Neural-Hacker's picture
Update README.md
58b9549 verified
metadata
license: cc-by-nc-4.0
datasets:
  - PhysicsWallahAI/JEE-Main-2025-Math
language:
  - en
base_model:
  - distilbert/distilbert-base-uncased
pipeline_tag: question-answering
tags:
  - math
  - PhysicsWallah
  - JEE
  - mathematics
library_name: transformers

DistilBERT JEE MCQ Classifier

This model is a fine-tuned DistilBERT (base uncased) designed to classify correct answers for JEE-style multiple-choice math questions. It selects the correct option among four choices (A, B, C, D).


Training Data

Source: PhysicsWallahAI JEE Main 2025 Math dataset (Jan + Apr shifts)

Filtered: Only multiple-choice questions (MCQs) were used.

Size: Combined January and April shifts, split into 80% train and 20% test.


Training Details

Base model: distilbert-base-uncased

Epochs: 10

Batch size: 4

Learning rate: 1e-5

Weight decay: 0.1


Results

Evaluation accuracy: 40%

Evaluation loss: ~1.42


Limitations

Accuracy is higher than random guess (25%) but not suitable for real exam preparation.

Trained only on Math MCQs from JEE Main 2025 dataset.

Does not handle numerical/subjective questions.


Intended Use

Research and experimentation with MCQ-style classification.

Baseline model for further fine-tuning or impro


License: cc-by-nc-4.0