DistilBERT JEE MCQ Classifier
This model is a fine-tuned DistilBERT (base uncased) designed to classify correct answers for JEE-style multiple-choice math questions. It selects the correct option among four choices (A, B, C, D).
Training Data
Source: PhysicsWallahAI JEE Main 2025 Math dataset (Jan + Apr shifts)
Filtered: Only multiple-choice questions (MCQs) were used.
Size: Combined January and April shifts, split into 80% train and 20% test.
Training Details
Base model: distilbert-base-uncased
Epochs: 10
Batch size: 4
Learning rate: 1e-5
Weight decay: 0.1
Results
Evaluation accuracy: 40%
Evaluation loss: ~1.42
Limitations
Accuracy is higher than random guess (25%) but not suitable for real exam preparation.
Trained only on Math MCQs from JEE Main 2025 dataset.
Does not handle numerical/subjective questions.
Intended Use
Research and experimentation with MCQ-style classification.
Baseline model for further fine-tuning or impro
- Downloads last month
- -
Model tree for Neural-Hacker/distilbert-jee-math-mcq-2025
Base model
distilbert/distilbert-base-uncased