Model Card

This model is a finetuned version of microsoft/deberta-base on the Onion or Not dataset. The model was fine-tuned for 5 epochs with a learning rate of 2e-5 and a linear schedule. Random token dropout was implemented during training to avoid overfitting. The classification report is shown below:

Final Validation Accuracy: 93.31%

Final Classification Report:
              precision    recall  f1-score   support

    NotOnion       0.94      0.96      0.95      3000
       Onion       0.93      0.89      0.91      1800

    accuracy                           0.93      4800
   macro avg       0.93      0.92      0.93      4800
weighted avg       0.93      0.93      0.93      4800

Running inference on a new sample gave the correct prediction:

Running example inference...
Text: Man With Fogged-Up Glasses Forced To Finish Soup Using Other Senses
Prediction: Onion
Confidence: 87.76%

Probabilities:
NotOnion: 12.24%
Onion: 87.76%
Downloads last month
10
Safetensors
Model size
0.1B params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for gokul-pv/deberta-base-OnionOrNot

Finetuned
(66)
this model