|
|
--- |
|
|
library_name: transformers |
|
|
language: |
|
|
- en |
|
|
base_model: |
|
|
- microsoft/deberta-base |
|
|
pipeline_tag: text-classification |
|
|
--- |
|
|
|
|
|
# Model Card |
|
|
|
|
|
This model is a finetuned version of [microsoft/deberta-base](https://huggingface.co/microsoft/deberta-base) on the [Onion or Not](https://www.kaggle.com/datasets/chrisfilo/onion-or-not) dataset. |
|
|
The model was fine-tuned for 5 epochs with a learning rate of 2e-5 and a linear schedule. Random token dropout was implemented during training to avoid overfitting. |
|
|
The classification report is shown below: |
|
|
``` |
|
|
Final Validation Accuracy: 93.31% |
|
|
|
|
|
Final Classification Report: |
|
|
precision recall f1-score support |
|
|
|
|
|
NotOnion 0.94 0.96 0.95 3000 |
|
|
Onion 0.93 0.89 0.91 1800 |
|
|
|
|
|
accuracy 0.93 4800 |
|
|
macro avg 0.93 0.92 0.93 4800 |
|
|
weighted avg 0.93 0.93 0.93 4800 |
|
|
``` |
|
|
|
|
|
Running inference on a new sample gave the correct prediction: |
|
|
``` |
|
|
Running example inference... |
|
|
Text: Man With Fogged-Up Glasses Forced To Finish Soup Using Other Senses |
|
|
Prediction: Onion |
|
|
Confidence: 87.76% |
|
|
|
|
|
Probabilities: |
|
|
NotOnion: 12.24% |
|
|
Onion: 87.76% |
|
|
``` |
|
|
|