DistilBERT Fine-tuned for Text Classification
This is a fine-tuned version of distilbert-base-uncased for a custom text classification task. It has been trained using the Hugging Face Transformers library and logged with Weights & Biases.
Model Details
- Base model: DistilBERT
- Task: Sequence Classification (binary or multi-class)
- Training Framework: Transformers (🤗)
- Logged with: Weights & Biases
Training Info
- Epochs: 1
- Final training loss: ~0.08
- Evaluation: 100% MAP on training data
How to Use
from transformers import pipeline
classifier = pipeline("text-classification", model="drzeeIslam/distilbert-finetuned")
result = classifier("Your input text here")
print(result)
- Downloads last month
- 2