danjung9/24679-Project1-aug
Viewer • Updated • 10.1k • 1 • 1
How to use yl0628/text-distilbert-clothes-predictor with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("text-classification", model="yl0628/text-distilbert-clothes-predictor") # Load model directly
from transformers import AutoTokenizer, AutoModelForSequenceClassification
tokenizer = AutoTokenizer.from_pretrained("yl0628/text-distilbert-clothes-predictor")
model = AutoModelForSequenceClassification.from_pretrained("yl0628/text-distilbert-clothes-predictor")This model is a fine-tuned version of distilbert-base-uncased on the None dataset. It achieves the following results on the evaluation set:
This a DistilBERT model that is trained on a clothing dataset (see Training and evaluation data section). It is used to classify whether a piece of wear is business casual.
Educational purposes. Dataset might not be big enough (as inferred by the result).
Source: https://huggingface.co/datasets/danjung9/24679-Project1-aug
(train/val/test) = (64/16/20)
The following hyperparameters were used during training:
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
|---|---|---|---|---|---|---|---|
| 0.0001 | 1.0 | 800 | 0.0001 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0 | 2.0 | 1600 | 0.0000 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0 | 3.0 | 2400 | 0.0000 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0 | 4.0 | 3200 | 0.0000 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0 | 5.0 | 4000 | 0.0000 | 1.0 | 1.0 | 1.0 | 1.0 |
Base model
distilbert/distilbert-base-uncased