2025-24679-HW2-Part3-text-distilbert-predictor
This model is a fine-tuned version of distilbert-base-uncased on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.0047
- Accuracy: 1.0
- F1: 1.0
- Precision: 1.0
- Recall: 1.0
Model description
This model classifies short text descriptions as either representing East Coast or West Coast culture.
It was fine-tuned on a small dataset of regionally themed examples.
Intended uses & limitations
- Intended use: educational purposes, small-scale demos of text classification with Hugging Face transformers.
- Limitations: the dataset is small and subjective, so predictions may not generalize beyond the training examples.
Training and evaluation data
The dataset consisted of short text snippets describing lifestyle, weather, food, and culture, each labeled as East Coast (0) or West Coast (1).
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 5
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | Precision | Recall |
|---|---|---|---|---|---|---|---|
| 0.0064 | 1.0 | 120 | 0.0052 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0016 | 2.0 | 240 | 0.0039 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0009 | 3.0 | 360 | 0.0007 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0006 | 4.0 | 480 | 0.0005 | 1.0 | 1.0 | 1.0 | 1.0 |
| 0.0006 | 5.0 | 600 | 0.0005 | 1.0 | 1.0 | 1.0 | 1.0 |
Framework versions
- Transformers 4.56.1
- Pytorch 2.8.0+cu126
- Datasets 4.0.0
- Tokenizers 0.22.0
- Downloads last month
- 1
Model tree for SebastianAndreu/2025-24679-HW2-Part3-text-distilbert-predictor
Base model
distilbert/distilbert-base-uncased