slac-single-head
This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.2427
- Accuracy: 0.9493
- F1 Macro: 0.9026
- Precision Macro: 0.8893
- Recall Macro: 0.9182
- Total Tf: [938, 215, 5145, 110]
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 424
- num_epochs: 5
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 Macro | Precision Macro | Recall Macro | Total Tf |
|---|---|---|---|---|---|---|---|---|
| 0.3241 | 1.0 | 425 | 0.3027 | 0.9203 | 0.8632 | 0.8415 | 0.9114 | [961, 424, 4936, 87] |
| 0.267 | 2.0 | 850 | 0.2239 | 0.9494 | 0.9047 | 0.8951 | 0.9214 | [956, 232, 5128, 92] |
| 0.1973 | 3.0 | 1275 | 0.2265 | 0.9521 | 0.9061 | 0.8995 | 0.9140 | [932, 191, 5169, 116] |
| 0.1418 | 4.0 | 1700 | 0.2320 | 0.9476 | 0.9018 | 0.8855 | 0.9219 | [953, 241, 5119, 95] |
| 0.1222 | 5.0 | 2125 | 0.2427 | 0.9493 | 0.9026 | 0.8893 | 0.9182 | [938, 215, 5145, 110] |
Framework versions
- Transformers 4.51.1
- Pytorch 2.8.0.dev20250515+cu128
- Datasets 3.5.0
- Tokenizers 0.21.1
- Downloads last month
- -
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support