rlcc-new-taste-upsample_replacement-absa-avg
This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.5750
- Accuracy: 0.5370
- F1 Macro: 0.5366
- Precision Macro: 0.5423
- Recall Macro: 0.5349
- F1 Micro: 0.5370
- Precision Micro: 0.5370
- Recall Micro: 0.5370
- Total Tf: [196, 169, 561, 169]
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 46
- num_epochs: 25
Training results
| Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 Macro | Precision Macro | Recall Macro | F1 Micro | Precision Micro | Recall Micro | Total Tf |
|---|---|---|---|---|---|---|---|---|---|---|---|
| 1.1188 | 1.0 | 47 | 1.0764 | 0.4219 | 0.3373 | 0.2961 | 0.4151 | 0.4219 | 0.4219 | 0.4219 | [154, 211, 519, 211] |
| 0.9219 | 2.0 | 94 | 0.9610 | 0.5425 | 0.5385 | 0.5376 | 0.5396 | 0.5425 | 0.5425 | 0.5425 | [198, 167, 563, 167] |
| 0.7458 | 3.0 | 141 | 0.9686 | 0.5315 | 0.5292 | 0.5340 | 0.5282 | 0.5315 | 0.5315 | 0.5315 | [194, 171, 559, 171] |
| 0.5206 | 4.0 | 188 | 1.0647 | 0.5589 | 0.5555 | 0.5563 | 0.5557 | 0.5589 | 0.5589 | 0.5589 | [204, 161, 569, 161] |
| 0.3686 | 5.0 | 235 | 1.1451 | 0.5507 | 0.5445 | 0.5443 | 0.5469 | 0.5507 | 0.5507 | 0.5507 | [201, 164, 566, 164] |
| 0.3191 | 6.0 | 282 | 1.2655 | 0.5534 | 0.5541 | 0.5590 | 0.5505 | 0.5534 | 0.5534 | 0.5534 | [202, 163, 567, 163] |
| 0.227 | 7.0 | 329 | 1.3891 | 0.5479 | 0.5482 | 0.5529 | 0.5455 | 0.5479 | 0.5479 | 0.5479 | [200, 165, 565, 165] |
| 0.1724 | 8.0 | 376 | 1.4916 | 0.5562 | 0.5499 | 0.5489 | 0.5525 | 0.5562 | 0.5562 | 0.5562 | [203, 162, 568, 162] |
| 0.1441 | 9.0 | 423 | 1.5750 | 0.5370 | 0.5366 | 0.5423 | 0.5349 | 0.5370 | 0.5370 | 0.5370 | [196, 169, 561, 169] |
Framework versions
- Transformers 4.52.4
- Pytorch 2.6.0+cu124
- Datasets 3.6.0
- Tokenizers 0.21.2
- Downloads last month
- 1
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support