rlcc-appearance-upsample_replacement-absa-None

This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.9560
  • Accuracy: 0.6610
  • F1 Macro: 0.6269
  • Precision Macro: 0.6302
  • Recall Macro: 0.6268
  • Total Tf: [271, 139, 1091, 139]

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 65
  • num_epochs: 25

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Macro Precision Macro Recall Macro Total Tf
1.1096 1.0 66 1.0995 0.5537 0.5047 0.5097 0.5082 [227, 183, 1047, 183]
0.9472 2.0 132 1.0104 0.6488 0.6033 0.6293 0.6126 [266, 144, 1086, 144]
0.7192 3.0 198 1.0839 0.6707 0.6348 0.6642 0.6608 [275, 135, 1095, 135]
0.5434 4.0 264 1.1165 0.6780 0.6492 0.6490 0.6638 [278, 132, 1098, 132]
0.4376 5.0 330 1.2321 0.6805 0.6515 0.6493 0.6596 [279, 131, 1099, 131]
0.3084 6.0 396 1.4064 0.6585 0.6260 0.6300 0.6279 [270, 140, 1090, 140]
0.2117 7.0 462 1.6170 0.6512 0.6201 0.6284 0.6217 [267, 143, 1087, 143]
0.2113 8.0 528 1.8189 0.6610 0.6291 0.6321 0.6428 [271, 139, 1091, 139]
0.1301 9.0 594 1.8293 0.6659 0.6348 0.6390 0.6544 [273, 137, 1093, 137]
0.0893 10.0 660 1.9560 0.6610 0.6269 0.6302 0.6268 [271, 139, 1091, 139]

Framework versions

  • Transformers 4.47.0
  • Pytorch 2.5.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support