Qna_classifier / README.md
Deadbody42's picture
Update README.md
801f3b1 verified
metadata
library_name: transformers
license: mit
base_model: roberta-base
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - f1
  - precision
  - recall
model-index:
  - name: checkpoints
    results: []

Qna classifier

This model is a fine-tuned version of roberta-base on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4574
  • Accuracy: 0.8526
  • F1: 0.8402
  • Precision: 0.8566
  • Recall: 0.8526

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 3

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision Recall
No log 0.1485 500 3.3156 0.2530 0.1877 0.1772 0.2530
No log 0.2971 1000 2.5424 0.3081 0.2274 0.2078 0.3081
No log 0.4456 1500 2.1390 0.3414 0.2649 0.2471 0.3414
No log 0.5942 2000 1.8717 0.4001 0.3334 0.3560 0.4001
No log 0.7427 2500 1.6311 0.4522 0.3897 0.3845 0.4522
No log 0.8913 3000 1.4911 0.5069 0.4460 0.4512 0.5069
2.3757 1.0398 3500 1.3221 0.5510 0.4840 0.4818 0.5510
2.3757 1.1884 4000 1.1827 0.6173 0.5630 0.5779 0.6173
2.3757 1.3369 4500 1.0504 0.6518 0.6041 0.6272 0.6518
2.3757 1.4854 5000 0.9556 0.6752 0.6272 0.6453 0.6752
2.3757 1.6340 5500 0.8580 0.7113 0.6642 0.6795 0.7113
2.3757 1.7825 6000 0.7818 0.7364 0.6984 0.7270 0.7364
2.3757 1.9311 6500 0.7009 0.7669 0.7324 0.7616 0.7669
1.0183 2.0796 7000 0.6271 0.7957 0.7685 0.7883 0.7957
1.0183 2.2282 7500 0.5864 0.8051 0.7812 0.8061 0.8051
1.0183 2.3767 8000 0.5439 0.8237 0.8008 0.8104 0.8237
1.0183 2.5253 8500 0.5116 0.8297 0.8118 0.8268 0.8297
1.0183 2.6738 9000 0.4916 0.8394 0.8237 0.8450 0.8394
1.0183 2.8223 9500 0.4696 0.8486 0.8346 0.8514 0.8486
1.0183 2.9709 10000 0.4574 0.8526 0.8402 0.8566 0.8526

Framework versions

  • Transformers 4.53.1
  • Pytorch 2.6.0+cu124
  • Datasets 2.14.4
  • Tokenizers 0.21.2