global_mbv4_hybrid_large
This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.0857
- Precision: 0.9806
- Recall: 0.9794
- Accuracy: 0.9834
- F1: 0.9800
- Roc Auc: 0.9986
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 24
- eval_batch_size: 24
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH_FUSED with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 4
Training results
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | Accuracy | F1 | Roc Auc |
|---|---|---|---|---|---|---|---|---|
| 0.3639 | 0.2577 | 200 | 12.4134 | 0.8978 | 0.9088 | 0.9128 | 0.8996 | 0.9862 |
| 0.2433 | 0.5155 | 400 | 1.5055 | 0.7737 | 0.6830 | 0.6291 | 0.6382 | 0.9657 |
| 0.2974 | 0.7732 | 600 | 0.1301 | 0.9570 | 0.9435 | 0.9585 | 0.9491 | 0.9944 |
| 0.0867 | 1.0309 | 800 | 0.1688 | 0.9438 | 0.9530 | 0.9552 | 0.9472 | 0.9959 |
| 0.0392 | 1.2887 | 1000 | 0.2777 | 0.9549 | 0.9195 | 0.9448 | 0.9313 | 0.9970 |
| 0.2445 | 1.5464 | 1200 | 0.1142 | 0.9770 | 0.9689 | 0.9772 | 0.9726 | 0.9971 |
| 0.1727 | 1.8041 | 1400 | 0.2502 | 0.9683 | 0.9680 | 0.9737 | 0.9682 | 0.9964 |
| 0.1100 | 2.0619 | 1600 | 0.1452 | 0.9652 | 0.9707 | 0.9730 | 0.9676 | 0.9976 |
| 0.0381 | 2.3196 | 1800 | 0.1217 | 0.9779 | 0.9708 | 0.9788 | 0.9741 | 0.9974 |
| 0.0441 | 2.5773 | 2000 | 0.1124 | 0.9762 | 0.9764 | 0.9804 | 0.9763 | 0.9979 |
| 0.0792 | 2.8351 | 2200 | 0.1406 | 0.9821 | 0.9772 | 0.9832 | 0.9795 | 0.9983 |
| 0.0010 | 3.0928 | 2400 | 0.0857 | 0.9806 | 0.9794 | 0.9834 | 0.9800 | 0.9986 |
| 0.0003 | 3.3505 | 2600 | 0.1904 | 0.9821 | 0.9776 | 0.9832 | 0.9797 | 0.9980 |
| 0.0011 | 3.6082 | 2800 | 0.0997 | 0.9815 | 0.9790 | 0.9837 | 0.9803 | 0.9984 |
| 0.0053 | 3.8660 | 3000 | 0.1172 | 0.9797 | 0.9769 | 0.9821 | 0.9783 | 0.9980 |
Framework versions
- Transformers 5.3.0
- Pytorch 2.10.0+cu128
- Datasets 4.7.0
- Tokenizers 0.22.2
- Downloads last month
- 77