| --- |
| base_model: Anwaarma/Merged-Server-praj |
| tags: |
| - generated_from_trainer |
| metrics: |
| - accuracy |
| - f1 |
| model-index: |
| - name: S02 |
| results: [] |
| --- |
| |
| <!-- This model card has been generated automatically according to the information the Trainer had access to. You |
| should probably proofread and complete it, then remove this comment. --> |
|
|
| # S02 |
|
|
| This model is a fine-tuned version of [Anwaarma/Merged-Server-praj](https://huggingface.co/Anwaarma/Merged-Server-praj) on an unknown dataset. |
| It achieves the following results on the evaluation set: |
| - Loss: 0.5643 |
| - Accuracy: 0.82 |
| - F1: 0.9011 |
|
|
| ## Model description |
|
|
| More information needed |
|
|
| ## Intended uses & limitations |
|
|
| More information needed |
|
|
| ## Training and evaluation data |
|
|
| More information needed |
|
|
| ## Training procedure |
|
|
| ### Training hyperparameters |
|
|
| The following hyperparameters were used during training: |
| - learning_rate: 4e-05 |
| - train_batch_size: 16 |
| - eval_batch_size: 16 |
| - seed: 42 |
| - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 |
| - lr_scheduler_type: linear |
| - num_epochs: 20 |
|
|
| ### Training results |
|
|
| | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 | |
| |:-------------:|:-----:|:----:|:---------------:|:--------:|:------:| |
| | No log | 0.0 | 50 | 0.5790 | 0.6 | 0.5992 | |
| | No log | 0.01 | 100 | 0.5691 | 0.65 | 0.6505 | |
| | No log | 0.01 | 150 | 0.5678 | 0.65 | 0.6505 | |
| | No log | 0.01 | 200 | 0.5621 | 0.68 | 0.6773 | |
| | No log | 0.02 | 250 | 0.5666 | 0.63 | 0.6303 | |
| | No log | 0.02 | 300 | 0.5721 | 0.65 | 0.6463 | |
| | No log | 0.02 | 350 | 0.5533 | 0.63 | 0.6260 | |
| | No log | 0.03 | 400 | 0.5614 | 0.62 | 0.6105 | |
| | No log | 0.03 | 450 | 0.5756 | 0.62 | 0.6181 | |
| | 0.5985 | 0.03 | 500 | 0.5666 | 0.6 | 0.5947 | |
| | 0.5985 | 0.04 | 550 | 0.5613 | 0.64 | 0.6406 | |
| | 0.5985 | 0.04 | 600 | 0.5541 | 0.63 | 0.6306 | |
| | 0.5985 | 0.04 | 650 | 0.5571 | 0.62 | 0.6192 | |
| | 0.5985 | 0.05 | 700 | 0.5536 | 0.62 | 0.6192 | |
| | 0.5985 | 0.05 | 750 | 0.5614 | 0.63 | 0.6306 | |
| | 0.5985 | 0.05 | 800 | 0.5667 | 0.63 | 0.6297 | |
| | 0.5985 | 0.06 | 850 | 0.5466 | 0.66 | 0.6600 | |
| | 0.5985 | 0.06 | 900 | 0.5532 | 0.66 | 0.6593 | |
| | 0.5985 | 0.06 | 950 | 0.5482 | 0.67 | 0.6630 | |
| | 0.5855 | 0.07 | 1000 | 0.5837 | 0.63 | 0.6220 | |
| | 0.5855 | 0.07 | 1050 | 0.5368 | 0.67 | 0.6705 | |
| | 0.5855 | 0.07 | 1100 | 0.5793 | 0.62 | 0.6167 | |
| | 0.5855 | 0.08 | 1150 | 0.5694 | 0.63 | 0.6276 | |
| | 0.5855 | 0.08 | 1200 | 0.5520 | 0.63 | 0.6306 | |
| | 0.5855 | 0.09 | 1250 | 0.5572 | 0.66 | 0.6593 | |
| | 0.5855 | 0.09 | 1300 | 0.5706 | 0.62 | 0.6150 | |
| | 0.5855 | 0.09 | 1350 | 0.5694 | 0.66 | 0.6593 | |
| | 0.5855 | 0.1 | 1400 | 0.5559 | 0.65 | 0.6497 | |
| | 0.5855 | 0.1 | 1450 | 0.5515 | 0.67 | 0.6705 | |
| | 0.5777 | 0.1 | 1500 | 0.5447 | 0.64 | 0.6393 | |
| | 0.5777 | 0.11 | 1550 | 0.5453 | 0.65 | 0.6502 | |
| | 0.5777 | 0.11 | 1600 | 0.5575 | 0.64 | 0.6400 | |
| | 0.5777 | 0.11 | 1650 | 0.5498 | 0.66 | 0.6584 | |
| | 0.5777 | 0.12 | 1700 | 0.5620 | 0.66 | 0.6604 | |
| | 0.5777 | 0.12 | 1750 | 0.5734 | 0.67 | 0.6702 | |
| | 0.5777 | 0.12 | 1800 | 0.5561 | 0.66 | 0.6593 | |
| | 0.5777 | 0.13 | 1850 | 0.5376 | 0.67 | 0.6649 | |
| | 0.5777 | 0.13 | 1900 | 0.5652 | 0.65 | 0.6505 | |
| | 0.5777 | 0.13 | 1950 | 0.5414 | 0.67 | 0.6689 | |
| | 0.575 | 0.14 | 2000 | 0.5340 | 0.67 | 0.6665 | |
| | 0.575 | 0.14 | 2050 | 0.5393 | 0.68 | 0.6794 | |
| | 0.575 | 0.14 | 2100 | 0.5253 | 0.7 | 0.6994 | |
| | 0.575 | 0.15 | 2150 | 0.5334 | 0.69 | 0.6834 | |
| | 0.575 | 0.15 | 2200 | 0.5395 | 0.68 | 0.6773 | |
| | 0.575 | 0.15 | 2250 | 0.5426 | 0.65 | 0.6446 | |
| | 0.575 | 0.16 | 2300 | 0.5523 | 0.64 | 0.6370 | |
| | 0.575 | 0.16 | 2350 | 0.5378 | 0.68 | 0.6804 | |
| | 0.575 | 0.16 | 2400 | 0.5375 | 0.67 | 0.6649 | |
| | 0.575 | 0.17 | 2450 | 0.5378 | 0.68 | 0.6742 | |
| | 0.556 | 0.17 | 2500 | 0.5491 | 0.69 | 0.6867 | |
| | 0.556 | 0.17 | 2550 | 0.5347 | 0.66 | 0.6517 | |
| | 0.556 | 0.18 | 2600 | 0.5325 | 0.69 | 0.6852 | |
| | 0.556 | 0.18 | 2650 | 0.5490 | 0.68 | 0.6794 | |
| | 0.556 | 0.18 | 2700 | 0.5313 | 0.7 | 0.7005 | |
| | 0.556 | 0.19 | 2750 | 0.5451 | 0.65 | 0.6314 | |
| | 0.556 | 0.19 | 2800 | 0.5506 | 0.64 | 0.6312 | |
| | 0.556 | 0.19 | 2850 | 0.5539 | 0.65 | 0.6497 | |
| | 0.556 | 0.2 | 2900 | 0.5601 | 0.66 | 0.6604 | |
| | 0.556 | 0.2 | 2950 | 0.5530 | 0.67 | 0.6705 | |
|
|
|
|
| ### Framework versions |
|
|
| - Transformers 4.35.2 |
| - Pytorch 2.1.0+cu121 |
| - Datasets 2.16.0 |
| - Tokenizers 0.15.0 |
|
|