--- license: mit base_model: roberta-base tags: - generated_from_trainer metrics: - f1 model-index: - name: model results: [] --- # model This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on an unknown dataset. It achieves the following results on the evaluation set: - Loss: 0.9688 - F1: [0.55895197 0.65655471 0.64079208 0.61947973 0.4622871 0. ] ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0002 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - num_epochs: 3 ### Training results | Training Loss | Epoch | Step | Validation Loss | F1 | |:-------------:|:------:|:----:|:---------------:|:-------------------------------------------------------------------:| | 1.1276 | 0.2889 | 500 | 1.0034 | [0. 0.64762876 0.62495426 0.58732057 0.23529412 0. ] | | 0.9959 | 0.5777 | 1000 | 0.9698 | [0. 0.65695931 0.58746415 0.59628074 0. 0. ] | | 1.0332 | 0.8666 | 1500 | 0.9243 | [0.56649396 0.62368113 0.65886525 0.57998639 0.42633229 0. ] | | 0.9629 | 1.1554 | 2000 | 0.9520 | [0.4842615 0.65972551 0.63432836 0.5464191 0.01117318 0. ] | | 1.1092 | 1.4443 | 2500 | 0.9043 | [0.58064516 0.61299597 0.63804173 0.63392347 0.09090909 0. ] | | 0.7352 | 1.7331 | 3000 | 0.9064 | [0.5703125 0.65753425 0.65755449 0.57650273 0.44675325 0. ] | | 0.8081 | 2.0220 | 3500 | 0.9160 | [0.57377049 0.65274725 0.64854518 0.5620389 0.44813278 0. ] | | 0.6783 | 2.3108 | 4000 | 0.9415 | [0.59016393 0.66339334 0.65859375 0.61950287 0.496614 0. ] | | 0.8309 | 2.5997 | 4500 | 0.9317 | [0.59960552 0.6642144 0.63559663 0.63245823 0.42767296 0. ] | | 0.5363 | 2.8885 | 5000 | 0.9688 | [0.55895197 0.65655471 0.64079208 0.61947973 0.4622871 0. ] | ### Framework versions - Transformers 4.40.1 - Pytorch 2.3.0+cu121 - Datasets 2.19.0 - Tokenizers 0.19.1