DayCardoso commited on
Commit
e7d8415
·
verified ·
1 Parent(s): f51f7e1

Model save

Browse files
Files changed (2) hide show
  1. README.md +42 -32
  2. model.safetensors +1 -1
README.md CHANGED
@@ -18,13 +18,13 @@ should probably proofread and complete it, then remove this comment. -->
18
 
19
  This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the None dataset.
20
  It achieves the following results on the evaluation set:
21
- - Loss: 0.4702
22
- - Accuracy: 0.5675
23
- - F1 Macro: 0.5441
24
- - F1 Micro: 0.5675
25
- - Precision Macro: 0.5454
26
- - Recall Macro: 0.5461
27
- - Roc Auc: 0.7877
28
 
29
  ## Model description
30
 
@@ -58,31 +58,41 @@ The following hyperparameters were used during training:
58
 
59
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 Macro | F1 Micro | Precision Macro | Recall Macro | Roc Auc |
60
  |:-------------:|:------:|:----:|:---------------:|:--------:|:--------:|:--------:|:---------------:|:------------:|:-------:|
61
- | No log | 0.1304 | 200 | 0.6700 | 0.1505 | 0.0654 | 0.1505 | 0.0376 | 0.25 | 0.5065 |
62
- | No log | 0.2609 | 400 | 0.5908 | 0.3307 | 0.2185 | 0.3307 | 0.3478 | 0.2628 | 0.5265 |
63
- | 0.6418 | 0.3913 | 600 | 0.5682 | 0.3965 | 0.2420 | 0.3965 | 0.3131 | 0.2958 | 0.5826 |
64
- | 0.6418 | 0.5217 | 800 | 0.5511 | 0.4446 | 0.2752 | 0.4446 | 0.3725 | 0.3348 | 0.6511 |
65
- | 0.56 | 0.6522 | 1000 | 0.5183 | 0.4796 | 0.3397 | 0.4796 | 0.5998 | 0.3752 | 0.7004 |
66
- | 0.56 | 0.7826 | 1200 | 0.5027 | 0.4951 | 0.3960 | 0.4951 | 0.4765 | 0.4243 | 0.7210 |
67
- | 0.56 | 0.9130 | 1400 | 0.5012 | 0.4936 | 0.4203 | 0.4936 | 0.4761 | 0.4461 | 0.7341 |
68
- | 0.5136 | 1.0430 | 1600 | 0.4898 | 0.5228 | 0.4273 | 0.5228 | 0.5111 | 0.4450 | 0.7377 |
69
- | 0.5136 | 1.1735 | 1800 | 0.4818 | 0.5304 | 0.4723 | 0.5304 | 0.4987 | 0.4752 | 0.7504 |
70
- | 0.4842 | 1.3039 | 2000 | 0.4809 | 0.5330 | 0.4931 | 0.5330 | 0.5035 | 0.4966 | 0.7580 |
71
- | 0.4842 | 1.4343 | 2200 | 0.4754 | 0.5412 | 0.5091 | 0.5412 | 0.5109 | 0.5110 | 0.7651 |
72
- | 0.4842 | 1.5648 | 2400 | 0.4689 | 0.5523 | 0.5072 | 0.5523 | 0.5307 | 0.5061 | 0.7695 |
73
- | 0.4695 | 1.6952 | 2600 | 0.4785 | 0.5380 | 0.4809 | 0.5380 | 0.5346 | 0.4907 | 0.7663 |
74
- | 0.4695 | 1.8256 | 2800 | 0.4649 | 0.5545 | 0.5144 | 0.5545 | 0.5301 | 0.5081 | 0.7745 |
75
- | 0.4655 | 1.9561 | 3000 | 0.4652 | 0.5495 | 0.5210 | 0.5495 | 0.5276 | 0.5245 | 0.7757 |
76
- | 0.4655 | 2.0861 | 3200 | 0.4610 | 0.5654 | 0.5286 | 0.5654 | 0.5444 | 0.5224 | 0.7813 |
77
- | 0.4655 | 2.2165 | 3400 | 0.4654 | 0.5498 | 0.5186 | 0.5498 | 0.5400 | 0.5255 | 0.7809 |
78
- | 0.4414 | 2.3469 | 3600 | 0.4660 | 0.5530 | 0.5192 | 0.5530 | 0.5427 | 0.5267 | 0.7817 |
79
- | 0.4414 | 2.4774 | 3800 | 0.4593 | 0.5644 | 0.5393 | 0.5644 | 0.5431 | 0.5388 | 0.7855 |
80
- | 0.433 | 2.6078 | 4000 | 0.4605 | 0.5583 | 0.5309 | 0.5583 | 0.5405 | 0.5347 | 0.7849 |
81
- | 0.433 | 2.7382 | 4200 | 0.4673 | 0.5486 | 0.5245 | 0.5486 | 0.5430 | 0.5359 | 0.7851 |
82
- | 0.433 | 2.8687 | 4400 | 0.4532 | 0.5686 | 0.5358 | 0.5686 | 0.5499 | 0.5319 | 0.7869 |
83
- | 0.435 | 2.9991 | 4600 | 0.4587 | 0.5696 | 0.5394 | 0.5696 | 0.5470 | 0.5386 | 0.7865 |
84
- | 0.435 | 3.1291 | 4800 | 0.4601 | 0.5660 | 0.5352 | 0.5660 | 0.5485 | 0.5373 | 0.7869 |
85
- | 0.4056 | 3.2596 | 5000 | 0.4702 | 0.5675 | 0.5441 | 0.5675 | 0.5454 | 0.5461 | 0.7877 |
 
 
 
 
 
 
 
 
 
 
86
 
87
 
88
  ### Framework versions
 
18
 
19
  This model is a fine-tuned version of [roberta-base](https://huggingface.co/roberta-base) on the None dataset.
20
  It achieves the following results on the evaluation set:
21
+ - Loss: 0.4882
22
+ - Accuracy: 0.5566
23
+ - F1 Macro: 0.5333
24
+ - F1 Micro: 0.5566
25
+ - Precision Macro: 0.5431
26
+ - Recall Macro: 0.5389
27
+ - Roc Auc: 0.7826
28
 
29
  ## Model description
30
 
 
58
 
59
  | Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 Macro | F1 Micro | Precision Macro | Recall Macro | Roc Auc |
60
  |:-------------:|:------:|:----:|:---------------:|:--------:|:--------:|:--------:|:---------------:|:------------:|:-------:|
61
+ | No log | 0.1304 | 200 | 0.6818 | 0.2672 | 0.1648 | 0.2672 | 0.1218 | 0.2567 | 0.4838 |
62
+ | No log | 0.2609 | 400 | 0.6261 | 0.3230 | 0.1221 | 0.3230 | 0.0808 | 0.25 | 0.5071 |
63
+ | 0.6589 | 0.3913 | 600 | 0.5625 | 0.3902 | 0.2186 | 0.3902 | 0.2036 | 0.2855 | 0.5948 |
64
+ | 0.6589 | 0.5217 | 800 | 0.5461 | 0.4307 | 0.2771 | 0.4307 | 0.3373 | 0.3294 | 0.6677 |
65
+ | 0.5528 | 0.6522 | 1000 | 0.5142 | 0.4806 | 0.3522 | 0.4806 | 0.4562 | 0.3832 | 0.7032 |
66
+ | 0.5528 | 0.7826 | 1200 | 0.5025 | 0.4966 | 0.3990 | 0.4966 | 0.4866 | 0.4231 | 0.7188 |
67
+ | 0.5528 | 0.9130 | 1400 | 0.5006 | 0.4939 | 0.4140 | 0.4939 | 0.4853 | 0.4429 | 0.7312 |
68
+ | 0.5111 | 1.0430 | 1600 | 0.4903 | 0.5165 | 0.4163 | 0.5165 | 0.5065 | 0.4369 | 0.7386 |
69
+ | 0.5111 | 1.1735 | 1800 | 0.4821 | 0.5267 | 0.4650 | 0.5267 | 0.5003 | 0.4699 | 0.7494 |
70
+ | 0.4847 | 1.3039 | 2000 | 0.4803 | 0.5273 | 0.4900 | 0.5273 | 0.5013 | 0.4970 | 0.7582 |
71
+ | 0.4847 | 1.4343 | 2200 | 0.4742 | 0.5438 | 0.5020 | 0.5438 | 0.5153 | 0.5015 | 0.7637 |
72
+ | 0.4847 | 1.5648 | 2400 | 0.4672 | 0.5476 | 0.4998 | 0.5476 | 0.5270 | 0.4976 | 0.7692 |
73
+ | 0.47 | 1.6952 | 2600 | 0.4743 | 0.5396 | 0.4820 | 0.5396 | 0.5346 | 0.4885 | 0.7650 |
74
+ | 0.47 | 1.8256 | 2800 | 0.4675 | 0.5512 | 0.5104 | 0.5512 | 0.5282 | 0.5029 | 0.7734 |
75
+ | 0.4651 | 1.9561 | 3000 | 0.4671 | 0.5436 | 0.5151 | 0.5436 | 0.5211 | 0.5190 | 0.7747 |
76
+ | 0.4651 | 2.0861 | 3200 | 0.4631 | 0.5643 | 0.5269 | 0.5643 | 0.5431 | 0.5209 | 0.7804 |
77
+ | 0.4651 | 2.2165 | 3400 | 0.4681 | 0.5445 | 0.5109 | 0.5445 | 0.5359 | 0.5207 | 0.7798 |
78
+ | 0.4415 | 2.3469 | 3600 | 0.4695 | 0.5459 | 0.5114 | 0.5459 | 0.5400 | 0.5218 | 0.7801 |
79
+ | 0.4415 | 2.4774 | 3800 | 0.4607 | 0.5639 | 0.5358 | 0.5639 | 0.5457 | 0.5335 | 0.7843 |
80
+ | 0.4335 | 2.6078 | 4000 | 0.4649 | 0.5525 | 0.5283 | 0.5525 | 0.5354 | 0.5349 | 0.7830 |
81
+ | 0.4335 | 2.7382 | 4200 | 0.4676 | 0.5457 | 0.5225 | 0.5457 | 0.5370 | 0.5348 | 0.7854 |
82
+ | 0.4335 | 2.8687 | 4400 | 0.4581 | 0.5606 | 0.5272 | 0.5606 | 0.5482 | 0.5250 | 0.7854 |
83
+ | 0.4347 | 2.9991 | 4600 | 0.4612 | 0.5650 | 0.5336 | 0.5650 | 0.5425 | 0.5341 | 0.7853 |
84
+ | 0.4347 | 3.1291 | 4800 | 0.4654 | 0.5580 | 0.5302 | 0.5580 | 0.5410 | 0.5358 | 0.7856 |
85
+ | 0.4048 | 3.2596 | 5000 | 0.4659 | 0.5706 | 0.5452 | 0.5706 | 0.5478 | 0.5463 | 0.7873 |
86
+ | 0.4048 | 3.3900 | 5200 | 0.4627 | 0.5692 | 0.5346 | 0.5692 | 0.5538 | 0.5311 | 0.7859 |
87
+ | 0.4048 | 3.5204 | 5400 | 0.4733 | 0.5557 | 0.5371 | 0.5557 | 0.5354 | 0.5451 | 0.7858 |
88
+ | 0.3995 | 3.6509 | 5600 | 0.4755 | 0.5538 | 0.5267 | 0.5538 | 0.5426 | 0.5308 | 0.7857 |
89
+ | 0.3995 | 3.7813 | 5800 | 0.4759 | 0.5467 | 0.5238 | 0.5467 | 0.5383 | 0.5342 | 0.7860 |
90
+ | 0.4016 | 3.9117 | 6000 | 0.4698 | 0.5566 | 0.5302 | 0.5566 | 0.5392 | 0.5368 | 0.7859 |
91
+ | 0.4016 | 4.0417 | 6200 | 0.4786 | 0.5646 | 0.5389 | 0.5646 | 0.5463 | 0.5369 | 0.7830 |
92
+ | 0.4016 | 4.1722 | 6400 | 0.4840 | 0.5636 | 0.5342 | 0.5636 | 0.5409 | 0.5319 | 0.7814 |
93
+ | 0.3723 | 4.3026 | 6600 | 0.4760 | 0.5653 | 0.5431 | 0.5653 | 0.5435 | 0.5457 | 0.7855 |
94
+ | 0.3723 | 4.4330 | 6800 | 0.4821 | 0.5632 | 0.5340 | 0.5632 | 0.5460 | 0.5348 | 0.7829 |
95
+ | 0.3682 | 4.5635 | 7000 | 0.4882 | 0.5566 | 0.5333 | 0.5566 | 0.5431 | 0.5389 | 0.7826 |
96
 
97
 
98
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:59b2f57ceed981bf513514d3e662857f2ff01ca465f8986c7127218f5c99b3a6
3
  size 498619448
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2b9372c2204f5922c82f68cdd45ca35da47ca605181325cd80195216d2821c9e
3
  size 498619448