ilaria-oneofftech commited on
Commit
e03bfb4
·
1 Parent(s): f6e64d2

update model card README.md

Browse files
Files changed (1) hide show
  1. README.md +18 -25
README.md CHANGED
@@ -2,8 +2,6 @@
2
  license: apache-2.0
3
  tags:
4
  - generated_from_trainer
5
- metrics:
6
- - accuracy
7
  model-index:
8
  - name: ikitracs_mitigation
9
  results: []
@@ -16,15 +14,8 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  This model is a fine-tuned version of [sentence-transformers/all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 0.9192
20
- - Precision Micro: 0.1466
21
- - Precision Weighted: 0.2016
22
- - Precision Samples: 0.1024
23
- - Recall Micro: 0.8889
24
- - Recall Weighted: 0.8889
25
- - Recall Samples: 0.4268
26
- - F1-score: 0.1468
27
- - Accuracy: 0.4855
28
 
29
  ## Model description
30
 
@@ -43,27 +34,29 @@ More information needed
43
  ### Training hyperparameters
44
 
45
  The following hyperparameters were used during training:
46
- - learning_rate: 1e-05
47
- - train_batch_size: 8
48
- - eval_batch_size: 8
49
  - seed: 42
 
 
50
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
51
  - lr_scheduler_type: linear
52
- - lr_scheduler_warmup_steps: 200
53
  - num_epochs: 8
54
 
55
  ### Training results
56
 
57
- | Training Loss | Epoch | Step | Validation Loss | Precision Micro | Precision Weighted | Precision Samples | Recall Micro | Recall Weighted | Recall Samples | F1-score | Accuracy |
58
- |:-------------:|:-----:|:----:|:---------------:|:---------------:|:------------------:|:-----------------:|:------------:|:---------------:|:--------------:|:--------:|:--------:|
59
- | 1.3559 | 1.0 | 156 | 1.3010 | 0.0806 | 0.1484 | 0.0618 | 0.7819 | 0.7819 | 0.3826 | 0.1027 | 0.0 |
60
- | 1.3082 | 2.0 | 312 | 1.1945 | 0.1359 | 0.1917 | 0.0840 | 0.8436 | 0.8436 | 0.3995 | 0.1208 | 0.4791 |
61
- | 1.1954 | 3.0 | 468 | 1.0831 | 0.1322 | 0.1804 | 0.0862 | 0.9012 | 0.9012 | 0.4333 | 0.1276 | 0.4823 |
62
- | 1.0874 | 4.0 | 624 | 1.0097 | 0.1346 | 0.1860 | 0.0931 | 0.8930 | 0.8930 | 0.4317 | 0.1332 | 0.4855 |
63
- | 1.0107 | 5.0 | 780 | 0.9662 | 0.1278 | 0.1803 | 0.0857 | 0.9053 | 0.9053 | 0.4365 | 0.1273 | 0.4823 |
64
- | 0.9531 | 6.0 | 936 | 0.9412 | 0.1429 | 0.1980 | 0.0965 | 0.8889 | 0.8889 | 0.4268 | 0.1401 | 0.4823 |
65
- | 0.9142 | 7.0 | 1092 | 0.9250 | 0.1429 | 0.1987 | 0.0990 | 0.8889 | 0.8889 | 0.4268 | 0.1425 | 0.4887 |
66
- | 0.9018 | 8.0 | 1248 | 0.9192 | 0.1466 | 0.2016 | 0.1024 | 0.8889 | 0.8889 | 0.4268 | 0.1468 | 0.4855 |
67
 
68
 
69
  ### Framework versions
 
2
  license: apache-2.0
3
  tags:
4
  - generated_from_trainer
 
 
5
  model-index:
6
  - name: ikitracs_mitigation
7
  results: []
 
14
 
15
  This model is a fine-tuned version of [sentence-transformers/all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) on the None dataset.
16
  It achieves the following results on the evaluation set:
17
+ - Loss: 0.7245
18
+ - F1-score: 0.2177
 
 
 
 
 
 
 
19
 
20
  ## Model description
21
 
 
34
  ### Training hyperparameters
35
 
36
  The following hyperparameters were used during training:
37
+ - learning_rate: 9.24e-05
38
+ - train_batch_size: 3
39
+ - eval_batch_size: 3
40
  - seed: 42
41
+ - gradient_accumulation_steps: 2
42
+ - total_train_batch_size: 6
43
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
44
  - lr_scheduler_type: linear
45
+ - lr_scheduler_warmup_steps: 300
46
  - num_epochs: 8
47
 
48
  ### Training results
49
 
50
+ | Training Loss | Epoch | Step | Validation Loss | F1-score |
51
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|
52
+ | 1.2983 | 1.0 | 207 | 1.0239 | 0.1101 |
53
+ | 1.0475 | 2.0 | 414 | 0.9027 | 0.1449 |
54
+ | 0.9095 | 3.0 | 621 | 0.8214 | 0.1646 |
55
+ | 0.7996 | 4.0 | 828 | 0.8772 | 0.1629 |
56
+ | 0.6585 | 5.0 | 1035 | 0.7504 | 0.2025 |
57
+ | 0.5473 | 6.0 | 1242 | 0.8324 | 0.2144 |
58
+ | 0.4423 | 7.0 | 1449 | 0.6799 | 0.2176 |
59
+ | 0.3741 | 8.0 | 1656 | 0.7245 | 0.2177 |
60
 
61
 
62
  ### Framework versions