craa commited on
Commit
9c529dc
·
verified ·
1 Parent(s): 7c85f8a

Model save

Browse files
Files changed (3) hide show
  1. README.md +1 -112
  2. model.safetensors +1 -1
  3. training_args.bin +1 -1
README.md CHANGED
@@ -2,8 +2,6 @@
2
  library_name: transformers
3
  tags:
4
  - generated_from_trainer
5
- metrics:
6
- - accuracy
7
  model-index:
8
  - name: exceptions_exp2_swap_require_to_carry_3591
9
  results: []
@@ -12,13 +10,10 @@ model-index:
12
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
  should probably proofread and complete it, then remove this comment. -->
14
 
15
- [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/craaaa/exceptions_exp2/runs/b1x3irbx)
16
  # exceptions_exp2_swap_require_to_carry_3591
17
 
18
  This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset.
19
- It achieves the following results on the evaluation set:
20
- - Loss: 3.5533
21
- - Accuracy: 0.3744
22
 
23
  ## Model description
24
 
@@ -49,112 +44,6 @@ The following hyperparameters were used during training:
49
  - num_epochs: 50.0
50
  - mixed_precision_training: Native AMP
51
 
52
- ### Training results
53
-
54
- | Training Loss | Epoch | Step | Accuracy | Validation Loss |
55
- |:-------------:|:-------:|:------:|:--------:|:---------------:|
56
- | 4.8353 | 0.2911 | 1000 | 0.2531 | 4.7629 |
57
- | 4.3374 | 0.5822 | 2000 | 0.2996 | 4.2795 |
58
- | 4.1489 | 0.8733 | 3000 | 0.3155 | 4.0955 |
59
- | 3.996 | 1.1642 | 4000 | 0.3252 | 3.9965 |
60
- | 3.9347 | 1.4553 | 5000 | 0.3318 | 3.9166 |
61
- | 3.8665 | 1.7464 | 6000 | 0.3372 | 3.8597 |
62
- | 3.7343 | 2.0373 | 7000 | 0.3416 | 3.8142 |
63
- | 3.7434 | 2.3284 | 8000 | 0.3447 | 3.7804 |
64
- | 3.7318 | 2.6195 | 9000 | 0.3477 | 3.7533 |
65
- | 3.7166 | 2.9106 | 10000 | 0.3496 | 3.7269 |
66
- | 3.6328 | 3.2014 | 11000 | 0.3518 | 3.7136 |
67
- | 3.6293 | 3.4925 | 12000 | 0.3536 | 3.6956 |
68
- | 3.6383 | 3.7837 | 13000 | 0.3553 | 3.6771 |
69
- | 3.5359 | 4.0745 | 14000 | 0.3568 | 3.6693 |
70
- | 3.5563 | 4.3656 | 15000 | 0.3578 | 3.6573 |
71
- | 3.569 | 4.6567 | 16000 | 0.3587 | 3.6432 |
72
- | 3.5858 | 4.9478 | 17000 | 0.3601 | 3.6322 |
73
- | 3.4991 | 5.2387 | 18000 | 0.3605 | 3.6344 |
74
- | 3.5221 | 5.5298 | 19000 | 0.3616 | 3.6235 |
75
- | 3.5329 | 5.8209 | 20000 | 0.3624 | 3.6119 |
76
- | 3.4393 | 6.1118 | 21000 | 0.3629 | 3.6139 |
77
- | 3.4739 | 6.4029 | 22000 | 0.3634 | 3.6083 |
78
- | 3.4852 | 6.6940 | 23000 | 0.3639 | 3.6001 |
79
- | 3.4901 | 6.9851 | 24000 | 0.3646 | 3.5918 |
80
- | 3.4147 | 7.2760 | 25000 | 0.3649 | 3.5968 |
81
- | 3.4439 | 7.5671 | 26000 | 0.3658 | 3.5878 |
82
- | 3.4694 | 7.8582 | 27000 | 0.3663 | 3.5779 |
83
- | 3.3723 | 8.1490 | 28000 | 0.3661 | 3.5911 |
84
- | 3.405 | 8.4401 | 29000 | 0.3668 | 3.5804 |
85
- | 3.4347 | 8.7313 | 30000 | 0.3675 | 3.5715 |
86
- | 3.3197 | 9.0221 | 31000 | 0.3673 | 3.5790 |
87
- | 3.3719 | 9.3132 | 32000 | 0.3678 | 3.5758 |
88
- | 3.394 | 9.6043 | 33000 | 0.3685 | 3.5682 |
89
- | 3.4089 | 9.8954 | 34000 | 0.3688 | 3.5599 |
90
- | 3.3351 | 10.1863 | 35000 | 0.3686 | 3.5718 |
91
- | 3.3608 | 10.4774 | 36000 | 0.3686 | 3.5644 |
92
- | 3.3839 | 10.7685 | 37000 | 0.3696 | 3.5576 |
93
- | 3.2872 | 11.0594 | 38000 | 0.3697 | 3.5635 |
94
- | 3.3239 | 11.3505 | 39000 | 0.3692 | 3.5649 |
95
- | 3.3604 | 11.6416 | 40000 | 0.3702 | 3.5563 |
96
- | 3.3633 | 11.9327 | 41000 | 0.3705 | 3.5486 |
97
- | 3.3018 | 12.2236 | 42000 | 0.3699 | 3.5617 |
98
- | 3.3237 | 12.5147 | 43000 | 0.3704 | 3.5561 |
99
- | 3.3553 | 12.8058 | 44000 | 0.3710 | 3.5454 |
100
- | 3.266 | 13.0966 | 45000 | 0.3703 | 3.5622 |
101
- | 3.298 | 13.3878 | 46000 | 0.3706 | 3.5562 |
102
- | 3.3193 | 13.6789 | 47000 | 0.3712 | 3.5511 |
103
- | 3.3377 | 13.9700 | 48000 | 0.3719 | 3.5380 |
104
- | 3.2707 | 14.2608 | 49000 | 0.3711 | 3.5559 |
105
- | 3.2986 | 14.5519 | 50000 | 0.3714 | 3.5475 |
106
- | 3.3352 | 14.8430 | 51000 | 0.3721 | 3.5395 |
107
- | 3.2328 | 15.1339 | 52000 | 0.3717 | 3.5531 |
108
- | 3.278 | 15.4250 | 53000 | 0.3720 | 3.5493 |
109
- | 3.2919 | 15.7161 | 54000 | 0.3724 | 3.5413 |
110
- | 3.2682 | 16.0070 | 55000 | 0.3720 | 3.5475 |
111
- | 3.2366 | 16.2981 | 56000 | 0.3725 | 3.5493 |
112
- | 3.2881 | 16.5892 | 57000 | 0.3725 | 3.5421 |
113
- | 3.2976 | 16.8803 | 58000 | 0.3728 | 3.5371 |
114
- | 3.2211 | 17.1712 | 59000 | 0.3723 | 3.5510 |
115
- | 3.265 | 17.4623 | 60000 | 0.3727 | 3.5464 |
116
- | 3.2771 | 17.7534 | 61000 | 0.3732 | 3.5357 |
117
- | 3.1751 | 18.0442 | 62000 | 0.3724 | 3.5497 |
118
- | 3.2348 | 18.3354 | 63000 | 0.3728 | 3.5502 |
119
- | 3.2587 | 18.6265 | 64000 | 0.3730 | 3.5414 |
120
- | 3.2776 | 18.9176 | 65000 | 0.3738 | 3.5317 |
121
- | 3.1939 | 19.2084 | 66000 | 0.3728 | 3.5513 |
122
- | 3.2337 | 19.4995 | 67000 | 0.3731 | 3.5424 |
123
- | 3.2667 | 19.7906 | 68000 | 0.3738 | 3.5364 |
124
- | 3.1624 | 20.0815 | 69000 | 0.3732 | 3.5489 |
125
- | 3.2091 | 20.3726 | 70000 | 0.3737 | 3.5438 |
126
- | 3.245 | 20.6637 | 71000 | 0.3735 | 3.5370 |
127
- | 3.2399 | 20.9548 | 72000 | 0.3742 | 3.5309 |
128
- | 3.1951 | 21.2457 | 73000 | 0.3732 | 3.5491 |
129
- | 3.2234 | 21.5368 | 74000 | 0.3738 | 3.5366 |
130
- | 3.2386 | 21.8279 | 75000 | 0.3742 | 3.5339 |
131
- | 3.1712 | 22.1188 | 76000 | 0.3735 | 3.5475 |
132
- | 3.1945 | 22.4099 | 77000 | 0.3739 | 3.5480 |
133
- | 3.2309 | 22.7010 | 78000 | 0.3742 | 3.5343 |
134
- | 3.2372 | 22.9921 | 79000 | 0.3750 | 3.5286 |
135
- | 3.1752 | 23.2830 | 80000 | 0.3739 | 3.5466 |
136
- | 3.1695 | 23.5741 | 81000 | 3.5544 | 0.3734 |
137
- | 3.1993 | 23.8652 | 82000 | 3.5403 | 0.3743 |
138
- | 3.1541 | 24.1563 | 83000 | 3.5507 | 0.3735 |
139
- | 3.1831 | 24.4474 | 84000 | 3.5457 | 0.3742 |
140
- | 3.2054 | 24.7385 | 85000 | 3.5350 | 0.3748 |
141
- | 3.1135 | 25.0294 | 86000 | 3.5484 | 0.3743 |
142
- | 3.1559 | 25.3205 | 87000 | 3.5515 | 0.3739 |
143
- | 3.1787 | 25.6116 | 88000 | 3.5396 | 0.3744 |
144
- | 3.2158 | 25.9027 | 89000 | 3.5324 | 0.3751 |
145
- | 3.137 | 26.1936 | 90000 | 3.5505 | 0.3743 |
146
- | 3.168 | 26.4847 | 91000 | 3.5460 | 0.3746 |
147
- | 3.2025 | 26.7758 | 92000 | 3.5367 | 0.3752 |
148
- | 3.1075 | 27.0667 | 93000 | 3.5545 | 0.3740 |
149
- | 3.1515 | 27.3578 | 94000 | 3.5459 | 0.3744 |
150
- | 3.1658 | 27.6489 | 95000 | 3.5345 | 0.3753 |
151
- | 3.1922 | 27.9400 | 96000 | 3.5348 | 0.3753 |
152
- | 3.1149 | 28.2308 | 97000 | 3.5507 | 0.3747 |
153
- | 3.1605 | 28.5219 | 98000 | 3.5423 | 0.3748 |
154
- | 3.1636 | 28.8131 | 99000 | 3.5381 | 0.3755 |
155
- | 3.0885 | 29.1039 | 100000 | 3.5533 | 0.3744 |
156
-
157
-
158
  ### Framework versions
159
 
160
  - Transformers 4.55.2
 
2
  library_name: transformers
3
  tags:
4
  - generated_from_trainer
 
 
5
  model-index:
6
  - name: exceptions_exp2_swap_require_to_carry_3591
7
  results: []
 
10
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
11
  should probably proofread and complete it, then remove this comment. -->
12
 
13
+ [<img src="https://raw.githubusercontent.com/wandb/assets/main/wandb-github-badge-28.svg" alt="Visualize in Weights & Biases" width="200" height="32"/>](https://wandb.ai/craaaa/exceptions_exp2/runs/5p1kloyr)
14
  # exceptions_exp2_swap_require_to_carry_3591
15
 
16
  This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset.
 
 
 
17
 
18
  ## Model description
19
 
 
44
  - num_epochs: 50.0
45
  - mixed_precision_training: Native AMP
46
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
47
  ### Framework versions
48
 
49
  - Transformers 4.55.2
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:96389c46dba110f66d7abfcea0c2057e99b44f13f211e18fb5712d430d8523a8
3
  size 497774208
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:50cc00849ff910eda942199e9d70932061e18bf1f2074512ceb1aa9d75eba2b8
3
  size 497774208
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:f88068db8eeb85572309493d85166d5828ad1e23480fc7416dddfb2f65f1900a
3
  size 5969
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d39ab8dc6e1aa7b3e1f8adede81177fd6cae253a4be46ca231b26bfb507a89d9
3
  size 5969