MayBashendy commited on
Commit
af57aee
·
verified ·
1 Parent(s): b7e98a0

Training in progress, step 500

Browse files
Files changed (4) hide show
  1. README.md +318 -0
  2. config.json +32 -0
  3. model.safetensors +3 -0
  4. training_args.bin +3 -0
README.md ADDED
@@ -0,0 +1,318 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ base_model: aubmindlab/bert-base-arabertv02
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k15_task3_organization
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k15_task3_organization
15
+
16
+ This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.8387
19
+ - Qwk: 0.0460
20
+ - Mse: 0.8387
21
+ - Rmse: 0.9158
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 2e-05
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 100
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
51
+ |:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
52
+ | No log | 0.0260 | 2 | 3.7827 | 0.0017 | 3.7827 | 1.9449 |
53
+ | No log | 0.0519 | 4 | 1.8284 | 0.0136 | 1.8284 | 1.3522 |
54
+ | No log | 0.0779 | 6 | 0.9332 | 0.0134 | 0.9332 | 0.9660 |
55
+ | No log | 0.1039 | 8 | 0.8623 | -0.0842 | 0.8623 | 0.9286 |
56
+ | No log | 0.1299 | 10 | 1.3640 | -0.0164 | 1.3640 | 1.1679 |
57
+ | No log | 0.1558 | 12 | 0.7873 | -0.0778 | 0.7873 | 0.8873 |
58
+ | No log | 0.1818 | 14 | 0.7033 | 0.0964 | 0.7033 | 0.8387 |
59
+ | No log | 0.2078 | 16 | 0.7434 | 0.0588 | 0.7434 | 0.8622 |
60
+ | No log | 0.2338 | 18 | 0.7891 | 0.0588 | 0.7891 | 0.8883 |
61
+ | No log | 0.2597 | 20 | 0.8384 | -0.0280 | 0.8384 | 0.9156 |
62
+ | No log | 0.2857 | 22 | 0.8515 | -0.0187 | 0.8515 | 0.9228 |
63
+ | No log | 0.3117 | 24 | 0.8668 | 0.1998 | 0.8668 | 0.9310 |
64
+ | No log | 0.3377 | 26 | 1.3116 | -0.0272 | 1.3116 | 1.1452 |
65
+ | No log | 0.3636 | 28 | 1.3169 | -0.0238 | 1.3169 | 1.1476 |
66
+ | No log | 0.3896 | 30 | 0.9412 | 0.0277 | 0.9412 | 0.9702 |
67
+ | No log | 0.4156 | 32 | 0.9388 | 0.0416 | 0.9388 | 0.9689 |
68
+ | No log | 0.4416 | 34 | 0.9095 | 0.0670 | 0.9095 | 0.9537 |
69
+ | No log | 0.4675 | 36 | 1.0151 | -0.0076 | 1.0151 | 1.0075 |
70
+ | No log | 0.4935 | 38 | 0.8864 | 0.1673 | 0.8864 | 0.9415 |
71
+ | No log | 0.5195 | 40 | 0.8033 | 0.1095 | 0.8033 | 0.8963 |
72
+ | No log | 0.5455 | 42 | 0.7796 | 0.2586 | 0.7796 | 0.8830 |
73
+ | No log | 0.5714 | 44 | 0.7941 | 0.0831 | 0.7941 | 0.8911 |
74
+ | No log | 0.5974 | 46 | 0.7523 | 0.1415 | 0.7523 | 0.8674 |
75
+ | No log | 0.6234 | 48 | 0.8062 | 0.1993 | 0.8062 | 0.8979 |
76
+ | No log | 0.6494 | 50 | 0.9525 | 0.1284 | 0.9525 | 0.9760 |
77
+ | No log | 0.6753 | 52 | 0.8532 | 0.2341 | 0.8532 | 0.9237 |
78
+ | No log | 0.7013 | 54 | 0.8357 | 0.2576 | 0.8357 | 0.9142 |
79
+ | No log | 0.7273 | 56 | 0.8785 | 0.2513 | 0.8785 | 0.9373 |
80
+ | No log | 0.7532 | 58 | 0.8859 | 0.2216 | 0.8859 | 0.9412 |
81
+ | No log | 0.7792 | 60 | 0.9198 | 0.1683 | 0.9198 | 0.9591 |
82
+ | No log | 0.8052 | 62 | 0.9467 | 0.1452 | 0.9467 | 0.9730 |
83
+ | No log | 0.8312 | 64 | 0.8959 | 0.1091 | 0.8959 | 0.9465 |
84
+ | No log | 0.8571 | 66 | 1.1017 | 0.0679 | 1.1017 | 1.0496 |
85
+ | No log | 0.8831 | 68 | 1.0610 | 0.1300 | 1.0610 | 1.0301 |
86
+ | No log | 0.9091 | 70 | 0.9891 | 0.2109 | 0.9891 | 0.9945 |
87
+ | No log | 0.9351 | 72 | 1.3542 | 0.0529 | 1.3542 | 1.1637 |
88
+ | No log | 0.9610 | 74 | 1.2204 | 0.0721 | 1.2204 | 1.1047 |
89
+ | No log | 0.9870 | 76 | 0.9187 | 0.2402 | 0.9187 | 0.9585 |
90
+ | No log | 1.0130 | 78 | 0.8594 | 0.2430 | 0.8594 | 0.9270 |
91
+ | No log | 1.0390 | 80 | 0.9112 | 0.1672 | 0.9112 | 0.9546 |
92
+ | No log | 1.0649 | 82 | 0.7820 | 0.1660 | 0.7820 | 0.8843 |
93
+ | No log | 1.0909 | 84 | 0.7905 | 0.1218 | 0.7905 | 0.8891 |
94
+ | No log | 1.1169 | 86 | 0.9036 | 0.1005 | 0.9036 | 0.9506 |
95
+ | No log | 1.1429 | 88 | 0.9019 | 0.1673 | 0.9019 | 0.9497 |
96
+ | No log | 1.1688 | 90 | 0.8481 | 0.1260 | 0.8481 | 0.9209 |
97
+ | No log | 1.1948 | 92 | 0.8712 | 0.1264 | 0.8712 | 0.9334 |
98
+ | No log | 1.2208 | 94 | 0.8809 | 0.0978 | 0.8809 | 0.9386 |
99
+ | No log | 1.2468 | 96 | 0.9137 | 0.1809 | 0.9137 | 0.9559 |
100
+ | No log | 1.2727 | 98 | 0.9551 | 0.1871 | 0.9551 | 0.9773 |
101
+ | No log | 1.2987 | 100 | 0.9088 | 0.1616 | 0.9088 | 0.9533 |
102
+ | No log | 1.3247 | 102 | 0.8746 | 0.1447 | 0.8746 | 0.9352 |
103
+ | No log | 1.3506 | 104 | 0.9544 | 0.0991 | 0.9544 | 0.9769 |
104
+ | No log | 1.3766 | 106 | 1.1961 | 0.0585 | 1.1961 | 1.0937 |
105
+ | No log | 1.4026 | 108 | 0.9617 | 0.0217 | 0.9617 | 0.9807 |
106
+ | No log | 1.4286 | 110 | 0.8321 | 0.1321 | 0.8321 | 0.9122 |
107
+ | No log | 1.4545 | 112 | 0.9039 | -0.0409 | 0.9039 | 0.9507 |
108
+ | No log | 1.4805 | 114 | 0.7704 | 0.1287 | 0.7704 | 0.8777 |
109
+ | No log | 1.5065 | 116 | 0.8998 | 0.0207 | 0.8998 | 0.9486 |
110
+ | No log | 1.5325 | 118 | 0.9084 | -0.0143 | 0.9084 | 0.9531 |
111
+ | No log | 1.5584 | 120 | 0.7506 | 0.1254 | 0.7506 | 0.8664 |
112
+ | No log | 1.5844 | 122 | 1.0116 | -0.0142 | 1.0116 | 1.0058 |
113
+ | No log | 1.6104 | 124 | 0.9836 | 0.0104 | 0.9836 | 0.9918 |
114
+ | No log | 1.6364 | 126 | 0.8551 | 0.0091 | 0.8551 | 0.9247 |
115
+ | No log | 1.6623 | 128 | 1.5396 | 0.0822 | 1.5396 | 1.2408 |
116
+ | No log | 1.6883 | 130 | 1.5051 | 0.0343 | 1.5051 | 1.2268 |
117
+ | No log | 1.7143 | 132 | 0.8941 | 0.0781 | 0.8941 | 0.9456 |
118
+ | No log | 1.7403 | 134 | 1.3760 | 0.1515 | 1.3760 | 1.1730 |
119
+ | No log | 1.7662 | 136 | 1.5703 | 0.1076 | 1.5703 | 1.2531 |
120
+ | No log | 1.7922 | 138 | 1.1920 | 0.1548 | 1.1920 | 1.0918 |
121
+ | No log | 1.8182 | 140 | 0.8285 | 0.1723 | 0.8285 | 0.9102 |
122
+ | No log | 1.8442 | 142 | 1.0455 | 0.1111 | 1.0455 | 1.0225 |
123
+ | No log | 1.8701 | 144 | 1.0487 | 0.0764 | 1.0487 | 1.0241 |
124
+ | No log | 1.8961 | 146 | 0.8153 | 0.1495 | 0.8153 | 0.9029 |
125
+ | No log | 1.9221 | 148 | 0.7524 | 0.1244 | 0.7524 | 0.8674 |
126
+ | No log | 1.9481 | 150 | 0.7661 | 0.0934 | 0.7661 | 0.8752 |
127
+ | No log | 1.9740 | 152 | 0.7330 | 0.1244 | 0.7330 | 0.8561 |
128
+ | No log | 2.0 | 154 | 0.7291 | 0.2070 | 0.7291 | 0.8539 |
129
+ | No log | 2.0260 | 156 | 0.8333 | 0.1687 | 0.8333 | 0.9128 |
130
+ | No log | 2.0519 | 158 | 0.9358 | 0.0547 | 0.9358 | 0.9674 |
131
+ | No log | 2.0779 | 160 | 0.8121 | 0.1379 | 0.8121 | 0.9012 |
132
+ | No log | 2.1039 | 162 | 0.9156 | 0.1605 | 0.9156 | 0.9569 |
133
+ | No log | 2.1299 | 164 | 1.0465 | 0.2047 | 1.0465 | 1.0230 |
134
+ | No log | 2.1558 | 166 | 1.1609 | 0.1849 | 1.1609 | 1.0774 |
135
+ | No log | 2.1818 | 168 | 1.0361 | 0.2287 | 1.0361 | 1.0179 |
136
+ | No log | 2.2078 | 170 | 0.9533 | 0.2306 | 0.9533 | 0.9764 |
137
+ | No log | 2.2338 | 172 | 0.8739 | 0.2339 | 0.8739 | 0.9348 |
138
+ | No log | 2.2597 | 174 | 0.8422 | 0.1624 | 0.8422 | 0.9177 |
139
+ | No log | 2.2857 | 176 | 0.7910 | 0.1007 | 0.7910 | 0.8894 |
140
+ | No log | 2.3117 | 178 | 0.7947 | 0.1373 | 0.7947 | 0.8914 |
141
+ | No log | 2.3377 | 180 | 0.7954 | 0.1037 | 0.7954 | 0.8919 |
142
+ | No log | 2.3636 | 182 | 0.7238 | 0.1362 | 0.7238 | 0.8507 |
143
+ | No log | 2.3896 | 184 | 0.8371 | 0.0799 | 0.8371 | 0.9149 |
144
+ | No log | 2.4156 | 186 | 0.8420 | 0.0799 | 0.8420 | 0.9176 |
145
+ | No log | 2.4416 | 188 | 0.7272 | 0.2239 | 0.7272 | 0.8528 |
146
+ | No log | 2.4675 | 190 | 0.9144 | 0.1042 | 0.9144 | 0.9562 |
147
+ | No log | 2.4935 | 192 | 0.9127 | 0.0681 | 0.9127 | 0.9553 |
148
+ | No log | 2.5195 | 194 | 0.7435 | 0.1835 | 0.7435 | 0.8623 |
149
+ | No log | 2.5455 | 196 | 0.8867 | 0.0769 | 0.8867 | 0.9416 |
150
+ | No log | 2.5714 | 198 | 0.8814 | 0.1105 | 0.8814 | 0.9389 |
151
+ | No log | 2.5974 | 200 | 0.7282 | 0.1199 | 0.7282 | 0.8533 |
152
+ | No log | 2.6234 | 202 | 0.7984 | 0.0991 | 0.7984 | 0.8935 |
153
+ | No log | 2.6494 | 204 | 0.9467 | 0.1116 | 0.9467 | 0.9730 |
154
+ | No log | 2.6753 | 206 | 0.7891 | 0.0989 | 0.7891 | 0.8883 |
155
+ | No log | 2.7013 | 208 | 0.7442 | 0.0814 | 0.7442 | 0.8627 |
156
+ | No log | 2.7273 | 210 | 0.9113 | 0.0799 | 0.9113 | 0.9546 |
157
+ | No log | 2.7532 | 212 | 0.8955 | 0.0799 | 0.8955 | 0.9463 |
158
+ | No log | 2.7792 | 214 | 0.7506 | 0.1196 | 0.7506 | 0.8664 |
159
+ | No log | 2.8052 | 216 | 0.7897 | 0.1358 | 0.7897 | 0.8886 |
160
+ | No log | 2.8312 | 218 | 0.7322 | 0.0495 | 0.7322 | 0.8557 |
161
+ | No log | 2.8571 | 220 | 0.7059 | 0.1828 | 0.7059 | 0.8402 |
162
+ | No log | 2.8831 | 222 | 0.8952 | 0.1493 | 0.8952 | 0.9461 |
163
+ | No log | 2.9091 | 224 | 0.8852 | 0.1493 | 0.8852 | 0.9408 |
164
+ | No log | 2.9351 | 226 | 0.7224 | 0.2390 | 0.7224 | 0.8499 |
165
+ | No log | 2.9610 | 228 | 0.7730 | 0.0989 | 0.7730 | 0.8792 |
166
+ | No log | 2.9870 | 230 | 1.0735 | 0.1077 | 1.0735 | 1.0361 |
167
+ | No log | 3.0130 | 232 | 1.0207 | 0.1396 | 1.0207 | 1.0103 |
168
+ | No log | 3.0390 | 234 | 0.7820 | 0.0861 | 0.7820 | 0.8843 |
169
+ | No log | 3.0649 | 236 | 0.9876 | 0.0953 | 0.9876 | 0.9938 |
170
+ | No log | 3.0909 | 238 | 1.0354 | 0.1182 | 1.0354 | 1.0176 |
171
+ | No log | 3.1169 | 240 | 0.7845 | 0.2092 | 0.7845 | 0.8857 |
172
+ | No log | 3.1429 | 242 | 0.6882 | 0.1371 | 0.6882 | 0.8296 |
173
+ | No log | 3.1688 | 244 | 0.6977 | 0.0414 | 0.6977 | 0.8353 |
174
+ | No log | 3.1948 | 246 | 0.7322 | 0.0874 | 0.7322 | 0.8557 |
175
+ | No log | 3.2208 | 248 | 0.7679 | 0.0776 | 0.7679 | 0.8763 |
176
+ | No log | 3.2468 | 250 | 0.9265 | 0.1297 | 0.9265 | 0.9625 |
177
+ | No log | 3.2727 | 252 | 1.0354 | 0.1110 | 1.0354 | 1.0176 |
178
+ | No log | 3.2987 | 254 | 0.9784 | 0.1180 | 0.9784 | 0.9891 |
179
+ | No log | 3.3247 | 256 | 0.8459 | 0.1103 | 0.8459 | 0.9197 |
180
+ | No log | 3.3506 | 258 | 0.7853 | 0.1359 | 0.7853 | 0.8862 |
181
+ | No log | 3.3766 | 260 | 0.7790 | 0.1646 | 0.7790 | 0.8826 |
182
+ | No log | 3.4026 | 262 | 0.7631 | 0.1236 | 0.7631 | 0.8736 |
183
+ | No log | 3.4286 | 264 | 0.7651 | 0.1143 | 0.7651 | 0.8747 |
184
+ | No log | 3.4545 | 266 | 0.9085 | 0.0921 | 0.9085 | 0.9531 |
185
+ | No log | 3.4805 | 268 | 1.0924 | 0.1353 | 1.0924 | 1.0452 |
186
+ | No log | 3.5065 | 270 | 1.0353 | 0.1111 | 1.0353 | 1.0175 |
187
+ | No log | 3.5325 | 272 | 0.8158 | 0.2155 | 0.8158 | 0.9032 |
188
+ | No log | 3.5584 | 274 | 0.7422 | 0.0926 | 0.7422 | 0.8615 |
189
+ | No log | 3.5844 | 276 | 0.7692 | 0.0937 | 0.7692 | 0.8770 |
190
+ | No log | 3.6104 | 278 | 0.7504 | 0.0926 | 0.7504 | 0.8663 |
191
+ | No log | 3.6364 | 280 | 0.8242 | 0.1954 | 0.8242 | 0.9078 |
192
+ | No log | 3.6623 | 282 | 0.9951 | 0.0856 | 0.9951 | 0.9976 |
193
+ | No log | 3.6883 | 284 | 1.0172 | 0.0451 | 1.0172 | 1.0086 |
194
+ | No log | 3.7143 | 286 | 0.8420 | 0.1105 | 0.8420 | 0.9176 |
195
+ | No log | 3.7403 | 288 | 0.7164 | 0.1758 | 0.7164 | 0.8464 |
196
+ | No log | 3.7662 | 290 | 0.6943 | 0.1828 | 0.6943 | 0.8333 |
197
+ | No log | 3.7922 | 292 | 0.7167 | 0.1758 | 0.7167 | 0.8466 |
198
+ | No log | 3.8182 | 294 | 0.8173 | 0.1522 | 0.8173 | 0.9040 |
199
+ | No log | 3.8442 | 296 | 0.9090 | 0.1180 | 0.9090 | 0.9534 |
200
+ | No log | 3.8701 | 298 | 0.8568 | 0.2204 | 0.8568 | 0.9256 |
201
+ | No log | 3.8961 | 300 | 0.8241 | 0.1760 | 0.8241 | 0.9078 |
202
+ | No log | 3.9221 | 302 | 0.7909 | 0.1790 | 0.7909 | 0.8893 |
203
+ | No log | 3.9481 | 304 | 0.7410 | 0.2605 | 0.7410 | 0.8608 |
204
+ | No log | 3.9740 | 306 | 0.7256 | 0.2544 | 0.7256 | 0.8518 |
205
+ | No log | 4.0 | 308 | 0.7021 | 0.2150 | 0.7021 | 0.8379 |
206
+ | No log | 4.0260 | 310 | 0.7278 | 0.1644 | 0.7278 | 0.8531 |
207
+ | No log | 4.0519 | 312 | 0.7727 | 0.2641 | 0.7727 | 0.8790 |
208
+ | No log | 4.0779 | 314 | 0.7644 | 0.2024 | 0.7644 | 0.8743 |
209
+ | No log | 4.1039 | 316 | 0.7635 | 0.1282 | 0.7635 | 0.8738 |
210
+ | No log | 4.1299 | 318 | 0.7725 | 0.1456 | 0.7725 | 0.8789 |
211
+ | No log | 4.1558 | 320 | 0.7649 | 0.1456 | 0.7649 | 0.8746 |
212
+ | No log | 4.1818 | 322 | 0.7600 | 0.0889 | 0.7600 | 0.8718 |
213
+ | No log | 4.2078 | 324 | 0.7745 | 0.2142 | 0.7745 | 0.8801 |
214
+ | No log | 4.2338 | 326 | 0.7533 | 0.1240 | 0.7533 | 0.8679 |
215
+ | No log | 4.2597 | 328 | 0.8175 | 0.0348 | 0.8175 | 0.9041 |
216
+ | No log | 4.2857 | 330 | 0.7764 | 0.1286 | 0.7764 | 0.8812 |
217
+ | No log | 4.3117 | 332 | 0.7584 | 0.1286 | 0.7584 | 0.8709 |
218
+ | No log | 4.3377 | 334 | 0.7222 | 0.1362 | 0.7222 | 0.8498 |
219
+ | No log | 4.3636 | 336 | 0.7648 | 0.0053 | 0.7648 | 0.8745 |
220
+ | No log | 4.3896 | 338 | 0.7367 | 0.0449 | 0.7367 | 0.8583 |
221
+ | No log | 4.4156 | 340 | 0.7523 | 0.0953 | 0.7523 | 0.8673 |
222
+ | No log | 4.4416 | 342 | 0.8638 | 0.0719 | 0.8638 | 0.9294 |
223
+ | No log | 4.4675 | 344 | 0.8479 | 0.0719 | 0.8479 | 0.9208 |
224
+ | No log | 4.4935 | 346 | 0.7852 | 0.0953 | 0.7852 | 0.8861 |
225
+ | No log | 4.5195 | 348 | 0.7553 | 0.0828 | 0.7553 | 0.8691 |
226
+ | No log | 4.5455 | 350 | 0.7795 | 0.1240 | 0.7795 | 0.8829 |
227
+ | No log | 4.5714 | 352 | 0.7852 | 0.1141 | 0.7852 | 0.8861 |
228
+ | No log | 4.5974 | 354 | 0.7638 | 0.0783 | 0.7638 | 0.8739 |
229
+ | No log | 4.6234 | 356 | 0.7352 | 0.1740 | 0.7352 | 0.8575 |
230
+ | No log | 4.6494 | 358 | 0.7252 | 0.2180 | 0.7252 | 0.8516 |
231
+ | No log | 4.6753 | 360 | 0.7469 | 0.1565 | 0.7469 | 0.8643 |
232
+ | No log | 4.7013 | 362 | 0.7280 | 0.2180 | 0.7280 | 0.8532 |
233
+ | No log | 4.7273 | 364 | 0.7549 | 0.2053 | 0.7549 | 0.8689 |
234
+ | No log | 4.7532 | 366 | 0.8056 | 0.1228 | 0.8056 | 0.8976 |
235
+ | No log | 4.7792 | 368 | 0.8273 | 0.1224 | 0.8273 | 0.9095 |
236
+ | No log | 4.8052 | 370 | 0.7851 | 0.1585 | 0.7851 | 0.8861 |
237
+ | No log | 4.8312 | 372 | 0.7528 | 0.2078 | 0.7528 | 0.8676 |
238
+ | No log | 4.8571 | 374 | 0.7064 | 0.0821 | 0.7064 | 0.8405 |
239
+ | No log | 4.8831 | 376 | 0.7196 | 0.0918 | 0.7196 | 0.8483 |
240
+ | No log | 4.9091 | 378 | 0.6905 | 0.0869 | 0.6905 | 0.8309 |
241
+ | No log | 4.9351 | 380 | 0.7093 | 0.1758 | 0.7093 | 0.8422 |
242
+ | No log | 4.9610 | 382 | 0.7654 | 0.1047 | 0.7654 | 0.8748 |
243
+ | No log | 4.9870 | 384 | 0.7964 | 0.1144 | 0.7964 | 0.8924 |
244
+ | No log | 5.0130 | 386 | 0.8680 | 0.1624 | 0.8680 | 0.9317 |
245
+ | No log | 5.0390 | 388 | 0.9276 | 0.0906 | 0.9276 | 0.9631 |
246
+ | No log | 5.0649 | 390 | 0.9091 | 0.1661 | 0.9091 | 0.9535 |
247
+ | No log | 5.0909 | 392 | 0.8770 | 0.1264 | 0.8770 | 0.9365 |
248
+ | No log | 5.1169 | 394 | 0.8827 | 0.1609 | 0.8827 | 0.9395 |
249
+ | No log | 5.1429 | 396 | 0.8163 | 0.1660 | 0.8163 | 0.9035 |
250
+ | No log | 5.1688 | 398 | 0.7888 | 0.2107 | 0.7888 | 0.8882 |
251
+ | No log | 5.1948 | 400 | 0.7333 | 0.0863 | 0.7333 | 0.8563 |
252
+ | No log | 5.2208 | 402 | 0.7787 | 0.0028 | 0.7787 | 0.8824 |
253
+ | No log | 5.2468 | 404 | 0.8527 | 0.0192 | 0.8527 | 0.9234 |
254
+ | No log | 5.2727 | 406 | 0.8274 | 0.0172 | 0.8274 | 0.9096 |
255
+ | No log | 5.2987 | 408 | 0.7942 | 0.0791 | 0.7942 | 0.8912 |
256
+ | No log | 5.3247 | 410 | 0.8594 | 0.2466 | 0.8594 | 0.9270 |
257
+ | No log | 5.3506 | 412 | 0.8424 | 0.2466 | 0.8424 | 0.9178 |
258
+ | No log | 5.3766 | 414 | 0.7696 | 0.1644 | 0.7696 | 0.8773 |
259
+ | No log | 5.4026 | 416 | 0.7638 | 0.0394 | 0.7638 | 0.8740 |
260
+ | No log | 5.4286 | 418 | 0.7671 | 0.1199 | 0.7671 | 0.8759 |
261
+ | No log | 5.4545 | 420 | 0.8344 | 0.1758 | 0.8344 | 0.9135 |
262
+ | No log | 5.4805 | 422 | 0.8640 | 0.0438 | 0.8640 | 0.9295 |
263
+ | No log | 5.5065 | 424 | 0.8555 | 0.1329 | 0.8555 | 0.9250 |
264
+ | No log | 5.5325 | 426 | 0.8232 | 0.1139 | 0.8232 | 0.9073 |
265
+ | No log | 5.5584 | 428 | 0.8269 | 0.0460 | 0.8269 | 0.9094 |
266
+ | No log | 5.5844 | 430 | 0.8181 | 0.1094 | 0.8181 | 0.9045 |
267
+ | No log | 5.6104 | 432 | 0.9209 | 0.0676 | 0.9209 | 0.9596 |
268
+ | No log | 5.6364 | 434 | 0.9284 | 0.1027 | 0.9284 | 0.9635 |
269
+ | No log | 5.6623 | 436 | 0.8331 | 0.2092 | 0.8331 | 0.9127 |
270
+ | No log | 5.6883 | 438 | 0.7815 | 0.1244 | 0.7815 | 0.8840 |
271
+ | No log | 5.7143 | 440 | 0.8235 | 0.0123 | 0.8235 | 0.9075 |
272
+ | No log | 5.7403 | 442 | 0.8225 | 0.0123 | 0.8225 | 0.9069 |
273
+ | No log | 5.7662 | 444 | 0.7892 | 0.0840 | 0.7892 | 0.8883 |
274
+ | No log | 5.7922 | 446 | 0.7982 | 0.2466 | 0.7982 | 0.8934 |
275
+ | No log | 5.8182 | 448 | 0.8093 | 0.2248 | 0.8093 | 0.8996 |
276
+ | No log | 5.8442 | 450 | 0.8194 | 0.2318 | 0.8194 | 0.9052 |
277
+ | No log | 5.8701 | 452 | 0.7653 | 0.1249 | 0.7653 | 0.8748 |
278
+ | No log | 5.8961 | 454 | 0.7787 | 0.0834 | 0.7787 | 0.8824 |
279
+ | No log | 5.9221 | 456 | 0.8083 | 0.0834 | 0.8083 | 0.8991 |
280
+ | No log | 5.9481 | 458 | 0.8322 | 0.1922 | 0.8322 | 0.9123 |
281
+ | No log | 5.9740 | 460 | 0.8192 | 0.1863 | 0.8192 | 0.9051 |
282
+ | No log | 6.0 | 462 | 0.8166 | 0.1365 | 0.8166 | 0.9036 |
283
+ | No log | 6.0260 | 464 | 0.8212 | 0.1415 | 0.8212 | 0.9062 |
284
+ | No log | 6.0519 | 466 | 0.8582 | 0.0923 | 0.8582 | 0.9264 |
285
+ | No log | 6.0779 | 468 | 0.9130 | 0.0700 | 0.9130 | 0.9555 |
286
+ | No log | 6.1039 | 470 | 0.9920 | 0.0918 | 0.9920 | 0.9960 |
287
+ | No log | 6.1299 | 472 | 0.9495 | 0.0988 | 0.9495 | 0.9744 |
288
+ | No log | 6.1558 | 474 | 0.8351 | 0.1324 | 0.8351 | 0.9139 |
289
+ | No log | 6.1818 | 476 | 0.7869 | 0.1304 | 0.7869 | 0.8871 |
290
+ | No log | 6.2078 | 478 | 0.7671 | 0.1304 | 0.7671 | 0.8758 |
291
+ | No log | 6.2338 | 480 | 0.7622 | 0.2053 | 0.7622 | 0.8730 |
292
+ | No log | 6.2597 | 482 | 0.7674 | 0.2431 | 0.7674 | 0.8760 |
293
+ | No log | 6.2857 | 484 | 0.8195 | 0.0799 | 0.8195 | 0.9053 |
294
+ | No log | 6.3117 | 486 | 0.7855 | 0.2155 | 0.7855 | 0.8863 |
295
+ | No log | 6.3377 | 488 | 0.7207 | 0.1740 | 0.7207 | 0.8489 |
296
+ | No log | 6.3636 | 490 | 0.7230 | 0.1740 | 0.7230 | 0.8503 |
297
+ | No log | 6.3896 | 492 | 0.7481 | 0.2053 | 0.7481 | 0.8649 |
298
+ | No log | 6.4156 | 494 | 0.7975 | 0.1465 | 0.7975 | 0.8930 |
299
+ | No log | 6.4416 | 496 | 0.8784 | 0.1448 | 0.8784 | 0.9372 |
300
+ | No log | 6.4675 | 498 | 0.8989 | 0.1386 | 0.8989 | 0.9481 |
301
+ | 0.2854 | 6.4935 | 500 | 0.8877 | 0.1268 | 0.8877 | 0.9422 |
302
+ | 0.2854 | 6.5195 | 502 | 0.8102 | 0.0798 | 0.8102 | 0.9001 |
303
+ | 0.2854 | 6.5455 | 504 | 0.7941 | 0.0412 | 0.7941 | 0.8911 |
304
+ | 0.2854 | 6.5714 | 506 | 0.8259 | 0.0893 | 0.8259 | 0.9088 |
305
+ | 0.2854 | 6.5974 | 508 | 0.8524 | 0.1498 | 0.8524 | 0.9232 |
306
+ | 0.2854 | 6.6234 | 510 | 1.0122 | 0.0585 | 1.0122 | 1.0061 |
307
+ | 0.2854 | 6.6494 | 512 | 1.0943 | 0.0585 | 1.0943 | 1.0461 |
308
+ | 0.2854 | 6.6753 | 514 | 0.9967 | 0.0603 | 0.9967 | 0.9984 |
309
+ | 0.2854 | 6.7013 | 516 | 0.8835 | 0.1228 | 0.8835 | 0.9399 |
310
+ | 0.2854 | 6.7273 | 518 | 0.8387 | 0.0460 | 0.8387 | 0.9158 |
311
+
312
+
313
+ ### Framework versions
314
+
315
+ - Transformers 4.44.2
316
+ - Pytorch 2.4.0+cu118
317
+ - Datasets 2.21.0
318
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "aubmindlab/bert-base-arabertv02",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "id2label": {
12
+ "0": "LABEL_0"
13
+ },
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 3072,
16
+ "label2id": {
17
+ "LABEL_0": 0
18
+ },
19
+ "layer_norm_eps": 1e-12,
20
+ "max_position_embeddings": 512,
21
+ "model_type": "bert",
22
+ "num_attention_heads": 12,
23
+ "num_hidden_layers": 12,
24
+ "pad_token_id": 0,
25
+ "position_embedding_type": "absolute",
26
+ "problem_type": "regression",
27
+ "torch_dtype": "float32",
28
+ "transformers_version": "4.44.2",
29
+ "type_vocab_size": 2,
30
+ "use_cache": true,
31
+ "vocab_size": 64000
32
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d9d3f5e3e552f355b52e4953868672927727ceafb7cce389e498d3a98f1bc7b8
3
+ size 540799996
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:37be9528bf12528dee5d9af3d4ac507b21a0eee33aa33c95ab0585c22e5d4c69
3
+ size 5304