MayBashendy commited on
Commit
6788d16
·
verified ·
1 Parent(s): a99e8fb

Training in progress, step 500

Browse files
Files changed (4) hide show
  1. README.md +318 -0
  2. config.json +32 -0
  3. model.safetensors +3 -0
  4. training_args.bin +3 -0
README.md ADDED
@@ -0,0 +1,318 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ base_model: aubmindlab/bert-base-arabertv02
4
+ tags:
5
+ - generated_from_trainer
6
+ model-index:
7
+ - name: ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k19_task3_organization
8
+ results: []
9
+ ---
10
+
11
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
+ should probably proofread and complete it, then remove this comment. -->
13
+
14
+ # ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k19_task3_organization
15
+
16
+ This model is a fine-tuned version of [aubmindlab/bert-base-arabertv02](https://huggingface.co/aubmindlab/bert-base-arabertv02) on the None dataset.
17
+ It achieves the following results on the evaluation set:
18
+ - Loss: 0.8082
19
+ - Qwk: -0.1398
20
+ - Mse: 0.8082
21
+ - Rmse: 0.8990
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 2e-05
41
+ - train_batch_size: 8
42
+ - eval_batch_size: 8
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 100
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Qwk | Mse | Rmse |
51
+ |:-------------:|:------:|:----:|:---------------:|:-------:|:------:|:------:|
52
+ | No log | 0.0204 | 2 | 3.8284 | 0.0023 | 3.8284 | 1.9566 |
53
+ | No log | 0.0408 | 4 | 1.9331 | 0.0119 | 1.9331 | 1.3903 |
54
+ | No log | 0.0612 | 6 | 1.1702 | -0.0445 | 1.1702 | 1.0818 |
55
+ | No log | 0.0816 | 8 | 1.0615 | 0.0378 | 1.0615 | 1.0303 |
56
+ | No log | 0.1020 | 10 | 1.0769 | 0.0723 | 1.0769 | 1.0377 |
57
+ | No log | 0.1224 | 12 | 0.7414 | -0.0101 | 0.7414 | 0.8611 |
58
+ | No log | 0.1429 | 14 | 0.7449 | 0.0506 | 0.7449 | 0.8631 |
59
+ | No log | 0.1633 | 16 | 0.9511 | 0.0627 | 0.9511 | 0.9753 |
60
+ | No log | 0.1837 | 18 | 1.9542 | -0.0405 | 1.9542 | 1.3979 |
61
+ | No log | 0.2041 | 20 | 1.2383 | 0.1118 | 1.2383 | 1.1128 |
62
+ | No log | 0.2245 | 22 | 0.6690 | -0.0626 | 0.6690 | 0.8179 |
63
+ | No log | 0.2449 | 24 | 0.6931 | 0.0555 | 0.6931 | 0.8325 |
64
+ | No log | 0.2653 | 26 | 0.7362 | 0.0 | 0.7362 | 0.8580 |
65
+ | No log | 0.2857 | 28 | 0.7323 | 0.0 | 0.7323 | 0.8557 |
66
+ | No log | 0.3061 | 30 | 0.7224 | 0.0555 | 0.7224 | 0.8499 |
67
+ | No log | 0.3265 | 32 | 0.7453 | -0.0069 | 0.7453 | 0.8633 |
68
+ | No log | 0.3469 | 34 | 0.9563 | 0.0067 | 0.9563 | 0.9779 |
69
+ | No log | 0.3673 | 36 | 1.2800 | 0.0016 | 1.2800 | 1.1314 |
70
+ | No log | 0.3878 | 38 | 1.3556 | 0.0 | 1.3556 | 1.1643 |
71
+ | No log | 0.4082 | 40 | 1.2018 | 0.0 | 1.2018 | 1.0963 |
72
+ | No log | 0.4286 | 42 | 0.9620 | 0.0543 | 0.9620 | 0.9808 |
73
+ | No log | 0.4490 | 44 | 0.8233 | -0.0056 | 0.8233 | 0.9074 |
74
+ | No log | 0.4694 | 46 | 0.7787 | 0.0099 | 0.7787 | 0.8824 |
75
+ | No log | 0.4898 | 48 | 0.8885 | 0.1191 | 0.8885 | 0.9426 |
76
+ | No log | 0.5102 | 50 | 0.8231 | -0.0812 | 0.8231 | 0.9072 |
77
+ | No log | 0.5306 | 52 | 0.7392 | -0.0679 | 0.7392 | 0.8598 |
78
+ | No log | 0.5510 | 54 | 0.7632 | 0.0555 | 0.7632 | 0.8736 |
79
+ | No log | 0.5714 | 56 | 0.7723 | 0.0555 | 0.7723 | 0.8788 |
80
+ | No log | 0.5918 | 58 | 0.8086 | -0.0739 | 0.8086 | 0.8992 |
81
+ | No log | 0.6122 | 60 | 0.8993 | -0.0425 | 0.8993 | 0.9483 |
82
+ | No log | 0.6327 | 62 | 0.7805 | 0.0260 | 0.7805 | 0.8835 |
83
+ | No log | 0.6531 | 64 | 0.7391 | -0.0131 | 0.7391 | 0.8597 |
84
+ | No log | 0.6735 | 66 | 0.8373 | -0.0214 | 0.8373 | 0.9150 |
85
+ | No log | 0.6939 | 68 | 0.9268 | 0.0052 | 0.9268 | 0.9627 |
86
+ | No log | 0.7143 | 70 | 0.8790 | -0.0477 | 0.8790 | 0.9375 |
87
+ | No log | 0.7347 | 72 | 0.7876 | 0.1660 | 0.7876 | 0.8875 |
88
+ | No log | 0.7551 | 74 | 0.8913 | 0.1147 | 0.8913 | 0.9441 |
89
+ | No log | 0.7755 | 76 | 0.8651 | 0.1633 | 0.8651 | 0.9301 |
90
+ | No log | 0.7959 | 78 | 0.8247 | 0.0172 | 0.8247 | 0.9081 |
91
+ | No log | 0.8163 | 80 | 0.9847 | -0.0228 | 0.9847 | 0.9923 |
92
+ | No log | 0.8367 | 82 | 1.0141 | -0.0533 | 1.0141 | 1.0070 |
93
+ | No log | 0.8571 | 84 | 0.8752 | 0.1591 | 0.8752 | 0.9355 |
94
+ | No log | 0.8776 | 86 | 0.8733 | 0.0119 | 0.8733 | 0.9345 |
95
+ | No log | 0.8980 | 88 | 0.8476 | -0.0268 | 0.8476 | 0.9206 |
96
+ | No log | 0.9184 | 90 | 0.8309 | 0.0056 | 0.8309 | 0.9115 |
97
+ | No log | 0.9388 | 92 | 0.9553 | -0.1152 | 0.9553 | 0.9774 |
98
+ | No log | 0.9592 | 94 | 0.8898 | 0.2031 | 0.8898 | 0.9433 |
99
+ | No log | 0.9796 | 96 | 0.8641 | -0.0287 | 0.8641 | 0.9296 |
100
+ | No log | 1.0 | 98 | 0.8091 | 0.0816 | 0.8091 | 0.8995 |
101
+ | No log | 1.0204 | 100 | 0.8516 | 0.1400 | 0.8516 | 0.9228 |
102
+ | No log | 1.0408 | 102 | 1.1248 | 0.0541 | 1.1248 | 1.0606 |
103
+ | No log | 1.0612 | 104 | 0.9803 | 0.0416 | 0.9803 | 0.9901 |
104
+ | No log | 1.0816 | 106 | 0.8334 | 0.1529 | 0.8334 | 0.9129 |
105
+ | No log | 1.1020 | 108 | 0.8413 | 0.0688 | 0.8413 | 0.9172 |
106
+ | No log | 1.1224 | 110 | 0.8573 | 0.0833 | 0.8573 | 0.9259 |
107
+ | No log | 1.1429 | 112 | 0.9098 | 0.1221 | 0.9098 | 0.9539 |
108
+ | No log | 1.1633 | 114 | 0.9435 | 0.0866 | 0.9435 | 0.9713 |
109
+ | No log | 1.1837 | 116 | 1.0304 | 0.0007 | 1.0304 | 1.0151 |
110
+ | No log | 1.2041 | 118 | 1.3937 | -0.0249 | 1.3937 | 1.1805 |
111
+ | No log | 1.2245 | 120 | 1.5959 | 0.0380 | 1.5959 | 1.2633 |
112
+ | No log | 1.2449 | 122 | 1.3763 | 0.0802 | 1.3763 | 1.1732 |
113
+ | No log | 1.2653 | 124 | 1.1502 | 0.1308 | 1.1502 | 1.0725 |
114
+ | No log | 1.2857 | 126 | 0.9375 | 0.1123 | 0.9375 | 0.9683 |
115
+ | No log | 1.3061 | 128 | 0.8915 | 0.1519 | 0.8915 | 0.9442 |
116
+ | No log | 1.3265 | 130 | 0.9443 | -0.0237 | 0.9443 | 0.9717 |
117
+ | No log | 1.3469 | 132 | 1.0621 | -0.0373 | 1.0621 | 1.0306 |
118
+ | No log | 1.3673 | 134 | 1.0022 | 0.0487 | 1.0022 | 1.0011 |
119
+ | No log | 1.3878 | 136 | 0.8774 | 0.1925 | 0.8774 | 0.9367 |
120
+ | No log | 1.4082 | 138 | 0.8235 | 0.1529 | 0.8235 | 0.9075 |
121
+ | No log | 1.4286 | 140 | 0.8571 | 0.1941 | 0.8571 | 0.9258 |
122
+ | No log | 1.4490 | 142 | 0.9744 | 0.0729 | 0.9744 | 0.9871 |
123
+ | No log | 1.4694 | 144 | 0.8663 | 0.2316 | 0.8663 | 0.9308 |
124
+ | No log | 1.4898 | 146 | 0.8334 | 0.2545 | 0.8334 | 0.9129 |
125
+ | No log | 1.5102 | 148 | 0.8223 | 0.1608 | 0.8223 | 0.9068 |
126
+ | No log | 1.5306 | 150 | 0.9312 | 0.0643 | 0.9312 | 0.9650 |
127
+ | No log | 1.5510 | 152 | 1.3826 | 0.0596 | 1.3826 | 1.1759 |
128
+ | No log | 1.5714 | 154 | 1.4690 | 0.0603 | 1.4690 | 1.2120 |
129
+ | No log | 1.5918 | 156 | 1.0035 | 0.0730 | 1.0035 | 1.0018 |
130
+ | No log | 1.6122 | 158 | 0.9029 | 0.0517 | 0.9029 | 0.9502 |
131
+ | No log | 1.6327 | 160 | 1.0766 | -0.0133 | 1.0766 | 1.0376 |
132
+ | No log | 1.6531 | 162 | 0.8961 | 0.0424 | 0.8961 | 0.9466 |
133
+ | No log | 1.6735 | 164 | 0.8162 | 0.0488 | 0.8162 | 0.9034 |
134
+ | No log | 1.6939 | 166 | 1.1911 | 0.0542 | 1.1911 | 1.0914 |
135
+ | No log | 1.7143 | 168 | 1.5028 | 0.1131 | 1.5028 | 1.2259 |
136
+ | No log | 1.7347 | 170 | 1.2878 | 0.0780 | 1.2878 | 1.1348 |
137
+ | No log | 1.7551 | 172 | 0.9469 | 0.1196 | 0.9469 | 0.9731 |
138
+ | No log | 1.7755 | 174 | 0.9132 | 0.1020 | 0.9132 | 0.9556 |
139
+ | No log | 1.7959 | 176 | 1.0073 | 0.1014 | 1.0073 | 1.0036 |
140
+ | No log | 1.8163 | 178 | 1.2283 | 0.0277 | 1.2283 | 1.1083 |
141
+ | No log | 1.8367 | 180 | 1.2539 | 0.0296 | 1.2539 | 1.1198 |
142
+ | No log | 1.8571 | 182 | 1.0306 | 0.1334 | 1.0306 | 1.0152 |
143
+ | No log | 1.8776 | 184 | 0.8685 | 0.1696 | 0.8685 | 0.9319 |
144
+ | No log | 1.8980 | 186 | 0.8069 | 0.1961 | 0.8069 | 0.8983 |
145
+ | No log | 1.9184 | 188 | 0.8220 | 0.2892 | 0.8220 | 0.9066 |
146
+ | No log | 1.9388 | 190 | 0.9342 | 0.1321 | 0.9342 | 0.9665 |
147
+ | No log | 1.9592 | 192 | 0.9385 | 0.1581 | 0.9385 | 0.9688 |
148
+ | No log | 1.9796 | 194 | 0.8027 | 0.1739 | 0.8027 | 0.8960 |
149
+ | No log | 2.0 | 196 | 0.8832 | 0.0435 | 0.8832 | 0.9398 |
150
+ | No log | 2.0204 | 198 | 0.9433 | 0.0762 | 0.9433 | 0.9712 |
151
+ | No log | 2.0408 | 200 | 0.7918 | 0.0586 | 0.7918 | 0.8898 |
152
+ | No log | 2.0612 | 202 | 0.8423 | 0.1039 | 0.8423 | 0.9178 |
153
+ | No log | 2.0816 | 204 | 1.0490 | 0.0522 | 1.0490 | 1.0242 |
154
+ | No log | 2.1020 | 206 | 1.0552 | 0.0497 | 1.0552 | 1.0272 |
155
+ | No log | 2.1224 | 208 | 0.8795 | 0.1672 | 0.8795 | 0.9378 |
156
+ | No log | 2.1429 | 210 | 0.7837 | 0.0488 | 0.7837 | 0.8853 |
157
+ | No log | 2.1633 | 212 | 0.8068 | 0.0165 | 0.8068 | 0.8982 |
158
+ | No log | 2.1837 | 214 | 0.9800 | 0.1111 | 0.9800 | 0.9899 |
159
+ | No log | 2.2041 | 216 | 1.0120 | 0.0482 | 1.0120 | 1.0060 |
160
+ | No log | 2.2245 | 218 | 0.8922 | 0.1228 | 0.8922 | 0.9446 |
161
+ | No log | 2.2449 | 220 | 0.8927 | 0.1208 | 0.8927 | 0.9449 |
162
+ | No log | 2.2653 | 222 | 0.8693 | 0.1907 | 0.8693 | 0.9324 |
163
+ | No log | 2.2857 | 224 | 1.0518 | 0.1017 | 1.0518 | 1.0256 |
164
+ | No log | 2.3061 | 226 | 1.1423 | 0.1294 | 1.1423 | 1.0688 |
165
+ | No log | 2.3265 | 228 | 1.0390 | 0.0707 | 1.0390 | 1.0193 |
166
+ | No log | 2.3469 | 230 | 0.8955 | 0.1304 | 0.8955 | 0.9463 |
167
+ | No log | 2.3673 | 232 | 0.9811 | 0.1014 | 0.9811 | 0.9905 |
168
+ | No log | 2.3878 | 234 | 1.0109 | 0.1017 | 1.0109 | 1.0054 |
169
+ | No log | 2.4082 | 236 | 0.9715 | 0.0516 | 0.9715 | 0.9857 |
170
+ | No log | 2.4286 | 238 | 0.8075 | 0.0592 | 0.8075 | 0.8986 |
171
+ | No log | 2.4490 | 240 | 0.7681 | 0.1835 | 0.7681 | 0.8764 |
172
+ | No log | 2.4694 | 242 | 0.7892 | 0.2118 | 0.7892 | 0.8883 |
173
+ | No log | 2.4898 | 244 | 0.8750 | 0.0311 | 0.8750 | 0.9354 |
174
+ | No log | 2.5102 | 246 | 0.9347 | 0.0391 | 0.9347 | 0.9668 |
175
+ | No log | 2.5306 | 248 | 0.8586 | 0.0311 | 0.8586 | 0.9266 |
176
+ | No log | 2.5510 | 250 | 0.7791 | 0.0987 | 0.7791 | 0.8827 |
177
+ | No log | 2.5714 | 252 | 0.7331 | 0.1856 | 0.7331 | 0.8562 |
178
+ | No log | 2.5918 | 254 | 0.7446 | 0.1928 | 0.7446 | 0.8629 |
179
+ | No log | 2.6122 | 256 | 0.8312 | 0.0350 | 0.8312 | 0.9117 |
180
+ | No log | 2.6327 | 258 | 0.8766 | 0.0643 | 0.8766 | 0.9362 |
181
+ | No log | 2.6531 | 260 | 0.9635 | 0.1146 | 0.9635 | 0.9816 |
182
+ | No log | 2.6735 | 262 | 1.0298 | 0.0767 | 1.0298 | 1.0148 |
183
+ | No log | 2.6939 | 264 | 0.9685 | 0.0789 | 0.9685 | 0.9841 |
184
+ | No log | 2.7143 | 266 | 0.9279 | 0.0154 | 0.9279 | 0.9633 |
185
+ | No log | 2.7347 | 268 | 0.8431 | 0.1079 | 0.8431 | 0.9182 |
186
+ | No log | 2.7551 | 270 | 0.7329 | 0.1081 | 0.7329 | 0.8561 |
187
+ | No log | 2.7755 | 272 | 0.7393 | 0.0814 | 0.7393 | 0.8598 |
188
+ | No log | 2.7959 | 274 | 0.7685 | 0.0814 | 0.7685 | 0.8766 |
189
+ | No log | 2.8163 | 276 | 0.7879 | 0.0432 | 0.7879 | 0.8877 |
190
+ | No log | 2.8367 | 278 | 0.9287 | -0.0036 | 0.9287 | 0.9637 |
191
+ | No log | 2.8571 | 280 | 0.9941 | 0.0701 | 0.9941 | 0.9970 |
192
+ | No log | 2.8776 | 282 | 0.9918 | 0.0326 | 0.9918 | 0.9959 |
193
+ | No log | 2.8980 | 284 | 0.9321 | -0.0142 | 0.9321 | 0.9655 |
194
+ | No log | 2.9184 | 286 | 1.0294 | 0.0083 | 1.0294 | 1.0146 |
195
+ | No log | 2.9388 | 288 | 1.4105 | 0.0058 | 1.4105 | 1.1876 |
196
+ | No log | 2.9592 | 290 | 1.3782 | -0.0191 | 1.3782 | 1.1740 |
197
+ | No log | 2.9796 | 292 | 1.1383 | -0.0504 | 1.1383 | 1.0669 |
198
+ | No log | 3.0 | 294 | 0.8843 | -0.0226 | 0.8843 | 0.9404 |
199
+ | No log | 3.0204 | 296 | 0.8786 | -0.0226 | 0.8786 | 0.9373 |
200
+ | No log | 3.0408 | 298 | 0.9931 | 0.0028 | 0.9931 | 0.9965 |
201
+ | No log | 3.0612 | 300 | 1.2189 | -0.0077 | 1.2189 | 1.1040 |
202
+ | No log | 3.0816 | 302 | 1.2128 | -0.0077 | 1.2128 | 1.1013 |
203
+ | No log | 3.1020 | 304 | 1.0070 | 0.0065 | 1.0070 | 1.0035 |
204
+ | No log | 3.1224 | 306 | 0.9335 | 0.0185 | 0.9335 | 0.9662 |
205
+ | No log | 3.1429 | 308 | 1.0715 | -0.0181 | 1.0715 | 1.0351 |
206
+ | No log | 3.1633 | 310 | 1.3956 | 0.0260 | 1.3956 | 1.1813 |
207
+ | No log | 3.1837 | 312 | 1.3196 | 0.0260 | 1.3196 | 1.1487 |
208
+ | No log | 3.2041 | 314 | 0.9987 | -0.0595 | 0.9987 | 0.9994 |
209
+ | No log | 3.2245 | 316 | 0.8371 | -0.0280 | 0.8371 | 0.9149 |
210
+ | No log | 3.2449 | 318 | 0.8001 | 0.0503 | 0.8001 | 0.8945 |
211
+ | No log | 3.2653 | 320 | 0.8780 | -0.1051 | 0.8780 | 0.9370 |
212
+ | No log | 3.2857 | 322 | 1.1622 | 0.0175 | 1.1622 | 1.0781 |
213
+ | No log | 3.3061 | 324 | 1.4210 | 0.0308 | 1.4210 | 1.1920 |
214
+ | No log | 3.3265 | 326 | 1.3406 | 0.0328 | 1.3406 | 1.1578 |
215
+ | No log | 3.3469 | 328 | 1.0510 | 0.0164 | 1.0510 | 1.0252 |
216
+ | No log | 3.3673 | 330 | 0.8093 | -0.0786 | 0.8093 | 0.8996 |
217
+ | No log | 3.3878 | 332 | 0.7727 | 0.1259 | 0.7727 | 0.8790 |
218
+ | No log | 3.4082 | 334 | 0.9681 | -0.0518 | 0.9681 | 0.9839 |
219
+ | No log | 3.4286 | 336 | 0.9802 | -0.0518 | 0.9802 | 0.9901 |
220
+ | No log | 3.4490 | 338 | 0.8280 | 0.0549 | 0.8280 | 0.9099 |
221
+ | No log | 3.4694 | 340 | 0.7879 | 0.0926 | 0.7879 | 0.8876 |
222
+ | No log | 3.4898 | 342 | 0.9878 | -0.1152 | 0.9878 | 0.9939 |
223
+ | No log | 3.5102 | 344 | 1.1507 | 0.0472 | 1.1507 | 1.0727 |
224
+ | No log | 3.5306 | 346 | 1.1147 | 0.0659 | 1.1147 | 1.0558 |
225
+ | No log | 3.5510 | 348 | 0.9627 | 0.0093 | 0.9627 | 0.9812 |
226
+ | No log | 3.5714 | 350 | 0.9199 | 0.0392 | 0.9199 | 0.9591 |
227
+ | No log | 3.5918 | 352 | 0.9114 | 0.0145 | 0.9114 | 0.9547 |
228
+ | No log | 3.6122 | 354 | 1.0939 | 0.0149 | 1.0939 | 1.0459 |
229
+ | No log | 3.6327 | 356 | 1.2140 | -0.0614 | 1.2140 | 1.1018 |
230
+ | No log | 3.6531 | 358 | 1.1000 | 0.0138 | 1.1000 | 1.0488 |
231
+ | No log | 3.6735 | 360 | 0.9167 | 0.0296 | 0.9167 | 0.9574 |
232
+ | No log | 3.6939 | 362 | 0.8394 | 0.0119 | 0.8394 | 0.9162 |
233
+ | No log | 3.7143 | 364 | 0.8324 | -0.0647 | 0.8324 | 0.9124 |
234
+ | No log | 3.7347 | 366 | 0.9147 | 0.0680 | 0.9147 | 0.9564 |
235
+ | No log | 3.7551 | 368 | 1.0408 | 0.0159 | 1.0408 | 1.0202 |
236
+ | No log | 3.7755 | 370 | 1.0326 | 0.0175 | 1.0326 | 1.0162 |
237
+ | No log | 3.7959 | 372 | 0.9615 | -0.0661 | 0.9615 | 0.9806 |
238
+ | No log | 3.8163 | 374 | 0.8910 | -0.1045 | 0.8910 | 0.9439 |
239
+ | No log | 3.8367 | 376 | 0.8673 | -0.0314 | 0.8673 | 0.9313 |
240
+ | No log | 3.8571 | 378 | 0.8957 | -0.1412 | 0.8957 | 0.9464 |
241
+ | No log | 3.8776 | 380 | 0.9112 | -0.2107 | 0.9112 | 0.9546 |
242
+ | No log | 3.8980 | 382 | 0.9022 | -0.2107 | 0.9022 | 0.9498 |
243
+ | No log | 3.9184 | 384 | 0.8165 | -0.1268 | 0.8165 | 0.9036 |
244
+ | No log | 3.9388 | 386 | 0.7738 | -0.0578 | 0.7738 | 0.8796 |
245
+ | No log | 3.9592 | 388 | 0.7682 | -0.0578 | 0.7682 | 0.8765 |
246
+ | No log | 3.9796 | 390 | 0.7854 | -0.0449 | 0.7854 | 0.8862 |
247
+ | No log | 4.0 | 392 | 0.8152 | -0.0406 | 0.8152 | 0.9029 |
248
+ | No log | 4.0204 | 394 | 0.8485 | -0.0226 | 0.8485 | 0.9211 |
249
+ | No log | 4.0408 | 396 | 0.8568 | -0.0620 | 0.8568 | 0.9257 |
250
+ | No log | 4.0612 | 398 | 0.8925 | -0.1103 | 0.8925 | 0.9447 |
251
+ | No log | 4.0816 | 400 | 0.8440 | -0.0572 | 0.8440 | 0.9187 |
252
+ | No log | 4.1020 | 402 | 0.8532 | -0.0517 | 0.8532 | 0.9237 |
253
+ | No log | 4.1224 | 404 | 0.9543 | -0.0638 | 0.9543 | 0.9769 |
254
+ | No log | 4.1429 | 406 | 1.1699 | -0.0023 | 1.1699 | 1.0816 |
255
+ | No log | 4.1633 | 408 | 1.1696 | 0.0247 | 1.1696 | 1.0815 |
256
+ | No log | 4.1837 | 410 | 1.0133 | -0.0606 | 1.0133 | 1.0066 |
257
+ | No log | 4.2041 | 412 | 0.9228 | -0.0103 | 0.9228 | 0.9606 |
258
+ | No log | 4.2245 | 414 | 0.8584 | 0.0123 | 0.8584 | 0.9265 |
259
+ | No log | 4.2449 | 416 | 0.9174 | -0.0059 | 0.9174 | 0.9578 |
260
+ | No log | 4.2653 | 418 | 1.1017 | -0.0096 | 1.1017 | 1.0496 |
261
+ | No log | 4.2857 | 420 | 1.1275 | 0.0247 | 1.1275 | 1.0618 |
262
+ | No log | 4.3061 | 422 | 0.9379 | -0.1103 | 0.9379 | 0.9684 |
263
+ | No log | 4.3265 | 424 | 0.7870 | 0.0518 | 0.7870 | 0.8871 |
264
+ | No log | 4.3469 | 426 | 0.7912 | 0.0983 | 0.7912 | 0.8895 |
265
+ | No log | 4.3673 | 428 | 0.8301 | -0.0248 | 0.8301 | 0.9111 |
266
+ | No log | 4.3878 | 430 | 0.9783 | -0.0245 | 0.9783 | 0.9891 |
267
+ | No log | 4.4082 | 432 | 0.9750 | -0.0545 | 0.9750 | 0.9874 |
268
+ | No log | 4.4286 | 434 | 0.9842 | -0.0245 | 0.9842 | 0.9921 |
269
+ | No log | 4.4490 | 436 | 1.0177 | -0.0181 | 1.0177 | 1.0088 |
270
+ | No log | 4.4694 | 438 | 0.9446 | -0.0672 | 0.9446 | 0.9719 |
271
+ | No log | 4.4898 | 440 | 0.8824 | -0.0949 | 0.8824 | 0.9393 |
272
+ | No log | 4.5102 | 442 | 0.8962 | -0.1249 | 0.8962 | 0.9467 |
273
+ | No log | 4.5306 | 444 | 0.9017 | -0.1249 | 0.9017 | 0.9496 |
274
+ | No log | 4.5510 | 446 | 0.8941 | -0.0473 | 0.8941 | 0.9456 |
275
+ | No log | 4.5714 | 448 | 0.8337 | 0.0449 | 0.8337 | 0.9131 |
276
+ | No log | 4.5918 | 450 | 0.8596 | 0.0393 | 0.8596 | 0.9272 |
277
+ | No log | 4.6122 | 452 | 0.9129 | 0.0189 | 0.9129 | 0.9555 |
278
+ | No log | 4.6327 | 454 | 0.9853 | -0.0353 | 0.9853 | 0.9926 |
279
+ | No log | 4.6531 | 456 | 0.9686 | -0.0672 | 0.9686 | 0.9842 |
280
+ | No log | 4.6735 | 458 | 0.9914 | -0.0563 | 0.9914 | 0.9957 |
281
+ | No log | 4.6939 | 460 | 0.8870 | -0.0458 | 0.8870 | 0.9418 |
282
+ | No log | 4.7143 | 462 | 0.7987 | 0.1304 | 0.7987 | 0.8937 |
283
+ | No log | 4.7347 | 464 | 0.8613 | 0.0438 | 0.8613 | 0.9280 |
284
+ | No log | 4.7551 | 466 | 0.8439 | 0.0956 | 0.8439 | 0.9186 |
285
+ | No log | 4.7755 | 468 | 0.7946 | 0.0869 | 0.7946 | 0.8914 |
286
+ | No log | 4.7959 | 470 | 0.8750 | -0.0852 | 0.8750 | 0.9354 |
287
+ | No log | 4.8163 | 472 | 0.9084 | -0.0811 | 0.9084 | 0.9531 |
288
+ | No log | 4.8367 | 474 | 0.8329 | -0.1979 | 0.8329 | 0.9126 |
289
+ | No log | 4.8571 | 476 | 0.7900 | 0.0828 | 0.7900 | 0.8888 |
290
+ | No log | 4.8776 | 478 | 0.8133 | -0.0444 | 0.8133 | 0.9019 |
291
+ | No log | 4.8980 | 480 | 0.9362 | -0.0036 | 0.9362 | 0.9676 |
292
+ | No log | 4.9184 | 482 | 1.1817 | -0.0041 | 1.1817 | 1.0871 |
293
+ | No log | 4.9388 | 484 | 1.2091 | -0.0041 | 1.2091 | 1.0996 |
294
+ | No log | 4.9592 | 486 | 1.0103 | -0.0164 | 1.0103 | 1.0051 |
295
+ | No log | 4.9796 | 488 | 0.7908 | -0.1331 | 0.7908 | 0.8893 |
296
+ | No log | 5.0 | 490 | 0.7490 | 0.1740 | 0.7490 | 0.8654 |
297
+ | No log | 5.0204 | 492 | 0.7737 | 0.1148 | 0.7737 | 0.8796 |
298
+ | No log | 5.0408 | 494 | 0.7563 | 0.1259 | 0.7563 | 0.8697 |
299
+ | No log | 5.0612 | 496 | 0.7590 | 0.0338 | 0.7590 | 0.8712 |
300
+ | No log | 5.0816 | 498 | 0.7699 | 0.0414 | 0.7699 | 0.8774 |
301
+ | 0.3636 | 5.1020 | 500 | 0.7814 | 0.0414 | 0.7814 | 0.8839 |
302
+ | 0.3636 | 5.1224 | 502 | 0.8000 | -0.0030 | 0.8000 | 0.8944 |
303
+ | 0.3636 | 5.1429 | 504 | 0.7694 | 0.0375 | 0.7694 | 0.8772 |
304
+ | 0.3636 | 5.1633 | 506 | 0.7793 | 0.0723 | 0.7793 | 0.8828 |
305
+ | 0.3636 | 5.1837 | 508 | 0.7816 | 0.0714 | 0.7816 | 0.8841 |
306
+ | 0.3636 | 5.2041 | 510 | 0.7720 | 0.0414 | 0.7720 | 0.8787 |
307
+ | 0.3636 | 5.2245 | 512 | 0.8359 | -0.1266 | 0.8359 | 0.9143 |
308
+ | 0.3636 | 5.2449 | 514 | 0.8905 | -0.1099 | 0.8905 | 0.9436 |
309
+ | 0.3636 | 5.2653 | 516 | 0.8801 | -0.1527 | 0.8801 | 0.9381 |
310
+ | 0.3636 | 5.2857 | 518 | 0.8082 | -0.1398 | 0.8082 | 0.8990 |
311
+
312
+
313
+ ### Framework versions
314
+
315
+ - Transformers 4.44.2
316
+ - Pytorch 2.4.0+cu118
317
+ - Datasets 2.21.0
318
+ - Tokenizers 0.19.1
config.json ADDED
@@ -0,0 +1,32 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "aubmindlab/bert-base-arabertv02",
3
+ "architectures": [
4
+ "BertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "classifier_dropout": null,
8
+ "hidden_act": "gelu",
9
+ "hidden_dropout_prob": 0.1,
10
+ "hidden_size": 768,
11
+ "id2label": {
12
+ "0": "LABEL_0"
13
+ },
14
+ "initializer_range": 0.02,
15
+ "intermediate_size": 3072,
16
+ "label2id": {
17
+ "LABEL_0": 0
18
+ },
19
+ "layer_norm_eps": 1e-12,
20
+ "max_position_embeddings": 512,
21
+ "model_type": "bert",
22
+ "num_attention_heads": 12,
23
+ "num_hidden_layers": 12,
24
+ "pad_token_id": 0,
25
+ "position_embedding_type": "absolute",
26
+ "problem_type": "regression",
27
+ "torch_dtype": "float32",
28
+ "transformers_version": "4.44.2",
29
+ "type_vocab_size": 2,
30
+ "use_cache": true,
31
+ "vocab_size": 64000
32
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:285410c4c9cdb7d7f357b0617b40108d6f096a795473bca45138625d41c79f8d
3
+ size 540799996
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c276dbea149f68fa678edc8c6b17d30aaa8125eb0853afa8e61a1b00fb3398bf
3
+ size 5304