ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k1_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2396
  • Qwk: 0.4878
  • Mse: 1.2396
  • Rmse: 1.1134

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.5 2 5.6252 0.1306 5.6252 2.3718
No log 1.0 4 3.3807 0.1667 3.3807 1.8387
No log 1.5 6 3.4798 0.0440 3.4798 1.8654
No log 2.0 8 2.4086 0.0462 2.4086 1.5520
No log 2.5 10 1.6055 0.2314 1.6055 1.2671
No log 3.0 12 1.5944 0.3226 1.5944 1.2627
No log 3.5 14 1.4046 0.3636 1.4046 1.1852
No log 4.0 16 1.4824 0.3273 1.4824 1.2176
No log 4.5 18 1.6225 0.2727 1.6225 1.2738
No log 5.0 20 1.6785 0.2243 1.6785 1.2956
No log 5.5 22 1.5509 0.3333 1.5509 1.2453
No log 6.0 24 1.5289 0.3590 1.5289 1.2365
No log 6.5 26 1.4958 0.2832 1.4958 1.2230
No log 7.0 28 1.4903 0.2222 1.4903 1.2208
No log 7.5 30 1.5937 0.2727 1.5937 1.2624
No log 8.0 32 1.4418 0.4138 1.4418 1.2007
No log 8.5 34 1.2709 0.4138 1.2709 1.1274
No log 9.0 36 1.2399 0.4667 1.2399 1.1135
No log 9.5 38 1.3255 0.4878 1.3255 1.1513
No log 10.0 40 1.4004 0.3419 1.4004 1.1834
No log 10.5 42 1.4805 0.3063 1.4805 1.2168
No log 11.0 44 1.4006 0.3478 1.4006 1.1835
No log 11.5 46 1.3231 0.3966 1.3231 1.1503
No log 12.0 48 1.2936 0.4628 1.2936 1.1374
No log 12.5 50 1.3307 0.3548 1.3307 1.1536
No log 13.0 52 1.3646 0.3333 1.3646 1.1682
No log 13.5 54 1.3762 0.3860 1.3762 1.1731
No log 14.0 56 1.2775 0.4348 1.2775 1.1303
No log 14.5 58 1.2391 0.4310 1.2391 1.1131
No log 15.0 60 1.2730 0.4576 1.2730 1.1283
No log 15.5 62 1.2301 0.4444 1.2301 1.1091
No log 16.0 64 1.2338 0.4561 1.2338 1.1108
No log 16.5 66 1.3113 0.4386 1.3113 1.1451
No log 17.0 68 1.2967 0.4696 1.2967 1.1387
No log 17.5 70 1.2399 0.5574 1.2399 1.1135
No log 18.0 72 1.1494 0.5323 1.1494 1.0721
No log 18.5 74 1.1259 0.5366 1.1259 1.0611
No log 19.0 76 1.1146 0.5397 1.1146 1.0558
No log 19.5 78 1.1460 0.5669 1.1460 1.0705
No log 20.0 80 1.2031 0.5669 1.2031 1.0969
No log 20.5 82 1.2098 0.528 1.2098 1.0999
No log 21.0 84 1.2279 0.5669 1.2279 1.1081
No log 21.5 86 1.2065 0.5827 1.2065 1.0984
No log 22.0 88 1.2249 0.5512 1.2249 1.1067
No log 22.5 90 1.2381 0.5827 1.2381 1.1127
No log 23.0 92 1.2412 0.5167 1.2412 1.1141
No log 23.5 94 1.2746 0.3604 1.2746 1.1290
No log 24.0 96 1.2844 0.375 1.2844 1.1333
No log 24.5 98 1.2651 0.4310 1.2651 1.1248
No log 25.0 100 1.3053 0.4677 1.3053 1.1425
No log 25.5 102 1.3902 0.4496 1.3902 1.1791
No log 26.0 104 1.3694 0.4496 1.3694 1.1702
No log 26.5 106 1.2779 0.4640 1.2779 1.1305
No log 27.0 108 1.2839 0.4576 1.2839 1.1331
No log 27.5 110 1.2863 0.4274 1.2863 1.1341
No log 28.0 112 1.2331 0.4833 1.2331 1.1104
No log 28.5 114 1.1793 0.512 1.1793 1.0860
No log 29.0 116 1.2114 0.5231 1.2114 1.1006
No log 29.5 118 1.2108 0.5426 1.2108 1.1004
No log 30.0 120 1.2409 0.5714 1.2409 1.1139
No log 30.5 122 1.2353 0.4538 1.2353 1.1114
No log 31.0 124 1.2565 0.4407 1.2565 1.1209
No log 31.5 126 1.2662 0.4576 1.2662 1.1253
No log 32.0 128 1.3135 0.4833 1.3135 1.1461
No log 32.5 130 1.3370 0.4667 1.3370 1.1563
No log 33.0 132 1.2492 0.5246 1.2492 1.1177
No log 33.5 134 1.1518 0.544 1.1518 1.0732
No log 34.0 136 1.1100 0.5 1.1100 1.0535
No log 34.5 138 1.1010 0.5082 1.1010 1.0493
No log 35.0 140 1.0934 0.496 1.0934 1.0457
No log 35.5 142 1.1680 0.5736 1.1680 1.0807
No log 36.0 144 1.2408 0.5312 1.2408 1.1139
No log 36.5 146 1.3105 0.5 1.3105 1.1448
No log 37.0 148 1.3443 0.5 1.3443 1.1594
No log 37.5 150 1.2811 0.5385 1.2811 1.1319
No log 38.0 152 1.1637 0.5692 1.1637 1.0787
No log 38.5 154 1.0743 0.5781 1.0743 1.0365
No log 39.0 156 1.0397 0.5781 1.0397 1.0197
No log 39.5 158 1.0621 0.5846 1.0621 1.0306
No log 40.0 160 1.0823 0.5938 1.0823 1.0404
No log 40.5 162 1.1186 0.5938 1.1186 1.0576
No log 41.0 164 1.1197 0.5984 1.1197 1.0582
No log 41.5 166 1.1330 0.5538 1.1330 1.0644
No log 42.0 168 1.1154 0.5827 1.1154 1.0561
No log 42.5 170 1.1310 0.5556 1.1310 1.0635
No log 43.0 172 1.1718 0.5041 1.1718 1.0825
No log 43.5 174 1.1747 0.544 1.1747 1.0838
No log 44.0 176 1.2106 0.56 1.2106 1.1003
No log 44.5 178 1.2401 0.5528 1.2401 1.1136
No log 45.0 180 1.2525 0.5528 1.2525 1.1191
No log 45.5 182 1.2113 0.5484 1.2113 1.1006
No log 46.0 184 1.1603 0.5082 1.1603 1.0772
No log 46.5 186 1.1315 0.4333 1.1315 1.0637
No log 47.0 188 1.1155 0.5041 1.1155 1.0562
No log 47.5 190 1.1367 0.5556 1.1367 1.0662
No log 48.0 192 1.2453 0.5197 1.2453 1.1159
No log 48.5 194 1.3105 0.4697 1.3105 1.1448
No log 49.0 196 1.3095 0.4662 1.3095 1.1443
No log 49.5 198 1.2975 0.4733 1.2975 1.1391
No log 50.0 200 1.2273 0.5079 1.2273 1.1079
No log 50.5 202 1.1865 0.6032 1.1865 1.0893
No log 51.0 204 1.1844 0.576 1.1844 1.0883
No log 51.5 206 1.1913 0.576 1.1913 1.0915
No log 52.0 208 1.2184 0.5645 1.2184 1.1038
No log 52.5 210 1.2323 0.5645 1.2323 1.1101
No log 53.0 212 1.2454 0.544 1.2454 1.1160
No log 53.5 214 1.2460 0.5246 1.2460 1.1162
No log 54.0 216 1.2373 0.4874 1.2373 1.1124
No log 54.5 218 1.2291 0.4538 1.2291 1.1086
No log 55.0 220 1.2581 0.4576 1.2581 1.1216
No log 55.5 222 1.2809 0.4274 1.2809 1.1318
No log 56.0 224 1.2858 0.4407 1.2858 1.1339
No log 56.5 226 1.2929 0.528 1.2929 1.1371
No log 57.0 228 1.2858 0.528 1.2858 1.1339
No log 57.5 230 1.2721 0.528 1.2721 1.1279
No log 58.0 232 1.2687 0.528 1.2687 1.1263
No log 58.5 234 1.2553 0.5041 1.2553 1.1204
No log 59.0 236 1.2258 0.5082 1.2258 1.1071
No log 59.5 238 1.2095 0.4793 1.2095 1.0998
No log 60.0 240 1.2071 0.4918 1.2071 1.0987
No log 60.5 242 1.2066 0.4500 1.2066 1.0985
No log 61.0 244 1.2157 0.4202 1.2157 1.1026
No log 61.5 246 1.2428 0.4370 1.2428 1.1148
No log 62.0 248 1.2778 0.4237 1.2777 1.1304
No log 62.5 250 1.2802 0.4274 1.2802 1.1315
No log 63.0 252 1.2618 0.4407 1.2618 1.1233
No log 63.5 254 1.2366 0.4706 1.2366 1.1120
No log 64.0 256 1.2192 0.4959 1.2192 1.1042
No log 64.5 258 1.1971 0.4918 1.1971 1.0941
No log 65.0 260 1.1786 0.4878 1.1786 1.0856
No log 65.5 262 1.1971 0.5203 1.1971 1.0941
No log 66.0 264 1.2277 0.5323 1.2277 1.1080
No log 66.5 266 1.2423 0.5238 1.2423 1.1146
No log 67.0 268 1.2468 0.512 1.2468 1.1166
No log 67.5 270 1.2185 0.5366 1.2185 1.1039
No log 68.0 272 1.1761 0.5203 1.1761 1.0845
No log 68.5 274 1.1532 0.5203 1.1532 1.0739
No log 69.0 276 1.1399 0.5161 1.1399 1.0677
No log 69.5 278 1.1360 0.5161 1.1360 1.0658
No log 70.0 280 1.1397 0.5203 1.1397 1.0676
No log 70.5 282 1.1534 0.5203 1.1534 1.0740
No log 71.0 284 1.1613 0.5203 1.1613 1.0777
No log 71.5 286 1.1943 0.4959 1.1943 1.0928
No log 72.0 288 1.2389 0.5246 1.2389 1.1131
No log 72.5 290 1.2683 0.5 1.2683 1.1262
No log 73.0 292 1.2686 0.4715 1.2686 1.1263
No log 73.5 294 1.2673 0.4754 1.2673 1.1258
No log 74.0 296 1.2666 0.4959 1.2666 1.1254
No log 74.5 298 1.2506 0.4833 1.2506 1.1183
No log 75.0 300 1.2363 0.4576 1.2363 1.1119
No log 75.5 302 1.2410 0.4538 1.2410 1.1140
No log 76.0 304 1.2528 0.4576 1.2528 1.1193
No log 76.5 306 1.2605 0.4274 1.2605 1.1227
No log 77.0 308 1.2698 0.4444 1.2698 1.1269
No log 77.5 310 1.2674 0.4576 1.2674 1.1258
No log 78.0 312 1.2765 0.4833 1.2765 1.1298
No log 78.5 314 1.3051 0.4833 1.3051 1.1424
No log 79.0 316 1.3329 0.4333 1.3329 1.1545
No log 79.5 318 1.3381 0.4628 1.3381 1.1568
No log 80.0 320 1.3405 0.4590 1.3405 1.1578
No log 80.5 322 1.3331 0.4878 1.3331 1.1546
No log 81.0 324 1.3162 0.4878 1.3162 1.1472
No log 81.5 326 1.2878 0.4754 1.2878 1.1348
No log 82.0 328 1.2678 0.5041 1.2678 1.1260
No log 82.5 330 1.2498 0.5246 1.2498 1.1180
No log 83.0 332 1.2452 0.5246 1.2452 1.1159
No log 83.5 334 1.2430 0.5124 1.2430 1.1149
No log 84.0 336 1.2449 0.4833 1.2449 1.1158
No log 84.5 338 1.2533 0.4833 1.2533 1.1195
No log 85.0 340 1.2549 0.5124 1.2549 1.1202
No log 85.5 342 1.2549 0.5203 1.2549 1.1202
No log 86.0 344 1.2412 0.5124 1.2412 1.1141
No log 86.5 346 1.2375 0.5124 1.2375 1.1124
No log 87.0 348 1.2376 0.5124 1.2376 1.1125
No log 87.5 350 1.2396 0.4833 1.2396 1.1134
No log 88.0 352 1.2462 0.4833 1.2462 1.1163
No log 88.5 354 1.2528 0.4833 1.2528 1.1193
No log 89.0 356 1.2570 0.4918 1.2570 1.1211
No log 89.5 358 1.2527 0.4918 1.2527 1.1192
No log 90.0 360 1.2464 0.4833 1.2464 1.1164
No log 90.5 362 1.2377 0.4833 1.2377 1.1125
No log 91.0 364 1.2301 0.4833 1.2301 1.1091
No log 91.5 366 1.2256 0.4833 1.2256 1.1071
No log 92.0 368 1.2272 0.4833 1.2272 1.1078
No log 92.5 370 1.2283 0.4833 1.2283 1.1083
No log 93.0 372 1.2304 0.4833 1.2304 1.1092
No log 93.5 374 1.2337 0.5124 1.2337 1.1107
No log 94.0 376 1.2340 0.5124 1.2340 1.1109
No log 94.5 378 1.2368 0.4918 1.2368 1.1121
No log 95.0 380 1.2405 0.4918 1.2405 1.1138
No log 95.5 382 1.2438 0.4918 1.2438 1.1152
No log 96.0 384 1.2437 0.4918 1.2437 1.1152
No log 96.5 386 1.2422 0.4878 1.2422 1.1145
No log 97.0 388 1.2410 0.4878 1.2410 1.1140
No log 97.5 390 1.2405 0.4878 1.2405 1.1138
No log 98.0 392 1.2389 0.4878 1.2389 1.1131
No log 98.5 394 1.2393 0.4878 1.2393 1.1132
No log 99.0 396 1.2401 0.4878 1.2401 1.1136
No log 99.5 398 1.2399 0.4878 1.2399 1.1135
No log 100.0 400 1.2396 0.4878 1.2396 1.1134

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k1_task1_organization

Finetuned
(4023)
this model