ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k10_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8533
  • Qwk: 0.2469
  • Mse: 0.8533
  • Rmse: 0.9238

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0385 2 3.1463 -0.0216 3.1463 1.7738
No log 0.0769 4 1.6608 -0.0101 1.6608 1.2887
No log 0.1154 6 1.1450 0.0588 1.1450 1.0701
No log 0.1538 8 0.6587 0.1195 0.6587 0.8116
No log 0.1923 10 0.5719 0.0222 0.5719 0.7562
No log 0.2308 12 0.5860 0.0222 0.5860 0.7655
No log 0.2692 14 0.6120 0.1895 0.6120 0.7823
No log 0.3077 16 0.8515 0.1930 0.8515 0.9227
No log 0.3462 18 0.5583 0.1884 0.5583 0.7472
No log 0.3846 20 0.5354 0.0569 0.5354 0.7317
No log 0.4231 22 0.5456 0.2248 0.5456 0.7386
No log 0.4615 24 0.5612 0.0 0.5612 0.7491
No log 0.5 26 0.6054 0.0 0.6054 0.7781
No log 0.5385 28 0.7063 0.0 0.7063 0.8404
No log 0.5769 30 0.6868 0.0 0.6868 0.8287
No log 0.6154 32 0.5966 0.1467 0.5966 0.7724
No log 0.6538 34 0.9207 0.1392 0.9207 0.9595
No log 0.6923 36 1.0516 0.0244 1.0516 1.0255
No log 0.7308 38 0.8226 0.1861 0.8226 0.9070
No log 0.7692 40 0.5883 0.1373 0.5883 0.7670
No log 0.8077 42 0.6617 0.0933 0.6617 0.8134
No log 0.8462 44 0.8299 0.1220 0.8299 0.9110
No log 0.8846 46 0.7333 0.1220 0.7333 0.8563
No log 0.9231 48 0.6246 0.1020 0.6246 0.7903
No log 0.9615 50 0.8180 0.1179 0.8180 0.9045
No log 1.0 52 0.9683 0.0933 0.9683 0.9840
No log 1.0385 54 0.9440 0.1765 0.9440 0.9716
No log 1.0769 56 0.7461 0.0659 0.7461 0.8638
No log 1.1154 58 0.7885 0.0955 0.7885 0.8880
No log 1.1538 60 0.8324 0.0980 0.8324 0.9123
No log 1.1923 62 0.6825 0.0692 0.6825 0.8262
No log 1.2308 64 0.6694 0.2281 0.6694 0.8182
No log 1.2692 66 1.1070 0.0769 1.1070 1.0521
No log 1.3077 68 1.1982 0.0522 1.1982 1.0946
No log 1.3462 70 0.8480 0.0991 0.8480 0.9209
No log 1.3846 72 0.5681 0.1895 0.5681 0.7537
No log 1.4231 74 0.6353 0.1304 0.6353 0.7970
No log 1.4615 76 0.7238 0.0703 0.7238 0.8508
No log 1.5 78 0.5891 0.0886 0.5891 0.7676
No log 1.5385 80 0.5347 0.1592 0.5347 0.7312
No log 1.5769 82 0.5302 0.3939 0.5302 0.7281
No log 1.6154 84 0.5359 0.4217 0.5359 0.7320
No log 1.6538 86 0.5609 0.3446 0.5609 0.7489
No log 1.6923 88 0.5774 0.1807 0.5774 0.7599
No log 1.7308 90 0.9312 0.1588 0.9312 0.9650
No log 1.7692 92 1.3366 0.0604 1.3366 1.1561
No log 1.8077 94 1.2462 -0.0070 1.2462 1.1163
No log 1.8462 96 0.8406 0.1543 0.8406 0.9168
No log 1.8846 98 0.8868 0.2000 0.8868 0.9417
No log 1.9231 100 1.4649 0.0609 1.4649 1.2103
No log 1.9615 102 1.4825 0.0843 1.4825 1.2176
No log 2.0 104 0.9842 0.2615 0.9842 0.9921
No log 2.0385 106 0.7578 0.1852 0.7578 0.8705
No log 2.0769 108 0.7276 0.2233 0.7276 0.8530
No log 2.1154 110 0.7504 0.2300 0.7504 0.8662
No log 2.1538 112 0.8150 0.1781 0.8150 0.9028
No log 2.1923 114 0.9845 0.1811 0.9845 0.9922
No log 2.2308 116 1.2162 0.2762 1.2162 1.1028
No log 2.2692 118 1.0663 0.2215 1.0663 1.0326
No log 2.3077 120 0.9986 0.2664 0.9986 0.9993
No log 2.3462 122 1.1797 0.2357 1.1797 1.0861
No log 2.3846 124 1.1576 0.2362 1.1576 1.0759
No log 2.4231 126 1.5641 0.2507 1.5641 1.2506
No log 2.4615 128 1.3598 0.2242 1.3598 1.1661
No log 2.5 130 1.0218 0.2296 1.0218 1.0108
No log 2.5385 132 0.6593 0.3010 0.6593 0.8120
No log 2.5769 134 0.6668 0.4798 0.6668 0.8166
No log 2.6154 136 0.7820 0.2960 0.7820 0.8843
No log 2.6538 138 1.6220 0.1762 1.6220 1.2736
No log 2.6923 140 1.8882 0.1224 1.8882 1.3741
No log 2.7308 142 1.3429 0.2099 1.3429 1.1588
No log 2.7692 144 0.8499 0.2960 0.8499 0.9219
No log 2.8077 146 0.9331 0.2286 0.9331 0.9660
No log 2.8462 148 1.5215 0.2292 1.5215 1.2335
No log 2.8846 150 1.8300 0.1845 1.8300 1.3528
No log 2.9231 152 1.8039 0.1778 1.8039 1.3431
No log 2.9615 154 1.5139 0.2135 1.5139 1.2304
No log 3.0 156 1.2626 0.1383 1.2626 1.1237
No log 3.0385 158 1.0070 0.2353 1.0070 1.0035
No log 3.0769 160 1.1044 0.1507 1.1044 1.0509
No log 3.1154 162 1.3699 0.1111 1.3699 1.1704
No log 3.1538 164 1.2894 0.0831 1.2894 1.1355
No log 3.1923 166 1.1525 0.1557 1.1525 1.0735
No log 3.2308 168 1.5077 0.1864 1.5077 1.2279
No log 3.2692 170 1.5540 0.2041 1.5540 1.2466
No log 3.3077 172 1.4264 0.1724 1.4264 1.1943
No log 3.3462 174 1.6591 0.1323 1.6591 1.2881
No log 3.3846 176 1.6814 0.1323 1.6814 1.2967
No log 3.4231 178 1.4565 0.1549 1.4565 1.2069
No log 3.4615 180 1.0218 0.2688 1.0218 1.0108
No log 3.5 182 1.0443 0.2676 1.0443 1.0219
No log 3.5385 184 1.4944 0.1914 1.4944 1.2225
No log 3.5769 186 1.4427 0.1810 1.4427 1.2011
No log 3.6154 188 1.0072 0.2472 1.0072 1.0036
No log 3.6538 190 0.8126 0.25 0.8126 0.9014
No log 3.6923 192 0.8637 0.2782 0.8637 0.9294
No log 3.7308 194 1.1882 0.2573 1.1882 1.0901
No log 3.7692 196 1.6348 0.1841 1.6348 1.2786
No log 3.8077 198 1.5035 0.2083 1.5035 1.2262
No log 3.8462 200 0.9623 0.2624 0.9623 0.9810
No log 3.8846 202 0.7589 0.2881 0.7589 0.8711
No log 3.9231 204 0.8818 0.2806 0.8818 0.9390
No log 3.9615 206 1.4227 0.2299 1.4227 1.1928
No log 4.0 208 2.1750 0.1435 2.1750 1.4748
No log 4.0385 210 2.0131 0.1461 2.0131 1.4188
No log 4.0769 212 1.3563 0.1471 1.3563 1.1646
No log 4.1154 214 0.7198 0.3035 0.7198 0.8484
No log 4.1538 216 0.5820 0.3725 0.5820 0.7629
No log 4.1923 218 0.5839 0.3398 0.5839 0.7641
No log 4.2308 220 0.8474 0.3156 0.8474 0.9205
No log 4.2692 222 1.2006 0.25 1.2006 1.0957
No log 4.3077 224 1.4431 0.2461 1.4431 1.2013
No log 4.3462 226 1.2410 0.2420 1.2410 1.1140
No log 4.3846 228 0.7850 0.3443 0.7850 0.8860
No log 4.4231 230 0.6955 0.3991 0.6955 0.8340
No log 4.4615 232 0.7903 0.2941 0.7903 0.8890
No log 4.5 234 1.1392 0.1317 1.1392 1.0673
No log 4.5385 236 1.4725 0.1739 1.4725 1.2135
No log 4.5769 238 1.2801 0.2478 1.2801 1.1314
No log 4.6154 240 0.9510 0.3164 0.9510 0.9752
No log 4.6538 242 0.8708 0.2977 0.8708 0.9332
No log 4.6923 244 1.1109 0.1837 1.1109 1.0540
No log 4.7308 246 1.1245 0.1888 1.1245 1.0604
No log 4.7692 248 0.8147 0.2787 0.8147 0.9026
No log 4.8077 250 0.6339 0.3548 0.6339 0.7962
No log 4.8462 252 0.6971 0.3052 0.6971 0.8349
No log 4.8846 254 0.9068 0.2756 0.9068 0.9522
No log 4.9231 256 1.2690 0.2104 1.2690 1.1265
No log 4.9615 258 1.2083 0.2105 1.2083 1.0992
No log 5.0 260 0.8712 0.2771 0.8712 0.9334
No log 5.0385 262 0.7358 0.3028 0.7358 0.8578
No log 5.0769 264 0.8008 0.2838 0.8008 0.8949
No log 5.1154 266 0.9711 0.1635 0.9711 0.9855
No log 5.1538 268 0.8900 0.2203 0.8900 0.9434
No log 5.1923 270 0.7478 0.3091 0.7478 0.8647
No log 5.2308 272 0.8149 0.2696 0.8149 0.9027
No log 5.2692 274 1.0713 0.1785 1.0713 1.0350
No log 5.3077 276 1.3392 0.2303 1.3392 1.1572
No log 5.3462 278 1.2481 0.1788 1.2481 1.1172
No log 5.3846 280 0.9362 0.2314 0.9362 0.9676
No log 5.4231 282 0.7158 0.3548 0.7158 0.8460
No log 5.4615 284 0.6204 0.3833 0.6204 0.7876
No log 5.5 286 0.6343 0.3803 0.6343 0.7964
No log 5.5385 288 0.7310 0.2661 0.7310 0.8550
No log 5.5769 290 0.9481 0.1877 0.9481 0.9737
No log 5.6154 292 1.0073 0.1635 1.0073 1.0036
No log 5.6538 294 0.8900 0.2510 0.8900 0.9434
No log 5.6923 296 0.6846 0.3571 0.6846 0.8274
No log 5.7308 298 0.6419 0.3180 0.6419 0.8012
No log 5.7692 300 0.6588 0.3180 0.6588 0.8117
No log 5.8077 302 0.7292 0.2775 0.7292 0.8539
No log 5.8462 304 0.8031 0.3080 0.8031 0.8962
No log 5.8846 306 0.9399 0.2552 0.9399 0.9695
No log 5.9231 308 1.0053 0.2000 1.0053 1.0026
No log 5.9615 310 1.1867 0.1834 1.1867 1.0894
No log 6.0 312 1.2443 0.1837 1.2443 1.1155
No log 6.0385 314 1.0856 0.2000 1.0856 1.0419
No log 6.0769 316 0.8062 0.2920 0.8062 0.8979
No log 6.1154 318 0.7136 0.3214 0.7136 0.8447
No log 6.1538 320 0.7743 0.2579 0.7743 0.8799
No log 6.1923 322 0.9321 0.3016 0.9321 0.9654
No log 6.2308 324 0.9713 0.2797 0.9713 0.9856
No log 6.2692 326 0.8781 0.2479 0.8781 0.9371
No log 6.3077 328 0.7371 0.2212 0.7371 0.8585
No log 6.3462 330 0.6503 0.3333 0.6503 0.8064
No log 6.3846 332 0.6842 0.3242 0.6842 0.8272
No log 6.4231 334 0.8030 0.2542 0.8030 0.8961
No log 6.4615 336 0.9571 0.2314 0.9571 0.9783
No log 6.5 338 1.1182 0.1608 1.1182 1.0575
No log 6.5385 340 1.1123 0.1648 1.1123 1.0547
No log 6.5769 342 0.9884 0.2061 0.9884 0.9942
No log 6.6154 344 0.9136 0.2248 0.9136 0.9558
No log 6.6538 346 0.8704 0.2863 0.8704 0.9330
No log 6.6923 348 0.9224 0.2243 0.9224 0.9604
No log 6.7308 350 0.9408 0.2243 0.9408 0.9699
No log 6.7692 352 1.0590 0.1822 1.0590 1.0291
No log 6.8077 354 1.0555 0.1822 1.0555 1.0274
No log 6.8462 356 0.9497 0.2243 0.9497 0.9745
No log 6.8846 358 0.9883 0.1822 0.9883 0.9941
No log 6.9231 360 0.9692 0.2126 0.9692 0.9845
No log 6.9615 362 0.9449 0.2713 0.9449 0.9720
No log 7.0 364 1.0609 0.1594 1.0609 1.0300
No log 7.0385 366 1.2614 0.1615 1.2614 1.1231
No log 7.0769 368 1.2694 0.1615 1.2694 1.1267
No log 7.1154 370 1.0945 0.2113 1.0945 1.0462
No log 7.1538 372 0.8812 0.1934 0.8812 0.9387
No log 7.1923 374 0.7375 0.2838 0.7375 0.8588
No log 7.2308 376 0.7261 0.2838 0.7261 0.8521
No log 7.2692 378 0.8034 0.2632 0.8034 0.8963
No log 7.3077 380 0.9311 0.2000 0.9311 0.9649
No log 7.3462 382 1.0914 0.2113 1.0914 1.0447
No log 7.3846 384 1.2635 0.1795 1.2635 1.1240
No log 7.4231 386 1.2933 0.1795 1.2933 1.1373
No log 7.4615 388 1.1450 0.1834 1.1450 1.0701
No log 7.5 390 0.9302 0.1882 0.9302 0.9645
No log 7.5385 392 0.8204 0.2191 0.8204 0.9058
No log 7.5769 394 0.8196 0.2188 0.8196 0.9053
No log 7.6154 396 0.8930 0.2374 0.8930 0.9450
No log 7.6538 398 0.9372 0.2424 0.9372 0.9681
No log 7.6923 400 0.9012 0.2424 0.9012 0.9493
No log 7.7308 402 0.8192 0.2469 0.8192 0.9051
No log 7.7692 404 0.7949 0.2821 0.7949 0.8915
No log 7.8077 406 0.8706 0.2191 0.8706 0.9331
No log 7.8462 408 0.9641 0.2253 0.9641 0.9819
No log 7.8846 410 1.0353 0.1886 1.0353 1.0175
No log 7.9231 412 0.9739 0.1635 0.9739 0.9869
No log 7.9615 414 0.8836 0.25 0.8836 0.9400
No log 8.0 416 0.7791 0.3188 0.7791 0.8826
No log 8.0385 418 0.7542 0.2793 0.7542 0.8685
No log 8.0769 420 0.8208 0.3443 0.8208 0.9060
No log 8.1154 422 0.8697 0.3092 0.8697 0.9326
No log 8.1538 424 0.9750 0.2180 0.9750 0.9874
No log 8.1923 426 1.0335 0.1882 1.0335 1.0166
No log 8.2308 428 1.1005 0.1884 1.1005 1.0491
No log 8.2692 430 1.0801 0.1884 1.0801 1.0393
No log 8.3077 432 0.9926 0.1822 0.9926 0.9963
No log 8.3462 434 0.8838 0.2685 0.8838 0.9401
No log 8.3846 436 0.8622 0.2450 0.8622 0.9285
No log 8.4231 438 0.8448 0.2450 0.8448 0.9191
No log 8.4615 440 0.8530 0.2450 0.8530 0.9236
No log 8.5 442 0.8896 0.2756 0.8896 0.9432
No log 8.5385 444 0.9122 0.2741 0.9122 0.9551
No log 8.5769 446 0.8625 0.2459 0.8625 0.9287
No log 8.6154 448 0.8020 0.2489 0.8020 0.8956
No log 8.6538 450 0.7894 0.25 0.7894 0.8885
No log 8.6923 452 0.8104 0.25 0.8104 0.9002
No log 8.7308 454 0.8267 0.2479 0.8267 0.9092
No log 8.7692 456 0.8245 0.2489 0.8245 0.9080
No log 8.8077 458 0.8329 0.2489 0.8329 0.9127
No log 8.8462 460 0.8709 0.2459 0.8709 0.9332
No log 8.8846 462 0.9206 0.2756 0.9206 0.9595
No log 8.9231 464 0.9323 0.2756 0.9323 0.9655
No log 8.9615 466 0.9120 0.2450 0.9120 0.9550
No log 9.0 468 0.8604 0.2469 0.8604 0.9276
No log 9.0385 470 0.8445 0.2469 0.8445 0.9189
No log 9.0769 472 0.8639 0.2469 0.8639 0.9295
No log 9.1154 474 0.8748 0.2469 0.8748 0.9353
No log 9.1538 476 0.8863 0.2459 0.8863 0.9415
No log 9.1923 478 0.8736 0.2469 0.8736 0.9347
No log 9.2308 480 0.8754 0.2469 0.8754 0.9356
No log 9.2692 482 0.8798 0.2469 0.8798 0.9380
No log 9.3077 484 0.8631 0.2469 0.8631 0.9290
No log 9.3462 486 0.8508 0.2469 0.8508 0.9224
No log 9.3846 488 0.8481 0.2469 0.8481 0.9209
No log 9.4231 490 0.8619 0.2469 0.8619 0.9284
No log 9.4615 492 0.8736 0.2469 0.8736 0.9347
No log 9.5 494 0.8932 0.2459 0.8932 0.9451
No log 9.5385 496 0.9130 0.2450 0.9130 0.9555
No log 9.5769 498 0.9244 0.2450 0.9244 0.9615
0.4117 9.6154 500 0.9370 0.2741 0.9370 0.9680
0.4117 9.6538 502 0.9362 0.2741 0.9362 0.9676
0.4117 9.6923 504 0.9265 0.2450 0.9265 0.9625
0.4117 9.7308 506 0.9124 0.2450 0.9124 0.9552
0.4117 9.7692 508 0.8963 0.2450 0.8963 0.9467
0.4117 9.8077 510 0.8822 0.2450 0.8822 0.9392
0.4117 9.8462 512 0.8691 0.2469 0.8691 0.9323
0.4117 9.8846 514 0.8613 0.2469 0.8613 0.9281
0.4117 9.9231 516 0.8570 0.2469 0.8570 0.9257
0.4117 9.9615 518 0.8538 0.2469 0.8538 0.9240
0.4117 10.0 520 0.8533 0.2469 0.8533 0.9238

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k10_task3_organization

Finetuned
(4023)
this model