ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run1_AugV5_k15_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0025
  • Qwk: 0.6308
  • Mse: 1.0025
  • Rmse: 1.0013

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.04 2 2.1694 0.0186 2.1694 1.4729
No log 0.08 4 1.4538 0.1900 1.4538 1.2057
No log 0.12 6 1.5658 0.1594 1.5658 1.2513
No log 0.16 8 1.6893 0.2214 1.6893 1.2997
No log 0.2 10 1.6939 0.3062 1.6939 1.3015
No log 0.24 12 1.5603 0.2665 1.5603 1.2491
No log 0.28 14 1.8302 0.2615 1.8302 1.3528
No log 0.32 16 2.1545 0.2214 2.1545 1.4678
No log 0.36 18 1.9999 0.2635 1.9999 1.4142
No log 0.4 20 1.6585 0.3294 1.6585 1.2878
No log 0.44 22 1.5695 0.3500 1.5695 1.2528
No log 0.48 24 1.6724 0.3240 1.6724 1.2932
No log 0.52 26 1.9146 0.2691 1.9146 1.3837
No log 0.56 28 1.8531 0.3080 1.8531 1.3613
No log 0.6 30 1.6669 0.3537 1.6669 1.2911
No log 0.64 32 1.3993 0.3051 1.3993 1.1829
No log 0.68 34 1.3504 0.2663 1.3504 1.1621
No log 0.72 36 1.3199 0.3374 1.3199 1.1489
No log 0.76 38 1.3706 0.3918 1.3706 1.1707
No log 0.8 40 1.3126 0.3917 1.3126 1.1457
No log 0.84 42 1.3826 0.4298 1.3826 1.1759
No log 0.88 44 1.3973 0.4396 1.3973 1.1821
No log 0.92 46 1.6637 0.3827 1.6637 1.2899
No log 0.96 48 1.7307 0.4027 1.7307 1.3156
No log 1.0 50 1.5794 0.4008 1.5794 1.2567
No log 1.04 52 1.3957 0.4805 1.3957 1.1814
No log 1.08 54 1.5493 0.4357 1.5493 1.2447
No log 1.12 56 1.9884 0.3858 1.9884 1.4101
No log 1.16 58 2.4384 0.3903 2.4384 1.5615
No log 1.2 60 2.2110 0.4144 2.2110 1.4869
No log 1.24 62 1.9351 0.4151 1.9351 1.3911
No log 1.28 64 1.6664 0.4381 1.6664 1.2909
No log 1.32 66 1.2700 0.5190 1.2700 1.1269
No log 1.3600 68 1.2137 0.4779 1.2137 1.1017
No log 1.4 70 1.4613 0.5055 1.4613 1.2088
No log 1.44 72 2.2239 0.4230 2.2239 1.4913
No log 1.48 74 2.7936 0.3597 2.7936 1.6714
No log 1.52 76 2.5902 0.3771 2.5902 1.6094
No log 1.56 78 1.8008 0.4777 1.8008 1.3420
No log 1.6 80 1.1743 0.5183 1.1743 1.0837
No log 1.6400 82 1.0322 0.5279 1.0322 1.0160
No log 1.6800 84 1.0627 0.5291 1.0627 1.0309
No log 1.72 86 1.2927 0.4802 1.2927 1.1370
No log 1.76 88 1.6801 0.4546 1.6801 1.2962
No log 1.8 90 1.6735 0.4360 1.6735 1.2936
No log 1.8400 92 1.4336 0.4601 1.4336 1.1973
No log 1.88 94 1.1501 0.4952 1.1501 1.0724
No log 1.92 96 1.0449 0.5462 1.0449 1.0222
No log 1.96 98 1.0858 0.5452 1.0858 1.0420
No log 2.0 100 1.3354 0.5334 1.3354 1.1556
No log 2.04 102 1.4607 0.5302 1.4607 1.2086
No log 2.08 104 1.5138 0.5455 1.5138 1.2303
No log 2.12 106 1.4649 0.5851 1.4649 1.2103
No log 2.16 108 1.1616 0.6012 1.1616 1.0778
No log 2.2 110 1.0358 0.6032 1.0358 1.0177
No log 2.24 112 1.0305 0.5992 1.0305 1.0151
No log 2.2800 114 1.1861 0.5928 1.1861 1.0891
No log 2.32 116 1.3910 0.5718 1.3910 1.1794
No log 2.36 118 1.4406 0.5766 1.4406 1.2003
No log 2.4 120 1.2009 0.5907 1.2009 1.0958
No log 2.44 122 1.0148 0.6008 1.0148 1.0074
No log 2.48 124 0.9594 0.6076 0.9594 0.9795
No log 2.52 126 1.0509 0.5960 1.0509 1.0251
No log 2.56 128 1.3323 0.5386 1.3323 1.1543
No log 2.6 130 1.6173 0.5440 1.6173 1.2717
No log 2.64 132 1.5357 0.5277 1.5357 1.2392
No log 2.68 134 1.2892 0.5135 1.2892 1.1354
No log 2.7200 136 1.0772 0.5725 1.0772 1.0379
No log 2.76 138 0.9743 0.5698 0.9743 0.9871
No log 2.8 140 0.9797 0.5845 0.9797 0.9898
No log 2.84 142 1.1709 0.5982 1.1709 1.0821
No log 2.88 144 1.4827 0.5763 1.4827 1.2177
No log 2.92 146 1.7867 0.5243 1.7867 1.3367
No log 2.96 148 1.7833 0.5557 1.7833 1.3354
No log 3.0 150 1.5397 0.5889 1.5397 1.2409
No log 3.04 152 1.3935 0.6191 1.3935 1.1805
No log 3.08 154 1.2115 0.6304 1.2115 1.1007
No log 3.12 156 1.0893 0.6201 1.0893 1.0437
No log 3.16 158 1.1365 0.5958 1.1365 1.0660
No log 3.2 160 1.3267 0.5821 1.3267 1.1518
No log 3.24 162 1.5854 0.5535 1.5854 1.2591
No log 3.2800 164 1.6188 0.5472 1.6188 1.2723
No log 3.32 166 1.4692 0.5911 1.4692 1.2121
No log 3.36 168 1.2836 0.5840 1.2836 1.1330
No log 3.4 170 1.2561 0.5840 1.2561 1.1208
No log 3.44 172 1.2704 0.5792 1.2704 1.1271
No log 3.48 174 1.4152 0.5927 1.4152 1.1896
No log 3.52 176 1.3834 0.5887 1.3834 1.1762
No log 3.56 178 1.1295 0.5883 1.1295 1.0628
No log 3.6 180 0.9341 0.6333 0.9341 0.9665
No log 3.64 182 0.9159 0.6374 0.9159 0.9570
No log 3.68 184 0.9171 0.6389 0.9171 0.9576
No log 3.7200 186 0.9220 0.6266 0.9220 0.9602
No log 3.76 188 0.9494 0.6147 0.9494 0.9743
No log 3.8 190 1.1391 0.6160 1.1391 1.0673
No log 3.84 192 1.3431 0.6093 1.3431 1.1589
No log 3.88 194 1.3311 0.6082 1.3311 1.1537
No log 3.92 196 1.2133 0.5980 1.2133 1.1015
No log 3.96 198 1.0329 0.6052 1.0329 1.0163
No log 4.0 200 0.8846 0.6406 0.8846 0.9405
No log 4.04 202 0.8968 0.6308 0.8968 0.9470
No log 4.08 204 0.9800 0.6150 0.9800 0.9899
No log 4.12 206 1.1027 0.6075 1.1027 1.0501
No log 4.16 208 1.1822 0.6094 1.1822 1.0873
No log 4.2 210 1.1803 0.6037 1.1803 1.0864
No log 4.24 212 1.0654 0.6117 1.0654 1.0322
No log 4.28 214 1.0373 0.6060 1.0373 1.0185
No log 4.32 216 1.0464 0.6076 1.0464 1.0229
No log 4.36 218 1.0600 0.6132 1.0600 1.0296
No log 4.4 220 1.2026 0.6060 1.2026 1.0966
No log 4.44 222 1.3520 0.5896 1.3520 1.1627
No log 4.48 224 1.2954 0.5896 1.2954 1.1382
No log 4.52 226 1.1337 0.5999 1.1337 1.0647
No log 4.5600 228 0.9692 0.6346 0.9692 0.9845
No log 4.6 230 0.9831 0.6405 0.9831 0.9915
No log 4.64 232 1.0342 0.6272 1.0342 1.0169
No log 4.68 234 1.0434 0.6285 1.0434 1.0215
No log 4.72 236 1.0959 0.6213 1.0959 1.0469
No log 4.76 238 1.1329 0.6083 1.1329 1.0644
No log 4.8 240 1.0307 0.6156 1.0307 1.0152
No log 4.84 242 0.8959 0.6720 0.8959 0.9465
No log 4.88 244 0.8439 0.6916 0.8439 0.9186
No log 4.92 246 0.8557 0.6756 0.8557 0.9250
No log 4.96 248 0.9017 0.6280 0.9017 0.9496
No log 5.0 250 0.8829 0.6397 0.8829 0.9396
No log 5.04 252 0.7903 0.6979 0.7903 0.8890
No log 5.08 254 0.7394 0.7083 0.7394 0.8599
No log 5.12 256 0.7065 0.7473 0.7065 0.8406
No log 5.16 258 0.7460 0.7387 0.7460 0.8637
No log 5.2 260 0.8795 0.6696 0.8795 0.9378
No log 5.24 262 1.0318 0.6449 1.0318 1.0158
No log 5.28 264 1.1164 0.6280 1.1164 1.0566
No log 5.32 266 1.1075 0.6280 1.1075 1.0524
No log 5.36 268 1.0697 0.6250 1.0697 1.0343
No log 5.4 270 0.9487 0.6571 0.9487 0.9740
No log 5.44 272 0.8735 0.6704 0.8735 0.9346
No log 5.48 274 0.8019 0.7268 0.8019 0.8955
No log 5.52 276 0.8188 0.6985 0.8188 0.9049
No log 5.5600 278 0.8604 0.6875 0.8604 0.9276
No log 5.6 280 0.8134 0.6976 0.8134 0.9019
No log 5.64 282 0.7311 0.7416 0.7311 0.8550
No log 5.68 284 0.6943 0.7272 0.6943 0.8332
No log 5.72 286 0.6950 0.7251 0.6950 0.8337
No log 5.76 288 0.7517 0.7492 0.7517 0.8670
No log 5.8 290 0.8100 0.6928 0.8100 0.9000
No log 5.84 292 0.8551 0.7008 0.8551 0.9247
No log 5.88 294 0.9205 0.6688 0.9205 0.9594
No log 5.92 296 0.9580 0.6643 0.9580 0.9788
No log 5.96 298 0.9795 0.6524 0.9795 0.9897
No log 6.0 300 0.9559 0.6689 0.9559 0.9777
No log 6.04 302 0.9751 0.6697 0.9751 0.9875
No log 6.08 304 1.0236 0.6528 1.0236 1.0117
No log 6.12 306 1.0223 0.6588 1.0223 1.0111
No log 6.16 308 1.0755 0.6311 1.0755 1.0371
No log 6.2 310 1.1853 0.6198 1.1853 1.0887
No log 6.24 312 1.2092 0.6144 1.2092 1.0997
No log 6.28 314 1.1232 0.6166 1.1232 1.0598
No log 6.32 316 1.0025 0.6542 1.0025 1.0013
No log 6.36 318 0.8956 0.6608 0.8956 0.9464
No log 6.4 320 0.8775 0.6487 0.8775 0.9367
No log 6.44 322 0.9206 0.6508 0.9206 0.9595
No log 6.48 324 1.0414 0.6389 1.0414 1.0205
No log 6.52 326 1.1669 0.6160 1.1669 1.0802
No log 6.5600 328 1.2544 0.6053 1.2544 1.1200
No log 6.6 330 1.2680 0.6042 1.2680 1.1261
No log 6.64 332 1.1872 0.6168 1.1872 1.0896
No log 6.68 334 1.0340 0.6385 1.0340 1.0169
No log 6.72 336 0.9109 0.6424 0.9109 0.9544
No log 6.76 338 0.8676 0.6503 0.8676 0.9314
No log 6.8 340 0.8816 0.6488 0.8816 0.9389
No log 6.84 342 0.9293 0.6308 0.9293 0.9640
No log 6.88 344 0.9312 0.6308 0.9312 0.9650
No log 6.92 346 0.9233 0.6308 0.9233 0.9609
No log 6.96 348 0.9161 0.6328 0.9161 0.9571
No log 7.0 350 0.9018 0.6434 0.9018 0.9496
No log 7.04 352 0.9015 0.6449 0.9015 0.9495
No log 7.08 354 0.9599 0.6266 0.9599 0.9797
No log 7.12 356 1.0215 0.6327 1.0215 1.0107
No log 7.16 358 1.0258 0.6327 1.0258 1.0128
No log 7.2 360 1.0263 0.6294 1.0263 1.0131
No log 7.24 362 1.0471 0.6353 1.0471 1.0233
No log 7.28 364 1.0380 0.6294 1.0380 1.0188
No log 7.32 366 0.9898 0.6308 0.9898 0.9949
No log 7.36 368 0.9784 0.6308 0.9784 0.9891
No log 7.4 370 1.0012 0.6248 1.0012 1.0006
No log 7.44 372 1.0088 0.6266 1.0088 1.0044
No log 7.48 374 1.0113 0.6266 1.0113 1.0056
No log 7.52 376 0.9910 0.6266 0.9910 0.9955
No log 7.5600 378 1.0048 0.6266 1.0048 1.0024
No log 7.6 380 1.0311 0.6266 1.0311 1.0154
No log 7.64 382 1.0331 0.6266 1.0331 1.0164
No log 7.68 384 1.0148 0.6266 1.0148 1.0074
No log 7.72 386 0.9586 0.6308 0.9586 0.9791
No log 7.76 388 0.9062 0.6400 0.9062 0.9520
No log 7.8 390 0.8730 0.6602 0.8730 0.9343
No log 7.84 392 0.8645 0.6723 0.8645 0.9298
No log 7.88 394 0.8868 0.6519 0.8868 0.9417
No log 7.92 396 0.9097 0.6400 0.9097 0.9538
No log 7.96 398 0.9365 0.6294 0.9365 0.9677
No log 8.0 400 0.9705 0.6294 0.9705 0.9851
No log 8.04 402 0.9710 0.6294 0.9710 0.9854
No log 8.08 404 0.9850 0.6294 0.9850 0.9925
No log 8.12 406 0.9875 0.6308 0.9875 0.9937
No log 8.16 408 0.9995 0.6308 0.9995 0.9998
No log 8.2 410 0.9896 0.6308 0.9896 0.9948
No log 8.24 412 0.9637 0.6308 0.9637 0.9817
No log 8.28 414 0.9443 0.6465 0.9443 0.9717
No log 8.32 416 0.9147 0.6465 0.9147 0.9564
No log 8.36 418 0.9072 0.6465 0.9072 0.9524
No log 8.4 420 0.9310 0.6465 0.9310 0.9649
No log 8.44 422 0.9564 0.6350 0.9564 0.9780
No log 8.48 424 0.9726 0.6248 0.9726 0.9862
No log 8.52 426 0.9905 0.6248 0.9905 0.9952
No log 8.56 428 1.0199 0.6248 1.0199 1.0099
No log 8.6 430 1.0471 0.6294 1.0471 1.0233
No log 8.64 432 1.0629 0.6267 1.0629 1.0310
No log 8.68 434 1.0706 0.6183 1.0706 1.0347
No log 8.72 436 1.0572 0.6153 1.0572 1.0282
No log 8.76 438 1.0439 0.6239 1.0439 1.0217
No log 8.8 440 1.0103 0.6368 1.0103 1.0051
No log 8.84 442 0.9745 0.6308 0.9745 0.9872
No log 8.88 444 0.9344 0.6294 0.9344 0.9666
No log 8.92 446 0.8970 0.6454 0.8970 0.9471
No log 8.96 448 0.8728 0.6545 0.8728 0.9342
No log 9.0 450 0.8632 0.6663 0.8632 0.9291
No log 9.04 452 0.8693 0.6663 0.8693 0.9324
No log 9.08 454 0.8818 0.6555 0.8818 0.9390
No log 9.12 456 0.8965 0.6465 0.8965 0.9468
No log 9.16 458 0.9138 0.6424 0.9138 0.9559
No log 9.2 460 0.9291 0.6308 0.9291 0.9639
No log 9.24 462 0.9442 0.6308 0.9442 0.9717
No log 9.28 464 0.9467 0.6308 0.9467 0.9730
No log 9.32 466 0.9426 0.6308 0.9426 0.9709
No log 9.36 468 0.9508 0.6308 0.9508 0.9751
No log 9.4 470 0.9574 0.6248 0.9574 0.9785
No log 9.44 472 0.9710 0.6248 0.9710 0.9854
No log 9.48 474 0.9760 0.6248 0.9760 0.9879
No log 9.52 476 0.9739 0.6248 0.9739 0.9869
No log 9.56 478 0.9716 0.6248 0.9716 0.9857
No log 9.6 480 0.9720 0.6248 0.9720 0.9859
No log 9.64 482 0.9777 0.6248 0.9777 0.9888
No log 9.68 484 0.9857 0.6248 0.9857 0.9928
No log 9.72 486 0.9906 0.6248 0.9906 0.9953
No log 9.76 488 0.9954 0.6308 0.9954 0.9977
No log 9.8 490 0.9970 0.6308 0.9970 0.9985
No log 9.84 492 0.9991 0.6308 0.9991 0.9996
No log 9.88 494 1.0007 0.6308 1.0007 1.0003
No log 9.92 496 1.0018 0.6308 1.0018 1.0009
No log 9.96 498 1.0026 0.6308 1.0026 1.0013
0.3114 10.0 500 1.0025 0.6308 1.0025 1.0013

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run1_AugV5_k15_task5_organization

Finetuned
(4023)
this model