ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k18_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7052
  • Qwk: 0.7617
  • Mse: 0.7052
  • Rmse: 0.8397

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0333 2 2.4332 0.0396 2.4332 1.5599
No log 0.0667 4 1.6696 0.1467 1.6696 1.2921
No log 0.1 6 1.5756 0.1254 1.5756 1.2552
No log 0.1333 8 1.6194 0.2439 1.6194 1.2725
No log 0.1667 10 1.5901 0.2926 1.5901 1.2610
No log 0.2 12 1.4603 0.3001 1.4603 1.2084
No log 0.2333 14 1.5482 0.3097 1.5482 1.2443
No log 0.2667 16 1.9156 0.2664 1.9156 1.3840
No log 0.3 18 1.5609 0.3075 1.5609 1.2494
No log 0.3333 20 1.3737 0.3324 1.3737 1.1720
No log 0.3667 22 1.5030 0.3563 1.5030 1.2260
No log 0.4 24 1.5070 0.3658 1.5070 1.2276
No log 0.4333 26 1.9975 0.3291 1.9975 1.4133
No log 0.4667 28 2.1235 0.3044 2.1235 1.4572
No log 0.5 30 1.8635 0.3807 1.8635 1.3651
No log 0.5333 32 1.6958 0.3549 1.6958 1.3022
No log 0.5667 34 1.6362 0.3765 1.6362 1.2791
No log 0.6 36 1.7631 0.3981 1.7631 1.3278
No log 0.6333 38 2.2361 0.2985 2.2361 1.4953
No log 0.6667 40 2.2541 0.3448 2.2541 1.5014
No log 0.7 42 1.6090 0.4264 1.6090 1.2684
No log 0.7333 44 1.0531 0.4378 1.0531 1.0262
No log 0.7667 46 0.9484 0.4672 0.9484 0.9739
No log 0.8 48 0.9488 0.5203 0.9488 0.9741
No log 0.8333 50 1.1387 0.5146 1.1387 1.0671
No log 0.8667 52 1.8572 0.4040 1.8572 1.3628
No log 0.9 54 2.2306 0.3419 2.2306 1.4935
No log 0.9333 56 2.4038 0.3470 2.4038 1.5504
No log 0.9667 58 2.2262 0.3695 2.2262 1.4920
No log 1.0 60 1.5840 0.4728 1.5840 1.2586
No log 1.0333 62 1.4385 0.4591 1.4385 1.1994
No log 1.0667 64 1.7361 0.4436 1.7361 1.3176
No log 1.1 66 1.7658 0.4403 1.7658 1.3288
No log 1.1333 68 1.2492 0.4951 1.2492 1.1177
No log 1.1667 70 1.0912 0.5556 1.0912 1.0446
No log 1.2 72 1.3711 0.5092 1.3711 1.1709
No log 1.2333 74 1.9345 0.4542 1.9345 1.3909
No log 1.2667 76 2.1286 0.4077 2.1286 1.4590
No log 1.3 78 1.6524 0.4854 1.6524 1.2855
No log 1.3333 80 1.1142 0.5360 1.1142 1.0555
No log 1.3667 82 0.9818 0.6053 0.9818 0.9908
No log 1.4 84 0.9980 0.6031 0.9980 0.9990
No log 1.4333 86 1.2293 0.5033 1.2293 1.1087
No log 1.4667 88 1.5627 0.4378 1.5627 1.2501
No log 1.5 90 1.6858 0.4244 1.6858 1.2984
No log 1.5333 92 1.4597 0.4375 1.4597 1.2082
No log 1.5667 94 1.1971 0.5365 1.1971 1.0941
No log 1.6 96 0.9612 0.6415 0.9612 0.9804
No log 1.6333 98 0.9424 0.5835 0.9424 0.9708
No log 1.6667 100 1.0097 0.5675 1.0097 1.0048
No log 1.7 102 1.0703 0.5508 1.0703 1.0346
No log 1.7333 104 1.0263 0.5761 1.0263 1.0131
No log 1.7667 106 1.0664 0.5562 1.0664 1.0327
No log 1.8 108 1.1450 0.5760 1.1450 1.0701
No log 1.8333 110 1.0261 0.6113 1.0261 1.0129
No log 1.8667 112 1.0190 0.6369 1.0190 1.0094
No log 1.9 114 0.8824 0.6551 0.8824 0.9394
No log 1.9333 116 0.6923 0.6705 0.6923 0.8321
No log 1.9667 118 0.6558 0.6959 0.6558 0.8098
No log 2.0 120 0.6541 0.6754 0.6541 0.8088
No log 2.0333 122 0.6947 0.6979 0.6947 0.8335
No log 2.0667 124 0.9229 0.6867 0.9229 0.9607
No log 2.1 126 1.1607 0.6131 1.1607 1.0774
No log 2.1333 128 1.2210 0.5858 1.2210 1.1050
No log 2.1667 130 1.0523 0.5985 1.0523 1.0258
No log 2.2 132 0.9788 0.6204 0.9788 0.9894
No log 2.2333 134 0.8613 0.6600 0.8613 0.9281
No log 2.2667 136 0.8556 0.6756 0.8556 0.9250
No log 2.3 138 0.9252 0.6704 0.9252 0.9618
No log 2.3333 140 0.8494 0.7140 0.8494 0.9216
No log 2.3667 142 0.7286 0.7339 0.7286 0.8536
No log 2.4 144 0.7227 0.7340 0.7227 0.8501
No log 2.4333 146 0.6707 0.7203 0.6707 0.8190
No log 2.4667 148 0.6339 0.7319 0.6339 0.7962
No log 2.5 150 0.6029 0.7388 0.6029 0.7765
No log 2.5333 152 0.5859 0.7439 0.5859 0.7654
No log 2.5667 154 0.6098 0.7579 0.6098 0.7809
No log 2.6 156 0.6158 0.7579 0.6158 0.7847
No log 2.6333 158 0.6051 0.7671 0.6051 0.7779
No log 2.6667 160 0.5990 0.7746 0.5990 0.7739
No log 2.7 162 0.6191 0.7703 0.6191 0.7868
No log 2.7333 164 0.5784 0.7398 0.5784 0.7605
No log 2.7667 166 0.5770 0.7054 0.5770 0.7596
No log 2.8 168 0.5742 0.7149 0.5742 0.7578
No log 2.8333 170 0.5782 0.7395 0.5782 0.7604
No log 2.8667 172 0.6136 0.7513 0.6136 0.7833
No log 2.9 174 0.5743 0.7372 0.5743 0.7578
No log 2.9333 176 0.5706 0.7230 0.5706 0.7554
No log 2.9667 178 0.5645 0.7180 0.5645 0.7513
No log 3.0 180 0.5538 0.7223 0.5538 0.7442
No log 3.0333 182 0.5502 0.7223 0.5502 0.7417
No log 3.0667 184 0.5631 0.7579 0.5631 0.7504
No log 3.1 186 0.5700 0.7552 0.5700 0.7550
No log 3.1333 188 0.5586 0.7587 0.5586 0.7474
No log 3.1667 190 0.5465 0.7387 0.5465 0.7392
No log 3.2 192 0.5551 0.7471 0.5551 0.7451
No log 3.2333 194 0.5608 0.7568 0.5608 0.7489
No log 3.2667 196 0.5781 0.7720 0.5781 0.7604
No log 3.3 198 0.5706 0.7720 0.5706 0.7554
No log 3.3333 200 0.5537 0.7631 0.5537 0.7441
No log 3.3667 202 0.5449 0.7226 0.5449 0.7382
No log 3.4 204 0.5787 0.7460 0.5787 0.7607
No log 3.4333 206 0.6805 0.7459 0.6805 0.8249
No log 3.4667 208 0.6743 0.7464 0.6743 0.8211
No log 3.5 210 0.6414 0.7457 0.6414 0.8008
No log 3.5333 212 0.6305 0.7418 0.6305 0.7940
No log 3.5667 214 0.6419 0.7560 0.6419 0.8012
No log 3.6 216 0.6933 0.7566 0.6933 0.8327
No log 3.6333 218 0.6631 0.7529 0.6631 0.8143
No log 3.6667 220 0.7310 0.7595 0.7310 0.8550
No log 3.7 222 0.9839 0.7015 0.9839 0.9919
No log 3.7333 224 1.1451 0.6695 1.1451 1.0701
No log 3.7667 226 1.0591 0.6855 1.0591 1.0291
No log 3.8 228 0.8516 0.7184 0.8516 0.9228
No log 3.8333 230 0.6980 0.7634 0.6980 0.8354
No log 3.8667 232 0.6159 0.7534 0.6159 0.7848
No log 3.9 234 0.6243 0.7451 0.6243 0.7901
No log 3.9333 236 0.6886 0.7319 0.6886 0.8298
No log 3.9667 238 0.8130 0.6941 0.8130 0.9017
No log 4.0 240 0.8938 0.6632 0.8938 0.9454
No log 4.0333 242 0.8504 0.6950 0.8504 0.9222
No log 4.0667 244 0.7111 0.7484 0.7111 0.8433
No log 4.1 246 0.6646 0.7401 0.6646 0.8152
No log 4.1333 248 0.6686 0.7516 0.6686 0.8177
No log 4.1667 250 0.7245 0.7484 0.7245 0.8512
No log 4.2 252 0.8454 0.7019 0.8454 0.9195
No log 4.2333 254 0.9662 0.6697 0.9662 0.9830
No log 4.2667 256 0.9237 0.6632 0.9237 0.9611
No log 4.3 258 0.7744 0.7282 0.7744 0.8800
No log 4.3333 260 0.6503 0.7531 0.6503 0.8064
No log 4.3667 262 0.6423 0.7493 0.6423 0.8014
No log 4.4 264 0.7297 0.7387 0.7297 0.8542
No log 4.4333 266 0.7962 0.7227 0.7962 0.8923
No log 4.4667 268 0.8584 0.7052 0.8584 0.9265
No log 4.5 270 0.9671 0.6628 0.9671 0.9834
No log 4.5333 272 0.8956 0.6752 0.8956 0.9464
No log 4.5667 274 0.7204 0.7302 0.7204 0.8488
No log 4.6 276 0.6011 0.7599 0.6011 0.7753
No log 4.6333 278 0.5880 0.7441 0.5880 0.7668
No log 4.6667 280 0.6219 0.7651 0.6219 0.7886
No log 4.7 282 0.7359 0.7296 0.7359 0.8578
No log 4.7333 284 0.8277 0.6892 0.8277 0.9098
No log 4.7667 286 0.8182 0.7056 0.8182 0.9045
No log 4.8 288 0.7346 0.7384 0.7346 0.8571
No log 4.8333 290 0.6150 0.7665 0.6150 0.7842
No log 4.8667 292 0.5883 0.7677 0.5883 0.7670
No log 4.9 294 0.6081 0.7739 0.6081 0.7798
No log 4.9333 296 0.7007 0.7633 0.7007 0.8371
No log 4.9667 298 0.8380 0.7209 0.8380 0.9154
No log 5.0 300 1.0822 0.6452 1.0822 1.0403
No log 5.0333 302 1.2118 0.6304 1.2118 1.1008
No log 5.0667 304 1.1400 0.6525 1.1400 1.0677
No log 5.1 306 0.9700 0.6713 0.9700 0.9849
No log 5.1333 308 0.8261 0.7221 0.8261 0.9089
No log 5.1667 310 0.8273 0.7016 0.8273 0.9096
No log 5.2 312 0.8030 0.6867 0.8030 0.8961
No log 5.2333 314 0.8124 0.6779 0.8124 0.9013
No log 5.2667 316 0.8823 0.6847 0.8823 0.9393
No log 5.3 318 0.8958 0.6829 0.8958 0.9465
No log 5.3333 320 0.8606 0.6973 0.8606 0.9277
No log 5.3667 322 0.8149 0.7098 0.8149 0.9027
No log 5.4 324 0.8332 0.7039 0.8332 0.9128
No log 5.4333 326 0.8154 0.7204 0.8154 0.9030
No log 5.4667 328 0.8316 0.7187 0.8316 0.9119
No log 5.5 330 0.8051 0.7248 0.8051 0.8973
No log 5.5333 332 0.8242 0.7303 0.8242 0.9079
No log 5.5667 334 0.8574 0.7169 0.8574 0.9260
No log 5.6 336 0.9066 0.6640 0.9066 0.9521
No log 5.6333 338 0.9968 0.6457 0.9968 0.9984
No log 5.6667 340 0.9537 0.6575 0.9537 0.9766
No log 5.7 342 0.8414 0.7049 0.8414 0.9173
No log 5.7333 344 0.7099 0.7532 0.7099 0.8425
No log 5.7667 346 0.6581 0.7520 0.6581 0.8112
No log 5.8 348 0.6408 0.7488 0.6408 0.8005
No log 5.8333 350 0.6461 0.7519 0.6461 0.8038
No log 5.8667 352 0.6783 0.7564 0.6783 0.8236
No log 5.9 354 0.7024 0.7724 0.7024 0.8381
No log 5.9333 356 0.7402 0.7572 0.7402 0.8604
No log 5.9667 358 0.7203 0.7544 0.7203 0.8487
No log 6.0 360 0.7166 0.7470 0.7166 0.8465
No log 6.0333 362 0.6648 0.7686 0.6648 0.8153
No log 6.0667 364 0.6432 0.7624 0.6432 0.8020
No log 6.1 366 0.6515 0.7727 0.6515 0.8071
No log 6.1333 368 0.6380 0.7727 0.6380 0.7987
No log 6.1667 370 0.6240 0.7679 0.6240 0.7900
No log 6.2 372 0.6243 0.7605 0.6243 0.7901
No log 6.2333 374 0.6526 0.7777 0.6526 0.8078
No log 6.2667 376 0.7178 0.7603 0.7178 0.8472
No log 6.3 378 0.7574 0.7552 0.7574 0.8703
No log 6.3333 380 0.7288 0.7639 0.7288 0.8537
No log 6.3667 382 0.7229 0.7481 0.7229 0.8502
No log 6.4 384 0.7313 0.7475 0.7313 0.8552
No log 6.4333 386 0.7343 0.7469 0.7343 0.8569
No log 6.4667 388 0.7024 0.7475 0.7024 0.8381
No log 6.5 390 0.6582 0.7834 0.6582 0.8113
No log 6.5333 392 0.6386 0.7609 0.6386 0.7991
No log 6.5667 394 0.6197 0.7470 0.6197 0.7872
No log 6.6 396 0.6231 0.7387 0.6231 0.7894
No log 6.6333 398 0.6470 0.7509 0.6470 0.8044
No log 6.6667 400 0.7180 0.7426 0.7180 0.8473
No log 6.7 402 0.7780 0.7353 0.7780 0.8820
No log 6.7333 404 0.8044 0.7290 0.8044 0.8969
No log 6.7667 406 0.7686 0.7509 0.7686 0.8767
No log 6.8 408 0.7110 0.7780 0.7110 0.8432
No log 6.8333 410 0.6735 0.7860 0.6735 0.8207
No log 6.8667 412 0.6716 0.7860 0.6716 0.8195
No log 6.9 414 0.6632 0.7826 0.6632 0.8144
No log 6.9333 416 0.6525 0.7826 0.6524 0.8077
No log 6.9667 418 0.6752 0.7821 0.6752 0.8217
No log 7.0 420 0.7382 0.7457 0.7382 0.8592
No log 7.0333 422 0.8468 0.7123 0.8468 0.9202
No log 7.0667 424 0.9020 0.6944 0.9020 0.9497
No log 7.1 426 0.8816 0.6962 0.8816 0.9389
No log 7.1333 428 0.8145 0.7330 0.8145 0.9025
No log 7.1667 430 0.7440 0.7416 0.7440 0.8626
No log 7.2 432 0.6909 0.7469 0.6909 0.8312
No log 7.2333 434 0.6566 0.7784 0.6566 0.8103
No log 7.2667 436 0.6490 0.7860 0.6490 0.8056
No log 7.3 438 0.6721 0.7745 0.6721 0.8198
No log 7.3333 440 0.7070 0.7699 0.7070 0.8408
No log 7.3667 442 0.7104 0.7699 0.7104 0.8428
No log 7.4 444 0.6761 0.7675 0.6761 0.8223
No log 7.4333 446 0.6440 0.7893 0.6440 0.8025
No log 7.4667 448 0.6361 0.7901 0.6361 0.7976
No log 7.5 450 0.6573 0.7852 0.6573 0.8108
No log 7.5333 452 0.7069 0.7740 0.7069 0.8408
No log 7.5667 454 0.7332 0.7694 0.7332 0.8563
No log 7.6 456 0.7357 0.7694 0.7357 0.8578
No log 7.6333 458 0.7105 0.7629 0.7105 0.8429
No log 7.6667 460 0.7032 0.7633 0.7032 0.8385
No log 7.7 462 0.7086 0.7669 0.7086 0.8418
No log 7.7333 464 0.6850 0.7681 0.6850 0.8277
No log 7.7667 466 0.6669 0.7681 0.6669 0.8166
No log 7.8 468 0.6648 0.7722 0.6648 0.8154
No log 7.8333 470 0.6787 0.7716 0.6787 0.8238
No log 7.8667 472 0.7150 0.7745 0.7150 0.8456
No log 7.9 474 0.7469 0.7733 0.7469 0.8642
No log 7.9333 476 0.7668 0.7649 0.7668 0.8757
No log 7.9667 478 0.7757 0.7609 0.7757 0.8808
No log 8.0 480 0.7767 0.7609 0.7767 0.8813
No log 8.0333 482 0.7710 0.7609 0.7710 0.8781
No log 8.0667 484 0.7476 0.7791 0.7476 0.8647
No log 8.1 486 0.7039 0.7621 0.7039 0.8390
No log 8.1333 488 0.6592 0.7595 0.6592 0.8119
No log 8.1667 490 0.6389 0.7595 0.6389 0.7993
No log 8.2 492 0.6243 0.7722 0.6243 0.7901
No log 8.2333 494 0.6301 0.7722 0.6301 0.7938
No log 8.2667 496 0.6463 0.7595 0.6463 0.8039
No log 8.3 498 0.6721 0.7650 0.6721 0.8198
0.3428 8.3333 500 0.6972 0.7717 0.6972 0.8350
0.3428 8.3667 502 0.7154 0.7705 0.7154 0.8458
0.3428 8.4 504 0.7253 0.7682 0.7253 0.8517
0.3428 8.4333 506 0.7331 0.7682 0.7331 0.8562
0.3428 8.4667 508 0.7437 0.7600 0.7437 0.8624
0.3428 8.5 510 0.7408 0.7563 0.7408 0.8607
0.3428 8.5333 512 0.7223 0.7554 0.7223 0.8499
0.3428 8.5667 514 0.7048 0.7554 0.7048 0.8395
0.3428 8.6 516 0.6917 0.7554 0.6917 0.8317
0.3428 8.6333 518 0.6907 0.7554 0.6907 0.8311
0.3428 8.6667 520 0.6979 0.7554 0.6979 0.8354
0.3428 8.7 522 0.6959 0.7554 0.6959 0.8342
0.3428 8.7333 524 0.7023 0.7628 0.7023 0.8380
0.3428 8.7667 526 0.7163 0.7628 0.7163 0.8464
0.3428 8.8 528 0.7373 0.7688 0.7373 0.8586
0.3428 8.8333 530 0.7690 0.7531 0.7690 0.8769
0.3428 8.8667 532 0.7964 0.7371 0.7964 0.8924
0.3428 8.9 534 0.8126 0.7371 0.8126 0.9014
0.3428 8.9333 536 0.8203 0.7371 0.8203 0.9057
0.3428 8.9667 538 0.8108 0.7391 0.8108 0.9005
0.3428 9.0 540 0.7882 0.7490 0.7882 0.8878
0.3428 9.0333 542 0.7767 0.7490 0.7767 0.8813
0.3428 9.0667 544 0.7651 0.7490 0.7651 0.8747
0.3428 9.1 546 0.7585 0.7490 0.7585 0.8709
0.3428 9.1333 548 0.7501 0.7531 0.7501 0.8661
0.3428 9.1667 550 0.7357 0.7572 0.7357 0.8577
0.3428 9.2 552 0.7297 0.7729 0.7297 0.8542
0.3428 9.2333 554 0.7288 0.7729 0.7288 0.8537
0.3428 9.2667 556 0.7235 0.7694 0.7235 0.8506
0.3428 9.3 558 0.7276 0.7694 0.7276 0.8530
0.3428 9.3333 560 0.7345 0.7729 0.7345 0.8570
0.3428 9.3667 562 0.7347 0.7729 0.7347 0.8571
0.3428 9.4 564 0.7350 0.7688 0.7350 0.8573
0.3428 9.4333 566 0.7280 0.7688 0.7280 0.8532
0.3428 9.4667 568 0.7204 0.7688 0.7204 0.8488
0.3428 9.5 570 0.7175 0.7576 0.7175 0.8470
0.3428 9.5333 572 0.7182 0.7576 0.7182 0.8475
0.3428 9.5667 574 0.7173 0.7576 0.7173 0.8469
0.3428 9.6 576 0.7139 0.7576 0.7139 0.8450
0.3428 9.6333 578 0.7110 0.7576 0.7110 0.8432
0.3428 9.6667 580 0.7062 0.7502 0.7062 0.8404
0.3428 9.7 582 0.7051 0.7502 0.7051 0.8397
0.3428 9.7333 584 0.7041 0.7544 0.7041 0.8391
0.3428 9.7667 586 0.7013 0.7544 0.7013 0.8374
0.3428 9.8 588 0.6984 0.7544 0.6984 0.8357
0.3428 9.8333 590 0.6979 0.7735 0.6979 0.8354
0.3428 9.8667 592 0.6992 0.7735 0.6992 0.8362
0.3428 9.9 594 0.7016 0.7617 0.7016 0.8376
0.3428 9.9333 596 0.7038 0.7617 0.7038 0.8389
0.3428 9.9667 598 0.7050 0.7617 0.7050 0.8396
0.3428 10.0 600 0.7052 0.7617 0.7052 0.8397

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k18_task5_organization

Finetuned
(4023)
this model