ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k15_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7004
  • Qwk: 0.7741
  • Mse: 0.7004
  • Rmse: 0.8369

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0339 2 2.5535 -0.0395 2.5535 1.5980
No log 0.0678 4 1.7808 0.0434 1.7808 1.3345
No log 0.1017 6 1.5752 0.1031 1.5752 1.2551
No log 0.1356 8 1.5605 0.1626 1.5605 1.2492
No log 0.1695 10 1.4656 0.3077 1.4656 1.2106
No log 0.2034 12 1.3634 0.1963 1.3634 1.1677
No log 0.2373 14 1.5370 0.3120 1.5370 1.2397
No log 0.2712 16 1.5821 0.3234 1.5821 1.2578
No log 0.3051 18 1.4941 0.3516 1.4941 1.2223
No log 0.3390 20 1.4911 0.3521 1.4911 1.2211
No log 0.3729 22 1.4540 0.3714 1.4540 1.2058
No log 0.4068 24 1.4053 0.3388 1.4053 1.1855
No log 0.4407 26 1.4169 0.3535 1.4169 1.1903
No log 0.4746 28 1.3647 0.3744 1.3647 1.1682
No log 0.5085 30 1.2968 0.4254 1.2968 1.1388
No log 0.5424 32 1.1226 0.4237 1.1226 1.0595
No log 0.5763 34 1.1507 0.4329 1.1507 1.0727
No log 0.6102 36 1.1458 0.4382 1.1458 1.0704
No log 0.6441 38 1.3604 0.4536 1.3604 1.1663
No log 0.6780 40 1.2900 0.4494 1.2900 1.1358
No log 0.7119 42 1.0355 0.4563 1.0355 1.0176
No log 0.7458 44 0.9951 0.3878 0.9951 0.9976
No log 0.7797 46 0.9760 0.4971 0.9760 0.9879
No log 0.8136 48 0.9119 0.5507 0.9119 0.9549
No log 0.8475 50 0.9863 0.5769 0.9863 0.9931
No log 0.8814 52 0.8934 0.6141 0.8934 0.9452
No log 0.9153 54 0.7858 0.6512 0.7858 0.8865
No log 0.9492 56 0.7705 0.6477 0.7705 0.8778
No log 0.9831 58 0.8188 0.5825 0.8188 0.9049
No log 1.0169 60 0.8124 0.5825 0.8124 0.9013
No log 1.0508 62 0.7565 0.6417 0.7565 0.8698
No log 1.0847 64 1.1370 0.5420 1.1370 1.0663
No log 1.1186 66 1.5848 0.4830 1.5848 1.2589
No log 1.1525 68 1.5213 0.4839 1.5213 1.2334
No log 1.1864 70 1.1964 0.4971 1.1964 1.0938
No log 1.2203 72 0.9206 0.5187 0.9206 0.9595
No log 1.2542 74 0.8842 0.5365 0.8842 0.9403
No log 1.2881 76 1.0279 0.5423 1.0279 1.0139
No log 1.3220 78 1.2401 0.5106 1.2401 1.1136
No log 1.3559 80 1.3180 0.5123 1.3180 1.1481
No log 1.3898 82 1.1695 0.5588 1.1695 1.0815
No log 1.4237 84 1.0579 0.5952 1.0579 1.0285
No log 1.4576 86 0.8970 0.6280 0.8970 0.9471
No log 1.4915 88 0.8800 0.6697 0.8800 0.9381
No log 1.5254 90 0.8522 0.6842 0.8522 0.9231
No log 1.5593 92 0.9022 0.6889 0.9022 0.9499
No log 1.5932 94 0.8306 0.6963 0.8306 0.9114
No log 1.6271 96 0.7751 0.7084 0.7751 0.8804
No log 1.6610 98 0.7599 0.7016 0.7599 0.8717
No log 1.6949 100 0.9054 0.6870 0.9054 0.9515
No log 1.7288 102 1.0261 0.6568 1.0261 1.0130
No log 1.7627 104 0.8367 0.6784 0.8367 0.9147
No log 1.7966 106 0.7303 0.7025 0.7303 0.8546
No log 1.8305 108 0.7112 0.7215 0.7112 0.8434
No log 1.8644 110 0.7868 0.6977 0.7868 0.8870
No log 1.8983 112 1.1436 0.5947 1.1436 1.0694
No log 1.9322 114 1.2781 0.5569 1.2781 1.1306
No log 1.9661 116 1.0152 0.6064 1.0152 1.0076
No log 2.0 118 0.7578 0.7296 0.7578 0.8705
No log 2.0339 120 0.6723 0.7324 0.6723 0.8199
No log 2.0678 122 0.6740 0.7317 0.6740 0.8210
No log 2.1017 124 0.7401 0.7319 0.7401 0.8603
No log 2.1356 126 0.8948 0.6723 0.8948 0.9460
No log 2.1695 128 0.9480 0.6632 0.9480 0.9737
No log 2.2034 130 0.9294 0.6577 0.9294 0.9641
No log 2.2373 132 0.7985 0.7112 0.7985 0.8936
No log 2.2712 134 0.7901 0.7339 0.7901 0.8889
No log 2.3051 136 0.7775 0.7370 0.7775 0.8818
No log 2.3390 138 0.8251 0.7166 0.8251 0.9083
No log 2.3729 140 0.8854 0.6562 0.8854 0.9410
No log 2.4068 142 0.7859 0.7151 0.7859 0.8865
No log 2.4407 144 0.7084 0.7181 0.7084 0.8417
No log 2.4746 146 0.7101 0.7200 0.7101 0.8427
No log 2.5085 148 0.7546 0.7490 0.7546 0.8687
No log 2.5424 150 0.7889 0.7610 0.7889 0.8882
No log 2.5763 152 0.8524 0.7317 0.8524 0.9233
No log 2.6102 154 0.7881 0.7584 0.7881 0.8878
No log 2.6441 156 0.7607 0.7389 0.7607 0.8722
No log 2.6780 158 0.6850 0.7386 0.6850 0.8277
No log 2.7119 160 0.6734 0.7475 0.6734 0.8206
No log 2.7458 162 0.7215 0.7376 0.7215 0.8494
No log 2.7797 164 0.7390 0.7534 0.7390 0.8597
No log 2.8136 166 0.7474 0.7560 0.7474 0.8645
No log 2.8475 168 0.7978 0.7539 0.7978 0.8932
No log 2.8814 170 0.7059 0.7962 0.7059 0.8402
No log 2.9153 172 0.6297 0.7784 0.6297 0.7935
No log 2.9492 174 0.6690 0.7794 0.6690 0.8179
No log 2.9831 176 0.7101 0.7787 0.7101 0.8427
No log 3.0169 178 0.7566 0.7675 0.7566 0.8698
No log 3.0508 180 0.7254 0.7679 0.7254 0.8517
No log 3.0847 182 0.6974 0.7733 0.6974 0.8351
No log 3.1186 184 0.6492 0.7573 0.6492 0.8057
No log 3.1525 186 0.5987 0.7327 0.5987 0.7738
No log 3.1864 188 0.5962 0.7255 0.5962 0.7722
No log 3.2203 190 0.5736 0.7372 0.5736 0.7574
No log 3.2542 192 0.6120 0.7811 0.6120 0.7823
No log 3.2881 194 0.6404 0.7859 0.6404 0.8002
No log 3.3220 196 0.7079 0.7645 0.7079 0.8414
No log 3.3559 198 0.6600 0.7692 0.6600 0.8124
No log 3.3898 200 0.6865 0.7662 0.6865 0.8286
No log 3.4237 202 0.6961 0.7662 0.6961 0.8343
No log 3.4576 204 0.7481 0.7035 0.7481 0.8650
No log 3.4915 206 0.7969 0.6742 0.7969 0.8927
No log 3.5254 208 0.8554 0.6539 0.8554 0.9249
No log 3.5593 210 0.7645 0.7260 0.7645 0.8744
No log 3.5932 212 0.6551 0.7679 0.6551 0.8094
No log 3.6271 214 0.6540 0.7925 0.6540 0.8087
No log 3.6610 216 0.7797 0.7192 0.7797 0.8830
No log 3.6949 218 0.8095 0.7068 0.8095 0.8997
No log 3.7288 220 0.7056 0.7668 0.7056 0.8400
No log 3.7627 222 0.6683 0.7892 0.6683 0.8175
No log 3.7966 224 0.7538 0.7222 0.7538 0.8682
No log 3.8305 226 0.8286 0.7174 0.8286 0.9103
No log 3.8644 228 0.8785 0.7071 0.8785 0.9373
No log 3.8983 230 0.7613 0.7261 0.7613 0.8725
No log 3.9322 232 0.6385 0.7765 0.6385 0.7991
No log 3.9661 234 0.5829 0.76 0.5829 0.7635
No log 4.0 236 0.5877 0.7576 0.5877 0.7666
No log 4.0339 238 0.6129 0.7872 0.6129 0.7829
No log 4.0678 240 0.5980 0.7745 0.5980 0.7733
No log 4.1017 242 0.6317 0.7786 0.6317 0.7948
No log 4.1356 244 0.7117 0.7686 0.7117 0.8436
No log 4.1695 246 0.6913 0.7732 0.6913 0.8315
No log 4.2034 248 0.6731 0.7732 0.6731 0.8204
No log 4.2373 250 0.6317 0.7844 0.6317 0.7948
No log 4.2712 252 0.5910 0.7858 0.5910 0.7688
No log 4.3051 254 0.5564 0.7410 0.5564 0.7459
No log 4.3390 256 0.5555 0.7372 0.5555 0.7453
No log 4.3729 258 0.5657 0.7599 0.5657 0.7522
No log 4.4068 260 0.6419 0.7733 0.6419 0.8012
No log 4.4407 262 0.7612 0.7209 0.7612 0.8725
No log 4.4746 264 0.7662 0.6976 0.7662 0.8754
No log 4.5085 266 0.6591 0.7556 0.6591 0.8118
No log 4.5424 268 0.5974 0.7583 0.5974 0.7729
No log 4.5763 270 0.5865 0.7798 0.5865 0.7658
No log 4.6102 272 0.6464 0.7507 0.6464 0.8040
No log 4.6441 274 0.6783 0.7615 0.6783 0.8236
No log 4.6780 276 0.6303 0.7674 0.6303 0.7939
No log 4.7119 278 0.5967 0.7826 0.5967 0.7725
No log 4.7458 280 0.5946 0.7852 0.5946 0.7711
No log 4.7797 282 0.6727 0.7692 0.6727 0.8202
No log 4.8136 284 0.8160 0.7127 0.8160 0.9033
No log 4.8475 286 0.8167 0.7129 0.8167 0.9037
No log 4.8814 288 0.7094 0.7722 0.7094 0.8423
No log 4.9153 290 0.6707 0.7727 0.6707 0.8190
No log 4.9492 292 0.6929 0.7770 0.6929 0.8324
No log 4.9831 294 0.6957 0.7729 0.6957 0.8341
No log 5.0169 296 0.6643 0.7799 0.6643 0.8151
No log 5.0508 298 0.6485 0.7685 0.6485 0.8053
No log 5.0847 300 0.6831 0.7629 0.6831 0.8265
No log 5.1186 302 0.6867 0.7640 0.6867 0.8287
No log 5.1525 304 0.6979 0.7409 0.6979 0.8354
No log 5.1864 306 0.7702 0.7430 0.7702 0.8776
No log 5.2203 308 0.8015 0.7350 0.8015 0.8953
No log 5.2542 310 0.8086 0.7179 0.8086 0.8992
No log 5.2881 312 0.7684 0.7316 0.7684 0.8766
No log 5.3220 314 0.6819 0.7661 0.6819 0.8258
No log 5.3559 316 0.6365 0.7684 0.6365 0.7978
No log 5.3898 318 0.6530 0.7640 0.6530 0.8081
No log 5.4237 320 0.7041 0.7787 0.7041 0.8391
No log 5.4576 322 0.7879 0.7051 0.7879 0.8876
No log 5.4915 324 0.8299 0.6905 0.8299 0.9110
No log 5.5254 326 0.7717 0.7240 0.7717 0.8785
No log 5.5593 328 0.7572 0.7421 0.7572 0.8702
No log 5.5932 330 0.7509 0.7520 0.7509 0.8665
No log 5.6271 332 0.7339 0.7445 0.7339 0.8567
No log 5.6610 334 0.7508 0.7168 0.7508 0.8665
No log 5.6949 336 0.7657 0.7358 0.7657 0.8750
No log 5.7288 338 0.7496 0.7358 0.7496 0.8658
No log 5.7627 340 0.7413 0.7322 0.7413 0.8610
No log 5.7966 342 0.7654 0.7357 0.7654 0.8749
No log 5.8305 344 0.8613 0.7112 0.8613 0.9280
No log 5.8644 346 0.9914 0.6711 0.9914 0.9957
No log 5.8983 348 1.0493 0.6628 1.0493 1.0244
No log 5.9322 350 1.0139 0.6555 1.0139 1.0069
No log 5.9661 352 0.8557 0.6996 0.8557 0.9250
No log 6.0 354 0.6880 0.7473 0.6880 0.8294
No log 6.0339 356 0.6201 0.7549 0.6201 0.7874
No log 6.0678 358 0.6132 0.7677 0.6132 0.7831
No log 6.1017 360 0.6456 0.7539 0.6456 0.8035
No log 6.1356 362 0.7399 0.7454 0.7399 0.8602
No log 6.1695 364 0.8615 0.7066 0.8615 0.9282
No log 6.2034 366 0.9102 0.6830 0.9102 0.9541
No log 6.2373 368 0.8739 0.7078 0.8739 0.9348
No log 6.2712 370 0.8326 0.7074 0.8326 0.9125
No log 6.3051 372 0.8469 0.7041 0.8469 0.9203
No log 6.3390 374 0.8729 0.6890 0.8729 0.9343
No log 6.3729 376 0.9098 0.6766 0.9098 0.9538
No log 6.4068 378 0.8619 0.6988 0.8619 0.9284
No log 6.4407 380 0.7908 0.7253 0.7908 0.8892
No log 6.4746 382 0.7318 0.7045 0.7318 0.8554
No log 6.5085 384 0.7103 0.7272 0.7103 0.8428
No log 6.5424 386 0.7252 0.7061 0.7252 0.8516
No log 6.5763 388 0.7373 0.7017 0.7373 0.8587
No log 6.6102 390 0.7476 0.7137 0.7476 0.8647
No log 6.6441 392 0.7693 0.7254 0.7693 0.8771
No log 6.6780 394 0.7792 0.7309 0.7792 0.8827
No log 6.7119 396 0.7231 0.7504 0.7231 0.8504
No log 6.7458 398 0.6936 0.7765 0.6936 0.8328
No log 6.7797 400 0.6972 0.7552 0.6972 0.8350
No log 6.8136 402 0.7390 0.7358 0.7390 0.8597
No log 6.8475 404 0.7493 0.7372 0.7493 0.8656
No log 6.8814 406 0.7039 0.7490 0.7039 0.8390
No log 6.9153 408 0.6447 0.7619 0.6447 0.8029
No log 6.9492 410 0.6304 0.7560 0.6304 0.7940
No log 6.9831 412 0.6494 0.7696 0.6494 0.8058
No log 7.0169 414 0.7023 0.7631 0.7023 0.8380
No log 7.0508 416 0.7398 0.7498 0.7398 0.8601
No log 7.0847 418 0.7494 0.7456 0.7494 0.8657
No log 7.1186 420 0.7558 0.7413 0.7558 0.8693
No log 7.1525 422 0.7326 0.7616 0.7326 0.8559
No log 7.1864 424 0.7365 0.7498 0.7365 0.8582
No log 7.2203 426 0.7454 0.7476 0.7454 0.8634
No log 7.2542 428 0.7060 0.7598 0.7060 0.8402
No log 7.2881 430 0.6951 0.7622 0.6951 0.8337
No log 7.3220 432 0.6804 0.7647 0.6804 0.8249
No log 7.3559 434 0.6639 0.7536 0.6639 0.8148
No log 7.3898 436 0.6786 0.7622 0.6786 0.8238
No log 7.4237 438 0.6809 0.7698 0.6809 0.8252
No log 7.4576 440 0.6582 0.7651 0.6582 0.8113
No log 7.4915 442 0.6503 0.7696 0.6503 0.8064
No log 7.5254 444 0.6595 0.7801 0.6595 0.8121
No log 7.5593 446 0.6470 0.7771 0.6470 0.8043
No log 7.5932 448 0.6425 0.7771 0.6425 0.8016
No log 7.6271 450 0.6416 0.7801 0.6416 0.8010
No log 7.6610 452 0.6469 0.7801 0.6469 0.8043
No log 7.6949 454 0.6808 0.7788 0.6808 0.8251
No log 7.7288 456 0.7297 0.7593 0.7297 0.8542
No log 7.7627 458 0.7702 0.7350 0.7702 0.8776
No log 7.7966 460 0.7738 0.7268 0.7738 0.8796
No log 7.8305 462 0.7535 0.7387 0.7535 0.8680
No log 7.8644 464 0.7446 0.7508 0.7446 0.8629
No log 7.8983 466 0.7722 0.7289 0.7722 0.8787
No log 7.9322 468 0.7752 0.7407 0.7752 0.8805
No log 7.9661 470 0.7399 0.7472 0.7399 0.8602
No log 8.0 472 0.7334 0.7416 0.7334 0.8564
No log 8.0339 474 0.7577 0.7470 0.7577 0.8705
No log 8.0678 476 0.7750 0.7505 0.7750 0.8804
No log 8.1017 478 0.7750 0.7505 0.7750 0.8804
No log 8.1356 480 0.7560 0.7505 0.7560 0.8695
No log 8.1695 482 0.7242 0.7662 0.7242 0.8510
No log 8.2034 484 0.6876 0.7607 0.6876 0.8292
No log 8.2373 486 0.6791 0.7550 0.6791 0.8240
No log 8.2712 488 0.6895 0.7607 0.6895 0.8304
No log 8.3051 490 0.6952 0.7607 0.6952 0.8338
No log 8.3390 492 0.7129 0.7602 0.7129 0.8443
No log 8.3729 494 0.7400 0.7569 0.7400 0.8602
No log 8.4068 496 0.7398 0.7569 0.7398 0.8601
No log 8.4407 498 0.7287 0.7533 0.7287 0.8536
0.3032 8.4746 500 0.7237 0.7711 0.7237 0.8507
0.3032 8.5085 502 0.7212 0.7753 0.7212 0.8493
0.3032 8.5424 504 0.7206 0.7575 0.7206 0.8489
0.3032 8.5763 506 0.7164 0.7871 0.7164 0.8464
0.3032 8.6102 508 0.7223 0.7611 0.7223 0.8499
0.3032 8.6441 510 0.7405 0.7495 0.7405 0.8605
0.3032 8.6780 512 0.7561 0.7398 0.7561 0.8695
0.3032 8.7119 514 0.7589 0.7432 0.7589 0.8712
0.3032 8.7458 516 0.7548 0.7511 0.7548 0.8688
0.3032 8.7797 518 0.7684 0.7448 0.7684 0.8766
0.3032 8.8136 520 0.7818 0.7234 0.7818 0.8842
0.3032 8.8475 522 0.7774 0.7234 0.7774 0.8817
0.3032 8.8814 524 0.7647 0.7330 0.7647 0.8744
0.3032 8.9153 526 0.7343 0.7569 0.7343 0.8569
0.3032 8.9492 528 0.7100 0.7579 0.7100 0.8426
0.3032 8.9831 530 0.6995 0.7705 0.6995 0.8364
0.3032 9.0169 532 0.6992 0.7705 0.6992 0.8362
0.3032 9.0508 534 0.7104 0.7575 0.7104 0.8428
0.3032 9.0847 536 0.7184 0.7610 0.7184 0.8476
0.3032 9.1186 538 0.7349 0.7569 0.7349 0.8573
0.3032 9.1525 540 0.7562 0.7470 0.7562 0.8696
0.3032 9.1864 542 0.7631 0.7470 0.7631 0.8735
0.3032 9.2203 544 0.7713 0.7295 0.7713 0.8782
0.3032 9.2542 546 0.7732 0.7295 0.7732 0.8793
0.3032 9.2881 548 0.7775 0.7295 0.7775 0.8818
0.3032 9.3220 550 0.7836 0.7179 0.7836 0.8852
0.3032 9.3559 552 0.7778 0.7179 0.7778 0.8819
0.3032 9.3898 554 0.7674 0.7254 0.7674 0.8760
0.3032 9.4237 556 0.7579 0.7470 0.7579 0.8706
0.3032 9.4576 558 0.7488 0.7470 0.7488 0.8653
0.3032 9.4915 560 0.7371 0.7569 0.7371 0.8585
0.3032 9.5254 562 0.7245 0.7693 0.7245 0.8512
0.3032 9.5593 564 0.7128 0.7699 0.7128 0.8443
0.3032 9.5932 566 0.7025 0.7699 0.7025 0.8381
0.3032 9.6271 568 0.6915 0.7705 0.6915 0.8315
0.3032 9.6610 570 0.6806 0.7636 0.6806 0.8250
0.3032 9.6949 572 0.6750 0.7636 0.6750 0.8216
0.3032 9.7288 574 0.6740 0.7759 0.6740 0.8210
0.3032 9.7627 576 0.6750 0.7636 0.6750 0.8216
0.3032 9.7966 578 0.6784 0.7636 0.6784 0.8236
0.3032 9.8305 580 0.6831 0.7692 0.6831 0.8265
0.3032 9.8644 582 0.6883 0.7649 0.6883 0.8297
0.3032 9.8983 584 0.6932 0.7649 0.6932 0.8326
0.3032 9.9322 586 0.6968 0.7741 0.6968 0.8348
0.3032 9.9661 588 0.6994 0.7741 0.6994 0.8363
0.3032 10.0 590 0.7004 0.7741 0.7004 0.8369

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k15_task5_organization

Finetuned
(4023)
this model