ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k15_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9244
  • Qwk: 0.6569
  • Mse: 0.9244
  • Rmse: 0.9615

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0290 2 6.8057 0.0116 6.8057 2.6088
No log 0.0580 4 4.2094 0.0769 4.2094 2.0517
No log 0.0870 6 2.9070 0.0633 2.9070 1.7050
No log 0.1159 8 2.6686 0.1351 2.6686 1.6336
No log 0.1449 10 2.3573 0.1343 2.3573 1.5354
No log 0.1739 12 2.7239 0.0282 2.7239 1.6504
No log 0.2029 14 2.2371 0.0889 2.2371 1.4957
No log 0.2319 16 1.7141 0.2807 1.7141 1.3092
No log 0.2609 18 1.8341 0.3306 1.8341 1.3543
No log 0.2899 20 2.3292 0.0544 2.3292 1.5262
No log 0.3188 22 2.3957 0.0400 2.3957 1.5478
No log 0.3478 24 2.0965 0.1233 2.0965 1.4479
No log 0.3768 26 1.8161 0.384 1.8161 1.3476
No log 0.4058 28 1.5903 0.3590 1.5903 1.2611
No log 0.4348 30 1.3912 0.2778 1.3912 1.1795
No log 0.4638 32 1.2788 0.2430 1.2788 1.1308
No log 0.4928 34 1.1975 0.4174 1.1975 1.0943
No log 0.5217 36 1.2841 0.4341 1.2841 1.1332
No log 0.5507 38 2.5178 0.2935 2.5178 1.5867
No log 0.5797 40 2.4255 0.2857 2.4255 1.5574
No log 0.6087 42 1.7457 0.4789 1.7457 1.3212
No log 0.6377 44 1.2953 0.4333 1.2953 1.1381
No log 0.6667 46 1.3130 0.3571 1.3130 1.1458
No log 0.6957 48 1.4378 0.2056 1.4378 1.1991
No log 0.7246 50 1.4610 0.2075 1.4610 1.2087
No log 0.7536 52 1.4353 0.2407 1.4353 1.1980
No log 0.7826 54 1.5336 0.3607 1.5336 1.2384
No log 0.8116 56 1.7093 0.2687 1.7093 1.3074
No log 0.8406 58 1.6430 0.2923 1.6430 1.2818
No log 0.8696 60 1.4367 0.3802 1.4367 1.1986
No log 0.8986 62 1.4254 0.2407 1.4254 1.1939
No log 0.9275 64 1.8093 0.1346 1.8093 1.3451
No log 0.9565 66 1.7916 0.1714 1.7916 1.3385
No log 0.9855 68 1.5465 0.2243 1.5465 1.2436
No log 1.0145 70 1.3842 0.2364 1.3842 1.1765
No log 1.0435 72 1.4016 0.3333 1.4016 1.1839
No log 1.0725 74 1.3660 0.3390 1.3660 1.1688
No log 1.1014 76 1.2754 0.4553 1.2754 1.1293
No log 1.1304 78 1.3015 0.5082 1.3015 1.1408
No log 1.1594 80 1.2437 0.5354 1.2437 1.1152
No log 1.1884 82 1.3256 0.4921 1.3256 1.1514
No log 1.2174 84 1.4786 0.3817 1.4786 1.2160
No log 1.2464 86 1.7815 0.1832 1.7815 1.3347
No log 1.2754 88 1.7394 0.2901 1.7394 1.3189
No log 1.3043 90 1.2811 0.5038 1.2811 1.1318
No log 1.3333 92 1.0823 0.5865 1.0823 1.0403
No log 1.3623 94 1.0607 0.5538 1.0607 1.0299
No log 1.3913 96 1.3137 0.528 1.3137 1.1462
No log 1.4203 98 1.5357 0.3697 1.5357 1.2393
No log 1.4493 100 1.4533 0.4167 1.4533 1.2055
No log 1.4783 102 1.2730 0.5581 1.2730 1.1283
No log 1.5072 104 1.2033 0.5581 1.2033 1.0970
No log 1.5362 106 1.1556 0.5669 1.1556 1.0750
No log 1.5652 108 1.1957 0.5909 1.1957 1.0935
No log 1.5942 110 1.2434 0.5564 1.2434 1.1151
No log 1.6232 112 1.3718 0.4812 1.3718 1.1712
No log 1.6522 114 1.2547 0.5385 1.2547 1.1201
No log 1.6812 116 0.9192 0.6119 0.9192 0.9588
No log 1.7101 118 0.8640 0.6944 0.8640 0.9295
No log 1.7391 120 1.0505 0.5538 1.0505 1.0249
No log 1.7681 122 1.2803 0.4409 1.2803 1.1315
No log 1.7971 124 1.1627 0.5224 1.1627 1.0783
No log 1.8261 126 0.8451 0.6338 0.8451 0.9193
No log 1.8551 128 0.8020 0.6993 0.8020 0.8955
No log 1.8841 130 0.8991 0.6338 0.8991 0.9482
No log 1.9130 132 1.0732 0.5496 1.0732 1.0360
No log 1.9420 134 1.3856 0.5263 1.3856 1.1771
No log 1.9710 136 1.2010 0.5481 1.2010 1.0959
No log 2.0 138 0.8493 0.5957 0.8493 0.9216
No log 2.0290 140 0.9199 0.5857 0.9199 0.9591
No log 2.0580 142 1.0488 0.5522 1.0488 1.0241
No log 2.0870 144 0.9606 0.5957 0.9606 0.9801
No log 2.1159 146 0.9583 0.6069 0.9583 0.9789
No log 2.1449 148 1.2741 0.5072 1.2741 1.1287
No log 2.1739 150 1.5194 0.4196 1.5194 1.2327
No log 2.2029 152 1.3516 0.4714 1.3516 1.1626
No log 2.2319 154 1.1730 0.6099 1.1730 1.0831
No log 2.2609 156 1.0860 0.6438 1.0860 1.0421
No log 2.2899 158 1.1857 0.5417 1.1857 1.0889
No log 2.3188 160 1.0990 0.6 1.0990 1.0483
No log 2.3478 162 0.8786 0.7248 0.8786 0.9373
No log 2.3768 164 0.8468 0.7448 0.8468 0.9202
No log 2.4058 166 0.8967 0.6713 0.8967 0.9469
No log 2.4348 168 0.9288 0.6331 0.9288 0.9637
No log 2.4638 170 1.0102 0.5942 1.0102 1.0051
No log 2.4928 172 0.9579 0.6040 0.9579 0.9787
No log 2.5217 174 0.7905 0.7114 0.7905 0.8891
No log 2.5507 176 0.6830 0.7285 0.6830 0.8265
No log 2.5797 178 0.7408 0.7075 0.7408 0.8607
No log 2.6087 180 0.9707 0.6119 0.9707 0.9853
No log 2.6377 182 1.2053 0.5401 1.2053 1.0979
No log 2.6667 184 1.1616 0.5547 1.1616 1.0778
No log 2.6957 186 0.8990 0.6892 0.8990 0.9481
No log 2.7246 188 0.7399 0.7020 0.7399 0.8602
No log 2.7536 190 0.7211 0.7020 0.7211 0.8492
No log 2.7826 192 0.8606 0.6846 0.8606 0.9277
No log 2.8116 194 1.1204 0.5109 1.1204 1.0585
No log 2.8406 196 1.0664 0.5294 1.0664 1.0327
No log 2.8696 198 0.9710 0.6029 0.9710 0.9854
No log 2.8986 200 0.7849 0.6667 0.7849 0.8859
No log 2.9275 202 0.7964 0.6331 0.7964 0.8924
No log 2.9565 204 0.7014 0.7260 0.7014 0.8375
No log 2.9855 206 0.6727 0.7432 0.6727 0.8202
No log 3.0145 208 0.7417 0.7248 0.7417 0.8612
No log 3.0435 210 0.9042 0.6389 0.9042 0.9509
No log 3.0725 212 0.8671 0.6575 0.8671 0.9312
No log 3.1014 214 0.8386 0.6712 0.8386 0.9158
No log 3.1304 216 0.6928 0.7451 0.6928 0.8323
No log 3.1594 218 0.7017 0.7564 0.7017 0.8377
No log 3.1884 220 0.7418 0.75 0.7418 0.8613
No log 3.2174 222 0.8116 0.6418 0.8116 0.9009
No log 3.2464 224 0.9879 0.6061 0.9879 0.9939
No log 3.2754 226 1.2847 0.5362 1.2847 1.1334
No log 3.3043 228 1.4357 0.4085 1.4357 1.1982
No log 3.3333 230 1.2651 0.5072 1.2651 1.1248
No log 3.3623 232 0.9504 0.6119 0.9504 0.9749
No log 3.3913 234 0.7920 0.7194 0.7920 0.8899
No log 3.4203 236 0.8297 0.6528 0.8297 0.9109
No log 3.4493 238 0.7715 0.6950 0.7715 0.8784
No log 3.4783 240 0.8104 0.6963 0.8104 0.9002
No log 3.5072 242 1.0739 0.5865 1.0739 1.0363
No log 3.5362 244 1.2954 0.5109 1.2954 1.1381
No log 3.5652 246 1.3058 0.5109 1.3058 1.1427
No log 3.5942 248 1.1608 0.5109 1.1608 1.0774
No log 3.6232 250 0.8902 0.6316 0.8902 0.9435
No log 3.6522 252 0.7779 0.6667 0.7779 0.8820
No log 3.6812 254 0.7839 0.7536 0.7839 0.8854
No log 3.7101 256 0.8161 0.7068 0.8161 0.9034
No log 3.7391 258 0.9236 0.6316 0.9236 0.9610
No log 3.7681 260 1.0254 0.6061 1.0254 1.0126
No log 3.7971 262 1.1195 0.5522 1.1195 1.0581
No log 3.8261 264 1.0250 0.5839 1.0250 1.0124
No log 3.8551 266 0.8219 0.6667 0.8219 0.9066
No log 3.8841 268 0.6819 0.6986 0.6819 0.8258
No log 3.9130 270 0.6740 0.7286 0.6740 0.8210
No log 3.9420 272 0.7857 0.6765 0.7857 0.8864
No log 3.9710 274 0.8820 0.6212 0.8820 0.9392
No log 4.0 276 0.9442 0.5954 0.9442 0.9717
No log 4.0290 278 1.0361 0.5985 1.0361 1.0179
No log 4.0580 280 0.9847 0.6187 0.9847 0.9923
No log 4.0870 282 0.8583 0.6423 0.8583 0.9265
No log 4.1159 284 0.8967 0.6269 0.8967 0.9469
No log 4.1449 286 0.9456 0.6176 0.9456 0.9724
No log 4.1739 288 0.8824 0.6269 0.8824 0.9394
No log 4.2029 290 0.9312 0.6212 0.9312 0.9650
No log 4.2319 292 0.9087 0.6269 0.9087 0.9533
No log 4.2609 294 0.8767 0.6619 0.8767 0.9363
No log 4.2899 296 0.8025 0.7067 0.8025 0.8958
No log 4.3188 298 0.7479 0.7333 0.7479 0.8648
No log 4.3478 300 0.6864 0.75 0.6864 0.8285
No log 4.3768 302 0.6756 0.7582 0.6756 0.8219
No log 4.4058 304 0.6252 0.7712 0.6252 0.7907
No log 4.4348 306 0.6497 0.7792 0.6497 0.8060
No log 4.4638 308 0.6844 0.7361 0.6844 0.8273
No log 4.4928 310 0.6261 0.76 0.6261 0.7913
No log 4.5217 312 0.5781 0.7692 0.5781 0.7603
No log 4.5507 314 0.5828 0.7778 0.5828 0.7634
No log 4.5797 316 0.5850 0.7702 0.5850 0.7649
No log 4.6087 318 0.5903 0.75 0.5903 0.7683
No log 4.6377 320 0.6317 0.7582 0.6317 0.7948
No log 4.6667 322 0.6503 0.7260 0.6503 0.8064
No log 4.6957 324 0.6322 0.7692 0.6322 0.7951
No log 4.7246 326 0.6256 0.7821 0.6256 0.7909
No log 4.7536 328 0.6339 0.7821 0.6339 0.7962
No log 4.7826 330 0.6503 0.7843 0.6503 0.8064
No log 4.8116 332 0.6599 0.7662 0.6599 0.8124
No log 4.8406 334 0.7234 0.7639 0.7234 0.8505
No log 4.8696 336 0.7431 0.7413 0.7431 0.8620
No log 4.8986 338 0.7390 0.7785 0.7390 0.8596
No log 4.9275 340 0.7257 0.7843 0.7257 0.8519
No log 4.9565 342 0.7418 0.7564 0.7418 0.8613
No log 4.9855 344 0.7347 0.75 0.7347 0.8572
No log 5.0145 346 0.7526 0.7097 0.7526 0.8675
No log 5.0435 348 0.7973 0.6797 0.7973 0.8929
No log 5.0725 350 0.6657 0.7949 0.6657 0.8159
No log 5.1014 352 0.6659 0.7821 0.6659 0.8160
No log 5.1304 354 0.7071 0.7632 0.7071 0.8409
No log 5.1594 356 0.7335 0.7632 0.7335 0.8565
No log 5.1884 358 0.6574 0.7848 0.6574 0.8108
No log 5.2174 360 0.6465 0.7848 0.6465 0.8040
No log 5.2464 362 0.6699 0.7848 0.6699 0.8185
No log 5.2754 364 0.6297 0.775 0.6297 0.7935
No log 5.3043 366 0.6286 0.7730 0.6286 0.7929
No log 5.3333 368 0.6278 0.7778 0.6278 0.7923
No log 5.3623 370 0.6803 0.7799 0.6803 0.8248
No log 5.3913 372 0.9109 0.6486 0.9109 0.9544
No log 5.4203 374 1.0900 0.5946 1.0900 1.0440
No log 5.4493 376 1.0144 0.6405 1.0144 1.0072
No log 5.4783 378 0.7571 0.7407 0.7571 0.8701
No log 5.5072 380 0.5964 0.7831 0.5964 0.7722
No log 5.5362 382 0.5811 0.8156 0.5811 0.7623
No log 5.5652 384 0.5916 0.8156 0.5916 0.7692
No log 5.5942 386 0.6164 0.7978 0.6164 0.7851
No log 5.6232 388 0.6514 0.7978 0.6514 0.8071
No log 5.6522 390 0.7799 0.7886 0.7799 0.8831
No log 5.6812 392 0.7221 0.7952 0.7221 0.8498
No log 5.7101 394 0.6605 0.7673 0.6605 0.8127
No log 5.7391 396 0.5806 0.7643 0.5806 0.7620
No log 5.7681 398 0.5694 0.75 0.5694 0.7546
No log 5.7971 400 0.5750 0.7632 0.5750 0.7583
No log 5.8261 402 0.6290 0.7682 0.6290 0.7931
No log 5.8551 404 0.7358 0.7347 0.7358 0.8578
No log 5.8841 406 0.7141 0.7123 0.7141 0.8451
No log 5.9130 408 0.6543 0.7682 0.6543 0.8089
No log 5.9420 410 0.7010 0.6957 0.7010 0.8372
No log 5.9710 412 0.7229 0.6957 0.7229 0.8502
No log 6.0 414 0.7736 0.6901 0.7736 0.8796
No log 6.0290 416 0.7652 0.7034 0.7652 0.8748
No log 6.0580 418 0.6629 0.7643 0.6629 0.8142
No log 6.0870 420 0.6001 0.7778 0.6001 0.7747
No log 6.1159 422 0.6022 0.7730 0.6022 0.7760
No log 6.1449 424 0.6352 0.7564 0.6352 0.7970
No log 6.1739 426 0.6877 0.7550 0.6877 0.8293
No log 6.2029 428 0.6707 0.7632 0.6707 0.8190
No log 6.2319 430 0.6607 0.7550 0.6607 0.8128
No log 6.2609 432 0.6695 0.7821 0.6695 0.8182
No log 6.2899 434 0.6486 0.7821 0.6486 0.8054
No log 6.3188 436 0.6108 0.7778 0.6108 0.7815
No log 6.3478 438 0.5742 0.7702 0.5742 0.7578
No log 6.3768 440 0.5636 0.7632 0.5636 0.7508
No log 6.4058 442 0.5496 0.7568 0.5496 0.7414
No log 6.4348 444 0.5601 0.7619 0.5601 0.7484
No log 6.4638 446 0.6071 0.7651 0.6071 0.7792
No log 6.4928 448 0.5988 0.7843 0.5988 0.7739
No log 6.5217 450 0.5741 0.7733 0.5741 0.7577
No log 6.5507 452 0.5714 0.7733 0.5714 0.7559
No log 6.5797 454 0.5744 0.7662 0.5744 0.7579
No log 6.6087 456 0.5828 0.7826 0.5828 0.7634
No log 6.6377 458 0.6131 0.7595 0.6131 0.7830
No log 6.6667 460 0.5993 0.7643 0.5993 0.7742
No log 6.6957 462 0.5620 0.7532 0.5620 0.7497
No log 6.7246 464 0.5568 0.7643 0.5568 0.7462
No log 6.7536 466 0.5846 0.7532 0.5846 0.7646
No log 6.7826 468 0.6268 0.7383 0.6268 0.7917
No log 6.8116 470 0.6457 0.75 0.6457 0.8036
No log 6.8406 472 0.6192 0.7397 0.6192 0.7869
No log 6.8696 474 0.5804 0.7432 0.5804 0.7618
No log 6.8986 476 0.5641 0.7564 0.5641 0.7511
No log 6.9275 478 0.5622 0.8 0.5622 0.7498
No log 6.9565 480 0.5588 0.7517 0.5588 0.7476
No log 6.9855 482 0.6071 0.7568 0.6071 0.7792
No log 7.0145 484 0.6409 0.7448 0.6409 0.8006
No log 7.0435 486 0.6329 0.7448 0.6329 0.7955
No log 7.0725 488 0.6441 0.7568 0.6441 0.8026
No log 7.1014 490 0.6085 0.7534 0.6085 0.7801
No log 7.1304 492 0.5809 0.7383 0.5809 0.7622
No log 7.1594 494 0.5913 0.7448 0.5913 0.7690
No log 7.1884 496 0.6668 0.7286 0.6668 0.8166
No log 7.2174 498 0.8902 0.6712 0.8902 0.9435
0.455 7.2464 500 1.0045 0.6581 1.0045 1.0022
0.455 7.2754 502 0.9554 0.6154 0.9554 0.9775
0.455 7.3043 504 0.8082 0.6667 0.8082 0.8990
0.455 7.3333 506 0.6877 0.7194 0.6877 0.8293
0.455 7.3623 508 0.6623 0.7692 0.6623 0.8138
0.455 7.3913 510 0.6498 0.7799 0.6498 0.8061
0.455 7.4203 512 0.6396 0.7722 0.6396 0.7998
0.455 7.4493 514 0.6829 0.7871 0.6829 0.8264
0.455 7.4783 516 0.7201 0.7451 0.7201 0.8486
0.455 7.5072 518 0.7117 0.7564 0.7117 0.8436
0.455 7.5362 520 0.6776 0.7821 0.6776 0.8232
0.455 7.5652 522 0.6596 0.7763 0.6596 0.8121
0.455 7.5942 524 0.6454 0.7651 0.6454 0.8034
0.455 7.6232 526 0.6494 0.7619 0.6494 0.8058
0.455 7.6522 528 0.6669 0.7586 0.6669 0.8166
0.455 7.6812 530 0.6913 0.7246 0.6913 0.8314
0.455 7.7101 532 0.7331 0.7246 0.7331 0.8562
0.455 7.7391 534 0.7201 0.7194 0.7201 0.8486
0.455 7.7681 536 0.6498 0.7376 0.6498 0.8061
0.455 7.7971 538 0.6027 0.7564 0.6027 0.7764
0.455 7.8261 540 0.6419 0.7403 0.6419 0.8012
0.455 7.8551 542 0.6519 0.7403 0.6519 0.8074
0.455 7.8841 544 0.6047 0.7662 0.6047 0.7776
0.455 7.9130 546 0.5850 0.7643 0.5850 0.7648
0.455 7.9420 548 0.6791 0.7183 0.6791 0.8241
0.455 7.9710 550 0.7810 0.7347 0.7810 0.8837
0.455 8.0 552 0.8799 0.6486 0.8799 0.9380
0.455 8.0290 554 0.9469 0.625 0.9469 0.9731
0.455 8.0580 556 0.9779 0.6 0.9779 0.9889
0.455 8.0870 558 1.0050 0.6232 1.0050 1.0025
0.455 8.1159 560 0.9244 0.6569 0.9244 0.9615

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k15_task1_organization

Finetuned
(4023)
this model