ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k9_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8390
  • Qwk: 0.4560
  • Mse: 0.8390
  • Rmse: 0.9160

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0833 2 4.1156 0.0024 4.1156 2.0287
No log 0.1667 4 1.9847 0.0633 1.9847 1.4088
No log 0.25 6 1.2650 0.0232 1.2650 1.1247
No log 0.3333 8 1.1427 0.1296 1.1427 1.0690
No log 0.4167 10 1.4212 0.0273 1.4212 1.1921
No log 0.5 12 1.4854 0.1438 1.4854 1.2188
No log 0.5833 14 1.3519 0.0170 1.3519 1.1627
No log 0.6667 16 1.3687 0.0712 1.3687 1.1699
No log 0.75 18 1.0846 0.2539 1.0846 1.0414
No log 0.8333 20 1.0034 0.2035 1.0034 1.0017
No log 0.9167 22 1.1764 0.0427 1.1764 1.0846
No log 1.0 24 1.6202 0.0399 1.6202 1.2729
No log 1.0833 26 1.7089 0.0651 1.7089 1.3073
No log 1.1667 28 1.2862 -0.0296 1.2862 1.1341
No log 1.25 30 1.0896 0.2734 1.0896 1.0438
No log 1.3333 32 1.1734 0.2150 1.1734 1.0833
No log 1.4167 34 1.1268 0.1910 1.1268 1.0615
No log 1.5 36 1.1471 0.1910 1.1471 1.0710
No log 1.5833 38 1.2530 0.0380 1.2530 1.1194
No log 1.6667 40 1.1814 0.1910 1.1814 1.0869
No log 1.75 42 1.1412 0.2150 1.1412 1.0683
No log 1.8333 44 1.1151 0.2150 1.1151 1.0560
No log 1.9167 46 1.1561 0.2295 1.1561 1.0752
No log 2.0 48 1.1455 0.2150 1.1455 1.0703
No log 2.0833 50 1.1505 0.2150 1.1505 1.0726
No log 2.1667 52 1.0827 0.1979 1.0827 1.0405
No log 2.25 54 1.0039 0.2416 1.0039 1.0019
No log 2.3333 56 0.9863 0.2068 0.9863 0.9931
No log 2.4167 58 1.0020 0.2441 1.0020 1.0010
No log 2.5 60 1.1079 0.2175 1.1079 1.0526
No log 2.5833 62 1.1474 0.2143 1.1474 1.0712
No log 2.6667 64 0.9963 0.2781 0.9963 0.9981
No log 2.75 66 0.9530 0.2390 0.9530 0.9762
No log 2.8333 68 1.0258 0.0445 1.0258 1.0128
No log 2.9167 70 0.9939 0.1076 0.9939 0.9970
No log 3.0 72 0.9553 0.2912 0.9553 0.9774
No log 3.0833 74 1.0256 0.2731 1.0256 1.0127
No log 3.1667 76 1.1163 0.2260 1.1163 1.0566
No log 3.25 78 1.0419 0.3131 1.0419 1.0207
No log 3.3333 80 0.9537 0.3370 0.9537 0.9766
No log 3.4167 82 0.9233 0.4438 0.9233 0.9609
No log 3.5 84 0.9231 0.4275 0.9231 0.9608
No log 3.5833 86 0.9396 0.4365 0.9396 0.9693
No log 3.6667 88 0.9266 0.4915 0.9266 0.9626
No log 3.75 90 0.8538 0.4769 0.8538 0.9240
No log 3.8333 92 0.7824 0.6133 0.7824 0.8845
No log 3.9167 94 0.7449 0.5035 0.7449 0.8631
No log 4.0 96 0.7973 0.4421 0.7973 0.8929
No log 4.0833 98 1.0362 0.3283 1.0362 1.0179
No log 4.1667 100 1.1811 0.3001 1.1811 1.0868
No log 4.25 102 1.0545 0.3218 1.0545 1.0269
No log 4.3333 104 0.7491 0.4949 0.7491 0.8655
No log 4.4167 106 0.6625 0.5446 0.6625 0.8139
No log 4.5 108 0.6912 0.5329 0.6912 0.8314
No log 4.5833 110 0.7396 0.4444 0.7396 0.8600
No log 4.6667 112 0.7370 0.5057 0.7370 0.8585
No log 4.75 114 0.7602 0.5127 0.7602 0.8719
No log 4.8333 116 0.7781 0.4615 0.7781 0.8821
No log 4.9167 118 0.8226 0.5065 0.8226 0.9070
No log 5.0 120 0.9131 0.4051 0.9131 0.9556
No log 5.0833 122 0.8026 0.5079 0.8026 0.8959
No log 5.1667 124 0.7402 0.4962 0.7402 0.8603
No log 5.25 126 0.7355 0.5512 0.7355 0.8576
No log 5.3333 128 0.8009 0.5181 0.8009 0.8950
No log 5.4167 130 0.9723 0.4252 0.9723 0.9860
No log 5.5 132 0.8379 0.5538 0.8379 0.9154
No log 5.5833 134 0.7056 0.5692 0.7056 0.8400
No log 5.6667 136 0.8537 0.5019 0.8537 0.9240
No log 5.75 138 0.7698 0.4893 0.7698 0.8774
No log 5.8333 140 0.6772 0.5949 0.6772 0.8229
No log 5.9167 142 0.7273 0.5540 0.7273 0.8528
No log 6.0 144 0.6866 0.6043 0.6866 0.8286
No log 6.0833 146 0.6664 0.5485 0.6664 0.8163
No log 6.1667 148 0.6526 0.5262 0.6526 0.8079
No log 6.25 150 0.6654 0.6325 0.6654 0.8157
No log 6.3333 152 0.6916 0.6315 0.6916 0.8316
No log 6.4167 154 0.6888 0.5980 0.6888 0.8300
No log 6.5 156 0.7031 0.5980 0.7031 0.8385
No log 6.5833 158 0.7387 0.5869 0.7387 0.8595
No log 6.6667 160 0.7054 0.5680 0.7054 0.8399
No log 6.75 162 0.7433 0.5759 0.7433 0.8621
No log 6.8333 164 0.7516 0.5890 0.7516 0.8669
No log 6.9167 166 0.7267 0.5659 0.7267 0.8525
No log 7.0 168 0.7371 0.5204 0.7371 0.8586
No log 7.0833 170 0.6637 0.6307 0.6637 0.8147
No log 7.1667 172 0.6463 0.6762 0.6463 0.8039
No log 7.25 174 0.6659 0.5955 0.6659 0.8160
No log 7.3333 176 0.6305 0.6610 0.6305 0.7941
No log 7.4167 178 0.7525 0.5735 0.7525 0.8675
No log 7.5 180 0.7804 0.5443 0.7804 0.8834
No log 7.5833 182 0.6913 0.5546 0.6913 0.8315
No log 7.6667 184 0.6455 0.6456 0.6455 0.8034
No log 7.75 186 0.6756 0.6165 0.6756 0.8219
No log 7.8333 188 0.7471 0.5397 0.7471 0.8643
No log 7.9167 190 0.7352 0.5751 0.7352 0.8575
No log 8.0 192 0.7066 0.6724 0.7066 0.8406
No log 8.0833 194 0.7465 0.5774 0.7465 0.8640
No log 8.1667 196 0.8732 0.4470 0.8732 0.9345
No log 8.25 198 0.8659 0.4588 0.8659 0.9305
No log 8.3333 200 0.8049 0.5195 0.8049 0.8972
No log 8.4167 202 0.7887 0.5160 0.7887 0.8881
No log 8.5 204 0.8055 0.5301 0.8055 0.8975
No log 8.5833 206 0.7984 0.5017 0.7984 0.8935
No log 8.6667 208 0.8057 0.4375 0.8057 0.8976
No log 8.75 210 0.7880 0.4757 0.7880 0.8877
No log 8.8333 212 0.7852 0.4757 0.7852 0.8861
No log 8.9167 214 0.7984 0.4974 0.7984 0.8935
No log 9.0 216 0.7877 0.5261 0.7877 0.8875
No log 9.0833 218 0.7915 0.5248 0.7915 0.8897
No log 9.1667 220 0.7936 0.5365 0.7936 0.8909
No log 9.25 222 0.7869 0.5798 0.7869 0.8871
No log 9.3333 224 0.7804 0.5607 0.7804 0.8834
No log 9.4167 226 0.7605 0.5540 0.7605 0.8721
No log 9.5 228 0.7408 0.5614 0.7408 0.8607
No log 9.5833 230 0.7789 0.5425 0.7789 0.8825
No log 9.6667 232 0.7729 0.5635 0.7729 0.8791
No log 9.75 234 0.8060 0.5480 0.8060 0.8978
No log 9.8333 236 0.8552 0.4834 0.8552 0.9248
No log 9.9167 238 0.8617 0.4450 0.8617 0.9283
No log 10.0 240 0.8639 0.4537 0.8639 0.9294
No log 10.0833 242 0.8877 0.4455 0.8877 0.9422
No log 10.1667 244 0.9938 0.4021 0.9938 0.9969
No log 10.25 246 0.9196 0.4749 0.9196 0.9589
No log 10.3333 248 0.8804 0.4524 0.8804 0.9383
No log 10.4167 250 0.9162 0.4517 0.9162 0.9572
No log 10.5 252 0.8237 0.4871 0.8237 0.9076
No log 10.5833 254 0.8311 0.4849 0.8311 0.9116
No log 10.6667 256 0.8239 0.5287 0.8239 0.9077
No log 10.75 258 0.8134 0.5518 0.8134 0.9019
No log 10.8333 260 0.8769 0.4639 0.8769 0.9364
No log 10.9167 262 0.8556 0.4954 0.8556 0.9250
No log 11.0 264 0.8359 0.5379 0.8359 0.9143
No log 11.0833 266 0.8676 0.4963 0.8676 0.9315
No log 11.1667 268 0.8329 0.5692 0.8329 0.9126
No log 11.25 270 0.8786 0.4440 0.8786 0.9374
No log 11.3333 272 0.9083 0.4601 0.9083 0.9530
No log 11.4167 274 0.8444 0.5006 0.8444 0.9189
No log 11.5 276 0.7798 0.5657 0.7798 0.8830
No log 11.5833 278 0.8064 0.5266 0.8064 0.8980
No log 11.6667 280 0.8380 0.5358 0.8380 0.9154
No log 11.75 282 0.8222 0.4917 0.8222 0.9068
No log 11.8333 284 0.7998 0.5167 0.7998 0.8943
No log 11.9167 286 0.7570 0.5774 0.7570 0.8700
No log 12.0 288 0.7395 0.6122 0.7395 0.8599
No log 12.0833 290 0.7282 0.5986 0.7282 0.8534
No log 12.1667 292 0.8070 0.5668 0.8070 0.8983
No log 12.25 294 0.7747 0.5587 0.7747 0.8802
No log 12.3333 296 0.6899 0.5594 0.6899 0.8306
No log 12.4167 298 0.6949 0.5647 0.6949 0.8336
No log 12.5 300 0.7088 0.5669 0.7088 0.8419
No log 12.5833 302 0.8788 0.4970 0.8788 0.9374
No log 12.6667 304 1.0405 0.4458 1.0405 1.0200
No log 12.75 306 1.0228 0.4458 1.0228 1.0113
No log 12.8333 308 0.8781 0.4810 0.8781 0.9370
No log 12.9167 310 0.7623 0.5875 0.7623 0.8731
No log 13.0 312 0.7672 0.5788 0.7672 0.8759
No log 13.0833 314 0.7530 0.5810 0.7530 0.8678
No log 13.1667 316 0.7106 0.5438 0.7106 0.8430
No log 13.25 318 0.7122 0.5455 0.7122 0.8439
No log 13.3333 320 0.7457 0.4940 0.7457 0.8636
No log 13.4167 322 0.7297 0.4966 0.7297 0.8542
No log 13.5 324 0.6871 0.5260 0.6871 0.8289
No log 13.5833 326 0.6819 0.5485 0.6819 0.8258
No log 13.6667 328 0.6870 0.5905 0.6870 0.8289
No log 13.75 330 0.7553 0.5532 0.7553 0.8691
No log 13.8333 332 0.7745 0.5439 0.7745 0.8800
No log 13.9167 334 0.7646 0.5647 0.7646 0.8744
No log 14.0 336 0.7124 0.5988 0.7124 0.8440
No log 14.0833 338 0.6667 0.6240 0.6667 0.8165
No log 14.1667 340 0.6996 0.6015 0.6996 0.8364
No log 14.25 342 0.7533 0.5591 0.7533 0.8679
No log 14.3333 344 0.7043 0.5663 0.7043 0.8392
No log 14.4167 346 0.6838 0.6305 0.6838 0.8269
No log 14.5 348 0.6791 0.6272 0.6791 0.8241
No log 14.5833 350 0.6509 0.6229 0.6509 0.8068
No log 14.6667 352 0.6990 0.5645 0.6990 0.8361
No log 14.75 354 0.7301 0.5324 0.7301 0.8545
No log 14.8333 356 0.6983 0.5300 0.6983 0.8357
No log 14.9167 358 0.7037 0.5607 0.7037 0.8389
No log 15.0 360 0.7313 0.6240 0.7313 0.8552
No log 15.0833 362 0.7074 0.4550 0.7074 0.8411
No log 15.1667 364 0.7101 0.4482 0.7101 0.8427
No log 15.25 366 0.6968 0.4849 0.6968 0.8348
No log 15.3333 368 0.6764 0.5248 0.6764 0.8225
No log 15.4167 370 0.7298 0.5630 0.7298 0.8543
No log 15.5 372 0.7277 0.5751 0.7277 0.8531
No log 15.5833 374 0.6721 0.6014 0.6721 0.8198
No log 15.6667 376 0.6558 0.6360 0.6558 0.8098
No log 15.75 378 0.6914 0.5983 0.6914 0.8315
No log 15.8333 380 0.7923 0.5283 0.7923 0.8901
No log 15.9167 382 0.7614 0.5510 0.7614 0.8726
No log 16.0 384 0.7027 0.6344 0.7027 0.8383
No log 16.0833 386 0.6733 0.5938 0.6733 0.8205
No log 16.1667 388 0.6516 0.6460 0.6516 0.8072
No log 16.25 390 0.6223 0.5886 0.6223 0.7889
No log 16.3333 392 0.6184 0.5910 0.6184 0.7864
No log 16.4167 394 0.6107 0.6733 0.6107 0.7815
No log 16.5 396 0.6145 0.6187 0.6145 0.7839
No log 16.5833 398 0.6133 0.6438 0.6133 0.7831
No log 16.6667 400 0.5911 0.6301 0.5911 0.7688
No log 16.75 402 0.5767 0.6733 0.5767 0.7594
No log 16.8333 404 0.5816 0.6537 0.5816 0.7626
No log 16.9167 406 0.5983 0.6133 0.5983 0.7735
No log 17.0 408 0.5998 0.6133 0.5998 0.7745
No log 17.0833 410 0.5985 0.6479 0.5985 0.7736
No log 17.1667 412 0.6036 0.6014 0.6036 0.7769
No log 17.25 414 0.6083 0.6296 0.6083 0.7799
No log 17.3333 416 0.6028 0.6606 0.6028 0.7764
No log 17.4167 418 0.6057 0.6479 0.6057 0.7783
No log 17.5 420 0.5962 0.6606 0.5962 0.7722
No log 17.5833 422 0.5960 0.6405 0.5960 0.7720
No log 17.6667 424 0.6073 0.6584 0.6073 0.7793
No log 17.75 426 0.6695 0.6099 0.6695 0.8182
No log 17.8333 428 0.6412 0.6259 0.6412 0.8007
No log 17.9167 430 0.6163 0.6320 0.6163 0.7850
No log 18.0 432 0.6288 0.6249 0.6288 0.7930
No log 18.0833 434 0.6663 0.5751 0.6663 0.8163
No log 18.1667 436 0.6986 0.5173 0.6986 0.8358
No log 18.25 438 0.6543 0.5917 0.6543 0.8089
No log 18.3333 440 0.6542 0.5261 0.6542 0.8088
No log 18.4167 442 0.6915 0.5986 0.6915 0.8316
No log 18.5 444 0.7041 0.5986 0.7041 0.8391
No log 18.5833 446 0.6947 0.5160 0.6947 0.8335
No log 18.6667 448 0.7040 0.4960 0.7040 0.8390
No log 18.75 450 0.6996 0.4960 0.6996 0.8364
No log 18.8333 452 0.6890 0.4691 0.6890 0.8301
No log 18.9167 454 0.6943 0.5121 0.6943 0.8333
No log 19.0 456 0.6916 0.5516 0.6916 0.8316
No log 19.0833 458 0.6865 0.4912 0.6865 0.8286
No log 19.1667 460 0.6761 0.5563 0.6761 0.8222
No log 19.25 462 0.6787 0.5763 0.6787 0.8238
No log 19.3333 464 0.6973 0.5275 0.6973 0.8350
No log 19.4167 466 0.7035 0.5287 0.7035 0.8387
No log 19.5 468 0.7055 0.5125 0.7055 0.8399
No log 19.5833 470 0.7592 0.5751 0.7592 0.8713
No log 19.6667 472 0.7768 0.4599 0.7768 0.8814
No log 19.75 474 0.7428 0.5117 0.7428 0.8619
No log 19.8333 476 0.7287 0.5485 0.7287 0.8537
No log 19.9167 478 0.7393 0.5345 0.7393 0.8598
No log 20.0 480 0.7781 0.5365 0.7781 0.8821
No log 20.0833 482 0.7917 0.5537 0.7917 0.8898
No log 20.1667 484 0.7648 0.5717 0.7648 0.8745
No log 20.25 486 0.7436 0.5736 0.7436 0.8623
No log 20.3333 488 0.6951 0.5933 0.6951 0.8337
No log 20.4167 490 0.6952 0.6219 0.6952 0.8338
No log 20.5 492 0.7343 0.4995 0.7343 0.8569
No log 20.5833 494 0.7461 0.4520 0.7461 0.8638
No log 20.6667 496 0.7360 0.4794 0.7360 0.8579
No log 20.75 498 0.7165 0.4794 0.7165 0.8465
0.3448 20.8333 500 0.7001 0.5135 0.7001 0.8367
0.3448 20.9167 502 0.6969 0.5357 0.6969 0.8348
0.3448 21.0 504 0.7040 0.4804 0.7040 0.8390
0.3448 21.0833 506 0.7705 0.4424 0.7705 0.8778
0.3448 21.1667 508 0.8004 0.4053 0.8004 0.8947
0.3448 21.25 510 0.7363 0.5396 0.7363 0.8581
0.3448 21.3333 512 0.7376 0.5786 0.7376 0.8589
0.3448 21.4167 514 0.7632 0.5546 0.7632 0.8736
0.3448 21.5 516 0.7349 0.5610 0.7349 0.8573
0.3448 21.5833 518 0.7363 0.5386 0.7363 0.8581
0.3448 21.6667 520 0.7478 0.4807 0.7478 0.8648
0.3448 21.75 522 0.7611 0.4898 0.7611 0.8724
0.3448 21.8333 524 0.7883 0.4507 0.7883 0.8879
0.3448 21.9167 526 0.7731 0.5175 0.7731 0.8793
0.3448 22.0 528 0.7323 0.5370 0.7323 0.8558
0.3448 22.0833 530 0.7427 0.5412 0.7427 0.8618
0.3448 22.1667 532 0.7285 0.5396 0.7285 0.8535
0.3448 22.25 534 0.7150 0.5980 0.7150 0.8456
0.3448 22.3333 536 0.8256 0.5428 0.8256 0.9086
0.3448 22.4167 538 0.9665 0.4444 0.9665 0.9831
0.3448 22.5 540 0.9586 0.4238 0.9586 0.9791
0.3448 22.5833 542 0.8461 0.5376 0.8461 0.9198
0.3448 22.6667 544 0.7478 0.5708 0.7478 0.8648
0.3448 22.75 546 0.7158 0.5103 0.7158 0.8460
0.3448 22.8333 548 0.7275 0.4755 0.7275 0.8529
0.3448 22.9167 550 0.7890 0.5579 0.7890 0.8882
0.3448 23.0 552 0.8276 0.5463 0.8276 0.9097
0.3448 23.0833 554 0.8308 0.5006 0.8308 0.9115
0.3448 23.1667 556 0.7577 0.5602 0.7577 0.8704
0.3448 23.25 558 0.7047 0.5590 0.7047 0.8395
0.3448 23.3333 560 0.7005 0.5248 0.7005 0.8370
0.3448 23.4167 562 0.7011 0.5590 0.7011 0.8373
0.3448 23.5 564 0.7259 0.5933 0.7259 0.8520
0.3448 23.5833 566 0.7231 0.5933 0.7231 0.8504
0.3448 23.6667 568 0.7014 0.5880 0.7014 0.8375
0.3448 23.75 570 0.7065 0.5880 0.7065 0.8405
0.3448 23.8333 572 0.7147 0.5763 0.7147 0.8454
0.3448 23.9167 574 0.7111 0.5763 0.7111 0.8433
0.3448 24.0 576 0.7188 0.5763 0.7188 0.8478
0.3448 24.0833 578 0.7073 0.5763 0.7073 0.8410
0.3448 24.1667 580 0.7170 0.5763 0.7170 0.8468
0.3448 24.25 582 0.7275 0.5121 0.7275 0.8529
0.3448 24.3333 584 0.7214 0.5503 0.7214 0.8493
0.3448 24.4167 586 0.7281 0.4878 0.7281 0.8533
0.3448 24.5 588 0.7206 0.5010 0.7206 0.8489
0.3448 24.5833 590 0.7330 0.4743 0.7330 0.8562
0.3448 24.6667 592 0.7728 0.4935 0.7728 0.8791
0.3448 24.75 594 0.7496 0.5173 0.7496 0.8658
0.3448 24.8333 596 0.7051 0.5375 0.7051 0.8397
0.3448 24.9167 598 0.7065 0.5402 0.7065 0.8405
0.3448 25.0 600 0.7067 0.5415 0.7067 0.8407
0.3448 25.0833 602 0.7042 0.5261 0.7042 0.8391
0.3448 25.1667 604 0.7211 0.5657 0.7211 0.8492
0.3448 25.25 606 0.7326 0.5450 0.7326 0.8559
0.3448 25.3333 608 0.7366 0.5002 0.7366 0.8583
0.3448 25.4167 610 0.7417 0.4893 0.7417 0.8612
0.3448 25.5 612 0.7484 0.4660 0.7484 0.8651
0.3448 25.5833 614 0.7790 0.4615 0.7790 0.8826
0.3448 25.6667 616 0.8390 0.4560 0.8390 0.9160

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k9_task5_organization

Finetuned
(4023)
this model