ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run3_AugV5_k5_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6831
  • Qwk: 0.5300
  • Mse: 0.6831
  • Rmse: 0.8265

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1176 2 3.9512 -0.0331 3.9512 1.9878
No log 0.2353 4 1.9517 0.0054 1.9517 1.3970
No log 0.3529 6 1.7389 0.0429 1.7389 1.3187
No log 0.4706 8 1.2289 0.0380 1.2289 1.1085
No log 0.5882 10 1.0338 0.2467 1.0338 1.0168
No log 0.7059 12 1.2293 0.0520 1.2293 1.1088
No log 0.8235 14 1.3886 -0.0777 1.3886 1.1784
No log 0.9412 16 1.2919 -0.0743 1.2919 1.1366
No log 1.0588 18 1.2011 0.0287 1.2011 1.0960
No log 1.1765 20 1.0930 0.2416 1.0930 1.0455
No log 1.2941 22 1.1705 0.1738 1.1705 1.0819
No log 1.4118 24 1.3922 0.0 1.3922 1.1799
No log 1.5294 26 1.3674 0.0 1.3674 1.1694
No log 1.6471 28 1.2346 0.0731 1.2346 1.1111
No log 1.7647 30 1.0999 0.2416 1.0999 1.0487
No log 1.8824 32 1.0559 0.2015 1.0559 1.0275
No log 2.0 34 1.0479 0.1698 1.0479 1.0237
No log 2.1176 36 1.1970 0.2156 1.1970 1.0941
No log 2.2353 38 1.5628 0.1303 1.5628 1.2501
No log 2.3529 40 1.5298 0.1208 1.5298 1.2369
No log 2.4706 42 1.1913 0.1967 1.1913 1.0915
No log 2.5882 44 1.0543 0.1341 1.0543 1.0268
No log 2.7059 46 1.0792 0.3596 1.0792 1.0388
No log 2.8235 48 1.0227 0.2541 1.0227 1.0113
No log 2.9412 50 1.1987 0.2665 1.1987 1.0949
No log 3.0588 52 1.5761 0.2033 1.5761 1.2554
No log 3.1765 54 1.5685 0.2248 1.5685 1.2524
No log 3.2941 56 1.3126 0.3059 1.3126 1.1457
No log 3.4118 58 0.9575 0.2822 0.9575 0.9785
No log 3.5294 60 0.9224 0.3840 0.9224 0.9604
No log 3.6471 62 1.0211 0.3124 1.0211 1.0105
No log 3.7647 64 0.9625 0.3841 0.9625 0.9811
No log 3.8824 66 0.9537 0.3896 0.9537 0.9766
No log 4.0 68 1.0236 0.2145 1.0236 1.0117
No log 4.1176 70 1.0754 0.2032 1.0754 1.0370
No log 4.2353 72 1.0041 0.2911 1.0041 1.0021
No log 4.3529 74 0.9998 0.3414 0.9998 0.9999
No log 4.4706 76 0.9896 0.4059 0.9896 0.9948
No log 4.5882 78 1.0043 0.3985 1.0043 1.0021
No log 4.7059 80 1.0035 0.3696 1.0035 1.0018
No log 4.8235 82 0.9681 0.5090 0.9681 0.9839
No log 4.9412 84 1.1340 0.3486 1.1340 1.0649
No log 5.0588 86 1.1375 0.3486 1.1375 1.0665
No log 5.1765 88 0.9202 0.5090 0.9202 0.9593
No log 5.2941 90 1.0273 0.3784 1.0273 1.0135
No log 5.4118 92 1.3062 0.3316 1.3062 1.1429
No log 5.5294 94 1.2205 0.3432 1.2205 1.1048
No log 5.6471 96 0.9394 0.3714 0.9394 0.9692
No log 5.7647 98 0.8442 0.4676 0.8442 0.9188
No log 5.8824 100 0.9365 0.3897 0.9365 0.9677
No log 6.0 102 0.9399 0.4379 0.9399 0.9695
No log 6.1176 104 0.8820 0.4305 0.8820 0.9391
No log 6.2353 106 0.8695 0.4345 0.8695 0.9325
No log 6.3529 108 0.8397 0.4802 0.8397 0.9163
No log 6.4706 110 0.8489 0.4588 0.8489 0.9214
No log 6.5882 112 0.8904 0.4450 0.8904 0.9436
No log 6.7059 114 0.8020 0.5330 0.8020 0.8955
No log 6.8235 116 0.7688 0.5060 0.7688 0.8768
No log 6.9412 118 0.8194 0.5080 0.8194 0.9052
No log 7.0588 120 0.8161 0.5190 0.8161 0.9034
No log 7.1765 122 0.7751 0.5215 0.7751 0.8804
No log 7.2941 124 0.8160 0.5380 0.8160 0.9033
No log 7.4118 126 0.8697 0.5317 0.8697 0.9326
No log 7.5294 128 0.8588 0.5549 0.8588 0.9267
No log 7.6471 130 0.7855 0.5973 0.7855 0.8863
No log 7.7647 132 0.7330 0.6259 0.7330 0.8562
No log 7.8824 134 0.7534 0.5756 0.7534 0.8680
No log 8.0 136 0.7645 0.5756 0.7645 0.8744
No log 8.1176 138 0.8398 0.5902 0.8398 0.9164
No log 8.2353 140 0.8035 0.5830 0.8034 0.8964
No log 8.3529 142 0.7418 0.5855 0.7418 0.8613
No log 8.4706 144 0.7327 0.5545 0.7327 0.8560
No log 8.5882 146 0.7551 0.5431 0.7551 0.8690
No log 8.7059 148 0.7765 0.5752 0.7765 0.8812
No log 8.8235 150 0.7896 0.5832 0.7896 0.8886
No log 8.9412 152 0.7534 0.5342 0.7534 0.8680
No log 9.0588 154 0.7476 0.5351 0.7476 0.8646
No log 9.1765 156 0.7393 0.5585 0.7393 0.8598
No log 9.2941 158 0.7332 0.5603 0.7332 0.8563
No log 9.4118 160 0.7159 0.6060 0.7159 0.8461
No log 9.5294 162 0.6981 0.5152 0.6981 0.8355
No log 9.6471 164 0.7277 0.5534 0.7277 0.8531
No log 9.7647 166 0.7079 0.5463 0.7079 0.8414
No log 9.8824 168 0.6938 0.5523 0.6938 0.8329
No log 10.0 170 0.6847 0.5415 0.6847 0.8275
No log 10.1176 172 0.6770 0.5542 0.6770 0.8228
No log 10.2353 174 0.6528 0.6035 0.6528 0.8080
No log 10.3529 176 0.6992 0.6063 0.6992 0.8362
No log 10.4706 178 0.8876 0.5387 0.8876 0.9421
No log 10.5882 180 0.9016 0.5374 0.9016 0.9496
No log 10.7059 182 0.9192 0.5272 0.9192 0.9587
No log 10.8235 184 0.7490 0.5798 0.7490 0.8655
No log 10.9412 186 0.6737 0.6035 0.6737 0.8208
No log 11.0588 188 0.6666 0.5843 0.6666 0.8165
No log 11.1765 190 0.6691 0.6114 0.6691 0.8180
No log 11.2941 192 0.6939 0.5853 0.6939 0.8330
No log 11.4118 194 0.8418 0.5706 0.8418 0.9175
No log 11.5294 196 1.0373 0.4978 1.0373 1.0185
No log 11.6471 198 0.9673 0.5190 0.9673 0.9835
No log 11.7647 200 0.7674 0.5489 0.7674 0.8760
No log 11.8824 202 0.7083 0.5580 0.7083 0.8416
No log 12.0 204 0.6871 0.6068 0.6871 0.8289
No log 12.1176 206 0.6611 0.6177 0.6611 0.8131
No log 12.2353 208 0.7271 0.6160 0.7271 0.8527
No log 12.3529 210 0.7552 0.6199 0.7552 0.8690
No log 12.4706 212 0.6898 0.6443 0.6898 0.8305
No log 12.5882 214 0.6750 0.6677 0.6750 0.8216
No log 12.7059 216 0.6799 0.6716 0.6799 0.8246
No log 12.8235 218 0.6570 0.5606 0.6570 0.8105
No log 12.9412 220 0.6581 0.5190 0.6581 0.8112
No log 13.0588 222 0.6307 0.6519 0.6307 0.7942
No log 13.1765 224 0.6691 0.6669 0.6691 0.8180
No log 13.2941 226 0.6421 0.6573 0.6421 0.8013
No log 13.4118 228 0.6021 0.6764 0.6021 0.7760
No log 13.5294 230 0.6193 0.6656 0.6193 0.7869
No log 13.6471 232 0.6687 0.6473 0.6687 0.8178
No log 13.7647 234 0.7573 0.5451 0.7573 0.8702
No log 13.8824 236 0.8045 0.5232 0.8045 0.8969
No log 14.0 238 0.7335 0.5810 0.7335 0.8564
No log 14.1176 240 0.6818 0.4807 0.6818 0.8257
No log 14.2353 242 0.6966 0.5074 0.6966 0.8346
No log 14.3529 244 0.6951 0.5694 0.6951 0.8337
No log 14.4706 246 0.6419 0.5939 0.6419 0.8012
No log 14.5882 248 0.6635 0.6647 0.6635 0.8145
No log 14.7059 250 0.6765 0.5998 0.6765 0.8225
No log 14.8235 252 0.6315 0.6575 0.6315 0.7946
No log 14.9412 254 0.5895 0.6548 0.5895 0.7678
No log 15.0588 256 0.5810 0.6632 0.5810 0.7622
No log 15.1765 258 0.5826 0.6441 0.5826 0.7633
No log 15.2941 260 0.6038 0.6687 0.6038 0.7770
No log 15.4118 262 0.6421 0.6538 0.6421 0.8013
No log 15.5294 264 0.6392 0.6502 0.6392 0.7995
No log 15.6471 266 0.6093 0.5902 0.6093 0.7806
No log 15.7647 268 0.6172 0.6518 0.6172 0.7856
No log 15.8824 270 0.6276 0.6301 0.6276 0.7922
No log 16.0 272 0.7305 0.5973 0.7305 0.8547
No log 16.1176 274 0.8170 0.5842 0.8170 0.9039
No log 16.2353 276 0.7717 0.5898 0.7717 0.8785
No log 16.3529 278 0.6994 0.5855 0.6994 0.8363
No log 16.4706 280 0.6671 0.6164 0.6671 0.8168
No log 16.5882 282 0.6772 0.5955 0.6772 0.8229
No log 16.7059 284 0.6492 0.6518 0.6492 0.8057
No log 16.8235 286 0.6412 0.6147 0.6412 0.8008
No log 16.9412 288 0.6622 0.6438 0.6622 0.8138
No log 17.0588 290 0.6574 0.6147 0.6574 0.8108
No log 17.1765 292 0.6336 0.6473 0.6336 0.7960
No log 17.2941 294 0.6195 0.6187 0.6195 0.7871
No log 17.4118 296 0.6201 0.6084 0.6201 0.7875
No log 17.5294 298 0.6288 0.6084 0.6288 0.7930
No log 17.6471 300 0.6577 0.6084 0.6577 0.8110
No log 17.7647 302 0.6574 0.6055 0.6574 0.8108
No log 17.8824 304 0.6624 0.6442 0.6624 0.8139
No log 18.0 306 0.6785 0.5759 0.6785 0.8237
No log 18.1176 308 0.6570 0.6028 0.6570 0.8106
No log 18.2353 310 0.6539 0.6007 0.6539 0.8086
No log 18.3529 312 0.6687 0.6510 0.6687 0.8178
No log 18.4706 314 0.6528 0.6219 0.6528 0.8079
No log 18.5882 316 0.6383 0.5955 0.6383 0.7989
No log 18.7059 318 0.6357 0.6154 0.6357 0.7973
No log 18.8235 320 0.6403 0.6187 0.6403 0.8002
No log 18.9412 322 0.6557 0.6438 0.6557 0.8098
No log 19.0588 324 0.6767 0.6328 0.6767 0.8226
No log 19.1765 326 0.6450 0.6647 0.6450 0.8031
No log 19.2941 328 0.6295 0.6175 0.6295 0.7934
No log 19.4118 330 0.6461 0.5759 0.6461 0.8038
No log 19.5294 332 0.6681 0.6397 0.6681 0.8174
No log 19.6471 334 0.6918 0.6396 0.6918 0.8317
No log 19.7647 336 0.6827 0.6396 0.6827 0.8263
No log 19.8824 338 0.6570 0.6335 0.6570 0.8106
No log 20.0 340 0.6546 0.4845 0.6546 0.8091
No log 20.1176 342 0.6662 0.5554 0.6662 0.8162
No log 20.2353 344 0.6445 0.5759 0.6445 0.8028
No log 20.3529 346 0.6037 0.6374 0.6037 0.7770
No log 20.4706 348 0.6378 0.6476 0.6378 0.7986
No log 20.5882 350 0.7009 0.6597 0.7009 0.8372
No log 20.7059 352 0.7143 0.6495 0.7143 0.8451
No log 20.8235 354 0.6482 0.6815 0.6482 0.8051
No log 20.9412 356 0.6106 0.6900 0.6106 0.7814
No log 21.0588 358 0.6100 0.6900 0.6100 0.7810
No log 21.1765 360 0.6063 0.6750 0.6063 0.7786
No log 21.2941 362 0.6078 0.6946 0.6078 0.7796
No log 21.4118 364 0.5988 0.6965 0.5988 0.7738
No log 21.5294 366 0.6033 0.6921 0.6033 0.7767
No log 21.6471 368 0.6321 0.6976 0.6321 0.7950
No log 21.7647 370 0.6682 0.6434 0.6682 0.8174
No log 21.8824 372 0.6545 0.6609 0.6545 0.8090
No log 22.0 374 0.6386 0.6798 0.6386 0.7992
No log 22.1176 376 0.6428 0.6798 0.6428 0.8018
No log 22.2353 378 0.6514 0.6965 0.6514 0.8071
No log 22.3529 380 0.6675 0.6008 0.6675 0.8170
No log 22.4706 382 0.6603 0.6008 0.6603 0.8126
No log 22.5882 384 0.6314 0.6482 0.6314 0.7946
No log 22.7059 386 0.6515 0.6006 0.6515 0.8071
No log 22.8235 388 0.6701 0.5459 0.6701 0.8186
No log 22.9412 390 0.6647 0.5610 0.6647 0.8153
No log 23.0588 392 0.7237 0.5595 0.7237 0.8507
No log 23.1765 394 0.7557 0.5777 0.7557 0.8693
No log 23.2941 396 0.7175 0.5898 0.7175 0.8470
No log 23.4118 398 0.6684 0.6218 0.6684 0.8176
No log 23.5294 400 0.6603 0.6055 0.6603 0.8126
No log 23.6471 402 0.6740 0.6547 0.6740 0.8210
No log 23.7647 404 0.7371 0.5875 0.7371 0.8585
No log 23.8824 406 0.8589 0.4994 0.8589 0.9268
No log 24.0 408 0.8545 0.4779 0.8545 0.9244
No log 24.1176 410 0.7780 0.5538 0.7780 0.8820
No log 24.2353 412 0.6874 0.5844 0.6874 0.8291
No log 24.3529 414 0.6517 0.6311 0.6517 0.8073
No log 24.4706 416 0.6469 0.6028 0.6469 0.8043
No log 24.5882 418 0.6366 0.6311 0.6366 0.7979
No log 24.7059 420 0.6568 0.6361 0.6568 0.8104
No log 24.8235 422 0.6860 0.6189 0.6860 0.8282
No log 24.9412 424 0.6556 0.6361 0.6556 0.8097
No log 25.0588 426 0.6403 0.6361 0.6403 0.8002
No log 25.1765 428 0.6292 0.6073 0.6292 0.7932
No log 25.2941 430 0.6236 0.6187 0.6236 0.7897
No log 25.4118 432 0.6322 0.6073 0.6322 0.7951
No log 25.5294 434 0.6404 0.6557 0.6404 0.8003
No log 25.6471 436 0.6609 0.6073 0.6609 0.8130
No log 25.7647 438 0.6635 0.6073 0.6635 0.8146
No log 25.8824 440 0.6696 0.6177 0.6696 0.8183
No log 26.0 442 0.6797 0.6147 0.6797 0.8244
No log 26.1176 444 0.6733 0.6209 0.6733 0.8205
No log 26.2353 446 0.6612 0.5038 0.6612 0.8131
No log 26.3529 448 0.6684 0.5441 0.6684 0.8175
No log 26.4706 450 0.6605 0.5977 0.6605 0.8127
No log 26.5882 452 0.6520 0.5809 0.6520 0.8075
No log 26.7059 454 0.6681 0.6249 0.6681 0.8173
No log 26.8235 456 0.6809 0.6189 0.6809 0.8252
No log 26.9412 458 0.6596 0.6249 0.6596 0.8122
No log 27.0588 460 0.6341 0.6291 0.6341 0.7963
No log 27.1765 462 0.6130 0.6805 0.6130 0.7829
No log 27.2941 464 0.6037 0.7124 0.6037 0.7770
No log 27.4118 466 0.6057 0.7345 0.6057 0.7783
No log 27.5294 468 0.6181 0.6441 0.6181 0.7862
No log 27.6471 470 0.6588 0.6177 0.6588 0.8117
No log 27.7647 472 0.6703 0.6063 0.6703 0.8187
No log 27.8824 474 0.6848 0.6008 0.6848 0.8275
No log 28.0 476 0.7003 0.6426 0.7003 0.8369
No log 28.1176 478 0.7037 0.6257 0.7037 0.8389
No log 28.2353 480 0.7122 0.5842 0.7122 0.8439
No log 28.3529 482 0.7193 0.5842 0.7193 0.8481
No log 28.4706 484 0.7089 0.5948 0.7089 0.8420
No log 28.5882 486 0.6640 0.6081 0.6640 0.8149
No log 28.7059 488 0.6168 0.6894 0.6168 0.7853
No log 28.8235 490 0.6204 0.6461 0.6204 0.7877
No log 28.9412 492 0.6229 0.6078 0.6229 0.7892
No log 29.0588 494 0.6184 0.6690 0.6184 0.7864
No log 29.1765 496 0.6223 0.6706 0.6223 0.7889
No log 29.2941 498 0.6658 0.5634 0.6658 0.8160
0.2408 29.4118 500 0.7325 0.5777 0.7325 0.8559
0.2408 29.5294 502 0.7320 0.5777 0.7320 0.8556
0.2408 29.6471 504 0.7057 0.5912 0.7057 0.8400
0.2408 29.7647 506 0.6768 0.5634 0.6768 0.8227
0.2408 29.8824 508 0.6327 0.6548 0.6327 0.7954
0.2408 30.0 510 0.6342 0.5530 0.6342 0.7963
0.2408 30.1176 512 0.6516 0.5862 0.6516 0.8072
0.2408 30.2353 514 0.6304 0.5633 0.6304 0.7940
0.2408 30.3529 516 0.6095 0.6433 0.6095 0.7807
0.2408 30.4706 518 0.6093 0.6482 0.6093 0.7806
0.2408 30.5882 520 0.6196 0.6538 0.6196 0.7872
0.2408 30.7059 522 0.6146 0.6511 0.6146 0.7840
0.2408 30.8235 524 0.6133 0.6822 0.6133 0.7831
0.2408 30.9412 526 0.6030 0.6790 0.6030 0.7765
0.2408 31.0588 528 0.5973 0.6430 0.5973 0.7728
0.2408 31.1765 530 0.6014 0.6306 0.6014 0.7755
0.2408 31.2941 532 0.6278 0.5982 0.6278 0.7924
0.2408 31.4118 534 0.6365 0.5446 0.6365 0.7978
0.2408 31.5294 536 0.6203 0.5630 0.6203 0.7876
0.2408 31.6471 538 0.6148 0.6144 0.6148 0.7841
0.2408 31.7647 540 0.6186 0.6124 0.6186 0.7865
0.2408 31.8824 542 0.6162 0.5995 0.6162 0.7850
0.2408 32.0 544 0.6103 0.6316 0.6103 0.7812
0.2408 32.1176 546 0.6028 0.6316 0.6028 0.7764
0.2408 32.2353 548 0.5960 0.6407 0.5960 0.7720
0.2408 32.3529 550 0.5939 0.6374 0.5939 0.7707
0.2408 32.4706 552 0.6047 0.6555 0.6047 0.7776
0.2408 32.5882 554 0.6245 0.6782 0.6245 0.7903
0.2408 32.7059 556 0.6219 0.6782 0.6219 0.7886
0.2408 32.8235 558 0.6045 0.6491 0.6045 0.7775
0.2408 32.9412 560 0.6006 0.6857 0.6006 0.7750
0.2408 33.0588 562 0.6114 0.6796 0.6114 0.7819
0.2408 33.1765 564 0.6249 0.6528 0.6249 0.7905
0.2408 33.2941 566 0.6437 0.5928 0.6437 0.8023
0.2408 33.4118 568 0.6725 0.5943 0.6725 0.8201
0.2408 33.5294 570 0.7066 0.5221 0.7066 0.8406
0.2408 33.6471 572 0.7249 0.5305 0.7249 0.8514
0.2408 33.7647 574 0.7140 0.5317 0.7140 0.8450
0.2408 33.8824 576 0.7229 0.5938 0.7229 0.8502
0.2408 34.0 578 0.7098 0.5740 0.7098 0.8425
0.2408 34.1176 580 0.6780 0.5345 0.6780 0.8234
0.2408 34.2353 582 0.6646 0.6055 0.6646 0.8152
0.2408 34.3529 584 0.6682 0.6055 0.6682 0.8174
0.2408 34.4706 586 0.6542 0.6055 0.6542 0.8089
0.2408 34.5882 588 0.6568 0.5590 0.6568 0.8104
0.2408 34.7059 590 0.6853 0.6063 0.6853 0.8278
0.2408 34.8235 592 0.7010 0.6132 0.7010 0.8373
0.2408 34.9412 594 0.6796 0.5798 0.6796 0.8244
0.2408 35.0588 596 0.6484 0.6084 0.6484 0.8053
0.2408 35.1765 598 0.6472 0.6643 0.6472 0.8045
0.2408 35.2941 600 0.7004 0.4912 0.7004 0.8369
0.2408 35.4118 602 0.7166 0.4912 0.7166 0.8465
0.2408 35.5294 604 0.6913 0.5351 0.6913 0.8315
0.2408 35.6471 606 0.6647 0.6196 0.6647 0.8153
0.2408 35.7647 608 0.6696 0.5996 0.6696 0.8183
0.2408 35.8824 610 0.6742 0.6102 0.6742 0.8211
0.2408 36.0 612 0.6697 0.5690 0.6697 0.8184
0.2408 36.1176 614 0.6895 0.5219 0.6895 0.8304
0.2408 36.2353 616 0.7171 0.5244 0.7171 0.8468
0.2408 36.3529 618 0.7088 0.5111 0.7088 0.8419
0.2408 36.4706 620 0.6831 0.5300 0.6831 0.8265

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run3_AugV5_k5_task5_organization

Finetuned
(4019)
this model