ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k11_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8255
  • Qwk: 0.4734
  • Mse: 0.8255
  • Rmse: 0.9086

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0323 2 4.3571 0.0163 4.3571 2.0874
No log 0.0645 4 2.2404 0.0809 2.2404 1.4968
No log 0.0968 6 1.3929 0.0682 1.3929 1.1802
No log 0.1290 8 1.1460 0.2520 1.1460 1.0705
No log 0.1613 10 1.0986 0.2023 1.0986 1.0482
No log 0.1935 12 1.1377 0.1649 1.1377 1.0666
No log 0.2258 14 1.1493 0.2053 1.1493 1.0721
No log 0.2581 16 1.0938 0.2731 1.0938 1.0458
No log 0.2903 18 1.1430 0.2935 1.1430 1.0691
No log 0.3226 20 1.2841 0.1196 1.2841 1.1332
No log 0.3548 22 1.2804 0.1283 1.2804 1.1315
No log 0.3871 24 1.2531 0.2019 1.2531 1.1194
No log 0.4194 26 1.0611 0.2815 1.0611 1.0301
No log 0.4516 28 1.0141 0.1827 1.0141 1.0070
No log 0.4839 30 1.0060 0.1507 1.0060 1.0030
No log 0.5161 32 1.0585 0.2745 1.0585 1.0288
No log 0.5484 34 1.2873 0.2094 1.2873 1.1346
No log 0.5806 36 1.4793 0.1896 1.4793 1.2163
No log 0.6129 38 1.1869 0.2389 1.1869 1.0895
No log 0.6452 40 0.9884 0.3043 0.9884 0.9942
No log 0.6774 42 1.0304 0.3830 1.0304 1.0151
No log 0.7097 44 0.9969 0.4218 0.9969 0.9985
No log 0.7419 46 0.9509 0.5120 0.9509 0.9751
No log 0.7742 48 0.8815 0.5102 0.8815 0.9389
No log 0.8065 50 0.8430 0.5364 0.8430 0.9181
No log 0.8387 52 0.8384 0.4966 0.8384 0.9157
No log 0.8710 54 0.8342 0.5555 0.8342 0.9133
No log 0.9032 56 0.8533 0.5699 0.8533 0.9238
No log 0.9355 58 0.8212 0.6002 0.8212 0.9062
No log 0.9677 60 0.8186 0.6086 0.8186 0.9048
No log 1.0 62 0.8640 0.5702 0.8640 0.9295
No log 1.0323 64 0.9864 0.5171 0.9864 0.9932
No log 1.0645 66 1.1135 0.4612 1.1135 1.0552
No log 1.0968 68 1.0459 0.4918 1.0459 1.0227
No log 1.1290 70 0.9667 0.5614 0.9667 0.9832
No log 1.1613 72 0.8417 0.6269 0.8417 0.9175
No log 1.1935 74 0.7943 0.5673 0.7943 0.8912
No log 1.2258 76 0.8087 0.5632 0.8087 0.8993
No log 1.2581 78 0.7676 0.6328 0.7676 0.8761
No log 1.2903 80 0.7676 0.6633 0.7676 0.8761
No log 1.3226 82 0.7667 0.6887 0.7667 0.8756
No log 1.3548 84 0.7463 0.6601 0.7463 0.8639
No log 1.3871 86 0.7623 0.6800 0.7623 0.8731
No log 1.4194 88 0.7940 0.6271 0.7940 0.8910
No log 1.4516 90 0.8414 0.5518 0.8414 0.9173
No log 1.4839 92 0.8261 0.5699 0.8261 0.9089
No log 1.5161 94 0.8008 0.6178 0.8008 0.8949
No log 1.5484 96 0.8022 0.6441 0.8022 0.8956
No log 1.5806 98 0.9701 0.5722 0.9701 0.9850
No log 1.6129 100 1.0806 0.5092 1.0806 1.0395
No log 1.6452 102 0.9914 0.5774 0.9914 0.9957
No log 1.6774 104 0.7678 0.6453 0.7678 0.8763
No log 1.7097 106 0.7693 0.5559 0.7693 0.8771
No log 1.7419 108 0.8060 0.5657 0.8060 0.8978
No log 1.7742 110 0.9254 0.6121 0.9254 0.9620
No log 1.8065 112 1.1386 0.4861 1.1386 1.0670
No log 1.8387 114 1.1372 0.4600 1.1372 1.0664
No log 1.8710 116 0.9565 0.5101 0.9565 0.9780
No log 1.9032 118 0.8893 0.5591 0.8893 0.9430
No log 1.9355 120 0.8477 0.5241 0.8477 0.9207
No log 1.9677 122 0.8945 0.5408 0.8945 0.9458
No log 2.0 124 0.9341 0.5879 0.9341 0.9665
No log 2.0323 126 0.8582 0.5899 0.8582 0.9264
No log 2.0645 128 0.8936 0.5669 0.8936 0.9453
No log 2.0968 130 0.9070 0.5847 0.9070 0.9524
No log 2.1290 132 0.8294 0.6052 0.8294 0.9107
No log 2.1613 134 0.7921 0.6412 0.7921 0.8900
No log 2.1935 136 0.7852 0.6382 0.7852 0.8861
No log 2.2258 138 0.8001 0.6081 0.8001 0.8945
No log 2.2581 140 1.0300 0.4783 1.0300 1.0149
No log 2.2903 142 1.1333 0.4508 1.1333 1.0646
No log 2.3226 144 0.9596 0.4965 0.9596 0.9796
No log 2.3548 146 0.9021 0.5002 0.9021 0.9498
No log 2.3871 148 0.8696 0.5150 0.8696 0.9325
No log 2.4194 150 0.9030 0.5135 0.9030 0.9503
No log 2.4516 152 0.9337 0.5370 0.9337 0.9663
No log 2.4839 154 0.8739 0.5208 0.8739 0.9348
No log 2.5161 156 0.8371 0.5357 0.8371 0.9149
No log 2.5484 158 0.8486 0.5887 0.8486 0.9212
No log 2.5806 160 0.9887 0.5082 0.9887 0.9943
No log 2.6129 162 1.2136 0.4943 1.2136 1.1016
No log 2.6452 164 1.3205 0.3882 1.3205 1.1491
No log 2.6774 166 1.1561 0.5027 1.1561 1.0752
No log 2.7097 168 0.9403 0.5334 0.9403 0.9697
No log 2.7419 170 0.9161 0.5334 0.9161 0.9571
No log 2.7742 172 0.9272 0.5614 0.9272 0.9629
No log 2.8065 174 0.8834 0.5637 0.8834 0.9399
No log 2.8387 176 0.7799 0.5356 0.7799 0.8831
No log 2.8710 178 0.7703 0.5189 0.7703 0.8777
No log 2.9032 180 0.7962 0.4798 0.7962 0.8923
No log 2.9355 182 0.7814 0.5102 0.7814 0.8840
No log 2.9677 184 0.7711 0.5751 0.7711 0.8781
No log 3.0 186 1.0460 0.5222 1.0460 1.0227
No log 3.0323 188 1.1980 0.4381 1.1980 1.0945
No log 3.0645 190 1.0398 0.4697 1.0398 1.0197
No log 3.0968 192 0.9169 0.4957 0.9169 0.9576
No log 3.1290 194 0.9716 0.4603 0.9716 0.9857
No log 3.1613 196 1.0529 0.4477 1.0529 1.0261
No log 3.1935 198 1.1249 0.4715 1.1249 1.0606
No log 3.2258 200 1.0768 0.4666 1.0768 1.0377
No log 3.2581 202 0.8997 0.5648 0.8997 0.9485
No log 3.2903 204 0.7552 0.5836 0.7552 0.8690
No log 3.3226 206 0.7225 0.6196 0.7225 0.8500
No log 3.3548 208 0.7849 0.5913 0.7849 0.8859
No log 3.3871 210 1.0679 0.5027 1.0679 1.0334
No log 3.4194 212 1.3682 0.4118 1.3682 1.1697
No log 3.4516 214 1.3514 0.4307 1.3514 1.1625
No log 3.4839 216 1.0817 0.4954 1.0817 1.0401
No log 3.5161 218 0.8065 0.6132 0.8065 0.8981
No log 3.5484 220 0.7397 0.5439 0.7397 0.8601
No log 3.5806 222 0.7725 0.5482 0.7725 0.8789
No log 3.6129 224 0.8832 0.5985 0.8832 0.9398
No log 3.6452 226 1.1103 0.4749 1.1103 1.0537
No log 3.6774 228 1.0820 0.4598 1.0820 1.0402
No log 3.7097 230 0.9729 0.4539 0.9729 0.9863
No log 3.7419 232 0.8460 0.4934 0.8460 0.9198
No log 3.7742 234 0.8651 0.4902 0.8651 0.9301
No log 3.8065 236 0.9616 0.4357 0.9616 0.9806
No log 3.8387 238 1.0893 0.4681 1.0893 1.0437
No log 3.8710 240 1.1623 0.4580 1.1623 1.0781
No log 3.9032 242 1.0060 0.4708 1.0060 1.0030
No log 3.9355 244 1.0044 0.4700 1.0044 1.0022
No log 3.9677 246 0.9509 0.4810 0.9509 0.9751
No log 4.0 248 0.9265 0.5384 0.9265 0.9625
No log 4.0323 250 0.9254 0.5072 0.9254 0.9620
No log 4.0645 252 0.8750 0.5080 0.8750 0.9354
No log 4.0968 254 0.8545 0.4533 0.8545 0.9244
No log 4.1290 256 0.9434 0.4168 0.9434 0.9713
No log 4.1613 258 1.0713 0.4393 1.0713 1.0350
No log 4.1935 260 1.1116 0.4468 1.1116 1.0543
No log 4.2258 262 1.1946 0.4640 1.1946 1.0930
No log 4.2581 264 1.0995 0.4390 1.0995 1.0486
No log 4.2903 266 1.0164 0.4496 1.0164 1.0082
No log 4.3226 268 0.8223 0.5276 0.8223 0.9068
No log 4.3548 270 0.7749 0.5733 0.7749 0.8803
No log 4.3871 272 0.8114 0.5216 0.8114 0.9008
No log 4.4194 274 0.8578 0.5389 0.8578 0.9262
No log 4.4516 276 1.1396 0.4971 1.1396 1.0675
No log 4.4839 278 1.2676 0.5161 1.2676 1.1259
No log 4.5161 280 1.1333 0.5104 1.1333 1.0646
No log 4.5484 282 0.8715 0.4726 0.8715 0.9335
No log 4.5806 284 0.7015 0.6060 0.7015 0.8376
No log 4.6129 286 0.6993 0.5872 0.6993 0.8362
No log 4.6452 288 0.7360 0.6052 0.7360 0.8579
No log 4.6774 290 0.9636 0.4496 0.9636 0.9816
No log 4.7097 292 1.3877 0.3894 1.3877 1.1780
No log 4.7419 294 1.4656 0.3796 1.4656 1.2106
No log 4.7742 296 1.2290 0.4086 1.2290 1.1086
No log 4.8065 298 0.9733 0.4851 0.9733 0.9866
No log 4.8387 300 0.8240 0.4942 0.8240 0.9078
No log 4.8710 302 0.8464 0.4552 0.8464 0.9200
No log 4.9032 304 0.8544 0.4852 0.8544 0.9243
No log 4.9355 306 0.9074 0.4645 0.9074 0.9526
No log 4.9677 308 0.9444 0.4631 0.9444 0.9718
No log 5.0 310 0.9941 0.4523 0.9941 0.9970
No log 5.0323 312 0.9246 0.4638 0.9246 0.9616
No log 5.0645 314 0.9268 0.4820 0.9268 0.9627
No log 5.0968 316 0.8529 0.4712 0.8529 0.9235
No log 5.1290 318 0.8132 0.4555 0.8132 0.9018
No log 5.1613 320 0.8254 0.4418 0.8254 0.9085
No log 5.1935 322 0.8722 0.4712 0.8722 0.9339
No log 5.2258 324 1.0006 0.3884 1.0006 1.0003
No log 5.2581 326 1.1506 0.3723 1.1506 1.0726
No log 5.2903 328 1.0721 0.4291 1.0721 1.0354
No log 5.3226 330 0.9024 0.4289 0.9024 0.9500
No log 5.3548 332 0.8736 0.4425 0.8736 0.9347
No log 5.3871 334 0.8999 0.3827 0.8999 0.9486
No log 5.4194 336 0.9571 0.3256 0.9571 0.9783
No log 5.4516 338 1.1039 0.3259 1.1039 1.0507
No log 5.4839 340 1.1469 0.3745 1.1469 1.0710
No log 5.5161 342 1.0329 0.3798 1.0329 1.0163
No log 5.5484 344 0.8904 0.5029 0.8904 0.9436
No log 5.5806 346 0.8363 0.5374 0.8363 0.9145
No log 5.6129 348 0.7882 0.5380 0.7882 0.8878
No log 5.6452 350 0.8633 0.5280 0.8633 0.9291
No log 5.6774 352 1.0282 0.4571 1.0282 1.0140
No log 5.7097 354 1.0189 0.4640 1.0189 1.0094
No log 5.7419 356 0.8635 0.4921 0.8635 0.9293
No log 5.7742 358 0.7270 0.6219 0.7270 0.8527
No log 5.8065 360 0.7094 0.6507 0.7094 0.8422
No log 5.8387 362 0.7411 0.5872 0.7411 0.8609
No log 5.8710 364 0.8367 0.4694 0.8367 0.9147
No log 5.9032 366 0.9719 0.4618 0.9719 0.9859
No log 5.9355 368 1.0943 0.4111 1.0943 1.0461
No log 5.9677 370 1.0595 0.4035 1.0595 1.0293
No log 6.0 372 0.9305 0.5199 0.9305 0.9646
No log 6.0323 374 0.8862 0.5637 0.8862 0.9414
No log 6.0645 376 0.9495 0.4902 0.9495 0.9744
No log 6.0968 378 0.9995 0.4625 0.9995 0.9998
No log 6.1290 380 1.0124 0.4523 1.0124 1.0062
No log 6.1613 382 0.9616 0.4834 0.9616 0.9806
No log 6.1935 384 0.8877 0.5912 0.8877 0.9422
No log 6.2258 386 0.8483 0.5874 0.8483 0.9210
No log 6.2581 388 0.8930 0.5147 0.8930 0.9450
No log 6.2903 390 0.8965 0.5333 0.8965 0.9468
No log 6.3226 392 0.8836 0.5333 0.8836 0.9400
No log 6.3548 394 0.8723 0.5306 0.8723 0.9340
No log 6.3871 396 0.8843 0.5000 0.8843 0.9404
No log 6.4194 398 0.8710 0.5326 0.8710 0.9333
No log 6.4516 400 0.8782 0.5090 0.8782 0.9371
No log 6.4839 402 0.8980 0.5279 0.8980 0.9476
No log 6.5161 404 0.9678 0.5272 0.9678 0.9838
No log 6.5484 406 1.0444 0.4927 1.0444 1.0219
No log 6.5806 408 1.0287 0.5028 1.0287 1.0142
No log 6.6129 410 0.9280 0.5168 0.9280 0.9634
No log 6.6452 412 0.8306 0.6174 0.8306 0.9114
No log 6.6774 414 0.8215 0.6471 0.8215 0.9064
No log 6.7097 416 0.8062 0.6575 0.8062 0.8979
No log 6.7419 418 0.8554 0.6008 0.8554 0.9249
No log 6.7742 420 0.9358 0.4726 0.9358 0.9674
No log 6.8065 422 1.0247 0.4692 1.0247 1.0123
No log 6.8387 424 1.0503 0.4692 1.0503 1.0248
No log 6.8710 426 0.9510 0.5216 0.9510 0.9752
No log 6.9032 428 0.8762 0.5192 0.8762 0.9360
No log 6.9355 430 0.8304 0.5763 0.8304 0.9113
No log 6.9677 432 0.8448 0.5192 0.8448 0.9191
No log 7.0 434 0.8844 0.5083 0.8844 0.9404
No log 7.0323 436 0.9930 0.4803 0.9930 0.9965
No log 7.0645 438 1.0381 0.4611 1.0381 1.0189
No log 7.0968 440 1.0413 0.4322 1.0413 1.0204
No log 7.1290 442 1.0564 0.4322 1.0564 1.0278
No log 7.1613 444 0.9292 0.5124 0.9292 0.9640
No log 7.1935 446 0.8401 0.5324 0.8401 0.9166
No log 7.2258 448 0.8121 0.5270 0.8121 0.9012
No log 7.2581 450 0.7979 0.4995 0.7979 0.8932
No log 7.2903 452 0.8324 0.4881 0.8324 0.9124
No log 7.3226 454 1.0068 0.4990 1.0068 1.0034
No log 7.3548 456 1.1265 0.4504 1.1265 1.0614
No log 7.3871 458 1.0720 0.4783 1.0720 1.0354
No log 7.4194 460 0.9288 0.5077 0.9288 0.9638
No log 7.4516 462 0.8100 0.5042 0.8100 0.9000
No log 7.4839 464 0.7871 0.5543 0.7871 0.8872
No log 7.5161 466 0.8152 0.5451 0.8152 0.9029
No log 7.5484 468 0.9146 0.4436 0.9146 0.9563
No log 7.5806 470 1.0934 0.4565 1.0934 1.0457
No log 7.6129 472 1.1395 0.4474 1.1395 1.0675
No log 7.6452 474 1.0250 0.4803 1.0250 1.0124
No log 7.6774 476 0.9595 0.4661 0.9595 0.9796
No log 7.7097 478 0.9480 0.4788 0.9480 0.9737
No log 7.7419 480 1.0009 0.5039 1.0009 1.0004
No log 7.7742 482 1.0128 0.5039 1.0128 1.0064
No log 7.8065 484 0.9823 0.4877 0.9823 0.9911
No log 7.8387 486 0.9364 0.5635 0.9364 0.9677
No log 7.8710 488 0.9728 0.5625 0.9728 0.9863
No log 7.9032 490 1.0731 0.4936 1.0731 1.0359
No log 7.9355 492 1.1356 0.5340 1.1356 1.0656
No log 7.9677 494 1.0717 0.5519 1.0717 1.0353
No log 8.0 496 0.9662 0.5490 0.9662 0.9829
No log 8.0323 498 0.8925 0.4958 0.8925 0.9447
0.3073 8.0645 500 0.8457 0.4916 0.8457 0.9196
0.3073 8.0968 502 0.8023 0.5361 0.8023 0.8957
0.3073 8.1290 504 0.7944 0.5110 0.7944 0.8913
0.3073 8.1613 506 0.8690 0.6142 0.8690 0.9322
0.3073 8.1935 508 0.9038 0.5532 0.9038 0.9507
0.3073 8.2258 510 0.8726 0.5561 0.8726 0.9341
0.3073 8.2581 512 0.8351 0.5451 0.8351 0.9138
0.3073 8.2903 514 0.8362 0.5338 0.8362 0.9145
0.3073 8.3226 516 0.8889 0.5601 0.8889 0.9428
0.3073 8.3548 518 0.9233 0.5431 0.9233 0.9609
0.3073 8.3871 520 0.9656 0.5365 0.9656 0.9827
0.3073 8.4194 522 0.9223 0.5250 0.9223 0.9604
0.3073 8.4516 524 0.8584 0.5029 0.8584 0.9265
0.3073 8.4839 526 0.8692 0.4898 0.8692 0.9323
0.3073 8.5161 528 0.8686 0.4898 0.8686 0.9320
0.3073 8.5484 530 0.8284 0.5650 0.8284 0.9102
0.3073 8.5806 532 0.8002 0.5560 0.8002 0.8945
0.3073 8.6129 534 0.7931 0.5560 0.7931 0.8906
0.3073 8.6452 536 0.7920 0.5560 0.7920 0.8899
0.3073 8.6774 538 0.7551 0.5959 0.7551 0.8690
0.3073 8.7097 540 0.7348 0.6064 0.7348 0.8572
0.3073 8.7419 542 0.7123 0.6395 0.7123 0.8439
0.3073 8.7742 544 0.7336 0.5700 0.7336 0.8565
0.3073 8.8065 546 0.7569 0.5875 0.7569 0.8700
0.3073 8.8387 548 0.7863 0.5823 0.7863 0.8867
0.3073 8.8710 550 0.8080 0.5443 0.8080 0.8989
0.3073 8.9032 552 0.8117 0.5733 0.8117 0.9009
0.3073 8.9355 554 0.8060 0.5566 0.8060 0.8978
0.3073 8.9677 556 0.8024 0.5220 0.8024 0.8958
0.3073 9.0 558 0.8082 0.4998 0.8082 0.8990
0.3073 9.0323 560 0.8307 0.5353 0.8307 0.9114
0.3073 9.0645 562 0.8880 0.5135 0.8880 0.9423
0.3073 9.0968 564 0.9336 0.5216 0.9336 0.9662
0.3073 9.1290 566 1.0103 0.4295 1.0103 1.0051
0.3073 9.1613 568 0.9492 0.5113 0.9492 0.9743
0.3073 9.1935 570 0.8623 0.4851 0.8623 0.9286
0.3073 9.2258 572 0.8335 0.4575 0.8335 0.9130
0.3073 9.2581 574 0.8488 0.4539 0.8488 0.9213
0.3073 9.2903 576 0.8513 0.4640 0.8513 0.9227
0.3073 9.3226 578 0.8255 0.4734 0.8255 0.9086

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k11_task2_organization

Finetuned
(4019)
this model