ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k13_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8907
  • Qwk: 0.4098
  • Mse: 0.8907
  • Rmse: 0.9437

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0303 2 4.1784 -0.0048 4.1784 2.0441
No log 0.0606 4 2.3348 -0.0062 2.3348 1.5280
No log 0.0909 6 1.5504 0.0724 1.5504 1.2451
No log 0.1212 8 1.2584 0.0909 1.2584 1.1218
No log 0.1515 10 1.0294 0.1755 1.0294 1.0146
No log 0.1818 12 1.0897 0.1532 1.0897 1.0439
No log 0.2121 14 1.0587 0.1908 1.0587 1.0289
No log 0.2424 16 1.0302 0.1418 1.0302 1.0150
No log 0.2727 18 0.9569 0.2695 0.9569 0.9782
No log 0.3030 20 0.9477 0.2933 0.9477 0.9735
No log 0.3333 22 0.9722 0.2135 0.9722 0.9860
No log 0.3636 24 1.0067 0.1954 1.0067 1.0033
No log 0.3939 26 0.9531 0.2594 0.9531 0.9762
No log 0.4242 28 0.9405 0.2736 0.9405 0.9698
No log 0.4545 30 1.1675 0.1700 1.1675 1.0805
No log 0.4848 32 1.2742 0.1138 1.2742 1.1288
No log 0.5152 34 1.1626 0.1770 1.1626 1.0782
No log 0.5455 36 1.0455 0.2212 1.0455 1.0225
No log 0.5758 38 0.9454 0.3243 0.9454 0.9723
No log 0.6061 40 0.9944 0.1487 0.9944 0.9972
No log 0.6364 42 1.0618 0.1487 1.0618 1.0304
No log 0.6667 44 1.0948 0.1360 1.0948 1.0463
No log 0.6970 46 1.0695 0.1516 1.0695 1.0341
No log 0.7273 48 0.9745 0.1671 0.9745 0.9871
No log 0.7576 50 0.9490 0.3693 0.9490 0.9742
No log 0.7879 52 0.9550 0.3857 0.9550 0.9772
No log 0.8182 54 1.0086 0.2782 1.0086 1.0043
No log 0.8485 56 1.0748 0.2754 1.0748 1.0367
No log 0.8788 58 1.0970 0.2271 1.0970 1.0474
No log 0.9091 60 1.0224 0.2318 1.0224 1.0111
No log 0.9394 62 0.9671 0.2015 0.9671 0.9834
No log 0.9697 64 1.0171 0.0919 1.0171 1.0085
No log 1.0 66 1.0412 0.1528 1.0412 1.0204
No log 1.0303 68 0.9949 0.1727 0.9949 0.9974
No log 1.0606 70 1.0101 0.0981 1.0101 1.0050
No log 1.0909 72 1.0119 0.1981 1.0119 1.0059
No log 1.1212 74 1.0046 0.1908 1.0046 1.0023
No log 1.1515 76 0.9919 0.1989 0.9919 0.9960
No log 1.1818 78 0.9716 0.2416 0.9716 0.9857
No log 1.2121 80 0.9400 0.2566 0.9400 0.9695
No log 1.2424 82 0.8949 0.2967 0.8949 0.9460
No log 1.2727 84 0.8889 0.3351 0.8889 0.9428
No log 1.3030 86 1.0885 0.3452 1.0885 1.0433
No log 1.3333 88 1.2649 0.3199 1.2649 1.1247
No log 1.3636 90 1.2093 0.2284 1.2093 1.0997
No log 1.3939 92 1.0139 0.2238 1.0139 1.0069
No log 1.4242 94 0.8803 0.3658 0.8803 0.9382
No log 1.4545 96 0.8831 0.3658 0.8831 0.9398
No log 1.4848 98 0.9830 0.2077 0.9830 0.9915
No log 1.5152 100 1.0990 0.2512 1.0990 1.0483
No log 1.5455 102 1.2958 0.2730 1.2958 1.1383
No log 1.5758 104 1.2998 0.2686 1.2998 1.1401
No log 1.6061 106 1.0784 0.3000 1.0784 1.0385
No log 1.6364 108 0.9715 0.2610 0.9715 0.9857
No log 1.6667 110 1.0287 0.3000 1.0287 1.0142
No log 1.6970 112 1.1288 0.2284 1.1288 1.0625
No log 1.7273 114 1.1735 0.2284 1.1735 1.0833
No log 1.7576 116 1.2445 0.2284 1.2445 1.1156
No log 1.7879 118 1.2156 0.2167 1.2156 1.1025
No log 1.8182 120 1.1007 0.2704 1.1007 1.0491
No log 1.8485 122 0.9070 0.4579 0.9070 0.9524
No log 1.8788 124 0.8456 0.4531 0.8456 0.9195
No log 1.9091 126 0.8620 0.4847 0.8620 0.9284
No log 1.9394 128 0.9978 0.3714 0.9978 0.9989
No log 1.9697 130 1.0757 0.4094 1.0757 1.0371
No log 2.0 132 1.1151 0.4107 1.1151 1.0560
No log 2.0303 134 1.0721 0.3381 1.0721 1.0354
No log 2.0606 136 0.9028 0.4407 0.9028 0.9502
No log 2.0909 138 0.9019 0.4368 0.9019 0.9497
No log 2.1212 140 0.9466 0.4986 0.9466 0.9729
No log 2.1515 142 0.9499 0.4004 0.9499 0.9746
No log 2.1818 144 1.1606 0.3913 1.1606 1.0773
No log 2.2121 146 1.2547 0.3280 1.2547 1.1202
No log 2.2424 148 1.2324 0.2089 1.2324 1.1101
No log 2.2727 150 1.2585 0.1141 1.2585 1.1218
No log 2.3030 152 1.2378 0.1265 1.2378 1.1125
No log 2.3333 154 1.1591 0.1892 1.1591 1.0766
No log 2.3636 156 1.0727 0.2907 1.0727 1.0357
No log 2.3939 158 1.0957 0.3333 1.0957 1.0467
No log 2.4242 160 1.0797 0.3363 1.0797 1.0391
No log 2.4545 162 0.9331 0.3695 0.9331 0.9660
No log 2.4848 164 0.9269 0.3695 0.9269 0.9628
No log 2.5152 166 0.8854 0.4809 0.8854 0.9410
No log 2.5455 168 0.8702 0.4928 0.8702 0.9328
No log 2.5758 170 1.0318 0.3848 1.0318 1.0158
No log 2.6061 172 1.1179 0.3848 1.1179 1.0573
No log 2.6364 174 1.0135 0.4181 1.0135 1.0067
No log 2.6667 176 1.0573 0.4181 1.0573 1.0282
No log 2.6970 178 0.9606 0.3937 0.9606 0.9801
No log 2.7273 180 0.9720 0.3937 0.9720 0.9859
No log 2.7576 182 1.1680 0.3126 1.1680 1.0807
No log 2.7879 184 1.4380 0.2515 1.4380 1.1992
No log 2.8182 186 1.4639 0.2677 1.4639 1.2099
No log 2.8485 188 1.2647 0.1700 1.2647 1.1246
No log 2.8788 190 1.0125 0.2299 1.0125 1.0062
No log 2.9091 192 0.8477 0.3780 0.8477 0.9207
No log 2.9394 194 0.7974 0.4355 0.7974 0.8930
No log 2.9697 196 0.8035 0.4082 0.8035 0.8964
No log 3.0 198 0.8651 0.3939 0.8651 0.9301
No log 3.0303 200 1.0119 0.3954 1.0119 1.0059
No log 3.0606 202 1.1173 0.3400 1.1173 1.0570
No log 3.0909 204 1.1754 0.2506 1.1754 1.0841
No log 3.1212 206 1.1221 0.3326 1.1221 1.0593
No log 3.1515 208 1.0023 0.4232 1.0023 1.0012
No log 3.1818 210 0.8528 0.4696 0.8528 0.9235
No log 3.2121 212 0.7821 0.4743 0.7821 0.8844
No log 3.2424 214 0.8178 0.3859 0.8178 0.9043
No log 3.2727 216 0.8611 0.4612 0.8611 0.9280
No log 3.3030 218 0.8797 0.4465 0.8797 0.9379
No log 3.3333 220 0.8059 0.4815 0.8059 0.8977
No log 3.3636 222 0.8148 0.4695 0.8148 0.9027
No log 3.3939 224 0.9146 0.4119 0.9146 0.9564
No log 3.4242 226 1.0870 0.4435 1.0870 1.0426
No log 3.4545 228 1.0890 0.4435 1.0890 1.0435
No log 3.4848 230 1.0038 0.4435 1.0038 1.0019
No log 3.5152 232 0.8226 0.4102 0.8226 0.9070
No log 3.5455 234 0.7698 0.4368 0.7698 0.8774
No log 3.5758 236 0.7963 0.4366 0.7963 0.8923
No log 3.6061 238 0.8126 0.4935 0.8126 0.9015
No log 3.6364 240 0.8230 0.4929 0.8230 0.9072
No log 3.6667 242 0.8405 0.5140 0.8405 0.9168
No log 3.6970 244 0.8086 0.4826 0.8086 0.8992
No log 3.7273 246 0.8472 0.5039 0.8472 0.9204
No log 3.7576 248 0.8623 0.5131 0.8623 0.9286
No log 3.7879 250 0.8430 0.5153 0.8430 0.9181
No log 3.8182 252 0.9174 0.4681 0.9174 0.9578
No log 3.8485 254 0.9288 0.4341 0.9288 0.9638
No log 3.8788 256 0.9974 0.3596 0.9974 0.9987
No log 3.9091 258 0.9345 0.3654 0.9345 0.9667
No log 3.9394 260 0.8064 0.3637 0.8064 0.8980
No log 3.9697 262 0.7613 0.4345 0.7613 0.8725
No log 4.0 264 0.7852 0.4227 0.7852 0.8861
No log 4.0303 266 0.9323 0.3721 0.9323 0.9656
No log 4.0606 268 1.1264 0.4130 1.1264 1.0613
No log 4.0909 270 1.1379 0.3833 1.1379 1.0667
No log 4.1212 272 1.0161 0.4786 1.0161 1.0080
No log 4.1515 274 0.9826 0.4988 0.9826 0.9912
No log 4.1818 276 0.9967 0.4775 0.9967 0.9983
No log 4.2121 278 1.0318 0.4235 1.0318 1.0158
No log 4.2424 280 0.9560 0.4232 0.9560 0.9778
No log 4.2727 282 0.8447 0.4696 0.8447 0.9191
No log 4.3030 284 0.7785 0.4727 0.7785 0.8823
No log 4.3333 286 0.7339 0.5107 0.7339 0.8567
No log 4.3636 288 0.7553 0.5425 0.7553 0.8691
No log 4.3939 290 0.7966 0.5516 0.7966 0.8926
No log 4.4242 292 0.7933 0.5208 0.7933 0.8907
No log 4.4545 294 0.7832 0.5503 0.7832 0.8850
No log 4.4848 296 0.8152 0.4858 0.8152 0.9029
No log 4.5152 298 0.9355 0.4334 0.9355 0.9672
No log 4.5455 300 1.0096 0.4048 1.0096 1.0048
No log 4.5758 302 0.9407 0.2864 0.9407 0.9699
No log 4.6061 304 0.8895 0.3021 0.8895 0.9431
No log 4.6364 306 0.8755 0.3663 0.8755 0.9357
No log 4.6667 308 0.8807 0.3021 0.8807 0.9384
No log 4.6970 310 0.8949 0.3363 0.8949 0.9460
No log 4.7273 312 0.8525 0.4203 0.8525 0.9233
No log 4.7576 314 0.8455 0.4471 0.8455 0.9195
No log 4.7879 316 0.8630 0.4588 0.8630 0.9290
No log 4.8182 318 0.9185 0.4907 0.9185 0.9584
No log 4.8485 320 0.9327 0.4907 0.9327 0.9657
No log 4.8788 322 1.0291 0.4681 1.0291 1.0145
No log 4.9091 324 1.1802 0.3296 1.1802 1.0864
No log 4.9394 326 1.0845 0.4053 1.0845 1.0414
No log 4.9697 328 0.8639 0.5487 0.8639 0.9295
No log 5.0 330 0.7865 0.5197 0.7865 0.8869
No log 5.0303 332 0.8187 0.5740 0.8187 0.9048
No log 5.0606 334 0.9137 0.5231 0.9137 0.9559
No log 5.0909 336 0.8750 0.5356 0.8750 0.9354
No log 5.1212 338 0.8540 0.5470 0.8540 0.9241
No log 5.1515 340 0.8986 0.5128 0.8986 0.9479
No log 5.1818 342 0.9142 0.4695 0.9142 0.9561
No log 5.2121 344 0.9027 0.4695 0.9027 0.9501
No log 5.2424 346 0.8739 0.4350 0.8739 0.9348
No log 5.2727 348 0.8422 0.4148 0.8422 0.9177
No log 5.3030 350 0.8541 0.4601 0.8541 0.9242
No log 5.3333 352 0.8997 0.4695 0.8997 0.9485
No log 5.3636 354 0.9483 0.4794 0.9483 0.9738
No log 5.3939 356 0.9355 0.4794 0.9355 0.9672
No log 5.4242 358 0.8276 0.4616 0.8276 0.9097
No log 5.4545 360 0.7654 0.4772 0.7654 0.8749
No log 5.4848 362 0.7610 0.5386 0.7610 0.8723
No log 5.5152 364 0.7889 0.5186 0.7889 0.8882
No log 5.5455 366 0.9786 0.4994 0.9786 0.9892
No log 5.5758 368 1.1777 0.3982 1.1777 1.0852
No log 5.6061 370 1.0999 0.4377 1.0999 1.0488
No log 5.6364 372 0.9757 0.4681 0.9757 0.9878
No log 5.6667 374 0.8024 0.4616 0.8024 0.8958
No log 5.6970 376 0.7663 0.4760 0.7663 0.8754
No log 5.7273 378 0.7585 0.4676 0.7585 0.8709
No log 5.7576 380 0.7457 0.4893 0.7457 0.8635
No log 5.7879 382 0.7720 0.4597 0.7720 0.8786
No log 5.8182 384 0.7730 0.4597 0.7730 0.8792
No log 5.8485 386 0.7393 0.4644 0.7393 0.8598
No log 5.8788 388 0.7335 0.4411 0.7335 0.8565
No log 5.9091 390 0.7330 0.4644 0.7330 0.8562
No log 5.9394 392 0.7363 0.4759 0.7363 0.8581
No log 5.9697 394 0.7238 0.4776 0.7238 0.8508
No log 6.0 396 0.7292 0.5288 0.7292 0.8539
No log 6.0303 398 0.7299 0.4893 0.7299 0.8544
No log 6.0606 400 0.7385 0.4511 0.7385 0.8594
No log 6.0909 402 0.7564 0.4759 0.7564 0.8697
No log 6.1212 404 0.7495 0.4511 0.7495 0.8657
No log 6.1515 406 0.7337 0.4893 0.7337 0.8566
No log 6.1818 408 0.7424 0.4573 0.7424 0.8617
No log 6.2121 410 0.7275 0.4923 0.7275 0.8530
No log 6.2424 412 0.7336 0.4759 0.7336 0.8565
No log 6.2727 414 0.7979 0.5183 0.7979 0.8933
No log 6.3030 416 0.7962 0.5305 0.7962 0.8923
No log 6.3333 418 0.7343 0.4629 0.7343 0.8569
No log 6.3636 420 0.7255 0.5032 0.7255 0.8517
No log 6.3939 422 0.7477 0.4742 0.7477 0.8647
No log 6.4242 424 0.7897 0.4726 0.7897 0.8887
No log 6.4545 426 0.9122 0.5208 0.9122 0.9551
No log 6.4848 428 0.9499 0.4987 0.9499 0.9746
No log 6.5152 430 0.8574 0.5231 0.8574 0.9260
No log 6.5455 432 0.7871 0.4487 0.7871 0.8872
No log 6.5758 434 0.7322 0.4995 0.7322 0.8557
No log 6.6061 436 0.7275 0.5052 0.7275 0.8529
No log 6.6364 438 0.7279 0.5010 0.7279 0.8531
No log 6.6667 440 0.7488 0.4615 0.7488 0.8653
No log 6.6970 442 0.7607 0.4727 0.7607 0.8722
No log 6.7273 444 0.7366 0.4742 0.7366 0.8583
No log 6.7576 446 0.7235 0.4743 0.7235 0.8506
No log 6.7879 448 0.7064 0.4629 0.7064 0.8405
No log 6.8182 450 0.7143 0.4629 0.7143 0.8452
No log 6.8485 452 0.7343 0.4742 0.7343 0.8569
No log 6.8788 454 0.7824 0.4343 0.7824 0.8845
No log 6.9091 456 0.8602 0.5119 0.8602 0.9275
No log 6.9394 458 1.0415 0.5090 1.0415 1.0205
No log 6.9697 460 1.1469 0.4581 1.1469 1.0710
No log 7.0 462 1.0885 0.5068 1.0885 1.0433
No log 7.0303 464 0.9507 0.4681 0.9507 0.9750
No log 7.0606 466 0.8914 0.5443 0.8914 0.9442
No log 7.0909 468 0.9053 0.5331 0.9053 0.9515
No log 7.1212 470 0.8526 0.5041 0.8526 0.9234
No log 7.1515 472 0.8235 0.5048 0.8235 0.9075
No log 7.1818 474 0.7823 0.4586 0.7823 0.8845
No log 7.2121 476 0.8096 0.4586 0.8096 0.8998
No log 7.2424 478 0.8608 0.4444 0.8608 0.9278
No log 7.2727 480 0.8217 0.4057 0.8217 0.9065
No log 7.3030 482 0.7536 0.4227 0.7536 0.8681
No log 7.3333 484 0.7155 0.5405 0.7155 0.8459
No log 7.3636 486 0.7036 0.5168 0.7036 0.8388
No log 7.3939 488 0.6974 0.5405 0.6974 0.8351
No log 7.4242 490 0.7038 0.5631 0.7038 0.8389
No log 7.4545 492 0.7705 0.4960 0.7705 0.8778
No log 7.4848 494 0.8830 0.4460 0.8830 0.9397
No log 7.5152 496 0.9333 0.4208 0.9333 0.9661
No log 7.5455 498 0.8874 0.4326 0.8874 0.9420
0.3521 7.5758 500 0.7879 0.4597 0.7879 0.8876
0.3521 7.6061 502 0.7504 0.4988 0.7504 0.8663
0.3521 7.6364 504 0.7572 0.4743 0.7572 0.8702
0.3521 7.6667 506 0.7923 0.5070 0.7923 0.8901
0.3521 7.6970 508 0.8372 0.4460 0.8372 0.9150
0.3521 7.7273 510 0.8691 0.4696 0.8691 0.9322
0.3521 7.7576 512 0.8250 0.4460 0.8250 0.9083
0.3521 7.7879 514 0.7864 0.4471 0.7864 0.8868
0.3521 7.8182 516 0.7725 0.4613 0.7725 0.8789
0.3521 7.8485 518 0.7497 0.4873 0.7497 0.8659
0.3521 7.8788 520 0.7329 0.5002 0.7329 0.8561
0.3521 7.9091 522 0.7534 0.5084 0.7534 0.8680
0.3521 7.9394 524 0.7729 0.4836 0.7729 0.8792
0.3521 7.9697 526 0.8002 0.4711 0.8002 0.8946
0.3521 8.0 528 0.7995 0.4960 0.7995 0.8942
0.3521 8.0303 530 0.7607 0.5246 0.7607 0.8722
0.3521 8.0606 532 0.7764 0.4988 0.7764 0.8812
0.3521 8.0909 534 0.8260 0.5397 0.8260 0.9088
0.3521 8.1212 536 0.8425 0.5160 0.8425 0.9179
0.3521 8.1515 538 0.8199 0.4220 0.8199 0.9055
0.3521 8.1818 540 0.7706 0.4762 0.7706 0.8779
0.3521 8.2121 542 0.7741 0.4762 0.7741 0.8798
0.3521 8.2424 544 0.7911 0.3959 0.7911 0.8894
0.3521 8.2727 546 0.8230 0.3958 0.8230 0.9072
0.3521 8.3030 548 0.8732 0.3298 0.8732 0.9345
0.3521 8.3333 550 0.8994 0.3921 0.8994 0.9484
0.3521 8.3636 552 0.8907 0.4098 0.8907 0.9437

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k13_task5_organization

Finetuned
(4019)
this model