ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k7_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9621
  • Qwk: 0.2441
  • Mse: 0.9621
  • Rmse: 0.9809

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0606 2 4.1836 -0.0087 4.1836 2.0454
No log 0.1212 4 2.9631 -0.0169 2.9631 1.7214
No log 0.1818 6 1.6719 0.0198 1.6719 1.2930
No log 0.2424 8 1.0752 0.1918 1.0752 1.0369
No log 0.3030 10 1.0535 0.1908 1.0535 1.0264
No log 0.3636 12 1.4145 -0.0777 1.4145 1.1893
No log 0.4242 14 1.5873 0.0085 1.5873 1.2599
No log 0.4848 16 1.4464 0.0085 1.4464 1.2027
No log 0.5455 18 1.2543 0.0523 1.2543 1.1200
No log 0.6061 20 1.0174 0.2288 1.0174 1.0087
No log 0.6667 22 1.1062 0.1740 1.1062 1.0518
No log 0.7273 24 1.1710 0.0579 1.1710 1.0821
No log 0.7879 26 1.0410 0.2187 1.0410 1.0203
No log 0.8485 28 1.0377 0.1589 1.0377 1.0187
No log 0.9091 30 1.2955 0.1114 1.2955 1.1382
No log 0.9697 32 1.6258 0.0778 1.6258 1.2751
No log 1.0303 34 1.6258 0.0778 1.6258 1.2751
No log 1.0909 36 1.3947 0.0256 1.3947 1.1810
No log 1.1515 38 1.1026 0.2221 1.1026 1.0500
No log 1.2121 40 0.9799 0.3134 0.9799 0.9899
No log 1.2727 42 0.9824 0.3314 0.9824 0.9912
No log 1.3333 44 1.0060 0.1821 1.0060 1.0030
No log 1.3939 46 1.0861 0.2520 1.0861 1.0422
No log 1.4545 48 1.2301 0.1810 1.2301 1.1091
No log 1.5152 50 1.2303 0.1433 1.2303 1.1092
No log 1.5758 52 1.2770 0.1310 1.2770 1.1300
No log 1.6364 54 1.3110 0.0946 1.3110 1.1450
No log 1.6970 56 1.2395 0.1433 1.2395 1.1133
No log 1.7576 58 1.1942 0.1995 1.1942 1.0928
No log 1.8182 60 1.1275 0.1416 1.1275 1.0618
No log 1.8788 62 1.2282 0.1290 1.2282 1.1082
No log 1.9394 64 1.2675 0.1283 1.2675 1.1258
No log 2.0 66 1.3745 0.1168 1.3745 1.1724
No log 2.0606 68 1.3266 0.1168 1.3266 1.1518
No log 2.1212 70 1.1316 0.2163 1.1316 1.0638
No log 2.1818 72 1.0716 0.2038 1.0716 1.0352
No log 2.2424 74 1.1342 0.2045 1.1342 1.0650
No log 2.3030 76 1.3031 0.2041 1.3031 1.1415
No log 2.3636 78 1.2623 0.2596 1.2623 1.1235
No log 2.4242 80 1.2103 0.2687 1.2103 1.1001
No log 2.4848 82 1.1883 0.1818 1.1883 1.0901
No log 2.5455 84 1.2400 0.2612 1.2400 1.1135
No log 2.6061 86 1.1902 0.3169 1.1902 1.0909
No log 2.6667 88 1.1525 0.2440 1.1525 1.0735
No log 2.7273 90 1.1614 0.2097 1.1614 1.0777
No log 2.7879 92 1.2023 0.1960 1.2023 1.0965
No log 2.8485 94 1.2448 0.2187 1.2448 1.1157
No log 2.9091 96 1.3535 0.2343 1.3535 1.1634
No log 2.9697 98 1.3901 0.2288 1.3901 1.1790
No log 3.0303 100 1.3681 0.2288 1.3681 1.1697
No log 3.0909 102 1.2138 0.2669 1.2138 1.1017
No log 3.1515 104 1.1932 0.2934 1.1932 1.0924
No log 3.2121 106 1.2344 0.2761 1.2344 1.1110
No log 3.2727 108 1.4385 0.0285 1.4385 1.1994
No log 3.3333 110 1.3863 0.0640 1.3863 1.1774
No log 3.3939 112 1.1583 0.2040 1.1583 1.0762
No log 3.4545 114 1.1327 0.3106 1.1327 1.0643
No log 3.5152 116 1.2725 0.3326 1.2725 1.1281
No log 3.5758 118 1.1982 0.2417 1.1982 1.0946
No log 3.6364 120 1.3672 0.0323 1.3672 1.1693
No log 3.6970 122 1.6171 -0.0525 1.6171 1.2716
No log 3.7576 124 1.6478 -0.0741 1.6478 1.2837
No log 3.8182 126 1.4711 0.0419 1.4711 1.2129
No log 3.8788 128 1.1910 0.2417 1.1910 1.0913
No log 3.9394 130 1.1381 0.1935 1.1381 1.0668
No log 4.0 132 1.1479 0.3152 1.1479 1.0714
No log 4.0606 134 1.2736 0.1967 1.2736 1.1285
No log 4.1212 136 1.4660 0.1076 1.4660 1.2108
No log 4.1818 138 1.4656 0.1405 1.4656 1.2106
No log 4.2424 140 1.3045 0.1736 1.3045 1.1422
No log 4.3030 142 1.0897 0.3275 1.0897 1.0439
No log 4.3636 144 1.0823 0.4007 1.0823 1.0403
No log 4.4242 146 1.2558 0.3413 1.2558 1.1206
No log 4.4848 148 1.1925 0.3528 1.1925 1.0920
No log 4.5455 150 1.0350 0.2697 1.0350 1.0174
No log 4.6061 152 1.0757 0.2909 1.0757 1.0372
No log 4.6667 154 1.1394 0.2628 1.1394 1.0674
No log 4.7273 156 1.0477 0.3826 1.0477 1.0236
No log 4.7879 158 0.9905 0.3113 0.9905 0.9952
No log 4.8485 160 0.9904 0.3503 0.9904 0.9952
No log 4.9091 162 0.9731 0.3678 0.9731 0.9865
No log 4.9697 164 0.9610 0.4374 0.9610 0.9803
No log 5.0303 166 0.9501 0.3908 0.9501 0.9747
No log 5.0909 168 0.9486 0.4248 0.9486 0.9740
No log 5.1515 170 0.9679 0.4726 0.9679 0.9838
No log 5.2121 172 1.0296 0.3675 1.0296 1.0147
No log 5.2727 174 1.0847 0.3556 1.0847 1.0415
No log 5.3333 176 0.9700 0.4379 0.9700 0.9849
No log 5.3939 178 0.8898 0.4434 0.8898 0.9433
No log 5.4545 180 0.8802 0.4705 0.8802 0.9382
No log 5.5152 182 0.9117 0.4273 0.9117 0.9548
No log 5.5758 184 1.0231 0.3913 1.0231 1.0115
No log 5.6364 186 1.0347 0.3913 1.0347 1.0172
No log 5.6970 188 0.9233 0.3657 0.9233 0.9609
No log 5.7576 190 0.8940 0.4105 0.8940 0.9455
No log 5.8182 192 0.8956 0.4105 0.8956 0.9464
No log 5.8788 194 0.9129 0.4073 0.9129 0.9555
No log 5.9394 196 0.9146 0.3820 0.9146 0.9564
No log 6.0 198 0.9053 0.4130 0.9053 0.9515
No log 6.0606 200 0.9219 0.4249 0.9219 0.9601
No log 6.1212 202 0.9147 0.4221 0.9147 0.9564
No log 6.1818 204 0.9159 0.3398 0.9159 0.9570
No log 6.2424 206 0.9486 0.3510 0.9486 0.9740
No log 6.3030 208 0.9267 0.3897 0.9267 0.9626
No log 6.3636 210 0.8890 0.4065 0.8890 0.9429
No log 6.4242 212 0.8837 0.4856 0.8837 0.9400
No log 6.4848 214 0.8876 0.4657 0.8876 0.9421
No log 6.5455 216 0.8790 0.4856 0.8790 0.9375
No log 6.6061 218 0.8697 0.4578 0.8697 0.9326
No log 6.6667 220 0.8904 0.3860 0.8904 0.9436
No log 6.7273 222 0.9506 0.3298 0.9506 0.9750
No log 6.7879 224 0.9459 0.3008 0.9459 0.9726
No log 6.8485 226 0.9156 0.3188 0.9156 0.9569
No log 6.9091 228 0.8861 0.3915 0.8861 0.9413
No log 6.9697 230 0.8880 0.4180 0.8880 0.9424
No log 7.0303 232 0.9146 0.4106 0.9146 0.9564
No log 7.0909 234 1.0193 0.3787 1.0193 1.0096
No log 7.1515 236 1.0404 0.3787 1.0404 1.0200
No log 7.2121 238 0.9575 0.2819 0.9575 0.9785
No log 7.2727 240 0.9176 0.2316 0.9176 0.9579
No log 7.3333 242 0.9236 0.3023 0.9236 0.9611
No log 7.3939 244 0.9112 0.2880 0.9112 0.9545
No log 7.4545 246 0.9104 0.2579 0.9104 0.9541
No log 7.5152 248 0.9629 0.2721 0.9629 0.9813
No log 7.5758 250 0.9820 0.2336 0.9820 0.9910
No log 7.6364 252 0.9983 0.2128 0.9983 0.9991
No log 7.6970 254 0.9541 0.2480 0.9541 0.9768
No log 7.7576 256 0.9381 0.2993 0.9381 0.9686
No log 7.8182 258 0.9483 0.2787 0.9483 0.9738
No log 7.8788 260 0.9611 0.2676 0.9611 0.9803
No log 7.9394 262 0.9660 0.2092 0.9660 0.9828
No log 8.0 264 0.9687 0.2092 0.9687 0.9842
No log 8.0606 266 0.9737 0.2448 0.9737 0.9868
No log 8.1212 268 0.9805 0.2448 0.9805 0.9902
No log 8.1818 270 1.0183 0.3023 1.0183 1.0091
No log 8.2424 272 1.0296 0.2857 1.0296 1.0147
No log 8.3030 274 1.0400 0.2857 1.0400 1.0198
No log 8.3636 276 1.0449 0.2472 1.0449 1.0222
No log 8.4242 278 1.0398 0.2472 1.0398 1.0197
No log 8.4848 280 1.0343 0.2472 1.0343 1.0170
No log 8.5455 282 1.0287 0.2713 1.0287 1.0142
No log 8.6061 284 1.0264 0.3076 1.0264 1.0131
No log 8.6667 286 1.0306 0.2785 1.0306 1.0152
No log 8.7273 288 1.0286 0.2424 1.0286 1.0142
No log 8.7879 290 1.0155 0.2763 1.0155 1.0077
No log 8.8485 292 1.0107 0.1623 1.0107 1.0053
No log 8.9091 294 1.0138 0.1885 1.0138 1.0069
No log 8.9697 296 1.0140 0.2579 1.0140 1.0070
No log 9.0303 298 1.0074 0.1987 1.0074 1.0037
No log 9.0909 300 1.0311 0.2352 1.0311 1.0154
No log 9.1515 302 1.0030 0.2639 1.0030 1.0015
No log 9.2121 304 0.9903 0.2639 0.9903 0.9952
No log 9.2727 306 0.9926 0.2639 0.9926 0.9963
No log 9.3333 308 0.9959 0.2057 0.9959 0.9980
No log 9.3939 310 1.0216 0.2654 1.0216 1.0107
No log 9.4545 312 1.0165 0.2351 1.0165 1.0082
No log 9.5152 314 1.0436 0.3101 1.0436 1.0216
No log 9.5758 316 1.0322 0.3216 1.0322 1.0160
No log 9.6364 318 1.0241 0.2931 1.0241 1.0120
No log 9.6970 320 1.0230 0.3338 1.0230 1.0115
No log 9.7576 322 0.9785 0.3112 0.9785 0.9892
No log 9.8182 324 0.9996 0.3181 0.9996 0.9998
No log 9.8788 326 1.0581 0.3277 1.0581 1.0286
No log 9.9394 328 1.0753 0.3028 1.0753 1.0369
No log 10.0 330 1.0248 0.3001 1.0248 1.0123
No log 10.0606 332 1.0129 0.2424 1.0129 1.0064
No log 10.1212 334 1.0087 0.2879 1.0087 1.0043
No log 10.1818 336 1.0034 0.2716 1.0034 1.0017
No log 10.2424 338 1.0068 0.3160 1.0068 1.0034
No log 10.3030 340 1.0057 0.2923 1.0057 1.0028
No log 10.3636 342 1.0251 0.2759 1.0251 1.0125
No log 10.4242 344 1.0211 0.2616 1.0211 1.0105
No log 10.4848 346 0.9915 0.2341 0.9915 0.9957
No log 10.5455 348 0.9852 0.2663 0.9852 0.9926
No log 10.6061 350 0.9877 0.2268 0.9877 0.9938
No log 10.6667 352 1.0201 0.2902 1.0201 1.0100
No log 10.7273 354 1.0081 0.2376 1.0081 1.0041
No log 10.7879 356 0.9580 0.3299 0.9580 0.9788
No log 10.8485 358 0.9674 0.3184 0.9674 0.9835
No log 10.9091 360 0.9805 0.3072 0.9805 0.9902
No log 10.9697 362 0.9862 0.4215 0.9862 0.9931
No log 11.0303 364 1.0859 0.3377 1.0859 1.0421
No log 11.0909 366 1.1645 0.3383 1.1645 1.0791
No log 11.1515 368 1.1177 0.3269 1.1177 1.0572
No log 11.2121 370 1.0094 0.3986 1.0094 1.0047
No log 11.2727 372 0.9968 0.2602 0.9968 0.9984
No log 11.3333 374 1.0001 0.2900 1.0001 1.0001
No log 11.3939 376 1.0023 0.2716 1.0023 1.0011
No log 11.4545 378 1.0537 0.2689 1.0537 1.0265
No log 11.5152 380 1.0701 0.3142 1.0701 1.0345
No log 11.5758 382 1.0386 0.3001 1.0386 1.0191
No log 11.6364 384 0.9990 0.3222 0.9990 0.9995
No log 11.6970 386 0.9814 0.2812 0.9814 0.9907
No log 11.7576 388 0.9651 0.2667 0.9651 0.9824
No log 11.8182 390 0.9561 0.2974 0.9561 0.9778
No log 11.8788 392 0.9556 0.3914 0.9556 0.9776
No log 11.9394 394 0.9378 0.3781 0.9378 0.9684
No log 12.0 396 0.9364 0.3896 0.9364 0.9677
No log 12.0606 398 0.9359 0.3781 0.9359 0.9674
No log 12.1212 400 0.9326 0.3154 0.9326 0.9657
No log 12.1818 402 0.9346 0.3138 0.9346 0.9668
No log 12.2424 404 0.9379 0.3101 0.9379 0.9685
No log 12.3030 406 0.9177 0.3525 0.9177 0.9579
No log 12.3636 408 0.9177 0.3215 0.9177 0.9580
No log 12.4242 410 0.9209 0.3896 0.9208 0.9596
No log 12.4848 412 0.9346 0.3188 0.9346 0.9668
No log 12.5455 414 0.9399 0.3250 0.9399 0.9695
No log 12.6061 416 0.9312 0.2965 0.9312 0.9650
No log 12.6667 418 0.9312 0.2812 0.9312 0.9650
No log 12.7273 420 0.9601 0.2736 0.9601 0.9798
No log 12.7879 422 0.9657 0.3063 0.9657 0.9827
No log 12.8485 424 0.9366 0.2978 0.9366 0.9678
No log 12.9091 426 0.9291 0.3365 0.9291 0.9639
No log 12.9697 428 0.9491 0.2857 0.9491 0.9742
No log 13.0303 430 0.9632 0.2857 0.9632 0.9814
No log 13.0909 432 0.9418 0.2615 0.9418 0.9704
No log 13.1515 434 0.9466 0.2088 0.9466 0.9729
No log 13.2121 436 0.9606 0.3107 0.9606 0.9801
No log 13.2727 438 0.9419 0.2965 0.9419 0.9705
No log 13.3333 440 0.9243 0.2615 0.9243 0.9614
No log 13.3939 442 0.9570 0.2902 0.9570 0.9783
No log 13.4545 444 0.9559 0.2902 0.9559 0.9777
No log 13.5152 446 0.9200 0.3243 0.9200 0.9592
No log 13.5758 448 0.9032 0.3435 0.9032 0.9503
No log 13.6364 450 0.9146 0.3250 0.9146 0.9563
No log 13.6970 452 0.9012 0.2944 0.9012 0.9493
No log 13.7576 454 0.9008 0.3631 0.9008 0.9491
No log 13.8182 456 0.9285 0.3202 0.9285 0.9636
No log 13.8788 458 0.9691 0.3869 0.9691 0.9844
No log 13.9394 460 0.9677 0.4023 0.9677 0.9837
No log 14.0 462 0.9046 0.3622 0.9046 0.9511
No log 14.0606 464 0.8612 0.3876 0.8612 0.9280
No log 14.1212 466 0.8645 0.3857 0.8645 0.9298
No log 14.1818 468 0.8765 0.3280 0.8765 0.9362
No log 14.2424 470 0.8919 0.3403 0.8919 0.9444
No log 14.3030 472 0.9087 0.3403 0.9087 0.9533
No log 14.3636 474 0.9113 0.3129 0.9113 0.9546
No log 14.4242 476 0.8924 0.3403 0.8924 0.9447
No log 14.4848 478 0.8856 0.3713 0.8856 0.9410
No log 14.5455 480 0.8782 0.4485 0.8782 0.9371
No log 14.6061 482 0.8677 0.5103 0.8677 0.9315
No log 14.6667 484 0.8595 0.5232 0.8594 0.9271
No log 14.7273 486 0.8480 0.4422 0.8480 0.9209
No log 14.7879 488 0.8469 0.3680 0.8469 0.9203
No log 14.8485 490 0.8472 0.3676 0.8472 0.9204
No log 14.9091 492 0.8651 0.3695 0.8651 0.9301
No log 14.9697 494 0.8576 0.3695 0.8576 0.9261
No log 15.0303 496 0.8439 0.3059 0.8439 0.9186
No log 15.0909 498 0.8545 0.3680 0.8545 0.9244
0.3149 15.1515 500 0.8827 0.3365 0.8827 0.9395
0.3149 15.2121 502 0.9653 0.2441 0.9653 0.9825
0.3149 15.2727 504 1.0250 0.3243 1.0250 1.0124
0.3149 15.3333 506 0.9641 0.2441 0.9641 0.9819
0.3149 15.3939 508 0.8911 0.3059 0.8911 0.9440
0.3149 15.4545 510 0.8938 0.3737 0.8938 0.9454
0.3149 15.5152 512 0.8934 0.3301 0.8934 0.9452
0.3149 15.5758 514 0.8967 0.2787 0.8967 0.9469
0.3149 15.6364 516 0.9445 0.2441 0.9445 0.9718
0.3149 15.6970 518 0.9815 0.2441 0.9815 0.9907
0.3149 15.7576 520 0.9621 0.2441 0.9621 0.9809

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k7_task5_organization

Finetuned
(4019)
this model