ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k18_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7778
  • Qwk: 0.5246
  • Mse: 0.7778
  • Rmse: 0.8819

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0345 2 3.8422 -0.0151 3.8422 1.9602
No log 0.0690 4 2.0386 0.0247 2.0386 1.4278
No log 0.1034 6 1.4058 0.0380 1.4058 1.1857
No log 0.1379 8 1.1044 0.2391 1.1044 1.0509
No log 0.1724 10 1.1958 0.1261 1.1958 1.0935
No log 0.2069 12 1.3160 0.0 1.3160 1.1472
No log 0.2414 14 1.2933 0.0380 1.2933 1.1372
No log 0.2759 16 1.5693 0.0 1.5693 1.2527
No log 0.3103 18 1.4323 0.0 1.4323 1.1968
No log 0.3448 20 1.1650 0.0374 1.1650 1.0794
No log 0.3793 22 1.0867 0.2466 1.0867 1.0424
No log 0.4138 24 1.0753 0.1854 1.0753 1.0370
No log 0.4483 26 1.0536 0.1573 1.0536 1.0264
No log 0.4828 28 1.0670 0.3332 1.0670 1.0329
No log 0.5172 30 1.2430 0.0731 1.2430 1.1149
No log 0.5517 32 1.4658 -0.0113 1.4658 1.2107
No log 0.5862 34 1.7807 0.1213 1.7807 1.3344
No log 0.6207 36 1.6286 0.0757 1.6286 1.2762
No log 0.6552 38 1.2440 0.1114 1.2440 1.1153
No log 0.6897 40 1.0561 0.2639 1.0561 1.0277
No log 0.7241 42 1.0430 0.2239 1.0430 1.0213
No log 0.7586 44 1.0381 0.1740 1.0381 1.0189
No log 0.7931 46 1.0388 0.1891 1.0388 1.0192
No log 0.8276 48 1.0365 0.1767 1.0365 1.0181
No log 0.8621 50 1.0674 0.2293 1.0674 1.0331
No log 0.8966 52 1.1181 0.1810 1.1181 1.0574
No log 0.9310 54 1.1311 0.1903 1.1311 1.0636
No log 0.9655 56 1.0628 0.2543 1.0628 1.0309
No log 1.0 58 1.0286 0.2901 1.0286 1.0142
No log 1.0345 60 1.0743 0.2268 1.0743 1.0365
No log 1.0690 62 1.1889 0.2592 1.1889 1.0904
No log 1.1034 64 1.2340 0.2221 1.2340 1.1108
No log 1.1379 66 1.1627 0.2049 1.1627 1.0783
No log 1.1724 68 1.1112 0.2735 1.1112 1.0541
No log 1.2069 70 1.0401 0.3230 1.0401 1.0198
No log 1.2414 72 0.9978 0.3359 0.9978 0.9989
No log 1.2759 74 0.9141 0.3876 0.9141 0.9561
No log 1.3103 76 0.8817 0.3508 0.8817 0.9390
No log 1.3448 78 0.8374 0.4124 0.8374 0.9151
No log 1.3793 80 0.8971 0.4318 0.8971 0.9471
No log 1.4138 82 1.0221 0.4192 1.0221 1.0110
No log 1.4483 84 0.9346 0.4081 0.9346 0.9668
No log 1.4828 86 0.8926 0.4301 0.8926 0.9448
No log 1.5172 88 0.7755 0.5018 0.7755 0.8806
No log 1.5517 90 0.7378 0.5402 0.7378 0.8589
No log 1.5862 92 0.7275 0.5606 0.7275 0.8529
No log 1.6207 94 0.7698 0.5054 0.7698 0.8774
No log 1.6552 96 0.8964 0.4081 0.8964 0.9468
No log 1.6897 98 0.9244 0.4284 0.9244 0.9615
No log 1.7241 100 0.8726 0.4326 0.8726 0.9341
No log 1.7586 102 0.7970 0.5383 0.7970 0.8928
No log 1.7931 104 0.7782 0.5313 0.7782 0.8822
No log 1.8276 106 0.7962 0.5497 0.7962 0.8923
No log 1.8621 108 0.8196 0.5575 0.8196 0.9053
No log 1.8966 110 0.7967 0.5879 0.7967 0.8926
No log 1.9310 112 0.7764 0.5931 0.7764 0.8811
No log 1.9655 114 0.7626 0.5796 0.7626 0.8733
No log 2.0 116 0.7630 0.5892 0.7630 0.8735
No log 2.0345 118 0.8106 0.5279 0.8106 0.9003
No log 2.0690 120 0.7907 0.5072 0.7907 0.8892
No log 2.1034 122 0.7667 0.5002 0.7667 0.8756
No log 2.1379 124 0.7716 0.5361 0.7716 0.8784
No log 2.1724 126 0.8733 0.4695 0.8733 0.9345
No log 2.2069 128 0.8896 0.4695 0.8896 0.9432
No log 2.2414 130 0.7888 0.4726 0.7888 0.8882
No log 2.2759 132 0.7595 0.5558 0.7595 0.8715
No log 2.3103 134 0.7860 0.5571 0.7860 0.8866
No log 2.3448 136 0.7918 0.5214 0.7918 0.8899
No log 2.3793 138 0.7786 0.5444 0.7786 0.8824
No log 2.4138 140 0.7725 0.5532 0.7725 0.8789
No log 2.4483 142 0.8082 0.5845 0.8082 0.8990
No log 2.4828 144 0.8579 0.5192 0.8579 0.9262
No log 2.5172 146 0.7631 0.6107 0.7631 0.8735
No log 2.5517 148 0.7846 0.5642 0.7846 0.8858
No log 2.5862 150 0.9686 0.4375 0.9686 0.9842
No log 2.6207 152 0.9441 0.3989 0.9441 0.9717
No log 2.6552 154 0.8217 0.4456 0.8217 0.9065
No log 2.6897 156 0.7815 0.4610 0.7815 0.8840
No log 2.7241 158 0.8311 0.4571 0.8311 0.9116
No log 2.7586 160 0.8194 0.4859 0.8194 0.9052
No log 2.7931 162 0.7915 0.4606 0.7915 0.8897
No log 2.8276 164 0.8593 0.4697 0.8593 0.9270
No log 2.8621 166 1.0557 0.3606 1.0557 1.0275
No log 2.8966 168 1.0392 0.3778 1.0392 1.0194
No log 2.9310 170 0.9416 0.4164 0.9416 0.9704
No log 2.9655 172 0.8013 0.5316 0.8013 0.8952
No log 3.0 174 0.7891 0.5431 0.7891 0.8883
No log 3.0345 176 0.8221 0.5304 0.8221 0.9067
No log 3.0690 178 0.8403 0.5144 0.8403 0.9167
No log 3.1034 180 0.8280 0.5279 0.8280 0.9099
No log 3.1379 182 0.9126 0.4383 0.9126 0.9553
No log 3.1724 184 0.9170 0.3688 0.9170 0.9576
No log 3.2069 186 0.8390 0.4093 0.8390 0.9160
No log 3.2414 188 0.8086 0.4534 0.8086 0.8992
No log 3.2759 190 0.8591 0.3945 0.8591 0.9269
No log 3.3103 192 0.9915 0.4013 0.9915 0.9957
No log 3.3448 194 0.9877 0.4415 0.9877 0.9938
No log 3.3793 196 0.8401 0.5342 0.8401 0.9166
No log 3.4138 198 0.7852 0.5645 0.7852 0.8861
No log 3.4483 200 0.7992 0.5516 0.7992 0.8940
No log 3.4828 202 0.9126 0.4906 0.9126 0.9553
No log 3.5172 204 1.0160 0.4590 1.0160 1.0080
No log 3.5517 206 0.9519 0.4796 0.9519 0.9757
No log 3.5862 208 0.7502 0.5798 0.7502 0.8662
No log 3.6207 210 0.7468 0.6147 0.7468 0.8642
No log 3.6552 212 0.8397 0.5614 0.8397 0.9163
No log 3.6897 214 1.0878 0.4199 1.0878 1.0430
No log 3.7241 216 1.2376 0.3688 1.2376 1.1125
No log 3.7586 218 1.1367 0.1845 1.1367 1.0662
No log 3.7931 220 0.9318 0.2441 0.9318 0.9653
No log 3.8276 222 0.8192 0.4764 0.8192 0.9051
No log 3.8621 224 0.7933 0.4781 0.7933 0.8907
No log 3.8966 226 0.7977 0.5287 0.7977 0.8932
No log 3.9310 228 0.7933 0.5261 0.7933 0.8907
No log 3.9655 230 0.8923 0.4812 0.8923 0.9446
No log 4.0 232 1.0278 0.4034 1.0278 1.0138
No log 4.0345 234 0.9332 0.5012 0.9332 0.9660
No log 4.0690 236 0.7708 0.5507 0.7708 0.8779
No log 4.1034 238 0.7578 0.5131 0.7578 0.8705
No log 4.1379 240 0.7605 0.6048 0.7605 0.8721
No log 4.1724 242 0.7512 0.6301 0.7512 0.8667
No log 4.2069 244 0.7544 0.6247 0.7544 0.8685
No log 4.2414 246 0.7884 0.5571 0.7884 0.8879
No log 4.2759 248 0.8886 0.4694 0.8886 0.9426
No log 4.3103 250 1.0471 0.3461 1.0471 1.0233
No log 4.3448 252 0.9367 0.4081 0.9367 0.9678
No log 4.3793 254 0.8007 0.4186 0.8007 0.8948
No log 4.4138 256 0.7689 0.4748 0.7689 0.8768
No log 4.4483 258 0.7426 0.4462 0.7426 0.8617
No log 4.4828 260 0.7216 0.4987 0.7216 0.8495
No log 4.5172 262 0.6966 0.5490 0.6966 0.8346
No log 4.5517 264 0.6725 0.6537 0.6725 0.8201
No log 4.5862 266 0.6641 0.6689 0.6641 0.8149
No log 4.6207 268 0.6651 0.6644 0.6651 0.8155
No log 4.6552 270 0.7456 0.5963 0.7456 0.8635
No log 4.6897 272 0.9565 0.4590 0.9565 0.9780
No log 4.7241 274 0.9891 0.4281 0.9891 0.9945
No log 4.7586 276 0.9032 0.4994 0.9032 0.9504
No log 4.7931 278 0.7870 0.5243 0.7870 0.8871
No log 4.8276 280 0.7157 0.6493 0.7157 0.8460
No log 4.8621 282 0.7124 0.6766 0.7124 0.8440
No log 4.8966 284 0.7373 0.6721 0.7373 0.8587
No log 4.9310 286 0.8470 0.5511 0.8470 0.9203
No log 4.9655 288 0.8848 0.4906 0.8848 0.9407
No log 5.0 290 0.8044 0.5888 0.8044 0.8969
No log 5.0345 292 0.7435 0.5442 0.7435 0.8622
No log 5.0690 294 0.7617 0.5442 0.7617 0.8728
No log 5.1034 296 0.7960 0.5516 0.7960 0.8922
No log 5.1379 298 0.8612 0.5584 0.8612 0.9280
No log 5.1724 300 0.8530 0.5383 0.8530 0.9236
No log 5.2069 302 0.8029 0.5173 0.8029 0.8961
No log 5.2414 304 0.8144 0.5566 0.8144 0.9024
No log 5.2759 306 0.8021 0.5875 0.8021 0.8956
No log 5.3103 308 0.8010 0.6064 0.8010 0.8950
No log 5.3448 310 0.7512 0.6446 0.7512 0.8667
No log 5.3793 312 0.7478 0.6380 0.7478 0.8647
No log 5.4138 314 0.8036 0.5875 0.8036 0.8964
No log 5.4483 316 0.9362 0.4387 0.9362 0.9676
No log 5.4828 318 0.9108 0.4796 0.9108 0.9544
No log 5.5172 320 0.7947 0.4745 0.7947 0.8914
No log 5.5517 322 0.7669 0.5146 0.7669 0.8757
No log 5.5862 324 0.7984 0.5645 0.7984 0.8935
No log 5.6207 326 0.8518 0.5566 0.8518 0.9229
No log 5.6552 328 0.8425 0.5368 0.8425 0.9179
No log 5.6897 330 0.8608 0.4604 0.8608 0.9278
No log 5.7241 332 0.8448 0.4954 0.8448 0.9191
No log 5.7586 334 0.8202 0.4948 0.8202 0.9056
No log 5.7931 336 0.7759 0.5357 0.7759 0.8808
No log 5.8276 338 0.7662 0.5135 0.7662 0.8753
No log 5.8621 340 0.7743 0.4742 0.7743 0.8799
No log 5.8966 342 0.7580 0.4760 0.7580 0.8706
No log 5.9310 344 0.7583 0.5523 0.7583 0.8708
No log 5.9655 346 0.7683 0.5419 0.7683 0.8766
No log 6.0 348 0.8210 0.5888 0.8210 0.9061
No log 6.0345 350 0.8146 0.5614 0.8146 0.9025
No log 6.0690 352 0.7906 0.5614 0.7906 0.8892
No log 6.1034 354 0.7692 0.4743 0.7692 0.8770
No log 6.1379 356 0.7627 0.4878 0.7627 0.8733
No log 6.1724 358 0.7516 0.5288 0.7516 0.8669
No log 6.2069 360 0.7552 0.5614 0.7552 0.8690
No log 6.2414 362 0.8077 0.5864 0.8077 0.8987
No log 6.2759 364 0.8069 0.5864 0.8069 0.8983
No log 6.3103 366 0.7652 0.5482 0.7652 0.8747
No log 6.3448 368 0.7457 0.5565 0.7457 0.8635
No log 6.3793 370 0.7290 0.6058 0.7290 0.8538
No log 6.4138 372 0.7610 0.6593 0.7610 0.8723
No log 6.4483 374 0.8102 0.6227 0.8102 0.9001
No log 6.4828 376 0.8193 0.6344 0.8193 0.9052
No log 6.5172 378 0.7606 0.5822 0.7606 0.8721
No log 6.5517 380 0.7093 0.5955 0.7093 0.8422
No log 6.5862 382 0.7240 0.5630 0.7240 0.8509
No log 6.6207 384 0.7166 0.6124 0.7166 0.8465
No log 6.6552 386 0.7399 0.5510 0.7399 0.8602
No log 6.6897 388 0.7608 0.5697 0.7608 0.8723
No log 6.7241 390 0.7873 0.5697 0.7873 0.8873
No log 6.7586 392 0.7620 0.5318 0.7620 0.8729
No log 6.7931 394 0.7333 0.5131 0.7333 0.8564
No log 6.8276 396 0.7392 0.5316 0.7392 0.8597
No log 6.8621 398 0.7377 0.5316 0.7377 0.8589
No log 6.8966 400 0.7408 0.5467 0.7408 0.8607
No log 6.9310 402 0.8044 0.5912 0.8044 0.8969
No log 6.9655 404 0.8147 0.6045 0.8147 0.9026
No log 7.0 406 0.7661 0.6695 0.7661 0.8753
No log 7.0345 408 0.7462 0.6415 0.7462 0.8638
No log 7.0690 410 0.7569 0.6593 0.7569 0.8700
No log 7.1034 412 0.7885 0.5798 0.7885 0.8880
No log 7.1379 414 0.7568 0.6415 0.7568 0.8699
No log 7.1724 416 0.7572 0.5169 0.7572 0.8702
No log 7.2069 418 0.7534 0.5155 0.7534 0.8680
No log 7.2414 420 0.7725 0.5438 0.7725 0.8789
No log 7.2759 422 0.8204 0.5383 0.8204 0.9058
No log 7.3103 424 0.7944 0.5173 0.7944 0.8913
No log 7.3448 426 0.7597 0.5475 0.7597 0.8716
No log 7.3793 428 0.7647 0.5155 0.7647 0.8745
No log 7.4138 430 0.7773 0.4883 0.7773 0.8816
No log 7.4483 432 0.8847 0.5150 0.8847 0.9406
No log 7.4828 434 1.0734 0.4882 1.0734 1.0361
No log 7.5172 436 1.1147 0.3878 1.1147 1.0558
No log 7.5517 438 1.0232 0.4783 1.0232 1.0115
No log 7.5862 440 1.0376 0.4783 1.0376 1.0186
No log 7.6207 442 1.0709 0.4783 1.0709 1.0348
No log 7.6552 444 1.0938 0.4152 1.0938 1.0458
No log 7.6897 446 1.0359 0.4587 1.0359 1.0178
No log 7.7241 448 0.9953 0.5098 0.9953 0.9977
No log 7.7586 450 0.8578 0.5650 0.8578 0.9262
No log 7.7931 452 0.8221 0.5462 0.8221 0.9067
No log 7.8276 454 0.8641 0.5234 0.8641 0.9296
No log 7.8621 456 0.9145 0.5411 0.9145 0.9563
No log 7.8966 458 0.8971 0.5715 0.8971 0.9471
No log 7.9310 460 0.7987 0.5699 0.7987 0.8937
No log 7.9655 462 0.7418 0.5796 0.7418 0.8613
No log 8.0 464 0.7332 0.6087 0.7332 0.8563
No log 8.0345 466 0.7408 0.5370 0.7408 0.8607
No log 8.0690 468 0.7893 0.4616 0.7893 0.8884
No log 8.1034 470 0.8470 0.5126 0.8470 0.9203
No log 8.1379 472 0.8342 0.5430 0.8342 0.9133
No log 8.1724 474 0.7772 0.6672 0.7772 0.8816
No log 8.2069 476 0.7677 0.6041 0.7677 0.8762
No log 8.2414 478 0.7588 0.6206 0.7588 0.8711
No log 8.2759 480 0.7993 0.5658 0.7993 0.8941
No log 8.3103 482 0.8330 0.5658 0.8330 0.9127
No log 8.3448 484 0.7878 0.5875 0.7878 0.8876
No log 8.3793 486 0.7490 0.6147 0.7490 0.8654
No log 8.4138 488 0.7096 0.6266 0.7096 0.8424
No log 8.4483 490 0.7069 0.5403 0.7069 0.8407
No log 8.4828 492 0.7075 0.5630 0.7075 0.8411
No log 8.5172 494 0.6861 0.6196 0.6861 0.8283
No log 8.5517 496 0.7306 0.5599 0.7306 0.8548
No log 8.5862 498 0.7929 0.5566 0.7929 0.8905
0.2909 8.6207 500 0.7583 0.5566 0.7583 0.8708
0.2909 8.6552 502 0.6960 0.6345 0.6960 0.8342
0.2909 8.6897 504 0.7033 0.5524 0.7033 0.8386
0.2909 8.7241 506 0.7083 0.5421 0.7083 0.8416
0.2909 8.7586 508 0.6841 0.5647 0.6841 0.8271
0.2909 8.7931 510 0.6891 0.6272 0.6891 0.8301
0.2909 8.8276 512 0.6877 0.5647 0.6877 0.8293
0.2909 8.8621 514 0.6949 0.5530 0.6949 0.8336
0.2909 8.8966 516 0.7014 0.5548 0.7014 0.8375
0.2909 8.9310 518 0.6826 0.5054 0.6826 0.8262
0.2909 8.9655 520 0.7000 0.5746 0.7000 0.8366
0.2909 9.0 522 0.7584 0.5059 0.7584 0.8709
0.2909 9.0345 524 0.8067 0.5677 0.8067 0.8982
0.2909 9.0690 526 0.7663 0.5279 0.7663 0.8754
0.2909 9.1034 528 0.7531 0.5710 0.7531 0.8678
0.2909 9.1379 530 0.7521 0.6324 0.7521 0.8672
0.2909 9.1724 532 0.7561 0.6324 0.7561 0.8695
0.2909 9.2069 534 0.7521 0.6157 0.7521 0.8672
0.2909 9.2414 536 0.7802 0.5059 0.7802 0.8833
0.2909 9.2759 538 0.8075 0.4943 0.8075 0.8986
0.2909 9.3103 540 0.7886 0.5059 0.7886 0.8880
0.2909 9.3448 542 0.7613 0.4869 0.7613 0.8725
0.2909 9.3793 544 0.7451 0.6096 0.7451 0.8632
0.2909 9.4138 546 0.7502 0.5972 0.7502 0.8661
0.2909 9.4483 548 0.7509 0.5246 0.7509 0.8666
0.2909 9.4828 550 0.7468 0.5972 0.7468 0.8642
0.2909 9.5172 552 0.7491 0.5485 0.7491 0.8655
0.2909 9.5517 554 0.7612 0.4988 0.7612 0.8724
0.2909 9.5862 556 0.7791 0.4988 0.7791 0.8827
0.2909 9.6207 558 0.7638 0.4981 0.7638 0.8739
0.2909 9.6552 560 0.7489 0.5972 0.7489 0.8654
0.2909 9.6897 562 0.7560 0.5747 0.7560 0.8695
0.2909 9.7241 564 0.7657 0.5485 0.7657 0.8750
0.2909 9.7586 566 0.7903 0.4981 0.7903 0.8890
0.2909 9.7931 568 0.8245 0.4615 0.8245 0.9080
0.2909 9.8276 570 0.8066 0.4615 0.8066 0.8981
0.2909 9.8621 572 0.7778 0.5246 0.7778 0.8819

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k18_task5_organization

Finetuned
(4019)
this model