ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k17_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7223
  • Qwk: 0.4883
  • Mse: 0.7223
  • Rmse: 0.8499

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0370 2 4.0314 -0.0115 4.0314 2.0078
No log 0.0741 4 2.1774 0.0159 2.1774 1.4756
No log 0.1111 6 1.5129 -0.0180 1.5129 1.2300
No log 0.1481 8 1.3779 0.0506 1.3779 1.1738
No log 0.1852 10 1.5519 -0.0219 1.5519 1.2457
No log 0.2222 12 2.1339 0.0440 2.1339 1.4608
No log 0.2593 14 1.9950 0.0741 1.9950 1.4125
No log 0.2963 16 1.6973 0.0416 1.6973 1.3028
No log 0.3333 18 1.3540 0.0530 1.3540 1.1636
No log 0.3704 20 1.1490 0.2212 1.1490 1.0719
No log 0.4074 22 1.0640 0.3104 1.0640 1.0315
No log 0.4444 24 1.1480 0.3849 1.1480 1.0714
No log 0.4815 26 1.3491 0.1438 1.3491 1.1615
No log 0.5185 28 1.2300 0.2889 1.2300 1.1091
No log 0.5556 30 1.0467 0.3655 1.0467 1.0231
No log 0.5926 32 1.0372 0.2566 1.0372 1.0184
No log 0.6296 34 1.0633 0.2365 1.0633 1.0311
No log 0.6667 36 1.0264 0.2517 1.0264 1.0131
No log 0.7037 38 1.0335 0.3037 1.0335 1.0166
No log 0.7407 40 1.0253 0.3139 1.0253 1.0126
No log 0.7778 42 1.0648 0.2806 1.0648 1.0319
No log 0.8148 44 1.1464 0.2871 1.1464 1.0707
No log 0.8519 46 1.1212 0.2376 1.1212 1.0589
No log 0.8889 48 1.1192 0.1969 1.1192 1.0579
No log 0.9259 50 1.1759 0.1361 1.1759 1.0844
No log 0.9630 52 1.4701 0.0568 1.4701 1.2125
No log 1.0 54 1.5767 0.0371 1.5767 1.2557
No log 1.0370 56 1.2630 0.1352 1.2630 1.1238
No log 1.0741 58 1.0026 0.2663 1.0026 1.0013
No log 1.1111 60 0.9558 0.2490 0.9558 0.9777
No log 1.1481 62 0.9513 0.2716 0.9513 0.9754
No log 1.1852 64 0.9790 0.2068 0.9790 0.9894
No log 1.2222 66 0.9710 0.2466 0.9710 0.9854
No log 1.2593 68 0.9507 0.3094 0.9507 0.9750
No log 1.2963 70 0.9808 0.2795 0.9808 0.9903
No log 1.3333 72 0.9888 0.3071 0.9888 0.9944
No log 1.3704 74 1.0259 0.2539 1.0259 1.0129
No log 1.4074 76 1.1663 0.0938 1.1663 1.0800
No log 1.4444 78 1.1560 0.2094 1.1560 1.0752
No log 1.4815 80 0.9845 0.2424 0.9845 0.9922
No log 1.5185 82 0.8479 0.4482 0.8479 0.9208
No log 1.5556 84 0.9346 0.4033 0.9346 0.9668
No log 1.5926 86 1.0441 0.3117 1.0441 1.0218
No log 1.6296 88 0.9318 0.4456 0.9318 0.9653
No log 1.6667 90 0.8720 0.3970 0.8720 0.9338
No log 1.7037 92 0.8240 0.4370 0.8240 0.9077
No log 1.7407 94 0.7993 0.4482 0.7993 0.8940
No log 1.7778 96 0.7839 0.5399 0.7839 0.8854
No log 1.8148 98 0.8487 0.5084 0.8487 0.9213
No log 1.8519 100 0.9115 0.4710 0.9115 0.9547
No log 1.8889 102 0.8672 0.5858 0.8672 0.9312
No log 1.9259 104 0.7979 0.3836 0.7979 0.8932
No log 1.9630 106 0.9485 0.3672 0.9485 0.9739
No log 2.0 108 1.0164 0.3560 1.0164 1.0082
No log 2.0370 110 0.8851 0.3126 0.8851 0.9408
No log 2.0741 112 0.8054 0.4062 0.8054 0.8974
No log 2.1111 114 0.9276 0.4711 0.9276 0.9631
No log 2.1481 116 0.9232 0.4148 0.9232 0.9608
No log 2.1852 118 0.8734 0.3035 0.8734 0.9346
No log 2.2222 120 0.8190 0.3932 0.8190 0.9050
No log 2.2593 122 0.8107 0.4401 0.8107 0.9004
No log 2.2963 124 0.9129 0.4812 0.9129 0.9555
No log 2.3333 126 0.9718 0.4369 0.9718 0.9858
No log 2.3704 128 1.0128 0.4191 1.0128 1.0064
No log 2.4074 130 0.9207 0.4969 0.9207 0.9595
No log 2.4444 132 0.8779 0.4330 0.8779 0.9369
No log 2.4815 134 0.9524 0.4497 0.9524 0.9759
No log 2.5185 136 0.9344 0.4388 0.9344 0.9667
No log 2.5556 138 0.8326 0.5275 0.8326 0.9125
No log 2.5926 140 0.8509 0.5390 0.8509 0.9224
No log 2.6296 142 0.9871 0.4423 0.9871 0.9935
No log 2.6667 144 1.0090 0.4693 1.0090 1.0045
No log 2.7037 146 0.9046 0.5006 0.9046 0.9511
No log 2.7407 148 0.8713 0.5211 0.8713 0.9335
No log 2.7778 150 0.8079 0.4623 0.8079 0.8989
No log 2.8148 152 0.8478 0.4923 0.8478 0.9207
No log 2.8519 154 0.9160 0.4598 0.9160 0.9571
No log 2.8889 156 0.8297 0.4805 0.8297 0.9109
No log 2.9259 158 0.7337 0.5291 0.7337 0.8565
No log 2.9630 160 0.6929 0.5415 0.6929 0.8324
No log 3.0 162 0.7041 0.4813 0.7041 0.8391
No log 3.0370 164 0.6933 0.5168 0.6933 0.8327
No log 3.0741 166 0.7704 0.5062 0.7704 0.8778
No log 3.1111 168 0.8810 0.4286 0.8810 0.9386
No log 3.1481 170 0.9307 0.4093 0.9307 0.9647
No log 3.1852 172 0.8482 0.4910 0.8482 0.9210
No log 3.2222 174 0.7005 0.6012 0.7005 0.8370
No log 3.2593 176 0.6948 0.6510 0.6948 0.8336
No log 3.2963 178 0.7219 0.5819 0.7219 0.8496
No log 3.3333 180 0.7620 0.5756 0.7620 0.8729
No log 3.3704 182 0.7976 0.5340 0.7976 0.8931
No log 3.4074 184 1.0268 0.4288 1.0268 1.0133
No log 3.4444 186 1.0836 0.4669 1.0836 1.0410
No log 3.4815 188 0.9007 0.4587 0.9007 0.9491
No log 3.5185 190 0.7155 0.5584 0.7155 0.8459
No log 3.5556 192 0.6490 0.6422 0.6490 0.8056
No log 3.5926 194 0.6504 0.6082 0.6504 0.8065
No log 3.6296 196 0.7022 0.5584 0.7022 0.8380
No log 3.6667 198 0.8463 0.5098 0.8463 0.9199
No log 3.7037 200 0.9507 0.4572 0.9507 0.9750
No log 3.7407 202 0.8246 0.5319 0.8246 0.9081
No log 3.7778 204 0.6795 0.5123 0.6795 0.8243
No log 3.8148 206 0.6819 0.5226 0.6819 0.8258
No log 3.8519 208 0.6803 0.5329 0.6803 0.8248
No log 3.8889 210 0.6836 0.4888 0.6836 0.8268
No log 3.9259 212 0.7694 0.5487 0.7694 0.8772
No log 3.9630 214 0.8256 0.5266 0.8256 0.9086
No log 4.0 216 0.8912 0.4812 0.8912 0.9440
No log 4.0370 218 0.9125 0.5230 0.9125 0.9553
No log 4.0741 220 0.9442 0.5115 0.9442 0.9717
No log 4.1111 222 0.9331 0.4783 0.9331 0.9660
No log 4.1481 224 0.8000 0.3804 0.8000 0.8944
No log 4.1852 226 0.7586 0.4794 0.7586 0.8710
No log 4.2222 228 0.7839 0.4841 0.7839 0.8854
No log 4.2593 230 0.7649 0.4174 0.7649 0.8746
No log 4.2963 232 0.7656 0.4227 0.7656 0.8750
No log 4.3333 234 0.8384 0.5020 0.8384 0.9156
No log 4.3704 236 0.8922 0.4783 0.8922 0.9446
No log 4.4074 238 0.8736 0.5230 0.8736 0.9347
No log 4.4444 240 0.8488 0.5242 0.8488 0.9213
No log 4.4815 242 0.7419 0.5054 0.7419 0.8613
No log 4.5185 244 0.7234 0.4843 0.7234 0.8505
No log 4.5556 246 0.7858 0.5385 0.7858 0.8865
No log 4.5926 248 0.8180 0.5675 0.8180 0.9044
No log 4.6296 250 0.7861 0.5385 0.7861 0.8866
No log 4.6667 252 0.7816 0.4466 0.7816 0.8841
No log 4.7037 254 0.7792 0.4461 0.7792 0.8827
No log 4.7407 256 0.7941 0.4335 0.7941 0.8911
No log 4.7778 258 0.8282 0.5266 0.8282 0.9101
No log 4.8148 260 0.8282 0.4935 0.8282 0.9101
No log 4.8519 262 0.8185 0.4935 0.8185 0.9047
No log 4.8889 264 0.7623 0.5397 0.7623 0.8731
No log 4.9259 266 0.7403 0.5844 0.7403 0.8604
No log 4.9630 268 0.7533 0.5688 0.7533 0.8679
No log 5.0 270 0.8105 0.5140 0.8105 0.9003
No log 5.0370 272 0.7861 0.5688 0.7861 0.8866
No log 5.0741 274 0.7311 0.5129 0.7311 0.8551
No log 5.1111 276 0.7349 0.4754 0.7349 0.8573
No log 5.1481 278 0.7600 0.5494 0.7600 0.8718
No log 5.1852 280 0.7555 0.5054 0.7555 0.8692
No log 5.2222 282 0.7784 0.5140 0.7784 0.8822
No log 5.2593 284 0.8674 0.5220 0.8674 0.9313
No log 5.2963 286 1.0306 0.4576 1.0306 1.0152
No log 5.3333 288 0.9775 0.4572 0.9775 0.9887
No log 5.3704 290 0.8215 0.4695 0.8215 0.9064
No log 5.4074 292 0.7504 0.4531 0.7504 0.8663
No log 5.4444 294 0.7541 0.4676 0.7541 0.8684
No log 5.4815 296 0.7686 0.4629 0.7686 0.8767
No log 5.5185 298 0.7851 0.4954 0.7851 0.8861
No log 5.5556 300 0.7612 0.5407 0.7612 0.8725
No log 5.5926 302 0.7724 0.5510 0.7724 0.8788
No log 5.6296 304 0.8256 0.5046 0.8256 0.9086
No log 5.6667 306 0.8999 0.4474 0.8999 0.9486
No log 5.7037 308 0.9424 0.4119 0.9424 0.9708
No log 5.7407 310 0.9266 0.4459 0.9266 0.9626
No log 5.7778 312 0.8804 0.4575 0.8804 0.9383
No log 5.8148 314 0.7819 0.4466 0.7819 0.8843
No log 5.8519 316 0.7455 0.4115 0.7455 0.8634
No log 5.8889 318 0.7537 0.4254 0.7537 0.8682
No log 5.9259 320 0.7913 0.4940 0.7913 0.8896
No log 5.9630 322 0.7696 0.5195 0.7696 0.8773
No log 6.0 324 0.7326 0.5010 0.7326 0.8559
No log 6.0370 326 0.7516 0.5536 0.7516 0.8670
No log 6.0741 328 0.7622 0.5089 0.7622 0.8731
No log 6.1111 330 0.7587 0.4145 0.7587 0.8711
No log 6.1481 332 0.8075 0.4344 0.8075 0.8986
No log 6.1852 334 0.8645 0.3939 0.8645 0.9298
No log 6.2222 336 0.8333 0.3802 0.8333 0.9129
No log 6.2593 338 0.7869 0.3576 0.7869 0.8871
No log 6.2963 340 0.7525 0.3616 0.7525 0.8675
No log 6.3333 342 0.7308 0.4215 0.7308 0.8549
No log 6.3704 344 0.7205 0.4720 0.7205 0.8488
No log 6.4074 346 0.7070 0.4705 0.7070 0.8408
No log 6.4444 348 0.7094 0.6028 0.7094 0.8423
No log 6.4815 350 0.7149 0.6002 0.7149 0.8455
No log 6.5185 352 0.6930 0.6012 0.6930 0.8325
No log 6.5556 354 0.6987 0.5510 0.6987 0.8359
No log 6.5926 356 0.6964 0.5510 0.6964 0.8345
No log 6.6296 358 0.7099 0.5279 0.7099 0.8425
No log 6.6667 360 0.7444 0.4935 0.7444 0.8628
No log 6.7037 362 0.7737 0.5644 0.7737 0.8796
No log 6.7407 364 0.7946 0.5644 0.7946 0.8914
No log 6.7778 366 0.8643 0.5098 0.8643 0.9297
No log 6.8148 368 0.8112 0.5242 0.8112 0.9007
No log 6.8519 370 0.7483 0.5809 0.7483 0.8651
No log 6.8889 372 0.7244 0.5905 0.7244 0.8511
No log 6.9259 374 0.7328 0.5905 0.7328 0.8560
No log 6.9630 376 0.7646 0.5809 0.7646 0.8744
No log 7.0 378 0.8654 0.5242 0.8654 0.9303
No log 7.0370 380 0.8567 0.5033 0.8567 0.9256
No log 7.0741 382 0.7668 0.4974 0.7668 0.8757
No log 7.1111 384 0.7477 0.3741 0.7477 0.8647
No log 7.1481 386 0.7722 0.4510 0.7722 0.8788
No log 7.1852 388 0.7361 0.4087 0.7361 0.8580
No log 7.2222 390 0.7200 0.4371 0.7200 0.8485
No log 7.2593 392 0.7445 0.5305 0.7445 0.8628
No log 7.2963 394 0.7847 0.5033 0.7847 0.8859
No log 7.3333 396 0.7628 0.5266 0.7628 0.8734
No log 7.3704 398 0.7457 0.4960 0.7457 0.8636
No log 7.4074 400 0.7352 0.5317 0.7352 0.8574
No log 7.4444 402 0.7216 0.4313 0.7216 0.8495
No log 7.4815 404 0.7245 0.4423 0.7245 0.8512
No log 7.5185 406 0.7315 0.5112 0.7315 0.8553
No log 7.5556 408 0.7697 0.4382 0.7697 0.8773
No log 7.5926 410 0.8341 0.5150 0.8341 0.9133
No log 7.6296 412 0.8058 0.4382 0.8058 0.8976
No log 7.6667 414 0.7953 0.4499 0.7953 0.8918
No log 7.7037 416 0.8211 0.4375 0.8211 0.9061
No log 7.7407 418 0.8848 0.4224 0.8848 0.9406
No log 7.7778 420 0.8982 0.4465 0.8982 0.9477
No log 7.8148 422 0.8387 0.4483 0.8387 0.9158
No log 7.8519 424 0.8156 0.4743 0.8156 0.9031
No log 7.8889 426 0.8142 0.4743 0.8142 0.9023
No log 7.9259 428 0.8071 0.4742 0.8071 0.8984
No log 7.9630 430 0.8218 0.4491 0.8218 0.9065
No log 8.0 432 0.8759 0.4815 0.8759 0.9359
No log 8.0370 434 0.8774 0.5150 0.8774 0.9367
No log 8.0741 436 0.8232 0.5392 0.8232 0.9073
No log 8.1111 438 0.7935 0.4372 0.7935 0.8908
No log 8.1481 440 0.8138 0.5419 0.8138 0.9021
No log 8.1852 442 0.7955 0.5215 0.7955 0.8919
No log 8.2222 444 0.7998 0.5640 0.7998 0.8943
No log 8.2593 446 0.8049 0.5640 0.8049 0.8972
No log 8.2963 448 0.8697 0.5291 0.8697 0.9326
No log 8.3333 450 0.8982 0.4940 0.8982 0.9478
No log 8.3704 452 0.8995 0.4755 0.8995 0.9484
No log 8.4074 454 0.8939 0.4646 0.8939 0.9455
No log 8.4444 456 0.9237 0.4620 0.9237 0.9611
No log 8.4815 458 0.9669 0.4255 0.9669 0.9833
No log 8.5185 460 1.0196 0.4272 1.0196 1.0097
No log 8.5556 462 1.0114 0.4279 1.0114 1.0057
No log 8.5926 464 0.8957 0.4592 0.8957 0.9464
No log 8.6296 466 0.8156 0.4385 0.8156 0.9031
No log 8.6667 468 0.8196 0.4499 0.8196 0.9053
No log 8.7037 470 0.8041 0.4385 0.8041 0.8967
No log 8.7407 472 0.8166 0.4727 0.8166 0.9037
No log 8.7778 474 0.8258 0.5062 0.8258 0.9088
No log 8.8148 476 0.7834 0.4645 0.7834 0.8851
No log 8.8519 478 0.7725 0.4550 0.7725 0.8789
No log 8.8889 480 0.7768 0.5112 0.7768 0.8814
No log 8.9259 482 0.7705 0.4898 0.7705 0.8778
No log 8.9630 484 0.7659 0.5036 0.7659 0.8752
No log 9.0 486 0.7679 0.5430 0.7679 0.8763
No log 9.0370 488 0.7702 0.5542 0.7702 0.8776
No log 9.0741 490 0.7321 0.5287 0.7321 0.8556
No log 9.1111 492 0.7366 0.4873 0.7366 0.8582
No log 9.1481 494 0.7382 0.5558 0.7382 0.8592
No log 9.1852 496 0.7166 0.5673 0.7166 0.8465
No log 9.2222 498 0.7115 0.5673 0.7115 0.8435
0.2932 9.2593 500 0.6984 0.5673 0.6984 0.8357
0.2932 9.2963 502 0.6857 0.6293 0.6857 0.8280
0.2932 9.3333 504 0.6839 0.6085 0.6839 0.8270
0.2932 9.3704 506 0.7401 0.5614 0.7401 0.8603
0.2932 9.4074 508 0.7876 0.5266 0.7876 0.8875
0.2932 9.4444 510 0.7455 0.5195 0.7455 0.8634
0.2932 9.4815 512 0.7176 0.5480 0.7176 0.8471
0.2932 9.5185 514 0.7223 0.4883 0.7223 0.8499

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k17_task5_organization

Finetuned
(4019)
this model