ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k17_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5573
  • Qwk: 0.4479
  • Mse: 0.5573
  • Rmse: 0.7465

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0351 2 2.4984 -0.0593 2.4984 1.5806
No log 0.0702 4 1.2488 0.0987 1.2488 1.1175
No log 0.1053 6 0.9315 0.1685 0.9315 0.9651
No log 0.1404 8 0.8848 -0.0354 0.8848 0.9406
No log 0.1754 10 1.0820 -0.2685 1.0820 1.0402
No log 0.2105 12 1.1396 -0.2311 1.1396 1.0675
No log 0.2456 14 0.9291 -0.0426 0.9291 0.9639
No log 0.2807 16 0.8822 0.0 0.8822 0.9392
No log 0.3158 18 0.9031 -0.0426 0.9031 0.9503
No log 0.3509 20 0.8042 0.0 0.8042 0.8968
No log 0.3860 22 0.7751 0.0393 0.7751 0.8804
No log 0.4211 24 0.8461 0.0078 0.8461 0.9199
No log 0.4561 26 0.8552 0.0547 0.8552 0.9248
No log 0.4912 28 0.9317 0.1685 0.9317 0.9653
No log 0.5263 30 0.8854 0.1328 0.8854 0.9410
No log 0.5614 32 0.6983 0.0027 0.6983 0.8356
No log 0.5965 34 0.6799 0.2857 0.6799 0.8245
No log 0.6316 36 0.7033 0.2085 0.7033 0.8386
No log 0.6667 38 0.7469 -0.0389 0.7469 0.8642
No log 0.7018 40 0.9358 0.0203 0.9358 0.9673
No log 0.7368 42 1.1116 0.0679 1.1116 1.0543
No log 0.7719 44 1.1057 0.0679 1.1057 1.0515
No log 0.8070 46 0.9756 -0.0173 0.9756 0.9877
No log 0.8421 48 0.8167 0.0940 0.8167 0.9037
No log 0.8772 50 0.7631 0.3173 0.7631 0.8736
No log 0.9123 52 0.8424 0.3192 0.8424 0.9178
No log 0.9474 54 0.8142 0.3466 0.8142 0.9023
No log 0.9825 56 0.6228 0.3125 0.6228 0.7892
No log 1.0175 58 0.6276 0.2930 0.6276 0.7922
No log 1.0526 60 0.7168 0.3166 0.7168 0.8466
No log 1.0877 62 0.7462 0.1139 0.7462 0.8638
No log 1.1228 64 0.7654 0.0393 0.7654 0.8749
No log 1.1579 66 0.8057 0.0 0.8057 0.8976
No log 1.1930 68 0.9004 0.0053 0.9004 0.9489
No log 1.2281 70 0.8840 0.1711 0.8840 0.9402
No log 1.2632 72 0.8711 0.2066 0.8711 0.9333
No log 1.2982 74 0.8544 0.2046 0.8544 0.9243
No log 1.3333 76 0.8293 0.2435 0.8293 0.9107
No log 1.3684 78 0.7546 0.1770 0.7546 0.8687
No log 1.4035 80 0.6524 0.1272 0.6524 0.8077
No log 1.4386 82 0.6055 0.1591 0.6055 0.7781
No log 1.4737 84 0.5788 0.2930 0.5788 0.7608
No log 1.5088 86 0.5787 0.2930 0.5787 0.7607
No log 1.5439 88 0.5835 0.3228 0.5835 0.7639
No log 1.5789 90 0.6276 0.2389 0.6276 0.7922
No log 1.6140 92 0.6691 0.2743 0.6691 0.8180
No log 1.6491 94 0.6737 0.2389 0.6737 0.8208
No log 1.6842 96 0.7614 0.2408 0.7614 0.8726
No log 1.7193 98 0.7570 0.2066 0.7570 0.8700
No log 1.7544 100 0.6542 0.2418 0.6542 0.8089
No log 1.7895 102 0.6264 0.3196 0.6264 0.7915
No log 1.8246 104 0.6868 0.3586 0.6868 0.8287
No log 1.8596 106 0.6354 0.3196 0.6354 0.7971
No log 1.8947 108 0.6756 0.3061 0.6756 0.8220
No log 1.9298 110 0.7321 0.3010 0.7321 0.8556
No log 1.9649 112 0.6819 0.3320 0.6819 0.8258
No log 2.0 114 0.6418 0.4535 0.6418 0.8011
No log 2.0351 116 0.6120 0.5034 0.6120 0.7823
No log 2.0702 118 0.6458 0.4788 0.6458 0.8036
No log 2.1053 120 0.6959 0.4292 0.6959 0.8342
No log 2.1404 122 0.8567 0.2045 0.8567 0.9256
No log 2.1754 124 0.8337 0.2294 0.8337 0.9131
No log 2.2105 126 0.7485 0.2574 0.7485 0.8652
No log 2.2456 128 0.8785 0.1823 0.8785 0.9373
No log 2.2807 130 0.9234 0.2323 0.9234 0.9609
No log 2.3158 132 0.8268 0.2476 0.8268 0.9093
No log 2.3509 134 0.7220 0.1558 0.7220 0.8497
No log 2.3860 136 0.7681 0.2913 0.7681 0.8764
No log 2.4211 138 0.9021 0.3261 0.9021 0.9498
No log 2.4561 140 0.9599 0.3131 0.9599 0.9798
No log 2.4912 142 0.9084 0.3167 0.9084 0.9531
No log 2.5263 144 0.7904 0.2498 0.7904 0.8891
No log 2.5614 146 0.7184 0.2043 0.7184 0.8476
No log 2.5965 148 0.7713 0.2708 0.7713 0.8782
No log 2.6316 150 0.8180 0.2886 0.8180 0.9044
No log 2.6667 152 0.8193 0.3103 0.8193 0.9051
No log 2.7018 154 0.7711 0.3113 0.7711 0.8781
No log 2.7368 156 0.6987 0.3729 0.6987 0.8359
No log 2.7719 158 0.6560 0.3884 0.6560 0.8100
No log 2.8070 160 0.6279 0.3738 0.6279 0.7924
No log 2.8421 162 0.6109 0.4505 0.6109 0.7816
No log 2.8772 164 0.5875 0.4249 0.5875 0.7665
No log 2.9123 166 0.5782 0.3919 0.5782 0.7604
No log 2.9474 168 0.5865 0.4315 0.5865 0.7658
No log 2.9825 170 0.5555 0.4419 0.5555 0.7453
No log 3.0175 172 0.5603 0.5397 0.5603 0.7486
No log 3.0526 174 0.5456 0.5050 0.5456 0.7387
No log 3.0877 176 0.5608 0.4504 0.5608 0.7489
No log 3.1228 178 0.5511 0.5457 0.5511 0.7423
No log 3.1579 180 0.5863 0.4389 0.5863 0.7657
No log 3.1930 182 0.5755 0.3942 0.5755 0.7586
No log 3.2281 184 0.5774 0.3029 0.5774 0.7599
No log 3.2632 186 0.6023 0.3445 0.6023 0.7761
No log 3.2982 188 0.6249 0.3494 0.6249 0.7905
No log 3.3333 190 0.6245 0.2181 0.6245 0.7903
No log 3.3684 192 0.6353 0.1277 0.6353 0.7970
No log 3.4035 194 0.6362 0.2041 0.6362 0.7976
No log 3.4386 196 0.6134 0.3213 0.6134 0.7832
No log 3.4737 198 0.5985 0.4752 0.5985 0.7736
No log 3.5088 200 0.5868 0.4059 0.5868 0.7660
No log 3.5439 202 0.5700 0.4455 0.5700 0.7550
No log 3.5789 204 0.5754 0.5491 0.5754 0.7586
No log 3.6140 206 0.6563 0.4568 0.6563 0.8101
No log 3.6491 208 0.5988 0.5421 0.5988 0.7738
No log 3.6842 210 0.5864 0.4704 0.5864 0.7658
No log 3.7193 212 0.6351 0.5042 0.6351 0.7969
No log 3.7544 214 0.6252 0.4684 0.6252 0.7907
No log 3.7895 216 0.5792 0.4020 0.5792 0.7610
No log 3.8246 218 0.5596 0.3788 0.5596 0.7481
No log 3.8596 220 0.5471 0.3919 0.5471 0.7397
No log 3.8947 222 0.5395 0.3919 0.5395 0.7345
No log 3.9298 224 0.5489 0.4243 0.5489 0.7409
No log 3.9649 226 0.5384 0.4243 0.5384 0.7337
No log 4.0 228 0.5280 0.4908 0.5280 0.7266
No log 4.0351 230 0.5630 0.6514 0.5630 0.7503
No log 4.0702 232 0.5821 0.6526 0.5821 0.7630
No log 4.1053 234 0.5617 0.5974 0.5617 0.7495
No log 4.1404 236 0.5589 0.5827 0.5589 0.7476
No log 4.1754 238 0.5556 0.5739 0.5556 0.7454
No log 4.2105 240 0.6062 0.5983 0.6062 0.7786
No log 4.2456 242 0.6560 0.4740 0.6560 0.8099
No log 4.2807 244 0.6548 0.4122 0.6548 0.8092
No log 4.3158 246 0.5668 0.5136 0.5668 0.7529
No log 4.3509 248 0.5298 0.5719 0.5298 0.7279
No log 4.3860 250 0.5475 0.5289 0.5475 0.7400
No log 4.4211 252 0.5703 0.4352 0.5703 0.7552
No log 4.4561 254 0.5669 0.5036 0.5669 0.7529
No log 4.4912 256 0.5698 0.4111 0.5698 0.7549
No log 4.5263 258 0.6521 0.3730 0.6521 0.8075
No log 4.5614 260 0.6698 0.4395 0.6698 0.8184
No log 4.5965 262 0.6963 0.3957 0.6963 0.8345
No log 4.6316 264 0.7033 0.4189 0.7033 0.8386
No log 4.6667 266 0.6275 0.4442 0.6275 0.7921
No log 4.7018 268 0.5527 0.5697 0.5527 0.7435
No log 4.7368 270 0.5818 0.4854 0.5818 0.7627
No log 4.7719 272 0.5899 0.4854 0.5899 0.7680
No log 4.8070 274 0.5672 0.4605 0.5672 0.7531
No log 4.8421 276 0.5793 0.5494 0.5793 0.7611
No log 4.8772 278 0.6502 0.4459 0.6502 0.8064
No log 4.9123 280 0.6368 0.5384 0.6368 0.7980
No log 4.9474 282 0.5586 0.4590 0.5586 0.7474
No log 4.9825 284 0.5729 0.3341 0.5729 0.7569
No log 5.0175 286 0.6788 0.4052 0.6788 0.8239
No log 5.0526 288 0.7099 0.3963 0.7099 0.8425
No log 5.0877 290 0.6505 0.3425 0.6505 0.8065
No log 5.1228 292 0.6422 0.3211 0.6422 0.8014
No log 5.1579 294 0.6760 0.3785 0.6760 0.8222
No log 5.1930 296 0.6737 0.3425 0.6737 0.8208
No log 5.2281 298 0.6666 0.3096 0.6666 0.8165
No log 5.2632 300 0.6809 0.3314 0.6809 0.8252
No log 5.2982 302 0.6867 0.3023 0.6867 0.8287
No log 5.3333 304 0.6646 0.3340 0.6646 0.8153
No log 5.3684 306 0.6558 0.4719 0.6558 0.8098
No log 5.4035 308 0.6858 0.4099 0.6858 0.8282
No log 5.4386 310 0.7065 0.4789 0.7065 0.8405
No log 5.4737 312 0.6611 0.4096 0.6611 0.8131
No log 5.5088 314 0.6353 0.4569 0.6353 0.7971
No log 5.5439 316 0.6187 0.4253 0.6187 0.7866
No log 5.5789 318 0.6110 0.3296 0.6110 0.7817
No log 5.6140 320 0.6216 0.3622 0.6216 0.7884
No log 5.6491 322 0.6248 0.3622 0.6248 0.7904
No log 5.6842 324 0.6002 0.3050 0.6002 0.7747
No log 5.7193 326 0.6119 0.3530 0.6119 0.7822
No log 5.7544 328 0.7103 0.3970 0.7103 0.8428
No log 5.7895 330 0.7962 0.4024 0.7962 0.8923
No log 5.8246 332 0.7607 0.4024 0.7607 0.8722
No log 5.8596 334 0.6546 0.4694 0.6546 0.8091
No log 5.8947 336 0.5979 0.3863 0.5979 0.7732
No log 5.9298 338 0.5935 0.3837 0.5935 0.7704
No log 5.9649 340 0.5898 0.3995 0.5898 0.7680
No log 6.0 342 0.5822 0.3886 0.5822 0.7630
No log 6.0351 344 0.5776 0.3939 0.5776 0.7600
No log 6.0702 346 0.5882 0.3050 0.5882 0.7670
No log 6.1053 348 0.6163 0.3737 0.6163 0.7850
No log 6.1404 350 0.6332 0.3737 0.6332 0.7958
No log 6.1754 352 0.6289 0.3737 0.6289 0.7930
No log 6.2105 354 0.5864 0.3976 0.5864 0.7657
No log 6.2456 356 0.5790 0.4262 0.5790 0.7609
No log 6.2807 358 0.6136 0.5584 0.6136 0.7833
No log 6.3158 360 0.6030 0.5368 0.6030 0.7765
No log 6.3509 362 0.5644 0.4569 0.5644 0.7513
No log 6.3860 364 0.5534 0.4338 0.5534 0.7439
No log 6.4211 366 0.5536 0.4724 0.5536 0.7441
No log 6.4561 368 0.5468 0.4724 0.5468 0.7395
No log 6.4912 370 0.5511 0.5397 0.5511 0.7423
No log 6.5263 372 0.5697 0.5266 0.5697 0.7548
No log 6.5614 374 0.5634 0.5554 0.5634 0.7506
No log 6.5965 376 0.5273 0.6046 0.5273 0.7262
No log 6.6316 378 0.5354 0.4555 0.5354 0.7317
No log 6.6667 380 0.5321 0.4555 0.5321 0.7295
No log 6.7018 382 0.5256 0.4857 0.5256 0.7250
No log 6.7368 384 0.5219 0.4206 0.5219 0.7224
No log 6.7719 386 0.5366 0.5104 0.5366 0.7326
No log 6.8070 388 0.5298 0.5345 0.5298 0.7279
No log 6.8421 390 0.5290 0.5286 0.5290 0.7273
No log 6.8772 392 0.5438 0.6068 0.5438 0.7374
No log 6.9123 394 0.5355 0.5042 0.5355 0.7318
No log 6.9474 396 0.5347 0.5177 0.5347 0.7312
No log 6.9825 398 0.5339 0.4419 0.5339 0.7307
No log 7.0175 400 0.5451 0.3919 0.5451 0.7383
No log 7.0526 402 0.5683 0.4352 0.5683 0.7538
No log 7.0877 404 0.5528 0.4618 0.5528 0.7435
No log 7.1228 406 0.5306 0.4378 0.5306 0.7285
No log 7.1579 408 0.5275 0.5246 0.5275 0.7263
No log 7.1930 410 0.5305 0.5028 0.5305 0.7284
No log 7.2281 412 0.5312 0.4960 0.5312 0.7289
No log 7.2632 414 0.5353 0.5283 0.5353 0.7316
No log 7.2982 416 0.5314 0.5189 0.5314 0.7289
No log 7.3333 418 0.5223 0.6215 0.5223 0.7227
No log 7.3684 420 0.5286 0.4795 0.5286 0.7270
No log 7.4035 422 0.5276 0.4642 0.5276 0.7264
No log 7.4386 424 0.5195 0.6125 0.5195 0.7207
No log 7.4737 426 0.5708 0.5046 0.5708 0.7555
No log 7.5088 428 0.6119 0.5031 0.6119 0.7822
No log 7.5439 430 0.5968 0.4756 0.5968 0.7725
No log 7.5789 432 0.5465 0.5574 0.5465 0.7393
No log 7.6140 434 0.5300 0.5248 0.5300 0.7280
No log 7.6491 436 0.5353 0.4983 0.5353 0.7316
No log 7.6842 438 0.5342 0.5003 0.5342 0.7309
No log 7.7193 440 0.5387 0.5885 0.5387 0.7340
No log 7.7544 442 0.5506 0.5649 0.5506 0.7420
No log 7.7895 444 0.5696 0.5352 0.5696 0.7548
No log 7.8246 446 0.5850 0.5498 0.5850 0.7649
No log 7.8596 448 0.5513 0.6295 0.5513 0.7425
No log 7.8947 450 0.5260 0.5232 0.5260 0.7252
No log 7.9298 452 0.5393 0.5345 0.5393 0.7344
No log 7.9649 454 0.5344 0.5056 0.5344 0.7310
No log 8.0 456 0.5403 0.4634 0.5403 0.7350
No log 8.0351 458 0.5994 0.4237 0.5994 0.7742
No log 8.0702 460 0.6717 0.3783 0.6717 0.8196
No log 8.1053 462 0.6546 0.4289 0.6546 0.8091
No log 8.1404 464 0.5764 0.5010 0.5764 0.7592
No log 8.1754 466 0.5353 0.4837 0.5353 0.7316
No log 8.2105 468 0.5634 0.4835 0.5634 0.7506
No log 8.2456 470 0.5780 0.4835 0.5780 0.7602
No log 8.2807 472 0.5540 0.4270 0.5540 0.7443
No log 8.3158 474 0.5374 0.4437 0.5374 0.7331
No log 8.3509 476 0.5362 0.4437 0.5362 0.7323
No log 8.3860 478 0.5425 0.4437 0.5425 0.7366
No log 8.4211 480 0.5481 0.4534 0.5481 0.7403
No log 8.4561 482 0.5615 0.4911 0.5615 0.7493
No log 8.4912 484 0.5435 0.5086 0.5435 0.7372
No log 8.5263 486 0.5273 0.4945 0.5273 0.7262
No log 8.5614 488 0.5190 0.5498 0.5190 0.7204
No log 8.5965 490 0.5078 0.6082 0.5078 0.7126
No log 8.6316 492 0.5031 0.6518 0.5031 0.7093
No log 8.6667 494 0.5026 0.6518 0.5026 0.7090
No log 8.7018 496 0.5012 0.6518 0.5012 0.7079
No log 8.7368 498 0.5017 0.5995 0.5017 0.7083
0.3285 8.7719 500 0.5110 0.5142 0.5110 0.7148
0.3285 8.8070 502 0.5117 0.4681 0.5117 0.7153
0.3285 8.8421 504 0.5103 0.5307 0.5103 0.7144
0.3285 8.8772 506 0.5098 0.5752 0.5098 0.7140
0.3285 8.9123 508 0.5019 0.6129 0.5019 0.7084
0.3285 8.9474 510 0.5039 0.6359 0.5039 0.7099
0.3285 8.9825 512 0.5034 0.6286 0.5034 0.7095
0.3285 9.0175 514 0.5021 0.6518 0.5021 0.7086
0.3285 9.0526 516 0.5193 0.5741 0.5193 0.7207
0.3285 9.0877 518 0.5045 0.5708 0.5045 0.7103
0.3285 9.1228 520 0.4941 0.5882 0.4941 0.7029
0.3285 9.1579 522 0.5163 0.5017 0.5163 0.7186
0.3285 9.1930 524 0.5483 0.4753 0.5483 0.7404
0.3285 9.2281 526 0.5304 0.4753 0.5304 0.7283
0.3285 9.2632 528 0.4969 0.5209 0.4969 0.7049
0.3285 9.2982 530 0.4929 0.5605 0.4929 0.7021
0.3285 9.3333 532 0.4943 0.5208 0.4943 0.7031
0.3285 9.3684 534 0.4969 0.4970 0.4969 0.7049
0.3285 9.4035 536 0.4995 0.5782 0.4995 0.7068
0.3285 9.4386 538 0.5157 0.5861 0.5157 0.7181
0.3285 9.4737 540 0.5231 0.5796 0.5231 0.7233
0.3285 9.5088 542 0.5159 0.5796 0.5159 0.7183
0.3285 9.5439 544 0.5441 0.5721 0.5441 0.7376
0.3285 9.5789 546 0.5597 0.5313 0.5597 0.7482
0.3285 9.6140 548 0.5149 0.5495 0.5149 0.7175
0.3285 9.6491 550 0.5141 0.5455 0.5141 0.7170
0.3285 9.6842 552 0.5855 0.4602 0.5855 0.7652
0.3285 9.7193 554 0.6102 0.4576 0.6102 0.7812
0.3285 9.7544 556 0.5799 0.4576 0.5799 0.7615
0.3285 9.7895 558 0.5762 0.4479 0.5762 0.7591
0.3285 9.8246 560 0.5573 0.4479 0.5573 0.7465

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k17_task7_organization

Finetuned
(4019)
this model