ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k4_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5042
  • Qwk: 0.5127
  • Mse: 0.5042
  • Rmse: 0.7100

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0952 2 2.5716 -0.0924 2.5716 1.6036
No log 0.1905 4 1.3078 0.0126 1.3078 1.1436
No log 0.2857 6 0.8720 0.0049 0.8720 0.9338
No log 0.3810 8 0.7715 0.1598 0.7715 0.8784
No log 0.4762 10 0.7834 0.2615 0.7834 0.8851
No log 0.5714 12 0.7443 0.3496 0.7443 0.8628
No log 0.6667 14 0.7335 0.3931 0.7335 0.8564
No log 0.7619 16 0.6228 0.4337 0.6228 0.7892
No log 0.8571 18 0.7012 0.3231 0.7012 0.8374
No log 0.9524 20 0.7022 0.3425 0.7022 0.8380
No log 1.0476 22 0.6858 0.3090 0.6858 0.8281
No log 1.1429 24 1.0982 0.2537 1.0982 1.0479
No log 1.2381 26 1.5111 0.0192 1.5111 1.2293
No log 1.3333 28 1.4778 0.0192 1.4778 1.2157
No log 1.4286 30 1.0551 0.2081 1.0551 1.0272
No log 1.5238 32 0.6741 0.1267 0.6741 0.8211
No log 1.6190 34 0.8681 0.3294 0.8681 0.9317
No log 1.7143 36 0.9941 0.2504 0.9941 0.9970
No log 1.8095 38 0.8308 0.3675 0.8308 0.9115
No log 1.9048 40 0.6658 0.3238 0.6658 0.8160
No log 2.0 42 0.6182 0.2345 0.6182 0.7863
No log 2.0952 44 0.6467 0.2808 0.6467 0.8042
No log 2.1905 46 0.6911 0.3573 0.6911 0.8313
No log 2.2857 48 0.6219 0.3829 0.6219 0.7886
No log 2.3810 50 0.5983 0.3961 0.5983 0.7735
No log 2.4762 52 0.6648 0.4587 0.6648 0.8154
No log 2.5714 54 0.6471 0.4321 0.6471 0.8044
No log 2.6667 56 0.6014 0.5160 0.6014 0.7755
No log 2.7619 58 0.7219 0.4725 0.7219 0.8497
No log 2.8571 60 0.9442 0.3003 0.9442 0.9717
No log 2.9524 62 0.8868 0.3492 0.8868 0.9417
No log 3.0476 64 0.8250 0.4921 0.8250 0.9083
No log 3.1429 66 0.6430 0.4842 0.6430 0.8018
No log 3.2381 68 0.5643 0.4044 0.5643 0.7512
No log 3.3333 70 0.5511 0.3964 0.5511 0.7423
No log 3.4286 72 0.5842 0.3664 0.5842 0.7643
No log 3.5238 74 0.5620 0.4061 0.5620 0.7497
No log 3.6190 76 0.5441 0.3781 0.5441 0.7377
No log 3.7143 78 0.6402 0.4745 0.6402 0.8001
No log 3.8095 80 0.7814 0.5124 0.7814 0.8839
No log 3.9048 82 0.6720 0.5163 0.6720 0.8197
No log 4.0 84 0.5542 0.4757 0.5542 0.7445
No log 4.0952 86 0.7521 0.4977 0.7521 0.8672
No log 4.1905 88 0.9968 0.3649 0.9968 0.9984
No log 4.2857 90 0.9485 0.3753 0.9485 0.9739
No log 4.3810 92 0.7211 0.5034 0.7211 0.8491
No log 4.4762 94 0.5916 0.3645 0.5916 0.7692
No log 4.5714 96 0.6767 0.4212 0.6767 0.8226
No log 4.6667 98 0.7367 0.4350 0.7367 0.8583
No log 4.7619 100 0.7320 0.4315 0.7320 0.8556
No log 4.8571 102 0.6052 0.4196 0.6052 0.7779
No log 4.9524 104 0.6081 0.5184 0.6081 0.7798
No log 5.0476 106 0.6085 0.4841 0.6085 0.7801
No log 5.1429 108 0.6052 0.4248 0.6052 0.7779
No log 5.2381 110 0.6350 0.4642 0.6350 0.7969
No log 5.3333 112 0.5957 0.4314 0.5957 0.7718
No log 5.4286 114 0.6142 0.4595 0.6142 0.7837
No log 5.5238 116 0.6754 0.4230 0.6754 0.8218
No log 5.6190 118 0.6363 0.4119 0.6363 0.7977
No log 5.7143 120 0.6054 0.4535 0.6054 0.7781
No log 5.8095 122 0.6135 0.4079 0.6135 0.7833
No log 5.9048 124 0.7069 0.4580 0.7069 0.8408
No log 6.0 126 0.7071 0.4670 0.7071 0.8409
No log 6.0952 128 0.7275 0.4946 0.7275 0.8529
No log 6.1905 130 0.7315 0.4768 0.7315 0.8553
No log 6.2857 132 0.6413 0.4866 0.6413 0.8008
No log 6.3810 134 0.5938 0.5304 0.5938 0.7706
No log 6.4762 136 0.7270 0.4597 0.7270 0.8526
No log 6.5714 138 0.6875 0.4243 0.6875 0.8291
No log 6.6667 140 0.5619 0.4856 0.5619 0.7496
No log 6.7619 142 0.6493 0.5030 0.6493 0.8058
No log 6.8571 144 0.7070 0.4602 0.7070 0.8408
No log 6.9524 146 0.6072 0.5607 0.6072 0.7793
No log 7.0476 148 0.5540 0.5405 0.5540 0.7443
No log 7.1429 150 0.5696 0.5003 0.5696 0.7547
No log 7.2381 152 0.5720 0.5396 0.5720 0.7563
No log 7.3333 154 0.5988 0.4581 0.5988 0.7738
No log 7.4286 156 0.5920 0.4789 0.5920 0.7694
No log 7.5238 158 0.5936 0.5647 0.5936 0.7704
No log 7.6190 160 0.5901 0.5252 0.5901 0.7682
No log 7.7143 162 0.6190 0.4388 0.6190 0.7868
No log 7.8095 164 0.7150 0.5018 0.7150 0.8456
No log 7.9048 166 0.6776 0.5122 0.6776 0.8232
No log 8.0 168 0.5948 0.4757 0.5948 0.7712
No log 8.0952 170 0.6567 0.5131 0.6567 0.8104
No log 8.1905 172 0.7030 0.4723 0.7030 0.8384
No log 8.2857 174 0.6341 0.4534 0.6341 0.7963
No log 8.3810 176 0.5954 0.4536 0.5954 0.7716
No log 8.4762 178 0.6246 0.4441 0.6246 0.7903
No log 8.5714 180 0.6124 0.4555 0.6124 0.7826
No log 8.6667 182 0.6175 0.4422 0.6175 0.7858
No log 8.7619 184 0.6474 0.4997 0.6474 0.8046
No log 8.8571 186 0.6924 0.4843 0.6924 0.8321
No log 8.9524 188 0.6626 0.5129 0.6626 0.8140
No log 9.0476 190 0.6576 0.5205 0.6576 0.8109
No log 9.1429 192 0.6885 0.4898 0.6885 0.8298
No log 9.2381 194 0.6415 0.5524 0.6415 0.8009
No log 9.3333 196 0.5877 0.5008 0.5877 0.7666
No log 9.4286 198 0.5786 0.4789 0.5786 0.7607
No log 9.5238 200 0.5867 0.5578 0.5867 0.7660
No log 9.6190 202 0.6701 0.5028 0.6701 0.8186
No log 9.7143 204 0.7201 0.4859 0.7201 0.8486
No log 9.8095 206 0.6538 0.4957 0.6538 0.8086
No log 9.9048 208 0.5934 0.5334 0.5934 0.7703
No log 10.0 210 0.6292 0.5403 0.6292 0.7932
No log 10.0952 212 0.6222 0.4997 0.6222 0.7888
No log 10.1905 214 0.5789 0.4937 0.5789 0.7609
No log 10.2857 216 0.5945 0.4315 0.5945 0.7710
No log 10.3810 218 0.6243 0.4918 0.6243 0.7902
No log 10.4762 220 0.5969 0.4422 0.5969 0.7726
No log 10.5714 222 0.5965 0.5444 0.5965 0.7723
No log 10.6667 224 0.6072 0.5173 0.6072 0.7793
No log 10.7619 226 0.6009 0.4562 0.6009 0.7752
No log 10.8571 228 0.6785 0.5236 0.6785 0.8237
No log 10.9524 230 0.7619 0.4890 0.7619 0.8729
No log 11.0476 232 0.6968 0.4898 0.6968 0.8348
No log 11.1429 234 0.6169 0.4675 0.6169 0.7854
No log 11.2381 236 0.6108 0.4454 0.6108 0.7816
No log 11.3333 238 0.6164 0.4692 0.6164 0.7851
No log 11.4286 240 0.6059 0.4618 0.6059 0.7784
No log 11.5238 242 0.5887 0.5021 0.5887 0.7673
No log 11.6190 244 0.6069 0.4576 0.6069 0.7790
No log 11.7143 246 0.5851 0.4535 0.5851 0.7649
No log 11.8095 248 0.5575 0.5039 0.5575 0.7466
No log 11.9048 250 0.5514 0.5368 0.5514 0.7425
No log 12.0 252 0.5399 0.5868 0.5399 0.7347
No log 12.0952 254 0.5528 0.4935 0.5528 0.7435
No log 12.1905 256 0.5969 0.5106 0.5969 0.7726
No log 12.2857 258 0.5498 0.5368 0.5498 0.7415
No log 12.3810 260 0.5024 0.5538 0.5024 0.7088
No log 12.4762 262 0.5089 0.5723 0.5089 0.7134
No log 12.5714 264 0.5013 0.6334 0.5013 0.7080
No log 12.6667 266 0.5036 0.5046 0.5036 0.7096
No log 12.7619 268 0.6607 0.4961 0.6607 0.8129
No log 12.8571 270 0.7840 0.4977 0.7840 0.8855
No log 12.9524 272 0.7330 0.4977 0.7330 0.8562
No log 13.0476 274 0.5908 0.4909 0.5908 0.7687
No log 13.1429 276 0.5261 0.5768 0.5261 0.7253
No log 13.2381 278 0.5723 0.5744 0.5723 0.7565
No log 13.3333 280 0.5711 0.5712 0.5711 0.7557
No log 13.4286 282 0.5399 0.4914 0.5399 0.7348
No log 13.5238 284 0.5523 0.4838 0.5523 0.7432
No log 13.6190 286 0.5919 0.4977 0.5919 0.7694
No log 13.7143 288 0.5829 0.4632 0.5829 0.7635
No log 13.8095 290 0.5532 0.4384 0.5532 0.7438
No log 13.9048 292 0.5616 0.4681 0.5616 0.7494
No log 14.0 294 0.5864 0.6137 0.5864 0.7657
No log 14.0952 296 0.5832 0.6137 0.5832 0.7637
No log 14.1905 298 0.5590 0.4222 0.5590 0.7477
No log 14.2857 300 0.5833 0.4556 0.5833 0.7637
No log 14.3810 302 0.6268 0.4807 0.6268 0.7917
No log 14.4762 304 0.6101 0.4499 0.6101 0.7811
No log 14.5714 306 0.5850 0.3716 0.5850 0.7648
No log 14.6667 308 0.5727 0.4114 0.5727 0.7567
No log 14.7619 310 0.5929 0.4555 0.5929 0.7700
No log 14.8571 312 0.6050 0.5388 0.6050 0.7778
No log 14.9524 314 0.5972 0.5845 0.5972 0.7728
No log 15.0476 316 0.5859 0.4758 0.5859 0.7655
No log 15.1429 318 0.5626 0.4402 0.5626 0.7501
No log 15.2381 320 0.5557 0.4595 0.5557 0.7454
No log 15.3333 322 0.5503 0.4495 0.5503 0.7418
No log 15.4286 324 0.5779 0.5991 0.5779 0.7602
No log 15.5238 326 0.5678 0.6092 0.5678 0.7535
No log 15.6190 328 0.5426 0.6441 0.5426 0.7366
No log 15.7143 330 0.5062 0.6121 0.5062 0.7114
No log 15.8095 332 0.5417 0.5152 0.5417 0.7360
No log 15.9048 334 0.5720 0.5513 0.5720 0.7563
No log 16.0 336 0.5604 0.4935 0.5604 0.7486
No log 16.0952 338 0.5363 0.5061 0.5363 0.7323
No log 16.1905 340 0.5332 0.4386 0.5332 0.7302
No log 16.2857 342 0.5432 0.4240 0.5432 0.7370
No log 16.3810 344 0.5465 0.4782 0.5465 0.7392
No log 16.4762 346 0.5954 0.5230 0.5954 0.7716
No log 16.5714 348 0.6444 0.5048 0.6444 0.8028
No log 16.6667 350 0.5911 0.5384 0.5911 0.7688
No log 16.7619 352 0.5641 0.5217 0.5641 0.7510
No log 16.8571 354 0.5443 0.4901 0.5443 0.7378
No log 16.9524 356 0.5419 0.4544 0.5419 0.7361
No log 17.0476 358 0.5541 0.5167 0.5541 0.7444
No log 17.1429 360 0.5779 0.4393 0.5779 0.7602
No log 17.2381 362 0.6032 0.4997 0.6032 0.7766
No log 17.3333 364 0.6024 0.5497 0.6024 0.7761
No log 17.4286 366 0.5656 0.5223 0.5656 0.7521
No log 17.5238 368 0.5365 0.4438 0.5365 0.7325
No log 17.6190 370 0.5509 0.4832 0.5509 0.7422
No log 17.7143 372 0.5612 0.4789 0.5612 0.7492
No log 17.8095 374 0.5450 0.5390 0.5450 0.7383
No log 17.9048 376 0.5432 0.4697 0.5432 0.7370
No log 18.0 378 0.5436 0.4776 0.5436 0.7373
No log 18.0952 380 0.5532 0.5399 0.5532 0.7438
No log 18.1905 382 0.5557 0.5399 0.5557 0.7455
No log 18.2857 384 0.5436 0.4776 0.5436 0.7373
No log 18.3810 386 0.5786 0.3713 0.5786 0.7606
No log 18.4762 388 0.6263 0.4393 0.6263 0.7914
No log 18.5714 390 0.6247 0.4484 0.6247 0.7904
No log 18.6667 392 0.5915 0.4951 0.5915 0.7691
No log 18.7619 394 0.5959 0.4951 0.5959 0.7719
No log 18.8571 396 0.5710 0.5397 0.5710 0.7556
No log 18.9524 398 0.5624 0.5195 0.5624 0.7499
No log 19.0476 400 0.5511 0.5238 0.5511 0.7423
No log 19.1429 402 0.5427 0.4149 0.5427 0.7367
No log 19.2381 404 0.5411 0.3786 0.5411 0.7356
No log 19.3333 406 0.5409 0.3834 0.5409 0.7355
No log 19.4286 408 0.5438 0.4561 0.5438 0.7374
No log 19.5238 410 0.5392 0.3863 0.5392 0.7343
No log 19.6190 412 0.5409 0.4402 0.5409 0.7355
No log 19.7143 414 0.5597 0.4067 0.5597 0.7481
No log 19.8095 416 0.5715 0.3788 0.5715 0.7560
No log 19.9048 418 0.5667 0.3813 0.5667 0.7528
No log 20.0 420 0.5549 0.3813 0.5549 0.7449
No log 20.0952 422 0.5381 0.4701 0.5381 0.7335
No log 20.1905 424 0.5338 0.5075 0.5338 0.7306
No log 20.2857 426 0.5325 0.5301 0.5325 0.7297
No log 20.3810 428 0.5286 0.5614 0.5286 0.7270
No log 20.4762 430 0.5249 0.5301 0.5249 0.7245
No log 20.5714 432 0.5238 0.5317 0.5238 0.7238
No log 20.6667 434 0.5310 0.4918 0.5310 0.7287
No log 20.7619 436 0.5250 0.5317 0.5250 0.7246
No log 20.8571 438 0.5235 0.4901 0.5235 0.7236
No log 20.9524 440 0.5322 0.5120 0.5322 0.7295
No log 21.0476 442 0.5284 0.4901 0.5284 0.7269
No log 21.1429 444 0.5298 0.5913 0.5298 0.7279
No log 21.2381 446 0.5365 0.6009 0.5365 0.7324
No log 21.3333 448 0.5300 0.5426 0.5300 0.7280
No log 21.4286 450 0.5311 0.5256 0.5311 0.7287
No log 21.5238 452 0.5296 0.5271 0.5296 0.7277
No log 21.6190 454 0.5254 0.5221 0.5254 0.7248
No log 21.7143 456 0.5317 0.5044 0.5317 0.7292
No log 21.8095 458 0.5322 0.5044 0.5322 0.7295
No log 21.9048 460 0.5193 0.5617 0.5193 0.7206
No log 22.0 462 0.5198 0.6125 0.5198 0.7210
No log 22.0952 464 0.5175 0.6125 0.5175 0.7194
No log 22.1905 466 0.5152 0.5617 0.5152 0.7178
No log 22.2857 468 0.5170 0.5875 0.5170 0.7190
No log 22.3810 470 0.5201 0.6198 0.5201 0.7212
No log 22.4762 472 0.5176 0.6210 0.5176 0.7194
No log 22.5714 474 0.5186 0.6365 0.5186 0.7201
No log 22.6667 476 0.5136 0.5768 0.5136 0.7166
No log 22.7619 478 0.5374 0.4756 0.5374 0.7331
No log 22.8571 480 0.5562 0.5083 0.5562 0.7458
No log 22.9524 482 0.5363 0.5189 0.5363 0.7323
No log 23.0476 484 0.5327 0.5065 0.5327 0.7298
No log 23.1429 486 0.5787 0.3985 0.5787 0.7607
No log 23.2381 488 0.6367 0.4606 0.6367 0.7979
No log 23.3333 490 0.6267 0.4522 0.6267 0.7917
No log 23.4286 492 0.5649 0.3976 0.5649 0.7516
No log 23.5238 494 0.5360 0.4136 0.5360 0.7321
No log 23.6190 496 0.5349 0.5324 0.5349 0.7314
No log 23.7143 498 0.5397 0.5046 0.5397 0.7346
0.2975 23.8095 500 0.5378 0.5422 0.5378 0.7333
0.2975 23.9048 502 0.5288 0.4923 0.5288 0.7272
0.2975 24.0 504 0.5387 0.4984 0.5387 0.7339
0.2975 24.0952 506 0.5717 0.5237 0.5717 0.7561
0.2975 24.1905 508 0.5603 0.4618 0.5603 0.7485
0.2975 24.2857 510 0.5382 0.5656 0.5382 0.7336
0.2975 24.3810 512 0.5428 0.5127 0.5428 0.7368
0.2975 24.4762 514 0.5422 0.5127 0.5422 0.7364
0.2975 24.5714 516 0.5389 0.5178 0.5389 0.7341
0.2975 24.6667 518 0.5361 0.5269 0.5361 0.7322
0.2975 24.7619 520 0.5326 0.5476 0.5326 0.7298
0.2975 24.8571 522 0.5264 0.5753 0.5264 0.7255
0.2975 24.9524 524 0.5397 0.5483 0.5397 0.7346
0.2975 25.0476 526 0.5321 0.5335 0.5321 0.7294
0.2975 25.1429 528 0.5183 0.5649 0.5183 0.7199
0.2975 25.2381 530 0.5170 0.5915 0.5170 0.7190
0.2975 25.3333 532 0.5163 0.5476 0.5163 0.7185
0.2975 25.4286 534 0.5415 0.6104 0.5415 0.7358
0.2975 25.5238 536 0.5539 0.6282 0.5539 0.7443
0.2975 25.6190 538 0.5352 0.5865 0.5352 0.7315
0.2975 25.7143 540 0.5170 0.5634 0.5170 0.7190
0.2975 25.8095 542 0.5249 0.5352 0.5249 0.7245
0.2975 25.9048 544 0.5398 0.5657 0.5398 0.7347
0.2975 26.0 546 0.5544 0.5315 0.5544 0.7446
0.2975 26.0952 548 0.5392 0.5657 0.5392 0.7343
0.2975 26.1905 550 0.5328 0.5300 0.5328 0.7299
0.2975 26.2857 552 0.5412 0.5455 0.5412 0.7357
0.2975 26.3810 554 0.5367 0.5455 0.5367 0.7326
0.2975 26.4762 556 0.5184 0.5065 0.5184 0.7200
0.2975 26.5714 558 0.4991 0.4782 0.4991 0.7065
0.2975 26.6667 560 0.4977 0.4829 0.4977 0.7055
0.2975 26.7619 562 0.4980 0.5009 0.4980 0.7057
0.2975 26.8571 564 0.5042 0.5127 0.5042 0.7100

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k4_task7_organization

Finetuned
(4019)
this model