ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k8_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7782
  • Qwk: 0.5054
  • Mse: 0.7782
  • Rmse: 0.8822

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0444 2 4.4730 0.0086 4.4730 2.1149
No log 0.0889 4 3.2853 0.0181 3.2853 1.8125
No log 0.1333 6 1.6385 0.0504 1.6385 1.2800
No log 0.1778 8 1.2535 0.0398 1.2535 1.1196
No log 0.2222 10 1.1032 0.2188 1.1032 1.0503
No log 0.2667 12 1.0247 0.3859 1.0247 1.0123
No log 0.3111 14 1.4664 0.2080 1.4664 1.2110
No log 0.3556 16 1.3883 0.2248 1.3883 1.1783
No log 0.4 18 1.0625 0.3430 1.0625 1.0308
No log 0.4444 20 0.9854 0.2899 0.9854 0.9927
No log 0.4889 22 1.4135 0.3289 1.4135 1.1889
No log 0.5333 24 1.3213 0.3898 1.3213 1.1495
No log 0.5778 26 0.9662 0.5204 0.9662 0.9829
No log 0.6222 28 1.1833 0.3071 1.1833 1.0878
No log 0.6667 30 1.0924 0.3424 1.0924 1.0452
No log 0.7111 32 0.9384 0.5735 0.9384 0.9687
No log 0.7556 34 1.1171 0.4954 1.1171 1.0570
No log 0.8 36 1.1237 0.5557 1.1237 1.0600
No log 0.8444 38 0.9354 0.5487 0.9354 0.9672
No log 0.8889 40 1.4872 0.3857 1.4872 1.2195
No log 0.9333 42 1.9248 0.2775 1.9248 1.3874
No log 0.9778 44 1.7925 0.3572 1.7925 1.3389
No log 1.0222 46 1.6638 0.3487 1.6638 1.2899
No log 1.0667 48 1.1361 0.4130 1.1361 1.0659
No log 1.1111 50 0.9549 0.5189 0.9549 0.9772
No log 1.1556 52 0.9850 0.5181 0.9850 0.9924
No log 1.2 54 0.9291 0.5726 0.9291 0.9639
No log 1.2444 56 0.8690 0.5773 0.8690 0.9322
No log 1.2889 58 1.1235 0.3421 1.1235 1.0599
No log 1.3333 60 1.1268 0.3421 1.1268 1.0615
No log 1.3778 62 0.8120 0.5404 0.8120 0.9011
No log 1.4222 64 0.8115 0.5451 0.8115 0.9009
No log 1.4667 66 1.0202 0.5501 1.0202 1.0100
No log 1.5111 68 1.0929 0.5182 1.0929 1.0454
No log 1.5556 70 0.8838 0.5534 0.8838 0.9401
No log 1.6 72 0.7277 0.6120 0.7277 0.8531
No log 1.6444 74 0.7505 0.5021 0.7505 0.8663
No log 1.6889 76 0.9204 0.5139 0.9204 0.9594
No log 1.7333 78 0.8640 0.5124 0.8640 0.9295
No log 1.7778 80 0.6795 0.6371 0.6795 0.8243
No log 1.8222 82 0.7910 0.6162 0.7910 0.8894
No log 1.8667 84 0.9419 0.5094 0.9419 0.9705
No log 1.9111 86 0.8488 0.5241 0.8488 0.9213
No log 1.9556 88 0.7295 0.6176 0.7295 0.8541
No log 2.0 90 0.7823 0.5631 0.7823 0.8845
No log 2.0444 92 0.8202 0.5333 0.8202 0.9056
No log 2.0889 94 0.8265 0.6079 0.8265 0.9091
No log 2.1333 96 0.9707 0.5859 0.9707 0.9853
No log 2.1778 98 1.3796 0.5556 1.3796 1.1746
No log 2.2222 100 1.4665 0.4880 1.4665 1.2110
No log 2.2667 102 1.2650 0.5085 1.2650 1.1247
No log 2.3111 104 0.9557 0.5455 0.9557 0.9776
No log 2.3556 106 0.7500 0.6109 0.7500 0.8660
No log 2.4 108 0.7625 0.5657 0.7625 0.8732
No log 2.4444 110 0.7633 0.6302 0.7633 0.8737
No log 2.4889 112 0.7740 0.6197 0.7740 0.8798
No log 2.5333 114 0.8724 0.5887 0.8724 0.9340
No log 2.5778 116 0.8764 0.6004 0.8764 0.9362
No log 2.6222 118 0.8114 0.6034 0.8114 0.9008
No log 2.6667 120 0.8145 0.5845 0.8145 0.9025
No log 2.7111 122 0.8090 0.5659 0.8090 0.8994
No log 2.7556 124 0.7737 0.6099 0.7737 0.8796
No log 2.8 126 0.8396 0.6367 0.8396 0.9163
No log 2.8444 128 0.7743 0.6809 0.7743 0.8800
No log 2.8889 130 0.7425 0.6849 0.7425 0.8617
No log 2.9333 132 0.7570 0.6369 0.7570 0.8701
No log 2.9778 134 0.7997 0.5947 0.7997 0.8943
No log 3.0222 136 0.7555 0.6369 0.7555 0.8692
No log 3.0667 138 0.7617 0.6471 0.7617 0.8728
No log 3.1111 140 0.8500 0.5789 0.8500 0.9220
No log 3.1556 142 0.9654 0.5493 0.9654 0.9826
No log 3.2 144 0.9983 0.5237 0.9983 0.9991
No log 3.2444 146 0.8735 0.5201 0.8735 0.9346
No log 3.2889 148 0.8021 0.6091 0.8021 0.8956
No log 3.3333 150 0.7493 0.5635 0.7493 0.8656
No log 3.3778 152 0.7347 0.5766 0.7347 0.8571
No log 3.4222 154 0.7549 0.6269 0.7549 0.8689
No log 3.4667 156 0.7593 0.6406 0.7593 0.8714
No log 3.5111 158 0.7531 0.6237 0.7531 0.8678
No log 3.5556 160 0.7671 0.6165 0.7671 0.8758
No log 3.6 162 0.7903 0.6580 0.7903 0.8890
No log 3.6444 164 0.8466 0.6550 0.8466 0.9201
No log 3.6889 166 0.8837 0.5838 0.8837 0.9400
No log 3.7333 168 0.8712 0.5455 0.8712 0.9334
No log 3.7778 170 0.8321 0.5283 0.8321 0.9122
No log 3.8222 172 0.8475 0.5283 0.8475 0.9206
No log 3.8667 174 0.8466 0.5283 0.8466 0.9201
No log 3.9111 176 0.7922 0.5759 0.7922 0.8901
No log 3.9556 178 0.7659 0.6448 0.7659 0.8752
No log 4.0 180 0.7820 0.6647 0.7820 0.8843
No log 4.0444 182 0.9090 0.5571 0.9090 0.9534
No log 4.0889 184 0.9172 0.5410 0.9172 0.9577
No log 4.1333 186 0.8180 0.6362 0.8180 0.9044
No log 4.1778 188 0.8041 0.5852 0.8041 0.8967
No log 4.2222 190 0.8424 0.6269 0.8424 0.9178
No log 4.2667 192 0.9579 0.5626 0.9579 0.9787
No log 4.3111 194 0.9829 0.5201 0.9829 0.9914
No log 4.3556 196 0.8654 0.5073 0.8654 0.9303
No log 4.4 198 0.8218 0.5428 0.8218 0.9065
No log 4.4444 200 0.8480 0.5291 0.8480 0.9209
No log 4.4889 202 0.8924 0.5458 0.8924 0.9447
No log 4.5333 204 0.8861 0.5449 0.8861 0.9413
No log 4.5778 206 0.9469 0.5575 0.9469 0.9731
No log 4.6222 208 0.9312 0.5915 0.9312 0.9650
No log 4.6667 210 0.9335 0.5920 0.9335 0.9662
No log 4.7111 212 0.8718 0.5851 0.8718 0.9337
No log 4.7556 214 0.8909 0.5013 0.8909 0.9439
No log 4.8 216 0.9234 0.5040 0.9234 0.9609
No log 4.8444 218 0.9963 0.5080 0.9963 0.9981
No log 4.8889 220 0.9226 0.5042 0.9226 0.9605
No log 4.9333 222 0.8059 0.4914 0.8059 0.8977
No log 4.9778 224 0.7615 0.5741 0.7615 0.8727
No log 5.0222 226 0.7603 0.4877 0.7603 0.8720
No log 5.0667 228 0.7899 0.6032 0.7899 0.8888
No log 5.1111 230 0.9362 0.5471 0.9362 0.9675
No log 5.1556 232 1.0650 0.5153 1.0650 1.0320
No log 5.2 234 0.9773 0.5027 0.9773 0.9886
No log 5.2444 236 0.8526 0.4340 0.8526 0.9234
No log 5.2889 238 0.8426 0.4474 0.8426 0.9180
No log 5.3333 240 0.8671 0.5028 0.8671 0.9312
No log 5.3778 242 0.9931 0.5027 0.9931 0.9965
No log 5.4222 244 1.0107 0.5027 1.0107 1.0053
No log 5.4667 246 0.8913 0.4521 0.8913 0.9441
No log 5.5111 248 0.8477 0.5312 0.8477 0.9207
No log 5.5556 250 0.8407 0.5112 0.8407 0.9169
No log 5.6 252 0.8540 0.5257 0.8540 0.9241
No log 5.6444 254 0.8773 0.4883 0.8773 0.9367
No log 5.6889 256 0.8599 0.4706 0.8599 0.9273
No log 5.7333 258 0.8055 0.5759 0.8055 0.8975
No log 5.7778 260 0.8341 0.4706 0.8341 0.9133
No log 5.8222 262 0.9945 0.5109 0.9945 0.9972
No log 5.8667 264 1.0868 0.4784 1.0868 1.0425
No log 5.9111 266 0.9923 0.5159 0.9923 0.9961
No log 5.9556 268 0.9162 0.4834 0.9162 0.9572
No log 6.0 270 0.9225 0.5002 0.9225 0.9604
No log 6.0444 272 1.0145 0.5174 1.0145 1.0072
No log 6.0889 274 0.9282 0.4824 0.9282 0.9634
No log 6.1333 276 0.8212 0.5118 0.8212 0.9062
No log 6.1778 278 0.8376 0.5190 0.8376 0.9152
No log 6.2222 280 0.9454 0.4722 0.9454 0.9723
No log 6.2667 282 0.9750 0.5246 0.9750 0.9874
No log 6.3111 284 0.9221 0.5798 0.9221 0.9603
No log 6.3556 286 0.8632 0.5636 0.8632 0.9291
No log 6.4 288 0.7946 0.5935 0.7946 0.8914
No log 6.4444 290 0.8190 0.5441 0.8190 0.9050
No log 6.4889 292 0.8860 0.5682 0.8860 0.9413
No log 6.5333 294 0.8804 0.5682 0.8804 0.9383
No log 6.5778 296 0.8368 0.5563 0.8368 0.9148
No log 6.6222 298 0.8158 0.5635 0.8158 0.9032
No log 6.6667 300 0.8095 0.5437 0.8095 0.8997
No log 6.7111 302 0.8350 0.5521 0.8350 0.9138
No log 6.7556 304 0.9387 0.5217 0.9387 0.9689
No log 6.8 306 1.0920 0.5341 1.0920 1.0450
No log 6.8444 308 1.1738 0.5506 1.1738 1.0834
No log 6.8889 310 1.0814 0.5429 1.0814 1.0399
No log 6.9333 312 0.9504 0.5571 0.9504 0.9749
No log 6.9778 314 0.8812 0.5422 0.8812 0.9387
No log 7.0222 316 0.8373 0.5669 0.8373 0.9150
No log 7.0667 318 0.8602 0.5778 0.8602 0.9275
No log 7.1111 320 0.8493 0.6279 0.8493 0.9216
No log 7.1556 322 0.8942 0.5803 0.8942 0.9456
No log 7.2 324 1.0811 0.5425 1.0811 1.0398
No log 7.2444 326 1.1407 0.5298 1.1407 1.0680
No log 7.2889 328 1.0052 0.5571 1.0052 1.0026
No log 7.3333 330 0.8491 0.5261 0.8491 0.9214
No log 7.3778 332 0.8088 0.5194 0.8088 0.8993
No log 7.4222 334 0.8140 0.5229 0.8140 0.9022
No log 7.4667 336 0.8685 0.5091 0.8685 0.9319
No log 7.5111 338 1.0434 0.5448 1.0434 1.0215
No log 7.5556 340 1.1409 0.5638 1.1409 1.0681
No log 7.6 342 1.0115 0.5286 1.0115 1.0057
No log 7.6444 344 0.8311 0.5414 0.8311 0.9117
No log 7.6889 346 0.7550 0.6025 0.7550 0.8689
No log 7.7333 348 0.7588 0.6256 0.7588 0.8711
No log 7.7778 350 0.7531 0.6365 0.7531 0.8678
No log 7.8222 352 0.8068 0.5728 0.8068 0.8982
No log 7.8667 354 0.9581 0.5363 0.9581 0.9788
No log 7.9111 356 1.0241 0.5258 1.0241 1.0120
No log 7.9556 358 0.9891 0.5551 0.9891 0.9945
No log 8.0 360 0.9664 0.5548 0.9664 0.9830
No log 8.0444 362 0.8898 0.5091 0.8898 0.9433
No log 8.0889 364 0.8370 0.5563 0.8370 0.9149
No log 8.1333 366 0.8062 0.6109 0.8062 0.8979
No log 8.1778 368 0.7850 0.6097 0.7850 0.8860
No log 8.2222 370 0.7849 0.6412 0.7849 0.8859
No log 8.2667 372 0.8337 0.5726 0.8337 0.9131
No log 8.3111 374 0.9136 0.5513 0.9136 0.9558
No log 8.3556 376 0.9246 0.5691 0.9246 0.9616
No log 8.4 378 0.8624 0.5658 0.8624 0.9286
No log 8.4444 380 0.8267 0.5951 0.8267 0.9092
No log 8.4889 382 0.8245 0.5538 0.8245 0.9080
No log 8.5333 384 0.8297 0.5467 0.8297 0.9109
No log 8.5778 386 0.8098 0.4995 0.8098 0.8999
No log 8.6222 388 0.8849 0.4694 0.8849 0.9407
No log 8.6667 390 0.9737 0.5199 0.9737 0.9868
No log 8.7111 392 1.0047 0.5365 1.0047 1.0024
No log 8.7556 394 0.9551 0.5389 0.9551 0.9773
No log 8.8 396 0.9348 0.5416 0.9348 0.9668
No log 8.8444 398 0.8773 0.5079 0.8773 0.9366
No log 8.8889 400 0.8618 0.5187 0.8618 0.9283
No log 8.9333 402 0.8397 0.5041 0.8397 0.9163
No log 8.9778 404 0.8612 0.5185 0.8612 0.9280
No log 9.0222 406 0.8717 0.5120 0.8717 0.9336
No log 9.0667 408 0.8521 0.5519 0.8521 0.9231
No log 9.1111 410 0.8524 0.5498 0.8524 0.9232
No log 9.1556 412 0.8491 0.5498 0.8491 0.9214
No log 9.2 414 0.8462 0.5498 0.8462 0.9199
No log 9.2444 416 0.8929 0.5273 0.8929 0.9449
No log 9.2889 418 0.8436 0.5308 0.8436 0.9185
No log 9.3333 420 0.8116 0.5359 0.8116 0.9009
No log 9.3778 422 0.8290 0.5118 0.8290 0.9105
No log 9.4222 424 0.8897 0.4761 0.8897 0.9433
No log 9.4667 426 0.8771 0.5071 0.8771 0.9365
No log 9.5111 428 0.8625 0.5071 0.8625 0.9287
No log 9.5556 430 0.8140 0.5164 0.8140 0.9022
No log 9.6 432 0.7881 0.5635 0.7881 0.8877
No log 9.6444 434 0.7927 0.5635 0.7927 0.8903
No log 9.6889 436 0.8044 0.5787 0.8044 0.8969
No log 9.7333 438 0.7856 0.5725 0.7856 0.8863
No log 9.7778 440 0.7989 0.5979 0.7989 0.8938
No log 9.8222 442 0.8213 0.5792 0.8213 0.9063
No log 9.8667 444 0.8239 0.5493 0.8239 0.9077
No log 9.9111 446 0.8158 0.5515 0.8158 0.9032
No log 9.9556 448 0.7707 0.6281 0.7707 0.8779
No log 10.0 450 0.7706 0.5930 0.7706 0.8778
No log 10.0444 452 0.7846 0.5756 0.7846 0.8858
No log 10.0889 454 0.7777 0.6132 0.7777 0.8819
No log 10.1333 456 0.7740 0.5695 0.7740 0.8798
No log 10.1778 458 0.7825 0.5647 0.7825 0.8846
No log 10.2222 460 0.8003 0.5623 0.8003 0.8946
No log 10.2667 462 0.8035 0.5623 0.8035 0.8964
No log 10.3111 464 0.7766 0.5658 0.7766 0.8812
No log 10.3556 466 0.7676 0.6277 0.7676 0.8761
No log 10.4 468 0.7715 0.5171 0.7715 0.8784
No log 10.4444 470 0.7851 0.4474 0.7851 0.8861
No log 10.4889 472 0.8051 0.5368 0.8051 0.8973
No log 10.5333 474 0.8212 0.5094 0.8212 0.9062
No log 10.5778 476 0.8411 0.5029 0.8411 0.9171
No log 10.6222 478 0.8689 0.5014 0.8689 0.9321
No log 10.6667 480 0.9150 0.5370 0.9150 0.9566
No log 10.7111 482 0.8745 0.5399 0.8745 0.9351
No log 10.7556 484 0.8537 0.5259 0.8537 0.9240
No log 10.8 486 0.8212 0.5482 0.8212 0.9062
No log 10.8444 488 0.8058 0.4340 0.8058 0.8977
No log 10.8889 490 0.8111 0.4308 0.8111 0.9006
No log 10.9333 492 0.8232 0.4240 0.8232 0.9073
No log 10.9778 494 0.8233 0.5102 0.8233 0.9073
No log 11.0222 496 0.8115 0.5461 0.8115 0.9008
No log 11.0667 498 0.7866 0.5984 0.7866 0.8869
0.3302 11.1111 500 0.7832 0.6129 0.7832 0.8850
0.3302 11.1556 502 0.8081 0.5679 0.8081 0.8989
0.3302 11.2 504 0.8159 0.5679 0.8159 0.9033
0.3302 11.2444 506 0.8227 0.5403 0.8227 0.9070
0.3302 11.2889 508 0.7911 0.5440 0.7911 0.8895
0.3302 11.3333 510 0.7782 0.5054 0.7782 0.8822

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k8_task2_organization

Finetuned
(4019)
this model