ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k10_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7423
  • Qwk: 0.0318
  • Mse: 0.7423
  • Rmse: 0.8616

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0385 2 3.5726 0.0035 3.5726 1.8901
No log 0.0769 4 2.0518 -0.0704 2.0518 1.4324
No log 0.1154 6 1.3248 0.0016 1.3248 1.1510
No log 0.1538 8 1.1643 -0.0446 1.1643 1.0790
No log 0.1923 10 0.7259 0.0506 0.7259 0.8520
No log 0.2308 12 0.7198 -0.0644 0.7198 0.8484
No log 0.2692 14 0.7354 -0.1223 0.7354 0.8576
No log 0.3077 16 0.7240 -0.1223 0.7240 0.8509
No log 0.3462 18 0.7983 -0.1249 0.7983 0.8935
No log 0.3846 20 0.7463 -0.0644 0.7463 0.8639
No log 0.4231 22 0.7370 -0.0035 0.7370 0.8585
No log 0.4615 24 0.7739 -0.0695 0.7739 0.8797
No log 0.5 26 1.2001 0.0025 1.2001 1.0955
No log 0.5385 28 0.9334 -0.0500 0.9334 0.9661
No log 0.5769 30 0.8305 0.0622 0.8305 0.9113
No log 0.6154 32 0.8811 0.1281 0.8811 0.9387
No log 0.6538 34 0.9125 0.0886 0.9125 0.9552
No log 0.6923 36 1.1228 0.0306 1.1228 1.0596
No log 0.7308 38 1.1054 0.0193 1.1054 1.0514
No log 0.7692 40 1.1621 -0.0260 1.1621 1.0780
No log 0.8077 42 1.4081 -0.0172 1.4081 1.1866
No log 0.8462 44 1.1687 -0.0170 1.1687 1.0811
No log 0.8846 46 1.8431 0.0231 1.8431 1.3576
No log 0.9231 48 2.5748 0.0123 2.5748 1.6046
No log 0.9615 50 1.6014 0.0754 1.6014 1.2655
No log 1.0 52 0.8735 0.0816 0.8735 0.9346
No log 1.0385 54 1.0214 0.0531 1.0214 1.0106
No log 1.0769 56 1.0407 0.0531 1.0407 1.0202
No log 1.1154 58 0.7725 0.0393 0.7725 0.8789
No log 1.1538 60 1.0929 0.0543 1.0929 1.0454
No log 1.1923 62 1.0674 0.0569 1.0674 1.0331
No log 1.2308 64 0.8011 0.1092 0.8011 0.8951
No log 1.2692 66 1.2378 0.1106 1.2378 1.1126
No log 1.3077 68 1.2026 0.1106 1.2026 1.0966
No log 1.3462 70 0.8462 0.1575 0.8462 0.9199
No log 1.3846 72 0.8310 0.1423 0.8310 0.9116
No log 1.4231 74 0.8217 0.1456 0.8217 0.9065
No log 1.4615 76 0.8154 0.1372 0.8154 0.9030
No log 1.5 78 0.7886 0.1189 0.7886 0.8881
No log 1.5385 80 0.7719 0.1585 0.7719 0.8786
No log 1.5769 82 0.7725 0.0834 0.7725 0.8789
No log 1.6154 84 0.7614 0.0690 0.7614 0.8726
No log 1.6538 86 0.7820 0.0214 0.7820 0.8843
No log 1.6923 88 0.8097 -0.0406 0.8097 0.8999
No log 1.7308 90 1.1110 0.0789 1.1110 1.0541
No log 1.7692 92 0.9597 -0.0442 0.9597 0.9796
No log 1.8077 94 0.8099 0.0289 0.8099 0.9000
No log 1.8462 96 0.8276 0.0257 0.8276 0.9097
No log 1.8846 98 0.8184 0.0532 0.8184 0.9047
No log 1.9231 100 0.8317 0.1333 0.8317 0.9120
No log 1.9615 102 0.9297 0.0283 0.9297 0.9642
No log 2.0 104 1.2635 0.1432 1.2635 1.1241
No log 2.0385 106 1.1885 0.1057 1.1885 1.0902
No log 2.0769 108 0.8869 0.0065 0.8869 0.9418
No log 2.1154 110 0.8501 0.0535 0.8501 0.9220
No log 2.1538 112 0.8964 0.0964 0.8964 0.9468
No log 2.1923 114 0.8214 0.2005 0.8214 0.9063
No log 2.2308 116 0.9748 0.0585 0.9748 0.9873
No log 2.2692 118 0.8780 0.1007 0.8780 0.9370
No log 2.3077 120 0.9223 0.0239 0.9223 0.9604
No log 2.3462 122 0.8757 0.0221 0.8757 0.9358
No log 2.3846 124 0.8344 0.1529 0.8344 0.9135
No log 2.4231 126 0.9284 0.1306 0.9284 0.9635
No log 2.4615 128 0.7692 0.1244 0.7692 0.8770
No log 2.5 130 0.8152 0.0682 0.8152 0.9029
No log 2.5385 132 0.7210 0.0976 0.7210 0.8491
No log 2.5769 134 0.7874 0.1239 0.7874 0.8874
No log 2.6154 136 1.2262 0.0623 1.2262 1.1073
No log 2.6538 138 1.1730 0.0746 1.1730 1.0831
No log 2.6923 140 0.7865 0.1239 0.7865 0.8868
No log 2.7308 142 0.7296 0.1974 0.7296 0.8542
No log 2.7692 144 0.8939 0.0772 0.8939 0.9455
No log 2.8077 146 0.8943 0.0665 0.8943 0.9457
No log 2.8462 148 0.8558 0.1372 0.8558 0.9251
No log 2.8846 150 1.0648 0.0735 1.0648 1.0319
No log 2.9231 152 0.9071 -0.0425 0.9071 0.9524
No log 2.9615 154 0.7797 0.0898 0.7797 0.8830
No log 3.0 156 0.8611 0.0763 0.8611 0.9280
No log 3.0385 158 0.7419 0.0393 0.7419 0.8614
No log 3.0769 160 0.9299 0.1406 0.9299 0.9643
No log 3.1154 162 1.2139 0.1370 1.2139 1.1018
No log 3.1538 164 0.9544 0.0984 0.9544 0.9769
No log 3.1923 166 0.7552 0.0376 0.7552 0.8690
No log 3.2308 168 0.7690 0.0058 0.7690 0.8769
No log 3.2692 170 0.7611 -0.0145 0.7611 0.8724
No log 3.3077 172 1.0350 -0.0013 1.0350 1.0173
No log 3.3462 174 1.2147 0.0119 1.2147 1.1021
No log 3.3846 176 1.0841 0.0596 1.0841 1.0412
No log 3.4231 178 0.8111 -0.0163 0.8111 0.9006
No log 3.4615 180 0.9243 0.1247 0.9243 0.9614
No log 3.5 182 0.9954 0.0906 0.9954 0.9977
No log 3.5385 184 0.8947 0.0469 0.8947 0.9459
No log 3.5769 186 0.9014 0.1025 0.9014 0.9494
No log 3.6154 188 0.9529 0.0949 0.9529 0.9762
No log 3.6538 190 0.7955 0.1336 0.7955 0.8919
No log 3.6923 192 0.7424 0.0834 0.7424 0.8616
No log 3.7308 194 0.7358 0.0922 0.7358 0.8578
No log 3.7692 196 0.7380 0.0318 0.7380 0.8591
No log 3.8077 198 0.7739 0.0639 0.7739 0.8797
No log 3.8462 200 0.8613 0.0871 0.8613 0.9281
No log 3.8846 202 0.8759 0.1222 0.8759 0.9359
No log 3.9231 204 0.9085 0.0708 0.9085 0.9531
No log 3.9615 206 0.8532 0.1259 0.8532 0.9237
No log 4.0 208 0.8777 0.1386 0.8777 0.9368
No log 4.0385 210 0.7899 -0.0027 0.7899 0.8888
No log 4.0769 212 0.8256 0.0247 0.8256 0.9086
No log 4.1154 214 0.7776 0.0318 0.7776 0.8818
No log 4.1538 216 0.7880 -0.1396 0.7880 0.8877
No log 4.1923 218 0.7973 0.0303 0.7973 0.8929
No log 4.2308 220 0.9016 -0.0373 0.9016 0.9495
No log 4.2692 222 0.8509 -0.0316 0.8509 0.9225
No log 4.3077 224 0.8066 0.0269 0.8066 0.8981
No log 4.3462 226 0.8226 0.0097 0.8226 0.9070
No log 4.3846 228 0.8076 -0.0218 0.8076 0.8986
No log 4.4231 230 0.8142 -0.1191 0.8142 0.9023
No log 4.4615 232 0.7623 0.0338 0.7623 0.8731
No log 4.5 234 0.7505 0.0375 0.7505 0.8663
No log 4.5385 236 0.7643 0.0247 0.7643 0.8743
No log 4.5769 238 0.9579 -0.0056 0.9579 0.9787
No log 4.6154 240 1.0239 -0.0545 1.0239 1.0119
No log 4.6538 242 0.8351 -0.0295 0.8351 0.9138
No log 4.6923 244 0.8232 0.0426 0.8232 0.9073
No log 4.7308 246 0.8789 0.0239 0.8789 0.9375
No log 4.7692 248 0.8468 -0.0144 0.8468 0.9202
No log 4.8077 250 0.7514 0.0807 0.7514 0.8668
No log 4.8462 252 0.8668 -0.0079 0.8668 0.9310
No log 4.8846 254 0.8132 0.0071 0.8132 0.9018
No log 4.9231 256 0.7187 0.0374 0.7187 0.8478
No log 4.9615 258 0.7332 0.0416 0.7332 0.8563
No log 5.0 260 0.7589 0.0318 0.7589 0.8711
No log 5.0385 262 0.7793 0.0318 0.7793 0.8828
No log 5.0769 264 0.7712 -0.0252 0.7712 0.8782
No log 5.1154 266 0.7550 0.0318 0.7550 0.8689
No log 5.1538 268 0.7686 0.0983 0.7686 0.8767
No log 5.1923 270 0.7305 -0.0152 0.7305 0.8547
No log 5.2308 272 0.8182 0.1193 0.8182 0.9046
No log 5.2692 274 0.7952 0.0068 0.7952 0.8917
No log 5.3077 276 0.7298 0.0282 0.7298 0.8543
No log 5.3462 278 0.7311 0.0471 0.7311 0.8550
No log 5.3846 280 0.7343 0.0214 0.7343 0.8569
No log 5.4231 282 0.8580 0.1542 0.8580 0.9263
No log 5.4615 284 0.8245 0.0456 0.8245 0.9080
No log 5.5 286 0.7881 -0.0240 0.7881 0.8878
No log 5.5385 288 0.7741 -0.0240 0.7741 0.8798
No log 5.5769 290 0.7828 0.0562 0.7828 0.8848
No log 5.6154 292 0.7554 0.0214 0.7554 0.8691
No log 5.6538 294 0.7448 -0.0030 0.7448 0.8630
No log 5.6923 296 0.7648 0.0030 0.7648 0.8745
No log 5.7308 298 0.7367 -0.0152 0.7367 0.8583
No log 5.7692 300 0.7520 0.0670 0.7520 0.8672
No log 5.8077 302 0.6984 0.0334 0.6984 0.8357
No log 5.8462 304 0.6990 -0.0033 0.6990 0.8360
No log 5.8846 306 0.7276 0.0 0.7276 0.8530
No log 5.9231 308 0.7292 -0.0033 0.7292 0.8539
No log 5.9615 310 0.7391 0.0152 0.7391 0.8597
No log 6.0 312 0.7736 0.0068 0.7736 0.8795
No log 6.0385 314 0.8119 0.0799 0.8119 0.9010
No log 6.0769 316 0.7934 0.0068 0.7934 0.8907
No log 6.1154 318 0.7727 0.0680 0.7727 0.8790
No log 6.1538 320 0.7789 -0.0118 0.7789 0.8825
No log 6.1923 322 0.7807 0.0282 0.7807 0.8836
No log 6.2308 324 0.7960 0.0282 0.7960 0.8922
No log 6.2692 326 0.8334 0.0226 0.8334 0.9129
No log 6.3077 328 0.8557 0.0741 0.8557 0.9250
No log 6.3462 330 0.8713 0.1009 0.8713 0.9334
No log 6.3846 332 0.9955 0.0794 0.9955 0.9977
No log 6.4231 334 0.9567 0.0824 0.9567 0.9781
No log 6.4615 336 0.8103 0.1049 0.8103 0.9002
No log 6.5 338 0.7995 0.1095 0.7995 0.8942
No log 6.5385 340 0.7915 0.0840 0.7915 0.8896
No log 6.5769 342 0.7739 -0.0506 0.7739 0.8797
No log 6.6154 344 0.7665 0.0303 0.7665 0.8755
No log 6.6538 346 0.7942 0.0956 0.7942 0.8912
No log 6.6923 348 0.9266 0.1897 0.9266 0.9626
No log 6.7308 350 0.9288 0.2015 0.9288 0.9637
No log 6.7692 352 0.8162 -0.0138 0.8162 0.9034
No log 6.8077 354 0.8872 0.0641 0.8872 0.9419
No log 6.8462 356 0.8614 0.0139 0.8614 0.9281
No log 6.8846 358 0.8686 0.1973 0.8686 0.9320
No log 6.9231 360 0.9822 0.1112 0.9822 0.9911
No log 6.9615 362 0.8513 0.1235 0.8513 0.9226
No log 7.0 364 0.8424 0.0537 0.8424 0.9178
No log 7.0385 366 0.9054 0.0627 0.9054 0.9515
No log 7.0769 368 0.8193 0.0905 0.8193 0.9051
No log 7.1154 370 0.8011 0.0999 0.8011 0.8951
No log 7.1538 372 0.8198 0.1243 0.8198 0.9055
No log 7.1923 374 0.8145 0.1243 0.8145 0.9025
No log 7.2308 376 0.7407 0.0999 0.7407 0.8606
No log 7.2692 378 0.7131 0.0282 0.7131 0.8445
No log 7.3077 380 0.7306 0.0436 0.7306 0.8548
No log 7.3462 382 0.7261 -0.0125 0.7261 0.8521
No log 7.3846 384 0.7388 -0.0179 0.7388 0.8595
No log 7.4231 386 0.7679 -0.0179 0.7679 0.8763
No log 7.4615 388 0.7878 0.1144 0.7878 0.8876
No log 7.5 390 0.8616 0.1445 0.8616 0.9282
No log 7.5385 392 0.9316 0.1311 0.9316 0.9652
No log 7.5769 394 0.8656 0.1445 0.8656 0.9304
No log 7.6154 396 0.7744 0.0600 0.7744 0.8800
No log 7.6538 398 0.7592 -0.0125 0.7592 0.8713
No log 7.6923 400 0.7311 -0.0125 0.7311 0.8550
No log 7.7308 402 0.7386 0.0723 0.7386 0.8594
No log 7.7692 404 0.7834 0.0956 0.7834 0.8851
No log 7.8077 406 0.8469 0.1593 0.8469 0.9202
No log 7.8462 408 0.8229 0.0831 0.8229 0.9071
No log 7.8846 410 0.7766 0.1048 0.7766 0.8812
No log 7.9231 412 0.7761 0.0776 0.7761 0.8810
No log 7.9615 414 0.8117 0.0871 0.8117 0.9009
No log 8.0 416 0.8746 0.1646 0.8746 0.9352
No log 8.0385 418 0.8455 0.1646 0.8455 0.9195
No log 8.0769 420 0.7694 0.1146 0.7694 0.8772
No log 8.1154 422 0.7544 0.0768 0.7544 0.8686
No log 8.1538 424 0.7519 0.0600 0.7519 0.8671
No log 8.1923 426 0.7496 0.0600 0.7496 0.8658
No log 8.2308 428 0.7585 0.0600 0.7585 0.8709
No log 8.2692 430 0.7644 0.1146 0.7644 0.8743
No log 8.3077 432 0.7546 0.2138 0.7546 0.8687
No log 8.3462 434 0.7423 0.0723 0.7423 0.8616
No log 8.3846 436 0.7605 0.0953 0.7605 0.8721
No log 8.4231 438 0.7768 0.1716 0.7768 0.8814
No log 8.4615 440 0.7590 0.0909 0.7590 0.8712
No log 8.5 442 0.7441 0.1395 0.7441 0.8626
No log 8.5385 444 0.7205 0.0318 0.7205 0.8488
No log 8.5769 446 0.7505 0.1413 0.7505 0.8663
No log 8.6154 448 0.7926 0.1345 0.7926 0.8903
No log 8.6538 450 0.7585 -0.0054 0.7585 0.8709
No log 8.6923 452 0.8084 0.1716 0.8084 0.8991
No log 8.7308 454 0.8889 0.1897 0.8889 0.9428
No log 8.7692 456 0.8309 0.1196 0.8309 0.9115
No log 8.8077 458 0.7369 0.0999 0.7369 0.8584
No log 8.8462 460 0.7470 -0.0118 0.7470 0.8643
No log 8.8846 462 0.7807 0.0081 0.7807 0.8836
No log 8.9231 464 0.7595 -0.0062 0.7595 0.8715
No log 8.9615 466 0.7172 0.0768 0.7172 0.8469
No log 9.0 468 0.7976 0.1716 0.7976 0.8931
No log 9.0385 470 0.8354 0.1955 0.8354 0.9140
No log 9.0769 472 0.7623 0.1286 0.7623 0.8731
No log 9.1154 474 0.7463 0.0562 0.7463 0.8639
No log 9.1538 476 0.7559 0.0282 0.7559 0.8694
No log 9.1923 478 0.7506 0.0282 0.7506 0.8664
No log 9.2308 480 0.7447 0.0183 0.7447 0.8630
No log 9.2692 482 0.7704 0.0871 0.7704 0.8777
No log 9.3077 484 0.7987 0.0793 0.7987 0.8937
No log 9.3462 486 0.8340 0.1105 0.8340 0.9132
No log 9.3846 488 0.8070 0.0876 0.8070 0.8983
No log 9.4231 490 0.8156 0.0444 0.8156 0.9031
No log 9.4615 492 0.8233 0.0488 0.8233 0.9074
No log 9.5 494 0.8010 0.0768 0.8010 0.8950
No log 9.5385 496 0.8654 0.1445 0.8654 0.9303
No log 9.5769 498 0.8721 0.2147 0.8721 0.9338
0.274 9.6154 500 0.7677 0.0600 0.7677 0.8762
0.274 9.6538 502 0.7319 0.0355 0.7319 0.8555
0.274 9.6923 504 0.7681 0.0 0.7681 0.8764
0.274 9.7308 506 0.7628 0.0 0.7628 0.8734
0.274 9.7692 508 0.7393 0.0355 0.7393 0.8598
0.274 9.8077 510 0.7423 0.0318 0.7423 0.8616

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k10_task3_organization

Finetuned
(4019)
this model