ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k4_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9191
  • Qwk: -0.0204
  • Mse: 0.9191
  • Rmse: 0.9587

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0952 2 3.8182 0.0017 3.8182 1.9540
No log 0.1905 4 1.9722 -0.0076 1.9722 1.4043
No log 0.2857 6 1.5875 -0.0029 1.5875 1.2600
No log 0.3810 8 1.5767 -0.0454 1.5767 1.2557
No log 0.4762 10 0.8234 0.0670 0.8234 0.9074
No log 0.5714 12 0.8053 -0.0264 0.8053 0.8974
No log 0.6667 14 0.8649 -0.0371 0.8649 0.9300
No log 0.7619 16 0.7744 0.0374 0.7744 0.8800
No log 0.8571 18 0.7342 -0.0035 0.7342 0.8568
No log 0.9524 20 0.7259 -0.0035 0.7259 0.8520
No log 1.0476 22 0.7395 -0.0679 0.7395 0.8600
No log 1.1429 24 0.7523 -0.1223 0.7523 0.8674
No log 1.2381 26 0.8442 -0.1682 0.8442 0.9188
No log 1.3333 28 1.0132 -0.0175 1.0132 1.0066
No log 1.4286 30 1.5386 0.0505 1.5386 1.2404
No log 1.5238 32 1.1179 -0.0191 1.1179 1.0573
No log 1.6190 34 0.7562 0.0296 0.7562 0.8696
No log 1.7143 36 0.7298 0.0460 0.7298 0.8543
No log 1.8095 38 0.7594 0.0759 0.7594 0.8714
No log 1.9048 40 0.8958 0.1239 0.8958 0.9464
No log 2.0 42 1.2048 -0.0043 1.2048 1.0976
No log 2.0952 44 0.9750 0.1306 0.9750 0.9874
No log 2.1905 46 0.7247 0.0807 0.7247 0.8513
No log 2.2857 48 0.7194 0.0506 0.7194 0.8482
No log 2.3810 50 0.7163 0.1021 0.7163 0.8464
No log 2.4762 52 0.8031 0.0442 0.8031 0.8962
No log 2.5714 54 0.7989 0.0588 0.7989 0.8938
No log 2.6667 56 0.7475 -0.0035 0.7475 0.8646
No log 2.7619 58 0.7789 -0.0571 0.7789 0.8826
No log 2.8571 60 0.7786 0.0375 0.7786 0.8824
No log 2.9524 62 0.8039 0.1240 0.8039 0.8966
No log 3.0476 64 0.8907 0.1264 0.8907 0.9437
No log 3.1429 66 0.8467 0.1355 0.8467 0.9202
No log 3.2381 68 0.8927 0.0089 0.8927 0.9448
No log 3.3333 70 0.9655 0.0512 0.9655 0.9826
No log 3.4286 72 0.9439 0.0492 0.9439 0.9715
No log 3.5238 74 0.9410 0.0492 0.9410 0.9701
No log 3.6190 76 0.8645 0.0770 0.8645 0.9298
No log 3.7143 78 0.7755 0.0705 0.7755 0.8807
No log 3.8095 80 0.7412 0.0116 0.7412 0.8609
No log 3.9048 82 0.8426 0.1506 0.8426 0.9179
No log 4.0 84 0.8455 0.1458 0.8455 0.9195
No log 4.0952 86 0.8636 0.1785 0.8636 0.9293
No log 4.1905 88 0.8953 0.1412 0.8953 0.9462
No log 4.2857 90 0.8649 0.0947 0.8649 0.9300
No log 4.3810 92 0.8976 0.1443 0.8976 0.9474
No log 4.4762 94 0.9096 0.1034 0.9096 0.9537
No log 4.5714 96 0.9044 0.0993 0.9044 0.9510
No log 4.6667 98 0.9854 0.0030 0.9854 0.9927
No log 4.7619 100 0.9415 -0.0408 0.9415 0.9703
No log 4.8571 102 0.9985 0.0092 0.9985 0.9992
No log 4.9524 104 0.8262 0.1031 0.8262 0.9090
No log 5.0476 106 0.7879 0.1372 0.7879 0.8876
No log 5.1429 108 0.8078 0.0702 0.8078 0.8988
No log 5.2381 110 0.7212 0.1674 0.7212 0.8492
No log 5.3333 112 0.7223 0.1453 0.7223 0.8499
No log 5.4286 114 0.7903 0.1515 0.7903 0.8890
No log 5.5238 116 0.7857 0.1872 0.7857 0.8864
No log 5.6190 118 0.8278 0.1450 0.8278 0.9098
No log 5.7143 120 0.7724 0.1310 0.7724 0.8789
No log 5.8095 122 0.8066 0.1034 0.8066 0.8981
No log 5.9048 124 0.7824 0.1416 0.7824 0.8846
No log 6.0 126 0.7698 0.2087 0.7698 0.8774
No log 6.0952 128 0.7874 0.1310 0.7874 0.8873
No log 6.1905 130 1.0421 0.0226 1.0421 1.0208
No log 6.2857 132 1.1976 0.0293 1.1976 1.0944
No log 6.3810 134 0.9460 0.1078 0.9460 0.9726
No log 6.4762 136 0.8994 0.0665 0.8994 0.9484
No log 6.5714 138 1.0914 -0.0120 1.0914 1.0447
No log 6.6667 140 0.9701 0.0772 0.9701 0.9849
No log 6.7619 142 0.7959 0.1646 0.7959 0.8921
No log 6.8571 144 0.8269 0.1660 0.8269 0.9093
No log 6.9524 146 0.9173 -0.0148 0.9173 0.9577
No log 7.0476 148 1.0522 0.1007 1.0522 1.0258
No log 7.1429 150 0.9030 0.0547 0.9030 0.9503
No log 7.2381 152 0.8068 0.1277 0.8068 0.8982
No log 7.3333 154 0.8602 0.1339 0.8602 0.9275
No log 7.4286 156 0.9085 0.0007 0.9085 0.9532
No log 7.5238 158 0.7554 0.1327 0.7554 0.8692
No log 7.6190 160 0.7361 0.1333 0.7361 0.8580
No log 7.7143 162 0.7359 0.0791 0.7359 0.8578
No log 7.8095 164 0.7792 0.0898 0.7792 0.8827
No log 7.9048 166 0.8572 0.1352 0.8572 0.9259
No log 8.0 168 0.8732 0.1700 0.8732 0.9344
No log 8.0952 170 1.0071 0.1329 1.0071 1.0036
No log 8.1905 172 1.0313 0.1041 1.0313 1.0156
No log 8.2857 174 0.9025 -0.0307 0.9025 0.9500
No log 8.3810 176 0.8967 0.0277 0.8967 0.9469
No log 8.4762 178 0.8913 0.0725 0.8913 0.9441
No log 8.5714 180 1.0022 -0.0036 1.0022 1.0011
No log 8.6667 182 1.1016 0.0196 1.1016 1.0496
No log 8.7619 184 0.9122 -0.0522 0.9122 0.9551
No log 8.8571 186 0.8465 0.0670 0.8465 0.9201
No log 8.9524 188 0.8440 0.0277 0.8440 0.9187
No log 9.0476 190 0.8426 0.1646 0.8426 0.9180
No log 9.1429 192 0.9789 -0.0661 0.9789 0.9894
No log 9.2381 194 0.9569 -0.0322 0.9569 0.9782
No log 9.3333 196 0.8081 0.1287 0.8081 0.8990
No log 9.4286 198 0.7859 0.0834 0.7859 0.8865
No log 9.5238 200 0.7735 0.1395 0.7735 0.8795
No log 9.6190 202 0.7935 0.0585 0.7935 0.8908
No log 9.7143 204 0.8050 0.0141 0.8050 0.8972
No log 9.8095 206 0.7708 0.0030 0.7708 0.8780
No log 9.9048 208 0.8105 0.0095 0.8105 0.9003
No log 10.0 210 0.9169 -0.0425 0.9169 0.9576
No log 10.0952 212 0.8013 0.1095 0.8013 0.8951
No log 10.1905 214 0.8963 0.0747 0.8963 0.9467
No log 10.2857 216 0.9207 0.0125 0.9207 0.9595
No log 10.3810 218 0.8297 0.0159 0.8297 0.9109
No log 10.4762 220 0.7828 0.1240 0.7828 0.8847
No log 10.5714 222 0.7986 0.0226 0.7986 0.8937
No log 10.6667 224 0.7952 0.1630 0.7952 0.8917
No log 10.7619 226 0.8841 -0.0470 0.8841 0.9403
No log 10.8571 228 0.9188 0.0728 0.9188 0.9585
No log 10.9524 230 0.8045 0.1259 0.8045 0.8969
No log 11.0476 232 0.7712 0.1282 0.7712 0.8782
No log 11.1429 234 0.7446 0.0357 0.7446 0.8629
No log 11.2381 236 0.7459 -0.0086 0.7459 0.8637
No log 11.3333 238 0.8159 -0.0440 0.8159 0.9033
No log 11.4286 240 0.7832 0.1687 0.7832 0.8850
No log 11.5238 242 0.7732 0.1189 0.7732 0.8793
No log 11.6190 244 0.7824 0.1674 0.7824 0.8846
No log 11.7143 246 0.8187 0.1941 0.8187 0.9048
No log 11.8095 248 0.9575 0.0799 0.9575 0.9785
No log 11.9048 250 0.8930 0.0783 0.8930 0.9450
No log 12.0 252 0.8440 0.1155 0.8440 0.9187
No log 12.0952 254 0.7630 0.0617 0.7630 0.8735
No log 12.1905 256 0.7537 0.1033 0.7537 0.8681
No log 12.2857 258 0.7436 0.0889 0.7436 0.8623
No log 12.3810 260 0.7730 0.1358 0.7730 0.8792
No log 12.4762 262 0.8504 0.1405 0.8504 0.9222
No log 12.5714 264 0.8504 0.1398 0.8504 0.9222
No log 12.6667 266 0.8503 0.1078 0.8503 0.9221
No log 12.7619 268 0.7981 -0.0166 0.7981 0.8933
No log 12.8571 270 0.7148 0.1311 0.7148 0.8455
No log 12.9524 272 0.7284 0.1148 0.7284 0.8534
No log 13.0476 274 0.7018 0.1828 0.7018 0.8377
No log 13.1429 276 0.7485 0.0663 0.7485 0.8651
No log 13.2381 278 0.8105 0.0710 0.8105 0.9003
No log 13.3333 280 0.7340 0.1080 0.7340 0.8568
No log 13.4286 282 0.7132 0.0513 0.7132 0.8445
No log 13.5238 284 0.7085 0.1379 0.7085 0.8417
No log 13.6190 286 0.7132 0.1379 0.7132 0.8445
No log 13.7143 288 0.7048 -0.0032 0.7048 0.8395
No log 13.8095 290 0.7940 0.0756 0.7940 0.8911
No log 13.9048 292 0.7726 0.0310 0.7726 0.8790
No log 14.0 294 0.7097 0.0524 0.7097 0.8424
No log 14.0952 296 0.6972 0.0964 0.6972 0.8350
No log 14.1905 298 0.7093 0.0964 0.7093 0.8422
No log 14.2857 300 0.7487 0.0061 0.7487 0.8653
No log 14.3810 302 0.7562 -0.0449 0.7562 0.8696
No log 14.4762 304 0.7364 -0.0062 0.7364 0.8581
No log 14.5714 306 0.7382 0.0914 0.7382 0.8592
No log 14.6667 308 0.7496 0.2258 0.7496 0.8658
No log 14.7619 310 0.7558 -0.0513 0.7558 0.8694
No log 14.8571 312 0.7839 -0.0717 0.7839 0.8854
No log 14.9524 314 0.7646 -0.0967 0.7646 0.8744
No log 15.0476 316 0.7613 0.1740 0.7613 0.8725
No log 15.1429 318 0.7905 0.0532 0.7905 0.8891
No log 15.2381 320 0.8694 0.0319 0.8694 0.9324
No log 15.3333 322 0.8184 0.0985 0.8184 0.9046
No log 15.4286 324 0.7888 0.1232 0.7888 0.8881
No log 15.5238 326 0.7809 0.0110 0.7809 0.8837
No log 15.6190 328 0.8215 0.0179 0.8215 0.9064
No log 15.7143 330 0.8393 0.0200 0.8393 0.9161
No log 15.8095 332 0.8726 0.0007 0.8726 0.9342
No log 15.9048 334 0.8138 0.0947 0.8138 0.9021
No log 16.0 336 0.7901 -0.0163 0.7901 0.8889
No log 16.0952 338 0.7608 0.0863 0.7608 0.8722
No log 16.1905 340 0.7490 -0.1018 0.7490 0.8654
No log 16.2857 342 0.7296 -0.1067 0.7296 0.8542
No log 16.3810 344 0.6939 0.0460 0.6939 0.8330
No log 16.4762 346 0.7108 0.1318 0.7108 0.8431
No log 16.5714 348 0.7024 0.1379 0.7024 0.8381
No log 16.6667 350 0.7183 -0.0591 0.7183 0.8475
No log 16.7619 352 0.8654 0.0406 0.8654 0.9303
No log 16.8571 354 0.8880 -0.0616 0.8880 0.9423
No log 16.9524 356 0.8094 -0.0566 0.8094 0.8997
No log 17.0476 358 0.7899 0.1047 0.7899 0.8887
No log 17.1429 360 0.8498 0.1243 0.8498 0.9218
No log 17.2381 362 0.7622 0.1202 0.7622 0.8731
No log 17.3333 364 0.7935 -0.0620 0.7935 0.8908
No log 17.4286 366 0.8669 0.0433 0.8669 0.9311
No log 17.5238 368 0.8004 -0.0173 0.8004 0.8947
No log 17.6190 370 0.7419 0.1740 0.7419 0.8614
No log 17.7143 372 0.7739 0.1565 0.7739 0.8797
No log 17.8095 374 0.7417 0.1740 0.7417 0.8612
No log 17.9048 376 0.7697 -0.0314 0.7697 0.8773
No log 18.0 378 0.8588 0.0433 0.8588 0.9267
No log 18.0952 380 0.7939 -0.0606 0.7939 0.8910
No log 18.1905 382 0.7194 0.0869 0.7194 0.8482
No log 18.2857 384 0.7050 0.1318 0.7050 0.8396
No log 18.3810 386 0.6870 0.1379 0.6870 0.8288
No log 18.4762 388 0.6716 0.0964 0.6716 0.8195
No log 18.5714 390 0.6687 0.0506 0.6687 0.8177
No log 18.6667 392 0.6671 0.0964 0.6671 0.8168
No log 18.7619 394 0.6736 0.0964 0.6736 0.8207
No log 18.8571 396 0.6807 0.0460 0.6807 0.8250
No log 18.9524 398 0.7021 0.0964 0.7021 0.8379
No log 19.0476 400 0.7438 0.1387 0.7438 0.8624
No log 19.1429 402 0.8299 0.0628 0.8299 0.9110
No log 19.2381 404 0.8528 0.1304 0.8528 0.9235
No log 19.3333 406 0.8496 0.0149 0.8496 0.9218
No log 19.4286 408 0.8519 0.0509 0.8519 0.9230
No log 19.5238 410 0.8354 0.0944 0.8354 0.9140
No log 19.6190 412 0.9122 -0.0261 0.9122 0.9551
No log 19.7143 414 0.8675 -0.1027 0.8675 0.9314
No log 19.8095 416 0.7614 0.0863 0.7614 0.8726
No log 19.9048 418 0.8102 -0.0033 0.8102 0.9001
No log 20.0 420 0.7937 0.0953 0.7937 0.8909
No log 20.0952 422 0.7538 0.0857 0.7538 0.8682
No log 20.1905 424 0.7743 -0.0449 0.7743 0.8799
No log 20.2857 426 0.8055 -0.1331 0.8055 0.8975
No log 20.3810 428 0.7972 0.0814 0.7972 0.8928
No log 20.4762 430 0.9152 0.0913 0.9152 0.9567
No log 20.5714 432 0.9014 0.0913 0.9014 0.9494
No log 20.6667 434 0.8588 -0.1205 0.8588 0.9267
No log 20.7619 436 1.0155 -0.0721 1.0155 1.0077
No log 20.8571 438 0.9717 -0.0464 0.9717 0.9858
No log 20.9524 440 0.8055 -0.1538 0.8055 0.8975
No log 21.0476 442 0.7412 0.1758 0.7412 0.8609
No log 21.1429 444 0.8272 0.0953 0.8272 0.9095
No log 21.2381 446 0.8218 0.0953 0.8218 0.9065
No log 21.3333 448 0.7397 0.1758 0.7397 0.8600
No log 21.4286 450 0.7714 -0.0366 0.7714 0.8783
No log 21.5238 452 0.8458 -0.0761 0.8458 0.9197
No log 21.6190 454 0.8297 -0.0039 0.8297 0.9109
No log 21.7143 456 0.7437 0.0428 0.7437 0.8624
No log 21.8095 458 0.7456 0.0357 0.7456 0.8635
No log 21.9048 460 0.7496 0.0444 0.7496 0.8658
No log 22.0 462 0.7565 0.0460 0.7565 0.8698
No log 22.0952 464 0.7292 0.0869 0.7292 0.8540
No log 22.1905 466 0.7111 0.1311 0.7111 0.8433
No log 22.2857 468 0.6974 0.1379 0.6974 0.8351
No log 22.3810 470 0.6881 0.1318 0.6881 0.8295
No log 22.4762 472 0.6849 0.1379 0.6849 0.8276
No log 22.5714 474 0.6885 0.0964 0.6885 0.8297
No log 22.6667 476 0.6849 0.1444 0.6849 0.8276
No log 22.7619 478 0.6807 0.1444 0.6807 0.8250
No log 22.8571 480 0.6808 0.1444 0.6808 0.8251
No log 22.9524 482 0.6844 0.1444 0.6844 0.8273
No log 23.0476 484 0.7044 0.1318 0.7044 0.8393
No log 23.1429 486 0.7304 0.1318 0.7304 0.8546
No log 23.2381 488 0.7225 0.1318 0.7225 0.8500
No log 23.3333 490 0.7025 0.1379 0.7025 0.8382
No log 23.4286 492 0.7213 -0.1074 0.7213 0.8493
No log 23.5238 494 0.7610 -0.1399 0.7610 0.8724
No log 23.6190 496 0.7275 -0.1074 0.7275 0.8530
No log 23.7143 498 0.7030 0.1379 0.7030 0.8385
0.2557 23.8095 500 0.7493 0.1565 0.7493 0.8656
0.2557 23.9048 502 0.7483 0.1565 0.7483 0.8651
0.2557 24.0 504 0.7154 0.1318 0.7154 0.8458
0.2557 24.0952 506 0.7621 -0.0427 0.7621 0.8730
0.2557 24.1905 508 0.8700 -0.0363 0.8700 0.9327
0.2557 24.2857 510 0.9191 -0.0204 0.9191 0.9587

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k4_task3_organization

Finetuned
(4019)
this model