ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k5_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8791
  • Qwk: 0.1537
  • Mse: 0.8791
  • Rmse: 0.9376

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0769 2 3.7293 0.0023 3.7293 1.9311
No log 0.1538 4 1.9289 -0.0076 1.9289 1.3888
No log 0.2308 6 1.1958 0.0317 1.1958 1.0935
No log 0.3077 8 1.0305 -0.0253 1.0305 1.0151
No log 0.3846 10 0.7988 0.0628 0.7988 0.8938
No log 0.4615 12 0.7823 0.0759 0.7823 0.8845
No log 0.5385 14 0.8399 0.0159 0.8399 0.9165
No log 0.6154 16 0.7508 -0.0069 0.7508 0.8665
No log 0.6923 18 0.7558 0.1444 0.7558 0.8694
No log 0.7692 20 1.0817 0.0107 1.0817 1.0400
No log 0.8462 22 1.0462 -0.0526 1.0462 1.0229
No log 0.9231 24 0.8758 -0.0295 0.8758 0.9358
No log 1.0 26 0.9619 0.0016 0.9619 0.9808
No log 1.0769 28 1.5279 0.0380 1.5279 1.2361
No log 1.1538 30 1.4758 0.0648 1.4758 1.2148
No log 1.2308 32 0.9413 -0.0008 0.9413 0.9702
No log 1.3077 34 0.8620 -0.1088 0.8620 0.9284
No log 1.3846 36 0.8875 -0.0778 0.8875 0.9421
No log 1.4615 38 1.1423 0.0129 1.1423 1.0688
No log 1.5385 40 0.9366 -0.0031 0.9366 0.9678
No log 1.6154 42 0.9098 -0.0427 0.9098 0.9538
No log 1.6923 44 0.9416 -0.0620 0.9416 0.9704
No log 1.7692 46 0.8799 -0.0389 0.8799 0.9381
No log 1.8462 48 0.8697 0.0709 0.8697 0.9326
No log 1.9231 50 0.8662 0.1093 0.8662 0.9307
No log 2.0 52 0.8787 0.0257 0.8787 0.9374
No log 2.0769 54 0.8613 0.0257 0.8613 0.9280
No log 2.1538 56 0.8536 0.1050 0.8536 0.9239
No log 2.2308 58 0.8810 0.1539 0.8810 0.9386
No log 2.3077 60 0.9940 0.0722 0.9940 0.9970
No log 2.3846 62 1.1150 0.0499 1.1150 1.0559
No log 2.4615 64 1.1575 0.0569 1.1575 1.0759
No log 2.5385 66 1.1780 0.0980 1.1780 1.0854
No log 2.6154 68 1.0716 -0.0204 1.0716 1.0352
No log 2.6923 70 1.0390 0.0294 1.0390 1.0193
No log 2.7692 72 0.9675 0.0483 0.9675 0.9836
No log 2.8462 74 0.9488 0.0641 0.9488 0.9741
No log 2.9231 76 0.8934 0.1558 0.8934 0.9452
No log 3.0 78 1.0468 0.0895 1.0468 1.0231
No log 3.0769 80 0.9820 0.1452 0.9820 0.9909
No log 3.1538 82 0.9849 -0.0119 0.9849 0.9924
No log 3.2308 84 1.0104 0.0214 1.0104 1.0052
No log 3.3077 86 1.0175 0.1051 1.0175 1.0087
No log 3.3846 88 1.0523 0.0121 1.0523 1.0258
No log 3.4615 90 1.4283 -0.0256 1.4283 1.1951
No log 3.5385 92 1.5192 0.0056 1.5192 1.2326
No log 3.6154 94 1.1066 0.1353 1.1066 1.0519
No log 3.6923 96 0.9244 0.1509 0.9244 0.9615
No log 3.7692 98 0.8835 0.1903 0.8835 0.9400
No log 3.8462 100 0.8695 0.1922 0.8695 0.9324
No log 3.9231 102 0.8253 0.1617 0.8253 0.9085
No log 4.0 104 0.8141 0.0898 0.8141 0.9023
No log 4.0769 106 0.7970 0.1236 0.7970 0.8928
No log 4.1538 108 0.8456 0.1263 0.8456 0.9196
No log 4.2308 110 1.0114 0.0952 1.0114 1.0057
No log 4.3077 112 1.0164 0.0498 1.0164 1.0082
No log 4.3846 114 1.2041 0.1375 1.2041 1.0973
No log 4.4615 116 1.1323 0.1192 1.1323 1.0641
No log 4.5385 118 1.1024 0.0546 1.1024 1.0500
No log 4.6154 120 1.0265 0.0215 1.0265 1.0132
No log 4.6923 122 0.9993 0.0569 0.9993 0.9996
No log 4.7692 124 0.9048 0.1509 0.9048 0.9512
No log 4.8462 126 0.8591 0.1500 0.8591 0.9269
No log 4.9231 128 0.9546 0.1042 0.9546 0.9770
No log 5.0 130 0.9664 0.1363 0.9664 0.9830
No log 5.0769 132 0.8276 0.1259 0.8276 0.9097
No log 5.1538 134 0.9022 0.0562 0.9022 0.9498
No log 5.2308 136 0.8877 -0.0171 0.8877 0.9422
No log 5.3077 138 1.0439 0.0587 1.0439 1.0217
No log 5.3846 140 1.4073 0.0026 1.4073 1.1863
No log 5.4615 142 1.3559 -0.0035 1.3559 1.1644
No log 5.5385 144 1.0281 0.1857 1.0281 1.0140
No log 5.6154 146 1.0045 0.0637 1.0045 1.0022
No log 5.6923 148 0.9339 0.1546 0.9339 0.9664
No log 5.7692 150 0.9151 0.1581 0.9151 0.9566
No log 5.8462 152 0.9776 0.1220 0.9776 0.9887
No log 5.9231 154 0.9424 0.1237 0.9424 0.9708
No log 6.0 156 0.9236 0.1887 0.9236 0.9610
No log 6.0769 158 0.9417 0.0226 0.9417 0.9704
No log 6.1538 160 0.8928 0.0214 0.8928 0.9449
No log 6.2308 162 0.8314 0.0749 0.8314 0.9118
No log 6.3077 164 0.7784 0.2087 0.7784 0.8823
No log 6.3846 166 0.8239 0.1991 0.8239 0.9077
No log 6.4615 168 0.9041 0.1701 0.9041 0.9508
No log 6.5385 170 0.8915 0.2776 0.8915 0.9442
No log 6.6154 172 1.0189 0.0974 1.0189 1.0094
No log 6.6923 174 0.8982 0.2430 0.8982 0.9477
No log 6.7692 176 0.8590 0.1539 0.8590 0.9268
No log 6.8462 178 0.9142 0.1422 0.9142 0.9561
No log 6.9231 180 0.8570 0.1539 0.8570 0.9257
No log 7.0 182 0.9482 0.0635 0.9482 0.9738
No log 7.0769 184 0.9599 0.0703 0.9599 0.9797
No log 7.1538 186 0.8656 0.1960 0.8656 0.9304
No log 7.2308 188 0.8491 0.1983 0.8491 0.9215
No log 7.3077 190 0.8546 0.1575 0.8546 0.9244
No log 7.3846 192 1.0230 0.0458 1.0230 1.0114
No log 7.4615 194 0.9316 0.0606 0.9316 0.9652
No log 7.5385 196 0.9156 0.1648 0.9156 0.9569
No log 7.6154 198 1.0611 0.0175 1.0611 1.0301
No log 7.6923 200 0.9256 0.1495 0.9256 0.9621
No log 7.7692 202 0.9225 0.0970 0.9225 0.9605
No log 7.8462 204 1.0074 0.0501 1.0074 1.0037
No log 7.9231 206 0.9099 0.0353 0.9099 0.9539
No log 8.0 208 0.8813 0.1901 0.8813 0.9388
No log 8.0769 210 1.1097 0.0616 1.1097 1.0534
No log 8.1538 212 1.0432 0.0413 1.0432 1.0214
No log 8.2308 214 0.8909 0.1093 0.8909 0.9439
No log 8.3077 216 0.9256 0.0871 0.9256 0.9621
No log 8.3846 218 0.9412 0.0871 0.9412 0.9702
No log 8.4615 220 0.9377 0.1604 0.9377 0.9683
No log 8.5385 222 1.0075 0.0257 1.0075 1.0038
No log 8.6154 224 0.9296 -0.0373 0.9296 0.9642
No log 8.6923 226 0.8422 0.1604 0.8422 0.9177
No log 8.7692 228 0.9254 0.0615 0.9254 0.9620
No log 8.8462 230 0.9263 0.0623 0.9263 0.9624
No log 8.9231 232 0.8193 0.1942 0.8193 0.9051
No log 9.0 234 0.9192 0.1301 0.9192 0.9587
No log 9.0769 236 1.1442 0.1086 1.1442 1.0697
No log 9.1538 238 1.0570 0.1144 1.0570 1.0281
No log 9.2308 240 0.8766 0.0627 0.8766 0.9363
No log 9.3077 242 0.9282 0.1162 0.9282 0.9634
No log 9.3846 244 0.9955 0.0885 0.9955 0.9977
No log 9.4615 246 0.9188 0.0860 0.9188 0.9585
No log 9.5385 248 0.8442 0.2353 0.8442 0.9188
No log 9.6154 250 0.9129 0.1003 0.9129 0.9555
No log 9.6923 252 0.9203 0.0842 0.9203 0.9593
No log 9.7692 254 0.9013 0.1834 0.9013 0.9494
No log 9.8462 256 0.9910 0.1159 0.9910 0.9955
No log 9.9231 258 0.9811 0.1122 0.9811 0.9905
No log 10.0 260 0.9491 0.0417 0.9491 0.9742
No log 10.0769 262 0.8898 0.1176 0.8898 0.9433
No log 10.1538 264 0.8468 0.0600 0.8468 0.9202
No log 10.2308 266 0.8148 0.0723 0.8148 0.9026
No log 10.3077 268 0.7934 0.0869 0.7934 0.8907
No log 10.3846 270 0.8193 0.1304 0.8193 0.9052
No log 10.4615 272 0.9043 0.0091 0.9043 0.9509
No log 10.5385 274 0.9279 0.0432 0.9279 0.9633
No log 10.6154 276 0.8933 0.0977 0.8933 0.9452
No log 10.6923 278 0.8707 0.2264 0.8707 0.9331
No log 10.7692 280 0.8489 0.1846 0.8489 0.9214
No log 10.8462 282 0.8312 0.1979 0.8312 0.9117
No log 10.9231 284 0.8231 0.1687 0.8231 0.9072
No log 11.0 286 0.8549 0.0154 0.8549 0.9246
No log 11.0769 288 0.8974 0.0973 0.8974 0.9473
No log 11.1538 290 0.9102 0.0716 0.9102 0.9541
No log 11.2308 292 0.8835 0.0774 0.8835 0.9400
No log 11.3077 294 0.8566 0.1006 0.8566 0.9255
No log 11.3846 296 0.8374 0.0152 0.8374 0.9151
No log 11.4615 298 0.8238 -0.0274 0.8238 0.9076
No log 11.5385 300 0.8055 0.1196 0.8055 0.8975
No log 11.6154 302 0.8071 0.0214 0.8071 0.8984
No log 11.6923 304 0.8071 0.0723 0.8071 0.8984
No log 11.7692 306 0.8459 0.2511 0.8459 0.9197
No log 11.8462 308 0.9065 0.1906 0.9065 0.9521
No log 11.9231 310 0.9717 0.0893 0.9717 0.9858
No log 12.0 312 0.9836 0.1432 0.9836 0.9918
No log 12.0769 314 0.9862 0.1724 0.9862 0.9931
No log 12.1538 316 0.9516 0.1789 0.9516 0.9755
No log 12.2308 318 0.9803 0.1259 0.9803 0.9901
No log 12.3077 320 0.8806 0.1203 0.8806 0.9384
No log 12.3846 322 0.8074 0.0741 0.8074 0.8986
No log 12.4615 324 0.7986 0.1899 0.7986 0.8936
No log 12.5385 326 0.7518 0.1565 0.7518 0.8671
No log 12.6154 328 0.7263 0.0814 0.7263 0.8522
No log 12.6923 330 0.7358 0.0814 0.7358 0.8578
No log 12.7692 332 0.7618 0.0338 0.7618 0.8728
No log 12.8462 334 0.7982 0.0289 0.7982 0.8934
No log 12.9231 336 0.8498 0.1050 0.8498 0.9218
No log 13.0 338 0.9392 0.0715 0.9392 0.9691
No log 13.0769 340 0.9637 0.1254 0.9637 0.9817
No log 13.1538 342 0.9301 0.0378 0.9301 0.9644
No log 13.2308 344 0.8684 0.1140 0.8684 0.9319
No log 13.3077 346 0.8479 0.1823 0.8479 0.9208
No log 13.3846 348 0.8921 0.1006 0.8921 0.9445
No log 13.4615 350 0.8429 0.1379 0.8429 0.9181
No log 13.5385 352 0.8229 0.1720 0.8229 0.9071
No log 13.6154 354 0.8511 0.1723 0.8511 0.9225
No log 13.6923 356 0.8734 0.0966 0.8734 0.9345
No log 13.7692 358 0.8764 0.0966 0.8764 0.9362
No log 13.8462 360 0.8717 0.1463 0.8717 0.9336
No log 13.9231 362 0.8614 0.1463 0.8614 0.9281
No log 14.0 364 0.8813 0.1500 0.8813 0.9388
No log 14.0769 366 0.8930 0.1130 0.8930 0.9450
No log 14.1538 368 0.8701 0.1379 0.8701 0.9328
No log 14.2308 370 0.8869 0.0407 0.8869 0.9417
No log 14.3077 372 0.8361 0.0123 0.8361 0.9144
No log 14.3846 374 0.8166 0.1569 0.8166 0.9037
No log 14.4615 376 0.9337 0.0701 0.9337 0.9663
No log 14.5385 378 0.9561 0.0707 0.9561 0.9778
No log 14.6154 380 0.9133 0.0900 0.9133 0.9557
No log 14.6923 382 0.9168 0.1935 0.9168 0.9575
No log 14.7692 384 0.9160 0.2349 0.9160 0.9571
No log 14.8462 386 0.9180 0.1161 0.9180 0.9581
No log 14.9231 388 0.8996 0.0334 0.8996 0.9485
No log 15.0 390 0.8606 -0.0149 0.8606 0.9277
No log 15.0769 392 0.8270 0.0226 0.8270 0.9094
No log 15.1538 394 0.8187 0.1495 0.8187 0.9048
No log 15.2308 396 0.8356 0.0227 0.8356 0.9141
No log 15.3077 398 0.8559 0.1424 0.8559 0.9252
No log 15.3846 400 0.8276 0.1424 0.8276 0.9097
No log 15.4615 402 0.8098 0.1050 0.8098 0.8999
No log 15.5385 404 0.8100 0.1050 0.8100 0.9000
No log 15.6154 406 0.7986 0.0688 0.7986 0.8936
No log 15.6923 408 0.7993 0.1189 0.7993 0.8940
No log 15.7692 410 0.8539 0.2057 0.8539 0.9241
No log 15.8462 412 0.8773 0.2036 0.8773 0.9367
No log 15.9231 414 0.8554 0.1143 0.8554 0.9249
No log 16.0 416 0.9071 0.0920 0.9071 0.9524
No log 16.0769 418 0.9126 0.0884 0.9126 0.9553
No log 16.1538 420 0.8866 0.1006 0.8866 0.9416
No log 16.2308 422 0.8841 0.1623 0.8841 0.9403
No log 16.3077 424 0.9301 0.0903 0.9301 0.9644
No log 16.3846 426 0.9095 0.0890 0.9095 0.9537
No log 16.4615 428 0.8823 0.1538 0.8823 0.9393
No log 16.5385 430 0.8724 0.1923 0.8724 0.9340
No log 16.6154 432 0.8929 0.1538 0.8929 0.9449
No log 16.6923 434 0.9201 0.1538 0.9201 0.9592
No log 16.7692 436 0.9131 0.2229 0.9131 0.9556
No log 16.8462 438 0.9044 0.0161 0.9044 0.9510
No log 16.9231 440 0.8846 0.1456 0.8846 0.9405
No log 17.0 442 0.8901 0.2353 0.8901 0.9434
No log 17.0769 444 0.8926 0.1548 0.8926 0.9448
No log 17.1538 446 0.8618 0.2353 0.8618 0.9283
No log 17.2308 448 0.8197 0.2024 0.8197 0.9054
No log 17.3077 450 0.7801 0.1144 0.7801 0.8832
No log 17.3846 452 0.7882 0.0600 0.7882 0.8878
No log 17.4615 454 0.8079 0.1048 0.8079 0.8988
No log 17.5385 456 0.8411 0.2169 0.8411 0.9171
No log 17.6154 458 0.8718 0.1531 0.8718 0.9337
No log 17.6923 460 0.8937 0.0747 0.8937 0.9454
No log 17.7692 462 0.8468 0.1228 0.8468 0.9202
No log 17.8462 464 0.8010 0.0741 0.8010 0.8950
No log 17.9231 466 0.8272 0.1660 0.8272 0.9095
No log 18.0 468 0.8368 0.1181 0.8368 0.9148
No log 18.0769 470 0.8279 0.1095 0.8279 0.9099
No log 18.1538 472 0.8666 0.0799 0.8666 0.9309
No log 18.2308 474 0.8507 0.0512 0.8507 0.9223
No log 18.3077 476 0.8542 0.0574 0.8542 0.9243
No log 18.3846 478 0.8589 0.1184 0.8589 0.9268
No log 18.4615 480 0.9122 0.0909 0.9122 0.9551
No log 18.5385 482 0.8826 0.1941 0.8826 0.9395
No log 18.6154 484 0.8330 0.0749 0.8330 0.9127
No log 18.6923 486 0.8793 0.0769 0.8793 0.9377
No log 18.7692 488 0.9331 0.1027 0.9331 0.9660
No log 18.8462 490 0.8892 0.1065 0.8892 0.9430
No log 18.9231 492 0.8126 0.0913 0.8126 0.9015
No log 19.0 494 0.7833 0.0680 0.7833 0.8850
No log 19.0769 496 0.8360 0.1633 0.8360 0.9143
No log 19.1538 498 0.8764 0.1842 0.8764 0.9362
0.2503 19.2308 500 0.8837 0.2310 0.8837 0.9401
0.2503 19.3077 502 0.9535 0.0644 0.9535 0.9765
0.2503 19.3846 504 1.0329 0.0701 1.0329 1.0163
0.2503 19.4615 506 0.9569 0.1146 0.9569 0.9782
0.2503 19.5385 508 0.8430 0.1006 0.8430 0.9182
0.2503 19.6154 510 0.8791 0.1537 0.8791 0.9376

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k5_task3_organization

Finetuned
(4019)
this model