ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k6_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7132
  • Qwk: 0.0524
  • Mse: 0.7132
  • Rmse: 0.8445

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1333 2 3.6362 -0.0047 3.6362 1.9069
No log 0.2667 4 2.1430 0.0247 2.1430 1.4639
No log 0.4 6 1.7879 0.0398 1.7879 1.3371
No log 0.5333 8 1.1515 -0.0234 1.1515 1.0731
No log 0.6667 10 1.1172 0.0610 1.1172 1.0570
No log 0.8 12 0.9793 0.1153 0.9793 0.9896
No log 0.9333 14 1.0696 -0.0207 1.0696 1.0342
No log 1.0667 16 0.9217 -0.0617 0.9217 0.9601
No log 1.2 18 0.8039 -0.0331 0.8039 0.8966
No log 1.3333 20 0.8656 -0.0504 0.8656 0.9304
No log 1.4667 22 0.8276 -0.0842 0.8276 0.9097
No log 1.6 24 0.8536 -0.1265 0.8536 0.9239
No log 1.7333 26 1.0474 0.0338 1.0474 1.0234
No log 1.8667 28 1.2418 0.0 1.2418 1.1144
No log 2.0 30 1.3791 0.0 1.3791 1.1744
No log 2.1333 32 1.2663 -0.0247 1.2663 1.1253
No log 2.2667 34 1.2035 0.0083 1.2035 1.0971
No log 2.4 36 0.8726 -0.0878 0.8726 0.9341
No log 2.5333 38 0.7571 -0.1220 0.7571 0.8701
No log 2.6667 40 0.7289 -0.1216 0.7289 0.8538
No log 2.8 42 0.8517 -0.1249 0.8517 0.9229
No log 2.9333 44 1.1466 -0.0982 1.1466 1.0708
No log 3.0667 46 1.3335 0.0083 1.3335 1.1548
No log 3.2 48 1.4895 -0.0247 1.4895 1.2205
No log 3.3333 50 1.4622 -0.0247 1.4622 1.2092
No log 3.4667 52 1.3291 0.0 1.3291 1.1529
No log 3.6 54 1.2690 0.0 1.2690 1.1265
No log 3.7333 56 1.1171 -0.0234 1.1171 1.0569
No log 3.8667 58 0.9108 -0.0200 0.9108 0.9543
No log 4.0 60 0.8024 -0.0790 0.8024 0.8958
No log 4.1333 62 0.7800 -0.0739 0.7800 0.8832
No log 4.2667 64 0.7806 -0.0739 0.7806 0.8835
No log 4.4 66 0.8013 -0.0739 0.8013 0.8952
No log 4.5333 68 0.7833 -0.0739 0.7833 0.8851
No log 4.6667 70 0.8425 -0.1244 0.8425 0.9179
No log 4.8 72 1.0115 0.0111 1.0115 1.0057
No log 4.9333 74 1.1528 0.0493 1.1528 1.0737
No log 5.0667 76 1.2982 -0.0193 1.2982 1.1394
No log 5.2 78 1.0449 -0.0049 1.0449 1.0222
No log 5.3333 80 0.7543 -0.1241 0.7543 0.8685
No log 5.4667 82 0.7242 0.0555 0.7242 0.8510
No log 5.6 84 0.7500 -0.0551 0.7500 0.8660
No log 5.7333 86 0.7645 -0.1168 0.7645 0.8743
No log 5.8667 88 0.9029 0.0642 0.9029 0.9502
No log 6.0 90 0.8932 -0.0008 0.8932 0.9451
No log 6.1333 92 0.8333 0.0191 0.8333 0.9129
No log 6.2667 94 0.8175 -0.0264 0.8175 0.9042
No log 6.4 96 0.8195 -0.0096 0.8195 0.9053
No log 6.5333 98 0.8447 0.0095 0.8447 0.9191
No log 6.6667 100 0.9162 0.0068 0.9162 0.9572
No log 6.8 102 0.9162 0.0129 0.9162 0.9572
No log 6.9333 104 0.9472 -0.0672 0.9472 0.9733
No log 7.0667 106 0.9931 -0.1194 0.9931 0.9965
No log 7.2 108 0.9957 -0.0059 0.9957 0.9978
No log 7.3333 110 0.9079 0.0040 0.9079 0.9529
No log 7.4667 112 0.8916 -0.0778 0.8916 0.9442
No log 7.6 114 0.8779 -0.1106 0.8779 0.9370
No log 7.7333 116 0.8468 -0.0385 0.8468 0.9202
No log 7.8667 118 0.7678 -0.0662 0.7678 0.8762
No log 8.0 120 0.8476 -0.0371 0.8476 0.9206
No log 8.1333 122 0.9752 0.0545 0.9752 0.9875
No log 8.2667 124 0.8632 -0.0425 0.8632 0.9291
No log 8.4 126 0.7734 0.0 0.7734 0.8794
No log 8.5333 128 0.9077 -0.0408 0.9077 0.9527
No log 8.6667 130 0.9463 -0.1054 0.9463 0.9728
No log 8.8 132 0.8201 0.0260 0.8201 0.9056
No log 8.9333 134 0.7494 0.0556 0.7494 0.8657
No log 9.0667 136 0.7499 0.0556 0.7499 0.8659
No log 9.2 138 0.7969 -0.0329 0.7969 0.8927
No log 9.3333 140 0.8579 0.1034 0.8579 0.9262
No log 9.4667 142 0.8497 0.0268 0.8497 0.9218
No log 9.6 144 0.7930 0.0179 0.7930 0.8905
No log 9.7333 146 0.8042 0.0172 0.8042 0.8968
No log 9.8667 148 0.8318 -0.0116 0.8318 0.9120
No log 10.0 150 0.8437 0.0315 0.8437 0.9185
No log 10.1333 152 0.8501 0.1123 0.8501 0.9220
No log 10.2667 154 0.8128 -0.0204 0.8128 0.9015
No log 10.4 156 0.7722 -0.0428 0.7722 0.8788
No log 10.5333 158 0.7547 0.0602 0.7547 0.8687
No log 10.6667 160 0.7481 0.0524 0.7481 0.8649
No log 10.8 162 0.7422 0.0555 0.7422 0.8615
No log 10.9333 164 0.7053 0.0 0.7053 0.8398
No log 11.0667 166 0.7026 -0.0033 0.7026 0.8382
No log 11.2 168 0.7491 -0.0690 0.7491 0.8655
No log 11.3333 170 0.7079 0.0436 0.7079 0.8414
No log 11.4667 172 0.7634 -0.0406 0.7634 0.8737
No log 11.6 174 0.7723 0.0064 0.7723 0.8788
No log 11.7333 176 0.7181 0.0 0.7181 0.8474
No log 11.8667 178 0.7128 0.0479 0.7128 0.8442
No log 12.0 180 0.9118 0.0362 0.9118 0.9549
No log 12.1333 182 0.9417 -0.0076 0.9417 0.9704
No log 12.2667 184 0.7456 0.0183 0.7456 0.8635
No log 12.4 186 0.7023 0.0 0.7023 0.8381
No log 12.5333 188 0.7434 -0.0033 0.7434 0.8622
No log 12.6667 190 0.7635 -0.1074 0.7635 0.8738
No log 12.8 192 0.7246 0.0587 0.7246 0.8512
No log 12.9333 194 0.7137 0.1552 0.7137 0.8448
No log 13.0667 196 0.7318 0.2024 0.7318 0.8555
No log 13.2 198 0.8145 0.0240 0.8145 0.9025
No log 13.3333 200 0.9512 -0.0583 0.9512 0.9753
No log 13.4667 202 0.9840 -0.0253 0.9840 0.9920
No log 13.6 204 0.9454 -0.0334 0.9454 0.9723
No log 13.7333 206 0.8482 0.0260 0.8482 0.9210
No log 13.8667 208 0.8163 -0.0958 0.8163 0.9035
No log 14.0 210 0.7736 -0.0520 0.7736 0.8795
No log 14.1333 212 0.7414 -0.0520 0.7414 0.8611
No log 14.2667 214 0.7277 0.0 0.7277 0.8530
No log 14.4 216 0.7066 0.0 0.7066 0.8406
No log 14.5333 218 0.7059 0.0 0.7059 0.8402
No log 14.6667 220 0.7155 0.0 0.7155 0.8459
No log 14.8 222 0.7294 0.0 0.7294 0.8540
No log 14.9333 224 0.7345 0.0602 0.7345 0.8570
No log 15.0667 226 0.7595 0.0602 0.7595 0.8715
No log 15.2 228 0.7805 0.1135 0.7805 0.8835
No log 15.3333 230 0.8143 0.0155 0.8143 0.9024
No log 15.4667 232 0.8620 0.0733 0.8620 0.9285
No log 15.6 234 0.8287 0.0196 0.8287 0.9103
No log 15.7333 236 0.7613 0.0615 0.7613 0.8725
No log 15.8667 238 0.7426 0.0973 0.7426 0.8617
No log 16.0 240 0.7654 -0.0271 0.7654 0.8749
No log 16.1333 242 0.8348 0.1124 0.8348 0.9137
No log 16.2667 244 0.8389 0.1124 0.8389 0.9159
No log 16.4 246 0.8133 0.1991 0.8133 0.9018
No log 16.5333 248 0.7882 0.0652 0.7882 0.8878
No log 16.6667 250 0.7848 0.0615 0.7848 0.8859
No log 16.8 252 0.7779 0.0602 0.7779 0.8820
No log 16.9333 254 0.7659 0.0602 0.7659 0.8751
No log 17.0667 256 0.7473 0.0033 0.7473 0.8645
No log 17.2 258 0.7207 0.0602 0.7207 0.8490
No log 17.3333 260 0.7297 0.0602 0.7297 0.8542
No log 17.4667 262 0.7748 0.1135 0.7748 0.8802
No log 17.6 264 0.7918 0.0640 0.7918 0.8898
No log 17.7333 266 0.7758 0.0628 0.7758 0.8808
No log 17.8667 268 0.7584 0.0628 0.7584 0.8708
No log 18.0 270 0.7537 0.0602 0.7537 0.8682
No log 18.1333 272 0.7890 0.0628 0.7890 0.8882
No log 18.2667 274 0.8701 0.1122 0.8701 0.9328
No log 18.4 276 0.9303 0.0789 0.9303 0.9645
No log 18.5333 278 0.8772 0.1119 0.8772 0.9366
No log 18.6667 280 0.7947 0.0598 0.7947 0.8915
No log 18.8 282 0.7571 0.0556 0.7571 0.8701
No log 18.9333 284 0.7636 0.0874 0.7636 0.8738
No log 19.0667 286 0.7855 0.1080 0.7855 0.8863
No log 19.2 288 0.8392 0.0315 0.8392 0.9161
No log 19.3333 290 0.8153 0.1130 0.8153 0.9030
No log 19.4667 292 0.7602 0.1137 0.7602 0.8719
No log 19.6 294 0.7072 0.0524 0.7072 0.8409
No log 19.7333 296 0.6931 0.0479 0.6931 0.8325
No log 19.8667 298 0.7086 0.1081 0.7086 0.8418
No log 20.0 300 0.7725 0.0640 0.7725 0.8789
No log 20.1333 302 0.8492 0.1118 0.8492 0.9215
No log 20.2667 304 0.8634 0.0726 0.8634 0.9292
No log 20.4 306 0.7879 0.0640 0.7879 0.8876
No log 20.5333 308 0.6949 0.0 0.6949 0.8336
No log 20.6667 310 0.6942 0.1021 0.6942 0.8332
No log 20.8 312 0.6954 0.1082 0.6954 0.8339
No log 20.9333 314 0.7001 0.0555 0.7001 0.8367
No log 21.0667 316 0.7141 0.0555 0.7141 0.8450
No log 21.2 318 0.7530 0.0587 0.7530 0.8678
No log 21.3333 320 0.8026 0.0122 0.8026 0.8959
No log 21.4667 322 0.8300 0.0279 0.8300 0.9110
No log 21.6 324 0.8699 -0.0015 0.8699 0.9327
No log 21.7333 326 0.8756 -0.0039 0.8756 0.9357
No log 21.8667 328 0.8321 0.0683 0.8321 0.9122
No log 22.0 330 0.7884 0.1974 0.7884 0.8879
No log 22.1333 332 0.7685 0.1538 0.7685 0.8766
No log 22.2667 334 0.7619 0.0064 0.7619 0.8729
No log 22.4 336 0.7756 0.0064 0.7756 0.8807
No log 22.5333 338 0.8027 0.0094 0.8027 0.8959
No log 22.6667 340 0.7848 0.0094 0.7848 0.8859
No log 22.8 342 0.7488 0.0587 0.7488 0.8653
No log 22.9333 344 0.7273 0.1081 0.7273 0.8528
No log 23.0667 346 0.7312 0.1486 0.7312 0.8551
No log 23.2 348 0.7390 0.1486 0.7390 0.8597
No log 23.3333 350 0.7407 0.1081 0.7407 0.8607
No log 23.4667 352 0.7514 0.0587 0.7514 0.8668
No log 23.6 354 0.7776 -0.0451 0.7776 0.8818
No log 23.7333 356 0.7595 0.0587 0.7595 0.8715
No log 23.8667 358 0.7085 0.0 0.7085 0.8417
No log 24.0 360 0.6922 0.0479 0.6922 0.8320
No log 24.1333 362 0.6900 0.0479 0.6900 0.8307
No log 24.2667 364 0.6930 0.0 0.6930 0.8325
No log 24.4 366 0.7262 0.0 0.7262 0.8521
No log 24.5333 368 0.8251 -0.1329 0.8251 0.9084
No log 24.6667 370 0.8833 -0.0279 0.8833 0.9399
No log 24.8 372 0.8903 -0.0279 0.8903 0.9435
No log 24.9333 374 0.8476 0.0364 0.8476 0.9206
No log 25.0667 376 0.8351 0.0364 0.8351 0.9138
No log 25.2 378 0.8347 -0.0015 0.8347 0.9136
No log 25.3333 380 0.8582 0.0050 0.8582 0.9264
No log 25.4667 382 0.8728 -0.0039 0.8728 0.9342
No log 25.6 384 0.8233 0.0240 0.8233 0.9074
No log 25.7333 386 0.7580 0.0064 0.7580 0.8706
No log 25.8667 388 0.7137 0.0555 0.7137 0.8448
No log 26.0 390 0.7053 0.0 0.7053 0.8398
No log 26.1333 392 0.7102 -0.0033 0.7102 0.8427
No log 26.2667 394 0.7041 0.0 0.7041 0.8391
No log 26.4 396 0.7001 0.0 0.7001 0.8367
No log 26.5333 398 0.6985 0.0 0.6985 0.8358
No log 26.6667 400 0.7005 0.0 0.7005 0.8370
No log 26.8 402 0.7183 0.0 0.7183 0.8475
No log 26.9333 404 0.7373 0.0 0.7373 0.8587
No log 27.0667 406 0.7457 0.0587 0.7457 0.8636
No log 27.2 408 0.7421 0.0602 0.7421 0.8614
No log 27.3333 410 0.7342 0.0587 0.7342 0.8569
No log 27.4667 412 0.7430 0.1139 0.7430 0.8620
No log 27.6 414 0.7357 0.0 0.7357 0.8577
No log 27.7333 416 0.7531 0.0064 0.7531 0.8678
No log 27.8667 418 0.7366 -0.0520 0.7366 0.8583
No log 28.0 420 0.7232 0.0 0.7232 0.8504
No log 28.1333 422 0.7347 0.0 0.7347 0.8571
No log 28.2667 424 0.7610 0.0628 0.7610 0.8724
No log 28.4 426 0.8340 0.0240 0.8340 0.9132
No log 28.5333 428 0.8523 0.1126 0.8523 0.9232
No log 28.6667 430 0.8186 0.0196 0.8186 0.9048
No log 28.8 432 0.7613 0.0 0.7613 0.8725
No log 28.9333 434 0.7270 0.0 0.7270 0.8527
No log 29.0667 436 0.7230 0.0 0.7230 0.8503
No log 29.2 438 0.7455 0.0587 0.7455 0.8634
No log 29.3333 440 0.8033 0.0652 0.8033 0.8963
No log 29.4667 442 0.8292 0.0196 0.8292 0.9106
No log 29.6 444 0.8117 0.0652 0.8117 0.9010
No log 29.7333 446 0.7636 0.0 0.7636 0.8738
No log 29.8667 448 0.7253 0.0 0.7253 0.8516
No log 30.0 450 0.7238 0.0524 0.7238 0.8508
No log 30.1333 452 0.7308 0.0 0.7308 0.8549
No log 30.2667 454 0.7683 0.1139 0.7683 0.8765
No log 30.4 456 0.8081 0.1133 0.8081 0.8990
No log 30.5333 458 0.8485 -0.1329 0.8485 0.9211
No log 30.6667 460 0.8685 -0.2071 0.8685 0.9319
No log 30.8 462 0.8827 -0.1145 0.8827 0.9395
No log 30.9333 464 0.8327 -0.0144 0.8327 0.9125
No log 31.0667 466 0.7937 0.1537 0.7937 0.8909
No log 31.2 468 0.7811 0.1562 0.7811 0.8838
No log 31.3333 470 0.7642 0.0970 0.7642 0.8742
No log 31.4667 472 0.7457 0.0970 0.7457 0.8635
No log 31.6 474 0.7360 0.0524 0.7360 0.8579
No log 31.7333 476 0.7230 0.0524 0.7230 0.8503
No log 31.8667 478 0.7167 0.1082 0.7167 0.8466
No log 32.0 480 0.7188 0.0555 0.7188 0.8478
No log 32.1333 482 0.7300 0.0555 0.7300 0.8544
No log 32.2667 484 0.7498 0.1141 0.7498 0.8659
No log 32.4 486 0.7557 0.1137 0.7557 0.8693
No log 32.5333 488 0.7774 0.1135 0.7774 0.8817
No log 32.6667 490 0.7775 0.0640 0.7775 0.8818
No log 32.8 492 0.7658 0.0628 0.7658 0.8751
No log 32.9333 494 0.7616 0.0094 0.7616 0.8727
No log 33.0667 496 0.7573 -0.0520 0.7573 0.8702
No log 33.2 498 0.7517 0.0 0.7517 0.8670
0.3056 33.3333 500 0.7373 0.0555 0.7373 0.8586
0.3056 33.4667 502 0.7326 0.0555 0.7326 0.8559
0.3056 33.6 504 0.7225 0.0 0.7225 0.8500
0.3056 33.7333 506 0.7113 0.0555 0.7113 0.8434
0.3056 33.8667 508 0.7212 0.0 0.7212 0.8492
0.3056 34.0 510 0.7132 0.0524 0.7132 0.8445

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k6_task3_organization

Finetuned
(4019)
this model