ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k8_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7354
  • Qwk: 0.4908
  • Mse: 0.7354
  • Rmse: 0.8575

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0488 2 4.3349 -0.0048 4.3349 2.0820
No log 0.0976 4 2.5809 0.0215 2.5809 1.6065
No log 0.1463 6 2.6657 0.0115 2.6657 1.6327
No log 0.1951 8 1.7227 0.0262 1.7227 1.3125
No log 0.2439 10 1.1310 0.2166 1.1310 1.0635
No log 0.2927 12 1.1152 0.1465 1.1152 1.0560
No log 0.3415 14 1.1752 0.2145 1.1752 1.0841
No log 0.3902 16 1.0516 0.2515 1.0516 1.0255
No log 0.4390 18 1.0284 0.2515 1.0284 1.0141
No log 0.4878 20 1.0279 0.2316 1.0279 1.0138
No log 0.5366 22 1.0837 0.2740 1.0837 1.0410
No log 0.5854 24 1.1572 0.2196 1.1572 1.0757
No log 0.6341 26 1.1135 0.2588 1.1135 1.0552
No log 0.6829 28 1.1359 0.2004 1.1359 1.0658
No log 0.7317 30 1.1456 0.1860 1.1456 1.0703
No log 0.7805 32 1.0554 0.2567 1.0554 1.0273
No log 0.8293 34 1.0013 0.1969 1.0013 1.0007
No log 0.8780 36 1.0281 0.2566 1.0281 1.0139
No log 0.9268 38 1.0838 0.1794 1.0838 1.0410
No log 0.9756 40 1.1489 0.1389 1.1489 1.0719
No log 1.0244 42 1.2867 0.0170 1.2867 1.1343
No log 1.0732 44 1.3547 0.0170 1.3547 1.1639
No log 1.1220 46 1.3516 0.0170 1.3516 1.1626
No log 1.1707 48 1.2932 0.0170 1.2932 1.1372
No log 1.2195 50 1.2715 0.0318 1.2715 1.1276
No log 1.2683 52 1.1398 0.1352 1.1398 1.0676
No log 1.3171 54 1.1169 0.2246 1.1169 1.0568
No log 1.3659 56 1.2849 0.1379 1.2849 1.1335
No log 1.4146 58 1.3202 0.0967 1.3202 1.1490
No log 1.4634 60 1.1805 0.2125 1.1805 1.0865
No log 1.5122 62 1.0983 0.2074 1.0983 1.0480
No log 1.5610 64 1.1160 0.2100 1.1160 1.0564
No log 1.6098 66 1.0629 0.1658 1.0629 1.0310
No log 1.6585 68 0.9769 0.2740 0.9769 0.9884
No log 1.7073 70 0.9826 0.2088 0.9826 0.9913
No log 1.7561 72 0.9966 0.1810 0.9966 0.9983
No log 1.8049 74 0.9954 0.3117 0.9954 0.9977
No log 1.8537 76 1.0197 0.2120 1.0197 1.0098
No log 1.9024 78 1.1407 0.1057 1.1407 1.0681
No log 1.9512 80 1.1379 0.1444 1.1379 1.0667
No log 2.0 82 1.1098 0.1564 1.1098 1.0535
No log 2.0488 84 1.0170 0.2366 1.0170 1.0085
No log 2.0976 86 1.0633 0.1653 1.0633 1.0312
No log 2.1463 88 1.1405 0.0931 1.1405 1.0679
No log 2.1951 90 1.1445 0.0811 1.1445 1.0698
No log 2.2439 92 1.0738 0.1873 1.0738 1.0362
No log 2.2927 94 1.0259 0.2161 1.0259 1.0129
No log 2.3415 96 1.0144 0.1576 1.0144 1.0072
No log 2.3902 98 1.0015 0.1997 1.0015 1.0007
No log 2.4390 100 0.9617 0.2643 0.9617 0.9807
No log 2.4878 102 0.9770 0.2024 0.9770 0.9884
No log 2.5366 104 1.0394 0.1961 1.0394 1.0195
No log 2.5854 106 1.0507 0.2979 1.0507 1.0250
No log 2.6341 108 0.9710 0.3363 0.9710 0.9854
No log 2.6829 110 0.9502 0.3785 0.9502 0.9748
No log 2.7317 112 0.9333 0.3921 0.9333 0.9661
No log 2.7805 114 1.0153 0.3663 1.0153 1.0076
No log 2.8293 116 1.0761 0.2772 1.0761 1.0374
No log 2.8780 118 1.0437 0.2926 1.0437 1.0216
No log 2.9268 120 0.9106 0.3902 0.9106 0.9543
No log 2.9756 122 0.8771 0.4169 0.8771 0.9365
No log 3.0244 124 0.9527 0.2815 0.9527 0.9761
No log 3.0732 126 0.9333 0.3405 0.9333 0.9661
No log 3.1220 128 0.8567 0.3785 0.8567 0.9256
No log 3.1707 130 0.7439 0.4762 0.7439 0.8625
No log 3.2195 132 0.7373 0.4778 0.7373 0.8587
No log 3.2683 134 0.7431 0.4898 0.7431 0.8620
No log 3.3171 136 0.8176 0.4546 0.8176 0.9042
No log 3.3659 138 0.8995 0.3902 0.8995 0.9484
No log 3.4146 140 0.8887 0.2983 0.8887 0.9427
No log 3.4634 142 0.8424 0.4516 0.8424 0.9178
No log 3.5122 144 0.8742 0.4060 0.8742 0.9350
No log 3.5610 146 0.8311 0.4742 0.8311 0.9117
No log 3.6098 148 0.7431 0.5093 0.7431 0.8621
No log 3.6585 150 0.6960 0.4908 0.6960 0.8343
No log 3.7073 152 0.6724 0.5345 0.6724 0.8200
No log 3.7561 154 0.6376 0.5933 0.6376 0.7985
No log 3.8049 156 0.6362 0.6566 0.6362 0.7976
No log 3.8537 158 0.6054 0.6259 0.6054 0.7781
No log 3.9024 160 0.7160 0.5538 0.7160 0.8462
No log 3.9512 162 0.7481 0.4889 0.7481 0.8649
No log 4.0 164 0.6048 0.6054 0.6048 0.7777
No log 4.0488 166 0.6293 0.6631 0.6293 0.7933
No log 4.0976 168 0.6504 0.6603 0.6504 0.8065
No log 4.1463 170 0.6238 0.6398 0.6238 0.7898
No log 4.1951 172 0.8342 0.5072 0.8342 0.9134
No log 4.2439 174 0.8730 0.5283 0.8730 0.9343
No log 4.2927 176 0.6958 0.6147 0.6958 0.8342
No log 4.3415 178 0.6910 0.5955 0.6910 0.8313
No log 4.3902 180 0.7097 0.5633 0.7097 0.8424
No log 4.4390 182 0.6663 0.6028 0.6663 0.8163
No log 4.4878 184 0.7034 0.5919 0.7034 0.8387
No log 4.5366 186 0.7151 0.5809 0.7151 0.8456
No log 4.5854 188 0.7136 0.5657 0.7136 0.8448
No log 4.6341 190 0.7304 0.5009 0.7304 0.8546
No log 4.6829 192 0.7306 0.5030 0.7306 0.8548
No log 4.7317 194 0.7061 0.5428 0.7061 0.8403
No log 4.7805 196 0.7246 0.5368 0.7246 0.8512
No log 4.8293 198 0.7024 0.5217 0.7024 0.8381
No log 4.8780 200 0.6905 0.5660 0.6905 0.8309
No log 4.9268 202 0.8601 0.5198 0.8601 0.9274
No log 4.9756 204 0.8286 0.5220 0.8286 0.9103
No log 5.0244 206 0.6991 0.5809 0.6991 0.8361
No log 5.0732 208 0.6895 0.5748 0.6895 0.8303
No log 5.1220 210 0.7407 0.5024 0.7407 0.8606
No log 5.1707 212 0.6951 0.4853 0.6951 0.8337
No log 5.2195 214 0.7077 0.4675 0.7077 0.8413
No log 5.2683 216 0.7962 0.3372 0.7962 0.8923
No log 5.3171 218 0.8223 0.2942 0.8223 0.9068
No log 5.3659 220 0.8330 0.4 0.8330 0.9127
No log 5.4146 222 0.8681 0.4303 0.8681 0.9317
No log 5.4634 224 0.8319 0.3896 0.8319 0.9121
No log 5.5122 226 0.7792 0.3979 0.7792 0.8827
No log 5.5610 228 0.7947 0.4466 0.7947 0.8914
No log 5.6098 230 0.8397 0.4575 0.8397 0.9163
No log 5.6585 232 0.7714 0.4710 0.7714 0.8783
No log 5.7073 234 0.7194 0.5774 0.7194 0.8482
No log 5.7561 236 0.7617 0.5370 0.7617 0.8727
No log 5.8049 238 0.7756 0.5033 0.7756 0.8807
No log 5.8537 240 0.7951 0.4695 0.7951 0.8917
No log 5.9024 242 0.7855 0.4450 0.7855 0.8863
No log 5.9512 244 0.7753 0.4711 0.7753 0.8805
No log 6.0 246 0.7696 0.5318 0.7696 0.8772
No log 6.0488 248 0.7322 0.6317 0.7322 0.8557
No log 6.0976 250 0.7408 0.5111 0.7408 0.8607
No log 6.1463 252 0.6938 0.6593 0.6938 0.8329
No log 6.1951 254 0.7479 0.5291 0.7479 0.8648
No log 6.2439 256 0.7225 0.5522 0.7225 0.8500
No log 6.2927 258 0.7184 0.6046 0.7184 0.8476
No log 6.3415 260 0.7272 0.6076 0.7272 0.8528
No log 6.3902 262 0.7765 0.5067 0.7765 0.8812
No log 6.4390 264 0.7976 0.4710 0.7976 0.8931
No log 6.4878 266 0.7173 0.5969 0.7173 0.8469
No log 6.5366 268 0.6986 0.5917 0.6986 0.8358
No log 6.5854 270 0.6979 0.5917 0.6979 0.8354
No log 6.6341 272 0.6827 0.6606 0.6827 0.8262
No log 6.6829 274 0.6861 0.6025 0.6861 0.8283
No log 6.7317 276 0.7116 0.5678 0.7116 0.8436
No log 6.7805 278 0.7063 0.6154 0.7063 0.8404
No log 6.8293 280 0.6971 0.5966 0.6971 0.8349
No log 6.8780 282 0.6965 0.5678 0.6965 0.8345
No log 6.9268 284 0.7340 0.5131 0.7340 0.8567
No log 6.9756 286 0.6762 0.5809 0.6762 0.8223
No log 7.0244 288 0.6704 0.5787 0.6704 0.8188
No log 7.0732 290 0.6956 0.5909 0.6956 0.8340
No log 7.1220 292 0.6581 0.6894 0.6581 0.8112
No log 7.1707 294 0.6620 0.6154 0.6620 0.8136
No log 7.2195 296 0.6663 0.5977 0.6663 0.8162
No log 7.2683 298 0.6947 0.6154 0.6947 0.8335
No log 7.3171 300 0.7129 0.5195 0.7129 0.8443
No log 7.3659 302 0.6629 0.6360 0.6629 0.8142
No log 7.4146 304 0.6295 0.6262 0.6295 0.7934
No log 7.4634 306 0.6163 0.6262 0.6163 0.7850
No log 7.5122 308 0.6211 0.6886 0.6211 0.7881
No log 7.5610 310 0.6172 0.6370 0.6172 0.7856
No log 7.6098 312 0.6147 0.6370 0.6147 0.7840
No log 7.6585 314 0.6030 0.6370 0.6030 0.7765
No log 7.7073 316 0.6252 0.6228 0.6252 0.7907
No log 7.7561 318 0.6162 0.6241 0.6162 0.7850
No log 7.8049 320 0.6293 0.5847 0.6293 0.7933
No log 7.8537 322 0.7186 0.5455 0.7186 0.8477
No log 7.9024 324 0.7024 0.5466 0.7024 0.8381
No log 7.9512 326 0.6044 0.6706 0.6044 0.7774
No log 8.0 328 0.6468 0.6194 0.6468 0.8042
No log 8.0488 330 0.6339 0.6516 0.6339 0.7962
No log 8.0976 332 0.6003 0.7223 0.6003 0.7748
No log 8.1463 334 0.7357 0.5106 0.7357 0.8578
No log 8.1951 336 0.7361 0.5211 0.7361 0.8579
No log 8.2439 338 0.6117 0.6798 0.6117 0.7821
No log 8.2927 340 0.6023 0.6795 0.6023 0.7761
No log 8.3415 342 0.6263 0.6641 0.6263 0.7914
No log 8.3902 344 0.6248 0.6805 0.6248 0.7905
No log 8.4390 346 0.6458 0.6719 0.6458 0.8036
No log 8.4878 348 0.6320 0.6919 0.6320 0.7950
No log 8.5366 350 0.6031 0.6528 0.6031 0.7766
No log 8.5854 352 0.5733 0.6579 0.5733 0.7571
No log 8.6341 354 0.5696 0.6833 0.5696 0.7548
No log 8.6829 356 0.5912 0.6623 0.5912 0.7689
No log 8.7317 358 0.5913 0.6441 0.5913 0.7690
No log 8.7805 360 0.5675 0.6724 0.5675 0.7533
No log 8.8293 362 0.6037 0.6039 0.6037 0.7770
No log 8.8780 364 0.6025 0.6466 0.6025 0.7762
No log 8.9268 366 0.5872 0.6643 0.5872 0.7663
No log 8.9756 368 0.5684 0.6796 0.5684 0.7540
No log 9.0244 370 0.5667 0.6673 0.5667 0.7528
No log 9.0732 372 0.5739 0.6768 0.5739 0.7576
No log 9.1220 374 0.5893 0.6325 0.5893 0.7677
No log 9.1707 376 0.5830 0.6370 0.5830 0.7635
No log 9.2195 378 0.5885 0.6370 0.5885 0.7672
No log 9.2683 380 0.6227 0.6209 0.6227 0.7891
No log 9.3171 382 0.7435 0.5330 0.7435 0.8623
No log 9.3659 384 0.7506 0.5439 0.7506 0.8664
No log 9.4146 386 0.6422 0.6395 0.6422 0.8014
No log 9.4634 388 0.5825 0.6623 0.5825 0.7632
No log 9.5122 390 0.5663 0.6888 0.5663 0.7526
No log 9.5610 392 0.5781 0.6254 0.5781 0.7603
No log 9.6098 394 0.6146 0.6349 0.6146 0.7840
No log 9.6585 396 0.7294 0.5358 0.7294 0.8540
No log 9.7073 398 0.7579 0.5319 0.7579 0.8706
No log 9.7561 400 0.6666 0.6132 0.6666 0.8165
No log 9.8049 402 0.6095 0.5570 0.6095 0.7807
No log 9.8537 404 0.6277 0.5210 0.6277 0.7923
No log 9.9024 406 0.6325 0.5332 0.6325 0.7953
No log 9.9512 408 0.6460 0.5522 0.6460 0.8037
No log 10.0 410 0.6623 0.5949 0.6623 0.8138
No log 10.0488 412 0.6845 0.5572 0.6845 0.8273
No log 10.0976 414 0.6575 0.5823 0.6575 0.8108
No log 10.1463 416 0.6285 0.5051 0.6285 0.7928
No log 10.1951 418 0.6200 0.5317 0.6200 0.7874
No log 10.2439 420 0.6286 0.5153 0.6286 0.7928
No log 10.2927 422 0.7011 0.5433 0.7011 0.8373
No log 10.3415 424 0.7608 0.5463 0.7608 0.8722
No log 10.3902 426 0.7432 0.5372 0.7432 0.8621
No log 10.4390 428 0.6597 0.5823 0.6597 0.8122
No log 10.4878 430 0.6210 0.6232 0.6210 0.7880
No log 10.5366 432 0.6104 0.5596 0.6104 0.7813
No log 10.5854 434 0.6091 0.5596 0.6091 0.7805
No log 10.6341 436 0.5970 0.6370 0.5970 0.7726
No log 10.6829 438 0.6311 0.5975 0.6311 0.7944
No log 10.7317 440 0.6427 0.5975 0.6427 0.8017
No log 10.7805 442 0.6195 0.6871 0.6195 0.7871
No log 10.8293 444 0.6438 0.5093 0.6438 0.8024
No log 10.8780 446 0.7109 0.4771 0.7109 0.8431
No log 10.9268 448 0.7017 0.4771 0.7017 0.8377
No log 10.9756 450 0.6906 0.5346 0.6906 0.8310
No log 11.0244 452 0.6887 0.5210 0.6887 0.8299
No log 11.0732 454 0.6986 0.5361 0.6986 0.8358
No log 11.1220 456 0.7258 0.5292 0.7258 0.8519
No log 11.1707 458 0.6831 0.5740 0.6831 0.8265
No log 11.2195 460 0.6386 0.6993 0.6386 0.7992
No log 11.2683 462 0.6055 0.6950 0.6055 0.7781
No log 11.3171 464 0.6046 0.6805 0.6046 0.7776
No log 11.3659 466 0.6171 0.6327 0.6171 0.7856
No log 11.4146 468 0.6299 0.6118 0.6299 0.7937
No log 11.4634 470 0.6347 0.6118 0.6347 0.7967
No log 11.5122 472 0.6669 0.4983 0.6669 0.8166
No log 11.5610 474 0.7026 0.4883 0.7026 0.8382
No log 11.6098 476 0.6424 0.5441 0.6424 0.8015
No log 11.6585 478 0.6126 0.7019 0.6126 0.7827
No log 11.7073 480 0.6246 0.6983 0.6246 0.7903
No log 11.7561 482 0.6026 0.6921 0.6026 0.7762
No log 11.8049 484 0.5805 0.6875 0.5805 0.7619
No log 11.8537 486 0.5876 0.6627 0.5876 0.7666
No log 11.9024 488 0.6006 0.6627 0.6006 0.7750
No log 11.9512 490 0.6083 0.6627 0.6083 0.7799
No log 12.0 492 0.6135 0.6488 0.6135 0.7833
No log 12.0488 494 0.6114 0.6602 0.6114 0.7819
No log 12.0976 496 0.6031 0.6627 0.6031 0.7766
No log 12.1463 498 0.6064 0.6564 0.6064 0.7787
0.3734 12.1951 500 0.6048 0.6737 0.6048 0.7777
0.3734 12.2439 502 0.6555 0.5548 0.6555 0.8096
0.3734 12.2927 504 0.6544 0.5548 0.6544 0.8089
0.3734 12.3415 506 0.6129 0.6903 0.6129 0.7829
0.3734 12.3902 508 0.6269 0.6341 0.6269 0.7917
0.3734 12.4390 510 0.6537 0.5710 0.6537 0.8085
0.3734 12.4878 512 0.6403 0.6177 0.6403 0.8002
0.3734 12.5366 514 0.6648 0.5674 0.6648 0.8154
0.3734 12.5854 516 0.7354 0.4908 0.7354 0.8575

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k8_task5_organization

Finetuned
(4019)
this model