ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k10_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7669
  • Qwk: 0.5949
  • Mse: 0.7669
  • Rmse: 0.8757

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0392 2 5.3285 -0.0128 5.3285 2.3084
No log 0.0784 4 2.8951 0.0744 2.8951 1.7015
No log 0.1176 6 2.2651 -0.0383 2.2651 1.5050
No log 0.1569 8 1.6540 0.0936 1.6540 1.2861
No log 0.1961 10 1.2848 0.1588 1.2848 1.1335
No log 0.2353 12 1.1505 0.2580 1.1505 1.0726
No log 0.2745 14 1.3517 0.1595 1.3517 1.1626
No log 0.3137 16 2.0091 0.1068 2.0091 1.4174
No log 0.3529 18 2.1215 0.1323 2.1215 1.4565
No log 0.3922 20 2.5104 0.1173 2.5104 1.5844
No log 0.4314 22 2.4057 0.1527 2.4057 1.5510
No log 0.4706 24 1.4878 0.2944 1.4878 1.2198
No log 0.5098 26 0.9182 0.3567 0.9182 0.9582
No log 0.5490 28 0.9368 0.2974 0.9368 0.9679
No log 0.5882 30 0.8938 0.4342 0.8938 0.9454
No log 0.6275 32 0.8614 0.4771 0.8614 0.9281
No log 0.6667 34 0.8687 0.4751 0.8687 0.9320
No log 0.7059 36 0.9473 0.4564 0.9473 0.9733
No log 0.7451 38 0.9674 0.4433 0.9674 0.9836
No log 0.7843 40 0.9248 0.5100 0.9248 0.9616
No log 0.8235 42 1.1693 0.3806 1.1693 1.0813
No log 0.8627 44 1.0284 0.4614 1.0284 1.0141
No log 0.9020 46 0.8251 0.5750 0.8251 0.9083
No log 0.9412 48 0.7680 0.6502 0.7680 0.8763
No log 0.9804 50 0.8545 0.6168 0.8545 0.9244
No log 1.0196 52 0.9203 0.5838 0.9203 0.9593
No log 1.0588 54 0.7586 0.6781 0.7586 0.8710
No log 1.0980 56 0.7286 0.6908 0.7286 0.8536
No log 1.1373 58 0.9699 0.5552 0.9699 0.9848
No log 1.1765 60 1.2657 0.4403 1.2657 1.1250
No log 1.2157 62 1.1194 0.4942 1.1194 1.0580
No log 1.2549 64 1.1431 0.4134 1.1431 1.0692
No log 1.2941 66 1.1212 0.4668 1.1212 1.0589
No log 1.3333 68 1.0197 0.4867 1.0197 1.0098
No log 1.3725 70 1.0174 0.5256 1.0174 1.0087
No log 1.4118 72 1.0991 0.4769 1.0991 1.0484
No log 1.4510 74 1.2192 0.4222 1.2192 1.1042
No log 1.4902 76 1.1482 0.4333 1.1482 1.0715
No log 1.5294 78 1.1562 0.4096 1.1562 1.0753
No log 1.5686 80 1.1758 0.4434 1.1758 1.0844
No log 1.6078 82 0.9565 0.5944 0.9565 0.9780
No log 1.6471 84 0.7220 0.6713 0.7220 0.8497
No log 1.6863 86 0.8541 0.5579 0.8541 0.9242
No log 1.7255 88 0.7600 0.5592 0.7600 0.8718
No log 1.7647 90 0.7828 0.6684 0.7828 0.8848
No log 1.8039 92 0.9034 0.6313 0.9034 0.9505
No log 1.8431 94 0.6926 0.7073 0.6926 0.8322
No log 1.8824 96 0.6469 0.7152 0.6469 0.8043
No log 1.9216 98 0.6894 0.7306 0.6894 0.8303
No log 1.9608 100 0.6949 0.7349 0.6949 0.8336
No log 2.0 102 0.7578 0.7163 0.7578 0.8705
No log 2.0392 104 0.7622 0.7110 0.7622 0.8730
No log 2.0784 106 0.8247 0.6854 0.8247 0.9081
No log 2.1176 108 0.9674 0.6004 0.9674 0.9835
No log 2.1569 110 0.8190 0.6879 0.8190 0.9050
No log 2.1961 112 0.7758 0.6978 0.7758 0.8808
No log 2.2353 114 0.9044 0.6372 0.9044 0.9510
No log 2.2745 116 0.8756 0.6308 0.8756 0.9357
No log 2.3137 118 0.7344 0.7159 0.7344 0.8570
No log 2.3529 120 0.7096 0.7343 0.7096 0.8424
No log 2.3922 122 0.7402 0.6742 0.7402 0.8604
No log 2.4314 124 0.7157 0.6736 0.7157 0.8460
No log 2.4706 126 0.7093 0.6803 0.7093 0.8422
No log 2.5098 128 0.6824 0.6872 0.6824 0.8261
No log 2.5490 130 0.7289 0.6450 0.7289 0.8537
No log 2.5882 132 0.7813 0.6395 0.7813 0.8839
No log 2.6275 134 0.7701 0.6626 0.7701 0.8775
No log 2.6667 136 0.7392 0.6151 0.7392 0.8598
No log 2.7059 138 0.7573 0.6116 0.7573 0.8702
No log 2.7451 140 0.7850 0.6110 0.7850 0.8860
No log 2.7843 142 0.8126 0.6333 0.8126 0.9014
No log 2.8235 144 0.7815 0.6517 0.7815 0.8840
No log 2.8627 146 0.8353 0.6249 0.8353 0.9139
No log 2.9020 148 0.7913 0.6594 0.7913 0.8896
No log 2.9412 150 0.8112 0.6576 0.8112 0.9007
No log 2.9804 152 0.8999 0.6076 0.8999 0.9486
No log 3.0196 154 0.8757 0.6194 0.8757 0.9358
No log 3.0588 156 0.7858 0.6751 0.7858 0.8865
No log 3.0980 158 0.7312 0.7239 0.7312 0.8551
No log 3.1373 160 0.7279 0.7016 0.7279 0.8532
No log 3.1765 162 0.8031 0.6553 0.8031 0.8961
No log 3.2157 164 0.9895 0.6120 0.9895 0.9947
No log 3.2549 166 0.8541 0.6508 0.8541 0.9242
No log 3.2941 168 0.6998 0.5911 0.6998 0.8365
No log 3.3333 170 0.7496 0.6506 0.7496 0.8658
No log 3.3725 172 0.7546 0.6766 0.7546 0.8687
No log 3.4118 174 0.6954 0.7381 0.6954 0.8339
No log 3.4510 176 0.7661 0.6884 0.7661 0.8753
No log 3.4902 178 0.8604 0.6547 0.8604 0.9276
No log 3.5294 180 0.8205 0.6475 0.8205 0.9058
No log 3.5686 182 0.7663 0.6760 0.7663 0.8754
No log 3.6078 184 0.7061 0.7152 0.7061 0.8403
No log 3.6471 186 0.6715 0.7407 0.6715 0.8194
No log 3.6863 188 0.6913 0.7254 0.6913 0.8315
No log 3.7255 190 0.7312 0.6663 0.7312 0.8551
No log 3.7647 192 0.7542 0.6720 0.7542 0.8684
No log 3.8039 194 0.7944 0.6656 0.7944 0.8913
No log 3.8431 196 0.6956 0.7161 0.6956 0.8340
No log 3.8824 198 0.6909 0.7059 0.6909 0.8312
No log 3.9216 200 0.6909 0.7051 0.6909 0.8312
No log 3.9608 202 0.7054 0.7083 0.7054 0.8399
No log 4.0 204 0.6908 0.7052 0.6908 0.8311
No log 4.0392 206 0.6736 0.7066 0.6736 0.8208
No log 4.0784 208 0.6754 0.6946 0.6754 0.8219
No log 4.1176 210 0.7580 0.6741 0.7580 0.8706
No log 4.1569 212 0.7531 0.6874 0.7531 0.8678
No log 4.1961 214 0.6773 0.6936 0.6773 0.8230
No log 4.2353 216 0.6715 0.7100 0.6715 0.8194
No log 4.2745 218 0.7045 0.6563 0.7045 0.8394
No log 4.3137 220 0.8920 0.6061 0.8920 0.9444
No log 4.3529 222 1.0048 0.5743 1.0048 1.0024
No log 4.3922 224 0.8567 0.6091 0.8567 0.9256
No log 4.4314 226 0.6878 0.7129 0.6878 0.8293
No log 4.4706 228 0.6621 0.7137 0.6621 0.8137
No log 4.5098 230 0.6674 0.6980 0.6674 0.8170
No log 4.5490 232 0.6814 0.7269 0.6814 0.8255
No log 4.5882 234 0.7284 0.6546 0.7284 0.8535
No log 4.6275 236 0.8135 0.6233 0.8135 0.9020
No log 4.6667 238 0.8165 0.6292 0.8165 0.9036
No log 4.7059 240 0.7004 0.6270 0.7004 0.8369
No log 4.7451 242 0.6769 0.6298 0.6769 0.8228
No log 4.7843 244 0.7541 0.5800 0.7541 0.8684
No log 4.8235 246 0.6925 0.6009 0.6925 0.8322
No log 4.8627 248 0.6294 0.7212 0.6294 0.7933
No log 4.9020 250 0.7022 0.6799 0.7022 0.8380
No log 4.9412 252 0.6869 0.6993 0.6869 0.8288
No log 4.9804 254 0.6330 0.7011 0.6330 0.7956
No log 5.0196 256 0.6507 0.7266 0.6507 0.8067
No log 5.0588 258 0.6497 0.7281 0.6497 0.8060
No log 5.0980 260 0.6680 0.7443 0.6680 0.8173
No log 5.1373 262 0.6923 0.7316 0.6923 0.8320
No log 5.1765 264 0.7085 0.7148 0.7085 0.8417
No log 5.2157 266 0.6925 0.7018 0.6925 0.8321
No log 5.2549 268 0.6728 0.7109 0.6728 0.8202
No log 5.2941 270 0.6639 0.6766 0.6639 0.8148
No log 5.3333 272 0.6658 0.6308 0.6658 0.8160
No log 5.3725 274 0.6731 0.6330 0.6731 0.8204
No log 5.4118 276 0.6984 0.6612 0.6984 0.8357
No log 5.4510 278 0.6677 0.6556 0.6677 0.8171
No log 5.4902 280 0.6321 0.6856 0.6321 0.7951
No log 5.5294 282 0.6439 0.6752 0.6439 0.8024
No log 5.5686 284 0.6275 0.7061 0.6275 0.7922
No log 5.6078 286 0.6345 0.7114 0.6345 0.7965
No log 5.6471 288 0.6521 0.7206 0.6521 0.8076
No log 5.6863 290 0.6563 0.7312 0.6563 0.8101
No log 5.7255 292 0.6657 0.7049 0.6657 0.8159
No log 5.7647 294 0.6963 0.6892 0.6963 0.8345
No log 5.8039 296 0.7814 0.6627 0.7814 0.8840
No log 5.8431 298 0.8277 0.6572 0.8277 0.9098
No log 5.8824 300 0.8787 0.6001 0.8787 0.9374
No log 5.9216 302 0.8652 0.6267 0.8652 0.9302
No log 5.9608 304 0.8331 0.6426 0.8331 0.9127
No log 6.0 306 0.9184 0.6139 0.9184 0.9583
No log 6.0392 308 1.0360 0.5497 1.0360 1.0178
No log 6.0784 310 0.9224 0.6281 0.9224 0.9604
No log 6.1176 312 0.7850 0.6719 0.7850 0.8860
No log 6.1569 314 0.7088 0.6905 0.7088 0.8419
No log 6.1961 316 0.7014 0.6343 0.7014 0.8375
No log 6.2353 318 0.7015 0.6259 0.7015 0.8376
No log 6.2745 320 0.7765 0.6317 0.7765 0.8812
No log 6.3137 322 0.8472 0.5771 0.8472 0.9205
No log 6.3529 324 0.8038 0.6040 0.8038 0.8966
No log 6.3922 326 0.7288 0.6443 0.7288 0.8537
No log 6.4314 328 0.6849 0.6438 0.6849 0.8276
No log 6.4706 330 0.6858 0.7004 0.6858 0.8282
No log 6.5098 332 0.7077 0.6808 0.7077 0.8412
No log 6.5490 334 0.7581 0.6895 0.7581 0.8707
No log 6.5882 336 0.7176 0.6935 0.7176 0.8471
No log 6.6275 338 0.6571 0.7339 0.6571 0.8106
No log 6.6667 340 0.6445 0.7300 0.6445 0.8028
No log 6.7059 342 0.6416 0.7339 0.6416 0.8010
No log 6.7451 344 0.6676 0.6976 0.6676 0.8171
No log 6.7843 346 0.7325 0.7005 0.7325 0.8559
No log 6.8235 348 0.7378 0.7320 0.7378 0.8590
No log 6.8627 350 0.7414 0.7274 0.7414 0.8611
No log 6.9020 352 0.7312 0.7222 0.7312 0.8551
No log 6.9412 354 0.7129 0.7163 0.7129 0.8443
No log 6.9804 356 0.6767 0.7298 0.6767 0.8226
No log 7.0196 358 0.6636 0.7149 0.6636 0.8146
No log 7.0588 360 0.6554 0.7065 0.6554 0.8096
No log 7.0980 362 0.6433 0.7193 0.6433 0.8020
No log 7.1373 364 0.6493 0.7328 0.6493 0.8058
No log 7.1765 366 0.6320 0.7479 0.6320 0.7950
No log 7.2157 368 0.6158 0.7383 0.6158 0.7847
No log 7.2549 370 0.6274 0.7252 0.6274 0.7921
No log 7.2941 372 0.6673 0.6999 0.6673 0.8169
No log 7.3333 374 0.6601 0.6984 0.6601 0.8125
No log 7.3725 376 0.6429 0.7273 0.6429 0.8018
No log 7.4118 378 0.6348 0.7221 0.6348 0.7967
No log 7.4510 380 0.6363 0.7110 0.6363 0.7977
No log 7.4902 382 0.6378 0.6790 0.6378 0.7986
No log 7.5294 384 0.6530 0.6760 0.6530 0.8081
No log 7.5686 386 0.6408 0.7029 0.6408 0.8005
No log 7.6078 388 0.6662 0.6832 0.6662 0.8162
No log 7.6471 390 0.6917 0.6664 0.6917 0.8317
No log 7.6863 392 0.6445 0.6956 0.6445 0.8028
No log 7.7255 394 0.6643 0.6976 0.6643 0.8151
No log 7.7647 396 0.7178 0.6796 0.7178 0.8472
No log 7.8039 398 0.6992 0.6921 0.6992 0.8362
No log 7.8431 400 0.6963 0.6936 0.6963 0.8345
No log 7.8824 402 0.6681 0.6733 0.6681 0.8174
No log 7.9216 404 0.6685 0.6756 0.6685 0.8176
No log 7.9608 406 0.6603 0.6241 0.6603 0.8126
No log 8.0 408 0.6667 0.6448 0.6667 0.8165
No log 8.0392 410 0.6649 0.6702 0.6649 0.8154
No log 8.0784 412 0.6502 0.6948 0.6502 0.8063
No log 8.1176 414 0.6520 0.7049 0.6520 0.8074
No log 8.1569 416 0.6536 0.7129 0.6536 0.8085
No log 8.1961 418 0.6717 0.7075 0.6717 0.8196
No log 8.2353 420 0.6616 0.6816 0.6616 0.8134
No log 8.2745 422 0.6516 0.6703 0.6516 0.8072
No log 8.3137 424 0.6554 0.6485 0.6554 0.8095
No log 8.3529 426 0.6599 0.6586 0.6599 0.8123
No log 8.3922 428 0.6440 0.6816 0.6440 0.8025
No log 8.4314 430 0.6147 0.6748 0.6147 0.7840
No log 8.4706 432 0.6362 0.6655 0.6362 0.7976
No log 8.5098 434 0.6521 0.6738 0.6521 0.8076
No log 8.5490 436 0.6662 0.6618 0.6662 0.8162
No log 8.5882 438 0.6101 0.6781 0.6101 0.7811
No log 8.6275 440 0.6364 0.7176 0.6364 0.7978
No log 8.6667 442 0.6626 0.7370 0.6626 0.8140
No log 8.7059 444 0.6227 0.7025 0.6227 0.7891
No log 8.7451 446 0.6295 0.7019 0.6295 0.7934
No log 8.7843 448 0.6583 0.6711 0.6583 0.8114
No log 8.8235 450 0.6512 0.6684 0.6512 0.8070
No log 8.8627 452 0.6354 0.6689 0.6354 0.7971
No log 8.9020 454 0.6575 0.6884 0.6575 0.8108
No log 8.9412 456 0.7300 0.6646 0.7300 0.8544
No log 8.9804 458 0.7549 0.6646 0.7549 0.8689
No log 9.0196 460 0.7299 0.6713 0.7299 0.8543
No log 9.0588 462 0.6702 0.6759 0.6702 0.8187
No log 9.0980 464 0.6703 0.6669 0.6703 0.8187
No log 9.1373 466 0.6862 0.6437 0.6862 0.8284
No log 9.1765 468 0.7253 0.6626 0.7253 0.8517
No log 9.2157 470 0.7887 0.6647 0.7887 0.8881
No log 9.2549 472 0.7896 0.6647 0.7896 0.8886
No log 9.2941 474 0.7247 0.6291 0.7247 0.8513
No log 9.3333 476 0.6811 0.6280 0.6811 0.8253
No log 9.3725 478 0.6958 0.6455 0.6958 0.8342
No log 9.4118 480 0.6847 0.6474 0.6847 0.8275
No log 9.4510 482 0.6862 0.6602 0.6862 0.8284
No log 9.4902 484 0.6938 0.6986 0.6938 0.8329
No log 9.5294 486 0.6990 0.6896 0.6990 0.8361
No log 9.5686 488 0.6974 0.6851 0.6974 0.8351
No log 9.6078 490 0.6879 0.6679 0.6879 0.8294
No log 9.6471 492 0.6779 0.6343 0.6779 0.8234
No log 9.6863 494 0.6927 0.5700 0.6927 0.8323
No log 9.7255 496 0.7232 0.6052 0.7232 0.8504
No log 9.7647 498 0.7029 0.6052 0.7029 0.8384
0.3718 9.8039 500 0.6516 0.6181 0.6516 0.8072
0.3718 9.8431 502 0.6385 0.6752 0.6385 0.7991
0.3718 9.8824 504 0.6638 0.6728 0.6638 0.8147
0.3718 9.9216 506 0.6627 0.7036 0.6627 0.8141
0.3718 9.9608 508 0.6538 0.7276 0.6538 0.8086
0.3718 10.0 510 0.6678 0.6904 0.6678 0.8172
0.3718 10.0392 512 0.6586 0.6961 0.6586 0.8115
0.3718 10.0784 514 0.6470 0.6952 0.6470 0.8044
0.3718 10.1176 516 0.6499 0.6791 0.6499 0.8062
0.3718 10.1569 518 0.6597 0.6261 0.6597 0.8122
0.3718 10.1961 520 0.7226 0.6766 0.7226 0.8500
0.3718 10.2353 522 0.7749 0.6447 0.7749 0.8803
0.3718 10.2745 524 0.7627 0.6617 0.7627 0.8733
0.3718 10.3137 526 0.6962 0.6791 0.6962 0.8344
0.3718 10.3529 528 0.6562 0.6869 0.6562 0.8100
0.3718 10.3922 530 0.6668 0.7047 0.6668 0.8166
0.3718 10.4314 532 0.6985 0.6985 0.6985 0.8358
0.3718 10.4706 534 0.6869 0.7106 0.6869 0.8288
0.3718 10.5098 536 0.6530 0.7106 0.6530 0.8081
0.3718 10.5490 538 0.6301 0.7215 0.6301 0.7938
0.3718 10.5882 540 0.6277 0.6839 0.6277 0.7922
0.3718 10.6275 542 0.6508 0.6704 0.6508 0.8067
0.3718 10.6667 544 0.6634 0.6262 0.6634 0.8145
0.3718 10.7059 546 0.6777 0.6062 0.6777 0.8232
0.3718 10.7451 548 0.6671 0.6204 0.6671 0.8168
0.3718 10.7843 550 0.6559 0.6175 0.6559 0.8099
0.3718 10.8235 552 0.6645 0.6399 0.6645 0.8152
0.3718 10.8627 554 0.6636 0.6540 0.6636 0.8146
0.3718 10.9020 556 0.6539 0.6581 0.6539 0.8086
0.3718 10.9412 558 0.7190 0.6537 0.7190 0.8480
0.3718 10.9804 560 0.8795 0.6454 0.8795 0.9378
0.3718 11.0196 562 0.9935 0.5907 0.9935 0.9967
0.3718 11.0588 564 1.0230 0.5807 1.0230 1.0115
0.3718 11.0980 566 0.9675 0.6074 0.9675 0.9836
0.3718 11.1373 568 0.8449 0.6053 0.8449 0.9192
0.3718 11.1765 570 0.7339 0.6684 0.7339 0.8567
0.3718 11.2157 572 0.7105 0.6547 0.7105 0.8429
0.3718 11.2549 574 0.7215 0.6683 0.7215 0.8494
0.3718 11.2941 576 0.7815 0.6385 0.7815 0.8840
0.3718 11.3333 578 0.8427 0.6547 0.8427 0.9180
0.3718 11.3725 580 0.9493 0.6331 0.9493 0.9743
0.3718 11.4118 582 0.9369 0.6331 0.9369 0.9679
0.3718 11.4510 584 0.8067 0.6376 0.8067 0.8982
0.3718 11.4902 586 0.6457 0.7295 0.6457 0.8036
0.3718 11.5294 588 0.6113 0.7101 0.6113 0.7819
0.3718 11.5686 590 0.6144 0.7243 0.6144 0.7838
0.3718 11.6078 592 0.6130 0.7049 0.6130 0.7830
0.3718 11.6471 594 0.6071 0.6638 0.6071 0.7792
0.3718 11.6863 596 0.6038 0.6606 0.6038 0.7770
0.3718 11.7255 598 0.6313 0.6369 0.6313 0.7946
0.3718 11.7647 600 0.6315 0.6544 0.6315 0.7947
0.3718 11.8039 602 0.5962 0.6811 0.5962 0.7722
0.3718 11.8431 604 0.5965 0.7261 0.5965 0.7724
0.3718 11.8824 606 0.6048 0.7151 0.6048 0.7777
0.3718 11.9216 608 0.5910 0.7198 0.5910 0.7687
0.3718 11.9608 610 0.6053 0.6988 0.6053 0.7780
0.3718 12.0 612 0.6516 0.6351 0.6516 0.8072
0.3718 12.0392 614 0.6509 0.6503 0.6509 0.8068
0.3718 12.0784 616 0.6268 0.6472 0.6268 0.7917
0.3718 12.1176 618 0.6388 0.6898 0.6388 0.7993
0.3718 12.1569 620 0.7023 0.6477 0.7023 0.8381
0.3718 12.1961 622 0.7090 0.6557 0.7090 0.8420
0.3718 12.2353 624 0.6749 0.6852 0.6749 0.8215
0.3718 12.2745 626 0.6228 0.7358 0.6228 0.7892
0.3718 12.3137 628 0.5981 0.7499 0.5981 0.7733
0.3718 12.3529 630 0.5972 0.7786 0.5972 0.7728
0.3718 12.3922 632 0.6243 0.7209 0.6243 0.7901
0.3718 12.4314 634 0.6611 0.7002 0.6611 0.8131
0.3718 12.4706 636 0.7021 0.6818 0.7021 0.8379
0.3718 12.5098 638 0.7381 0.6579 0.7381 0.8591
0.3718 12.5490 640 0.7915 0.6383 0.7915 0.8897
0.3718 12.5882 642 0.8488 0.6151 0.8488 0.9213
0.3718 12.6275 644 0.8248 0.6156 0.8248 0.9082
0.3718 12.6667 646 0.7224 0.6295 0.7224 0.8499
0.3718 12.7059 648 0.6718 0.6861 0.6718 0.8196
0.3718 12.7451 650 0.6677 0.6767 0.6677 0.8171
0.3718 12.7843 652 0.6667 0.6705 0.6667 0.8165
0.3718 12.8235 654 0.6736 0.6550 0.6736 0.8208
0.3718 12.8627 656 0.6868 0.6529 0.6868 0.8287
0.3718 12.9020 658 0.7399 0.6284 0.7399 0.8602
0.3718 12.9412 660 0.8042 0.6322 0.8042 0.8968
0.3718 12.9804 662 0.9045 0.6008 0.9045 0.9511
0.3718 13.0196 664 0.9379 0.5922 0.9379 0.9685
0.3718 13.0588 666 0.8715 0.6132 0.8715 0.9335
0.3718 13.0980 668 0.7858 0.6115 0.7858 0.8865
0.3718 13.1373 670 0.7669 0.5949 0.7669 0.8757

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k10_task1_organization

Finetuned
(4023)
this model