ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k4_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7704
  • Qwk: 0.4749
  • Mse: 0.7704
  • Rmse: 0.8777

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0870 2 4.5826 0.0042 4.5826 2.1407
No log 0.1739 4 3.2510 -0.0233 3.2510 1.8030
No log 0.2609 6 1.8277 0.0190 1.8277 1.3519
No log 0.3478 8 1.2129 0.1470 1.2129 1.1013
No log 0.4348 10 1.1778 0.1857 1.1778 1.0853
No log 0.5217 12 1.1145 0.2083 1.1145 1.0557
No log 0.6087 14 1.2344 0.1315 1.2344 1.1110
No log 0.6957 16 1.4035 0.0608 1.4035 1.1847
No log 0.7826 18 1.1494 0.2346 1.1494 1.0721
No log 0.8696 20 1.0220 0.3747 1.0220 1.0109
No log 0.9565 22 0.9517 0.3448 0.9517 0.9755
No log 1.0435 24 1.0060 0.3765 1.0060 1.0030
No log 1.1304 26 1.5399 0.2281 1.5399 1.2409
No log 1.2174 28 1.8044 0.2147 1.8044 1.3433
No log 1.3043 30 1.4104 0.3458 1.4104 1.1876
No log 1.3913 32 1.0900 0.3907 1.0900 1.0440
No log 1.4783 34 1.2908 0.4170 1.2908 1.1361
No log 1.5652 36 1.1615 0.4017 1.1615 1.0777
No log 1.6522 38 1.0085 0.5365 1.0085 1.0042
No log 1.7391 40 0.9711 0.4877 0.9711 0.9854
No log 1.8261 42 1.0827 0.3232 1.0827 1.0405
No log 1.9130 44 1.1086 0.3392 1.1086 1.0529
No log 2.0 46 1.0801 0.4075 1.0801 1.0393
No log 2.0870 48 0.8671 0.5597 0.8671 0.9312
No log 2.1739 50 0.8713 0.5203 0.8713 0.9334
No log 2.2609 52 0.8565 0.5362 0.8565 0.9254
No log 2.3478 54 0.8299 0.5693 0.8299 0.9110
No log 2.4348 56 0.8633 0.6218 0.8633 0.9292
No log 2.5217 58 0.8808 0.6250 0.8808 0.9385
No log 2.6087 60 0.8207 0.5896 0.8207 0.9059
No log 2.6957 62 0.8508 0.5038 0.8508 0.9224
No log 2.7826 64 0.8262 0.5624 0.8262 0.9089
No log 2.8696 66 0.8119 0.5565 0.8119 0.9011
No log 2.9565 68 0.8415 0.6515 0.8415 0.9173
No log 3.0435 70 0.9735 0.5589 0.9735 0.9867
No log 3.1304 72 0.9136 0.6312 0.9136 0.9558
No log 3.2174 74 0.9162 0.5943 0.9162 0.9572
No log 3.3043 76 0.9947 0.5566 0.9947 0.9973
No log 3.3913 78 0.8694 0.5806 0.8694 0.9324
No log 3.4783 80 0.8321 0.6372 0.8321 0.9122
No log 3.5652 82 0.8528 0.6075 0.8528 0.9235
No log 3.6522 84 1.0106 0.4577 1.0106 1.0053
No log 3.7391 86 0.9067 0.5617 0.9067 0.9522
No log 3.8261 88 0.7893 0.5684 0.7893 0.8884
No log 3.9130 90 1.1018 0.5051 1.1018 1.0496
No log 4.0 92 1.2440 0.4213 1.2440 1.1154
No log 4.0870 94 1.0950 0.5298 1.0950 1.0464
No log 4.1739 96 0.8751 0.5860 0.8751 0.9355
No log 4.2609 98 0.8367 0.5680 0.8367 0.9147
No log 4.3478 100 0.8275 0.5746 0.8275 0.9097
No log 4.4348 102 0.9239 0.4 0.9239 0.9612
No log 4.5217 104 1.0590 0.3487 1.0590 1.0291
No log 4.6087 106 1.0905 0.3706 1.0905 1.0443
No log 4.6957 108 1.0072 0.3839 1.0072 1.0036
No log 4.7826 110 0.8560 0.4482 0.8560 0.9252
No log 4.8696 112 0.7976 0.5781 0.7976 0.8931
No log 4.9565 114 0.9411 0.5393 0.9411 0.9701
No log 5.0435 116 0.9391 0.5562 0.9391 0.9691
No log 5.1304 118 0.7918 0.5617 0.7918 0.8898
No log 5.2174 120 0.7102 0.6294 0.7102 0.8427
No log 5.3043 122 0.7893 0.6167 0.7893 0.8884
No log 5.3913 124 0.7675 0.6167 0.7675 0.8761
No log 5.4783 126 0.6905 0.6292 0.6905 0.8310
No log 5.5652 128 0.6782 0.5492 0.6782 0.8235
No log 5.6522 130 0.6848 0.5866 0.6848 0.8276
No log 5.7391 132 0.6974 0.6224 0.6974 0.8351
No log 5.8261 134 0.6989 0.6487 0.6989 0.8360
No log 5.9130 136 0.7429 0.5618 0.7429 0.8619
No log 6.0 138 0.7356 0.5618 0.7356 0.8577
No log 6.0870 140 0.7351 0.5161 0.7351 0.8574
No log 6.1739 142 0.7279 0.5322 0.7279 0.8531
No log 6.2609 144 0.7457 0.5684 0.7457 0.8636
No log 6.3478 146 0.8244 0.5775 0.8244 0.9080
No log 6.4348 148 1.0479 0.5597 1.0479 1.0237
No log 6.5217 150 1.1051 0.5178 1.1051 1.0512
No log 6.6087 152 0.9501 0.5679 0.9501 0.9747
No log 6.6957 154 0.8068 0.5828 0.8068 0.8982
No log 6.7826 156 0.8679 0.4915 0.8679 0.9316
No log 6.8696 158 0.9881 0.3872 0.9881 0.9940
No log 6.9565 160 1.0248 0.3706 1.0248 1.0123
No log 7.0435 162 0.9834 0.4431 0.9834 0.9917
No log 7.1304 164 0.8562 0.5645 0.8562 0.9253
No log 7.2174 166 0.9098 0.6119 0.9098 0.9539
No log 7.3043 168 0.9495 0.55 0.9495 0.9744
No log 7.3913 170 0.8905 0.5498 0.8905 0.9437
No log 7.4783 172 0.8078 0.4982 0.8078 0.8988
No log 7.5652 174 0.8027 0.5403 0.8027 0.8959
No log 7.6522 176 0.7994 0.5327 0.7994 0.8941
No log 7.7391 178 0.7676 0.5607 0.7676 0.8761
No log 7.8261 180 0.7654 0.5525 0.7654 0.8749
No log 7.9130 182 0.8276 0.5560 0.8276 0.9097
No log 8.0 184 0.9181 0.5780 0.9181 0.9582
No log 8.0870 186 0.9439 0.5780 0.9439 0.9716
No log 8.1739 188 0.8700 0.6083 0.8700 0.9327
No log 8.2609 190 0.9009 0.6302 0.9009 0.9492
No log 8.3478 192 0.9676 0.5506 0.9676 0.9836
No log 8.4348 194 0.8652 0.6021 0.8652 0.9302
No log 8.5217 196 0.8096 0.5611 0.8096 0.8998
No log 8.6087 198 0.8181 0.5011 0.8181 0.9045
No log 8.6957 200 0.8021 0.4841 0.8021 0.8956
No log 8.7826 202 0.8085 0.5110 0.8085 0.8992
No log 8.8696 204 0.8505 0.5821 0.8505 0.9222
No log 8.9565 206 0.8373 0.5451 0.8373 0.9150
No log 9.0435 208 0.7948 0.4994 0.7948 0.8915
No log 9.1304 210 0.7690 0.5254 0.7690 0.8769
No log 9.2174 212 0.7755 0.5269 0.7755 0.8806
No log 9.3043 214 0.7699 0.5259 0.7699 0.8774
No log 9.3913 216 0.7535 0.5566 0.7535 0.8680
No log 9.4783 218 0.7465 0.5691 0.7465 0.8640
No log 9.5652 220 0.7477 0.6205 0.7477 0.8647
No log 9.6522 222 0.7578 0.6682 0.7578 0.8705
No log 9.7391 224 0.7419 0.6682 0.7419 0.8613
No log 9.8261 226 0.7172 0.6269 0.7172 0.8469
No log 9.9130 228 0.7551 0.6256 0.7551 0.8690
No log 10.0 230 0.8435 0.5888 0.8435 0.9184
No log 10.0870 232 0.8853 0.5888 0.8853 0.9409
No log 10.1739 234 0.8087 0.6111 0.8087 0.8993
No log 10.2609 236 0.7231 0.6369 0.7231 0.8504
No log 10.3478 238 0.6969 0.5856 0.6969 0.8348
No log 10.4348 240 0.6827 0.5349 0.6827 0.8263
No log 10.5217 242 0.6956 0.5925 0.6956 0.8341
No log 10.6087 244 0.6894 0.6412 0.6894 0.8303
No log 10.6957 246 0.7147 0.6534 0.7147 0.8454
No log 10.7826 248 0.7104 0.6809 0.7104 0.8429
No log 10.8696 250 0.7355 0.5881 0.7355 0.8576
No log 10.9565 252 0.7150 0.5908 0.7150 0.8456
No log 11.0435 254 0.7035 0.5329 0.7035 0.8387
No log 11.1304 256 0.7085 0.5329 0.7085 0.8417
No log 11.2174 258 0.7149 0.5329 0.7149 0.8455
No log 11.3043 260 0.7261 0.5171 0.7261 0.8521
No log 11.3913 262 0.7461 0.4979 0.7461 0.8638
No log 11.4783 264 0.7487 0.5403 0.7487 0.8653
No log 11.5652 266 0.7476 0.5868 0.7476 0.8647
No log 11.6522 268 0.7570 0.6005 0.7570 0.8701
No log 11.7391 270 0.7464 0.5657 0.7464 0.8639
No log 11.8261 272 0.7498 0.5743 0.7498 0.8659
No log 11.9130 274 0.7539 0.5072 0.7539 0.8683
No log 12.0 276 0.7614 0.5042 0.7614 0.8726
No log 12.0870 278 0.7487 0.5011 0.7487 0.8653
No log 12.1739 280 0.7465 0.5527 0.7465 0.8640
No log 12.2609 282 0.7891 0.5954 0.7891 0.8883
No log 12.3478 284 0.8052 0.5623 0.8052 0.8973
No log 12.4348 286 0.8157 0.5623 0.8157 0.9032
No log 12.5217 288 0.7956 0.6178 0.7956 0.8920
No log 12.6087 290 0.7698 0.5485 0.7698 0.8774
No log 12.6957 292 0.7429 0.6363 0.7429 0.8619
No log 12.7826 294 0.7507 0.5447 0.7507 0.8665
No log 12.8696 296 0.7688 0.5161 0.7688 0.8768
No log 12.9565 298 0.7754 0.4929 0.7754 0.8805
No log 13.0435 300 0.7653 0.4962 0.7653 0.8748
No log 13.1304 302 0.7573 0.5413 0.7573 0.8702
No log 13.2174 304 0.7720 0.5693 0.7720 0.8786
No log 13.3043 306 0.8130 0.5806 0.8130 0.9017
No log 13.3913 308 0.8166 0.6072 0.8166 0.9037
No log 13.4783 310 0.7861 0.6119 0.7861 0.8866
No log 13.5652 312 0.7563 0.6160 0.7563 0.8697
No log 13.6522 314 0.7948 0.5691 0.7948 0.8915
No log 13.7391 316 0.8041 0.5367 0.8041 0.8967
No log 13.8261 318 0.7505 0.5632 0.7505 0.8663
No log 13.9130 320 0.7548 0.5432 0.7548 0.8688
No log 14.0 322 0.7684 0.5756 0.7684 0.8766
No log 14.0870 324 0.7543 0.6545 0.7543 0.8685
No log 14.1739 326 0.7399 0.6414 0.7399 0.8602
No log 14.2609 328 0.7459 0.6263 0.7459 0.8636
No log 14.3478 330 0.7432 0.6263 0.7432 0.8621
No log 14.4348 332 0.7435 0.5680 0.7435 0.8623
No log 14.5217 334 0.7609 0.6196 0.7609 0.8723
No log 14.6087 336 0.7801 0.5869 0.7801 0.8832
No log 14.6957 338 0.7691 0.5979 0.7691 0.8770
No log 14.7826 340 0.7506 0.6269 0.7506 0.8664
No log 14.8696 342 0.7431 0.6269 0.7431 0.8620
No log 14.9565 344 0.7130 0.6035 0.7130 0.8444
No log 15.0435 346 0.6980 0.6363 0.6980 0.8355
No log 15.1304 348 0.7038 0.6437 0.7038 0.8389
No log 15.2174 350 0.7044 0.6386 0.7044 0.8393
No log 15.3043 352 0.7006 0.6546 0.7006 0.8370
No log 15.3913 354 0.6970 0.6412 0.6970 0.8349
No log 15.4783 356 0.7137 0.6712 0.7137 0.8448
No log 15.5652 358 0.6954 0.6525 0.6954 0.8339
No log 15.6522 360 0.6941 0.6563 0.6941 0.8331
No log 15.7391 362 0.7023 0.6449 0.7023 0.8381
No log 15.8261 364 0.7280 0.6109 0.7280 0.8532
No log 15.9130 366 0.7466 0.5756 0.7466 0.8640
No log 16.0 368 0.7578 0.5385 0.7578 0.8705
No log 16.0870 370 0.7593 0.5385 0.7593 0.8714
No log 16.1739 372 0.7452 0.5324 0.7452 0.8633
No log 16.2609 374 0.7309 0.5746 0.7309 0.8549
No log 16.3478 376 0.7350 0.5658 0.7350 0.8573
No log 16.4348 378 0.7362 0.5550 0.7362 0.8580
No log 16.5217 380 0.7715 0.5942 0.7715 0.8784
No log 16.6087 382 0.8117 0.5714 0.8117 0.9010
No log 16.6957 384 0.7925 0.5624 0.7925 0.8902
No log 16.7826 386 0.7515 0.5937 0.7515 0.8669
No log 16.8696 388 0.7780 0.5347 0.7780 0.8821
No log 16.9565 390 0.8095 0.4645 0.8095 0.8997
No log 17.0435 392 0.7713 0.6085 0.7713 0.8782
No log 17.1304 394 0.7548 0.6171 0.7548 0.8688
No log 17.2174 396 0.7814 0.5954 0.7814 0.8840
No log 17.3043 398 0.7623 0.6241 0.7623 0.8731
No log 17.3913 400 0.7643 0.5703 0.7643 0.8743
No log 17.4783 402 0.7606 0.5727 0.7606 0.8721
No log 17.5652 404 0.7572 0.5730 0.7572 0.8701
No log 17.6522 406 0.7630 0.5534 0.7630 0.8735
No log 17.7391 408 0.7785 0.5426 0.7785 0.8823
No log 17.8261 410 0.7985 0.5426 0.7985 0.8936
No log 17.9130 412 0.8136 0.5249 0.8136 0.9020
No log 18.0 414 0.8111 0.5879 0.8111 0.9006
No log 18.0870 416 0.8037 0.5443 0.8037 0.8965
No log 18.1739 418 0.7972 0.5671 0.7972 0.8928
No log 18.2609 420 0.8268 0.5895 0.8268 0.9093
No log 18.3478 422 0.9317 0.5247 0.9317 0.9652
No log 18.4348 424 0.9566 0.4873 0.9566 0.9781
No log 18.5217 426 0.8917 0.6025 0.8917 0.9443
No log 18.6087 428 0.8329 0.5472 0.8329 0.9127
No log 18.6957 430 0.8128 0.5846 0.8128 0.9016
No log 18.7826 432 0.8421 0.5485 0.8421 0.9177
No log 18.8696 434 0.8741 0.5338 0.8741 0.9349
No log 18.9565 436 0.8578 0.5832 0.8578 0.9262
No log 19.0435 438 0.8309 0.5886 0.8309 0.9115
No log 19.1304 440 0.8130 0.4996 0.8130 0.9017
No log 19.2174 442 0.8143 0.5110 0.8143 0.9024
No log 19.3043 444 0.8311 0.5241 0.8311 0.9117
No log 19.3913 446 0.8188 0.5110 0.8188 0.9049
No log 19.4783 448 0.7979 0.5356 0.7979 0.8932
No log 19.5652 450 0.8173 0.5858 0.8173 0.9040
No log 19.6522 452 0.8664 0.5832 0.8664 0.9308
No log 19.7391 454 0.9497 0.5557 0.9497 0.9745
No log 19.8261 456 0.9674 0.5258 0.9674 0.9835
No log 19.9130 458 0.9161 0.5682 0.9161 0.9571
No log 20.0 460 0.8431 0.5720 0.8431 0.9182
No log 20.0870 462 0.8029 0.5186 0.8029 0.8960
No log 20.1739 464 0.8051 0.4836 0.8051 0.8973
No log 20.2609 466 0.8193 0.5721 0.8193 0.9051
No log 20.3478 468 0.8046 0.5582 0.8046 0.8970
No log 20.4348 470 0.8025 0.5548 0.8025 0.8958
No log 20.5217 472 0.8683 0.5517 0.8683 0.9318
No log 20.6087 474 0.9017 0.5276 0.9017 0.9496
No log 20.6957 476 0.9378 0.5591 0.9378 0.9684
No log 20.7826 478 0.9174 0.5769 0.9174 0.9578
No log 20.8696 480 0.8520 0.5835 0.8520 0.9230
No log 20.9565 482 0.7960 0.5740 0.7960 0.8922
No log 21.0435 484 0.8390 0.5479 0.8390 0.9160
No log 21.1304 486 0.9520 0.5258 0.9520 0.9757
No log 21.2174 488 1.0121 0.5266 1.0121 1.0060
No log 21.3043 490 0.9526 0.5258 0.9526 0.9760
No log 21.3913 492 0.8848 0.5301 0.8848 0.9406
No log 21.4783 494 0.8172 0.5451 0.8172 0.9040
No log 21.5652 496 0.8415 0.5482 0.8415 0.9173
No log 21.6522 498 0.8912 0.5308 0.8912 0.9440
0.2739 21.7391 500 0.9041 0.5094 0.9041 0.9508
0.2739 21.8261 502 0.8885 0.5014 0.8885 0.9426
0.2739 21.9130 504 0.8435 0.4998 0.8435 0.9184
0.2739 22.0 506 0.8132 0.4981 0.8132 0.9018
0.2739 22.0870 508 0.7990 0.5672 0.7990 0.8938
0.2739 22.1739 510 0.7915 0.5447 0.7915 0.8896
0.2739 22.2609 512 0.7903 0.5634 0.7903 0.8890
0.2739 22.3478 514 0.8036 0.5806 0.8036 0.8964
0.2739 22.4348 516 0.7970 0.5806 0.7970 0.8928
0.2739 22.5217 518 0.7851 0.5858 0.7851 0.8861
0.2739 22.6087 520 0.7606 0.5851 0.7606 0.8721
0.2739 22.6957 522 0.7615 0.5321 0.7615 0.8726
0.2739 22.7826 524 0.7747 0.5321 0.7747 0.8802
0.2739 22.8696 526 0.7802 0.5321 0.7802 0.8833
0.2739 22.9565 528 0.7784 0.5321 0.7784 0.8822
0.2739 23.0435 530 0.7808 0.5583 0.7808 0.8836
0.2739 23.1304 532 0.7852 0.4898 0.7852 0.8861
0.2739 23.2174 534 0.7833 0.4889 0.7833 0.8850
0.2739 23.3043 536 0.7802 0.4889 0.7802 0.8833
0.2739 23.3913 538 0.7784 0.5040 0.7784 0.8823
0.2739 23.4783 540 0.7775 0.5451 0.7775 0.8818
0.2739 23.5652 542 0.7752 0.5451 0.7752 0.8805
0.2739 23.6522 544 0.7764 0.5472 0.7764 0.8811
0.2739 23.7391 546 0.7969 0.5634 0.7969 0.8927
0.2739 23.8261 548 0.8140 0.5409 0.8140 0.9022
0.2739 23.9130 550 0.8190 0.5554 0.8190 0.9050
0.2739 24.0 552 0.8157 0.5575 0.8157 0.9032
0.2739 24.0870 554 0.8126 0.5598 0.8126 0.9014
0.2739 24.1739 556 0.8053 0.5807 0.8053 0.8974
0.2739 24.2609 558 0.7906 0.5645 0.7906 0.8891
0.2739 24.3478 560 0.7655 0.5730 0.7655 0.8749
0.2739 24.4348 562 0.7744 0.5136 0.7744 0.8800
0.2739 24.5217 564 0.7921 0.5269 0.7921 0.8900
0.2739 24.6087 566 0.7767 0.6069 0.7767 0.8813
0.2739 24.6957 568 0.7647 0.5646 0.7647 0.8745
0.2739 24.7826 570 0.7508 0.5772 0.7508 0.8665
0.2739 24.8696 572 0.7512 0.5670 0.7512 0.8667
0.2739 24.9565 574 0.7678 0.5784 0.7678 0.8762
0.2739 25.0435 576 0.7604 0.5376 0.7604 0.8720
0.2739 25.1304 578 0.7526 0.5557 0.7526 0.8675
0.2739 25.2174 580 0.7659 0.5382 0.7659 0.8752
0.2739 25.3043 582 0.7747 0.5559 0.7747 0.8802
0.2739 25.3913 584 0.7786 0.5704 0.7786 0.8824
0.2739 25.4783 586 0.7836 0.5343 0.7836 0.8852
0.2739 25.5652 588 0.7752 0.5343 0.7752 0.8805
0.2739 25.6522 590 0.7733 0.5611 0.7733 0.8794
0.2739 25.7391 592 0.7754 0.5956 0.7754 0.8806
0.2739 25.8261 594 0.7648 0.5815 0.7648 0.8745
0.2739 25.9130 596 0.7621 0.5621 0.7621 0.8730
0.2739 26.0 598 0.7981 0.5433 0.7981 0.8934
0.2739 26.0870 600 0.8276 0.5387 0.8276 0.9097
0.2739 26.1739 602 0.8162 0.5367 0.8162 0.9034
0.2739 26.2609 604 0.7938 0.5518 0.7938 0.8909
0.2739 26.3478 606 0.7809 0.5849 0.7809 0.8837
0.2739 26.4348 608 0.8160 0.5647 0.8160 0.9033
0.2739 26.5217 610 0.8866 0.5201 0.8866 0.9416
0.2739 26.6087 612 0.9225 0.5094 0.9225 0.9605
0.2739 26.6957 614 0.9154 0.5094 0.9154 0.9568
0.2739 26.7826 616 0.8816 0.5041 0.8816 0.9389
0.2739 26.8696 618 0.8620 0.4716 0.8620 0.9284
0.2739 26.9565 620 0.8450 0.5176 0.8450 0.9192
0.2739 27.0435 622 0.8265 0.5295 0.8265 0.9091
0.2739 27.1304 624 0.8234 0.5232 0.8234 0.9074
0.2739 27.2174 626 0.8331 0.5158 0.8331 0.9127
0.2739 27.3043 628 0.8522 0.5265 0.8522 0.9232
0.2739 27.3913 630 0.8920 0.5607 0.8920 0.9444
0.2739 27.4783 632 0.9274 0.5821 0.9274 0.9630
0.2739 27.5652 634 0.9144 0.5848 0.9144 0.9563
0.2739 27.6522 636 0.8667 0.5631 0.8667 0.9310
0.2739 27.7391 638 0.8141 0.5476 0.8141 0.9023
0.2739 27.8261 640 0.7801 0.5645 0.7801 0.8832
0.2739 27.9130 642 0.7713 0.5370 0.7713 0.8783
0.2739 28.0 644 0.7680 0.5370 0.7680 0.8764
0.2739 28.0870 646 0.7687 0.4902 0.7687 0.8768
0.2739 28.1739 648 0.7694 0.4902 0.7694 0.8771
0.2739 28.2609 650 0.7679 0.5131 0.7679 0.8763
0.2739 28.3478 652 0.7673 0.5114 0.7673 0.8760
0.2739 28.4348 654 0.7723 0.5024 0.7723 0.8788
0.2739 28.5217 656 0.7696 0.4923 0.7696 0.8773
0.2739 28.6087 658 0.7656 0.4923 0.7656 0.8750
0.2739 28.6957 660 0.7660 0.4749 0.7660 0.8752
0.2739 28.7826 662 0.7704 0.4749 0.7704 0.8777

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k4_task2_organization

Finetuned
(4019)
this model