ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run3_AugV5_k12_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6549
  • Qwk: 0.2807
  • Mse: 0.6549
  • Rmse: 0.8092

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0476 2 2.5054 -0.0262 2.5054 1.5828
No log 0.0952 4 1.2308 0.1246 1.2308 1.1094
No log 0.1429 6 0.9636 -0.1062 0.9636 0.9817
No log 0.1905 8 1.0403 -0.0408 1.0403 1.0200
No log 0.2381 10 0.8668 0.0 0.8668 0.9310
No log 0.2857 12 0.8299 0.0 0.8299 0.9110
No log 0.3333 14 0.8982 0.0 0.8982 0.9478
No log 0.3810 16 0.8567 -0.0426 0.8567 0.9256
No log 0.4286 18 0.7435 0.0 0.7435 0.8623
No log 0.4762 20 0.8107 0.0798 0.8107 0.9004
No log 0.5238 22 0.9564 -0.0444 0.9564 0.9780
No log 0.5714 24 1.2190 -0.1983 1.2190 1.1041
No log 0.6190 26 1.0807 0.0185 1.0807 1.0396
No log 0.6667 28 0.8751 0.0053 0.8751 0.9355
No log 0.7143 30 0.8226 0.0481 0.8226 0.9070
No log 0.7619 32 0.8730 0.1770 0.8730 0.9343
No log 0.8095 34 0.9102 0.2435 0.9102 0.9540
No log 0.8571 36 0.9443 0.2008 0.9443 0.9717
No log 0.9048 38 0.8800 0.2066 0.8801 0.9381
No log 0.9524 40 0.7121 0.0495 0.7121 0.8439
No log 1.0 42 0.6591 0.0393 0.6591 0.8118
No log 1.0476 44 0.6764 0.0495 0.6764 0.8224
No log 1.0952 46 0.6723 0.0893 0.6723 0.8199
No log 1.1429 48 0.7218 0.2118 0.7218 0.8496
No log 1.1905 50 0.9872 0.1896 0.9872 0.9936
No log 1.2381 52 1.3580 0.0319 1.3580 1.1653
No log 1.2857 54 1.4695 0.0338 1.4695 1.2122
No log 1.3333 56 1.5113 0.0338 1.5113 1.2294
No log 1.3810 58 1.4487 -0.0329 1.4487 1.2036
No log 1.4286 60 1.1436 0.0126 1.1436 1.0694
No log 1.4762 62 0.9801 0.0949 0.9801 0.9900
No log 1.5238 64 0.8875 0.0481 0.8875 0.9421
No log 1.5714 66 0.9250 0.0937 0.9250 0.9617
No log 1.6190 68 0.9094 0.1136 0.9094 0.9536
No log 1.6667 70 0.8869 0.2476 0.8869 0.9418
No log 1.7143 72 0.9098 0.2041 0.9098 0.9538
No log 1.7619 74 1.0180 0.0609 1.0180 1.0090
No log 1.8095 76 1.0323 0.0342 1.0323 1.0160
No log 1.8571 78 0.9856 0.0615 0.9856 0.9928
No log 1.9048 80 0.8768 0.2283 0.8768 0.9364
No log 1.9524 82 0.8098 0.3348 0.8098 0.8999
No log 2.0 84 0.8148 0.3664 0.8148 0.9027
No log 2.0476 86 0.9264 0.2233 0.9264 0.9625
No log 2.0952 88 1.1726 0.1248 1.1726 1.0829
No log 2.1429 90 1.2155 0.1243 1.2155 1.1025
No log 2.1905 92 1.0328 0.1819 1.0328 1.0163
No log 2.2381 94 0.7820 0.2012 0.7820 0.8843
No log 2.2857 96 0.7466 0.2913 0.7466 0.8640
No log 2.3333 98 0.7617 0.3060 0.7617 0.8728
No log 2.3810 100 0.6897 0.3866 0.6897 0.8305
No log 2.4286 102 0.6596 0.2353 0.6596 0.8122
No log 2.4762 104 0.6998 0.3101 0.6998 0.8366
No log 2.5238 106 0.6732 0.3360 0.6732 0.8205
No log 2.5714 108 0.6864 0.3907 0.6864 0.8285
No log 2.6190 110 0.7713 0.3401 0.7713 0.8782
No log 2.6667 112 1.0233 0.2823 1.0233 1.0116
No log 2.7143 114 1.1628 0.1959 1.1628 1.0783
No log 2.7619 116 1.0135 0.3089 1.0135 1.0067
No log 2.8095 118 0.7742 0.3243 0.7742 0.8799
No log 2.8571 120 0.6786 0.3377 0.6786 0.8237
No log 2.9048 122 0.6862 0.3649 0.6862 0.8284
No log 2.9524 124 0.7222 0.2661 0.7222 0.8498
No log 3.0 126 0.7092 0.3580 0.7092 0.8422
No log 3.0476 128 0.7707 0.1873 0.7707 0.8779
No log 3.0952 130 0.9048 0.0977 0.9048 0.9512
No log 3.1429 132 0.8406 0.2254 0.8406 0.9169
No log 3.1905 134 0.7005 0.2249 0.7005 0.8370
No log 3.2381 136 0.6815 0.3213 0.6815 0.8255
No log 3.2857 138 0.6865 0.3318 0.6865 0.8285
No log 3.3333 140 0.7239 0.2236 0.7239 0.8508
No log 3.3810 142 0.7764 0.2203 0.7764 0.8811
No log 3.4286 144 0.7634 0.1959 0.7634 0.8737
No log 3.4762 146 0.7135 0.2249 0.7135 0.8447
No log 3.5238 148 0.7399 0.4182 0.7399 0.8602
No log 3.5714 150 0.8696 0.3256 0.8696 0.9325
No log 3.6190 152 0.9085 0.3134 0.9085 0.9532
No log 3.6667 154 0.7766 0.3475 0.7766 0.8812
No log 3.7143 156 0.7119 0.3318 0.7119 0.8437
No log 3.7619 158 0.7095 0.2979 0.7095 0.8423
No log 3.8095 160 0.6678 0.3661 0.6678 0.8172
No log 3.8571 162 0.6443 0.4173 0.6443 0.8027
No log 3.9048 164 0.6260 0.3552 0.6260 0.7912
No log 3.9524 166 0.6345 0.3509 0.6345 0.7966
No log 4.0 168 0.7197 0.3976 0.7197 0.8484
No log 4.0476 170 0.7780 0.3140 0.7780 0.8821
No log 4.0952 172 0.6928 0.3659 0.6928 0.8323
No log 4.1429 174 0.6268 0.3703 0.6268 0.7917
No log 4.1905 176 0.6516 0.4795 0.6516 0.8072
No log 4.2381 178 0.6317 0.4136 0.6317 0.7948
No log 4.2857 180 0.6525 0.3146 0.6525 0.8078
No log 4.3333 182 0.6898 0.3060 0.6898 0.8305
No log 4.3810 184 0.7515 0.3113 0.7515 0.8669
No log 4.4286 186 0.7330 0.3163 0.7330 0.8561
No log 4.4762 188 0.7190 0.3667 0.7190 0.8480
No log 4.5238 190 0.8072 0.3365 0.8072 0.8984
No log 4.5714 192 0.8584 0.3074 0.8584 0.9265
No log 4.6190 194 0.7479 0.2975 0.7479 0.8648
No log 4.6667 196 0.6778 0.2947 0.6778 0.8233
No log 4.7143 198 0.7134 0.2963 0.7134 0.8446
No log 4.7619 200 0.7168 0.2963 0.7168 0.8466
No log 4.8095 202 0.7063 0.3462 0.7063 0.8404
No log 4.8571 204 0.7368 0.2204 0.7368 0.8584
No log 4.9048 206 0.7813 0.2521 0.7813 0.8839
No log 4.9524 208 0.7327 0.2261 0.7327 0.8560
No log 5.0 210 0.7220 0.2261 0.7220 0.8497
No log 5.0476 212 0.7044 0.3599 0.7044 0.8393
No log 5.0952 214 0.6983 0.2830 0.6983 0.8356
No log 5.1429 216 0.7205 0.3360 0.7205 0.8488
No log 5.1905 218 0.7226 0.3671 0.7226 0.8501
No log 5.2381 220 0.7111 0.3671 0.7111 0.8433
No log 5.2857 222 0.7009 0.3265 0.7009 0.8372
No log 5.3333 224 0.7289 0.3335 0.7289 0.8538
No log 5.3810 226 0.7216 0.3252 0.7216 0.8495
No log 5.4286 228 0.7273 0.3417 0.7273 0.8528
No log 5.4762 230 0.7264 0.3556 0.7264 0.8523
No log 5.5238 232 0.7176 0.2895 0.7176 0.8471
No log 5.5714 234 0.7081 0.3581 0.7081 0.8415
No log 5.6190 236 0.7149 0.3189 0.7149 0.8455
No log 5.6667 238 0.6922 0.3786 0.6922 0.8320
No log 5.7143 240 0.6887 0.3089 0.6887 0.8299
No log 5.7619 242 0.7261 0.3614 0.7261 0.8521
No log 5.8095 244 0.7623 0.3264 0.7623 0.8731
No log 5.8571 246 0.7062 0.3590 0.7062 0.8403
No log 5.9048 248 0.6672 0.4244 0.6672 0.8168
No log 5.9524 250 0.6650 0.4037 0.6650 0.8155
No log 6.0 252 0.7027 0.4261 0.7027 0.8383
No log 6.0476 254 0.6935 0.4261 0.6935 0.8328
No log 6.0952 256 0.6249 0.4555 0.6249 0.7905
No log 6.1429 258 0.6918 0.4684 0.6918 0.8318
No log 6.1905 260 0.7521 0.4295 0.7521 0.8672
No log 6.2381 262 0.6923 0.5086 0.6923 0.8320
No log 6.2857 264 0.6550 0.3791 0.6550 0.8093
No log 6.3333 266 0.6767 0.4010 0.6767 0.8226
No log 6.3810 268 0.6695 0.3536 0.6695 0.8183
No log 6.4286 270 0.6706 0.3906 0.6706 0.8189
No log 6.4762 272 0.7031 0.4081 0.7031 0.8385
No log 6.5238 274 0.6771 0.4212 0.6771 0.8229
No log 6.5714 276 0.6250 0.4301 0.6250 0.7906
No log 6.6190 278 0.6171 0.3460 0.6171 0.7856
No log 6.6667 280 0.6342 0.4022 0.6342 0.7964
No log 6.7143 282 0.6645 0.4918 0.6645 0.8152
No log 6.7619 284 0.6800 0.4858 0.6800 0.8246
No log 6.8095 286 0.6665 0.4918 0.6665 0.8164
No log 6.8571 288 0.6445 0.4128 0.6445 0.8028
No log 6.9048 290 0.6303 0.3714 0.6303 0.7939
No log 6.9524 292 0.6270 0.3964 0.6270 0.7919
No log 7.0 294 0.6622 0.3566 0.6622 0.8137
No log 7.0476 296 0.6899 0.2737 0.6899 0.8306
No log 7.0952 298 0.7419 0.2706 0.7419 0.8614
No log 7.1429 300 0.7700 0.2984 0.7700 0.8775
No log 7.1905 302 0.6811 0.3291 0.6811 0.8253
No log 7.2381 304 0.6686 0.3279 0.6686 0.8177
No log 7.2857 306 0.6736 0.3628 0.6736 0.8208
No log 7.3333 308 0.7145 0.2030 0.7145 0.8453
No log 7.3810 310 0.7428 0.2911 0.7428 0.8618
No log 7.4286 312 0.7600 0.3457 0.7600 0.8718
No log 7.4762 314 0.7536 0.2960 0.7536 0.8681
No log 7.5238 316 0.7748 0.3166 0.7748 0.8802
No log 7.5714 318 0.7618 0.3455 0.7618 0.8728
No log 7.6190 320 0.7571 0.3541 0.7571 0.8701
No log 7.6667 322 0.7098 0.3259 0.7098 0.8425
No log 7.7143 324 0.6792 0.3645 0.6792 0.8242
No log 7.7619 326 0.6618 0.4338 0.6618 0.8135
No log 7.8095 328 0.6464 0.4569 0.6464 0.8040
No log 7.8571 330 0.6521 0.5577 0.6521 0.8075
No log 7.9048 332 0.6448 0.5362 0.6448 0.8030
No log 7.9524 334 0.6582 0.3739 0.6582 0.8113
No log 8.0 336 0.6950 0.3566 0.6950 0.8337
No log 8.0476 338 0.7163 0.3505 0.7163 0.8464
No log 8.0952 340 0.6630 0.3856 0.6630 0.8142
No log 8.1429 342 0.6417 0.4475 0.6417 0.8010
No log 8.1905 344 0.6526 0.4013 0.6526 0.8078
No log 8.2381 346 0.6729 0.3769 0.6729 0.8203
No log 8.2857 348 0.7203 0.4472 0.7203 0.8487
No log 8.3333 350 0.7251 0.4472 0.7251 0.8516
No log 8.3810 352 0.7338 0.3858 0.7338 0.8566
No log 8.4286 354 0.7459 0.3358 0.7459 0.8636
No log 8.4762 356 0.7619 0.3785 0.7619 0.8728
No log 8.5238 358 0.6928 0.3297 0.6928 0.8324
No log 8.5714 360 0.7117 0.4536 0.7117 0.8436
No log 8.6190 362 0.7622 0.3739 0.7622 0.8730
No log 8.6667 364 0.7220 0.2469 0.7220 0.8497
No log 8.7143 366 0.6957 0.2469 0.6957 0.8341
No log 8.7619 368 0.6579 0.2710 0.6579 0.8111
No log 8.8095 370 0.6247 0.4091 0.6247 0.7904
No log 8.8571 372 0.6599 0.4764 0.6599 0.8123
No log 8.9048 374 0.6525 0.4764 0.6525 0.8077
No log 8.9524 376 0.6038 0.4179 0.6038 0.7770
No log 9.0 378 0.7065 0.4079 0.7065 0.8405
No log 9.0476 380 0.8053 0.3263 0.8053 0.8974
No log 9.0952 382 0.7218 0.3851 0.7218 0.8496
No log 9.1429 384 0.6336 0.3671 0.6336 0.7960
No log 9.1905 386 0.6279 0.3950 0.6279 0.7924
No log 9.2381 388 0.6397 0.4278 0.6397 0.7998
No log 9.2857 390 0.6425 0.5003 0.6425 0.8016
No log 9.3333 392 0.6378 0.3213 0.6378 0.7986
No log 9.3810 394 0.6360 0.4136 0.6360 0.7975
No log 9.4286 396 0.6380 0.3170 0.6380 0.7987
No log 9.4762 398 0.6467 0.3603 0.6467 0.8042
No log 9.5238 400 0.6805 0.3990 0.6805 0.8249
No log 9.5714 402 0.7376 0.4702 0.7376 0.8589
No log 9.6190 404 0.7511 0.4284 0.7511 0.8666
No log 9.6667 406 0.7165 0.4428 0.7165 0.8465
No log 9.7143 408 0.6833 0.3669 0.6833 0.8266
No log 9.7619 410 0.6855 0.3335 0.6855 0.8280
No log 9.8095 412 0.6840 0.3715 0.6840 0.8271
No log 9.8571 414 0.7053 0.4147 0.7053 0.8398
No log 9.9048 416 0.7062 0.4248 0.7062 0.8404
No log 9.9524 418 0.7150 0.4795 0.7150 0.8456
No log 10.0 420 0.6803 0.4547 0.6803 0.8248
No log 10.0476 422 0.6420 0.3385 0.6420 0.8012
No log 10.0952 424 0.6621 0.2944 0.6621 0.8137
No log 10.1429 426 0.6494 0.2978 0.6494 0.8058
No log 10.1905 428 0.6395 0.3320 0.6395 0.7997
No log 10.2381 430 0.6388 0.3808 0.6388 0.7992
No log 10.2857 432 0.6503 0.3239 0.6503 0.8064
No log 10.3333 434 0.6576 0.3239 0.6576 0.8109
No log 10.3810 436 0.6633 0.3978 0.6633 0.8144
No log 10.4286 438 0.6581 0.3939 0.6581 0.8112
No log 10.4762 440 0.6572 0.4291 0.6572 0.8107
No log 10.5238 442 0.6618 0.4291 0.6618 0.8135
No log 10.5714 444 0.6868 0.3862 0.6868 0.8287
No log 10.6190 446 0.7364 0.3235 0.7364 0.8581
No log 10.6667 448 0.8674 0.3527 0.8674 0.9314
No log 10.7143 450 0.9482 0.3475 0.9482 0.9738
No log 10.7619 452 0.8882 0.3089 0.8882 0.9424
No log 10.8095 454 0.7880 0.3793 0.7880 0.8877
No log 10.8571 456 0.7419 0.3703 0.7419 0.8613
No log 10.9048 458 0.7270 0.3481 0.7270 0.8527
No log 10.9524 460 0.7237 0.2884 0.7237 0.8507
No log 11.0 462 0.7396 0.3289 0.7396 0.8600
No log 11.0476 464 0.7565 0.2780 0.7565 0.8697
No log 11.0952 466 0.8072 0.3209 0.8072 0.8985
No log 11.1429 468 0.8383 0.2862 0.8383 0.9156
No log 11.1905 470 0.8183 0.3391 0.8183 0.9046
No log 11.2381 472 0.8075 0.2568 0.8075 0.8986
No log 11.2857 474 0.8347 0.2444 0.8347 0.9136
No log 11.3333 476 0.7931 0.3025 0.7931 0.8906
No log 11.3810 478 0.7556 0.3189 0.7556 0.8692
No log 11.4286 480 0.6984 0.3677 0.6984 0.8357
No log 11.4762 482 0.6720 0.4111 0.6720 0.8198
No log 11.5238 484 0.6556 0.3893 0.6556 0.8097
No log 11.5714 486 0.6417 0.3920 0.6417 0.8011
No log 11.6190 488 0.6302 0.3920 0.6302 0.7938
No log 11.6667 490 0.6266 0.4224 0.6266 0.7916
No log 11.7143 492 0.6199 0.4019 0.6199 0.7874
No log 11.7619 494 0.6191 0.3886 0.6191 0.7868
No log 11.8095 496 0.6368 0.3994 0.6368 0.7980
No log 11.8571 498 0.6909 0.3829 0.6909 0.8312
0.3349 11.9048 500 0.7021 0.3594 0.7021 0.8379
0.3349 11.9524 502 0.6487 0.4575 0.6487 0.8054
0.3349 12.0 504 0.6070 0.5151 0.6070 0.7791
0.3349 12.0476 506 0.6041 0.4700 0.6041 0.7772
0.3349 12.0952 508 0.6109 0.4581 0.6109 0.7816
0.3349 12.1429 510 0.6195 0.5592 0.6195 0.7871
0.3349 12.1905 512 0.6451 0.5524 0.6451 0.8032
0.3349 12.2381 514 0.6710 0.4858 0.6710 0.8191
0.3349 12.2857 516 0.6459 0.4232 0.6459 0.8037
0.3349 12.3333 518 0.6592 0.4493 0.6592 0.8119
0.3349 12.3810 520 0.6703 0.4459 0.6703 0.8187
0.3349 12.4286 522 0.6483 0.4291 0.6483 0.8052
0.3349 12.4762 524 0.6565 0.3243 0.6565 0.8102
0.3349 12.5238 526 0.6944 0.3754 0.6944 0.8333
0.3349 12.5714 528 0.6907 0.3754 0.6907 0.8311
0.3349 12.6190 530 0.6740 0.3964 0.6740 0.8210
0.3349 12.6667 532 0.6786 0.3577 0.6786 0.8238
0.3349 12.7143 534 0.7373 0.3633 0.7373 0.8587
0.3349 12.7619 536 0.8383 0.4177 0.8383 0.9156
0.3349 12.8095 538 0.8382 0.4240 0.8382 0.9156
0.3349 12.8571 540 0.8014 0.4240 0.8014 0.8952
0.3349 12.9048 542 0.7327 0.4260 0.7327 0.8560
0.3349 12.9524 544 0.6898 0.2958 0.6898 0.8305
0.3349 13.0 546 0.7044 0.3201 0.7044 0.8393
0.3349 13.0476 548 0.7035 0.3474 0.7035 0.8387
0.3349 13.0952 550 0.6978 0.2889 0.6978 0.8353
0.3349 13.1429 552 0.7217 0.3862 0.7217 0.8495
0.3349 13.1905 554 0.7847 0.4254 0.7847 0.8858
0.3349 13.2381 556 0.8271 0.4940 0.8271 0.9095
0.3349 13.2857 558 0.7745 0.4556 0.7745 0.8801
0.3349 13.3333 560 0.6768 0.4244 0.6768 0.8227
0.3349 13.3810 562 0.6514 0.4119 0.6514 0.8071
0.3349 13.4286 564 0.6872 0.4729 0.6872 0.8290
0.3349 13.4762 566 0.6584 0.4499 0.6584 0.8114
0.3349 13.5238 568 0.6351 0.3762 0.6351 0.7969
0.3349 13.5714 570 0.6606 0.5076 0.6606 0.8128
0.3349 13.6190 572 0.6723 0.4707 0.6723 0.8199
0.3349 13.6667 574 0.6649 0.4148 0.6649 0.8154
0.3349 13.7143 576 0.6553 0.4729 0.6553 0.8095
0.3349 13.7619 578 0.6393 0.3939 0.6393 0.7996
0.3349 13.8095 580 0.6343 0.4137 0.6343 0.7964
0.3349 13.8571 582 0.6461 0.3467 0.6461 0.8038
0.3349 13.9048 584 0.6594 0.4265 0.6594 0.8120
0.3349 13.9524 586 0.6761 0.4196 0.6761 0.8223
0.3349 14.0 588 0.6866 0.4196 0.6866 0.8286
0.3349 14.0476 590 0.6895 0.3715 0.6895 0.8304
0.3349 14.0952 592 0.6938 0.3738 0.6938 0.8330
0.3349 14.1429 594 0.7179 0.3878 0.7179 0.8473
0.3349 14.1905 596 0.7318 0.4140 0.7318 0.8554
0.3349 14.2381 598 0.7023 0.4099 0.7023 0.8380
0.3349 14.2857 600 0.6865 0.3910 0.6865 0.8286
0.3349 14.3333 602 0.6821 0.4278 0.6821 0.8259
0.3349 14.3810 604 0.6784 0.4278 0.6784 0.8236
0.3349 14.4286 606 0.6682 0.4278 0.6682 0.8175
0.3349 14.4762 608 0.6589 0.3738 0.6589 0.8117
0.3349 14.5238 610 0.6693 0.3541 0.6693 0.8181
0.3349 14.5714 612 0.6757 0.3291 0.6757 0.8220
0.3349 14.6190 614 0.6840 0.3566 0.6840 0.8270
0.3349 14.6667 616 0.6964 0.3899 0.6964 0.8345
0.3349 14.7143 618 0.6837 0.3946 0.6837 0.8269
0.3349 14.7619 620 0.6746 0.3972 0.6746 0.8213
0.3349 14.8095 622 0.6801 0.4342 0.6801 0.8247
0.3349 14.8571 624 0.6884 0.4342 0.6884 0.8297
0.3349 14.9048 626 0.6960 0.3948 0.6960 0.8343
0.3349 14.9524 628 0.7086 0.4353 0.7086 0.8418
0.3349 15.0 630 0.7330 0.4527 0.7330 0.8562
0.3349 15.0476 632 0.7340 0.4474 0.7340 0.8568
0.3349 15.0952 634 0.7072 0.4580 0.7072 0.8410
0.3349 15.1429 636 0.6721 0.3924 0.6721 0.8198
0.3349 15.1905 638 0.6762 0.4576 0.6762 0.8223
0.3349 15.2381 640 0.7048 0.4315 0.7048 0.8395
0.3349 15.2857 642 0.6981 0.4212 0.6981 0.8355
0.3349 15.3333 644 0.6783 0.4292 0.6783 0.8236
0.3349 15.3810 646 0.6576 0.3651 0.6576 0.8109
0.3349 15.4286 648 0.6779 0.3433 0.6779 0.8233
0.3349 15.4762 650 0.7238 0.4230 0.7238 0.8508
0.3349 15.5238 652 0.7311 0.3962 0.7311 0.8550
0.3349 15.5714 654 0.6885 0.4425 0.6885 0.8298
0.3349 15.6190 656 0.6670 0.3946 0.6670 0.8167
0.3349 15.6667 658 0.6576 0.3474 0.6576 0.8109
0.3349 15.7143 660 0.6585 0.3331 0.6585 0.8115
0.3349 15.7619 662 0.6527 0.3042 0.6527 0.8079
0.3349 15.8095 664 0.6521 0.3200 0.6521 0.8075
0.3349 15.8571 666 0.6549 0.2807 0.6549 0.8092

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run3_AugV5_k12_task7_organization

Finetuned
(4019)
this model