ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k14_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7885
  • Qwk: 0.4180
  • Mse: 0.7885
  • Rmse: 0.8880

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0435 2 3.9282 -0.0032 3.9282 1.9820
No log 0.0870 4 1.7856 -0.0144 1.7856 1.3363
No log 0.1304 6 1.2941 0.0 1.2941 1.1376
No log 0.1739 8 1.1610 0.0760 1.1610 1.0775
No log 0.2174 10 1.2165 0.0380 1.2165 1.1030
No log 0.2609 12 1.2399 -0.0148 1.2399 1.1135
No log 0.3043 14 1.0220 0.3104 1.0220 1.0109
No log 0.3478 16 1.0067 0.2787 1.0067 1.0033
No log 0.3913 18 1.0050 0.2515 1.0050 1.0025
No log 0.4348 20 1.0936 0.2611 1.0936 1.0458
No log 0.4783 22 1.5421 -0.0411 1.5421 1.2418
No log 0.5217 24 1.5728 0.0 1.5728 1.2541
No log 0.5652 26 1.2972 -0.0411 1.2972 1.1389
No log 0.6087 28 1.0551 0.1874 1.0551 1.0272
No log 0.6522 30 1.0508 0.2145 1.0508 1.0251
No log 0.6957 32 1.1523 0.1233 1.1523 1.0734
No log 0.7391 34 1.4576 0.0380 1.4576 1.2073
No log 0.7826 36 1.8461 0.1522 1.8461 1.3587
No log 0.8261 38 1.7729 0.1442 1.7729 1.3315
No log 0.8696 40 1.3756 0.1770 1.3756 1.1729
No log 0.9130 42 1.0749 0.1858 1.0749 1.0368
No log 0.9565 44 1.0654 0.1416 1.0654 1.0322
No log 1.0 46 1.3240 0.2661 1.3240 1.1506
No log 1.0435 48 1.3327 0.2489 1.3327 1.1544
No log 1.0870 50 1.0321 0.1667 1.0321 1.0159
No log 1.1304 52 0.9838 0.3663 0.9838 0.9919
No log 1.1739 54 1.1530 0.3394 1.1530 1.0738
No log 1.2174 56 1.0612 0.3414 1.0612 1.0301
No log 1.2609 58 0.8607 0.4544 0.8607 0.9278
No log 1.3043 60 0.8695 0.3891 0.8695 0.9325
No log 1.3478 62 0.8160 0.5185 0.8160 0.9033
No log 1.3913 64 0.9031 0.2991 0.9031 0.9503
No log 1.4348 66 1.0414 0.2878 1.0414 1.0205
No log 1.4783 68 1.0069 0.3491 1.0069 1.0035
No log 1.5217 70 0.8774 0.4123 0.8774 0.9367
No log 1.5652 72 0.9296 0.3890 0.9296 0.9642
No log 1.6087 74 0.9888 0.3049 0.9888 0.9944
No log 1.6522 76 0.8958 0.3890 0.8958 0.9465
No log 1.6957 78 0.8058 0.5131 0.8058 0.8977
No log 1.7391 80 0.8450 0.4359 0.8450 0.9192
No log 1.7826 82 0.8645 0.3861 0.8645 0.9298
No log 1.8261 84 0.8407 0.4231 0.8407 0.9169
No log 1.8696 86 0.7842 0.5485 0.7842 0.8855
No log 1.9130 88 0.7624 0.5972 0.7624 0.8732
No log 1.9565 90 0.7819 0.5234 0.7819 0.8842
No log 2.0 92 0.9685 0.4342 0.9685 0.9841
No log 2.0435 94 1.0420 0.4783 1.0420 1.0208
No log 2.0870 96 0.9170 0.4342 0.9170 0.9576
No log 2.1304 98 0.8072 0.5103 0.8072 0.8984
No log 2.1739 100 0.8326 0.4483 0.8326 0.9125
No log 2.2174 102 0.9884 0.3243 0.9884 0.9942
No log 2.2609 104 1.4484 0.1131 1.4484 1.2035
No log 2.3043 106 1.5666 -0.0347 1.5666 1.2516
No log 2.3478 108 1.2228 0.0951 1.2228 1.1058
No log 2.3913 110 0.9105 0.3494 0.9105 0.9542
No log 2.4348 112 0.7819 0.5116 0.7819 0.8843
No log 2.4783 114 0.7792 0.6057 0.7792 0.8827
No log 2.5217 116 0.7890 0.4646 0.7890 0.8883
No log 2.5652 118 0.8096 0.4631 0.8096 0.8998
No log 2.6087 120 0.7975 0.4630 0.7975 0.8930
No log 2.6522 122 0.8176 0.5380 0.8176 0.9042
No log 2.6957 124 0.7922 0.6263 0.7922 0.8900
No log 2.7391 126 0.8246 0.6075 0.8246 0.9081
No log 2.7826 128 0.9093 0.4938 0.9093 0.9536
No log 2.8261 130 0.9516 0.4930 0.9516 0.9755
No log 2.8696 132 0.9830 0.4836 0.9830 0.9915
No log 2.9130 134 1.0935 0.4634 1.0935 1.0457
No log 2.9565 136 0.9173 0.4729 0.9173 0.9578
No log 3.0 138 0.7769 0.5944 0.7769 0.8814
No log 3.0435 140 0.7706 0.5205 0.7706 0.8778
No log 3.0870 142 0.7559 0.6039 0.7559 0.8694
No log 3.1304 144 0.8034 0.5708 0.8034 0.8963
No log 3.1739 146 0.8195 0.5366 0.8195 0.9052
No log 3.2174 148 0.7366 0.5534 0.7366 0.8582
No log 3.2609 150 0.7459 0.6325 0.7459 0.8637
No log 3.3043 152 0.7461 0.6325 0.7461 0.8638
No log 3.3478 154 0.7279 0.6510 0.7279 0.8532
No log 3.3913 156 0.7361 0.6414 0.7361 0.8580
No log 3.4348 158 0.7536 0.6510 0.7536 0.8681
No log 3.4783 160 0.7654 0.5813 0.7654 0.8749
No log 3.5217 162 0.8675 0.5673 0.8675 0.9314
No log 3.5652 164 0.8775 0.5666 0.8775 0.9367
No log 3.6087 166 0.7833 0.5793 0.7833 0.8850
No log 3.6522 168 0.7636 0.5721 0.7636 0.8739
No log 3.6957 170 0.7639 0.5610 0.7639 0.8740
No log 3.7391 172 0.7530 0.6380 0.7530 0.8678
No log 3.7826 174 0.7795 0.4850 0.7795 0.8829
No log 3.8261 176 0.7539 0.5220 0.7539 0.8682
No log 3.8696 178 0.7130 0.5032 0.7130 0.8444
No log 3.9130 180 0.7070 0.5428 0.7070 0.8409
No log 3.9565 182 0.6882 0.5542 0.6882 0.8296
No log 4.0 184 0.7211 0.5787 0.7211 0.8492
No log 4.0435 186 0.8015 0.5470 0.8015 0.8953
No log 4.0870 188 0.7654 0.5470 0.7654 0.8748
No log 4.1304 190 0.7553 0.5810 0.7553 0.8691
No log 4.1739 192 0.8172 0.5122 0.8172 0.9040
No log 4.2174 194 0.8878 0.4490 0.8878 0.9423
No log 4.2609 196 0.7589 0.5521 0.7589 0.8711
No log 4.3043 198 0.6648 0.6528 0.6648 0.8153
No log 4.3478 200 0.6787 0.6133 0.6787 0.8238
No log 4.3913 202 0.7212 0.5292 0.7212 0.8492
No log 4.4348 204 0.6970 0.6199 0.6970 0.8349
No log 4.4783 206 0.6521 0.6470 0.6521 0.8075
No log 4.5217 208 0.6791 0.6078 0.6791 0.8241
No log 4.5652 210 0.6734 0.5868 0.6734 0.8206
No log 4.6087 212 0.6734 0.6014 0.6734 0.8206
No log 4.6522 214 0.7490 0.5048 0.7490 0.8654
No log 4.6957 216 0.7382 0.5079 0.7382 0.8592
No log 4.7391 218 0.6840 0.5874 0.6840 0.8270
No log 4.7826 220 0.6805 0.5871 0.6805 0.8249
No log 4.8261 222 0.6995 0.5546 0.6995 0.8363
No log 4.8696 224 0.7531 0.5292 0.7531 0.8678
No log 4.9130 226 0.7296 0.5062 0.7296 0.8542
No log 4.9565 228 0.6852 0.6593 0.6852 0.8277
No log 5.0 230 0.6891 0.6564 0.6891 0.8301
No log 5.0435 232 0.7017 0.5839 0.7017 0.8377
No log 5.0870 234 0.6818 0.6606 0.6818 0.8257
No log 5.1304 236 0.7141 0.6053 0.7141 0.8450
No log 5.1739 238 0.7711 0.5041 0.7711 0.8781
No log 5.2174 240 0.7208 0.5305 0.7208 0.8490
No log 5.2609 242 0.7212 0.4849 0.7212 0.8492
No log 5.3043 244 0.7870 0.4513 0.7870 0.8872
No log 5.3478 246 0.7768 0.4493 0.7768 0.8814
No log 5.3913 248 0.7427 0.5180 0.7427 0.8618
No log 5.4348 250 0.7405 0.5408 0.7405 0.8605
No log 5.4783 252 0.7435 0.5066 0.7435 0.8622
No log 5.5217 254 0.7543 0.5313 0.7543 0.8685
No log 5.5652 256 0.7674 0.5625 0.7674 0.8760
No log 5.6087 258 0.7621 0.5625 0.7621 0.8730
No log 5.6522 260 0.7497 0.5633 0.7497 0.8658
No log 5.6957 262 0.7506 0.4842 0.7506 0.8664
No log 5.7391 264 0.7491 0.5060 0.7491 0.8655
No log 5.7826 266 0.7345 0.5174 0.7345 0.8570
No log 5.8261 268 0.7151 0.5606 0.7151 0.8457
No log 5.8696 270 0.7189 0.6007 0.7189 0.8479
No log 5.9130 272 0.7948 0.5439 0.7948 0.8915
No log 5.9565 274 0.9066 0.4693 0.9066 0.9522
No log 6.0 276 0.9171 0.4387 0.9171 0.9576
No log 6.0435 278 0.7814 0.6301 0.7814 0.8839
No log 6.0870 280 0.7281 0.6343 0.7281 0.8533
No log 6.1304 282 0.7256 0.6343 0.7256 0.8518
No log 6.1739 284 0.7284 0.6768 0.7284 0.8535
No log 6.2174 286 0.7363 0.6266 0.7363 0.8581
No log 6.2609 288 0.7468 0.5658 0.7468 0.8642
No log 6.3043 290 0.7377 0.5046 0.7377 0.8589
No log 6.3478 292 0.7124 0.5279 0.7124 0.8440
No log 6.3913 294 0.6989 0.5725 0.6989 0.8360
No log 6.4348 296 0.7357 0.5415 0.7357 0.8577
No log 6.4783 298 0.7509 0.5188 0.7509 0.8665
No log 6.5217 300 0.7294 0.5735 0.7294 0.8540
No log 6.5652 302 0.7522 0.5070 0.7522 0.8673
No log 6.6087 304 0.8062 0.4695 0.8062 0.8979
No log 6.6522 306 0.7754 0.5266 0.7754 0.8805
No log 6.6957 308 0.7215 0.5980 0.7215 0.8494
No log 6.7391 310 0.7302 0.6134 0.7302 0.8545
No log 6.7826 312 0.7513 0.5530 0.7513 0.8668
No log 6.8261 314 0.7173 0.5716 0.7173 0.8469
No log 6.8696 316 0.7276 0.4724 0.7276 0.8530
No log 6.9130 318 0.7669 0.5231 0.7669 0.8757
No log 6.9565 320 0.7318 0.4829 0.7318 0.8555
No log 7.0 322 0.6954 0.5720 0.6954 0.8339
No log 7.0435 324 0.6936 0.5969 0.6936 0.8328
No log 7.0870 326 0.6925 0.6006 0.6925 0.8322
No log 7.1304 328 0.7295 0.4815 0.7295 0.8541
No log 7.1739 330 0.7935 0.4695 0.7935 0.8908
No log 7.2174 332 0.8054 0.4695 0.8054 0.8974
No log 7.2609 334 0.7328 0.5267 0.7328 0.8561
No log 7.3043 336 0.7097 0.6205 0.7097 0.8424
No log 7.3478 338 0.7451 0.5804 0.7451 0.8632
No log 7.3913 340 0.7332 0.5930 0.7332 0.8563
No log 7.4348 342 0.7379 0.5592 0.7379 0.8590
No log 7.4783 344 0.8523 0.4577 0.8523 0.9232
No log 7.5217 346 0.8479 0.4349 0.8479 0.9208
No log 7.5652 348 0.7456 0.4350 0.7456 0.8635
No log 7.6087 350 0.7047 0.5160 0.7047 0.8394
No log 7.6522 352 0.7158 0.4691 0.7158 0.8460
No log 7.6957 354 0.7184 0.4914 0.7184 0.8476
No log 7.7391 356 0.7432 0.3802 0.7432 0.8621
No log 7.7826 358 0.8320 0.4326 0.8320 0.9122
No log 7.8261 360 0.8408 0.4318 0.8408 0.9169
No log 7.8696 362 0.7774 0.4335 0.7774 0.8817
No log 7.9130 364 0.7211 0.4878 0.7211 0.8492
No log 7.9565 366 0.7307 0.5759 0.7307 0.8548
No log 8.0 368 0.7382 0.5748 0.7382 0.8592
No log 8.0435 370 0.7270 0.5748 0.7270 0.8526
No log 8.0870 372 0.7104 0.5393 0.7104 0.8428
No log 8.1304 374 0.7077 0.5809 0.7077 0.8412
No log 8.1739 376 0.6986 0.5402 0.6986 0.8358
No log 8.2174 378 0.6918 0.5847 0.6918 0.8317
No log 8.2609 380 0.6898 0.5025 0.6898 0.8306
No log 8.3043 382 0.7109 0.5054 0.7109 0.8431
No log 8.3478 384 0.7400 0.5215 0.7400 0.8602
No log 8.3913 386 0.7575 0.5686 0.7575 0.8704
No log 8.4348 388 0.7447 0.5653 0.7447 0.8630
No log 8.4783 390 0.7230 0.5261 0.7230 0.8503
No log 8.5217 392 0.7413 0.5089 0.7413 0.8610
No log 8.5652 394 0.7439 0.5340 0.7439 0.8625
No log 8.6087 396 0.7527 0.4828 0.7527 0.8676
No log 8.6522 398 0.7651 0.4705 0.7651 0.8747
No log 8.6957 400 0.7755 0.4612 0.7755 0.8806
No log 8.7391 402 0.8123 0.5056 0.8123 0.9013
No log 8.7826 404 0.8562 0.4579 0.8562 0.9253
No log 8.8261 406 0.8629 0.4579 0.8629 0.9289
No log 8.8696 408 0.8262 0.4916 0.8262 0.9089
No log 8.9130 410 0.7999 0.4926 0.7999 0.8944
No log 8.9565 412 0.7374 0.5333 0.7374 0.8587
No log 9.0 414 0.7206 0.5797 0.7206 0.8489
No log 9.0435 416 0.7283 0.5688 0.7283 0.8534
No log 9.0870 418 0.7694 0.5208 0.7694 0.8772
No log 9.1304 420 0.8470 0.4357 0.8470 0.9203
No log 9.1739 422 0.8975 0.4573 0.8975 0.9474
No log 9.2174 424 0.8399 0.4310 0.8399 0.9165
No log 9.2609 426 0.7948 0.3782 0.7948 0.8915
No log 9.3043 428 0.7554 0.3782 0.7554 0.8692
No log 9.3478 430 0.7208 0.4576 0.7208 0.8490
No log 9.3913 432 0.7269 0.4180 0.7269 0.8526
No log 9.4348 434 0.7516 0.4349 0.7516 0.8670
No log 9.4783 436 0.8484 0.4592 0.8484 0.9211
No log 9.5217 438 0.9102 0.5140 0.9102 0.9541
No log 9.5652 440 0.9029 0.5353 0.9029 0.9502
No log 9.6087 442 0.9067 0.5144 0.9067 0.9522
No log 9.6522 444 0.9061 0.5140 0.9061 0.9519
No log 9.6957 446 0.8271 0.4935 0.8271 0.9094
No log 9.7391 448 0.7879 0.4660 0.7879 0.8876
No log 9.7826 450 0.8005 0.4691 0.8005 0.8947
No log 9.8261 452 0.7850 0.4819 0.7850 0.8860
No log 9.8696 454 0.7732 0.5274 0.7732 0.8793
No log 9.9130 456 0.7777 0.5565 0.7777 0.8818
No log 9.9565 458 0.8225 0.5266 0.8225 0.9069
No log 10.0 460 0.8853 0.4796 0.8853 0.9409
No log 10.0435 462 0.9417 0.5 0.9417 0.9704
No log 10.0870 464 0.8854 0.4349 0.8854 0.9409
No log 10.1304 466 0.7677 0.5083 0.7677 0.8762
No log 10.1739 468 0.7264 0.5259 0.7264 0.8523
No log 10.2174 470 0.7279 0.5160 0.7279 0.8531
No log 10.2609 472 0.7202 0.4987 0.7202 0.8486
No log 10.3043 474 0.7185 0.4987 0.7185 0.8476
No log 10.3478 476 0.7279 0.4593 0.7279 0.8532
No log 10.3913 478 0.7643 0.4932 0.7643 0.8742
No log 10.4348 480 0.8344 0.4579 0.8344 0.9135
No log 10.4783 482 0.9184 0.4794 0.9184 0.9583
No log 10.5217 484 0.9031 0.4807 0.9031 0.9503
No log 10.5652 486 0.8506 0.5279 0.8506 0.9223
No log 10.6087 488 0.8325 0.5198 0.8325 0.9124
No log 10.6522 490 0.7991 0.5134 0.7991 0.8939
No log 10.6957 492 0.7648 0.5570 0.7648 0.8746
No log 10.7391 494 0.7547 0.5319 0.7547 0.8687
No log 10.7826 496 0.7551 0.5098 0.7551 0.8690
No log 10.8261 498 0.7787 0.5074 0.7787 0.8824
0.2813 10.8696 500 0.7888 0.4937 0.7888 0.8882
0.2813 10.9130 502 0.7919 0.5157 0.7919 0.8899
0.2813 10.9565 504 0.7709 0.4921 0.7709 0.8780
0.2813 11.0 506 0.7501 0.5266 0.7501 0.8661
0.2813 11.0435 508 0.7155 0.5509 0.7155 0.8459
0.2813 11.0870 510 0.6830 0.5540 0.6830 0.8265
0.2813 11.1304 512 0.6597 0.5467 0.6597 0.8122
0.2813 11.1739 514 0.6651 0.6393 0.6651 0.8155
0.2813 11.2174 516 0.6656 0.6256 0.6656 0.8159
0.2813 11.2609 518 0.6446 0.6804 0.6446 0.8028
0.2813 11.3043 520 0.6682 0.5688 0.6682 0.8174
0.2813 11.3478 522 0.7457 0.5458 0.7457 0.8635
0.2813 11.3913 524 0.7454 0.5266 0.7454 0.8634
0.2813 11.4348 526 0.6993 0.5509 0.6993 0.8363
0.2813 11.4783 528 0.6720 0.4972 0.6720 0.8197
0.2813 11.5217 530 0.6815 0.5108 0.6815 0.8255
0.2813 11.5652 532 0.6842 0.4972 0.6842 0.8272
0.2813 11.6087 534 0.6980 0.5292 0.6980 0.8355
0.2813 11.6522 536 0.6979 0.5266 0.6979 0.8354
0.2813 11.6957 538 0.6956 0.5443 0.6956 0.8340
0.2813 11.7391 540 0.6547 0.6451 0.6547 0.8092
0.2813 11.7826 542 0.6537 0.6444 0.6537 0.8085
0.2813 11.8261 544 0.6603 0.6976 0.6603 0.8126
0.2813 11.8696 546 0.6511 0.6562 0.6511 0.8069
0.2813 11.9130 548 0.6783 0.5650 0.6783 0.8236
0.2813 11.9565 550 0.7056 0.5639 0.7056 0.8400
0.2813 12.0 552 0.7157 0.5639 0.7157 0.8460
0.2813 12.0435 554 0.6974 0.5677 0.6974 0.8351
0.2813 12.0870 556 0.6716 0.5917 0.6716 0.8195
0.2813 12.1304 558 0.6800 0.5626 0.6800 0.8246
0.2813 12.1739 560 0.6959 0.5945 0.6959 0.8342
0.2813 12.2174 562 0.6892 0.5945 0.6892 0.8302
0.2813 12.2609 564 0.6804 0.6491 0.6804 0.8249
0.2813 12.3043 566 0.7047 0.5983 0.7047 0.8395
0.2813 12.3478 568 0.7454 0.5443 0.7454 0.8634
0.2813 12.3913 570 0.7751 0.5119 0.7751 0.8804
0.2813 12.4348 572 0.7544 0.5475 0.7544 0.8686
0.2813 12.4783 574 0.7027 0.5527 0.7027 0.8383
0.2813 12.5217 576 0.6917 0.5245 0.6917 0.8317
0.2813 12.5652 578 0.6989 0.4730 0.6989 0.8360
0.2813 12.6087 580 0.7113 0.4576 0.7113 0.8434
0.2813 12.6522 582 0.7542 0.5306 0.7542 0.8685
0.2813 12.6957 584 0.7707 0.5527 0.7707 0.8779
0.2813 12.7391 586 0.7326 0.5433 0.7326 0.8559
0.2813 12.7826 588 0.7151 0.5709 0.7151 0.8456
0.2813 12.8261 590 0.7230 0.5686 0.7230 0.8503
0.2813 12.8696 592 0.7575 0.5416 0.7575 0.8703
0.2813 12.9130 594 0.7925 0.5491 0.7925 0.8903
0.2813 12.9565 596 0.8107 0.5242 0.8107 0.9004
0.2813 13.0 598 0.7694 0.5697 0.7694 0.8771
0.2813 13.0435 600 0.7362 0.5872 0.7362 0.8580
0.2813 13.0870 602 0.7147 0.5663 0.7147 0.8454
0.2813 13.1304 604 0.7282 0.5305 0.7282 0.8534
0.2813 13.1739 606 0.7255 0.5319 0.7255 0.8517
0.2813 13.2174 608 0.7060 0.4234 0.7060 0.8402
0.2813 13.2609 610 0.6963 0.4363 0.6963 0.8345
0.2813 13.3043 612 0.6956 0.4714 0.6956 0.8340
0.2813 13.3478 614 0.7135 0.5306 0.7135 0.8447
0.2813 13.3913 616 0.7979 0.5475 0.7979 0.8932
0.2813 13.4348 618 0.9036 0.5318 0.9036 0.9506
0.2813 13.4783 620 0.8763 0.5318 0.8763 0.9361
0.2813 13.5217 622 0.7740 0.5242 0.7740 0.8798
0.2813 13.5652 624 0.6885 0.5992 0.6885 0.8298
0.2813 13.6087 626 0.6731 0.6062 0.6731 0.8204
0.2813 13.6522 628 0.6851 0.5363 0.6851 0.8277
0.2813 13.6957 630 0.6981 0.5898 0.6981 0.8355
0.2813 13.7391 632 0.7184 0.6112 0.7184 0.8476
0.2813 13.7826 634 0.7615 0.5595 0.7615 0.8726
0.2813 13.8261 636 0.7960 0.5788 0.7960 0.8922
0.2813 13.8696 638 0.7998 0.5376 0.7998 0.8943
0.2813 13.9130 640 0.7934 0.5180 0.7934 0.8907
0.2813 13.9565 642 0.8085 0.5144 0.8085 0.8992
0.2813 14.0 644 0.8132 0.5140 0.8132 0.9018
0.2813 14.0435 646 0.8311 0.4695 0.8311 0.9116
0.2813 14.0870 648 0.8138 0.4926 0.8138 0.9021
0.2813 14.1304 650 0.7914 0.4450 0.7914 0.8896
0.2813 14.1739 652 0.7785 0.4576 0.7785 0.8823
0.2813 14.2174 654 0.7790 0.5245 0.7790 0.8826
0.2813 14.2609 656 0.7857 0.4644 0.7857 0.8864
0.2813 14.3043 658 0.7848 0.4746 0.7848 0.8859
0.2813 14.3478 660 0.7947 0.4180 0.7947 0.8915
0.2813 14.3913 662 0.8122 0.4180 0.8122 0.9012
0.2813 14.4348 664 0.7885 0.4180 0.7885 0.8880

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k14_task5_organization

Finetuned
(4019)
this model