ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k20_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7636
  • Qwk: 0.6866
  • Mse: 0.7636
  • Rmse: 0.8738

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0208 2 6.7320 0.0308 6.7320 2.5946
No log 0.0417 4 4.3092 0.0591 4.3092 2.0758
No log 0.0625 6 3.2533 0.0328 3.2533 1.8037
No log 0.0833 8 2.2486 0.1429 2.2486 1.4995
No log 0.1042 10 1.7912 0.1468 1.7912 1.3383
No log 0.125 12 1.7218 0.1869 1.7218 1.3122
No log 0.1458 14 2.0263 0.0174 2.0263 1.4235
No log 0.1667 16 2.1466 0.1094 2.1466 1.4651
No log 0.1875 18 1.9140 0.2131 1.9140 1.3835
No log 0.2083 20 1.7886 0.2521 1.7886 1.3374
No log 0.2292 22 1.8491 0.3150 1.8491 1.3598
No log 0.25 24 1.9577 0.3453 1.9577 1.3992
No log 0.2708 26 2.1414 0.3268 2.1414 1.4633
No log 0.2917 28 3.3852 0.2143 3.3852 1.8399
No log 0.3125 30 3.3859 0.2383 3.3859 1.8401
No log 0.3333 32 2.0134 0.3117 2.0134 1.4190
No log 0.3542 34 1.7098 0.4476 1.7098 1.3076
No log 0.375 36 1.5460 0.3780 1.5460 1.2434
No log 0.3958 38 1.5239 0.384 1.5239 1.2345
No log 0.4167 40 1.6018 0.3906 1.6018 1.2656
No log 0.4375 42 1.7381 0.3768 1.7381 1.3184
No log 0.4583 44 2.2961 0.3158 2.2961 1.5153
No log 0.4792 46 3.0971 0.1608 3.0971 1.7598
No log 0.5 48 2.5605 0.2674 2.5605 1.6002
No log 0.5208 50 2.2464 0.3171 2.2464 1.4988
No log 0.5417 52 1.7678 0.4133 1.7678 1.3296
No log 0.5625 54 1.7344 0.4133 1.7344 1.3170
No log 0.5833 56 1.6856 0.4 1.6856 1.2983
No log 0.6042 58 1.7329 0.4416 1.7329 1.3164
No log 0.625 60 2.4735 0.3333 2.4735 1.5727
No log 0.6458 62 2.4247 0.3842 2.4247 1.5572
No log 0.6667 64 1.4474 0.5269 1.4474 1.2031
No log 0.6875 66 1.1914 0.5379 1.1914 1.0915
No log 0.7083 68 1.1449 0.4818 1.1449 1.0700
No log 0.7292 70 1.2286 0.4460 1.2286 1.1084
No log 0.75 72 1.4402 0.5318 1.4402 1.2001
No log 0.7708 74 1.9217 0.4880 1.9217 1.3862
No log 0.7917 76 1.8539 0.5096 1.8539 1.3616
No log 0.8125 78 1.3610 0.5311 1.3610 1.1666
No log 0.8333 80 1.1290 0.4755 1.1290 1.0625
No log 0.8542 82 1.0960 0.4627 1.0960 1.0469
No log 0.875 84 1.1931 0.5 1.1931 1.0923
No log 0.8958 86 1.3884 0.4841 1.3884 1.1783
No log 0.9167 88 1.3427 0.5063 1.3427 1.1588
No log 0.9375 90 1.4161 0.525 1.4161 1.1900
No log 0.9583 92 1.1330 0.5588 1.1330 1.0644
No log 0.9792 94 0.9546 0.5649 0.9546 0.9770
No log 1.0 96 0.9085 0.6316 0.9085 0.9532
No log 1.0208 98 0.8779 0.6857 0.8779 0.9369
No log 1.0417 100 0.9655 0.6795 0.9655 0.9826
No log 1.0625 102 1.3379 0.5714 1.3379 1.1567
No log 1.0833 104 1.3867 0.5969 1.3867 1.1776
No log 1.1042 106 1.1207 0.6667 1.1207 1.0586
No log 1.125 108 0.8887 0.7190 0.8887 0.9427
No log 1.1458 110 0.9374 0.7083 0.9374 0.9682
No log 1.1667 112 0.9508 0.6573 0.9508 0.9751
No log 1.1875 114 1.0697 0.5644 1.0697 1.0342
No log 1.2083 116 1.4558 0.5806 1.4558 1.2066
No log 1.2292 118 1.4935 0.6010 1.4935 1.2221
No log 1.25 120 1.1356 0.6257 1.1356 1.0656
No log 1.2708 122 0.9012 0.6914 0.9012 0.9493
No log 1.2917 124 0.8843 0.7179 0.8843 0.9404
No log 1.3125 126 0.8840 0.6763 0.8840 0.9402
No log 1.3333 128 0.9055 0.6667 0.9055 0.9516
No log 1.3542 130 0.9188 0.6883 0.9188 0.9585
No log 1.375 132 1.0341 0.6235 1.0341 1.0169
No log 1.3958 134 1.0592 0.6322 1.0592 1.0292
No log 1.4167 136 0.8342 0.7317 0.8342 0.9133
No log 1.4375 138 0.7771 0.7152 0.7771 0.8815
No log 1.4583 140 0.7940 0.7285 0.7940 0.8910
No log 1.4792 142 0.9073 0.6829 0.9073 0.9525
No log 1.5 144 0.8653 0.6968 0.8653 0.9302
No log 1.5208 146 0.8179 0.7226 0.8179 0.9044
No log 1.5417 148 0.8230 0.6800 0.8230 0.9072
No log 1.5625 150 0.8340 0.6951 0.8340 0.9133
No log 1.5833 152 0.8540 0.7081 0.8540 0.9241
No log 1.6042 154 0.8775 0.6577 0.8775 0.9367
No log 1.625 156 0.8466 0.7134 0.8466 0.9201
No log 1.6458 158 0.8226 0.7607 0.8226 0.9070
No log 1.6667 160 0.8355 0.7730 0.8355 0.9140
No log 1.6875 162 0.8480 0.6980 0.8480 0.9209
No log 1.7083 164 0.8490 0.7586 0.8490 0.9214
No log 1.7292 166 0.9161 0.6479 0.9161 0.9572
No log 1.75 168 0.8879 0.6986 0.8879 0.9423
No log 1.7708 170 0.8293 0.7383 0.8293 0.9106
No log 1.7917 172 0.8458 0.6883 0.8458 0.9197
No log 1.8125 174 0.8388 0.7097 0.8388 0.9158
No log 1.8333 176 0.9362 0.6627 0.9362 0.9676
No log 1.8542 178 0.8441 0.7219 0.8441 0.9188
No log 1.875 180 0.7782 0.7326 0.7782 0.8822
No log 1.8958 182 0.7772 0.7468 0.7772 0.8816
No log 1.9167 184 0.8018 0.7261 0.8018 0.8954
No log 1.9375 186 0.8591 0.6803 0.8591 0.9269
No log 1.9583 188 0.8521 0.6897 0.8521 0.9231
No log 1.9792 190 0.7619 0.7417 0.7619 0.8729
No log 2.0 192 0.7229 0.7529 0.7229 0.8502
No log 2.0208 194 0.7309 0.7529 0.7309 0.8550
No log 2.0417 196 0.7125 0.7805 0.7125 0.8441
No log 2.0625 198 0.7191 0.7771 0.7191 0.8480
No log 2.0833 200 0.7621 0.7027 0.7621 0.8730
No log 2.1042 202 0.7526 0.7347 0.7526 0.8675
No log 2.125 204 0.7474 0.7683 0.7474 0.8645
No log 2.1458 206 0.8186 0.7205 0.8186 0.9047
No log 2.1667 208 0.8104 0.7682 0.8104 0.9002
No log 2.1875 210 0.7910 0.7733 0.7910 0.8894
No log 2.2083 212 0.8723 0.7260 0.8723 0.9340
No log 2.2292 214 0.8825 0.7183 0.8825 0.9394
No log 2.25 216 0.8339 0.7222 0.8339 0.9132
No log 2.2708 218 0.8320 0.7261 0.8320 0.9121
No log 2.2917 220 0.9355 0.6467 0.9355 0.9672
No log 2.3125 222 0.8652 0.7209 0.8652 0.9301
No log 2.3333 224 0.7380 0.7857 0.7380 0.8591
No log 2.3542 226 0.7263 0.7532 0.7263 0.8522
No log 2.375 228 0.7331 0.7211 0.7331 0.8562
No log 2.3958 230 0.7096 0.7451 0.7096 0.8424
No log 2.4167 232 0.7569 0.7738 0.7569 0.8700
No log 2.4375 234 0.7906 0.7574 0.7906 0.8891
No log 2.4583 236 0.7087 0.7927 0.7087 0.8419
No log 2.4792 238 0.6784 0.7925 0.6784 0.8237
No log 2.5 240 0.6877 0.7792 0.6877 0.8293
No log 2.5208 242 0.6695 0.7848 0.6695 0.8182
No log 2.5417 244 0.6704 0.7950 0.6704 0.8188
No log 2.5625 246 0.6748 0.7947 0.6748 0.8215
No log 2.5833 248 0.6959 0.7518 0.6959 0.8342
No log 2.6042 250 0.6945 0.7639 0.6945 0.8334
No log 2.625 252 0.7050 0.7692 0.7050 0.8396
No log 2.6458 254 0.7425 0.7545 0.7425 0.8617
No log 2.6667 256 0.8193 0.7594 0.8193 0.9052
No log 2.6875 258 0.8277 0.7725 0.8277 0.9098
No log 2.7083 260 0.7413 0.7797 0.7413 0.8610
No log 2.7292 262 0.7421 0.7586 0.7421 0.8615
No log 2.75 264 0.7577 0.7836 0.7577 0.8705
No log 2.7708 266 0.7819 0.7836 0.7819 0.8842
No log 2.7917 268 0.8008 0.7619 0.8008 0.8949
No log 2.8125 270 0.8081 0.7296 0.8081 0.8990
No log 2.8333 272 0.8024 0.75 0.8024 0.8958
No log 2.8542 274 0.8002 0.75 0.8002 0.8945
No log 2.875 276 0.7884 0.7329 0.7884 0.8879
No log 2.8958 278 0.8119 0.7515 0.8119 0.9010
No log 2.9167 280 0.8218 0.7515 0.8218 0.9065
No log 2.9375 282 0.7920 0.7329 0.7920 0.8899
No log 2.9583 284 0.7715 0.7114 0.7715 0.8783
No log 2.9792 286 0.7689 0.72 0.7689 0.8769
No log 3.0 288 0.7099 0.7329 0.7099 0.8426
No log 3.0208 290 0.7358 0.7362 0.7358 0.8578
No log 3.0417 292 0.7625 0.7470 0.7625 0.8732
No log 3.0625 294 0.7192 0.75 0.7192 0.8480
No log 3.0833 296 0.7473 0.7170 0.7473 0.8645
No log 3.1042 298 0.7617 0.7170 0.7617 0.8728
No log 3.125 300 0.7585 0.7170 0.7585 0.8709
No log 3.1458 302 0.7683 0.7237 0.7683 0.8765
No log 3.1667 304 0.7871 0.7248 0.7871 0.8872
No log 3.1875 306 0.7773 0.7362 0.7773 0.8816
No log 3.2083 308 0.7738 0.7125 0.7738 0.8796
No log 3.2292 310 0.8045 0.7205 0.8045 0.8969
No log 3.25 312 0.8058 0.7394 0.8058 0.8977
No log 3.2708 314 0.7843 0.7453 0.7843 0.8856
No log 3.2917 316 0.7907 0.7561 0.7907 0.8892
No log 3.3125 318 0.8313 0.7399 0.8313 0.9118
No log 3.3333 320 0.7926 0.7586 0.7926 0.8903
No log 3.3542 322 0.7890 0.7368 0.7890 0.8882
No log 3.375 324 0.7605 0.7485 0.7605 0.8721
No log 3.3958 326 0.8223 0.7314 0.8223 0.9068
No log 3.4167 328 0.7888 0.7442 0.7888 0.8881
No log 3.4375 330 0.7108 0.7296 0.7108 0.8431
No log 3.4583 332 0.7125 0.7636 0.7125 0.8441
No log 3.4792 334 0.7267 0.7329 0.7267 0.8524
No log 3.5 336 0.7538 0.7329 0.7538 0.8682
No log 3.5208 338 0.7460 0.7349 0.7460 0.8637
No log 3.5417 340 0.6937 0.7590 0.6937 0.8329
No log 3.5625 342 0.7430 0.7453 0.7430 0.8620
No log 3.5833 344 0.7065 0.7578 0.7065 0.8405
No log 3.6042 346 0.6598 0.7439 0.6598 0.8123
No log 3.625 348 0.7043 0.7647 0.7043 0.8392
No log 3.6458 350 0.7981 0.7135 0.7981 0.8934
No log 3.6667 352 0.7697 0.7152 0.7697 0.8773
No log 3.6875 354 0.7255 0.7547 0.7255 0.8518
No log 3.7083 356 0.7210 0.7547 0.7210 0.8491
No log 3.7292 358 0.7397 0.7529 0.7397 0.8601
No log 3.75 360 0.7138 0.7630 0.7138 0.8449
No log 3.7708 362 0.6838 0.7654 0.6838 0.8270
No log 3.7917 364 0.6664 0.7545 0.6664 0.8163
No log 3.8125 366 0.6492 0.7545 0.6492 0.8058
No log 3.8333 368 0.6562 0.7816 0.6562 0.8100
No log 3.8542 370 0.6560 0.7929 0.6560 0.8099
No log 3.875 372 0.6559 0.7821 0.6559 0.8099
No log 3.8958 374 0.6774 0.7843 0.6774 0.8231
No log 3.9167 376 0.6966 0.8054 0.6966 0.8346
No log 3.9375 378 0.7314 0.7211 0.7314 0.8552
No log 3.9583 380 0.7121 0.7211 0.7121 0.8439
No log 3.9792 382 0.6643 0.7792 0.6643 0.8151
No log 4.0 384 0.6513 0.7595 0.6513 0.8070
No log 4.0208 386 0.6728 0.7647 0.6728 0.8203
No log 4.0417 388 0.7339 0.7667 0.7339 0.8567
No log 4.0625 390 0.7387 0.7684 0.7387 0.8595
No log 4.0833 392 0.7109 0.7613 0.7109 0.8432
No log 4.1042 394 0.7575 0.7222 0.7575 0.8704
No log 4.125 396 0.7850 0.7273 0.7850 0.8860
No log 4.1458 398 0.7490 0.7347 0.7490 0.8654
No log 4.1667 400 0.7199 0.7516 0.7199 0.8484
No log 4.1875 402 0.7164 0.7425 0.7164 0.8464
No log 4.2083 404 0.6546 0.7590 0.6546 0.8090
No log 4.2292 406 0.6702 0.7662 0.6702 0.8186
No log 4.25 408 0.6788 0.7815 0.6788 0.8239
No log 4.2708 410 0.6526 0.7742 0.6526 0.8078
No log 4.2917 412 0.7491 0.7152 0.7491 0.8655
No log 4.3125 414 0.7744 0.7229 0.7744 0.8800
No log 4.3333 416 0.6849 0.7329 0.6849 0.8276
No log 4.3542 418 0.6576 0.7561 0.6576 0.8109
No log 4.375 420 0.6495 0.75 0.6495 0.8059
No log 4.3958 422 0.6627 0.7558 0.6627 0.8140
No log 4.4167 424 0.7623 0.7826 0.7623 0.8731
No log 4.4375 426 0.7314 0.7598 0.7314 0.8552
No log 4.4583 428 0.6361 0.7692 0.6361 0.7976
No log 4.4792 430 0.6553 0.7547 0.6553 0.8095
No log 4.5 432 0.6785 0.7651 0.6785 0.8237
No log 4.5208 434 0.6890 0.7517 0.6890 0.8301
No log 4.5417 436 0.6654 0.7651 0.6654 0.8157
No log 4.5625 438 0.6642 0.7417 0.6642 0.8150
No log 4.5833 440 0.6713 0.7417 0.6713 0.8194
No log 4.6042 442 0.6647 0.7467 0.6647 0.8153
No log 4.625 444 0.6278 0.7613 0.6278 0.7923
No log 4.6458 446 0.6481 0.7683 0.6481 0.8051
No log 4.6667 448 0.6812 0.7545 0.6812 0.8253
No log 4.6875 450 0.6799 0.7578 0.6799 0.8246
No log 4.7083 452 0.6899 0.7763 0.6899 0.8306
No log 4.7292 454 0.7080 0.7632 0.7080 0.8414
No log 4.75 456 0.7013 0.7712 0.7013 0.8374
No log 4.7708 458 0.6964 0.7778 0.6964 0.8345
No log 4.7917 460 0.6997 0.7841 0.6997 0.8365
No log 4.8125 462 0.7563 0.7869 0.7563 0.8696
No log 4.8333 464 0.7243 0.7753 0.7243 0.8510
No log 4.8542 466 0.6706 0.7574 0.6706 0.8189
No log 4.875 468 0.8067 0.6757 0.8067 0.8982
No log 4.8958 470 0.9504 0.5942 0.9504 0.9749
No log 4.9167 472 0.8927 0.6429 0.8927 0.9448
No log 4.9375 474 0.7833 0.7183 0.7833 0.8850
No log 4.9583 476 0.7430 0.7483 0.7430 0.8620
No log 4.9792 478 0.7108 0.7483 0.7108 0.8431
No log 5.0 480 0.7026 0.6912 0.7026 0.8382
No log 5.0208 482 0.7143 0.6861 0.7143 0.8452
No log 5.0417 484 0.6791 0.7550 0.6791 0.8241
No log 5.0625 486 0.7132 0.7260 0.7132 0.8445
No log 5.0833 488 0.8223 0.6812 0.8223 0.9068
No log 5.1042 490 0.8341 0.6861 0.8341 0.9133
No log 5.125 492 0.7584 0.7260 0.7584 0.8708
No log 5.1458 494 0.6741 0.7799 0.6741 0.8210
No log 5.1667 496 0.6879 0.7711 0.6879 0.8294
No log 5.1875 498 0.7360 0.7251 0.7360 0.8579
0.4192 5.2083 500 0.7285 0.7614 0.7285 0.8535
0.4192 5.2292 502 0.6818 0.7771 0.6818 0.8257
0.4192 5.25 504 0.7394 0.7717 0.7394 0.8599
0.4192 5.2708 506 0.7074 0.7735 0.7074 0.8411
0.4192 5.2917 508 0.6295 0.7765 0.6295 0.7934
0.4192 5.3125 510 0.6210 0.8075 0.6210 0.7880
0.4192 5.3333 512 0.6400 0.7925 0.6400 0.8000
0.4192 5.3542 514 0.6758 0.7848 0.6758 0.8221
0.4192 5.375 516 0.7072 0.7722 0.7072 0.8409
0.4192 5.3958 518 0.6814 0.7758 0.6814 0.8255
0.4192 5.4167 520 0.6292 0.7811 0.6292 0.7932
0.4192 5.4375 522 0.6212 0.8 0.6212 0.7881
0.4192 5.4583 524 0.6385 0.7861 0.6385 0.7991
0.4192 5.4792 526 0.6580 0.7836 0.6580 0.8112
0.4192 5.5 528 0.6929 0.7727 0.6929 0.8324
0.4192 5.5208 530 0.7111 0.7665 0.7111 0.8433
0.4192 5.5417 532 0.7498 0.7294 0.7498 0.8659
0.4192 5.5625 534 0.7549 0.7514 0.7549 0.8689
0.4192 5.5833 536 0.6792 0.7929 0.6792 0.8242
0.4192 5.6042 538 0.6266 0.8098 0.6266 0.7916
0.4192 5.625 540 0.6889 0.7347 0.6889 0.8300
0.4192 5.6458 542 0.7102 0.7397 0.7102 0.8427
0.4192 5.6667 544 0.6709 0.7722 0.6709 0.8191
0.4192 5.6875 546 0.6599 0.775 0.6599 0.8123
0.4192 5.7083 548 0.6562 0.7730 0.6562 0.8101
0.4192 5.7292 550 0.6546 0.8024 0.6546 0.8091
0.4192 5.75 552 0.7010 0.7753 0.7010 0.8373
0.4192 5.7708 554 0.6911 0.7872 0.6911 0.8313
0.4192 5.7917 556 0.6423 0.8128 0.6423 0.8014
0.4192 5.8125 558 0.6234 0.8065 0.6234 0.7896
0.4192 5.8333 560 0.6188 0.7956 0.6188 0.7866
0.4192 5.8542 562 0.6202 0.7953 0.6202 0.7875
0.4192 5.875 564 0.6270 0.7701 0.6270 0.7918
0.4192 5.8958 566 0.6247 0.7746 0.6247 0.7904
0.4192 5.9167 568 0.6257 0.7797 0.6257 0.7910
0.4192 5.9375 570 0.6214 0.7841 0.6214 0.7883
0.4192 5.9583 572 0.6217 0.7910 0.6217 0.7885
0.4192 5.9792 574 0.6182 0.7797 0.6182 0.7862
0.4192 6.0 576 0.6138 0.7910 0.6138 0.7834
0.4192 6.0208 578 0.6190 0.7953 0.6190 0.7868
0.4192 6.0417 580 0.6289 0.7976 0.6289 0.7930
0.4192 6.0625 582 0.6442 0.8049 0.6442 0.8026
0.4192 6.0833 584 0.6393 0.8075 0.6393 0.7996
0.4192 6.1042 586 0.6359 0.8098 0.6359 0.7974
0.4192 6.125 588 0.6782 0.7662 0.6782 0.8235
0.4192 6.1458 590 0.6814 0.7683 0.6814 0.8255
0.4192 6.1667 592 0.6205 0.8121 0.6205 0.7877
0.4192 6.1875 594 0.5840 0.7977 0.5840 0.7642
0.4192 6.2083 596 0.5874 0.7886 0.5874 0.7664
0.4192 6.2292 598 0.6001 0.7841 0.6001 0.7747
0.4192 6.25 600 0.6703 0.7684 0.6703 0.8187
0.4192 6.2708 602 0.7341 0.7614 0.7341 0.8568
0.4192 6.2917 604 0.6940 0.7684 0.6940 0.8331
0.4192 6.3125 606 0.6032 0.7619 0.6032 0.7767
0.4192 6.3333 608 0.5987 0.7811 0.5987 0.7737
0.4192 6.3542 610 0.5937 0.7841 0.5937 0.7705
0.4192 6.375 612 0.6255 0.7802 0.6255 0.7909
0.4192 6.3958 614 0.6919 0.7957 0.6919 0.8318
0.4192 6.4167 616 0.6473 0.7957 0.6473 0.8046
0.4192 6.4375 618 0.5910 0.7978 0.5910 0.7687
0.4192 6.4583 620 0.6595 0.7879 0.6595 0.8121
0.4192 6.4792 622 0.6744 0.7778 0.6744 0.8212
0.4192 6.5 624 0.6374 0.8025 0.6374 0.7984
0.4192 6.5208 626 0.6376 0.8121 0.6376 0.7985
0.4192 6.5417 628 0.6734 0.7976 0.6734 0.8206
0.4192 6.5625 630 0.6862 0.8049 0.6862 0.8284
0.4192 6.5833 632 0.6749 0.8049 0.6749 0.8215
0.4192 6.6042 634 0.6474 0.8047 0.6474 0.8046
0.4192 6.625 636 0.6241 0.8095 0.6241 0.7900
0.4192 6.6458 638 0.6493 0.8193 0.6493 0.8058
0.4192 6.6667 640 0.6619 0.8098 0.6619 0.8136
0.4192 6.6875 642 0.6403 0.8049 0.6403 0.8002
0.4192 6.7083 644 0.6355 0.7799 0.6355 0.7972
0.4192 6.7292 646 0.6663 0.7532 0.6663 0.8163
0.4192 6.75 648 0.7705 0.6620 0.7705 0.8778
0.4192 6.7708 650 0.8472 0.6232 0.8472 0.9204
0.4192 6.7917 652 0.8442 0.6119 0.8442 0.9188
0.4192 6.8125 654 0.7636 0.6866 0.7636 0.8738

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k20_task1_organization

Finetuned
(4019)
this model