ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k16_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9797
  • Qwk: 0.5203
  • Mse: 0.9797
  • Rmse: 0.9898

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0274 2 7.1735 0.0 7.1735 2.6783
No log 0.0548 4 4.2780 0.0877 4.2780 2.0683
No log 0.0822 6 3.2767 -0.0219 3.2767 1.8102
No log 0.1096 8 2.3004 0.1678 2.3004 1.5167
No log 0.1370 10 2.1986 0.0863 2.1986 1.4828
No log 0.1644 12 2.4038 0.0559 2.4038 1.5504
No log 0.1918 14 2.4547 0.0 2.4547 1.5668
No log 0.2192 16 2.1320 0.0833 2.1320 1.4601
No log 0.2466 18 2.0480 0.1176 2.0480 1.4311
No log 0.2740 20 1.8085 0.2241 1.8085 1.3448
No log 0.3014 22 1.7422 0.2393 1.7422 1.3199
No log 0.3288 24 1.9449 0.2963 1.9449 1.3946
No log 0.3562 26 1.4882 0.3636 1.4882 1.2199
No log 0.3836 28 1.4355 0.2832 1.4355 1.1981
No log 0.4110 30 1.8289 0.2523 1.8289 1.3524
No log 0.4384 32 1.4396 0.2957 1.4396 1.1998
No log 0.4658 34 1.5152 0.4496 1.5152 1.2309
No log 0.4932 36 2.1921 0.3057 2.1921 1.4806
No log 0.5205 38 2.6152 0.1749 2.6152 1.6172
No log 0.5479 40 2.1784 0.2981 2.1784 1.4759
No log 0.5753 42 1.4639 0.3307 1.4639 1.2099
No log 0.6027 44 1.4251 0.2632 1.4251 1.1938
No log 0.6301 46 2.2112 0.0 2.2112 1.4870
No log 0.6575 48 2.1890 -0.0177 2.1890 1.4795
No log 0.6849 50 1.6801 0.3273 1.6801 1.2962
No log 0.7123 52 1.3793 0.4483 1.3793 1.1744
No log 0.7397 54 1.3632 0.4628 1.3632 1.1676
No log 0.7671 56 1.5030 0.3193 1.5030 1.2260
No log 0.7945 58 1.4070 0.3220 1.4070 1.1862
No log 0.8219 60 1.2945 0.4603 1.2945 1.1378
No log 0.8493 62 1.3133 0.5625 1.3133 1.1460
No log 0.8767 64 1.6662 0.3089 1.6662 1.2908
No log 0.9041 66 1.7869 0.2017 1.7869 1.3367
No log 0.9315 68 1.4332 0.3932 1.4332 1.1972
No log 0.9589 70 1.3305 0.4160 1.3305 1.1535
No log 0.9863 72 1.4425 0.3307 1.4425 1.2010
No log 1.0137 74 1.3692 0.368 1.3692 1.1701
No log 1.0411 76 1.2803 0.4839 1.2803 1.1315
No log 1.0685 78 1.4306 0.4065 1.4306 1.1961
No log 1.0959 80 1.5447 0.3902 1.5447 1.2429
No log 1.1233 82 1.3856 0.4500 1.3856 1.1771
No log 1.1507 84 1.1938 0.4522 1.1938 1.0926
No log 1.1781 86 1.3159 0.3932 1.3159 1.1471
No log 1.2055 88 1.5814 0.4341 1.5814 1.2575
No log 1.2329 90 1.5250 0.4593 1.5250 1.2349
No log 1.2603 92 1.1901 0.4262 1.1901 1.0909
No log 1.2877 94 1.0527 0.4833 1.0527 1.0260
No log 1.3151 96 1.0366 0.5500 1.0366 1.0182
No log 1.3425 98 1.0448 0.6457 1.0448 1.0221
No log 1.3699 100 1.1727 0.5373 1.1727 1.0829
No log 1.3973 102 1.4352 0.3972 1.4352 1.1980
No log 1.4247 104 1.7469 0.3425 1.7469 1.3217
No log 1.4521 106 1.6476 0.3776 1.6476 1.2836
No log 1.4795 108 1.4088 0.4348 1.4088 1.1869
No log 1.5068 110 1.1131 0.5736 1.1131 1.0550
No log 1.5342 112 1.0877 0.5781 1.0877 1.0429
No log 1.5616 114 1.0914 0.5410 1.0914 1.0447
No log 1.5890 116 1.1066 0.5323 1.1066 1.0520
No log 1.6164 118 1.1804 0.5692 1.1804 1.0865
No log 1.6438 120 1.4173 0.4604 1.4173 1.1905
No log 1.6712 122 1.5233 0.4255 1.5233 1.2342
No log 1.6986 124 1.3605 0.4412 1.3605 1.1664
No log 1.7260 126 1.2868 0.5385 1.2868 1.1344
No log 1.7534 128 1.3255 0.5116 1.3255 1.1513
No log 1.7808 130 1.3255 0.4355 1.3255 1.1513
No log 1.8082 132 1.2606 0.4746 1.2606 1.1228
No log 1.8356 134 1.1624 0.5210 1.1624 1.0782
No log 1.8630 136 1.1022 0.5484 1.1022 1.0498
No log 1.8904 138 1.0596 0.5846 1.0596 1.0294
No log 1.9178 140 0.9192 0.6714 0.9192 0.9587
No log 1.9452 142 0.8424 0.6901 0.8424 0.9178
No log 1.9726 144 0.8165 0.6849 0.8165 0.9036
No log 2.0 146 0.7664 0.72 0.7664 0.8754
No log 2.0274 148 0.8465 0.6974 0.8465 0.9200
No log 2.0548 150 1.0501 0.6434 1.0501 1.0248
No log 2.0822 152 0.9832 0.6621 0.9832 0.9916
No log 2.1096 154 0.8476 0.7211 0.8476 0.9207
No log 2.1370 156 0.8500 0.7075 0.8500 0.9219
No log 2.1644 158 0.8823 0.6853 0.8823 0.9393
No log 2.1918 160 0.8899 0.6765 0.8899 0.9433
No log 2.2192 162 0.9252 0.6190 0.9252 0.9619
No log 2.2466 164 0.9049 0.5528 0.9049 0.9512
No log 2.2740 166 0.9226 0.5397 0.9226 0.9605
No log 2.3014 168 0.8876 0.5484 0.8876 0.9421
No log 2.3288 170 0.8572 0.5366 0.8572 0.9259
No log 2.3562 172 0.7967 0.6457 0.7967 0.8926
No log 2.3836 174 0.8939 0.6074 0.8939 0.9455
No log 2.4110 176 1.0230 0.5882 1.0230 1.0115
No log 2.4384 178 1.0494 0.6043 1.0494 1.0244
No log 2.4658 180 0.8271 0.6759 0.8271 0.9094
No log 2.4932 182 0.6946 0.7568 0.6946 0.8334
No log 2.5205 184 0.7159 0.7467 0.7159 0.8461
No log 2.5479 186 0.7328 0.7619 0.7328 0.8560
No log 2.5753 188 0.8990 0.6861 0.8990 0.9482
No log 2.6027 190 1.1229 0.5736 1.1229 1.0597
No log 2.6301 192 1.2084 0.5124 1.2084 1.0993
No log 2.6575 194 1.1473 0.4746 1.1473 1.0711
No log 2.6849 196 0.9984 0.5124 0.9984 0.9992
No log 2.7123 198 0.9533 0.5373 0.9533 0.9764
No log 2.7397 200 1.0645 0.5874 1.0645 1.0317
No log 2.7671 202 1.0105 0.6111 1.0105 1.0052
No log 2.7945 204 0.8293 0.6479 0.8293 0.9107
No log 2.8219 206 0.8257 0.7286 0.8257 0.9087
No log 2.8493 208 0.9724 0.6418 0.9724 0.9861
No log 2.8767 210 0.9607 0.6617 0.9607 0.9802
No log 2.9041 212 0.8762 0.6015 0.8762 0.9361
No log 2.9315 214 0.8976 0.5778 0.8976 0.9474
No log 2.9589 216 0.9372 0.5630 0.9372 0.9681
No log 2.9863 218 0.9105 0.5778 0.9105 0.9542
No log 3.0137 220 0.8357 0.6331 0.8357 0.9142
No log 3.0411 222 0.7762 0.7260 0.7762 0.8810
No log 3.0685 224 0.7620 0.7297 0.7620 0.8729
No log 3.0959 226 0.7863 0.7297 0.7863 0.8867
No log 3.1233 228 0.8384 0.6806 0.8384 0.9156
No log 3.1507 230 0.8473 0.6809 0.8473 0.9205
No log 3.1781 232 0.8791 0.6815 0.8791 0.9376
No log 3.2055 234 0.8759 0.6815 0.8759 0.9359
No log 3.2329 236 0.8139 0.7448 0.8139 0.9022
No log 3.2603 238 0.7892 0.7034 0.7892 0.8883
No log 3.2877 240 0.8358 0.6713 0.8358 0.9142
No log 3.3151 242 0.8958 0.6143 0.8958 0.9465
No log 3.3425 244 0.8626 0.5985 0.8626 0.9288
No log 3.3699 246 0.8599 0.5985 0.8599 0.9273
No log 3.3973 248 0.8354 0.5985 0.8354 0.9140
No log 3.4247 250 0.7480 0.6620 0.7480 0.8648
No log 3.4521 252 0.7368 0.6853 0.7368 0.8584
No log 3.4795 254 0.7356 0.6434 0.7356 0.8577
No log 3.5068 256 0.7160 0.7211 0.7160 0.8462
No log 3.5342 258 0.7982 0.6944 0.7982 0.8934
No log 3.5616 260 0.8553 0.6853 0.8553 0.9248
No log 3.5890 262 0.8239 0.6944 0.8239 0.9077
No log 3.6164 264 0.7629 0.7483 0.7629 0.8734
No log 3.6438 266 0.7731 0.7483 0.7731 0.8792
No log 3.6712 268 0.7978 0.7273 0.7978 0.8932
No log 3.6986 270 0.8028 0.7083 0.8028 0.8960
No log 3.7260 272 0.8452 0.6277 0.8452 0.9194
No log 3.7534 274 0.9342 0.5797 0.9342 0.9665
No log 3.7808 276 0.9665 0.5630 0.9665 0.9831
No log 3.8082 278 0.9700 0.6370 0.9700 0.9849
No log 3.8356 280 0.9599 0.5954 0.9599 0.9798
No log 3.8630 282 0.9686 0.5873 0.9686 0.9842
No log 3.8904 284 0.9713 0.6119 0.9713 0.9855
No log 3.9178 286 0.9670 0.6154 0.9670 0.9834
No log 3.9452 288 0.9707 0.6202 0.9707 0.9852
No log 3.9726 290 0.9602 0.6154 0.9602 0.9799
No log 4.0 292 1.0365 0.5625 1.0365 1.0181
No log 4.0274 294 1.1076 0.4034 1.1076 1.0524
No log 4.0548 296 1.0356 0.5366 1.0356 1.0177
No log 4.0822 298 0.9835 0.6357 0.9835 0.9917
No log 4.1096 300 1.0150 0.6222 1.0150 1.0075
No log 4.1370 302 0.9372 0.6515 0.9372 0.9681
No log 4.1644 304 0.8470 0.6370 0.8470 0.9203
No log 4.1918 306 0.8743 0.6377 0.8743 0.9350
No log 4.2192 308 0.8808 0.6222 0.8808 0.9385
No log 4.2466 310 0.8794 0.5714 0.8794 0.9378
No log 4.2740 312 0.8859 0.5827 0.8859 0.9412
No log 4.3014 314 0.8483 0.6615 0.8483 0.9210
No log 4.3288 316 0.8222 0.5909 0.8222 0.9068
No log 4.3562 318 0.8232 0.6571 0.8232 0.9073
No log 4.3836 320 0.8256 0.6667 0.8256 0.9086
No log 4.4110 322 0.8074 0.6475 0.8074 0.8986
No log 4.4384 324 0.8050 0.6667 0.8050 0.8972
No log 4.4658 326 0.8246 0.6617 0.8246 0.9081
No log 4.4932 328 0.8247 0.6462 0.8247 0.9081
No log 4.5205 330 0.7882 0.6047 0.7882 0.8878
No log 4.5479 332 0.8900 0.6471 0.8900 0.9434
No log 4.5753 334 0.8935 0.6324 0.8935 0.9452
No log 4.6027 336 0.7922 0.6763 0.7922 0.8901
No log 4.6301 338 0.7251 0.7273 0.7251 0.8515
No log 4.6575 340 0.7112 0.7092 0.7112 0.8433
No log 4.6849 342 0.7028 0.7671 0.7028 0.8383
No log 4.7123 344 0.7243 0.6763 0.7243 0.8510
No log 4.7397 346 0.7175 0.7260 0.7175 0.8471
No log 4.7671 348 0.7218 0.7413 0.7218 0.8496
No log 4.7945 350 0.7274 0.7568 0.7274 0.8529
No log 4.8219 352 0.7789 0.7092 0.7789 0.8825
No log 4.8493 354 0.9082 0.6569 0.9082 0.9530
No log 4.8767 356 0.9749 0.6667 0.9749 0.9874
No log 4.9041 358 0.9996 0.6202 0.9996 0.9998
No log 4.9315 360 1.0167 0.5625 1.0167 1.0083
No log 4.9589 362 0.9928 0.6 0.9928 0.9964
No log 4.9863 364 0.9310 0.5512 0.9310 0.9649
No log 5.0137 366 0.9248 0.5970 0.9248 0.9617
No log 5.0411 368 0.8866 0.5970 0.8866 0.9416
No log 5.0685 370 0.8290 0.6364 0.8290 0.9105
No log 5.0959 372 0.8232 0.7007 0.8232 0.9073
No log 5.1233 374 0.7824 0.6912 0.7824 0.8846
No log 5.1507 376 0.7478 0.7042 0.7478 0.8647
No log 5.1781 378 0.7695 0.6950 0.7695 0.8772
No log 5.2055 380 0.8531 0.6912 0.8531 0.9236
No log 5.2329 382 0.8917 0.6815 0.8917 0.9443
No log 5.2603 384 0.8832 0.6617 0.8832 0.9398
No log 5.2877 386 0.8455 0.6857 0.8455 0.9195
No log 5.3151 388 0.8075 0.7133 0.8075 0.8986
No log 5.3425 390 0.7723 0.7310 0.7723 0.8788
No log 5.3699 392 0.7255 0.7133 0.7255 0.8517
No log 5.3973 394 0.6970 0.7324 0.6970 0.8349
No log 5.4247 396 0.6835 0.7133 0.6835 0.8268
No log 5.4521 398 0.6698 0.7123 0.6698 0.8184
No log 5.4795 400 0.6555 0.7619 0.6555 0.8096
No log 5.5068 402 0.6482 0.7619 0.6482 0.8051
No log 5.5342 404 0.6835 0.7222 0.6835 0.8267
No log 5.5616 406 0.7681 0.6763 0.7681 0.8764
No log 5.5890 408 0.7547 0.7042 0.7547 0.8687
No log 5.6164 410 0.6984 0.75 0.6984 0.8357
No log 5.6438 412 0.6946 0.7534 0.6946 0.8334
No log 5.6712 414 0.6852 0.7733 0.6852 0.8278
No log 5.6986 416 0.6602 0.7733 0.6602 0.8125
No log 5.7260 418 0.6152 0.8025 0.6152 0.7844
No log 5.7534 420 0.6117 0.7771 0.6117 0.7821
No log 5.7808 422 0.5667 0.8098 0.5667 0.7528
No log 5.8082 424 0.5199 0.8395 0.5199 0.7210
No log 5.8356 426 0.6214 0.7952 0.6214 0.7883
No log 5.8630 428 0.7663 0.7485 0.7663 0.8754
No log 5.8904 430 0.7686 0.7778 0.7686 0.8767
No log 5.9178 432 0.7581 0.7799 0.7581 0.8707
No log 5.9452 434 0.6792 0.7662 0.6792 0.8241
No log 5.9726 436 0.6116 0.7947 0.6116 0.7821
No log 6.0 438 0.6180 0.7755 0.6180 0.7861
No log 6.0274 440 0.6624 0.7974 0.6624 0.8139
No log 6.0548 442 0.8105 0.6980 0.8105 0.9003
No log 6.0822 444 0.9645 0.6797 0.9645 0.9821
No log 6.1096 446 0.8218 0.7134 0.8218 0.9066
No log 6.1370 448 0.7285 0.7547 0.7285 0.8535
No log 6.1644 450 0.6924 0.7843 0.6924 0.8321
No log 6.1918 452 0.6131 0.7922 0.6131 0.7830
No log 6.2192 454 0.6582 0.7484 0.6582 0.8113
No log 6.2466 456 0.6842 0.7484 0.6842 0.8272
No log 6.2740 458 0.7040 0.75 0.7040 0.8391
No log 6.3014 460 0.7821 0.6939 0.7821 0.8844
No log 6.3288 462 0.8402 0.6712 0.8402 0.9166
No log 6.3562 464 0.8399 0.6897 0.8399 0.9164
No log 6.3836 466 0.8423 0.6897 0.8423 0.9178
No log 6.4110 468 0.8043 0.7123 0.8043 0.8968
No log 6.4384 470 0.7572 0.7190 0.7572 0.8702
No log 6.4658 472 0.7230 0.7273 0.7230 0.8503
No log 6.4932 474 0.7231 0.7564 0.7231 0.8504
No log 6.5205 476 0.7548 0.7355 0.7548 0.8688
No log 6.5479 478 0.8281 0.7020 0.8281 0.9100
No log 6.5753 480 0.8805 0.6667 0.8805 0.9383
No log 6.6027 482 0.8559 0.6577 0.8559 0.9251
No log 6.6301 484 0.8130 0.7172 0.8130 0.9016
No log 6.6575 486 0.8194 0.6812 0.8194 0.9052
No log 6.6849 488 0.8310 0.6812 0.8310 0.9116
No log 6.7123 490 0.8340 0.6715 0.8340 0.9132
No log 6.7397 492 0.9058 0.6429 0.9058 0.9517
No log 6.7671 494 0.9329 0.5522 0.9329 0.9659
No log 6.7945 496 0.9223 0.6222 0.9223 0.9603
No log 6.8219 498 0.9356 0.6906 0.9356 0.9673
0.4752 6.8493 500 1.0122 0.6383 1.0122 1.0061
0.4752 6.8767 502 1.0219 0.6528 1.0219 1.0109
0.4752 6.9041 504 0.8720 0.6892 0.8720 0.9338
0.4752 6.9315 506 0.7272 0.76 0.7272 0.8528
0.4752 6.9589 508 0.6838 0.7815 0.6838 0.8269
0.4752 6.9863 510 0.7007 0.7755 0.7007 0.8371
0.4752 7.0137 512 0.7529 0.7286 0.7529 0.8677
0.4752 7.0411 514 0.8017 0.7153 0.8017 0.8954
0.4752 7.0685 516 0.7987 0.7059 0.7987 0.8937
0.4752 7.0959 518 0.7469 0.7639 0.7469 0.8642
0.4752 7.1233 520 0.6767 0.7815 0.6767 0.8226
0.4752 7.1507 522 0.6302 0.7975 0.6302 0.7938
0.4752 7.1781 524 0.5943 0.7950 0.5943 0.7709
0.4752 7.2055 526 0.5723 0.8101 0.5723 0.7565
0.4752 7.2329 528 0.6091 0.7922 0.6091 0.7804
0.4752 7.2603 530 0.6471 0.7733 0.6471 0.8044
0.4752 7.2877 532 0.7236 0.7568 0.7236 0.8506
0.4752 7.3151 534 0.8034 0.7286 0.8034 0.8963
0.4752 7.3425 536 0.8453 0.7194 0.8453 0.9194
0.4752 7.3699 538 0.8437 0.7194 0.8437 0.9186
0.4752 7.3973 540 0.7952 0.7101 0.7952 0.8917
0.4752 7.4247 542 0.7435 0.7448 0.7435 0.8623
0.4752 7.4521 544 0.7051 0.7517 0.7051 0.8397
0.4752 7.4795 546 0.7254 0.7568 0.7254 0.8517
0.4752 7.5068 548 0.7397 0.7651 0.7397 0.8601
0.4752 7.5342 550 0.7732 0.7432 0.7732 0.8793
0.4752 7.5616 552 0.7597 0.7682 0.7597 0.8716
0.4752 7.5890 554 0.7744 0.7417 0.7744 0.8800
0.4752 7.6164 556 0.7865 0.7260 0.7865 0.8869
0.4752 7.6438 558 0.7755 0.7333 0.7755 0.8806
0.4752 7.6712 560 0.7352 0.7582 0.7352 0.8575
0.4752 7.6986 562 0.6971 0.8 0.6971 0.8349
0.4752 7.7260 564 0.6972 0.7417 0.6972 0.8350
0.4752 7.7534 566 0.6968 0.7333 0.6968 0.8347
0.4752 7.7808 568 0.6785 0.7417 0.6785 0.8237
0.4752 7.8082 570 0.6816 0.7922 0.6816 0.8256
0.4752 7.8356 572 0.7425 0.7483 0.7425 0.8617
0.4752 7.8630 574 0.8208 0.7432 0.8208 0.9060
0.4752 7.8904 576 0.8457 0.7361 0.8457 0.9196
0.4752 7.9178 578 0.8498 0.6569 0.8498 0.9218
0.4752 7.9452 580 0.8554 0.6667 0.8554 0.9249
0.4752 7.9726 582 0.8248 0.6906 0.8248 0.9082
0.4752 8.0 584 0.7674 0.6857 0.7674 0.8760
0.4752 8.0274 586 0.7753 0.7211 0.7753 0.8805
0.4752 8.0548 588 0.8221 0.6993 0.8221 0.9067
0.4752 8.0822 590 0.8755 0.6618 0.8755 0.9357
0.4752 8.1096 592 0.8508 0.6519 0.8508 0.9224
0.4752 8.1370 594 0.8496 0.6619 0.8496 0.9217
0.4752 8.1644 596 0.8538 0.6331 0.8538 0.9240
0.4752 8.1918 598 0.8471 0.7123 0.8471 0.9204
0.4752 8.2192 600 0.8033 0.7114 0.8033 0.8963
0.4752 8.2466 602 0.7442 0.6986 0.7442 0.8627
0.4752 8.2740 604 0.7715 0.7467 0.7715 0.8783
0.4752 8.3014 606 0.7787 0.7297 0.7787 0.8824
0.4752 8.3288 608 0.7797 0.7260 0.7797 0.8830
0.4752 8.3562 610 0.8102 0.7234 0.8102 0.9001
0.4752 8.3836 612 0.7866 0.7 0.7866 0.8869
0.4752 8.4110 614 0.7630 0.75 0.7630 0.8735
0.4752 8.4384 616 0.8101 0.6471 0.8101 0.9001
0.4752 8.4658 618 0.8839 0.6187 0.8839 0.9402
0.4752 8.4932 620 0.9050 0.5693 0.9050 0.9513
0.4752 8.5205 622 0.8843 0.6187 0.8843 0.9404
0.4752 8.5479 624 0.8255 0.6906 0.8255 0.9086
0.4752 8.5753 626 0.8151 0.7397 0.8151 0.9028
0.4752 8.6027 628 0.8415 0.7347 0.8415 0.9173
0.4752 8.6301 630 0.8403 0.7347 0.8403 0.9167
0.4752 8.6575 632 0.8572 0.7347 0.8572 0.9259
0.4752 8.6849 634 0.8682 0.7133 0.8682 0.9317
0.4752 8.7123 636 0.8388 0.7260 0.8388 0.9158
0.4752 8.7397 638 0.8228 0.7273 0.8228 0.9071
0.4752 8.7671 640 0.8253 0.6857 0.8253 0.9085
0.4752 8.7945 642 0.8923 0.6377 0.8923 0.9446
0.4752 8.8219 644 0.9007 0.6475 0.9007 0.9491
0.4752 8.8493 646 0.8452 0.6308 0.8452 0.9193
0.4752 8.8767 648 0.8120 0.6818 0.8120 0.9011
0.4752 8.9041 650 0.8107 0.6963 0.8107 0.9004
0.4752 8.9315 652 0.7479 0.7448 0.7479 0.8648
0.4752 8.9589 654 0.6791 0.7310 0.6791 0.8241
0.4752 8.9863 656 0.6658 0.7467 0.6658 0.8160
0.4752 9.0137 658 0.6635 0.7417 0.6635 0.8146
0.4752 9.0411 660 0.6590 0.7867 0.6590 0.8118
0.4752 9.0685 662 0.7293 0.7682 0.7293 0.8540
0.4752 9.0959 664 0.9347 0.6573 0.9347 0.9668
0.4752 9.1233 666 1.0633 0.5775 1.0633 1.0312
0.4752 9.1507 668 1.0629 0.6377 1.0629 1.0310
0.4752 9.1781 670 1.0203 0.6142 1.0203 1.0101
0.4752 9.2055 672 0.9932 0.5528 0.9932 0.9966
0.4752 9.2329 674 0.9797 0.5203 0.9797 0.9898

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k16_task1_organization

Finetuned
(4023)
this model