ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k5_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1851
  • Qwk: 0.0987
  • Mse: 1.1851
  • Rmse: 1.0886

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0833 2 4.0043 -0.0032 4.0043 2.0011
No log 0.1667 4 2.3116 0.0203 2.3116 1.5204
No log 0.25 6 1.6665 0.0181 1.6665 1.2909
No log 0.3333 8 1.1083 0.2023 1.1083 1.0528
No log 0.4167 10 1.0489 0.2564 1.0489 1.0242
No log 0.5 12 1.0011 0.2515 1.0011 1.0005
No log 0.5833 14 0.9898 0.3310 0.9898 0.9949
No log 0.6667 16 1.0039 0.3310 1.0039 1.0019
No log 0.75 18 1.0304 0.2787 1.0304 1.0151
No log 0.8333 20 1.0928 0.1979 1.0928 1.0454
No log 0.9167 22 1.1337 0.1738 1.1337 1.0647
No log 1.0 24 1.0934 0.1979 1.0934 1.0457
No log 1.0833 26 1.0341 0.2588 1.0341 1.0169
No log 1.1667 28 1.0005 0.3139 1.0005 1.0002
No log 1.25 30 0.9947 0.3139 0.9947 0.9973
No log 1.3333 32 1.0014 0.2967 1.0014 1.0007
No log 1.4167 34 1.0372 0.1997 1.0372 1.0184
No log 1.5 36 1.2356 0.2236 1.2356 1.1116
No log 1.5833 38 1.2561 0.2885 1.2561 1.1208
No log 1.6667 40 1.0594 0.2662 1.0594 1.0293
No log 1.75 42 0.9431 0.3425 0.9431 0.9712
No log 1.8333 44 0.9617 0.3264 0.9617 0.9806
No log 1.9167 46 0.9207 0.3139 0.9207 0.9595
No log 2.0 48 0.9648 0.2341 0.9648 0.9822
No log 2.0833 50 0.9779 0.2594 0.9779 0.9889
No log 2.1667 52 1.0323 0.2359 1.0323 1.0160
No log 2.25 54 0.9395 0.3071 0.9395 0.9693
No log 2.3333 56 0.9222 0.2865 0.9222 0.9603
No log 2.4167 58 0.9296 0.2692 0.9296 0.9641
No log 2.5 60 0.9539 0.2391 0.9539 0.9767
No log 2.5833 62 0.9919 0.2145 0.9919 0.9959
No log 2.6667 64 1.0455 0.3208 1.0455 1.0225
No log 2.75 66 1.1075 0.2471 1.1075 1.0524
No log 2.8333 68 1.3397 -0.0152 1.3397 1.1575
No log 2.9167 70 1.6210 -0.0686 1.6210 1.2732
No log 3.0 72 1.3642 0.0099 1.3642 1.1680
No log 3.0833 74 1.2239 0.0694 1.2239 1.1063
No log 3.1667 76 1.1346 0.1443 1.1346 1.0652
No log 3.25 78 1.0422 0.2739 1.0422 1.0209
No log 3.3333 80 0.9143 0.3915 0.9143 0.9562
No log 3.4167 82 0.9243 0.4223 0.9243 0.9614
No log 3.5 84 0.9283 0.4610 0.9283 0.9635
No log 3.5833 86 0.9753 0.3702 0.9753 0.9876
No log 3.6667 88 0.9743 0.3940 0.9743 0.9871
No log 3.75 90 0.9577 0.3414 0.9577 0.9786
No log 3.8333 92 0.9854 0.3172 0.9854 0.9927
No log 3.9167 94 0.9812 0.2497 0.9812 0.9906
No log 4.0 96 0.9656 0.2133 0.9656 0.9826
No log 4.0833 98 1.0027 0.3028 1.0027 1.0013
No log 4.1667 100 0.9873 0.3277 0.9873 0.9936
No log 4.25 102 0.9949 0.1671 0.9949 0.9975
No log 4.3333 104 0.9786 0.2596 0.9786 0.9892
No log 4.4167 106 0.9319 0.3678 0.9319 0.9654
No log 4.5 108 0.8815 0.3840 0.8815 0.9389
No log 4.5833 110 0.9064 0.4216 0.9064 0.9521
No log 4.6667 112 0.9995 0.3782 0.9995 0.9998
No log 4.75 114 1.1799 0.2726 1.1799 1.0863
No log 4.8333 116 1.1832 0.2448 1.1832 1.0878
No log 4.9167 118 1.0297 0.3920 1.0297 1.0147
No log 5.0 120 0.9677 0.4444 0.9677 0.9837
No log 5.0833 122 0.9733 0.4444 0.9733 0.9865
No log 5.1667 124 0.9863 0.3921 0.9863 0.9931
No log 5.25 126 1.0178 0.3884 1.0178 1.0088
No log 5.3333 128 1.1222 0.3039 1.1222 1.0593
No log 5.4167 130 1.0655 0.3171 1.0655 1.0322
No log 5.5 132 1.0040 0.4023 1.0040 1.0020
No log 5.5833 134 0.9669 0.3523 0.9669 0.9833
No log 5.6667 136 0.9011 0.3959 0.9011 0.9493
No log 5.75 138 0.8935 0.3414 0.8935 0.9452
No log 5.8333 140 0.9254 0.3940 0.9254 0.9620
No log 5.9167 142 0.9848 0.3623 0.9848 0.9924
No log 6.0 144 0.9170 0.4078 0.9170 0.9576
No log 6.0833 146 0.8712 0.3994 0.8712 0.9334
No log 6.1667 148 0.8770 0.4078 0.8770 0.9365
No log 6.25 150 0.8868 0.4078 0.8868 0.9417
No log 6.3333 152 0.8748 0.4078 0.8748 0.9353
No log 6.4167 154 0.9499 0.3531 0.9499 0.9746
No log 6.5 156 1.0693 0.2886 1.0693 1.0341
No log 6.5833 158 1.0168 0.3283 1.0168 1.0084
No log 6.6667 160 0.8919 0.4327 0.8919 0.9444
No log 6.75 162 0.8726 0.3958 0.8726 0.9341
No log 6.8333 164 0.9701 0.4169 0.9701 0.9849
No log 6.9167 166 1.0276 0.4078 1.0276 1.0137
No log 7.0 168 0.9535 0.4444 0.9535 0.9765
No log 7.0833 170 0.8786 0.4075 0.8786 0.9373
No log 7.1667 172 0.9048 0.4444 0.9048 0.9512
No log 7.25 174 0.9980 0.4429 0.9980 0.9990
No log 7.3333 176 1.0858 0.275 1.0858 1.0420
No log 7.4167 178 1.0588 0.2886 1.0588 1.0290
No log 7.5 180 0.9417 0.4310 0.9417 0.9704
No log 7.5833 182 0.8971 0.4057 0.8971 0.9472
No log 7.6667 184 0.8592 0.4743 0.8592 0.9269
No log 7.75 186 0.8706 0.4461 0.8706 0.9330
No log 7.8333 188 0.9135 0.3957 0.9135 0.9558
No log 7.9167 190 0.9407 0.4681 0.9407 0.9699
No log 8.0 192 0.8615 0.4958 0.8615 0.9282
No log 8.0833 194 0.8547 0.3816 0.8547 0.9245
No log 8.1667 196 0.8423 0.3816 0.8423 0.9178
No log 8.25 198 0.8208 0.4361 0.8208 0.9060
No log 8.3333 200 0.8102 0.5260 0.8102 0.9001
No log 8.4167 202 0.8100 0.5171 0.8100 0.9000
No log 8.5 204 0.8033 0.5246 0.8033 0.8963
No log 8.5833 206 0.9382 0.3957 0.9382 0.9686
No log 8.6667 208 1.0755 0.4318 1.0755 1.0370
No log 8.75 210 0.9839 0.4334 0.9839 0.9919
No log 8.8333 212 0.8343 0.4595 0.8343 0.9134
No log 8.9167 214 0.8214 0.3977 0.8214 0.9063
No log 9.0 216 0.8305 0.3977 0.8305 0.9113
No log 9.0833 218 0.8825 0.4831 0.8825 0.9394
No log 9.1667 220 0.9187 0.4697 0.9187 0.9585
No log 9.25 222 0.9071 0.4697 0.9071 0.9524
No log 9.3333 224 0.8382 0.4576 0.8382 0.9155
No log 9.4167 226 0.8202 0.4792 0.8202 0.9057
No log 9.5 228 0.8106 0.4524 0.8106 0.9003
No log 9.5833 230 0.8159 0.4198 0.8159 0.9033
No log 9.6667 232 0.8913 0.4444 0.8913 0.9441
No log 9.75 234 0.9502 0.3957 0.9502 0.9748
No log 9.8333 236 0.9517 0.4318 0.9517 0.9755
No log 9.9167 238 0.9191 0.4057 0.9191 0.9587
No log 10.0 240 0.9256 0.4433 0.9256 0.9621
No log 10.0833 242 0.9661 0.4301 0.9661 0.9829
No log 10.1667 244 0.9868 0.4301 0.9868 0.9934
No log 10.25 246 0.9658 0.4301 0.9658 0.9827
No log 10.3333 248 0.9518 0.4057 0.9518 0.9756
No log 10.4167 250 0.9399 0.3107 0.9399 0.9695
No log 10.5 252 0.8898 0.2794 0.8898 0.9433
No log 10.5833 254 0.8376 0.2596 0.8376 0.9152
No log 10.6667 256 0.8116 0.3536 0.8116 0.9009
No log 10.75 258 0.8407 0.4828 0.8407 0.9169
No log 10.8333 260 0.9537 0.4389 0.9537 0.9766
No log 10.9167 262 1.1415 0.3336 1.1415 1.0684
No log 11.0 264 1.1497 0.3101 1.1497 1.0723
No log 11.0833 266 0.9767 0.4171 0.9767 0.9883
No log 11.1667 268 0.8096 0.4318 0.8096 0.8998
No log 11.25 270 0.8098 0.4642 0.8098 0.8999
No log 11.3333 272 0.8168 0.3777 0.8168 0.9038
No log 11.4167 274 0.8616 0.4044 0.8616 0.9282
No log 11.5 276 0.9558 0.4421 0.9558 0.9777
No log 11.5833 278 0.9615 0.4681 0.9615 0.9806
No log 11.6667 280 0.8757 0.4044 0.8757 0.9358
No log 11.75 282 0.8264 0.3112 0.8264 0.9091
No log 11.8333 284 0.8170 0.3112 0.8170 0.9039
No log 11.9167 286 0.8230 0.3506 0.8230 0.9072
No log 12.0 288 0.8732 0.3902 0.8732 0.9344
No log 12.0833 290 0.9185 0.4819 0.9185 0.9584
No log 12.1667 292 0.9172 0.4439 0.9172 0.9577
No log 12.25 294 0.8958 0.3902 0.8958 0.9465
No log 12.3333 296 0.8541 0.4044 0.8541 0.9242
No log 12.4167 298 0.8358 0.3556 0.8358 0.9142
No log 12.5 300 0.8310 0.3172 0.8310 0.9116
No log 12.5833 302 0.8401 0.3922 0.8401 0.9166
No log 12.6667 304 0.9095 0.4576 0.9095 0.9537
No log 12.75 306 0.9604 0.4697 0.9604 0.9800
No log 12.8333 308 1.0097 0.3921 1.0097 1.0048
No log 12.9167 310 0.9794 0.4186 0.9794 0.9896
No log 13.0 312 0.8872 0.4318 0.8872 0.9419
No log 13.0833 314 0.8598 0.4044 0.8598 0.9273
No log 13.1667 316 0.9030 0.3611 0.9030 0.9503
No log 13.25 318 1.0049 0.2748 1.0049 1.0024
No log 13.3333 320 1.0524 0.2024 1.0524 1.0259
No log 13.4167 322 0.9894 0.2896 0.9894 0.9947
No log 13.5 324 0.8869 0.3454 0.8869 0.9417
No log 13.5833 326 0.8240 0.3326 0.8240 0.9077
No log 13.6667 328 0.8137 0.3198 0.8137 0.9021
No log 13.75 330 0.8178 0.3631 0.8178 0.9043
No log 13.8333 332 0.9176 0.4576 0.9176 0.9579
No log 13.9167 334 1.0425 0.3938 1.0425 1.0210
No log 14.0 336 1.0485 0.3808 1.0485 1.0240
No log 14.0833 338 1.0253 0.3539 1.0253 1.0126
No log 14.1667 340 0.9983 0.2941 0.9983 0.9992
No log 14.25 342 0.9919 0.2647 0.9919 0.9959
No log 14.3333 344 0.9930 0.3063 0.9930 0.9965
No log 14.4167 346 0.9556 0.2796 0.9556 0.9776
No log 14.5 348 0.9514 0.2796 0.9514 0.9754
No log 14.5833 350 0.9581 0.4036 0.9581 0.9788
No log 14.6667 352 0.8883 0.4060 0.8883 0.9425
No log 14.75 354 0.8396 0.3631 0.8396 0.9163
No log 14.8333 356 0.8498 0.3757 0.8498 0.9219
No log 14.9167 358 0.8764 0.4060 0.8764 0.9362
No log 15.0 360 0.9265 0.4928 0.9265 0.9626
No log 15.0833 362 0.9239 0.4565 0.9239 0.9612
No log 15.1667 364 0.8471 0.4831 0.8471 0.9204
No log 15.25 366 0.7829 0.4378 0.7829 0.8848
No log 15.3333 368 0.7781 0.3979 0.7781 0.8821
No log 15.4167 370 0.7887 0.3797 0.7887 0.8881
No log 15.5 372 0.8143 0.3797 0.8143 0.9024
No log 15.5833 374 0.8437 0.3631 0.8437 0.9185
No log 15.6667 376 0.8571 0.3071 0.8571 0.9258
No log 15.75 378 0.8716 0.3631 0.8716 0.9336
No log 15.8333 380 0.9442 0.3169 0.9442 0.9717
No log 15.9167 382 0.9904 0.2359 0.9904 0.9952
No log 16.0 384 0.9588 0.3169 0.9588 0.9792
No log 16.0833 386 0.9447 0.3044 0.9447 0.9719
No log 16.1667 388 0.8964 0.3485 0.8964 0.9468
No log 16.25 390 0.8755 0.3485 0.8755 0.9357
No log 16.3333 392 0.8434 0.3777 0.8434 0.9184
No log 16.4167 394 0.8131 0.3652 0.8131 0.9017
No log 16.5 396 0.8343 0.3799 0.8343 0.9134
No log 16.5833 398 0.9143 0.4439 0.9143 0.9562
No log 16.6667 400 0.9644 0.3864 0.9644 0.9821
No log 16.75 402 0.9307 0.3474 0.9307 0.9648
No log 16.8333 404 0.8391 0.4044 0.8391 0.9160
No log 16.9167 406 0.7787 0.3777 0.7787 0.8824
No log 17.0 408 0.7661 0.4371 0.7661 0.8753
No log 17.0833 410 0.7977 0.3922 0.7977 0.8932
No log 17.1667 412 0.8517 0.3922 0.8517 0.9229
No log 17.25 414 0.8725 0.3760 0.8725 0.9341
No log 17.3333 416 0.8857 0.3617 0.8857 0.9411
No log 17.4167 418 0.8441 0.3631 0.8441 0.9187
No log 17.5 420 0.8151 0.3631 0.8151 0.9028
No log 17.5833 422 0.7991 0.4503 0.7991 0.8939
No log 17.6667 424 0.8043 0.4230 0.8043 0.8968
No log 17.75 426 0.8116 0.3819 0.8116 0.9009
No log 17.8333 428 0.8591 0.3757 0.8591 0.9269
No log 17.9167 430 0.9426 0.3063 0.9426 0.9709
No log 18.0 432 0.9972 0.2750 0.9972 0.9986
No log 18.0833 434 0.9905 0.2750 0.9905 0.9952
No log 18.1667 436 0.9113 0.3617 0.9113 0.9546
No log 18.25 438 0.8337 0.3922 0.8337 0.9131
No log 18.3333 440 0.7855 0.3977 0.7855 0.8863
No log 18.4167 442 0.7588 0.4251 0.7588 0.8711
No log 18.5 444 0.7560 0.4244 0.7560 0.8695
No log 18.5833 446 0.7645 0.4244 0.7645 0.8744
No log 18.6667 448 0.7966 0.3959 0.7966 0.8925
No log 18.75 450 0.8339 0.4044 0.8339 0.9132
No log 18.8333 452 0.8507 0.3757 0.8507 0.9224
No log 18.9167 454 0.8506 0.3757 0.8506 0.9223
No log 19.0 456 0.8367 0.3631 0.8367 0.9147
No log 19.0833 458 0.8255 0.3631 0.8255 0.9086
No log 19.1667 460 0.8398 0.3757 0.8398 0.9164
No log 19.25 462 0.8703 0.3611 0.8703 0.9329
No log 19.3333 464 0.9283 0.3317 0.9283 0.9635
No log 19.4167 466 0.9530 0.2896 0.9530 0.9762
No log 19.5 468 0.9423 0.3001 0.9423 0.9707
No log 19.5833 470 0.9116 0.3304 0.9116 0.9548
No log 19.6667 472 0.8822 0.3658 0.8822 0.9392
No log 19.75 474 0.8727 0.3658 0.8727 0.9342
No log 19.8333 476 0.8674 0.3940 0.8674 0.9313
No log 19.9167 478 0.9068 0.3902 0.9068 0.9523
No log 20.0 480 0.9105 0.3611 0.9105 0.9542
No log 20.0833 482 0.8693 0.3757 0.8693 0.9324
No log 20.1667 484 0.8307 0.3326 0.8307 0.9114
No log 20.25 486 0.8115 0.3528 0.8115 0.9008
No log 20.3333 488 0.7963 0.3528 0.7963 0.8923
No log 20.4167 490 0.7883 0.3819 0.7883 0.8879
No log 20.5 492 0.8143 0.3678 0.8143 0.9024
No log 20.5833 494 0.8340 0.3631 0.8340 0.9133
No log 20.6667 496 0.8471 0.3631 0.8471 0.9204
No log 20.75 498 0.8842 0.3454 0.8842 0.9403
0.2324 20.8333 500 0.9032 0.3304 0.9032 0.9504
0.2324 20.9167 502 0.8812 0.3454 0.8812 0.9387
0.2324 21.0 504 0.8755 0.3757 0.8755 0.9357
0.2324 21.0833 506 0.9098 0.3611 0.9098 0.9538
0.2324 21.1667 508 0.9785 0.2820 0.9785 0.9892
0.2324 21.25 510 0.9437 0.3643 0.9437 0.9714
0.2324 21.3333 512 0.8592 0.3631 0.8592 0.9270
0.2324 21.4167 514 0.7973 0.3631 0.7973 0.8929
0.2324 21.5 516 0.8006 0.3326 0.8006 0.8948
0.2324 21.5833 518 0.8269 0.3326 0.8269 0.9094
0.2324 21.6667 520 0.8607 0.3757 0.8607 0.9277
0.2324 21.75 522 0.8551 0.3757 0.8551 0.9247
0.2324 21.8333 524 0.8087 0.3631 0.8087 0.8993
0.2324 21.9167 526 0.7643 0.3817 0.7643 0.8742
0.2324 22.0 528 0.7562 0.4019 0.7562 0.8696
0.2324 22.0833 530 0.7509 0.4019 0.7509 0.8665
0.2324 22.1667 532 0.7271 0.4138 0.7271 0.8527
0.2324 22.25 534 0.7343 0.4490 0.7343 0.8569
0.2324 22.3333 536 0.7712 0.4728 0.7712 0.8782
0.2324 22.4167 538 0.8178 0.4824 0.8178 0.9043
0.2324 22.5 540 0.8167 0.4712 0.8167 0.9037
0.2324 22.5833 542 0.7901 0.3922 0.7901 0.8889
0.2324 22.6667 544 0.7691 0.4097 0.7691 0.8770
0.2324 22.75 546 0.7845 0.3797 0.7845 0.8857
0.2324 22.8333 548 0.8187 0.3631 0.8187 0.9048
0.2324 22.9167 550 0.8536 0.3631 0.8536 0.9239
0.2324 23.0 552 0.8963 0.3485 0.8963 0.9467
0.2324 23.0833 554 0.9048 0.3175 0.9048 0.9512
0.2324 23.1667 556 0.9064 0.3485 0.9064 0.9521
0.2324 23.25 558 0.9207 0.3067 0.9207 0.9595
0.2324 23.3333 560 0.9675 0.3531 0.9675 0.9836
0.2324 23.4167 562 0.9941 0.3405 0.9941 0.9970
0.2324 23.5 564 1.0114 0.3434 1.0114 1.0057
0.2324 23.5833 566 0.9966 0.3405 0.9966 0.9983
0.2324 23.6667 568 0.9460 0.3643 0.9460 0.9726
0.2324 23.75 570 0.9170 0.3044 0.9170 0.9576
0.2324 23.8333 572 0.9015 0.3611 0.9015 0.9495
0.2324 23.9167 574 0.8567 0.3631 0.8567 0.9256
0.2324 24.0 576 0.8183 0.3819 0.8183 0.9046
0.2324 24.0833 578 0.7938 0.3314 0.7938 0.8910
0.2324 24.1667 580 0.7976 0.4227 0.7976 0.8931
0.2324 24.25 582 0.8274 0.4227 0.8274 0.9096
0.2324 24.3333 584 0.8865 0.4318 0.8865 0.9415
0.2324 24.4167 586 0.9173 0.4318 0.9173 0.9578
0.2324 24.5 588 0.8991 0.4318 0.8991 0.9482
0.2324 24.5833 590 0.8742 0.3678 0.8742 0.9350
0.2324 24.6667 592 0.8723 0.3382 0.8723 0.9340
0.2324 24.75 594 0.8843 0.3506 0.8843 0.9404
0.2324 24.8333 596 0.8953 0.3757 0.8953 0.9462
0.2324 24.9167 598 0.8908 0.3757 0.8908 0.9438
0.2324 25.0 600 0.8850 0.4044 0.8850 0.9407
0.2324 25.0833 602 0.8775 0.3757 0.8775 0.9368
0.2324 25.1667 604 0.8658 0.3454 0.8658 0.9305
0.2324 25.25 606 0.9051 0.3454 0.9051 0.9514
0.2324 25.3333 608 0.9080 0.3454 0.9080 0.9529
0.2324 25.4167 610 0.9012 0.3454 0.9012 0.9493
0.2324 25.5 612 0.9051 0.3757 0.9051 0.9514
0.2324 25.5833 614 0.9334 0.3757 0.9334 0.9661
0.2324 25.6667 616 0.9178 0.4044 0.9178 0.9580
0.2324 25.75 618 0.8752 0.3757 0.8752 0.9355
0.2324 25.8333 620 0.8453 0.4067 0.8453 0.9194
0.2324 25.9167 622 0.8306 0.3646 0.8306 0.9114
0.2324 26.0 624 0.8232 0.3942 0.8232 0.9073
0.2324 26.0833 626 0.8598 0.4198 0.8598 0.9272
0.2324 26.1667 628 1.0133 0.3695 1.0133 1.0066
0.2324 26.25 630 1.1185 0.3434 1.1185 1.0576
0.2324 26.3333 632 1.1053 0.3042 1.1053 1.0513
0.2324 26.4167 634 1.0077 0.2896 1.0077 1.0038
0.2324 26.5 636 0.8965 0.3631 0.8965 0.9469
0.2324 26.5833 638 0.8311 0.4211 0.8311 0.9116
0.2324 26.6667 640 0.8075 0.4381 0.8075 0.8986
0.2324 26.75 642 0.7987 0.4241 0.7987 0.8937
0.2324 26.8333 644 0.8255 0.4204 0.8255 0.9086
0.2324 26.9167 646 0.9309 0.4824 0.9309 0.9648
0.2324 27.0 648 1.0652 0.4310 1.0652 1.0321
0.2324 27.0833 650 1.1355 0.3578 1.1355 1.0656
0.2324 27.1667 652 1.1214 0.3578 1.1214 1.0590
0.2324 27.25 654 1.0246 0.4081 1.0246 1.0122
0.2324 27.3333 656 0.8978 0.3802 0.8978 0.9475
0.2324 27.4167 658 0.7769 0.4241 0.7769 0.8814
0.2324 27.5 660 0.7549 0.4398 0.7549 0.8688
0.2324 27.5833 662 0.7584 0.4935 0.7584 0.8708
0.2324 27.6667 664 0.7548 0.4520 0.7548 0.8688
0.2324 27.75 666 0.7890 0.4067 0.7890 0.8883
0.2324 27.8333 668 0.8858 0.3631 0.8858 0.9412
0.2324 27.9167 670 1.0069 0.2325 1.0069 1.0034
0.2324 28.0 672 1.1267 0.25 1.1267 1.0615
0.2324 28.0833 674 1.2323 0.2206 1.2323 1.1101
0.2324 28.1667 676 1.2485 0.1512 1.2485 1.1174
0.2324 28.25 678 1.1851 0.0987 1.1851 1.0886

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k5_task5_organization

Finetuned
(4019)
this model