ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k9_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6686
  • Qwk: 0.4982
  • Mse: 0.6686
  • Rmse: 0.8177

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0667 2 4.0335 -0.0323 4.0335 2.0084
No log 0.1333 4 2.4316 0.0541 2.4316 1.5594
No log 0.2 6 1.3720 0.0760 1.3720 1.1713
No log 0.2667 8 1.1373 0.3048 1.1373 1.0664
No log 0.3333 10 1.1045 0.3149 1.1045 1.0509
No log 0.4 12 1.0827 0.2834 1.0827 1.0405
No log 0.4667 14 1.0950 0.2857 1.0950 1.0464
No log 0.5333 16 1.2000 0.0731 1.2000 1.0954
No log 0.6 18 1.2127 0.1114 1.2127 1.1012
No log 0.6667 20 1.1629 0.2074 1.1629 1.0784
No log 0.7333 22 1.1030 0.0761 1.1030 1.0503
No log 0.8 24 1.0480 0.1727 1.0480 1.0237
No log 0.8667 26 1.0198 0.2365 1.0198 1.0098
No log 0.9333 28 1.1037 0.2441 1.1037 1.0506
No log 1.0 30 1.2091 0.1057 1.2091 1.0996
No log 1.0667 32 1.4046 0.1474 1.4046 1.1852
No log 1.1333 34 1.5247 0.1000 1.5247 1.2348
No log 1.2 36 1.5022 0.1112 1.5022 1.2256
No log 1.2667 38 1.3497 0.1498 1.3497 1.1618
No log 1.3333 40 1.1094 0.1537 1.1094 1.0533
No log 1.4 42 1.1094 0.1685 1.1094 1.0533
No log 1.4667 44 1.0587 0.1680 1.0587 1.0289
No log 1.5333 46 1.0532 0.2196 1.0532 1.0263
No log 1.6 48 1.0525 0.2998 1.0525 1.0259
No log 1.6667 50 1.0166 0.3278 1.0166 1.0083
No log 1.7333 52 0.9888 0.2549 0.9888 0.9944
No log 1.8 54 0.9914 0.3411 0.9914 0.9957
No log 1.8667 56 0.9693 0.3665 0.9694 0.9846
No log 1.9333 58 0.8077 0.5142 0.8077 0.8987
No log 2.0 60 0.7627 0.6272 0.7627 0.8733
No log 2.0667 62 0.8229 0.4978 0.8229 0.9071
No log 2.1333 64 0.7401 0.5928 0.7401 0.8603
No log 2.2 66 0.7935 0.4712 0.7935 0.8908
No log 2.2667 68 0.7658 0.4839 0.7658 0.8751
No log 2.3333 70 0.8540 0.3678 0.8540 0.9241
No log 2.4 72 0.8900 0.3352 0.8900 0.9434
No log 2.4667 74 0.7834 0.4918 0.7834 0.8851
No log 2.5333 76 0.7734 0.5076 0.7734 0.8795
No log 2.6 78 0.7298 0.5847 0.7298 0.8543
No log 2.6667 80 0.7646 0.4860 0.7646 0.8744
No log 2.7333 82 0.6664 0.6507 0.6664 0.8163
No log 2.8 84 0.6417 0.6857 0.6417 0.8011
No log 2.8667 86 0.6278 0.6606 0.6278 0.7923
No log 2.9333 88 0.6182 0.6715 0.6182 0.7862
No log 3.0 90 0.7030 0.5902 0.7030 0.8384
No log 3.0667 92 0.8959 0.4458 0.8959 0.9465
No log 3.1333 94 0.7467 0.6220 0.7467 0.8641
No log 3.2 96 0.6218 0.6383 0.6218 0.7886
No log 3.2667 98 0.7678 0.4962 0.7678 0.8763
No log 3.3333 100 0.6699 0.5141 0.6699 0.8185
No log 3.4 102 0.6502 0.6221 0.6502 0.8063
No log 3.4667 104 0.6618 0.6305 0.6618 0.8135
No log 3.5333 106 0.6615 0.6816 0.6615 0.8133
No log 3.6 108 0.6437 0.5131 0.6437 0.8023
No log 3.6667 110 0.6501 0.6450 0.6501 0.8063
No log 3.7333 112 0.6842 0.6556 0.6842 0.8272
No log 3.8 114 0.6096 0.6415 0.6096 0.7808
No log 3.8667 116 0.6393 0.5560 0.6393 0.7996
No log 3.9333 118 0.7114 0.4754 0.7114 0.8434
No log 4.0 120 0.6825 0.4754 0.6825 0.8261
No log 4.0667 122 0.5825 0.6537 0.5825 0.7632
No log 4.1333 124 0.6658 0.6773 0.6658 0.8160
No log 4.2 126 0.6646 0.6637 0.6646 0.8152
No log 4.2667 128 0.5945 0.6013 0.5945 0.7710
No log 4.3333 130 0.6129 0.5093 0.6129 0.7829
No log 4.4 132 0.5957 0.6296 0.5957 0.7718
No log 4.4667 134 0.6716 0.6450 0.6716 0.8195
No log 4.5333 136 0.7623 0.5777 0.7623 0.8731
No log 4.6 138 0.7144 0.5601 0.7144 0.8452
No log 4.6667 140 0.6875 0.4914 0.6875 0.8292
No log 4.7333 142 0.6743 0.4938 0.6743 0.8211
No log 4.8 144 0.6664 0.5202 0.6664 0.8163
No log 4.8667 146 0.7156 0.5442 0.7156 0.8460
No log 4.9333 148 0.6733 0.5325 0.6733 0.8205
No log 5.0 150 0.6435 0.6227 0.6435 0.8022
No log 5.0667 152 0.6599 0.5819 0.6599 0.8123
No log 5.1333 154 0.6971 0.5716 0.6971 0.8349
No log 5.2 156 0.7301 0.5610 0.7301 0.8544
No log 5.2667 158 0.7897 0.5536 0.7897 0.8887
No log 5.3333 160 0.8495 0.5542 0.8495 0.9217
No log 5.4 162 0.7434 0.5093 0.7434 0.8622
No log 5.4667 164 0.7140 0.4736 0.7140 0.8450
No log 5.5333 166 0.6892 0.5659 0.6892 0.8302
No log 5.6 168 0.6574 0.5939 0.6574 0.8108
No log 5.6667 170 0.6564 0.6084 0.6564 0.8102
No log 5.7333 172 0.6626 0.6084 0.6626 0.8140
No log 5.8 174 0.6541 0.6345 0.6541 0.8088
No log 5.8667 176 0.6435 0.5820 0.6435 0.8022
No log 5.9333 178 0.6956 0.5763 0.6956 0.8340
No log 6.0 180 0.8544 0.5549 0.8544 0.9244
No log 6.0667 182 0.9655 0.4211 0.9655 0.9826
No log 6.1333 184 0.8524 0.5625 0.8524 0.9233
No log 6.2 186 0.6231 0.6993 0.6231 0.7894
No log 6.2667 188 0.5868 0.6715 0.5868 0.7660
No log 6.3333 190 0.6342 0.6502 0.6342 0.7964
No log 6.4 192 0.8656 0.5251 0.8656 0.9304
No log 6.4667 194 0.8127 0.5991 0.8127 0.9015
No log 6.5333 196 0.5857 0.6946 0.5857 0.7653
No log 6.6 198 0.5626 0.6649 0.5626 0.7501
No log 6.6667 200 0.5650 0.6946 0.5650 0.7516
No log 6.7333 202 0.5675 0.7020 0.5675 0.7533
No log 6.8 204 0.5511 0.7175 0.5511 0.7423
No log 6.8667 206 0.5251 0.6894 0.5251 0.7247
No log 6.9333 208 0.5313 0.6658 0.5313 0.7289
No log 7.0 210 0.5465 0.6865 0.5465 0.7393
No log 7.0667 212 0.5522 0.7059 0.5522 0.7431
No log 7.1333 214 0.5750 0.5434 0.5750 0.7583
No log 7.2 216 0.6556 0.5364 0.6556 0.8097
No log 7.2667 218 0.6095 0.5450 0.6095 0.7807
No log 7.3333 220 0.5807 0.7059 0.5807 0.7620
No log 7.4 222 0.6638 0.6521 0.6638 0.8147
No log 7.4667 224 0.5927 0.7276 0.5927 0.7699
No log 7.5333 226 0.5924 0.5313 0.5924 0.7696
No log 7.6 228 0.6478 0.5439 0.6478 0.8049
No log 7.6667 230 0.6036 0.5524 0.6036 0.7769
No log 7.7333 232 0.6300 0.6573 0.6300 0.7937
No log 7.8 234 0.6139 0.6157 0.6139 0.7835
No log 7.8667 236 0.5940 0.5406 0.5940 0.7707
No log 7.9333 238 0.6094 0.5220 0.6094 0.7806
No log 8.0 240 0.5814 0.5548 0.5814 0.7625
No log 8.0667 242 0.5834 0.6779 0.5834 0.7638
No log 8.1333 244 0.5880 0.7385 0.5880 0.7668
No log 8.2 246 0.5645 0.6916 0.5645 0.7513
No log 8.2667 248 0.5420 0.6699 0.5420 0.7362
No log 8.3333 250 0.5364 0.6753 0.5364 0.7324
No log 8.4 252 0.5391 0.6814 0.5391 0.7342
No log 8.4667 254 0.5420 0.6712 0.5420 0.7362
No log 8.5333 256 0.5649 0.7232 0.5649 0.7516
No log 8.6 258 0.6114 0.6609 0.6114 0.7819
No log 8.6667 260 0.6177 0.6422 0.6177 0.7859
No log 8.7333 262 0.6012 0.6641 0.6012 0.7754
No log 8.8 264 0.6547 0.5242 0.6547 0.8091
No log 8.8667 266 0.7957 0.4714 0.7957 0.8920
No log 8.9333 268 0.7544 0.5242 0.7544 0.8685
No log 9.0 270 0.6929 0.5054 0.6929 0.8324
No log 9.0667 272 0.7116 0.5490 0.7116 0.8436
No log 9.1333 274 0.7324 0.5446 0.7324 0.8558
No log 9.2 276 0.6814 0.6387 0.6814 0.8254
No log 9.2667 278 0.6241 0.5459 0.6241 0.7900
No log 9.3333 280 0.6593 0.5134 0.6593 0.8120
No log 9.4 282 0.6121 0.5450 0.6121 0.7824
No log 9.4667 284 0.5944 0.6798 0.5944 0.7710
No log 9.5333 286 0.6400 0.6208 0.6400 0.8000
No log 9.6 288 0.7014 0.5989 0.7014 0.8375
No log 9.6667 290 0.6573 0.6208 0.6573 0.8108
No log 9.7333 292 0.6093 0.5943 0.6093 0.7806
No log 9.8 294 0.6167 0.5610 0.6167 0.7853
No log 9.8667 296 0.6494 0.6241 0.6494 0.8058
No log 9.9333 298 0.6866 0.6230 0.6866 0.8286
No log 10.0 300 0.6383 0.6328 0.6383 0.7989
No log 10.0667 302 0.5857 0.6756 0.5857 0.7653
No log 10.1333 304 0.5828 0.6609 0.5828 0.7634
No log 10.2 306 0.7225 0.5847 0.7225 0.8500
No log 10.2667 308 0.8238 0.5896 0.8238 0.9076
No log 10.3333 310 0.6991 0.6292 0.6991 0.8361
No log 10.4 312 0.6024 0.6614 0.6024 0.7761
No log 10.4667 314 0.6083 0.6456 0.6083 0.7799
No log 10.5333 316 0.6028 0.6013 0.6028 0.7764
No log 10.6 318 0.5817 0.6597 0.5817 0.7627
No log 10.6667 320 0.6373 0.6464 0.6373 0.7983
No log 10.7333 322 0.7582 0.5516 0.7582 0.8708
No log 10.8 324 0.7184 0.5329 0.7184 0.8476
No log 10.8667 326 0.6387 0.6160 0.6387 0.7992
No log 10.9333 328 0.5693 0.7122 0.5693 0.7545
No log 11.0 330 0.5343 0.7082 0.5343 0.7309
No log 11.0667 332 0.5325 0.6903 0.5325 0.7297
No log 11.1333 334 0.5288 0.7151 0.5288 0.7272
No log 11.2 336 0.5455 0.6519 0.5455 0.7386
No log 11.2667 338 0.6558 0.6647 0.6558 0.8098
No log 11.3333 340 0.7220 0.5905 0.7220 0.8497
No log 11.4 342 0.7222 0.6105 0.7222 0.8498
No log 11.4667 344 0.6464 0.6021 0.6464 0.8040
No log 11.5333 346 0.5892 0.6065 0.5892 0.7676
No log 11.6 348 0.5763 0.5859 0.5763 0.7592
No log 11.6667 350 0.5870 0.6557 0.5870 0.7662
No log 11.7333 352 0.6278 0.6871 0.6278 0.7923
No log 11.8 354 0.6568 0.6979 0.6568 0.8104
No log 11.8667 356 0.6429 0.6720 0.6429 0.8018
No log 11.9333 358 0.5885 0.6921 0.5885 0.7671
No log 12.0 360 0.5784 0.6139 0.5784 0.7605
No log 12.0667 362 0.6560 0.5255 0.6560 0.8100
No log 12.1333 364 0.6810 0.4654 0.6810 0.8252
No log 12.2 366 0.6393 0.5255 0.6393 0.7995
No log 12.2667 368 0.5761 0.5581 0.5761 0.7590
No log 12.3333 370 0.5196 0.7396 0.5196 0.7209
No log 12.4 372 0.5217 0.7184 0.5217 0.7223
No log 12.4667 374 0.5064 0.7148 0.5064 0.7116
No log 12.5333 376 0.5214 0.6990 0.5214 0.7221
No log 12.6 378 0.5040 0.7013 0.5040 0.7099
No log 12.6667 380 0.4910 0.7265 0.4910 0.7007
No log 12.7333 382 0.4833 0.7063 0.4833 0.6952
No log 12.8 384 0.4897 0.7057 0.4897 0.6998
No log 12.8667 386 0.5050 0.7397 0.5050 0.7107
No log 12.9333 388 0.5895 0.6257 0.5895 0.7678
No log 13.0 390 0.7218 0.5417 0.7218 0.8496
No log 13.0667 392 0.7254 0.5479 0.7254 0.8517
No log 13.1333 394 0.6569 0.5123 0.6569 0.8105
No log 13.2 396 0.6205 0.5208 0.6205 0.7877
No log 13.2667 398 0.5946 0.5703 0.5946 0.7711
No log 13.3333 400 0.5384 0.6888 0.5384 0.7338
No log 13.4 402 0.5407 0.6685 0.5407 0.7353
No log 13.4667 404 0.6176 0.6845 0.6176 0.7859
No log 13.5333 406 0.6072 0.7001 0.6072 0.7792
No log 13.6 408 0.5287 0.7036 0.5287 0.7271
No log 13.6667 410 0.4912 0.7019 0.4912 0.7009
No log 13.7333 412 0.4922 0.6850 0.4922 0.7015
No log 13.8 414 0.5044 0.7019 0.5044 0.7102
No log 13.8667 416 0.5878 0.6700 0.5878 0.7667
No log 13.9333 418 0.6009 0.6700 0.6009 0.7752
No log 14.0 420 0.5729 0.6857 0.5729 0.7569
No log 14.0667 422 0.5387 0.7066 0.5387 0.7340
No log 14.1333 424 0.5131 0.6625 0.5131 0.7163
No log 14.2 426 0.4988 0.7348 0.4988 0.7062
No log 14.2667 428 0.5144 0.6886 0.5144 0.7172
No log 14.3333 430 0.5457 0.7280 0.5457 0.7387
No log 14.4 432 0.5315 0.7280 0.5315 0.7290
No log 14.4667 434 0.4964 0.7225 0.4964 0.7046
No log 14.5333 436 0.5091 0.7059 0.5091 0.7135
No log 14.6 438 0.5389 0.7051 0.5389 0.7341
No log 14.6667 440 0.5798 0.6799 0.5798 0.7615
No log 14.7333 442 0.5810 0.7079 0.5810 0.7623
No log 14.8 444 0.5273 0.6789 0.5273 0.7262
No log 14.8667 446 0.5134 0.6593 0.5134 0.7165
No log 14.9333 448 0.5196 0.6593 0.5196 0.7208
No log 15.0 450 0.5115 0.6830 0.5115 0.7152
No log 15.0667 452 0.5391 0.7126 0.5391 0.7342
No log 15.1333 454 0.5464 0.7080 0.5464 0.7392
No log 15.2 456 0.5261 0.7129 0.5261 0.7253
No log 15.2667 458 0.5324 0.6699 0.5324 0.7296
No log 15.3333 460 0.5617 0.5794 0.5617 0.7495
No log 15.4 462 0.5622 0.6337 0.5622 0.7498
No log 15.4667 464 0.5697 0.6118 0.5697 0.7548
No log 15.5333 466 0.5594 0.6307 0.5594 0.7479
No log 15.6 468 0.5413 0.6745 0.5413 0.7358
No log 15.6667 470 0.5603 0.6921 0.5603 0.7486
No log 15.7333 472 0.5585 0.6865 0.5585 0.7473
No log 15.8 474 0.5542 0.6806 0.5542 0.7444
No log 15.8667 476 0.5609 0.6806 0.5609 0.7489
No log 15.9333 478 0.5754 0.6806 0.5754 0.7586
No log 16.0 480 0.6086 0.6547 0.6086 0.7802
No log 16.0667 482 0.6786 0.5938 0.6786 0.8238
No log 16.1333 484 0.7001 0.5584 0.7001 0.8367
No log 16.2 486 0.6520 0.6266 0.6520 0.8075
No log 16.2667 488 0.5709 0.6581 0.5709 0.7556
No log 16.3333 490 0.5471 0.6562 0.5471 0.7397
No log 16.4 492 0.5453 0.6979 0.5453 0.7385
No log 16.4667 494 0.5684 0.7008 0.5684 0.7539
No log 16.5333 496 0.6206 0.6474 0.6206 0.7878
No log 16.6 498 0.7680 0.5731 0.7680 0.8763
0.2415 16.6667 500 0.8709 0.5866 0.8709 0.9332
0.2415 16.7333 502 0.7952 0.5583 0.7952 0.8918
0.2415 16.8 504 0.6515 0.6529 0.6515 0.8072
0.2415 16.8667 506 0.6108 0.5914 0.6108 0.7815
0.2415 16.9333 508 0.6269 0.5472 0.6269 0.7917
0.2415 17.0 510 0.6118 0.5694 0.6118 0.7822
0.2415 17.0667 512 0.5553 0.6517 0.5553 0.7452
0.2415 17.1333 514 0.5245 0.6916 0.5245 0.7242
0.2415 17.2 516 0.5536 0.7093 0.5536 0.7440
0.2415 17.2667 518 0.5456 0.7057 0.5456 0.7387
0.2415 17.3333 520 0.5110 0.7293 0.5110 0.7148
0.2415 17.4 522 0.5021 0.7318 0.5021 0.7086
0.2415 17.4667 524 0.5185 0.6516 0.5185 0.7201
0.2415 17.5333 526 0.5464 0.6427 0.5464 0.7392
0.2415 17.6 528 0.5440 0.6265 0.5440 0.7376
0.2415 17.6667 530 0.5468 0.6470 0.5468 0.7395
0.2415 17.7333 532 0.5723 0.7019 0.5723 0.7565
0.2415 17.8 534 0.5799 0.6973 0.5799 0.7615
0.2415 17.8667 536 0.5537 0.6704 0.5537 0.7441
0.2415 17.9333 538 0.5384 0.6526 0.5384 0.7337
0.2415 18.0 540 0.5436 0.6638 0.5436 0.7373
0.2415 18.0667 542 0.5450 0.6857 0.5450 0.7382
0.2415 18.1333 544 0.5569 0.6986 0.5569 0.7463
0.2415 18.2 546 0.5550 0.6986 0.5550 0.7450
0.2415 18.2667 548 0.5337 0.6703 0.5337 0.7306
0.2415 18.3333 550 0.5311 0.6963 0.5311 0.7287
0.2415 18.4 552 0.5110 0.6843 0.5110 0.7148
0.2415 18.4667 554 0.5029 0.6951 0.5029 0.7092
0.2415 18.5333 556 0.5241 0.7182 0.5241 0.7239
0.2415 18.6 558 0.5182 0.7182 0.5182 0.7198
0.2415 18.6667 560 0.5130 0.6919 0.5130 0.7162
0.2415 18.7333 562 0.5419 0.5966 0.5419 0.7361
0.2415 18.8 564 0.5330 0.5955 0.5330 0.7301
0.2415 18.8667 566 0.5194 0.6788 0.5194 0.7207
0.2415 18.9333 568 0.5737 0.6735 0.5737 0.7574
0.2415 19.0 570 0.6575 0.6019 0.6575 0.8109
0.2415 19.0667 572 0.6847 0.6190 0.6847 0.8275
0.2415 19.1333 574 0.6121 0.6495 0.6121 0.7824
0.2415 19.2 576 0.5473 0.6813 0.5473 0.7398
0.2415 19.2667 578 0.5380 0.6914 0.5380 0.7335
0.2415 19.3333 580 0.5466 0.6813 0.5466 0.7393
0.2415 19.4 582 0.5944 0.7050 0.5944 0.7710
0.2415 19.4667 584 0.6033 0.7195 0.6033 0.7767
0.2415 19.5333 586 0.5755 0.7253 0.5755 0.7586
0.2415 19.6 588 0.5446 0.6813 0.5446 0.7379
0.2415 19.6667 590 0.5386 0.6853 0.5386 0.7339
0.2415 19.7333 592 0.5620 0.6446 0.5620 0.7497
0.2415 19.8 594 0.6428 0.6081 0.6428 0.8017
0.2415 19.8667 596 0.7002 0.5343 0.7002 0.8368
0.2415 19.9333 598 0.6738 0.5864 0.6738 0.8209
0.2415 20.0 600 0.6141 0.6228 0.6141 0.7836
0.2415 20.0667 602 0.5548 0.7186 0.5548 0.7449
0.2415 20.1333 604 0.5339 0.7041 0.5339 0.7307
0.2415 20.2 606 0.5312 0.6995 0.5312 0.7288
0.2415 20.2667 608 0.5429 0.6758 0.5429 0.7368
0.2415 20.3333 610 0.5548 0.6758 0.5548 0.7448
0.2415 20.4 612 0.5608 0.6758 0.5608 0.7489
0.2415 20.4667 614 0.5739 0.7057 0.5739 0.7576
0.2415 20.5333 616 0.5504 0.7063 0.5504 0.7419
0.2415 20.6 618 0.5152 0.7033 0.5152 0.7178
0.2415 20.6667 620 0.5311 0.7033 0.5311 0.7288
0.2415 20.7333 622 0.5416 0.6634 0.5416 0.7359
0.2415 20.8 624 0.5478 0.6958 0.5478 0.7402
0.2415 20.8667 626 0.5787 0.6639 0.5787 0.7607
0.2415 20.9333 628 0.6283 0.6473 0.6283 0.7926
0.2415 21.0 630 0.6672 0.6035 0.6672 0.8168
0.2415 21.0667 632 0.6545 0.6473 0.6545 0.8090
0.2415 21.1333 634 0.6345 0.6473 0.6345 0.7965
0.2415 21.2 636 0.6009 0.6510 0.6009 0.7752
0.2415 21.2667 638 0.5769 0.6771 0.5769 0.7595
0.2415 21.3333 640 0.5731 0.6771 0.5731 0.7570
0.2415 21.4 642 0.5575 0.6664 0.5575 0.7467
0.2415 21.4667 644 0.5310 0.6833 0.5310 0.7287
0.2415 21.5333 646 0.5230 0.5759 0.5230 0.7232
0.2415 21.6 648 0.5233 0.5771 0.5233 0.7234
0.2415 21.6667 650 0.5284 0.6639 0.5284 0.7269
0.2415 21.7333 652 0.5817 0.6687 0.5817 0.7627
0.2415 21.8 654 0.6025 0.6687 0.6025 0.7762
0.2415 21.8667 656 0.6077 0.6664 0.6077 0.7796
0.2415 21.9333 658 0.6183 0.6706 0.6183 0.7863
0.2415 22.0 660 0.5984 0.6359 0.5984 0.7735
0.2415 22.0667 662 0.5896 0.6243 0.5896 0.7678
0.2415 22.1333 664 0.6029 0.6243 0.6029 0.7764
0.2415 22.2 666 0.6242 0.6796 0.6242 0.7901
0.2415 22.2667 668 0.6782 0.6003 0.6782 0.8235
0.2415 22.3333 670 0.7254 0.5413 0.7254 0.8517
0.2415 22.4 672 0.7282 0.5192 0.7282 0.8533
0.2415 22.4667 674 0.6971 0.5273 0.6971 0.8349
0.2415 22.5333 676 0.6698 0.5210 0.6698 0.8184
0.2415 22.6 678 0.6686 0.4982 0.6686 0.8177

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k9_task5_organization

Finetuned
(4019)
this model