ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k16_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0711
  • Qwk: 0.5251
  • Mse: 1.0711
  • Rmse: 1.0349

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0253 2 5.5257 -0.0284 5.5257 2.3507
No log 0.0506 4 3.1566 0.0597 3.1567 1.7767
No log 0.0759 6 2.4065 -0.1288 2.4065 1.5513
No log 0.1013 8 2.6228 -0.1596 2.6228 1.6195
No log 0.1266 10 1.8167 -0.0220 1.8167 1.3479
No log 0.1519 12 1.2412 0.2365 1.2412 1.1141
No log 0.1772 14 1.2650 0.2032 1.2650 1.1247
No log 0.2025 16 1.2830 0.2233 1.2830 1.1327
No log 0.2278 18 1.2668 0.2233 1.2668 1.1255
No log 0.2532 20 1.2421 0.2401 1.2421 1.1145
No log 0.2785 22 1.2512 0.2913 1.2512 1.1186
No log 0.3038 24 1.2933 0.2812 1.2933 1.1372
No log 0.3291 26 1.2919 0.3064 1.2919 1.1366
No log 0.3544 28 1.1810 0.2231 1.1810 1.0868
No log 0.3797 30 1.1324 0.2335 1.1324 1.0642
No log 0.4051 32 1.2156 0.2080 1.2156 1.1026
No log 0.4304 34 1.2581 0.2347 1.2581 1.1217
No log 0.4557 36 1.2374 0.2312 1.2374 1.1124
No log 0.4810 38 1.1507 0.2419 1.1507 1.0727
No log 0.5063 40 1.1426 0.1944 1.1426 1.0689
No log 0.5316 42 1.1628 0.2029 1.1628 1.0783
No log 0.5570 44 1.2321 0.2542 1.2321 1.1100
No log 0.5823 46 1.2874 0.1697 1.2874 1.1346
No log 0.6076 48 1.3445 0.1655 1.3445 1.1595
No log 0.6329 50 1.4509 0.0968 1.4509 1.2045
No log 0.6582 52 1.3949 0.1010 1.3949 1.1811
No log 0.6835 54 1.2871 0.2125 1.2871 1.1345
No log 0.7089 56 1.2100 0.2724 1.2100 1.1000
No log 0.7342 58 1.1767 0.2653 1.1767 1.0848
No log 0.7595 60 1.1267 0.2650 1.1267 1.0614
No log 0.7848 62 1.1139 0.2794 1.1139 1.0554
No log 0.8101 64 1.0953 0.2529 1.0953 1.0466
No log 0.8354 66 1.1340 0.2517 1.1340 1.0649
No log 0.8608 68 1.3108 0.2943 1.3108 1.1449
No log 0.8861 70 1.1675 0.2940 1.1675 1.0805
No log 0.9114 72 0.9667 0.4421 0.9667 0.9832
No log 0.9367 74 1.0984 0.3785 1.0984 1.0481
No log 0.9620 76 1.1611 0.2331 1.1611 1.0775
No log 0.9873 78 1.1240 0.1902 1.1240 1.0602
No log 1.0127 80 0.9465 0.3228 0.9465 0.9729
No log 1.0380 82 0.8963 0.4125 0.8963 0.9467
No log 1.0633 84 0.8852 0.5024 0.8852 0.9409
No log 1.0886 86 0.9244 0.5032 0.9244 0.9615
No log 1.1139 88 1.0836 0.4419 1.0836 1.0410
No log 1.1392 90 1.0973 0.4368 1.0973 1.0475
No log 1.1646 92 0.9255 0.5870 0.9255 0.9620
No log 1.1899 94 0.8551 0.5261 0.8551 0.9247
No log 1.2152 96 0.8466 0.5360 0.8466 0.9201
No log 1.2405 98 0.8331 0.5568 0.8331 0.9127
No log 1.2658 100 0.9148 0.5686 0.9148 0.9564
No log 1.2911 102 1.1770 0.4170 1.1770 1.0849
No log 1.3165 104 1.1114 0.4614 1.1114 1.0542
No log 1.3418 106 0.8516 0.5971 0.8516 0.9228
No log 1.3671 108 0.7946 0.5467 0.7946 0.8914
No log 1.3924 110 0.7911 0.5325 0.7911 0.8894
No log 1.4177 112 0.7969 0.5723 0.7969 0.8927
No log 1.4430 114 1.0347 0.4904 1.0347 1.0172
No log 1.4684 116 1.1838 0.4561 1.1838 1.0880
No log 1.4937 118 1.0973 0.4268 1.0973 1.0475
No log 1.5190 120 0.9149 0.3895 0.9149 0.9565
No log 1.5443 122 0.8437 0.4163 0.8437 0.9185
No log 1.5696 124 0.8538 0.4678 0.8538 0.9240
No log 1.5949 126 0.8438 0.5324 0.8438 0.9186
No log 1.6203 128 0.8016 0.5591 0.8016 0.8953
No log 1.6456 130 0.7595 0.5954 0.7595 0.8715
No log 1.6709 132 0.7801 0.6012 0.7801 0.8832
No log 1.6962 134 0.9394 0.5690 0.9394 0.9692
No log 1.7215 136 0.9692 0.5482 0.9692 0.9845
No log 1.7468 138 0.9923 0.5482 0.9923 0.9961
No log 1.7722 140 0.9266 0.5487 0.9266 0.9626
No log 1.7975 142 0.9325 0.5487 0.9325 0.9657
No log 1.8228 144 0.9964 0.5426 0.9964 0.9982
No log 1.8481 146 0.9480 0.5527 0.9480 0.9737
No log 1.8734 148 0.8246 0.6091 0.8246 0.9081
No log 1.8987 150 0.7499 0.6075 0.7499 0.8660
No log 1.9241 152 0.7736 0.5222 0.7736 0.8795
No log 1.9494 154 0.7508 0.6001 0.7508 0.8665
No log 1.9747 156 0.7540 0.5680 0.7540 0.8683
No log 2.0 158 1.0886 0.5257 1.0886 1.0434
No log 2.0253 160 1.4375 0.3769 1.4375 1.1989
No log 2.0506 162 1.3438 0.3787 1.3438 1.1592
No log 2.0759 164 1.0510 0.5339 1.0510 1.0252
No log 2.1013 166 0.9804 0.5766 0.9804 0.9902
No log 2.1266 168 0.9923 0.5557 0.9923 0.9962
No log 2.1519 170 1.0410 0.5495 1.0410 1.0203
No log 2.1772 172 1.2215 0.4705 1.2215 1.1052
No log 2.2025 174 1.3118 0.4574 1.3118 1.1454
No log 2.2278 176 1.0720 0.4674 1.0720 1.0354
No log 2.2532 178 0.8255 0.5009 0.8255 0.9086
No log 2.2785 180 0.7996 0.5061 0.7996 0.8942
No log 2.3038 182 0.7728 0.5362 0.7728 0.8791
No log 2.3291 184 0.9860 0.5576 0.9860 0.9930
No log 2.3544 186 1.2587 0.4787 1.2587 1.1219
No log 2.3797 188 1.1281 0.5391 1.1281 1.0621
No log 2.4051 190 0.8874 0.6057 0.8874 0.9420
No log 2.4304 192 0.8814 0.6069 0.8814 0.9388
No log 2.4557 194 0.8792 0.6069 0.8792 0.9377
No log 2.4810 196 0.9520 0.5718 0.9520 0.9757
No log 2.5063 198 0.9495 0.5664 0.9495 0.9744
No log 2.5316 200 0.9384 0.5896 0.9384 0.9687
No log 2.5570 202 0.8990 0.5899 0.8990 0.9482
No log 2.5823 204 0.8093 0.5756 0.8093 0.8996
No log 2.6076 206 0.7726 0.5911 0.7726 0.8790
No log 2.6329 208 0.8148 0.6430 0.8148 0.9026
No log 2.6582 210 0.8231 0.6353 0.8231 0.9073
No log 2.6835 212 0.8086 0.6465 0.8086 0.8992
No log 2.7089 214 0.8356 0.6336 0.8356 0.9141
No log 2.7342 216 1.0507 0.5562 1.0507 1.0250
No log 2.7595 218 1.1284 0.5306 1.1284 1.0623
No log 2.7848 220 0.9331 0.6023 0.9331 0.9660
No log 2.8101 222 0.7897 0.6354 0.7897 0.8886
No log 2.8354 224 0.7444 0.5991 0.7444 0.8628
No log 2.8608 226 0.7192 0.5072 0.7192 0.8481
No log 2.8861 228 0.7183 0.5303 0.7183 0.8475
No log 2.9114 230 0.7706 0.5801 0.7706 0.8778
No log 2.9367 232 0.8709 0.6014 0.8709 0.9332
No log 2.9620 234 0.8771 0.5849 0.8771 0.9365
No log 2.9873 236 0.7307 0.6278 0.7307 0.8548
No log 3.0127 238 0.6990 0.6526 0.6990 0.8360
No log 3.0380 240 0.7811 0.6112 0.7811 0.8838
No log 3.0633 242 0.9798 0.5501 0.9798 0.9899
No log 3.0886 244 1.1263 0.4887 1.1263 1.0613
No log 3.1139 246 1.0069 0.5154 1.0069 1.0035
No log 3.1392 248 0.8413 0.6132 0.8413 0.9173
No log 3.1646 250 0.8492 0.6191 0.8492 0.9215
No log 3.1899 252 0.9820 0.5471 0.9820 0.9910
No log 3.2152 254 0.9502 0.5607 0.9502 0.9748
No log 3.2405 256 0.9816 0.5306 0.9816 0.9908
No log 3.2658 258 0.9191 0.5806 0.9191 0.9587
No log 3.2911 260 0.7611 0.6566 0.7611 0.8724
No log 3.3165 262 0.6857 0.6187 0.6857 0.8281
No log 3.3418 264 0.6939 0.6100 0.6939 0.8330
No log 3.3671 266 0.7558 0.5857 0.7558 0.8694
No log 3.3924 268 0.8895 0.5979 0.8895 0.9431
No log 3.4177 270 0.9627 0.5418 0.9627 0.9812
No log 3.4430 272 0.9815 0.5216 0.9815 0.9907
No log 3.4684 274 0.8927 0.5756 0.8927 0.9448
No log 3.4937 276 0.7756 0.6419 0.7756 0.8807
No log 3.5190 278 0.7062 0.6310 0.7062 0.8403
No log 3.5443 280 0.7362 0.6062 0.7362 0.8580
No log 3.5696 282 0.8183 0.6006 0.8183 0.9046
No log 3.5949 284 0.7862 0.5923 0.7862 0.8867
No log 3.6203 286 0.7406 0.5693 0.7406 0.8606
No log 3.6456 288 0.7066 0.6023 0.7066 0.8406
No log 3.6709 290 0.7601 0.6422 0.7601 0.8718
No log 3.6962 292 0.8991 0.6003 0.8991 0.9482
No log 3.7215 294 0.8651 0.6235 0.8651 0.9301
No log 3.7468 296 0.9077 0.6056 0.9077 0.9528
No log 3.7722 298 0.7992 0.6607 0.7992 0.8940
No log 3.7975 300 0.6910 0.6630 0.6910 0.8312
No log 3.8228 302 0.6644 0.6411 0.6644 0.8151
No log 3.8481 304 0.6620 0.6458 0.6620 0.8136
No log 3.8734 306 0.7045 0.7064 0.7045 0.8394
No log 3.8987 308 0.8521 0.6501 0.8521 0.9231
No log 3.9241 310 0.8457 0.6257 0.8457 0.9196
No log 3.9494 312 0.7159 0.6922 0.7159 0.8461
No log 3.9747 314 0.6684 0.6320 0.6684 0.8176
No log 4.0 316 0.6600 0.6405 0.6600 0.8124
No log 4.0253 318 0.6480 0.6947 0.6480 0.8050
No log 4.0506 320 0.7264 0.6910 0.7264 0.8523
No log 4.0759 322 0.8107 0.6516 0.8107 0.9004
No log 4.1013 324 0.8716 0.6031 0.8716 0.9336
No log 4.1266 326 0.8140 0.6327 0.8140 0.9022
No log 4.1519 328 0.7816 0.6513 0.7816 0.8841
No log 4.1772 330 0.7386 0.6762 0.7386 0.8594
No log 4.2025 332 0.7881 0.6516 0.7881 0.8878
No log 4.2278 334 0.9617 0.5742 0.9617 0.9807
No log 4.2532 336 1.0193 0.5617 1.0193 1.0096
No log 4.2785 338 0.9113 0.5800 0.9113 0.9546
No log 4.3038 340 0.9373 0.5754 0.9373 0.9682
No log 4.3291 342 0.8481 0.6456 0.8481 0.9209
No log 4.3544 344 0.7480 0.6819 0.7480 0.8649
No log 4.3797 346 0.7104 0.6721 0.7104 0.8428
No log 4.4051 348 0.7054 0.6781 0.7054 0.8399
No log 4.4304 350 0.7696 0.6755 0.7696 0.8773
No log 4.4557 352 0.9425 0.5655 0.9425 0.9708
No log 4.4810 354 0.9342 0.5705 0.9342 0.9665
No log 4.5063 356 0.7615 0.6540 0.7615 0.8726
No log 4.5316 358 0.6296 0.6961 0.6296 0.7934
No log 4.5570 360 0.7224 0.6244 0.7224 0.8499
No log 4.5823 362 0.7297 0.6541 0.7297 0.8542
No log 4.6076 364 0.6553 0.7032 0.6553 0.8095
No log 4.6329 366 0.7017 0.6855 0.7017 0.8377
No log 4.6582 368 0.7928 0.6597 0.7928 0.8904
No log 4.6835 370 0.7915 0.6389 0.7915 0.8897
No log 4.7089 372 0.7663 0.6489 0.7663 0.8754
No log 4.7342 374 0.7643 0.6524 0.7643 0.8742
No log 4.7595 376 0.8162 0.6368 0.8162 0.9034
No log 4.7848 378 0.7897 0.6357 0.7897 0.8886
No log 4.8101 380 0.7275 0.6752 0.7275 0.8529
No log 4.8354 382 0.7312 0.6819 0.7312 0.8551
No log 4.8608 384 0.7978 0.6441 0.7978 0.8932
No log 4.8861 386 0.9281 0.6167 0.9281 0.9634
No log 4.9114 388 1.0447 0.5521 1.0447 1.0221
No log 4.9367 390 1.0431 0.5569 1.0431 1.0213
No log 4.9620 392 0.9319 0.6306 0.9319 0.9653
No log 4.9873 394 0.8273 0.6509 0.8273 0.9095
No log 5.0127 396 0.8245 0.6509 0.8245 0.9080
No log 5.0380 398 0.8703 0.6509 0.8703 0.9329
No log 5.0633 400 0.8842 0.6418 0.8842 0.9403
No log 5.0886 402 0.9514 0.6382 0.9514 0.9754
No log 5.1139 404 1.0410 0.5996 1.0410 1.0203
No log 5.1392 406 1.0141 0.5935 1.0141 1.0070
No log 5.1646 408 0.8526 0.6463 0.8526 0.9234
No log 5.1899 410 0.7092 0.6862 0.7092 0.8422
No log 5.2152 412 0.6872 0.6673 0.6872 0.8290
No log 5.2405 414 0.7088 0.6745 0.7088 0.8419
No log 5.2658 416 0.8355 0.6168 0.8355 0.9141
No log 5.2911 418 1.0864 0.5324 1.0864 1.0423
No log 5.3165 420 1.1367 0.5187 1.1367 1.0662
No log 5.3418 422 1.0593 0.5426 1.0593 1.0292
No log 5.3671 424 0.8965 0.6050 0.8965 0.9468
No log 5.3924 426 0.8765 0.6153 0.8765 0.9362
No log 5.4177 428 0.9431 0.6010 0.9431 0.9711
No log 5.4430 430 0.9898 0.5707 0.9898 0.9949
No log 5.4684 432 0.9342 0.6100 0.9342 0.9665
No log 5.4937 434 0.8317 0.6161 0.8317 0.9120
No log 5.5190 436 0.7647 0.6616 0.7647 0.8745
No log 5.5443 438 0.7567 0.6218 0.7567 0.8699
No log 5.5696 440 0.7900 0.6188 0.7900 0.8888
No log 5.5949 442 0.8990 0.5869 0.8990 0.9482
No log 5.6203 444 0.9907 0.5844 0.9907 0.9953
No log 5.6456 446 0.9957 0.5757 0.9957 0.9979
No log 5.6709 448 0.9121 0.5916 0.9121 0.9550
No log 5.6962 450 0.7962 0.6188 0.7962 0.8923
No log 5.7215 452 0.7704 0.6366 0.7704 0.8777
No log 5.7468 454 0.8030 0.6169 0.8030 0.8961
No log 5.7722 456 0.8734 0.6273 0.8734 0.9346
No log 5.7975 458 0.9334 0.5944 0.9334 0.9661
No log 5.8228 460 0.9907 0.5489 0.9907 0.9954
No log 5.8481 462 0.9461 0.5914 0.9461 0.9727
No log 5.8734 464 0.8355 0.6217 0.8355 0.9140
No log 5.8987 466 0.7307 0.6558 0.7307 0.8548
No log 5.9241 468 0.7107 0.6591 0.7107 0.8430
No log 5.9494 470 0.7290 0.7031 0.7290 0.8538
No log 5.9747 472 0.7985 0.6507 0.7985 0.8936
No log 6.0 474 0.9187 0.5717 0.9187 0.9585
No log 6.0253 476 0.9862 0.5707 0.9862 0.9931
No log 6.0506 478 0.9451 0.5718 0.9451 0.9721
No log 6.0759 480 0.8579 0.5862 0.8579 0.9262
No log 6.1013 482 0.8416 0.5937 0.8416 0.9174
No log 6.1266 484 0.8480 0.6080 0.8480 0.9209
No log 6.1519 486 0.8209 0.6168 0.8209 0.9060
No log 6.1772 488 0.8617 0.5987 0.8617 0.9283
No log 6.2025 490 0.8784 0.5972 0.8784 0.9372
No log 6.2278 492 0.8753 0.5955 0.8753 0.9356
No log 6.2532 494 0.8951 0.6004 0.8951 0.9461
No log 6.2785 496 0.8885 0.5918 0.8885 0.9426
No log 6.3038 498 0.9043 0.5847 0.9043 0.9509
0.4907 6.3291 500 0.8731 0.5839 0.8731 0.9344
0.4907 6.3544 502 0.7985 0.6217 0.7985 0.8936
0.4907 6.3797 504 0.7824 0.6299 0.7824 0.8846
0.4907 6.4051 506 0.7784 0.6427 0.7784 0.8823
0.4907 6.4304 508 0.7470 0.6618 0.7470 0.8643
0.4907 6.4557 510 0.6830 0.7015 0.6830 0.8264
0.4907 6.4810 512 0.6527 0.6958 0.6527 0.8079
0.4907 6.5063 514 0.6687 0.7078 0.6687 0.8178
0.4907 6.5316 516 0.6983 0.7049 0.6983 0.8356
0.4907 6.5570 518 0.7971 0.6261 0.7971 0.8928
0.4907 6.5823 520 0.8286 0.6313 0.8286 0.9103
0.4907 6.6076 522 0.7651 0.6458 0.7651 0.8747
0.4907 6.6329 524 0.6608 0.7328 0.6608 0.8129
0.4907 6.6582 526 0.6507 0.6925 0.6507 0.8066
0.4907 6.6835 528 0.6672 0.7130 0.6672 0.8168
0.4907 6.7089 530 0.7419 0.6647 0.7419 0.8614
0.4907 6.7342 532 0.8427 0.6257 0.8427 0.9180
0.4907 6.7595 534 0.8547 0.624 0.8547 0.9245
0.4907 6.7848 536 0.7957 0.625 0.7957 0.8920
0.4907 6.8101 538 0.7347 0.6427 0.7347 0.8571
0.4907 6.8354 540 0.7100 0.6455 0.7100 0.8426
0.4907 6.8608 542 0.7137 0.6633 0.7137 0.8448
0.4907 6.8861 544 0.7729 0.6601 0.7729 0.8792
0.4907 6.9114 546 0.9115 0.6013 0.9115 0.9547
0.4907 6.9367 548 0.9468 0.5910 0.9468 0.9730
0.4907 6.9620 550 0.8676 0.6200 0.8676 0.9315
0.4907 6.9873 552 0.7585 0.6590 0.7585 0.8709
0.4907 7.0127 554 0.7311 0.6827 0.7311 0.8550
0.4907 7.0380 556 0.7955 0.6448 0.7955 0.8919
0.4907 7.0633 558 0.9086 0.5836 0.9086 0.9532
0.4907 7.0886 560 0.9133 0.6059 0.9133 0.9557
0.4907 7.1139 562 0.8547 0.6078 0.8547 0.9245
0.4907 7.1392 564 0.8656 0.6078 0.8656 0.9304
0.4907 7.1646 566 0.8742 0.6140 0.8742 0.9350
0.4907 7.1899 568 0.9125 0.6210 0.9125 0.9553
0.4907 7.2152 570 0.9652 0.5890 0.9652 0.9824
0.4907 7.2405 572 0.9717 0.5968 0.9717 0.9857
0.4907 7.2658 574 0.9069 0.6166 0.9069 0.9523
0.4907 7.2911 576 0.8195 0.6357 0.8195 0.9052
0.4907 7.3165 578 0.7848 0.6207 0.7848 0.8859
0.4907 7.3418 580 0.8009 0.6127 0.8009 0.8949
0.4907 7.3671 582 0.8443 0.6181 0.8443 0.9189
0.4907 7.3924 584 0.8880 0.6251 0.8880 0.9423
0.4907 7.4177 586 0.8598 0.6251 0.8598 0.9273
0.4907 7.4430 588 0.7691 0.6336 0.7691 0.8770
0.4907 7.4684 590 0.7530 0.6338 0.7530 0.8678
0.4907 7.4937 592 0.7663 0.6345 0.7663 0.8754
0.4907 7.5190 594 0.7766 0.6350 0.7766 0.8812
0.4907 7.5443 596 0.8218 0.6245 0.8218 0.9065
0.4907 7.5696 598 0.8327 0.6414 0.8327 0.9125
0.4907 7.5949 600 0.8157 0.6315 0.8157 0.9032
0.4907 7.6203 602 0.8577 0.6314 0.8577 0.9261
0.4907 7.6456 604 0.8307 0.6261 0.8307 0.9114
0.4907 7.6709 606 0.8113 0.6430 0.8113 0.9007
0.4907 7.6962 608 0.8618 0.6352 0.8618 0.9283
0.4907 7.7215 610 0.9340 0.6124 0.9340 0.9664
0.4907 7.7468 612 0.9838 0.6149 0.9838 0.9919
0.4907 7.7722 614 0.9546 0.6191 0.9546 0.9770
0.4907 7.7975 616 0.8205 0.6190 0.8205 0.9058
0.4907 7.8228 618 0.7132 0.7170 0.7132 0.8445
0.4907 7.8481 620 0.6903 0.7278 0.6903 0.8309
0.4907 7.8734 622 0.6804 0.7278 0.6804 0.8249
0.4907 7.8987 624 0.6919 0.7255 0.6919 0.8318
0.4907 7.9241 626 0.8046 0.6667 0.8046 0.8970
0.4907 7.9494 628 1.0403 0.5663 1.0403 1.0199
0.4907 7.9747 630 1.1062 0.5539 1.1062 1.0518
0.4907 8.0 632 0.9922 0.5707 0.9922 0.9961
0.4907 8.0253 634 0.8504 0.6556 0.8504 0.9222
0.4907 8.0506 636 0.7733 0.6629 0.7733 0.8794
0.4907 8.0759 638 0.7138 0.6790 0.7138 0.8449
0.4907 8.1013 640 0.6831 0.7388 0.6831 0.8265
0.4907 8.1266 642 0.6840 0.7418 0.6840 0.8270
0.4907 8.1519 644 0.7097 0.7163 0.7097 0.8424
0.4907 8.1772 646 0.8018 0.6583 0.8018 0.8954
0.4907 8.2025 648 0.8388 0.6215 0.8388 0.9159
0.4907 8.2278 650 0.9170 0.5613 0.9170 0.9576
0.4907 8.2532 652 0.9969 0.5695 0.9969 0.9984
0.4907 8.2785 654 0.9606 0.5613 0.9606 0.9801
0.4907 8.3038 656 0.8336 0.6499 0.8336 0.9130
0.4907 8.3291 658 0.7395 0.6972 0.7395 0.8599
0.4907 8.3544 660 0.7128 0.7043 0.7128 0.8443
0.4907 8.3797 662 0.6928 0.7071 0.6928 0.8324
0.4907 8.4051 664 0.7382 0.6972 0.7382 0.8592
0.4907 8.4304 666 0.8750 0.6024 0.8750 0.9354
0.4907 8.4557 668 1.1542 0.5170 1.1542 1.0743
0.4907 8.4810 670 1.3885 0.4895 1.3885 1.1783
0.4907 8.5063 672 1.3724 0.4672 1.3724 1.1715
0.4907 8.5316 674 1.2611 0.4962 1.2611 1.1230
0.4907 8.5570 676 1.0711 0.5251 1.0711 1.0349

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k16_task1_organization

Finetuned
(4023)
this model