ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k20_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8331
  • Qwk: -0.1268
  • Mse: 0.8331
  • Rmse: 0.9128

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.04 2 3.6257 -0.0057 3.6257 1.9041
No log 0.08 4 1.9928 0.0704 1.9928 1.4117
No log 0.12 6 1.7116 0.0150 1.7116 1.3083
No log 0.16 8 1.1726 -0.0704 1.1726 1.0829
No log 0.2 10 1.6868 -0.0199 1.6868 1.2988
No log 0.24 12 1.4820 -0.0164 1.4820 1.2174
No log 0.28 14 0.6937 0.0460 0.6937 0.8329
No log 0.32 16 0.6603 -0.0035 0.6603 0.8126
No log 0.36 18 0.6810 -0.0069 0.6810 0.8252
No log 0.4 20 0.8099 0.1342 0.8099 0.9000
No log 0.44 22 1.3260 -0.0736 1.3260 1.1515
No log 0.48 24 1.4067 -0.0253 1.4067 1.1860
No log 0.52 26 1.0163 -0.0117 1.0163 1.0081
No log 0.56 28 0.7820 -0.1230 0.7820 0.8843
No log 0.6 30 0.8128 -0.0287 0.8128 0.9015
No log 0.64 32 0.9257 -0.0490 0.9257 0.9621
No log 0.68 34 1.0670 -0.0997 1.0670 1.0330
No log 0.72 36 1.1205 -0.1288 1.1205 1.0586
No log 0.76 38 1.0598 -0.0955 1.0598 1.0294
No log 0.8 40 0.9949 -0.0200 0.9949 0.9974
No log 0.84 42 0.9422 -0.1265 0.9422 0.9706
No log 0.88 44 0.8977 -0.1259 0.8977 0.9475
No log 0.92 46 0.9070 -0.1261 0.9070 0.9524
No log 0.96 48 0.9594 -0.2000 0.9594 0.9795
No log 1.0 50 0.9379 -0.1261 0.9379 0.9684
No log 1.04 52 0.9579 -0.2380 0.9579 0.9787
No log 1.08 54 0.8348 0.0129 0.8348 0.9137
No log 1.12 56 0.8257 0.0129 0.8257 0.9087
No log 1.16 58 0.8459 0.0512 0.8459 0.9197
No log 1.2 60 0.8921 0.0867 0.8921 0.9445
No log 1.24 62 1.2661 -0.1286 1.2661 1.1252
No log 1.28 64 1.1348 -0.0930 1.1348 1.0653
No log 1.32 66 0.7749 0.0714 0.7749 0.8803
No log 1.3600 68 0.6757 0.0 0.6757 0.8220
No log 1.4 70 0.6784 0.0 0.6784 0.8237
No log 1.44 72 0.7835 0.0374 0.7835 0.8851
No log 1.48 74 1.0483 -0.0861 1.0483 1.0239
No log 1.52 76 1.2567 -0.0936 1.2567 1.1210
No log 1.56 78 0.8483 0.0159 0.8483 0.9210
No log 1.6 80 0.7692 -0.0101 0.7692 0.8770
No log 1.6400 82 0.7945 0.0759 0.7945 0.8914
No log 1.6800 84 0.7830 0.0857 0.7830 0.8848
No log 1.72 86 0.9830 -0.0500 0.9830 0.9915
No log 1.76 88 0.9487 -0.0441 0.9487 0.9740
No log 1.8 90 0.7849 0.0416 0.7849 0.8859
No log 1.8400 92 0.8495 0.0 0.8495 0.9217
No log 1.88 94 0.8485 -0.1635 0.8485 0.9211
No log 1.92 96 0.7695 0.0857 0.7695 0.8772
No log 1.96 98 1.1630 0.0147 1.1630 1.0784
No log 2.0 100 1.1231 -0.0500 1.1231 1.0598
No log 2.04 102 0.8717 -0.0351 0.8717 0.9336
No log 2.08 104 0.8717 -0.1354 0.8717 0.9336
No log 2.12 106 0.9177 -0.0228 0.9177 0.9580
No log 2.16 108 0.8312 0.0893 0.8312 0.9117
No log 2.2 110 0.9955 -0.1162 0.9955 0.9978
No log 2.24 112 1.0305 -0.0787 1.0305 1.0151
No log 2.2800 114 0.9000 -0.0757 0.9000 0.9487
No log 2.32 116 0.8786 0.0062 0.8786 0.9374
No log 2.36 118 0.7940 0.0303 0.7940 0.8911
No log 2.4 120 0.7783 -0.0086 0.7783 0.8822
No log 2.44 122 0.7941 0.0600 0.7941 0.8911
No log 2.48 124 1.0235 0.0778 1.0235 1.0117
No log 2.52 126 1.3900 -0.0047 1.3900 1.1790
No log 2.56 128 0.9775 0.1348 0.9775 0.9887
No log 2.6 130 0.7952 -0.1466 0.7952 0.8918
No log 2.64 132 0.8700 -0.1547 0.8700 0.9327
No log 2.68 134 0.8662 -0.0366 0.8662 0.9307
No log 2.7200 136 0.8586 0.0081 0.8586 0.9266
No log 2.76 138 1.2339 0.0502 1.2339 1.1108
No log 2.8 140 1.4590 0.0022 1.4590 1.2079
No log 2.84 142 1.0689 -0.1644 1.0689 1.0339
No log 2.88 144 1.0655 0.0451 1.0655 1.0322
No log 2.92 146 1.0885 -0.0210 1.0885 1.0433
No log 2.96 148 0.9345 0.0776 0.9345 0.9667
No log 3.0 150 0.7736 0.0 0.7736 0.8796
No log 3.04 152 0.9066 0.0277 0.9066 0.9522
No log 3.08 154 1.2719 -0.0063 1.2719 1.1278
No log 3.12 156 1.1787 -0.0029 1.1787 1.0857
No log 3.16 158 0.8887 0.0786 0.8887 0.9427
No log 3.2 160 0.7484 0.0964 0.7484 0.8651
No log 3.24 162 0.7313 0.0506 0.7313 0.8552
No log 3.2800 164 0.7594 -0.0541 0.7594 0.8714
No log 3.32 166 0.7962 0.0375 0.7962 0.8923
No log 3.36 168 0.8972 0.0456 0.8972 0.9472
No log 3.4 170 0.8280 0.0414 0.8280 0.9099
No log 3.44 172 0.8096 -0.0541 0.8096 0.8998
No log 3.48 174 0.7968 0.0460 0.7968 0.8926
No log 3.52 176 0.8415 -0.0160 0.8415 0.9173
No log 3.56 178 0.9841 -0.0441 0.9841 0.9920
No log 3.6 180 0.9745 -0.1259 0.9745 0.9872
No log 3.64 182 0.8751 -0.1094 0.8751 0.9355
No log 3.68 184 0.8670 -0.0513 0.8670 0.9311
No log 3.7200 186 0.9062 -0.0690 0.9062 0.9520
No log 3.76 188 1.0034 -0.1261 1.0034 1.0017
No log 3.8 190 0.9312 0.0639 0.9312 0.9650
No log 3.84 192 0.8867 -0.0163 0.8867 0.9417
No log 3.88 194 0.8404 -0.1542 0.8404 0.9168
No log 3.92 196 0.8503 -0.0118 0.8503 0.9221
No log 3.96 198 0.8541 -0.0578 0.8541 0.9242
No log 4.0 200 0.8678 0.0914 0.8678 0.9316
No log 4.04 202 0.8391 -0.0560 0.8391 0.9160
No log 4.08 204 0.8636 -0.0550 0.8636 0.9293
No log 4.12 206 0.8857 -0.1176 0.8857 0.9411
No log 4.16 208 0.9801 -0.0283 0.9801 0.9900
No log 4.2 210 0.9104 -0.0739 0.9104 0.9542
No log 4.24 212 0.8312 -0.0035 0.8312 0.9117
No log 4.28 214 0.8655 -0.1628 0.8655 0.9303
No log 4.32 216 0.8861 -0.1266 0.8861 0.9413
No log 4.36 218 0.8736 -0.0541 0.8736 0.9347
No log 4.4 220 0.8697 -0.0131 0.8697 0.9326
No log 4.44 222 0.8544 -0.0609 0.8544 0.9243
No log 4.48 224 0.8615 -0.0096 0.8615 0.9281
No log 4.52 226 0.8824 -0.1172 0.8824 0.9394
No log 4.5600 228 0.9058 -0.1331 0.9058 0.9517
No log 4.6 230 0.9543 -0.1671 0.9543 0.9769
No log 4.64 232 0.9540 -0.1643 0.9540 0.9767
No log 4.68 234 0.9554 -0.1229 0.9554 0.9775
No log 4.72 236 0.9515 -0.1200 0.9515 0.9754
No log 4.76 238 0.9355 -0.0820 0.9355 0.9672
No log 4.8 240 0.8618 -0.1665 0.8618 0.9283
No log 4.84 242 0.8379 -0.0849 0.8379 0.9154
No log 4.88 244 0.8347 0.0964 0.8347 0.9136
No log 4.92 246 0.8368 -0.0215 0.8368 0.9148
No log 4.96 248 0.9370 -0.0823 0.9370 0.9680
No log 5.0 250 0.8841 -0.1244 0.8841 0.9403
No log 5.04 252 0.8218 -0.0560 0.8218 0.9065
No log 5.08 254 0.8522 -0.1397 0.8522 0.9231
No log 5.12 256 0.8744 -0.2538 0.8744 0.9351
No log 5.16 258 0.8771 -0.2022 0.8771 0.9365
No log 5.2 260 0.9530 -0.1191 0.9530 0.9762
No log 5.24 262 0.9585 -0.0755 0.9585 0.9790
No log 5.28 264 0.8851 -0.2720 0.8851 0.9408
No log 5.32 266 0.9677 -0.0393 0.9677 0.9837
No log 5.36 268 0.9480 -0.0425 0.9480 0.9736
No log 5.4 270 0.8236 -0.0967 0.8236 0.9075
No log 5.44 272 0.8250 -0.1094 0.8250 0.9083
No log 5.48 274 0.8447 -0.1466 0.8447 0.9190
No log 5.52 276 0.8517 -0.1531 0.8517 0.9229
No log 5.5600 278 0.8969 -0.2489 0.8969 0.9470
No log 5.6 280 1.0477 -0.0143 1.0477 1.0236
No log 5.64 282 1.0985 0.0458 1.0985 1.0481
No log 5.68 284 0.9231 -0.1633 0.9231 0.9608
No log 5.72 286 0.8717 -0.1180 0.8717 0.9336
No log 5.76 288 0.8516 -0.0753 0.8516 0.9228
No log 5.8 290 0.9171 -0.1246 0.9171 0.9576
No log 5.84 292 0.8727 -0.0753 0.8727 0.9342
No log 5.88 294 0.7906 0.0460 0.7906 0.8891
No log 5.92 296 0.7880 -0.0065 0.7880 0.8877
No log 5.96 298 0.7945 -0.0560 0.7945 0.8914
No log 6.0 300 0.8164 -0.0984 0.8164 0.9036
No log 6.04 302 0.8317 -0.0984 0.8317 0.9120
No log 6.08 304 0.8550 -0.1273 0.8550 0.9247
No log 6.12 306 0.8956 -0.0949 0.8956 0.9464
No log 6.16 308 0.8662 -0.0426 0.8662 0.9307
No log 6.2 310 0.9217 -0.0262 0.9217 0.9600
No log 6.24 312 0.8966 -0.0262 0.8966 0.9469
No log 6.28 314 0.8765 -0.0849 0.8765 0.9362
No log 6.32 316 0.9213 -0.0941 0.9213 0.9599
No log 6.36 318 0.8665 -0.2056 0.8665 0.9309
No log 6.4 320 0.8149 0.1024 0.8149 0.9027
No log 6.44 322 0.8413 -0.0274 0.8413 0.9172
No log 6.48 324 0.8233 0.1024 0.8233 0.9074
No log 6.52 326 0.8755 -0.1745 0.8755 0.9357
No log 6.5600 328 0.8700 -0.1823 0.8700 0.9328
No log 6.6 330 0.8362 -0.1026 0.8362 0.9144
No log 6.64 332 0.8349 -0.1026 0.8349 0.9137
No log 6.68 334 0.8396 -0.0612 0.8396 0.9163
No log 6.72 336 0.8392 -0.1527 0.8392 0.9161
No log 6.76 338 0.8509 -0.0967 0.8509 0.9224
No log 6.8 340 0.8670 -0.1204 0.8670 0.9311
No log 6.84 342 0.8415 -0.0912 0.8415 0.9173
No log 6.88 344 0.8481 -0.0524 0.8481 0.9209
No log 6.92 346 0.8784 -0.0563 0.8784 0.9372
No log 6.96 348 0.8684 -0.0921 0.8684 0.9319
No log 7.0 350 0.8741 -0.0859 0.8741 0.9350
No log 7.04 352 0.8552 -0.0859 0.8552 0.9248
No log 7.08 354 0.8181 -0.1026 0.8181 0.9045
No log 7.12 356 0.8011 -0.0595 0.8011 0.8950
No log 7.16 358 0.7839 -0.0578 0.7839 0.8854
No log 7.2 360 0.7854 -0.1026 0.7854 0.8862
No log 7.24 362 0.8008 -0.0967 0.8008 0.8949
No log 7.28 364 0.8053 -0.0976 0.8053 0.8974
No log 7.32 366 0.8119 -0.0488 0.8119 0.9010
No log 7.36 368 0.8221 -0.0488 0.8221 0.9067
No log 7.4 370 0.8419 -0.0488 0.8419 0.9175
No log 7.44 372 0.8706 -0.0999 0.8706 0.9331
No log 7.48 374 0.8807 -0.0870 0.8807 0.9385
No log 7.52 376 0.9601 -0.1156 0.9601 0.9798
No log 7.5600 378 0.9551 -0.0838 0.9551 0.9773
No log 7.6 380 0.9284 -0.0238 0.9284 0.9635
No log 7.64 382 0.9099 -0.0116 0.9099 0.9539
No log 7.68 384 0.8460 -0.0488 0.8460 0.9198
No log 7.72 386 0.8067 0.0964 0.8067 0.8981
No log 7.76 388 0.8111 -0.0711 0.8111 0.9006
No log 7.8 390 0.7890 0.0964 0.7890 0.8882
No log 7.84 392 0.7815 -0.0069 0.7815 0.8840
No log 7.88 394 0.7757 -0.0035 0.7757 0.8808
No log 7.92 396 0.7609 0.0460 0.7609 0.8723
No log 7.96 398 0.7699 0.0964 0.7699 0.8774
No log 8.0 400 0.8083 0.1259 0.8083 0.8991
No log 8.04 402 0.8041 0.0394 0.8041 0.8967
No log 8.08 404 0.8281 -0.1331 0.8281 0.9100
No log 8.12 406 0.8805 -0.0996 0.8805 0.9384
No log 8.16 408 0.8833 -0.1648 0.8833 0.9398
No log 8.2 410 0.9010 -0.0963 0.9010 0.9492
No log 8.24 412 0.9135 -0.1026 0.9135 0.9558
No log 8.28 414 0.8902 -0.0588 0.8902 0.9435
No log 8.32 416 0.8425 -0.0992 0.8425 0.9179
No log 8.36 418 0.8382 -0.0879 0.8382 0.9155
No log 8.4 420 0.8179 -0.0506 0.8179 0.9044
No log 8.44 422 0.8108 -0.0506 0.8108 0.9004
No log 8.48 424 0.8037 -0.0992 0.8037 0.8965
No log 8.52 426 0.8135 -0.0152 0.8135 0.9019
No log 8.56 428 0.8456 0.1148 0.8456 0.9196
No log 8.6 430 0.8883 0.0017 0.8883 0.9425
No log 8.64 432 0.8130 0.1202 0.8130 0.9016
No log 8.68 434 0.7716 0.0395 0.7716 0.8784
No log 8.72 436 0.7889 -0.0488 0.7889 0.8882
No log 8.76 438 0.8187 -0.0488 0.8187 0.9048
No log 8.8 440 0.8886 -0.0735 0.8886 0.9427
No log 8.84 442 0.9028 -0.0735 0.9028 0.9502
No log 8.88 444 0.8719 -0.0669 0.8719 0.9338
No log 8.92 446 0.7837 -0.0550 0.7837 0.8853
No log 8.96 448 0.7561 -0.0032 0.7561 0.8695
No log 9.0 450 0.7462 0.0964 0.7462 0.8638
No log 9.04 452 0.8041 0.1202 0.8041 0.8967
No log 9.08 454 0.8836 0.0909 0.8836 0.9400
No log 9.12 456 0.8029 -0.0612 0.8029 0.8960
No log 9.16 458 0.8104 0.0030 0.8104 0.9002
No log 9.2 460 0.8186 0.0030 0.8186 0.9048
No log 9.24 462 0.8116 -0.0567 0.8116 0.9009
No log 9.28 464 0.9346 0.0909 0.9346 0.9667
No log 9.32 466 0.8911 0.0549 0.8911 0.9440
No log 9.36 468 0.7826 0.0414 0.7826 0.8846
No log 9.4 470 0.7792 0.0414 0.7792 0.8827
No log 9.44 472 0.7842 0.0471 0.7842 0.8856
No log 9.48 474 0.7865 0.0471 0.7865 0.8868
No log 9.52 476 0.7914 0.0454 0.7914 0.8896
No log 9.56 478 0.8689 -0.0264 0.8689 0.9322
No log 9.6 480 0.8855 -0.0766 0.8855 0.9410
No log 9.64 482 0.8322 -0.0096 0.8322 0.9122
No log 9.68 484 0.8126 -0.1074 0.8126 0.9014
No log 9.72 486 0.8097 -0.0541 0.8097 0.8999
No log 9.76 488 0.8465 -0.0215 0.8465 0.9201
No log 9.8 490 0.9987 -0.0878 0.9987 0.9993
No log 9.84 492 0.9381 -0.0033 0.9381 0.9686
No log 9.88 494 0.7774 0.1379 0.7774 0.8817
No log 9.92 496 0.7573 0.1021 0.7573 0.8702
No log 9.96 498 0.7903 0.0 0.7903 0.8890
0.3216 10.0 500 0.8344 -0.0859 0.8344 0.9135
0.3216 10.04 502 0.8788 -0.1916 0.8788 0.9374
0.3216 10.08 504 0.9099 -0.0699 0.9099 0.9539
0.3216 10.12 506 0.9053 -0.0271 0.9053 0.9515
0.3216 10.16 508 0.8860 -0.0462 0.8860 0.9413
0.3216 10.2 510 0.8876 -0.0370 0.8876 0.9421
0.3216 10.24 512 0.8830 -0.0407 0.8830 0.9397
0.3216 10.28 514 0.8790 -0.0027 0.8790 0.9376
0.3216 10.32 516 0.8731 -0.0567 0.8731 0.9344
0.3216 10.36 518 0.8793 -0.1982 0.8793 0.9377
0.3216 10.4 520 0.8897 -0.0731 0.8897 0.9433
0.3216 10.44 522 0.9038 0.0071 0.9038 0.9507
0.3216 10.48 524 0.8634 -0.0264 0.8634 0.9292
0.3216 10.52 526 0.8112 -0.0513 0.8112 0.9006
0.3216 10.56 528 0.8245 -0.1468 0.8245 0.9080
0.3216 10.6 530 0.8200 -0.1893 0.8200 0.9055
0.3216 10.64 532 0.7900 0.1021 0.7900 0.8888
0.3216 10.68 534 0.8718 0.0191 0.8718 0.9337
0.3216 10.72 536 0.8742 0.0191 0.8742 0.9350
0.3216 10.76 538 0.7926 0.0964 0.7926 0.8903
0.3216 10.8 540 0.7922 -0.0033 0.7922 0.8900
0.3216 10.84 542 0.8437 -0.1329 0.8437 0.9186
0.3216 10.88 544 0.8569 -0.1329 0.8569 0.9257
0.3216 10.92 546 0.8228 -0.0949 0.8228 0.9071
0.3216 10.96 548 0.7785 -0.0065 0.7785 0.8823
0.3216 11.0 550 0.9217 -0.0425 0.9217 0.9600
0.3216 11.04 552 0.9372 -0.0442 0.9372 0.9681
0.3216 11.08 554 0.8567 -0.0096 0.8567 0.9256
0.3216 11.12 556 0.8481 -0.1397 0.8481 0.9209
0.3216 11.16 558 0.8444 -0.0967 0.8444 0.9189
0.3216 11.2 560 0.8329 -0.0030 0.8329 0.9126
0.3216 11.24 562 0.8243 0.0964 0.8243 0.9079
0.3216 11.28 564 0.8197 0.0964 0.8197 0.9054
0.3216 11.32 566 0.8070 0.0964 0.8070 0.8983
0.3216 11.36 568 0.7952 0.0964 0.7952 0.8918
0.3216 11.4 570 0.7942 0.1021 0.7942 0.8912
0.3216 11.44 572 0.8047 -0.0065 0.8047 0.8970
0.3216 11.48 574 0.8270 0.0395 0.8270 0.9094
0.3216 11.52 576 0.8446 -0.0488 0.8446 0.9190
0.3216 11.56 578 0.8552 -0.0912 0.8552 0.9248
0.3216 11.6 580 0.8577 -0.0859 0.8577 0.9261
0.3216 11.64 582 0.8385 -0.1268 0.8385 0.9157
0.3216 11.68 584 0.8126 -0.0912 0.8126 0.9014
0.3216 11.72 586 0.8025 0.0479 0.8025 0.8958
0.3216 11.76 588 0.8195 0.0964 0.8195 0.9053
0.3216 11.8 590 0.8008 0.1021 0.8008 0.8949
0.3216 11.84 592 0.8113 0.0479 0.8113 0.9007
0.3216 11.88 594 0.8293 0.0479 0.8293 0.9107
0.3216 11.92 596 0.8415 -0.0644 0.8415 0.9173
0.3216 11.96 598 0.8781 -0.1172 0.8781 0.9371
0.3216 12.0 600 0.9347 -0.1197 0.9347 0.9668
0.3216 12.04 602 0.9247 -0.1197 0.9247 0.9616
0.3216 12.08 604 0.8911 -0.2354 0.8911 0.9440
0.3216 12.12 606 0.8869 0.0085 0.8869 0.9418
0.3216 12.16 608 0.8795 0.0085 0.8795 0.9378
0.3216 12.2 610 0.8643 -0.0513 0.8643 0.9297
0.3216 12.24 612 0.8470 -0.0595 0.8470 0.9203
0.3216 12.28 614 0.9059 -0.0778 0.9059 0.9518
0.3216 12.32 616 1.0279 -0.1261 1.0279 1.0138
0.3216 12.36 618 1.0674 -0.0886 1.0674 1.0331
0.3216 12.4 620 1.0061 -0.1261 1.0061 1.0030
0.3216 12.44 622 0.9239 -0.0823 0.9239 0.9612
0.3216 12.48 624 0.8156 0.0395 0.8156 0.9031
0.3216 12.52 626 0.8320 0.0496 0.8320 0.9121
0.3216 12.56 628 0.8646 0.0031 0.8646 0.9298
0.3216 12.6 630 0.8433 -0.1018 0.8433 0.9183
0.3216 12.64 632 0.8277 -0.1397 0.8277 0.9098
0.3216 12.68 634 0.8231 -0.1397 0.8231 0.9073
0.3216 12.72 636 0.8432 0.0061 0.8432 0.9183
0.3216 12.76 638 0.8310 -0.0939 0.8310 0.9116
0.3216 12.8 640 0.8053 -0.0033 0.8053 0.8974
0.3216 12.84 642 0.8376 0.1318 0.8376 0.9152
0.3216 12.88 644 0.8507 0.0191 0.8507 0.9223
0.3216 12.92 646 0.8666 0.0628 0.8666 0.9309
0.3216 12.96 648 0.8475 0.0247 0.8475 0.9206
0.3216 13.0 650 0.8257 0.0768 0.8257 0.9087
0.3216 13.04 652 0.8219 0.1259 0.8219 0.9066
0.3216 13.08 654 0.8117 0.0967 0.8117 0.9009
0.3216 13.12 656 0.8413 0.1148 0.8413 0.9172
0.3216 13.16 658 0.8216 0.0449 0.8216 0.9064
0.3216 13.2 660 0.8061 -0.1397 0.8061 0.8978
0.3216 13.24 662 0.8530 -0.1535 0.8530 0.9236
0.3216 13.28 664 0.8822 -0.1709 0.8822 0.9393
0.3216 13.32 666 0.8673 -0.1905 0.8673 0.9313
0.3216 13.36 668 0.8331 -0.1268 0.8331 0.9128

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k20_task3_organization

Finetuned
(4019)
this model