ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run3_AugV5_k17_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7337
  • Qwk: 0.7328
  • Mse: 0.7337
  • Rmse: 0.8566

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0303 2 2.1616 0.0658 2.1616 1.4702
No log 0.0606 4 1.4343 0.2297 1.4343 1.1976
No log 0.0909 6 1.4777 0.0873 1.4777 1.2156
No log 0.1212 8 1.4885 0.2143 1.4885 1.2200
No log 0.1515 10 1.6966 0.3023 1.6966 1.3025
No log 0.1818 12 1.7578 0.2930 1.7578 1.3258
No log 0.2121 14 1.6058 0.2877 1.6058 1.2672
No log 0.2424 16 1.5810 0.2877 1.5810 1.2574
No log 0.2727 18 1.4457 0.3501 1.4457 1.2024
No log 0.3030 20 1.2507 0.2821 1.2507 1.1184
No log 0.3333 22 1.2109 0.3081 1.2109 1.1004
No log 0.3636 24 1.1562 0.2889 1.1562 1.0753
No log 0.3939 26 1.1105 0.3582 1.1105 1.0538
No log 0.4242 28 1.0481 0.3874 1.0481 1.0238
No log 0.4545 30 1.0194 0.4444 1.0194 1.0097
No log 0.4848 32 1.0131 0.4499 1.0131 1.0065
No log 0.5152 34 1.0285 0.4459 1.0285 1.0141
No log 0.5455 36 0.9911 0.4080 0.9911 0.9955
No log 0.5758 38 1.1408 0.4700 1.1408 1.0681
No log 0.6061 40 1.1388 0.4679 1.1388 1.0671
No log 0.6364 42 1.0445 0.4739 1.0445 1.0220
No log 0.6667 44 1.0246 0.4796 1.0246 1.0122
No log 0.6970 46 0.9745 0.4572 0.9745 0.9872
No log 0.7273 48 0.9274 0.5182 0.9274 0.9630
No log 0.7576 50 1.0005 0.4850 1.0005 1.0003
No log 0.7879 52 1.0397 0.5486 1.0397 1.0196
No log 0.8182 54 0.9299 0.6049 0.9299 0.9643
No log 0.8485 56 0.9138 0.6437 0.9138 0.9559
No log 0.8788 58 1.1328 0.5977 1.1328 1.0643
No log 0.9091 60 0.9955 0.6356 0.9955 0.9978
No log 0.9394 62 0.7158 0.7143 0.7158 0.8461
No log 0.9697 64 0.6811 0.6851 0.6811 0.8253
No log 1.0 66 0.7649 0.6871 0.7649 0.8746
No log 1.0303 68 0.9231 0.6829 0.9231 0.9608
No log 1.0606 70 0.9880 0.6369 0.9880 0.9940
No log 1.0909 72 0.9943 0.6294 0.9943 0.9972
No log 1.1212 74 1.0174 0.6127 1.0174 1.0086
No log 1.1515 76 1.1187 0.6193 1.1187 1.0577
No log 1.1818 78 0.9760 0.6144 0.9760 0.9879
No log 1.2121 80 0.7338 0.7044 0.7338 0.8566
No log 1.2424 82 0.6299 0.7234 0.6299 0.7937
No log 1.2727 84 0.6307 0.7382 0.6307 0.7942
No log 1.3030 86 0.6982 0.7547 0.6982 0.8356
No log 1.3333 88 0.6416 0.7586 0.6416 0.8010
No log 1.3636 90 0.6506 0.7600 0.6506 0.8066
No log 1.3939 92 0.6354 0.7380 0.6354 0.7971
No log 1.4242 94 0.6378 0.7310 0.6378 0.7986
No log 1.4545 96 0.6180 0.6835 0.6180 0.7861
No log 1.4848 98 0.6743 0.6373 0.6743 0.8211
No log 1.5152 100 0.6509 0.6765 0.6509 0.8068
No log 1.5455 102 0.7904 0.7061 0.7904 0.8890
No log 1.5758 104 1.3744 0.5408 1.3744 1.1723
No log 1.6061 106 1.6337 0.4861 1.6337 1.2782
No log 1.6364 108 1.3277 0.5102 1.3277 1.1523
No log 1.6667 110 0.9183 0.5734 0.9183 0.9583
No log 1.6970 112 0.7899 0.6388 0.7899 0.8888
No log 1.7273 114 0.8017 0.6026 0.8017 0.8954
No log 1.7576 116 0.8231 0.6014 0.8231 0.9072
No log 1.7879 118 0.9939 0.5399 0.9939 0.9970
No log 1.8182 120 1.4725 0.5075 1.4725 1.2135
No log 1.8485 122 1.6551 0.4631 1.6551 1.2865
No log 1.8788 124 1.4356 0.5146 1.4356 1.1982
No log 1.9091 126 1.0958 0.5569 1.0958 1.0468
No log 1.9394 128 0.8795 0.6632 0.8795 0.9378
No log 1.9697 130 0.8625 0.6632 0.8625 0.9287
No log 2.0 132 1.0121 0.5885 1.0121 1.0060
No log 2.0303 134 1.2751 0.4889 1.2751 1.1292
No log 2.0606 136 1.3044 0.5476 1.3044 1.1421
No log 2.0909 138 1.0364 0.5821 1.0364 1.0181
No log 2.1212 140 0.7519 0.6994 0.7519 0.8671
No log 2.1515 142 0.7024 0.6647 0.7024 0.8381
No log 2.1818 144 0.7002 0.6606 0.7002 0.8368
No log 2.2121 146 0.7016 0.6773 0.7016 0.8376
No log 2.2424 148 0.7581 0.7255 0.7581 0.8707
No log 2.2727 150 0.8309 0.7105 0.8309 0.9115
No log 2.3030 152 0.8325 0.7141 0.8325 0.9124
No log 2.3333 154 0.7754 0.7058 0.7754 0.8805
No log 2.3636 156 0.7414 0.7186 0.7414 0.8610
No log 2.3939 158 0.7312 0.7272 0.7312 0.8551
No log 2.4242 160 0.8404 0.6986 0.8404 0.9167
No log 2.4545 162 0.8659 0.6867 0.8659 0.9306
No log 2.4848 164 0.8722 0.6945 0.8722 0.9339
No log 2.5152 166 0.8544 0.6718 0.8544 0.9243
No log 2.5455 168 0.9795 0.6747 0.9795 0.9897
No log 2.5758 170 1.1496 0.6355 1.1496 1.0722
No log 2.6061 172 1.1095 0.6547 1.1095 1.0534
No log 2.6364 174 1.1320 0.6361 1.1320 1.0640
No log 2.6667 176 1.1274 0.6499 1.1274 1.0618
No log 2.6970 178 1.0475 0.6658 1.0475 1.0235
No log 2.7273 180 1.0095 0.6801 1.0095 1.0047
No log 2.7576 182 1.0104 0.6768 1.0104 1.0052
No log 2.7879 184 1.0622 0.6659 1.0622 1.0306
No log 2.8182 186 1.0261 0.6781 1.0261 1.0130
No log 2.8485 188 1.0708 0.6644 1.0708 1.0348
No log 2.8788 190 1.1160 0.6652 1.1160 1.0564
No log 2.9091 192 0.9488 0.7024 0.9488 0.9741
No log 2.9394 194 0.7372 0.7227 0.7372 0.8586
No log 2.9697 196 0.6980 0.7574 0.6980 0.8354
No log 3.0 198 0.7782 0.7092 0.7782 0.8822
No log 3.0303 200 1.0433 0.6462 1.0433 1.0214
No log 3.0606 202 1.3473 0.5917 1.3473 1.1607
No log 3.0909 204 1.4392 0.5601 1.4392 1.1997
No log 3.1212 206 1.2599 0.5812 1.2599 1.1224
No log 3.1515 208 0.9600 0.6711 0.9600 0.9798
No log 3.1818 210 0.6993 0.7465 0.6993 0.8363
No log 3.2121 212 0.6378 0.7101 0.6378 0.7986
No log 3.2424 214 0.6373 0.7127 0.6373 0.7983
No log 3.2727 216 0.6743 0.7417 0.6743 0.8211
No log 3.3030 218 0.8714 0.7062 0.8714 0.9335
No log 3.3333 220 1.1712 0.6421 1.1712 1.0822
No log 3.3636 222 1.3487 0.6005 1.3487 1.1613
No log 3.3939 224 1.2784 0.6040 1.2784 1.1306
No log 3.4242 226 1.0594 0.6412 1.0594 1.0293
No log 3.4545 228 0.8495 0.7024 0.8495 0.9217
No log 3.4848 230 0.7902 0.7139 0.7902 0.8890
No log 3.5152 232 0.8445 0.7024 0.8445 0.9190
No log 3.5455 234 0.8732 0.6791 0.8732 0.9345
No log 3.5758 236 0.8594 0.6869 0.8594 0.9270
No log 3.6061 238 0.8480 0.6869 0.8480 0.9209
No log 3.6364 240 0.8344 0.7122 0.8344 0.9134
No log 3.6667 242 0.8495 0.6959 0.8495 0.9217
No log 3.6970 244 0.7909 0.7274 0.7909 0.8893
No log 3.7273 246 0.7712 0.7354 0.7712 0.8782
No log 3.7576 248 0.7946 0.7354 0.7946 0.8914
No log 3.7879 250 0.8615 0.7163 0.8615 0.9282
No log 3.8182 252 0.8174 0.7354 0.8174 0.9041
No log 3.8485 254 0.7488 0.7363 0.7488 0.8653
No log 3.8788 256 0.7869 0.7392 0.7869 0.8871
No log 3.9091 258 0.8452 0.7017 0.8452 0.9193
No log 3.9394 260 0.8464 0.7017 0.8464 0.9200
No log 3.9697 262 0.7826 0.7335 0.7826 0.8847
No log 4.0 264 0.6848 0.7307 0.6848 0.8275
No log 4.0303 266 0.6143 0.7514 0.6143 0.7838
No log 4.0606 268 0.5962 0.7439 0.5962 0.7721
No log 4.0909 270 0.6192 0.7591 0.6192 0.7869
No log 4.1212 272 0.6398 0.7476 0.6398 0.7999
No log 4.1515 274 0.6674 0.7432 0.6674 0.8169
No log 4.1818 276 0.6490 0.7531 0.6490 0.8056
No log 4.2121 278 0.5796 0.7527 0.5796 0.7613
No log 4.2424 280 0.5698 0.7322 0.5698 0.7549
No log 4.2727 282 0.5699 0.7283 0.5699 0.7549
No log 4.3030 284 0.5862 0.7418 0.5862 0.7656
No log 4.3333 286 0.6511 0.7489 0.6511 0.8069
No log 4.3636 288 0.6663 0.7363 0.6663 0.8163
No log 4.3939 290 0.6562 0.7421 0.6562 0.8101
No log 4.4242 292 0.6373 0.7489 0.6373 0.7983
No log 4.4545 294 0.6777 0.7378 0.6777 0.8232
No log 4.4848 296 0.7230 0.7371 0.7230 0.8503
No log 4.5152 298 0.8112 0.7143 0.8112 0.9006
No log 4.5455 300 0.8428 0.7022 0.8428 0.9180
No log 4.5758 302 0.8351 0.7022 0.8351 0.9138
No log 4.6061 304 0.7876 0.7170 0.7876 0.8875
No log 4.6364 306 0.7619 0.7329 0.7619 0.8729
No log 4.6667 308 0.7262 0.7453 0.7262 0.8522
No log 4.6970 310 0.7380 0.7329 0.7380 0.8591
No log 4.7273 312 0.8048 0.7079 0.8048 0.8971
No log 4.7576 314 0.8271 0.7082 0.8271 0.9094
No log 4.7879 316 0.7781 0.7288 0.7781 0.8821
No log 4.8182 318 0.7775 0.7247 0.7775 0.8818
No log 4.8485 320 0.8271 0.6962 0.8271 0.9094
No log 4.8788 322 0.8829 0.6868 0.8829 0.9396
No log 4.9091 324 0.9732 0.6803 0.9732 0.9865
No log 4.9394 326 0.9963 0.6794 0.9963 0.9982
No log 4.9697 328 0.9177 0.6955 0.9177 0.9580
No log 5.0 330 0.7753 0.7261 0.7753 0.8805
No log 5.0303 332 0.7198 0.7361 0.7198 0.8484
No log 5.0606 334 0.7489 0.7237 0.7489 0.8654
No log 5.0909 336 0.8046 0.6661 0.8046 0.8970
No log 5.1212 338 0.7660 0.7050 0.7660 0.8752
No log 5.1515 340 0.7362 0.7339 0.7362 0.8580
No log 5.1818 342 0.7332 0.7259 0.7332 0.8563
No log 5.2121 344 0.7152 0.7259 0.7152 0.8457
No log 5.2424 346 0.6741 0.7294 0.6741 0.8210
No log 5.2727 348 0.6526 0.7406 0.6526 0.8078
No log 5.3030 350 0.6558 0.7465 0.6558 0.8098
No log 5.3333 352 0.6866 0.7316 0.6866 0.8286
No log 5.3636 354 0.7367 0.7501 0.7367 0.8583
No log 5.3939 356 0.7401 0.7427 0.7401 0.8603
No log 5.4242 358 0.6952 0.7541 0.6952 0.8338
No log 5.4545 360 0.6652 0.7609 0.6652 0.8156
No log 5.4848 362 0.7083 0.7608 0.7083 0.8416
No log 5.5152 364 0.7124 0.7604 0.7124 0.8440
No log 5.5455 366 0.6837 0.7675 0.6837 0.8269
No log 5.5758 368 0.6641 0.7623 0.6641 0.8149
No log 5.6061 370 0.6762 0.7635 0.6762 0.8223
No log 5.6364 372 0.7602 0.7526 0.7602 0.8719
No log 5.6667 374 0.8616 0.6869 0.8616 0.9282
No log 5.6970 376 0.9061 0.6680 0.9061 0.9519
No log 5.7273 378 0.8439 0.7057 0.8439 0.9187
No log 5.7576 380 0.7219 0.7659 0.7219 0.8496
No log 5.7879 382 0.6719 0.7618 0.6719 0.8197
No log 5.8182 384 0.6313 0.7555 0.6313 0.7946
No log 5.8485 386 0.6104 0.7747 0.6104 0.7813
No log 5.8788 388 0.5909 0.7652 0.5909 0.7687
No log 5.9091 390 0.5988 0.7622 0.5988 0.7738
No log 5.9394 392 0.6558 0.7633 0.6558 0.8098
No log 5.9697 394 0.7751 0.7628 0.7751 0.8804
No log 6.0 396 0.8654 0.7127 0.8654 0.9303
No log 6.0303 398 0.8796 0.7086 0.8796 0.9379
No log 6.0606 400 0.8338 0.7212 0.8338 0.9132
No log 6.0909 402 0.7468 0.7650 0.7468 0.8642
No log 6.1212 404 0.6747 0.7461 0.6747 0.8214
No log 6.1515 406 0.6392 0.7560 0.6392 0.7995
No log 6.1818 408 0.6425 0.7564 0.6425 0.8016
No log 6.2121 410 0.6543 0.7560 0.6543 0.8089
No log 6.2424 412 0.6848 0.7321 0.6848 0.8276
No log 6.2727 414 0.6865 0.7433 0.6865 0.8286
No log 6.3030 416 0.7030 0.7433 0.7030 0.8384
No log 6.3333 418 0.6759 0.7529 0.6759 0.8221
No log 6.3636 420 0.6312 0.7625 0.6312 0.7945
No log 6.3939 422 0.6027 0.7672 0.6027 0.7763
No log 6.4242 424 0.5980 0.7580 0.5980 0.7733
No log 6.4545 426 0.6103 0.7672 0.6103 0.7812
No log 6.4848 428 0.6272 0.7625 0.6272 0.7920
No log 6.5152 430 0.6383 0.7583 0.6383 0.7989
No log 6.5455 432 0.6377 0.7601 0.6377 0.7986
No log 6.5758 434 0.6082 0.7659 0.6082 0.7799
No log 6.6061 436 0.6059 0.7659 0.6059 0.7784
No log 6.6364 438 0.6260 0.7611 0.6260 0.7912
No log 6.6667 440 0.6746 0.7356 0.6746 0.8214
No log 6.6970 442 0.7377 0.7408 0.7377 0.8589
No log 6.7273 444 0.8524 0.7150 0.8524 0.9233
No log 6.7576 446 0.8987 0.7146 0.8987 0.9480
No log 6.7879 448 0.8784 0.7074 0.8784 0.9372
No log 6.8182 450 0.8479 0.7180 0.8479 0.9208
No log 6.8485 452 0.8414 0.7219 0.8414 0.9173
No log 6.8788 454 0.8317 0.7295 0.8317 0.9120
No log 6.9091 456 0.8201 0.7352 0.8201 0.9056
No log 6.9394 458 0.7761 0.7327 0.7761 0.8810
No log 6.9697 460 0.7443 0.7387 0.7443 0.8627
No log 7.0 462 0.7402 0.7387 0.7402 0.8603
No log 7.0303 464 0.7490 0.7348 0.7490 0.8655
No log 7.0606 466 0.7462 0.7342 0.7462 0.8638
No log 7.0909 468 0.7283 0.7342 0.7283 0.8534
No log 7.1212 470 0.7240 0.7260 0.7240 0.8509
No log 7.1515 472 0.7448 0.7145 0.7448 0.8630
No log 7.1818 474 0.7942 0.7044 0.7942 0.8912
No log 7.2121 476 0.8080 0.7044 0.8080 0.8989
No log 7.2424 478 0.8018 0.7044 0.8018 0.8954
No log 7.2727 480 0.7673 0.7166 0.7673 0.8759
No log 7.3030 482 0.7231 0.7348 0.7231 0.8503
No log 7.3333 484 0.6862 0.7605 0.6862 0.8284
No log 7.3636 486 0.6792 0.7569 0.6792 0.8241
No log 7.3939 488 0.6664 0.7569 0.6664 0.8163
No log 7.4242 490 0.6808 0.7569 0.6808 0.8251
No log 7.4545 492 0.7219 0.7653 0.7219 0.8496
No log 7.4848 494 0.7682 0.7327 0.7682 0.8765
No log 7.5152 496 0.7904 0.7398 0.7904 0.8890
No log 7.5455 498 0.7872 0.7398 0.7872 0.8872
0.3479 7.5758 500 0.7765 0.7436 0.7765 0.8812
0.3479 7.6061 502 0.7461 0.7600 0.7461 0.8638
0.3479 7.6364 504 0.7158 0.7596 0.7158 0.8461
0.3479 7.6667 506 0.7281 0.7665 0.7281 0.8533
0.3479 7.6970 508 0.7433 0.7699 0.7433 0.8622
0.3479 7.7273 510 0.7394 0.7699 0.7394 0.8599
0.3479 7.7576 512 0.7237 0.7596 0.7237 0.8507
0.3479 7.7879 514 0.6857 0.7544 0.6857 0.8281
0.3479 7.8182 516 0.6537 0.7519 0.6537 0.8085
0.3479 7.8485 518 0.6432 0.7541 0.6432 0.8020
0.3479 7.8788 520 0.6536 0.7519 0.6536 0.8084
0.3479 7.9091 522 0.6748 0.7496 0.6748 0.8215
0.3479 7.9394 524 0.7119 0.7605 0.7119 0.8437
0.3479 7.9697 526 0.7694 0.7421 0.7694 0.8772
0.3479 8.0 528 0.7968 0.7209 0.7968 0.8927
0.3479 8.0303 530 0.7848 0.7174 0.7848 0.8859
0.3479 8.0606 532 0.7530 0.7538 0.7530 0.8677
0.3479 8.0909 534 0.7026 0.7605 0.7026 0.8382
0.3479 8.1212 536 0.6724 0.7551 0.6724 0.8200
0.3479 8.1515 538 0.6446 0.7502 0.6446 0.8029
0.3479 8.1818 540 0.6289 0.7629 0.6289 0.7930
0.3479 8.2121 542 0.6276 0.7629 0.6276 0.7922
0.3479 8.2424 544 0.6426 0.7656 0.6426 0.8016
0.3479 8.2727 546 0.6741 0.7551 0.6741 0.8211
0.3479 8.3030 548 0.7027 0.7635 0.7027 0.8383
0.3479 8.3333 550 0.7488 0.7535 0.7488 0.8653
0.3479 8.3636 552 0.7789 0.7174 0.7789 0.8825
0.3479 8.3939 554 0.7777 0.7174 0.7777 0.8819
0.3479 8.4242 556 0.7643 0.7254 0.7643 0.8742
0.3479 8.4545 558 0.7662 0.7174 0.7662 0.8753
0.3479 8.4848 560 0.7682 0.7174 0.7682 0.8765
0.3479 8.5152 562 0.7652 0.7174 0.7652 0.8748
0.3479 8.5455 564 0.7566 0.7174 0.7566 0.8698
0.3479 8.5758 566 0.7412 0.7381 0.7412 0.8609
0.3479 8.6061 568 0.7358 0.7384 0.7358 0.8578
0.3479 8.6364 570 0.7332 0.7384 0.7332 0.8563
0.3479 8.6667 572 0.7377 0.7383 0.7377 0.8589
0.3479 8.6970 574 0.7282 0.7384 0.7282 0.8533
0.3479 8.7273 576 0.7120 0.7541 0.7120 0.8438
0.3479 8.7576 578 0.7017 0.7541 0.7017 0.8377
0.3479 8.7879 580 0.7035 0.7541 0.7035 0.8388
0.3479 8.8182 582 0.7115 0.7541 0.7115 0.8435
0.3479 8.8485 584 0.7213 0.7423 0.7213 0.8493
0.3479 8.8788 586 0.7405 0.7383 0.7405 0.8605
0.3479 8.9091 588 0.7735 0.7134 0.7735 0.8795
0.3479 8.9394 590 0.7901 0.7094 0.7901 0.8889
0.3479 8.9697 592 0.7926 0.7128 0.7926 0.8903
0.3479 9.0 594 0.7804 0.7094 0.7804 0.8834
0.3479 9.0303 596 0.7599 0.7174 0.7599 0.8717
0.3479 9.0606 598 0.7510 0.7214 0.7510 0.8666
0.3479 9.0909 600 0.7453 0.7254 0.7453 0.8633
0.3479 9.1212 602 0.7480 0.7254 0.7480 0.8649
0.3479 9.1515 604 0.7512 0.7254 0.7512 0.8667
0.3479 9.1818 606 0.7559 0.7174 0.7559 0.8694
0.3479 9.2121 608 0.7609 0.7174 0.7609 0.8723
0.3479 9.2424 610 0.7586 0.7174 0.7586 0.8709
0.3479 9.2727 612 0.7537 0.7174 0.7537 0.8682
0.3479 9.3030 614 0.7422 0.7341 0.7422 0.8615
0.3479 9.3333 616 0.7413 0.7341 0.7413 0.8610
0.3479 9.3636 618 0.7346 0.7341 0.7346 0.8571
0.3479 9.3939 620 0.7249 0.7404 0.7249 0.8514
0.3479 9.4242 622 0.7168 0.7349 0.7168 0.8467
0.3479 9.4545 624 0.7101 0.7449 0.7101 0.8427
0.3479 9.4848 626 0.7120 0.7449 0.7120 0.8438
0.3479 9.5152 628 0.7122 0.7449 0.7122 0.8439
0.3479 9.5455 630 0.7128 0.7449 0.7128 0.8443
0.3479 9.5758 632 0.7168 0.7350 0.7168 0.8467
0.3479 9.6061 634 0.7221 0.7350 0.7221 0.8498
0.3479 9.6364 636 0.7249 0.7350 0.7249 0.8514
0.3479 9.6667 638 0.7292 0.7329 0.7292 0.8540
0.3479 9.6970 640 0.7350 0.7342 0.7350 0.8573
0.3479 9.7273 642 0.7380 0.7301 0.7380 0.8591
0.3479 9.7576 644 0.7369 0.7381 0.7369 0.8585
0.3479 9.7879 646 0.7366 0.7381 0.7366 0.8582
0.3479 9.8182 648 0.7363 0.7381 0.7363 0.8581
0.3479 9.8485 650 0.7358 0.7381 0.7358 0.8578
0.3479 9.8788 652 0.7356 0.7381 0.7356 0.8577
0.3479 9.9091 654 0.7345 0.7328 0.7345 0.8570
0.3479 9.9394 656 0.7336 0.7328 0.7336 0.8565
0.3479 9.9697 658 0.7337 0.7328 0.7337 0.8566
0.3479 10.0 660 0.7337 0.7328 0.7337 0.8566

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run3_AugV5_k17_task5_organization

Finetuned
(4023)
this model