ArabicNewSplits6_FineTuningAraBERT_run1_AugV5_k16_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8044
  • Qwk: 0.6805
  • Mse: 0.8044
  • Rmse: 0.8969

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.025 2 5.5310 -0.0138 5.5310 2.3518
No log 0.05 4 3.5016 0.0518 3.5016 1.8713
No log 0.075 6 2.8900 -0.0593 2.8900 1.7000
No log 0.1 8 2.2691 -0.0619 2.2691 1.5064
No log 0.125 10 1.6277 0.1532 1.6277 1.2758
No log 0.15 12 1.3307 0.1418 1.3307 1.1536
No log 0.175 14 1.3928 0.1350 1.3928 1.1802
No log 0.2 16 1.4725 0.0437 1.4725 1.2135
No log 0.225 18 1.5866 0.0100 1.5866 1.2596
No log 0.25 20 1.4985 0.1109 1.4985 1.2241
No log 0.275 22 1.5004 0.1145 1.5004 1.2249
No log 0.3 24 1.4650 0.1390 1.4650 1.2104
No log 0.325 26 1.4430 0.0868 1.4430 1.2013
No log 0.35 28 1.3677 0.0770 1.3677 1.1695
No log 0.375 30 1.2048 0.2462 1.2048 1.0976
No log 0.4 32 1.1270 0.3672 1.1270 1.0616
No log 0.425 34 1.0865 0.3697 1.0865 1.0423
No log 0.45 36 1.1714 0.3480 1.1714 1.0823
No log 0.475 38 1.4668 0.2388 1.4668 1.2111
No log 0.5 40 1.6955 0.2624 1.6955 1.3021
No log 0.525 42 1.8924 0.2489 1.8924 1.3757
No log 0.55 44 2.5605 0.1706 2.5605 1.6001
No log 0.575 46 2.6101 0.1838 2.6101 1.6156
No log 0.6 48 1.8402 0.3102 1.8402 1.3565
No log 0.625 50 1.1935 0.4659 1.1935 1.0925
No log 0.65 52 1.1117 0.4846 1.1117 1.0544
No log 0.675 54 1.2441 0.4500 1.2441 1.1154
No log 0.7 56 1.3019 0.4210 1.3019 1.1410
No log 0.725 58 1.3203 0.4142 1.3203 1.1490
No log 0.75 60 1.0606 0.5095 1.0606 1.0298
No log 0.775 62 0.8920 0.5708 0.8920 0.9445
No log 0.8 64 0.8245 0.4924 0.8245 0.9080
No log 0.825 66 0.8121 0.4695 0.8121 0.9011
No log 0.85 68 0.8490 0.5095 0.8490 0.9214
No log 0.875 70 0.9348 0.4734 0.9348 0.9669
No log 0.9 72 1.1510 0.4330 1.1510 1.0729
No log 0.925 74 1.2766 0.4177 1.2766 1.1299
No log 0.95 76 1.3365 0.4034 1.3365 1.1561
No log 0.975 78 1.1325 0.4267 1.1325 1.0642
No log 1.0 80 0.9745 0.5007 0.9745 0.9872
No log 1.025 82 0.9864 0.5652 0.9864 0.9932
No log 1.05 84 0.9963 0.5587 0.9963 0.9981
No log 1.075 86 0.9228 0.5617 0.9228 0.9606
No log 1.1 88 0.8856 0.5479 0.8856 0.9411
No log 1.125 90 0.9382 0.5479 0.9382 0.9686
No log 1.15 92 1.0485 0.5348 1.0485 1.0240
No log 1.175 94 1.2301 0.5356 1.2301 1.1091
No log 1.2 96 1.2714 0.5179 1.2714 1.1276
No log 1.225 98 1.2882 0.5002 1.2882 1.1350
No log 1.25 100 1.1125 0.5262 1.1125 1.0547
No log 1.275 102 0.9783 0.5293 0.9783 0.9891
No log 1.3 104 0.8974 0.5207 0.8974 0.9473
No log 1.325 106 0.9882 0.5051 0.9882 0.9941
No log 1.35 108 1.0790 0.4614 1.0790 1.0388
No log 1.375 110 0.9950 0.4868 0.9950 0.9975
No log 1.4 112 0.8442 0.5397 0.8442 0.9188
No log 1.425 114 0.7504 0.5819 0.7504 0.8663
No log 1.45 116 0.7481 0.5836 0.7481 0.8649
No log 1.475 118 0.8217 0.5781 0.8217 0.9065
No log 1.5 120 0.9672 0.5545 0.9672 0.9834
No log 1.525 122 1.0599 0.5352 1.0599 1.0295
No log 1.55 124 0.9402 0.5830 0.9402 0.9696
No log 1.575 126 0.8148 0.6604 0.8148 0.9027
No log 1.6 128 0.8173 0.6440 0.8173 0.9040
No log 1.625 130 0.7971 0.6697 0.7971 0.8928
No log 1.65 132 0.7742 0.7160 0.7742 0.8799
No log 1.675 134 0.8168 0.6329 0.8168 0.9038
No log 1.7 136 0.8007 0.6284 0.8007 0.8948
No log 1.725 138 0.7871 0.6820 0.7871 0.8872
No log 1.75 140 0.8690 0.5671 0.8690 0.9322
No log 1.775 142 0.9912 0.5328 0.9912 0.9956
No log 1.8 144 0.8513 0.6308 0.8513 0.9226
No log 1.825 146 0.7823 0.6978 0.7823 0.8845
No log 1.85 148 0.7289 0.7039 0.7289 0.8537
No log 1.875 150 0.6808 0.6643 0.6808 0.8251
No log 1.9 152 0.6568 0.6844 0.6568 0.8104
No log 1.925 154 0.6567 0.6902 0.6567 0.8104
No log 1.95 156 0.6557 0.6832 0.6557 0.8097
No log 1.975 158 0.6539 0.5990 0.6539 0.8086
No log 2.0 160 0.6488 0.6528 0.6488 0.8055
No log 2.025 162 0.6716 0.6063 0.6716 0.8195
No log 2.05 164 0.7307 0.6167 0.7307 0.8548
No log 2.075 166 0.7042 0.6374 0.7042 0.8392
No log 2.1 168 0.6714 0.6702 0.6714 0.8194
No log 2.125 170 0.6758 0.6837 0.6758 0.8221
No log 2.15 172 0.7859 0.6793 0.7859 0.8865
No log 2.175 174 0.8367 0.6348 0.8367 0.9147
No log 2.2 176 0.7928 0.6799 0.7928 0.8904
No log 2.225 178 0.7351 0.6896 0.7351 0.8574
No log 2.25 180 0.7237 0.7159 0.7237 0.8507
No log 2.275 182 0.7346 0.7065 0.7346 0.8571
No log 2.3 184 0.7320 0.7073 0.7320 0.8556
No log 2.325 186 0.7568 0.7001 0.7568 0.8699
No log 2.35 188 0.7849 0.6736 0.7849 0.8859
No log 2.375 190 0.7951 0.6831 0.7951 0.8917
No log 2.4 192 0.7730 0.6726 0.7730 0.8792
No log 2.425 194 0.7501 0.6807 0.7501 0.8661
No log 2.45 196 0.7708 0.6552 0.7708 0.8779
No log 2.475 198 0.7726 0.6447 0.7726 0.8790
No log 2.5 200 0.7261 0.7010 0.7261 0.8521
No log 2.525 202 0.7604 0.6841 0.7604 0.8720
No log 2.55 204 0.8452 0.6342 0.8452 0.9194
No log 2.575 206 0.9807 0.6159 0.9807 0.9903
No log 2.6 208 0.9811 0.6103 0.9811 0.9905
No log 2.625 210 0.8717 0.6409 0.8717 0.9337
No log 2.65 212 0.7590 0.7075 0.7590 0.8712
No log 2.675 214 0.7209 0.7054 0.7209 0.8490
No log 2.7 216 0.7074 0.7046 0.7074 0.8411
No log 2.725 218 0.7178 0.6964 0.7178 0.8472
No log 2.75 220 0.7549 0.7018 0.7549 0.8689
No log 2.775 222 0.7763 0.6574 0.7763 0.8811
No log 2.8 224 0.7654 0.6402 0.7654 0.8749
No log 2.825 226 0.7599 0.6462 0.7599 0.8717
No log 2.85 228 0.7482 0.6351 0.7482 0.8650
No log 2.875 230 0.7307 0.6600 0.7307 0.8548
No log 2.9 232 0.7388 0.6594 0.7388 0.8596
No log 2.925 234 0.8223 0.6281 0.8223 0.9068
No log 2.95 236 0.9122 0.5977 0.9122 0.9551
No log 2.975 238 1.0221 0.5394 1.0221 1.0110
No log 3.0 240 1.0149 0.5500 1.0149 1.0074
No log 3.025 242 0.8601 0.6134 0.8601 0.9274
No log 3.05 244 0.7356 0.6682 0.7356 0.8577
No log 3.075 246 0.7015 0.6964 0.7015 0.8376
No log 3.1 248 0.7525 0.6317 0.7525 0.8675
No log 3.125 250 0.8034 0.6358 0.8034 0.8963
No log 3.15 252 0.7667 0.6645 0.7667 0.8756
No log 3.175 254 0.7679 0.6929 0.7679 0.8763
No log 3.2 256 0.8820 0.6056 0.8820 0.9392
No log 3.225 258 0.9623 0.6022 0.9623 0.9810
No log 3.25 260 0.9536 0.5350 0.9536 0.9765
No log 3.275 262 0.9036 0.5816 0.9036 0.9506
No log 3.3 264 0.8314 0.6067 0.8314 0.9118
No log 3.325 266 0.7724 0.6451 0.7724 0.8789
No log 3.35 268 0.7373 0.6539 0.7373 0.8586
No log 3.375 270 0.7866 0.6608 0.7866 0.8869
No log 3.4 272 0.9152 0.5946 0.9152 0.9567
No log 3.425 274 1.0856 0.5443 1.0856 1.0419
No log 3.45 276 1.2012 0.5215 1.2012 1.0960
No log 3.475 278 1.1381 0.5372 1.1381 1.0668
No log 3.5 280 1.0111 0.5784 1.0111 1.0056
No log 3.525 282 0.8564 0.6513 0.8564 0.9254
No log 3.55 284 0.7361 0.6818 0.7361 0.8580
No log 3.575 286 0.6981 0.7029 0.6981 0.8356
No log 3.6 288 0.7115 0.6820 0.7115 0.8435
No log 3.625 290 0.7854 0.6783 0.7854 0.8862
No log 3.65 292 0.8331 0.6348 0.8331 0.9127
No log 3.675 294 0.8201 0.6528 0.8201 0.9056
No log 3.7 296 0.7783 0.6598 0.7783 0.8822
No log 3.725 298 0.7453 0.6735 0.7453 0.8633
No log 3.75 300 0.7091 0.6760 0.7091 0.8421
No log 3.775 302 0.6877 0.6882 0.6877 0.8293
No log 3.8 304 0.6861 0.6773 0.6861 0.8283
No log 3.825 306 0.7198 0.6661 0.7198 0.8484
No log 3.85 308 0.8392 0.6411 0.8392 0.9161
No log 3.875 310 0.9409 0.6140 0.9409 0.9700
No log 3.9 312 0.9098 0.6339 0.9098 0.9538
No log 3.925 314 0.8459 0.6245 0.8459 0.9197
No log 3.95 316 0.8505 0.6245 0.8505 0.9222
No log 3.975 318 0.8588 0.6327 0.8588 0.9267
No log 4.0 320 0.8404 0.6602 0.8404 0.9167
No log 4.025 322 0.8368 0.6120 0.8368 0.9148
No log 4.05 324 0.8322 0.6177 0.8322 0.9123
No log 4.075 326 0.8217 0.6214 0.8217 0.9065
No log 4.1 328 0.7919 0.6365 0.7919 0.8899
No log 4.125 330 0.7727 0.6572 0.7727 0.8790
No log 4.15 332 0.7640 0.6601 0.7640 0.8741
No log 4.175 334 0.7450 0.6833 0.7450 0.8631
No log 4.2 336 0.7231 0.6585 0.7231 0.8504
No log 4.225 338 0.7168 0.6589 0.7168 0.8466
No log 4.25 340 0.7083 0.6623 0.7083 0.8416
No log 4.275 342 0.7165 0.6469 0.7165 0.8464
No log 4.3 344 0.7536 0.6841 0.7536 0.8681
No log 4.325 346 0.7444 0.6796 0.7444 0.8628
No log 4.35 348 0.7178 0.6796 0.7178 0.8472
No log 4.375 350 0.7237 0.6820 0.7237 0.8507
No log 4.4 352 0.7064 0.6838 0.7064 0.8404
No log 4.425 354 0.7088 0.6852 0.7088 0.8419
No log 4.45 356 0.7263 0.6774 0.7263 0.8522
No log 4.475 358 0.7622 0.6714 0.7622 0.8730
No log 4.5 360 0.8248 0.6539 0.8248 0.9082
No log 4.525 362 0.9033 0.6263 0.9033 0.9504
No log 4.55 364 0.9862 0.5776 0.9862 0.9931
No log 4.575 366 0.9964 0.5742 0.9964 0.9982
No log 4.6 368 0.9482 0.5829 0.9482 0.9737
No log 4.625 370 0.9243 0.5894 0.9243 0.9614
No log 4.65 372 0.8586 0.6178 0.8586 0.9266
No log 4.675 374 0.8149 0.6635 0.8149 0.9027
No log 4.7 376 0.7726 0.6754 0.7726 0.8790
No log 4.725 378 0.7317 0.6729 0.7317 0.8554
No log 4.75 380 0.7305 0.6923 0.7305 0.8547
No log 4.775 382 0.7870 0.6707 0.7870 0.8871
No log 4.8 384 0.8004 0.6707 0.8004 0.8947
No log 4.825 386 0.7591 0.6847 0.7591 0.8712
No log 4.85 388 0.7908 0.6738 0.7908 0.8893
No log 4.875 390 0.8317 0.6673 0.8317 0.9120
No log 4.9 392 0.9165 0.6262 0.9165 0.9573
No log 4.925 394 1.0323 0.6063 1.0323 1.0160
No log 4.95 396 1.1249 0.5989 1.1249 1.0606
No log 4.975 398 1.1501 0.6049 1.1501 1.0724
No log 5.0 400 1.1931 0.5756 1.1931 1.0923
No log 5.025 402 1.2604 0.5709 1.2604 1.1227
No log 5.05 404 1.2200 0.5615 1.2200 1.1045
No log 5.075 406 1.0950 0.5782 1.0950 1.0464
No log 5.1 408 0.9898 0.6066 0.9898 0.9949
No log 5.125 410 0.8661 0.6368 0.8661 0.9306
No log 5.15 412 0.7259 0.6764 0.7259 0.8520
No log 5.175 414 0.6918 0.6988 0.6918 0.8317
No log 5.2 416 0.6965 0.6969 0.6965 0.8346
No log 5.225 418 0.7294 0.6633 0.7294 0.8541
No log 5.25 420 0.7505 0.6616 0.7505 0.8663
No log 5.275 422 0.7306 0.6764 0.7306 0.8548
No log 5.3 424 0.7310 0.6669 0.7310 0.8550
No log 5.325 426 0.7058 0.6686 0.7058 0.8401
No log 5.35 428 0.6719 0.6820 0.6719 0.8197
No log 5.375 430 0.6701 0.7050 0.6701 0.8186
No log 5.4 432 0.6711 0.6912 0.6711 0.8192
No log 5.425 434 0.6911 0.6709 0.6911 0.8313
No log 5.45 436 0.7122 0.6629 0.7122 0.8439
No log 5.475 438 0.7485 0.6700 0.7485 0.8652
No log 5.5 440 0.8120 0.6640 0.8120 0.9011
No log 5.525 442 0.8847 0.6493 0.8847 0.9406
No log 5.55 444 0.9260 0.6442 0.9260 0.9623
No log 5.575 446 0.9046 0.6461 0.9046 0.9511
No log 5.6 448 0.8494 0.6641 0.8494 0.9216
No log 5.625 450 0.8228 0.6829 0.8228 0.9071
No log 5.65 452 0.8279 0.6769 0.8279 0.9099
No log 5.675 454 0.8335 0.6769 0.8335 0.9129
No log 5.7 456 0.8291 0.6709 0.8291 0.9106
No log 5.725 458 0.8318 0.6709 0.8318 0.9120
No log 5.75 460 0.7746 0.6645 0.7746 0.8801
No log 5.775 462 0.7378 0.6780 0.7378 0.8590
No log 5.8 464 0.7103 0.7063 0.7103 0.8428
No log 5.825 466 0.7094 0.7107 0.7094 0.8422
No log 5.85 468 0.6955 0.6910 0.6955 0.8340
No log 5.875 470 0.6882 0.7095 0.6882 0.8296
No log 5.9 472 0.6807 0.7220 0.6807 0.8250
No log 5.925 474 0.6740 0.6966 0.6740 0.8210
No log 5.95 476 0.7075 0.6949 0.7075 0.8411
No log 5.975 478 0.8032 0.6640 0.8032 0.8962
No log 6.0 480 0.8936 0.6253 0.8936 0.9453
No log 6.025 482 0.9336 0.6098 0.9336 0.9662
No log 6.05 484 0.9801 0.5918 0.9801 0.9900
No log 6.075 486 0.9877 0.6212 0.9877 0.9938
No log 6.1 488 0.9611 0.6226 0.9611 0.9804
No log 6.125 490 0.8919 0.6352 0.8919 0.9444
No log 6.15 492 0.7855 0.6659 0.7855 0.8863
No log 6.175 494 0.7175 0.6575 0.7175 0.8471
No log 6.2 496 0.6812 0.6831 0.6812 0.8254
No log 6.225 498 0.6663 0.7187 0.6663 0.8163
0.438 6.25 500 0.6812 0.7048 0.6812 0.8253
0.438 6.275 502 0.6980 0.7225 0.6980 0.8355
0.438 6.3 504 0.7034 0.7066 0.7034 0.8387
0.438 6.325 506 0.7060 0.7033 0.7060 0.8402
0.438 6.35 508 0.7358 0.6872 0.7358 0.8578
0.438 6.375 510 0.7942 0.6579 0.7942 0.8912
0.438 6.4 512 0.7925 0.6579 0.7925 0.8902
0.438 6.425 514 0.7500 0.6713 0.7500 0.8660
0.438 6.45 516 0.7035 0.7045 0.7035 0.8388
0.438 6.475 518 0.6740 0.7110 0.6740 0.8210
0.438 6.5 520 0.6704 0.6789 0.6704 0.8187
0.438 6.525 522 0.6723 0.7013 0.6723 0.8199
0.438 6.55 524 0.6724 0.6928 0.6724 0.8200
0.438 6.575 526 0.6880 0.7007 0.6880 0.8295
0.438 6.6 528 0.7330 0.6796 0.7330 0.8562
0.438 6.625 530 0.7644 0.6630 0.7644 0.8743
0.438 6.65 532 0.7461 0.6630 0.7461 0.8638
0.438 6.675 534 0.7141 0.6796 0.7141 0.8450
0.438 6.7 536 0.6749 0.6945 0.6749 0.8215
0.438 6.725 538 0.6616 0.7136 0.6616 0.8134
0.438 6.75 540 0.6664 0.6854 0.6664 0.8163
0.438 6.775 542 0.6676 0.6810 0.6676 0.8170
0.438 6.8 544 0.6626 0.6977 0.6626 0.8140
0.438 6.825 546 0.6542 0.7096 0.6542 0.8088
0.438 6.85 548 0.6644 0.7114 0.6644 0.8151
0.438 6.875 550 0.6682 0.6877 0.6682 0.8174
0.438 6.9 552 0.6908 0.6913 0.6908 0.8311
0.438 6.925 554 0.7299 0.6989 0.7299 0.8544
0.438 6.95 556 0.7633 0.6828 0.7633 0.8736
0.438 6.975 558 0.7714 0.6828 0.7714 0.8783
0.438 7.0 560 0.7324 0.6906 0.7324 0.8558
0.438 7.025 562 0.7049 0.6924 0.7049 0.8396
0.438 7.05 564 0.6903 0.6924 0.6903 0.8308
0.438 7.075 566 0.6739 0.6832 0.6739 0.8209
0.438 7.1 568 0.6621 0.6832 0.6621 0.8137
0.438 7.125 570 0.6721 0.6832 0.6721 0.8198
0.438 7.15 572 0.6993 0.6868 0.6993 0.8362
0.438 7.175 574 0.7419 0.6789 0.7419 0.8613
0.438 7.2 576 0.7641 0.6771 0.7641 0.8742
0.438 7.225 578 0.7760 0.6771 0.7760 0.8809
0.438 7.25 580 0.7823 0.6771 0.7823 0.8845
0.438 7.275 582 0.7972 0.6582 0.7972 0.8928
0.438 7.3 584 0.7802 0.6676 0.7802 0.8833
0.438 7.325 586 0.7354 0.6735 0.7354 0.8576
0.438 7.35 588 0.7098 0.6930 0.7098 0.8425
0.438 7.375 590 0.7062 0.6930 0.7062 0.8404
0.438 7.4 592 0.7008 0.6912 0.7008 0.8371
0.438 7.425 594 0.7123 0.6620 0.7123 0.8440
0.438 7.45 596 0.7500 0.6730 0.7500 0.8660
0.438 7.475 598 0.7969 0.6603 0.7969 0.8927
0.438 7.5 600 0.8432 0.6529 0.8432 0.9182
0.438 7.525 602 0.8739 0.6444 0.8739 0.9348
0.438 7.55 604 0.8666 0.6444 0.8666 0.9309
0.438 7.575 606 0.8512 0.6402 0.8512 0.9226
0.438 7.6 608 0.8118 0.6603 0.8118 0.9010
0.438 7.625 610 0.7738 0.6740 0.7738 0.8796
0.438 7.65 612 0.7608 0.6713 0.7608 0.8723
0.438 7.675 614 0.7484 0.6666 0.7484 0.8651
0.438 7.7 616 0.7550 0.6676 0.7550 0.8689
0.438 7.725 618 0.7554 0.6676 0.7554 0.8691
0.438 7.75 620 0.7545 0.6630 0.7545 0.8686
0.438 7.775 622 0.7479 0.6594 0.7479 0.8648
0.438 7.8 624 0.7496 0.6708 0.7496 0.8658
0.438 7.825 626 0.7427 0.6607 0.7427 0.8618
0.438 7.85 628 0.7431 0.6644 0.7431 0.8620
0.438 7.875 630 0.7468 0.6594 0.7468 0.8642
0.438 7.9 632 0.7439 0.6558 0.7439 0.8625
0.438 7.925 634 0.7279 0.6550 0.7279 0.8532
0.438 7.95 636 0.7114 0.6593 0.7114 0.8435
0.438 7.975 638 0.7104 0.6609 0.7104 0.8428
0.438 8.0 640 0.7214 0.6593 0.7214 0.8494
0.438 8.025 642 0.7405 0.6593 0.7405 0.8605
0.438 8.05 644 0.7665 0.6593 0.7665 0.8755
0.438 8.075 646 0.7977 0.6527 0.7977 0.8931
0.438 8.1 648 0.8239 0.6377 0.8239 0.9077
0.438 8.125 650 0.8323 0.6419 0.8323 0.9123
0.438 8.15 652 0.8250 0.6512 0.8250 0.9083
0.438 8.175 654 0.7990 0.6571 0.7990 0.8938
0.438 8.2 656 0.7753 0.6666 0.7753 0.8805
0.438 8.225 658 0.7490 0.6621 0.7490 0.8654
0.438 8.25 660 0.7293 0.6502 0.7293 0.8540
0.438 8.275 662 0.7187 0.6454 0.7187 0.8478
0.438 8.3 664 0.7162 0.6632 0.7162 0.8463
0.438 8.325 666 0.7204 0.6454 0.7204 0.8488
0.438 8.35 668 0.7379 0.6550 0.7379 0.8590
0.438 8.375 670 0.7609 0.6630 0.7609 0.8723
0.438 8.4 672 0.7945 0.6607 0.7945 0.8913
0.438 8.425 674 0.8333 0.6485 0.8333 0.9128
0.438 8.45 676 0.8589 0.6553 0.8589 0.9267
0.438 8.475 678 0.8684 0.6553 0.8684 0.9319
0.438 8.5 680 0.8632 0.6553 0.8632 0.9291
0.438 8.525 682 0.8432 0.6517 0.8432 0.9182
0.438 8.55 684 0.8277 0.6671 0.8277 0.9098
0.438 8.575 686 0.8107 0.6678 0.8107 0.9004
0.438 8.6 688 0.8047 0.6678 0.8047 0.8971
0.438 8.625 690 0.8120 0.6608 0.8120 0.9011
0.438 8.65 692 0.8294 0.6744 0.8294 0.9107
0.438 8.675 694 0.8410 0.6647 0.8410 0.9170
0.438 8.7 696 0.8352 0.6647 0.8352 0.9139
0.438 8.725 698 0.8199 0.6645 0.8199 0.9055
0.438 8.75 700 0.8020 0.6632 0.8020 0.8956
0.438 8.775 702 0.7977 0.6703 0.7977 0.8932
0.438 8.8 704 0.7917 0.6703 0.7917 0.8898
0.438 8.825 706 0.7942 0.6632 0.7942 0.8912
0.438 8.85 708 0.8081 0.6705 0.8081 0.8990
0.438 8.875 710 0.8286 0.6772 0.8286 0.9103
0.438 8.9 712 0.8491 0.6630 0.8491 0.9215
0.438 8.925 714 0.8558 0.6614 0.8558 0.9251
0.438 8.95 716 0.8525 0.6518 0.8525 0.9233
0.438 8.975 718 0.8456 0.6614 0.8456 0.9195
0.438 9.0 720 0.8328 0.6710 0.8328 0.9126
0.438 9.025 722 0.8133 0.6772 0.8133 0.9018
0.438 9.05 724 0.8046 0.6735 0.8046 0.8970
0.438 9.075 726 0.8084 0.6772 0.8084 0.8991
0.438 9.1 728 0.8229 0.6772 0.8229 0.9072
0.438 9.125 730 0.8355 0.6710 0.8355 0.9141
0.438 9.15 732 0.8488 0.6518 0.8488 0.9213
0.438 9.175 734 0.8579 0.6331 0.8579 0.9262
0.438 9.2 736 0.8567 0.6331 0.8567 0.9256
0.438 9.225 738 0.8466 0.6518 0.8466 0.9201
0.438 9.25 740 0.8322 0.6710 0.8322 0.9122
0.438 9.275 742 0.8133 0.6728 0.8133 0.9018
0.438 9.3 744 0.8000 0.6691 0.8000 0.8944
0.438 9.325 746 0.8000 0.6728 0.8000 0.8944
0.438 9.35 748 0.8021 0.6728 0.8021 0.8956
0.438 9.375 750 0.8099 0.6728 0.8099 0.8999
0.438 9.4 752 0.8161 0.6728 0.8161 0.9034
0.438 9.425 754 0.8211 0.6728 0.8211 0.9061
0.438 9.45 756 0.8219 0.6728 0.8219 0.9066
0.438 9.475 758 0.8209 0.6728 0.8209 0.9061
0.438 9.5 760 0.8185 0.6728 0.8185 0.9047
0.438 9.525 762 0.8177 0.6728 0.8177 0.9043
0.438 9.55 764 0.8174 0.6728 0.8174 0.9041
0.438 9.575 766 0.8149 0.6728 0.8149 0.9027
0.438 9.6 768 0.8102 0.6772 0.8102 0.9001
0.438 9.625 770 0.8067 0.6735 0.8067 0.8982
0.438 9.65 772 0.8031 0.6735 0.8031 0.8962
0.438 9.675 774 0.8021 0.6735 0.8021 0.8956
0.438 9.7 776 0.8020 0.6805 0.8020 0.8955
0.438 9.725 778 0.8001 0.6805 0.8001 0.8945
0.438 9.75 780 0.8005 0.6805 0.8005 0.8947
0.438 9.775 782 0.7996 0.6805 0.7996 0.8942
0.438 9.8 784 0.7997 0.6805 0.7997 0.8943
0.438 9.825 786 0.8011 0.6805 0.8011 0.8950
0.438 9.85 788 0.8032 0.6805 0.8032 0.8962
0.438 9.875 790 0.8039 0.6805 0.8039 0.8966
0.438 9.9 792 0.8048 0.6805 0.8048 0.8971
0.438 9.925 794 0.8052 0.6805 0.8052 0.8973
0.438 9.95 796 0.8050 0.6805 0.8050 0.8972
0.438 9.975 798 0.8046 0.6805 0.8046 0.8970
0.438 10.0 800 0.8044 0.6805 0.8044 0.8969

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_FineTuningAraBERT_run1_AugV5_k16_task1_organization

Finetuned
(4023)
this model