ArabicNewSplits6_FineTuningAraBERT_run2_AugV5_k18_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6746
  • Qwk: 0.7668
  • Mse: 0.6746
  • Rmse: 0.8213

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0299 2 2.2272 0.0250 2.2272 1.4924
No log 0.0597 4 1.4805 0.1885 1.4805 1.2167
No log 0.0896 6 1.4383 0.1323 1.4383 1.1993
No log 0.1194 8 1.4751 0.3037 1.4751 1.2145
No log 0.1493 10 1.4252 0.2921 1.4252 1.1938
No log 0.1791 12 1.3408 0.1802 1.3408 1.1579
No log 0.2090 14 1.3469 0.1529 1.3469 1.1606
No log 0.2388 16 1.3959 0.1886 1.3959 1.1815
No log 0.2687 18 1.3986 0.1739 1.3986 1.1826
No log 0.2985 20 1.3671 0.1569 1.3671 1.1692
No log 0.3284 22 1.3896 0.2152 1.3896 1.1788
No log 0.3582 24 1.3833 0.2755 1.3833 1.1761
No log 0.3881 26 1.4608 0.3714 1.4608 1.2086
No log 0.4179 28 1.6721 0.3012 1.6721 1.2931
No log 0.4478 30 1.4111 0.3495 1.4111 1.1879
No log 0.4776 32 1.1927 0.4405 1.1927 1.0921
No log 0.5075 34 1.0766 0.4455 1.0766 1.0376
No log 0.5373 36 1.0270 0.4657 1.0270 1.0134
No log 0.5672 38 1.0651 0.4880 1.0651 1.0321
No log 0.5970 40 1.0537 0.4734 1.0537 1.0265
No log 0.6269 42 0.9526 0.5317 0.9526 0.9760
No log 0.6567 44 0.9757 0.4828 0.9757 0.9878
No log 0.6866 46 0.9206 0.5331 0.9206 0.9595
No log 0.7164 48 0.9773 0.5276 0.9773 0.9886
No log 0.7463 50 0.9897 0.5161 0.9897 0.9948
No log 0.7761 52 0.8608 0.5649 0.8608 0.9278
No log 0.8060 54 0.8515 0.5451 0.8515 0.9228
No log 0.8358 56 0.8226 0.5721 0.8226 0.9069
No log 0.8657 58 0.8959 0.5875 0.8959 0.9465
No log 0.8955 60 0.9025 0.5692 0.9025 0.9500
No log 0.9254 62 0.7992 0.6063 0.7992 0.8940
No log 0.9552 64 0.7760 0.6113 0.7760 0.8809
No log 0.9851 66 0.7232 0.6372 0.7232 0.8504
No log 1.0149 68 0.6879 0.6298 0.6879 0.8294
No log 1.0448 70 0.7305 0.6971 0.7305 0.8547
No log 1.0746 72 0.6805 0.6723 0.6805 0.8249
No log 1.1045 74 0.6404 0.6785 0.6404 0.8003
No log 1.1343 76 0.7779 0.5658 0.7779 0.8820
No log 1.1642 78 0.7971 0.5873 0.7971 0.8928
No log 1.1940 80 0.6595 0.6479 0.6595 0.8121
No log 1.2239 82 0.6736 0.6802 0.6736 0.8207
No log 1.2537 84 1.0350 0.5691 1.0350 1.0174
No log 1.2836 86 1.1651 0.5015 1.1651 1.0794
No log 1.3134 88 1.0311 0.5185 1.0311 1.0154
No log 1.3433 90 0.8551 0.5444 0.8551 0.9247
No log 1.3731 92 0.8136 0.5962 0.8136 0.9020
No log 1.4030 94 0.8264 0.5787 0.8264 0.9091
No log 1.4328 96 0.9944 0.5376 0.9944 0.9972
No log 1.4627 98 1.1929 0.5024 1.1929 1.0922
No log 1.4925 100 1.1554 0.5070 1.1554 1.0749
No log 1.5224 102 0.8900 0.6110 0.8900 0.9434
No log 1.5522 104 0.8052 0.6230 0.8052 0.8973
No log 1.5821 106 0.8540 0.6178 0.8540 0.9241
No log 1.6119 108 1.0379 0.5956 1.0379 1.0188
No log 1.6418 110 1.1008 0.6028 1.1008 1.0492
No log 1.6716 112 1.0422 0.6163 1.0422 1.0209
No log 1.7015 114 0.8338 0.6790 0.8338 0.9131
No log 1.7313 116 0.7210 0.6843 0.7210 0.8491
No log 1.7612 118 0.6958 0.6988 0.6958 0.8341
No log 1.7910 120 0.7913 0.6885 0.7913 0.8895
No log 1.8209 122 1.0288 0.6312 1.0288 1.0143
No log 1.8507 124 1.1056 0.6206 1.1056 1.0515
No log 1.8806 126 0.9797 0.6591 0.9797 0.9898
No log 1.9104 128 0.8037 0.7198 0.8037 0.8965
No log 1.9403 130 0.8595 0.7078 0.8595 0.9271
No log 1.9701 132 0.9703 0.6571 0.9703 0.9850
No log 2.0 134 0.9670 0.6689 0.9670 0.9834
No log 2.0299 136 1.0759 0.6075 1.0759 1.0372
No log 2.0597 138 0.8876 0.6790 0.8876 0.9421
No log 2.0896 140 0.7429 0.7054 0.7429 0.8619
No log 2.1194 142 0.7619 0.6959 0.7619 0.8728
No log 2.1493 144 0.7829 0.7003 0.7829 0.8848
No log 2.1791 146 0.8554 0.6706 0.8554 0.9249
No log 2.2090 148 0.9482 0.5781 0.9482 0.9737
No log 2.2388 150 0.9421 0.6302 0.9421 0.9706
No log 2.2687 152 0.8000 0.6906 0.8000 0.8944
No log 2.2985 154 0.7446 0.7158 0.7446 0.8629
No log 2.3284 156 0.8203 0.6555 0.8203 0.9057
No log 2.3582 158 0.8522 0.6227 0.8522 0.9232
No log 2.3881 160 0.7350 0.7152 0.7350 0.8573
No log 2.4179 162 0.6702 0.7131 0.6702 0.8187
No log 2.4478 164 0.6834 0.7286 0.6834 0.8267
No log 2.4776 166 0.8228 0.6569 0.8228 0.9071
No log 2.5075 168 0.9311 0.6275 0.9311 0.9649
No log 2.5373 170 0.8330 0.6514 0.8330 0.9127
No log 2.5672 172 0.6723 0.7544 0.6723 0.8199
No log 2.5970 174 0.6312 0.6953 0.6312 0.7945
No log 2.6269 176 0.6239 0.7067 0.6239 0.7899
No log 2.6567 178 0.6316 0.7442 0.6316 0.7947
No log 2.6866 180 0.7592 0.6994 0.7592 0.8713
No log 2.7164 182 0.9938 0.6029 0.9938 0.9969
No log 2.7463 184 1.0141 0.5946 1.0141 1.0070
No log 2.7761 186 0.8743 0.6504 0.8743 0.9350
No log 2.8060 188 0.7312 0.6890 0.7312 0.8551
No log 2.8358 190 0.6623 0.7303 0.6623 0.8138
No log 2.8657 192 0.6718 0.7253 0.6718 0.8196
No log 2.8955 194 0.7079 0.7072 0.7079 0.8414
No log 2.9254 196 0.7468 0.6716 0.7468 0.8642
No log 2.9552 198 0.8274 0.6616 0.8274 0.9096
No log 2.9851 200 0.8061 0.6882 0.8061 0.8978
No log 3.0149 202 0.8023 0.7010 0.8023 0.8957
No log 3.0448 204 0.8036 0.7041 0.8036 0.8964
No log 3.0746 206 0.7543 0.7045 0.7543 0.8685
No log 3.1045 208 0.7794 0.6927 0.7794 0.8828
No log 3.1343 210 0.7709 0.6971 0.7709 0.8780
No log 3.1642 212 0.7436 0.7059 0.7436 0.8623
No log 3.1940 214 0.6878 0.7549 0.6878 0.8293
No log 3.2239 216 0.6906 0.7339 0.6906 0.8310
No log 3.2537 218 0.6986 0.7293 0.6986 0.8358
No log 3.2836 220 0.6535 0.6936 0.6535 0.8084
No log 3.3134 222 0.6502 0.7085 0.6502 0.8063
No log 3.3433 224 0.7050 0.7154 0.7050 0.8397
No log 3.3731 226 0.8169 0.6860 0.8169 0.9038
No log 3.4030 228 0.8452 0.6402 0.8452 0.9194
No log 3.4328 230 0.7884 0.6882 0.7884 0.8879
No log 3.4627 232 0.7077 0.7192 0.7077 0.8413
No log 3.4925 234 0.7039 0.7192 0.7039 0.8390
No log 3.5224 236 0.7900 0.6956 0.7900 0.8888
No log 3.5522 238 0.9439 0.6655 0.9439 0.9716
No log 3.5821 240 1.0336 0.6657 1.0336 1.0167
No log 3.6119 242 0.9460 0.6655 0.9460 0.9726
No log 3.6418 244 0.7822 0.7210 0.7822 0.8844
No log 3.6716 246 0.7382 0.7576 0.7382 0.8592
No log 3.7015 248 0.7854 0.7210 0.7854 0.8862
No log 3.7313 250 0.8576 0.6877 0.8576 0.9261
No log 3.7612 252 0.9246 0.6600 0.9246 0.9616
No log 3.7910 254 0.8501 0.7010 0.8501 0.9220
No log 3.8209 256 0.7737 0.7621 0.7737 0.8796
No log 3.8507 258 0.7066 0.7308 0.7066 0.8406
No log 3.8806 260 0.6892 0.7140 0.6892 0.8302
No log 3.9104 262 0.7016 0.7240 0.7016 0.8376
No log 3.9403 264 0.7929 0.7075 0.7929 0.8904
No log 3.9701 266 0.8265 0.7260 0.8265 0.9091
No log 4.0 268 0.8367 0.7233 0.8367 0.9147
No log 4.0299 270 0.7457 0.7398 0.7457 0.8635
No log 4.0597 272 0.7158 0.7442 0.7158 0.8461
No log 4.0896 274 0.7367 0.7533 0.7367 0.8583
No log 4.1194 276 0.8077 0.7414 0.8077 0.8987
No log 4.1493 278 0.8865 0.6754 0.8865 0.9416
No log 4.1791 280 0.8445 0.6979 0.8445 0.9190
No log 4.2090 282 0.7152 0.7188 0.7152 0.8457
No log 4.2388 284 0.6319 0.7274 0.6319 0.7949
No log 4.2687 286 0.6231 0.7198 0.6231 0.7894
No log 4.2985 288 0.6400 0.7348 0.6400 0.8000
No log 4.3284 290 0.6818 0.7295 0.6818 0.8257
No log 4.3582 292 0.7022 0.7120 0.7022 0.8380
No log 4.3881 294 0.6925 0.7517 0.6925 0.8322
No log 4.4179 296 0.6766 0.7661 0.6766 0.8226
No log 4.4478 298 0.7307 0.7456 0.7307 0.8548
No log 4.4776 300 0.8698 0.7062 0.8698 0.9326
No log 4.5075 302 0.9700 0.6518 0.9700 0.9849
No log 4.5373 304 0.9385 0.6618 0.9385 0.9688
No log 4.5672 306 0.7811 0.7414 0.7811 0.8838
No log 4.5970 308 0.7169 0.7554 0.7169 0.8467
No log 4.6269 310 0.6958 0.7467 0.6958 0.8342
No log 4.6567 312 0.7231 0.7225 0.7231 0.8503
No log 4.6866 314 0.6947 0.7480 0.6947 0.8335
No log 4.7164 316 0.6625 0.7487 0.6625 0.8140
No log 4.7463 318 0.6418 0.7703 0.6418 0.8011
No log 4.7761 320 0.6570 0.7583 0.6570 0.8106
No log 4.8060 322 0.6958 0.7745 0.6958 0.8341
No log 4.8358 324 0.7469 0.7383 0.7469 0.8642
No log 4.8657 326 0.7488 0.7393 0.7488 0.8653
No log 4.8955 328 0.6929 0.7482 0.6929 0.8324
No log 4.9254 330 0.6580 0.7255 0.6580 0.8112
No log 4.9552 332 0.6523 0.7303 0.6523 0.8076
No log 4.9851 334 0.6322 0.7407 0.6322 0.7951
No log 5.0149 336 0.6034 0.7250 0.6034 0.7768
No log 5.0448 338 0.6053 0.7292 0.6053 0.7780
No log 5.0746 340 0.6334 0.7483 0.6334 0.7959
No log 5.1045 342 0.7057 0.7510 0.7057 0.8400
No log 5.1343 344 0.6903 0.7642 0.6903 0.8309
No log 5.1642 346 0.6146 0.7408 0.6146 0.7839
No log 5.1940 348 0.5967 0.7393 0.5967 0.7725
No log 5.2239 350 0.5998 0.7470 0.5998 0.7745
No log 5.2537 352 0.6101 0.7528 0.6101 0.7811
No log 5.2836 354 0.6642 0.7617 0.6642 0.8150
No log 5.3134 356 0.7504 0.7422 0.7504 0.8663
No log 5.3433 358 0.7591 0.7213 0.7591 0.8713
No log 5.3731 360 0.7063 0.7448 0.7063 0.8404
No log 5.4030 362 0.6657 0.7630 0.6657 0.8159
No log 5.4328 364 0.6625 0.7587 0.6625 0.8140
No log 5.4627 366 0.6318 0.7607 0.6318 0.7948
No log 5.4925 368 0.6340 0.7607 0.6340 0.7962
No log 5.5224 370 0.6642 0.7534 0.6642 0.8150
No log 5.5522 372 0.6934 0.7491 0.6934 0.8327
No log 5.5821 374 0.7856 0.7127 0.7856 0.8864
No log 5.6119 376 0.9227 0.6872 0.9227 0.9606
No log 5.6418 378 1.0324 0.6439 1.0324 1.0161
No log 5.6716 380 1.0013 0.6530 1.0013 1.0006
No log 5.7015 382 0.8579 0.6868 0.8579 0.9262
No log 5.7313 384 0.7074 0.7018 0.7074 0.8411
No log 5.7612 386 0.5983 0.7503 0.5983 0.7735
No log 5.7910 388 0.5590 0.7607 0.5590 0.7477
No log 5.8209 390 0.5558 0.7572 0.5558 0.7455
No log 5.8507 392 0.5861 0.7820 0.5861 0.7656
No log 5.8806 394 0.6540 0.7501 0.6540 0.8087
No log 5.9104 396 0.6944 0.7489 0.6944 0.8333
No log 5.9403 398 0.6730 0.7576 0.6730 0.8204
No log 5.9701 400 0.6132 0.7851 0.6132 0.7831
No log 6.0 402 0.5614 0.7782 0.5614 0.7493
No log 6.0299 404 0.5548 0.7442 0.5548 0.7448
No log 6.0597 406 0.5591 0.7562 0.5591 0.7477
No log 6.0896 408 0.5829 0.7764 0.5829 0.7635
No log 6.1194 410 0.6557 0.7639 0.6557 0.8097
No log 6.1493 412 0.7148 0.7407 0.7148 0.8455
No log 6.1791 414 0.7109 0.7351 0.7109 0.8431
No log 6.2090 416 0.6779 0.7550 0.6779 0.8233
No log 6.2388 418 0.6247 0.7692 0.6247 0.7904
No log 6.2687 420 0.6020 0.7309 0.6020 0.7759
No log 6.2985 422 0.5898 0.7316 0.5898 0.7680
No log 6.3284 424 0.6086 0.7629 0.6086 0.7802
No log 6.3582 426 0.6133 0.7685 0.6133 0.7831
No log 6.3881 428 0.6014 0.7555 0.6014 0.7755
No log 6.4179 430 0.6173 0.7592 0.6173 0.7857
No log 6.4478 432 0.6585 0.7631 0.6585 0.8115
No log 6.4776 434 0.7419 0.7240 0.7419 0.8613
No log 6.5075 436 0.7936 0.7022 0.7936 0.8908
No log 6.5373 438 0.8175 0.6962 0.8175 0.9042
No log 6.5672 440 0.7797 0.7041 0.7797 0.8830
No log 6.5970 442 0.7135 0.7523 0.7135 0.8447
No log 6.6269 444 0.6953 0.7570 0.6953 0.8338
No log 6.6567 446 0.7179 0.7583 0.7179 0.8473
No log 6.6866 448 0.7351 0.7152 0.7351 0.8574
No log 6.7164 450 0.7776 0.7037 0.7776 0.8818
No log 6.7463 452 0.8042 0.7095 0.8042 0.8967
No log 6.7761 454 0.8391 0.7022 0.8391 0.9160
No log 6.8060 456 0.8452 0.7022 0.8452 0.9194
No log 6.8358 458 0.8418 0.7022 0.8418 0.9175
No log 6.8657 460 0.8418 0.7041 0.8418 0.9175
No log 6.8955 462 0.8187 0.7041 0.8187 0.9048
No log 6.9254 464 0.8336 0.7041 0.8336 0.9130
No log 6.9552 466 0.8939 0.6752 0.8939 0.9455
No log 6.9851 468 0.9784 0.6823 0.9784 0.9891
No log 7.0149 470 0.9971 0.6743 0.9971 0.9985
No log 7.0448 472 0.9387 0.6712 0.9387 0.9689
No log 7.0746 474 0.8641 0.6908 0.8641 0.9296
No log 7.1045 476 0.7824 0.7269 0.7824 0.8845
No log 7.1343 478 0.7408 0.7391 0.7408 0.8607
No log 7.1642 480 0.7387 0.7254 0.7387 0.8595
No log 7.1940 482 0.7487 0.7329 0.7487 0.8653
No log 7.2239 484 0.7145 0.7484 0.7145 0.8453
No log 7.2537 486 0.6941 0.7566 0.6941 0.8331
No log 7.2836 488 0.6960 0.7566 0.6960 0.8343
No log 7.3134 490 0.7006 0.7558 0.7006 0.8370
No log 7.3433 492 0.6920 0.7599 0.6920 0.8318
No log 7.3731 494 0.6827 0.7739 0.6827 0.8263
No log 7.4030 496 0.6697 0.7739 0.6697 0.8184
No log 7.4328 498 0.6730 0.7739 0.6730 0.8204
0.2956 7.4627 500 0.6811 0.7626 0.6811 0.8253
0.2956 7.4925 502 0.6985 0.7585 0.6985 0.8358
0.2956 7.5224 504 0.7207 0.7484 0.7207 0.8489
0.2956 7.5522 506 0.7292 0.7337 0.7292 0.8539
0.2956 7.5821 508 0.7086 0.7585 0.7086 0.8418
0.2956 7.6119 510 0.6758 0.7593 0.6758 0.8221
0.2956 7.6418 512 0.6422 0.7642 0.6422 0.8014
0.2956 7.6716 514 0.6180 0.7598 0.6180 0.7861
0.2956 7.7015 516 0.6224 0.7654 0.6224 0.7889
0.2956 7.7313 518 0.6448 0.7674 0.6448 0.8030
0.2956 7.7612 520 0.6642 0.7710 0.6642 0.8150
0.2956 7.7910 522 0.6882 0.7710 0.6882 0.8296
0.2956 7.8209 524 0.7381 0.7274 0.7381 0.8591
0.2956 7.8507 526 0.7926 0.7191 0.7926 0.8903
0.2956 7.8806 528 0.8030 0.7191 0.8030 0.8961
0.2956 7.9104 530 0.7937 0.7191 0.7937 0.8909
0.2956 7.9403 532 0.7948 0.7191 0.7948 0.8915
0.2956 7.9701 534 0.7750 0.7289 0.7750 0.8803
0.2956 8.0 536 0.7596 0.7232 0.7596 0.8716
0.2956 8.0299 538 0.7550 0.7232 0.7550 0.8689
0.2956 8.0597 540 0.7570 0.7232 0.7570 0.8701
0.2956 8.0896 542 0.7641 0.7191 0.7641 0.8741
0.2956 8.1194 544 0.7889 0.7174 0.7889 0.8882
0.2956 8.1493 546 0.8049 0.7097 0.8049 0.8972
0.2956 8.1791 548 0.7885 0.7041 0.7885 0.8880
0.2956 8.2090 550 0.7870 0.7041 0.7870 0.8871
0.2956 8.2388 552 0.7644 0.6960 0.7644 0.8743
0.2956 8.2687 554 0.7298 0.7317 0.7298 0.8543
0.2956 8.2985 556 0.7002 0.7418 0.7002 0.8368
0.2956 8.3284 558 0.6617 0.7661 0.6617 0.8134
0.2956 8.3582 560 0.6351 0.7591 0.6351 0.7970
0.2956 8.3881 562 0.6165 0.7445 0.6165 0.7852
0.2956 8.4179 564 0.6135 0.7569 0.6135 0.7832
0.2956 8.4478 566 0.6168 0.7565 0.6168 0.7854
0.2956 8.4776 568 0.6262 0.7498 0.6262 0.7913
0.2956 8.5075 570 0.6431 0.7624 0.6431 0.8019
0.2956 8.5373 572 0.6614 0.7770 0.6614 0.8133
0.2956 8.5672 574 0.6777 0.7782 0.6777 0.8232
0.2956 8.5970 576 0.6776 0.7782 0.6776 0.8231
0.2956 8.6269 578 0.6644 0.7824 0.6644 0.8151
0.2956 8.6567 580 0.6567 0.7824 0.6567 0.8104
0.2956 8.6866 582 0.6464 0.7655 0.6464 0.8040
0.2956 8.7164 584 0.6413 0.7604 0.6413 0.8008
0.2956 8.7463 586 0.6437 0.7526 0.6437 0.8023
0.2956 8.7761 588 0.6446 0.7446 0.6446 0.8029
0.2956 8.8060 590 0.6492 0.7401 0.6492 0.8057
0.2956 8.8358 592 0.6477 0.7401 0.6477 0.8048
0.2956 8.8657 594 0.6518 0.7540 0.6518 0.8073
0.2956 8.8955 596 0.6525 0.7617 0.6525 0.8078
0.2956 8.9254 598 0.6543 0.7717 0.6543 0.8089
0.2956 8.9552 600 0.6564 0.7710 0.6564 0.8102
0.2956 8.9851 602 0.6658 0.7710 0.6658 0.8159
0.2956 9.0149 604 0.6775 0.7710 0.6775 0.8231
0.2956 9.0448 606 0.6767 0.7710 0.6767 0.8226
0.2956 9.0746 608 0.6781 0.7710 0.6781 0.8235
0.2956 9.1045 610 0.6708 0.7710 0.6708 0.8190
0.2956 9.1343 612 0.6610 0.7710 0.6610 0.8130
0.2956 9.1642 614 0.6544 0.7655 0.6544 0.8090
0.2956 9.1940 616 0.6538 0.7655 0.6538 0.8085
0.2956 9.2239 618 0.6539 0.7655 0.6539 0.8086
0.2956 9.2537 620 0.6551 0.7655 0.6551 0.8094
0.2956 9.2836 622 0.6552 0.7655 0.6552 0.8094
0.2956 9.3134 624 0.6545 0.7655 0.6545 0.8090
0.2956 9.3433 626 0.6578 0.7710 0.6578 0.8111
0.2956 9.3731 628 0.6668 0.7710 0.6668 0.8166
0.2956 9.4030 630 0.6735 0.7710 0.6735 0.8207
0.2956 9.4328 632 0.6810 0.7668 0.6810 0.8252
0.2956 9.4627 634 0.6801 0.7668 0.6801 0.8247
0.2956 9.4925 636 0.6715 0.7668 0.6715 0.8195
0.2956 9.5224 638 0.6631 0.7710 0.6631 0.8143
0.2956 9.5522 640 0.6617 0.7710 0.6617 0.8134
0.2956 9.5821 642 0.6633 0.7710 0.6633 0.8144
0.2956 9.6119 644 0.6652 0.7710 0.6652 0.8156
0.2956 9.6418 646 0.6680 0.7710 0.6680 0.8173
0.2956 9.6716 648 0.6706 0.7710 0.6706 0.8189
0.2956 9.7015 650 0.6686 0.7710 0.6686 0.8177
0.2956 9.7313 652 0.6649 0.7710 0.6649 0.8154
0.2956 9.7612 654 0.6647 0.7710 0.6647 0.8153
0.2956 9.7910 656 0.6652 0.7710 0.6652 0.8156
0.2956 9.8209 658 0.6679 0.7710 0.6679 0.8172
0.2956 9.8507 660 0.6712 0.7710 0.6712 0.8193
0.2956 9.8806 662 0.6731 0.7668 0.6731 0.8204
0.2956 9.9104 664 0.6737 0.7668 0.6737 0.8208
0.2956 9.9403 666 0.6741 0.7668 0.6741 0.8210
0.2956 9.9701 668 0.6746 0.7668 0.6746 0.8213
0.2956 10.0 670 0.6746 0.7668 0.6746 0.8213

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_FineTuningAraBERT_run2_AugV5_k18_task5_organization

Finetuned
(4023)
this model