ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k16_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8416
  • Qwk: 0.3819
  • Mse: 0.8416
  • Rmse: 0.9174

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0488 2 2.4244 -0.0646 2.4244 1.5570
No log 0.0976 4 1.0756 0.1941 1.0756 1.0371
No log 0.1463 6 0.9139 -0.1462 0.9139 0.9560
No log 0.1951 8 0.8039 0.1184 0.8039 0.8966
No log 0.2439 10 0.7619 0.1184 0.7619 0.8729
No log 0.2927 12 0.7183 0.0481 0.7183 0.8475
No log 0.3415 14 0.7614 0.0937 0.7614 0.8726
No log 0.3902 16 0.7980 0.1372 0.7980 0.8933
No log 0.4390 18 0.9443 0.2381 0.9443 0.9717
No log 0.4878 20 0.8694 0.2526 0.8694 0.9324
No log 0.5366 22 0.7240 0.0481 0.7240 0.8509
No log 0.5854 24 0.7275 0.1561 0.7275 0.8529
No log 0.6341 26 0.8879 0.1822 0.8879 0.9423
No log 0.6829 28 1.0007 0.0338 1.0007 1.0003
No log 0.7317 30 1.1734 -0.2407 1.1734 1.0833
No log 0.7805 32 1.2328 -0.2346 1.2328 1.1103
No log 0.8293 34 1.0013 0.0 1.0013 1.0006
No log 0.8780 36 0.8515 0.0 0.8515 0.9227
No log 0.9268 38 0.7939 0.0295 0.7939 0.8910
No log 0.9756 40 0.7659 0.0798 0.7659 0.8751
No log 1.0244 42 0.7480 0.0840 0.7480 0.8649
No log 1.0732 44 0.7429 0.0840 0.7429 0.8619
No log 1.1220 46 0.7236 0.2046 0.7236 0.8507
No log 1.1707 48 0.7094 0.2046 0.7094 0.8423
No log 1.2195 50 0.7057 0.1184 0.7057 0.8401
No log 1.2683 52 0.6879 0.1184 0.6879 0.8294
No log 1.3171 54 0.6804 0.1184 0.6804 0.8249
No log 1.3659 56 0.6690 0.2467 0.6690 0.8179
No log 1.4146 58 0.7436 0.2726 0.7436 0.8623
No log 1.4634 60 0.8258 0.3231 0.8258 0.9088
No log 1.5122 62 0.8954 0.3347 0.8954 0.9463
No log 1.5610 64 0.8647 0.3169 0.8647 0.9299
No log 1.6098 66 0.7623 0.2632 0.7623 0.8731
No log 1.6585 68 0.6864 0.1972 0.6864 0.8285
No log 1.7073 70 0.6656 0.1699 0.6656 0.8158
No log 1.7561 72 0.7241 0.1918 0.7241 0.8509
No log 1.8049 74 0.8622 0.1672 0.8622 0.9286
No log 1.8537 76 0.7737 0.2227 0.7737 0.8796
No log 1.9024 78 0.7345 0.1264 0.7345 0.8570
No log 1.9512 80 0.6902 0.2360 0.6902 0.8308
No log 2.0 82 0.8190 0.3313 0.8190 0.9050
No log 2.0488 84 0.9237 0.3579 0.9237 0.9611
No log 2.0976 86 0.8017 0.2984 0.8017 0.8954
No log 2.1463 88 0.6763 0.3738 0.6763 0.8224
No log 2.1951 90 0.8838 0.3347 0.8838 0.9401
No log 2.2439 92 1.0225 0.3264 1.0225 1.0112
No log 2.2927 94 0.9398 0.3890 0.9398 0.9694
No log 2.3415 96 0.7722 0.3869 0.7722 0.8788
No log 2.3902 98 0.6839 0.2862 0.6839 0.8270
No log 2.4390 100 0.7836 0.3252 0.7836 0.8852
No log 2.4878 102 0.8038 0.3570 0.8038 0.8965
No log 2.5366 104 0.7555 0.3214 0.7555 0.8692
No log 2.5854 106 0.6975 0.3022 0.6975 0.8351
No log 2.6341 108 0.6642 0.2379 0.6642 0.8150
No log 2.6829 110 0.6692 0.3050 0.6692 0.8181
No log 2.7317 112 0.6893 0.3545 0.6893 0.8302
No log 2.7805 114 0.7828 0.3918 0.7828 0.8848
No log 2.8293 116 0.8135 0.3060 0.8135 0.9020
No log 2.8780 118 0.7680 0.2784 0.7680 0.8764
No log 2.9268 120 0.7706 0.2558 0.7706 0.8778
No log 2.9756 122 0.8605 0.1727 0.8605 0.9276
No log 3.0244 124 0.9593 0.1175 0.9593 0.9794
No log 3.0732 126 0.9430 0.2239 0.9430 0.9711
No log 3.1220 128 0.8969 0.2728 0.8969 0.9471
No log 3.1707 130 0.8468 0.3193 0.8468 0.9202
No log 3.2195 132 0.7677 0.3333 0.7677 0.8762
No log 3.2683 134 0.7595 0.3126 0.7595 0.8715
No log 3.3171 136 0.8471 0.2154 0.8471 0.9204
No log 3.3659 138 0.8241 0.1538 0.8241 0.9078
No log 3.4146 140 0.7455 0.2688 0.7455 0.8634
No log 3.4634 142 0.7146 0.3285 0.7146 0.8453
No log 3.5122 144 0.7458 0.2095 0.7458 0.8636
No log 3.5610 146 0.7666 0.2508 0.7666 0.8756
No log 3.6098 148 0.7629 0.2777 0.7629 0.8734
No log 3.6585 150 0.7652 0.2551 0.7652 0.8747
No log 3.7073 152 0.8005 0.2835 0.8005 0.8947
No log 3.7561 154 0.8629 0.3379 0.8629 0.9289
No log 3.8049 156 0.9012 0.3699 0.9012 0.9493
No log 3.8537 158 0.8787 0.3483 0.8787 0.9374
No log 3.9024 160 0.7907 0.3329 0.7907 0.8892
No log 3.9512 162 0.7795 0.3441 0.7795 0.8829
No log 4.0 164 0.7685 0.3299 0.7685 0.8766
No log 4.0488 166 0.7615 0.3209 0.7615 0.8727
No log 4.0976 168 0.7668 0.3770 0.7668 0.8757
No log 4.1463 170 0.8832 0.4080 0.8832 0.9398
No log 4.1951 172 0.8982 0.4080 0.8982 0.9478
No log 4.2439 174 0.7574 0.4624 0.7574 0.8703
No log 4.2927 176 0.6620 0.2813 0.6620 0.8136
No log 4.3415 178 0.6500 0.2561 0.6500 0.8062
No log 4.3902 180 0.6535 0.3088 0.6535 0.8084
No log 4.4390 182 0.6948 0.4350 0.6948 0.8335
No log 4.4878 184 0.6847 0.4112 0.6847 0.8274
No log 4.5366 186 0.7315 0.4880 0.7315 0.8553
No log 4.5854 188 0.7698 0.5009 0.7698 0.8774
No log 4.6341 190 0.7363 0.4947 0.7363 0.8581
No log 4.6829 192 0.7379 0.4531 0.7379 0.8590
No log 4.7317 194 0.7141 0.3700 0.7141 0.8450
No log 4.7805 196 0.7017 0.4315 0.7017 0.8377
No log 4.8293 198 0.7002 0.4247 0.7002 0.8368
No log 4.8780 200 0.7047 0.3972 0.7047 0.8395
No log 4.9268 202 0.6512 0.3572 0.6512 0.8070
No log 4.9756 204 0.6530 0.3603 0.6530 0.8081
No log 5.0244 206 0.7088 0.4005 0.7088 0.8419
No log 5.0732 208 0.7161 0.3635 0.7161 0.8462
No log 5.1220 210 0.6684 0.3129 0.6684 0.8176
No log 5.1707 212 0.7013 0.3372 0.7013 0.8374
No log 5.2195 214 0.8378 0.4230 0.8378 0.9153
No log 5.2683 216 0.8296 0.4230 0.8296 0.9108
No log 5.3171 218 0.7363 0.4708 0.7363 0.8581
No log 5.3659 220 0.6994 0.4224 0.6994 0.8363
No log 5.4146 222 0.7122 0.4387 0.7122 0.8439
No log 5.4634 224 0.7870 0.4624 0.7870 0.8871
No log 5.5122 226 0.7667 0.4624 0.7668 0.8756
No log 5.5610 228 0.7127 0.4387 0.7127 0.8442
No log 5.6098 230 0.7166 0.4873 0.7166 0.8465
No log 5.6585 232 0.6970 0.4644 0.6970 0.8348
No log 5.7073 234 0.7268 0.4424 0.7268 0.8525
No log 5.7561 236 0.7598 0.4404 0.7598 0.8716
No log 5.8049 238 0.7381 0.4624 0.7381 0.8591
No log 5.8537 240 0.6619 0.3127 0.6619 0.8136
No log 5.9024 242 0.6454 0.2622 0.6454 0.8034
No log 5.9512 244 0.6367 0.2327 0.6367 0.7979
No log 6.0 246 0.6408 0.2981 0.6408 0.8005
No log 6.0488 248 0.7028 0.2967 0.7028 0.8384
No log 6.0976 250 0.8322 0.4255 0.8322 0.9122
No log 6.1463 252 0.8348 0.4228 0.8348 0.9137
No log 6.1951 254 0.7376 0.3630 0.7376 0.8588
No log 6.2439 256 0.6440 0.3267 0.6440 0.8025
No log 6.2927 258 0.6347 0.3267 0.6347 0.7967
No log 6.3415 260 0.6578 0.3399 0.6578 0.8111
No log 6.3902 262 0.6830 0.3399 0.6830 0.8264
No log 6.4390 264 0.7414 0.3157 0.7414 0.8610
No log 6.4878 266 0.7371 0.3157 0.7371 0.8585
No log 6.5366 268 0.7039 0.2161 0.7039 0.8390
No log 6.5854 270 0.6923 0.2161 0.6923 0.8320
No log 6.6341 272 0.7336 0.2960 0.7336 0.8565
No log 6.6829 274 0.8286 0.3754 0.8286 0.9103
No log 6.7317 276 0.9132 0.4133 0.9132 0.9556
No log 6.7805 278 0.8854 0.4366 0.8854 0.9410
No log 6.8293 280 0.7979 0.3344 0.7979 0.8932
No log 6.8780 282 0.8003 0.3402 0.8003 0.8946
No log 6.9268 284 0.8219 0.4173 0.8219 0.9066
No log 6.9756 286 0.7442 0.4464 0.7442 0.8627
No log 7.0244 288 0.6699 0.4464 0.6699 0.8185
No log 7.0732 290 0.6292 0.4377 0.6292 0.7932
No log 7.1220 292 0.6303 0.4377 0.6303 0.7939
No log 7.1707 294 0.6702 0.3817 0.6702 0.8186
No log 7.2195 296 0.7174 0.3569 0.7174 0.8470
No log 7.2683 298 0.7569 0.3402 0.7569 0.8700
No log 7.3171 300 0.7708 0.3376 0.7708 0.8780
No log 7.3659 302 0.7686 0.2946 0.7686 0.8767
No log 7.4146 304 0.8543 0.3635 0.8543 0.9243
No log 7.4634 306 1.0950 0.3753 1.0950 1.0464
No log 7.5122 308 1.2151 0.3433 1.2151 1.1023
No log 7.5610 310 1.0574 0.3753 1.0574 1.0283
No log 7.6098 312 0.8546 0.4735 0.8546 0.9244
No log 7.6585 314 0.7613 0.3918 0.7613 0.8725
No log 7.7073 316 0.6763 0.3287 0.6763 0.8224
No log 7.7561 318 0.6459 0.4134 0.6459 0.8037
No log 7.8049 320 0.6452 0.4134 0.6452 0.8033
No log 7.8537 322 0.6541 0.4393 0.6541 0.8088
No log 7.9024 324 0.6390 0.4336 0.6390 0.7994
No log 7.9512 326 0.6261 0.4819 0.6261 0.7913
No log 8.0 328 0.6277 0.4819 0.6277 0.7923
No log 8.0488 330 0.6365 0.4819 0.6365 0.7978
No log 8.0976 332 0.6800 0.4393 0.6800 0.8246
No log 8.1463 334 0.7328 0.4389 0.7328 0.8560
No log 8.1951 336 0.7275 0.4389 0.7275 0.8529
No log 8.2439 338 0.7379 0.4389 0.7379 0.8590
No log 8.2927 340 0.7008 0.4134 0.7008 0.8372
No log 8.3415 342 0.6674 0.4342 0.6674 0.8169
No log 8.3902 344 0.6619 0.4234 0.6619 0.8136
No log 8.4390 346 0.6853 0.3942 0.6853 0.8278
No log 8.4878 348 0.7383 0.4272 0.7383 0.8592
No log 8.5366 350 0.7516 0.4272 0.7516 0.8669
No log 8.5854 352 0.6970 0.4190 0.6970 0.8349
No log 8.6341 354 0.6631 0.3382 0.6631 0.8143
No log 8.6829 356 0.6682 0.3382 0.6682 0.8174
No log 8.7317 358 0.6917 0.4190 0.6917 0.8317
No log 8.7805 360 0.7191 0.4167 0.7191 0.8480
No log 8.8293 362 0.7096 0.4167 0.7096 0.8424
No log 8.8780 364 0.7020 0.4190 0.7020 0.8379
No log 8.9268 366 0.7603 0.4144 0.7603 0.8719
No log 8.9756 368 0.7931 0.4387 0.7931 0.8905
No log 9.0244 370 0.7845 0.4745 0.7845 0.8857
No log 9.0732 372 0.7302 0.4294 0.7302 0.8545
No log 9.1220 374 0.6749 0.3937 0.6749 0.8215
No log 9.1707 376 0.6750 0.4287 0.6750 0.8216
No log 9.2195 378 0.6896 0.4287 0.6896 0.8304
No log 9.2683 380 0.6971 0.4591 0.6971 0.8349
No log 9.3171 382 0.7161 0.4447 0.7161 0.8462
No log 9.3659 384 0.7629 0.4633 0.7629 0.8735
No log 9.4146 386 0.7531 0.4821 0.7531 0.8678
No log 9.4634 388 0.7112 0.4093 0.7112 0.8433
No log 9.5122 390 0.6746 0.4294 0.6746 0.8214
No log 9.5610 392 0.6616 0.3763 0.6616 0.8134
No log 9.6098 394 0.6606 0.4019 0.6606 0.8127
No log 9.6585 396 0.6597 0.3787 0.6597 0.8122
No log 9.7073 398 0.6598 0.4582 0.6598 0.8123
No log 9.7561 400 0.7032 0.4606 0.7032 0.8386
No log 9.8049 402 0.7078 0.4606 0.7078 0.8413
No log 9.8537 404 0.6678 0.5103 0.6678 0.8172
No log 9.9024 406 0.6446 0.5142 0.6446 0.8029
No log 9.9512 408 0.6313 0.5142 0.6313 0.7946
No log 10.0 410 0.6343 0.5142 0.6343 0.7964
No log 10.0488 412 0.6433 0.5125 0.6433 0.8021
No log 10.0976 414 0.6462 0.5125 0.6462 0.8038
No log 10.1463 416 0.6529 0.5125 0.6529 0.8080
No log 10.1951 418 0.6633 0.5032 0.6633 0.8144
No log 10.2439 420 0.6942 0.4795 0.6942 0.8332
No log 10.2927 422 0.7618 0.3786 0.7618 0.8728
No log 10.3415 424 0.8066 0.5146 0.8066 0.8981
No log 10.3902 426 0.7732 0.4698 0.7732 0.8793
No log 10.4390 428 0.7073 0.4529 0.7073 0.8410
No log 10.4878 430 0.6764 0.4484 0.6764 0.8224
No log 10.5366 432 0.6719 0.4664 0.6719 0.8197
No log 10.5854 434 0.6322 0.4430 0.6322 0.7951
No log 10.6341 436 0.6038 0.4430 0.6038 0.7770
No log 10.6829 438 0.5846 0.4134 0.5846 0.7646
No log 10.7317 440 0.5956 0.4134 0.5956 0.7718
No log 10.7805 442 0.6161 0.4681 0.6161 0.7849
No log 10.8293 444 0.6285 0.4758 0.6285 0.7928
No log 10.8780 446 0.6623 0.4236 0.6623 0.8138
No log 10.9268 448 0.7239 0.4217 0.7239 0.8508
No log 10.9756 450 0.7897 0.5124 0.7897 0.8887
No log 11.0244 452 0.8287 0.4844 0.8287 0.9103
No log 11.0732 454 0.8626 0.4844 0.8626 0.9288
No log 11.1220 456 0.9764 0.4013 0.9764 0.9882
No log 11.1707 458 1.0279 0.3183 1.0279 1.0139
No log 11.2195 460 0.9938 0.3484 0.9938 0.9969
No log 11.2683 462 0.9067 0.4444 0.9067 0.9522
No log 11.3171 464 0.7267 0.4808 0.7267 0.8525
No log 11.3659 466 0.6214 0.4081 0.6214 0.7883
No log 11.4146 468 0.6072 0.4601 0.6072 0.7792
No log 11.4634 470 0.6107 0.4601 0.6107 0.7815
No log 11.5122 472 0.6405 0.4044 0.6405 0.8003
No log 11.5610 474 0.7165 0.4877 0.7165 0.8465
No log 11.6098 476 0.7354 0.5065 0.7354 0.8576
No log 11.6585 478 0.6753 0.4812 0.6753 0.8217
No log 11.7073 480 0.6014 0.4212 0.6014 0.7755
No log 11.7561 482 0.5724 0.4493 0.5724 0.7565
No log 11.8049 484 0.5754 0.4895 0.5754 0.7586
No log 11.8537 486 0.5906 0.4618 0.5906 0.7685
No log 11.9024 488 0.6483 0.4294 0.6483 0.8052
No log 11.9512 490 0.7821 0.4844 0.7821 0.8844
No log 12.0 492 0.8545 0.4519 0.8545 0.9244
No log 12.0488 494 0.8543 0.4519 0.8543 0.9243
No log 12.0976 496 0.7594 0.4667 0.7594 0.8714
No log 12.1463 498 0.6558 0.4190 0.6558 0.8098
0.3622 12.1951 500 0.5980 0.4036 0.5980 0.7733
0.3622 12.2439 502 0.5923 0.4441 0.5923 0.7696
0.3622 12.2927 504 0.5870 0.3862 0.5870 0.7661
0.3622 12.3415 506 0.5769 0.4229 0.5769 0.7595
0.3622 12.3902 508 0.5955 0.4190 0.5955 0.7717
0.3622 12.4390 510 0.7123 0.4404 0.7123 0.8440
0.3622 12.4878 512 0.8719 0.3804 0.8719 0.9338
0.3622 12.5366 514 0.9130 0.3804 0.9130 0.9555
0.3622 12.5854 516 0.8333 0.4247 0.8333 0.9128
0.3622 12.6341 518 0.7560 0.3976 0.7560 0.8695
0.3622 12.6829 520 0.7264 0.3888 0.7264 0.8523
0.3622 12.7317 522 0.7175 0.3888 0.7175 0.8470
0.3622 12.7805 524 0.7276 0.3997 0.7276 0.8530
0.3622 12.8293 526 0.7575 0.4531 0.7575 0.8704
0.3622 12.8780 528 0.7410 0.4072 0.7410 0.8608
0.3622 12.9268 530 0.7070 0.3985 0.7070 0.8409
0.3622 12.9756 532 0.7005 0.3985 0.7005 0.8370
0.3622 13.0244 534 0.6997 0.4193 0.6997 0.8365
0.3622 13.0732 536 0.6698 0.4044 0.6698 0.8184
0.3622 13.1220 538 0.6305 0.4534 0.6305 0.7940
0.3622 13.1707 540 0.6029 0.4291 0.6029 0.7765
0.3622 13.2195 542 0.6008 0.4291 0.6008 0.7751
0.3622 13.2683 544 0.6079 0.4291 0.6079 0.7797
0.3622 13.3171 546 0.6160 0.4291 0.6160 0.7848
0.3622 13.3659 548 0.6300 0.4808 0.6300 0.7938
0.3622 13.4146 550 0.6616 0.4754 0.6616 0.8134
0.3622 13.4634 552 0.6731 0.4587 0.6731 0.8205
0.3622 13.5122 554 0.6960 0.4436 0.6960 0.8343
0.3622 13.5610 556 0.6916 0.4353 0.6916 0.8316
0.3622 13.6098 558 0.6464 0.4888 0.6464 0.8040
0.3622 13.6585 560 0.6011 0.5076 0.6011 0.7753
0.3622 13.7073 562 0.5812 0.4972 0.5812 0.7624
0.3622 13.7561 564 0.5868 0.5075 0.5868 0.7660
0.3622 13.8049 566 0.5973 0.5619 0.5973 0.7728
0.3622 13.8537 568 0.6034 0.5528 0.6034 0.7768
0.3622 13.9024 570 0.5912 0.5729 0.5912 0.7689
0.3622 13.9512 572 0.5779 0.6359 0.5779 0.7602
0.3622 14.0 574 0.5783 0.5953 0.5783 0.7605
0.3622 14.0488 576 0.5697 0.5692 0.5697 0.7548
0.3622 14.0976 578 0.5624 0.5937 0.5624 0.7499
0.3622 14.1463 580 0.5603 0.4801 0.5603 0.7486
0.3622 14.1951 582 0.5613 0.4911 0.5613 0.7492
0.3622 14.2439 584 0.5911 0.4522 0.5911 0.7688
0.3622 14.2927 586 0.5994 0.4606 0.5994 0.7742
0.3622 14.3415 588 0.5756 0.4954 0.5756 0.7587
0.3622 14.3902 590 0.5748 0.5831 0.5748 0.7582
0.3622 14.4390 592 0.5870 0.5421 0.5870 0.7662
0.3622 14.4878 594 0.5851 0.4972 0.5851 0.7649
0.3622 14.5366 596 0.5863 0.5379 0.5863 0.7657
0.3622 14.5854 598 0.6178 0.4542 0.6178 0.7860
0.3622 14.6341 600 0.6392 0.4625 0.6392 0.7995
0.3622 14.6829 602 0.6197 0.4929 0.6197 0.7872
0.3622 14.7317 604 0.5994 0.4354 0.5994 0.7742
0.3622 14.7805 606 0.5852 0.4637 0.5852 0.7650
0.3622 14.8293 608 0.5833 0.4194 0.5833 0.7637
0.3622 14.8780 610 0.5917 0.4908 0.5917 0.7692
0.3622 14.9268 612 0.5949 0.4451 0.5949 0.7713
0.3622 14.9756 614 0.5944 0.4684 0.5944 0.7710
0.3622 15.0244 616 0.5865 0.4684 0.5865 0.7658
0.3622 15.0732 618 0.5843 0.4684 0.5843 0.7644
0.3622 15.1220 620 0.6126 0.4330 0.6126 0.7827
0.3622 15.1707 622 0.6344 0.4576 0.6344 0.7965
0.3622 15.2195 624 0.6530 0.4576 0.6530 0.8081
0.3622 15.2683 626 0.6808 0.4491 0.6808 0.8251
0.3622 15.3171 628 0.6873 0.4491 0.6873 0.8290
0.3622 15.3659 630 0.6761 0.4247 0.6761 0.8223
0.3622 15.4146 632 0.6859 0.4491 0.6859 0.8282
0.3622 15.4634 634 0.7390 0.3819 0.7390 0.8596
0.3622 15.5122 636 0.7658 0.4307 0.7658 0.8751
0.3622 15.5610 638 0.7212 0.4067 0.7212 0.8493
0.3622 15.6098 640 0.6537 0.4576 0.6537 0.8085
0.3622 15.6585 642 0.6144 0.4375 0.6144 0.7838
0.3622 15.7073 644 0.6187 0.4253 0.6187 0.7866
0.3622 15.7561 646 0.6317 0.3551 0.6317 0.7948
0.3622 15.8049 648 0.6750 0.3713 0.6750 0.8216
0.3622 15.8537 650 0.7736 0.4089 0.7736 0.8795
0.3622 15.9024 652 0.9603 0.3193 0.9603 0.9800
0.3622 15.9512 654 1.0979 0.2650 1.0979 1.0478
0.3622 16.0 656 1.1080 0.2815 1.1080 1.0526
0.3622 16.0488 658 0.9947 0.3517 0.9947 0.9973
0.3622 16.0976 660 0.8416 0.3819 0.8416 0.9174

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k16_task7_organization

Finetuned
(4023)
this model