ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k13_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0024
  • Qwk: 0.3676
  • Mse: 1.0024
  • Rmse: 1.0012

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0299 2 3.9378 -0.0039 3.9378 1.9844
No log 0.0597 4 2.0427 0.0305 2.0427 1.4292
No log 0.0896 6 1.2016 0.0569 1.2016 1.0962
No log 0.1194 8 0.9414 -0.0728 0.9414 0.9703
No log 0.1493 10 0.7593 0.1746 0.7593 0.8714
No log 0.1791 12 0.6904 0.2650 0.6904 0.8309
No log 0.2090 14 0.8712 0.0565 0.8712 0.9334
No log 0.2388 16 0.9602 0.0607 0.9602 0.9799
No log 0.2687 18 1.0893 0.0593 1.0893 1.0437
No log 0.2985 20 0.7727 0.0775 0.7727 0.8791
No log 0.3284 22 0.7903 0.1819 0.7903 0.8890
No log 0.3582 24 0.9670 0.2523 0.9670 0.9834
No log 0.3881 26 0.8572 0.1525 0.8572 0.9259
No log 0.4179 28 0.6791 0.1063 0.6791 0.8241
No log 0.4478 30 0.6787 0.1694 0.6787 0.8238
No log 0.4776 32 0.7541 0.1981 0.7541 0.8684
No log 0.5075 34 0.7115 0.2532 0.7115 0.8435
No log 0.5373 36 0.6172 0.3011 0.6172 0.7856
No log 0.5672 38 0.6740 0.2355 0.6740 0.8210
No log 0.5970 40 0.7309 0.3346 0.7309 0.8549
No log 0.6269 42 0.8974 0.2944 0.8974 0.9473
No log 0.6567 44 1.0912 0.2740 1.0912 1.0446
No log 0.6866 46 1.2895 0.1755 1.2895 1.1355
No log 0.7164 48 1.0240 0.2885 1.0240 1.0119
No log 0.7463 50 0.7687 0.2846 0.7687 0.8768
No log 0.7761 52 0.7065 0.2069 0.7065 0.8406
No log 0.8060 54 0.7406 0.3245 0.7406 0.8606
No log 0.8358 56 0.9884 0.3496 0.9884 0.9942
No log 0.8657 58 1.3583 0.2992 1.3583 1.1654
No log 0.8955 60 1.1363 0.2992 1.1363 1.0660
No log 0.9254 62 0.8255 0.4250 0.8255 0.9086
No log 0.9552 64 0.8597 0.3995 0.8597 0.9272
No log 0.9851 66 1.1764 0.3605 1.1764 1.0846
No log 1.0149 68 1.4387 0.3262 1.4387 1.1994
No log 1.0448 70 1.5367 0.3143 1.5367 1.2397
No log 1.0746 72 1.5401 0.3283 1.5401 1.2410
No log 1.1045 74 1.2111 0.4085 1.2111 1.1005
No log 1.1343 76 0.9527 0.4324 0.9527 0.9761
No log 1.1642 78 0.9849 0.4491 0.9849 0.9924
No log 1.1940 80 1.0706 0.4065 1.0706 1.0347
No log 1.2239 82 1.1432 0.3606 1.1432 1.0692
No log 1.2537 84 1.2528 0.3600 1.2528 1.1193
No log 1.2836 86 1.2184 0.3855 1.2184 1.1038
No log 1.3134 88 0.9533 0.3951 0.9533 0.9764
No log 1.3433 90 0.9159 0.3998 0.9159 0.9571
No log 1.3731 92 0.9197 0.4195 0.9197 0.9590
No log 1.4030 94 0.9828 0.4042 0.9828 0.9914
No log 1.4328 96 1.4526 0.3171 1.4526 1.2053
No log 1.4627 98 1.4825 0.3137 1.4825 1.2176
No log 1.4925 100 1.1785 0.3679 1.1785 1.0856
No log 1.5224 102 1.0456 0.4147 1.0456 1.0226
No log 1.5522 104 1.1328 0.3699 1.1328 1.0643
No log 1.5821 106 1.2181 0.3707 1.2181 1.1037
No log 1.6119 108 1.1815 0.3873 1.1815 1.0869
No log 1.6418 110 0.8697 0.4075 0.8697 0.9326
No log 1.6716 112 0.7543 0.4146 0.7543 0.8685
No log 1.7015 114 0.7776 0.3839 0.7776 0.8818
No log 1.7313 116 1.0288 0.3864 1.0288 1.0143
No log 1.7612 118 1.1935 0.3717 1.1935 1.0925
No log 1.7910 120 1.1278 0.3356 1.1278 1.0620
No log 1.8209 122 0.9896 0.3729 0.9896 0.9948
No log 1.8507 124 0.9884 0.3573 0.9884 0.9942
No log 1.8806 126 0.9481 0.4181 0.9481 0.9737
No log 1.9104 128 0.9435 0.4107 0.9435 0.9713
No log 1.9403 130 0.9556 0.3917 0.9556 0.9775
No log 1.9701 132 1.0130 0.4132 1.0130 1.0065
No log 2.0 134 1.3015 0.3759 1.3015 1.1408
No log 2.0299 136 1.2607 0.3755 1.2607 1.1228
No log 2.0597 138 1.0114 0.4118 1.0114 1.0057
No log 2.0896 140 0.9557 0.4084 0.9557 0.9776
No log 2.1194 142 0.9703 0.4133 0.9703 0.9851
No log 2.1493 144 1.0624 0.3819 1.0624 1.0307
No log 2.1791 146 1.0376 0.4061 1.0376 1.0186
No log 2.2090 148 1.1088 0.3972 1.1088 1.0530
No log 2.2388 150 1.0322 0.4385 1.0322 1.0160
No log 2.2687 152 0.9981 0.4122 0.9981 0.9991
No log 2.2985 154 0.9799 0.3901 0.9799 0.9899
No log 2.3284 156 1.0118 0.4398 1.0118 1.0059
No log 2.3582 158 1.0929 0.3746 1.0929 1.0454
No log 2.3881 160 1.0464 0.3926 1.0464 1.0229
No log 2.4179 162 0.9612 0.4217 0.9612 0.9804
No log 2.4478 164 0.8985 0.4332 0.8985 0.9479
No log 2.4776 166 0.8981 0.3951 0.8981 0.9477
No log 2.5075 168 1.0069 0.4076 1.0069 1.0034
No log 2.5373 170 1.1598 0.3697 1.1598 1.0770
No log 2.5672 172 1.1521 0.3896 1.1521 1.0734
No log 2.5970 174 1.1574 0.4023 1.1574 1.0758
No log 2.6269 176 1.1164 0.3832 1.1164 1.0566
No log 2.6567 178 0.9707 0.4065 0.9707 0.9852
No log 2.6866 180 0.9582 0.4101 0.9582 0.9789
No log 2.7164 182 1.0299 0.3700 1.0299 1.0148
No log 2.7463 184 1.0953 0.3759 1.0953 1.0466
No log 2.7761 186 1.2534 0.3998 1.2534 1.1196
No log 2.8060 188 1.4549 0.3683 1.4549 1.2062
No log 2.8358 190 1.3925 0.3648 1.3925 1.1801
No log 2.8657 192 1.1489 0.3923 1.1489 1.0719
No log 2.8955 194 0.9832 0.4563 0.9832 0.9915
No log 2.9254 196 0.9365 0.4702 0.9365 0.9677
No log 2.9552 198 0.9498 0.4483 0.9498 0.9746
No log 2.9851 200 1.0625 0.4107 1.0625 1.0308
No log 3.0149 202 1.2256 0.4034 1.2256 1.1071
No log 3.0448 204 1.1698 0.3983 1.1698 1.0816
No log 3.0746 206 1.0841 0.4018 1.0841 1.0412
No log 3.1045 208 1.0683 0.4218 1.0683 1.0336
No log 3.1343 210 1.2088 0.3414 1.2088 1.0994
No log 3.1642 212 1.3120 0.3350 1.3120 1.1454
No log 3.1940 214 1.3170 0.3375 1.3170 1.1476
No log 3.2239 216 1.2067 0.3778 1.2067 1.0985
No log 3.2537 218 1.1003 0.4164 1.1003 1.0489
No log 3.2836 220 1.0509 0.4122 1.0509 1.0251
No log 3.3134 222 1.0151 0.3789 1.0151 1.0075
No log 3.3433 224 0.9855 0.3604 0.9855 0.9927
No log 3.3731 226 1.0717 0.3417 1.0717 1.0352
No log 3.4030 228 1.0829 0.3179 1.0829 1.0406
No log 3.4328 230 1.0283 0.4082 1.0283 1.0141
No log 3.4627 232 1.0062 0.4098 1.0062 1.0031
No log 3.4925 234 1.0621 0.4130 1.0621 1.0306
No log 3.5224 236 1.2293 0.3591 1.2293 1.1087
No log 3.5522 238 1.3475 0.3853 1.3475 1.1608
No log 3.5821 240 1.2418 0.3865 1.2418 1.1144
No log 3.6119 242 1.1279 0.4176 1.1279 1.0620
No log 3.6418 244 1.0127 0.4441 1.0127 1.0063
No log 3.6716 246 1.0043 0.4417 1.0043 1.0021
No log 3.7015 248 1.0931 0.3672 1.0931 1.0455
No log 3.7313 250 1.1018 0.3835 1.1018 1.0497
No log 3.7612 252 0.9733 0.4171 0.9733 0.9865
No log 3.7910 254 0.9207 0.4457 0.9207 0.9595
No log 3.8209 256 0.9163 0.4364 0.9163 0.9573
No log 3.8507 258 0.9315 0.4364 0.9315 0.9651
No log 3.8806 260 1.0241 0.4202 1.0241 1.0120
No log 3.9104 262 1.1219 0.4269 1.1219 1.0592
No log 3.9403 264 1.1529 0.4162 1.1529 1.0737
No log 3.9701 266 1.0981 0.4255 1.0981 1.0479
No log 4.0 268 1.0641 0.4110 1.0641 1.0315
No log 4.0299 270 1.1209 0.4086 1.1209 1.0587
No log 4.0597 272 1.1637 0.3730 1.1637 1.0788
No log 4.0896 274 1.1383 0.3921 1.1383 1.0669
No log 4.1194 276 1.1793 0.3381 1.1793 1.0859
No log 4.1493 278 1.1400 0.3358 1.1400 1.0677
No log 4.1791 280 1.0812 0.3742 1.0812 1.0398
No log 4.2090 282 0.9961 0.3908 0.9961 0.9981
No log 4.2388 284 0.9467 0.3884 0.9467 0.9730
No log 4.2687 286 0.8975 0.4157 0.8975 0.9474
No log 4.2985 288 0.9029 0.3922 0.9029 0.9502
No log 4.3284 290 0.9274 0.3930 0.9274 0.9630
No log 4.3582 292 0.9448 0.4040 0.9448 0.9720
No log 4.3881 294 0.9253 0.4311 0.9253 0.9620
No log 4.4179 296 0.9446 0.4015 0.9446 0.9719
No log 4.4478 298 0.9643 0.3829 0.9643 0.9820
No log 4.4776 300 0.9312 0.4121 0.9312 0.9650
No log 4.5075 302 0.9586 0.4001 0.9586 0.9791
No log 4.5373 304 0.9586 0.3947 0.9586 0.9791
No log 4.5672 306 0.9412 0.4114 0.9412 0.9702
No log 4.5970 308 0.9415 0.4052 0.9415 0.9703
No log 4.6269 310 0.9279 0.3998 0.9279 0.9633
No log 4.6567 312 0.8888 0.4123 0.8888 0.9428
No log 4.6866 314 0.8724 0.4273 0.8724 0.9340
No log 4.7164 316 0.8942 0.4366 0.8942 0.9456
No log 4.7463 318 0.9114 0.4467 0.9114 0.9547
No log 4.7761 320 0.9638 0.4191 0.9638 0.9817
No log 4.8060 322 1.0484 0.4073 1.0484 1.0239
No log 4.8358 324 1.0913 0.4123 1.0913 1.0446
No log 4.8657 326 1.0948 0.4068 1.0948 1.0463
No log 4.8955 328 1.0848 0.4125 1.0848 1.0415
No log 4.9254 330 1.0838 0.4097 1.0838 1.0411
No log 4.9552 332 1.1028 0.4054 1.1028 1.0501
No log 4.9851 334 1.1350 0.4286 1.1350 1.0654
No log 5.0149 336 1.1976 0.4073 1.1976 1.0944
No log 5.0448 338 1.2287 0.4146 1.2287 1.1085
No log 5.0746 340 1.2620 0.3693 1.2620 1.1234
No log 5.1045 342 1.2859 0.3496 1.2859 1.1340
No log 5.1343 344 1.1860 0.4062 1.1860 1.0890
No log 5.1642 346 1.1096 0.4015 1.1096 1.0534
No log 5.1940 348 1.0948 0.3890 1.0948 1.0463
No log 5.2239 350 1.0861 0.4474 1.0861 1.0422
No log 5.2537 352 1.1075 0.3918 1.1075 1.0524
No log 5.2836 354 1.1499 0.3944 1.1499 1.0723
No log 5.3134 356 1.1852 0.3602 1.1852 1.0887
No log 5.3433 358 1.1290 0.3984 1.1290 1.0625
No log 5.3731 360 1.0445 0.4265 1.0445 1.0220
No log 5.4030 362 1.0537 0.4138 1.0537 1.0265
No log 5.4328 364 1.1319 0.4396 1.1319 1.0639
No log 5.4627 366 1.0936 0.4259 1.0936 1.0458
No log 5.4925 368 1.0191 0.4356 1.0191 1.0095
No log 5.5224 370 0.9586 0.4043 0.9586 0.9791
No log 5.5522 372 0.9543 0.4026 0.9543 0.9769
No log 5.5821 374 0.9349 0.4343 0.9349 0.9669
No log 5.6119 376 0.9685 0.4289 0.9685 0.9841
No log 5.6418 378 1.0503 0.3969 1.0503 1.0248
No log 5.6716 380 1.0817 0.3704 1.0817 1.0401
No log 5.7015 382 1.0901 0.3864 1.0901 1.0441
No log 5.7313 384 1.0196 0.3906 1.0196 1.0098
No log 5.7612 386 0.9856 0.3891 0.9856 0.9928
No log 5.7910 388 0.9073 0.4749 0.9073 0.9525
No log 5.8209 390 0.8662 0.4817 0.8662 0.9307
No log 5.8507 392 0.8371 0.4468 0.8371 0.9149
No log 5.8806 394 0.8621 0.4106 0.8621 0.9285
No log 5.9104 396 0.8747 0.4006 0.8747 0.9352
No log 5.9403 398 0.8729 0.4006 0.8729 0.9343
No log 5.9701 400 0.8499 0.3962 0.8499 0.9219
No log 6.0 402 0.8906 0.3969 0.8906 0.9437
No log 6.0299 404 0.9654 0.3934 0.9654 0.9826
No log 6.0597 406 1.0332 0.3833 1.0332 1.0165
No log 6.0896 408 1.1122 0.3905 1.1122 1.0546
No log 6.1194 410 1.2084 0.3834 1.2084 1.0993
No log 6.1493 412 1.1849 0.3936 1.1849 1.0885
No log 6.1791 414 1.1361 0.3832 1.1361 1.0659
No log 6.2090 416 1.1004 0.3879 1.1004 1.0490
No log 6.2388 418 1.0658 0.3900 1.0658 1.0324
No log 6.2687 420 1.0488 0.4187 1.0488 1.0241
No log 6.2985 422 1.0452 0.4187 1.0452 1.0223
No log 6.3284 424 1.1057 0.4002 1.1057 1.0515
No log 6.3582 426 1.1754 0.3879 1.1754 1.0842
No log 6.3881 428 1.1351 0.3845 1.1351 1.0654
No log 6.4179 430 1.0544 0.4189 1.0544 1.0268
No log 6.4478 432 0.9959 0.4351 0.9959 0.9980
No log 6.4776 434 0.9023 0.4305 0.9023 0.9499
No log 6.5075 436 0.8486 0.4289 0.8486 0.9212
No log 6.5373 438 0.8222 0.3965 0.8222 0.9068
No log 6.5672 440 0.8313 0.4173 0.8313 0.9117
No log 6.5970 442 0.8516 0.4250 0.8516 0.9228
No log 6.6269 444 0.8750 0.4509 0.8750 0.9354
No log 6.6567 446 0.9225 0.4187 0.9225 0.9605
No log 6.6866 448 1.0101 0.4035 1.0101 1.0051
No log 6.7164 450 1.0764 0.4149 1.0764 1.0375
No log 6.7463 452 1.0794 0.4061 1.0794 1.0390
No log 6.7761 454 1.0473 0.4242 1.0473 1.0234
No log 6.8060 456 1.0011 0.4247 1.0011 1.0006
No log 6.8358 458 0.9814 0.4107 0.9814 0.9907
No log 6.8657 460 0.9849 0.4247 0.9849 0.9924
No log 6.8955 462 1.0060 0.4175 1.0060 1.0030
No log 6.9254 464 1.0157 0.4337 1.0157 1.0078
No log 6.9552 466 1.0201 0.4305 1.0201 1.0100
No log 6.9851 468 1.0680 0.4135 1.0680 1.0334
No log 7.0149 470 1.1025 0.4007 1.1025 1.0500
No log 7.0448 472 1.1045 0.4109 1.1045 1.0510
No log 7.0746 474 1.0792 0.4148 1.0792 1.0389
No log 7.1045 476 1.0582 0.4147 1.0582 1.0287
No log 7.1343 478 1.0219 0.4192 1.0219 1.0109
No log 7.1642 480 0.9993 0.4256 0.9993 0.9997
No log 7.1940 482 0.9943 0.4174 0.9943 0.9971
No log 7.2239 484 0.9928 0.4043 0.9928 0.9964
No log 7.2537 486 0.9842 0.4127 0.9842 0.9921
No log 7.2836 488 1.0073 0.4064 1.0073 1.0037
No log 7.3134 490 1.0743 0.3520 1.0743 1.0365
No log 7.3433 492 1.1618 0.3486 1.1618 1.0779
No log 7.3731 494 1.2243 0.3542 1.2243 1.1065
No log 7.4030 496 1.2551 0.3693 1.2551 1.1203
No log 7.4328 498 1.2493 0.3693 1.2493 1.1177
0.4293 7.4627 500 1.1809 0.3577 1.1809 1.0867
0.4293 7.4925 502 1.0875 0.3676 1.0875 1.0428
0.4293 7.5224 504 0.9993 0.4147 0.9993 0.9997
0.4293 7.5522 506 0.9495 0.4171 0.9495 0.9744
0.4293 7.5821 508 0.9355 0.4220 0.9355 0.9672
0.4293 7.6119 510 0.9398 0.4270 0.9398 0.9694
0.4293 7.6418 512 0.9730 0.3921 0.9730 0.9864
0.4293 7.6716 514 1.0542 0.3627 1.0542 1.0267
0.4293 7.7015 516 1.1510 0.4201 1.1510 1.0728
0.4293 7.7313 518 1.1957 0.3999 1.1957 1.0935
0.4293 7.7612 520 1.1744 0.4157 1.1744 1.0837
0.4293 7.7910 522 1.0999 0.3748 1.0999 1.0487
0.4293 7.8209 524 1.0593 0.3651 1.0593 1.0292
0.4293 7.8507 526 1.0384 0.3675 1.0384 1.0190
0.4293 7.8806 528 1.0364 0.3675 1.0364 1.0180
0.4293 7.9104 530 1.0222 0.3629 1.0222 1.0110
0.4293 7.9403 532 1.0224 0.3629 1.0224 1.0112
0.4293 7.9701 534 0.9907 0.3861 0.9907 0.9954
0.4293 8.0 536 0.9658 0.3861 0.9658 0.9827
0.4293 8.0299 538 0.9572 0.3664 0.9572 0.9784
0.4293 8.0597 540 0.9578 0.3639 0.9578 0.9787
0.4293 8.0896 542 0.9348 0.3633 0.9348 0.9668
0.4293 8.1194 544 0.9305 0.3633 0.9305 0.9646
0.4293 8.1493 546 0.9210 0.3784 0.9210 0.9597
0.4293 8.1791 548 0.9160 0.3784 0.9160 0.9571
0.4293 8.2090 550 0.9239 0.3784 0.9239 0.9612
0.4293 8.2388 552 0.9544 0.3857 0.9544 0.9769
0.4293 8.2687 554 0.9789 0.3658 0.9789 0.9894
0.4293 8.2985 556 1.0165 0.3599 1.0165 1.0082
0.4293 8.3284 558 1.0299 0.3599 1.0299 1.0148
0.4293 8.3582 560 1.0185 0.3687 1.0185 1.0092
0.4293 8.3881 562 0.9972 0.3819 0.9972 0.9986
0.4293 8.4179 564 0.9838 0.4092 0.9838 0.9919
0.4293 8.4478 566 0.9743 0.4287 0.9743 0.9871
0.4293 8.4776 568 0.9827 0.4303 0.9827 0.9913
0.4293 8.5075 570 0.9836 0.4270 0.9836 0.9918
0.4293 8.5373 572 0.9875 0.4205 0.9875 0.9937
0.4293 8.5672 574 1.0071 0.3811 1.0071 1.0035
0.4293 8.5970 576 1.0264 0.3571 1.0264 1.0131
0.4293 8.6269 578 1.0557 0.3653 1.0557 1.0275
0.4293 8.6567 580 1.0888 0.3611 1.0888 1.0435
0.4293 8.6866 582 1.0952 0.3766 1.0952 1.0465
0.4293 8.7164 584 1.0852 0.3629 1.0852 1.0417
0.4293 8.7463 586 1.0728 0.3653 1.0728 1.0358
0.4293 8.7761 588 1.0626 0.3539 1.0626 1.0308
0.4293 8.8060 590 1.0485 0.3539 1.0485 1.0240
0.4293 8.8358 592 1.0231 0.3676 1.0231 1.0115
0.4293 8.8657 594 0.9890 0.4027 0.9890 0.9945
0.4293 8.8955 596 0.9657 0.4088 0.9657 0.9827
0.4293 8.9254 598 0.9532 0.4239 0.9532 0.9763
0.4293 8.9552 600 0.9539 0.4088 0.9539 0.9767
0.4293 8.9851 602 0.9595 0.4088 0.9595 0.9796
0.4293 9.0149 604 0.9731 0.3837 0.9731 0.9865
0.4293 9.0448 606 0.9796 0.3823 0.9796 0.9897
0.4293 9.0746 608 0.9923 0.3713 0.9923 0.9961
0.4293 9.1045 610 1.0139 0.3676 1.0139 1.0069
0.4293 9.1343 612 1.0301 0.3527 1.0301 1.0150
0.4293 9.1642 614 1.0431 0.3527 1.0431 1.0213
0.4293 9.1940 616 1.0458 0.3527 1.0458 1.0226
0.4293 9.2239 618 1.0359 0.3527 1.0359 1.0178
0.4293 9.2537 620 1.0176 0.3676 1.0176 1.0088
0.4293 9.2836 622 1.0024 0.3713 1.0024 1.0012
0.4293 9.3134 624 0.9829 0.3713 0.9829 0.9914
0.4293 9.3433 626 0.9678 0.3823 0.9678 0.9838
0.4293 9.3731 628 0.9492 0.3740 0.9492 0.9743
0.4293 9.4030 630 0.9359 0.3997 0.9359 0.9674
0.4293 9.4328 632 0.9268 0.4184 0.9268 0.9627
0.4293 9.4627 634 0.9244 0.4184 0.9244 0.9615
0.4293 9.4925 636 0.9264 0.4133 0.9264 0.9625
0.4293 9.5224 638 0.9295 0.4133 0.9295 0.9641
0.4293 9.5522 640 0.9377 0.4066 0.9377 0.9684
0.4293 9.5821 642 0.9473 0.3997 0.9473 0.9733
0.4293 9.6119 644 0.9522 0.3997 0.9522 0.9758
0.4293 9.6418 646 0.9549 0.3997 0.9549 0.9772
0.4293 9.6716 648 0.9604 0.3997 0.9604 0.9800
0.4293 9.7015 650 0.9678 0.4070 0.9678 0.9838
0.4293 9.7313 652 0.9767 0.4070 0.9767 0.9883
0.4293 9.7612 654 0.9832 0.4072 0.9832 0.9916
0.4293 9.7910 656 0.9874 0.4072 0.9874 0.9937
0.4293 9.8209 658 0.9917 0.4072 0.9917 0.9958
0.4293 9.8507 660 0.9969 0.3893 0.9969 0.9985
0.4293 9.8806 662 1.0009 0.3713 1.0009 1.0004
0.4293 9.9104 664 1.0026 0.3676 1.0026 1.0013
0.4293 9.9403 666 1.0028 0.3676 1.0028 1.0014
0.4293 9.9701 668 1.0025 0.3676 1.0025 1.0012
0.4293 10.0 670 1.0024 0.3676 1.0024 1.0012

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k13_task2_organization

Finetuned
(4023)
this model