ArabicNewSplits7_FineTuningAraBERT_run3_AugV5_k19_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8561
  • Qwk: 0.3838
  • Mse: 0.8561
  • Rmse: 0.9252

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0345 2 4.8087 0.0010 4.8087 2.1929
No log 0.0690 4 2.7275 -0.0233 2.7275 1.6515
No log 0.1034 6 1.7450 0.0062 1.7450 1.3210
No log 0.1379 8 1.4470 -0.0211 1.4470 1.2029
No log 0.1724 10 1.2554 0.1273 1.2554 1.1204
No log 0.2069 12 1.2659 0.0677 1.2659 1.1251
No log 0.2414 14 1.4700 0.0169 1.4700 1.2124
No log 0.2759 16 1.7470 0.0 1.7470 1.3218
No log 0.3103 18 1.6484 0.0 1.6484 1.2839
No log 0.3448 20 1.3762 0.0317 1.3762 1.1731
No log 0.3793 22 1.2286 0.1865 1.2286 1.1084
No log 0.4138 24 1.1692 0.3195 1.1692 1.0813
No log 0.4483 26 1.1450 0.2360 1.1450 1.0700
No log 0.4828 28 1.0983 0.2200 1.0983 1.0480
No log 0.5172 30 1.1247 0.2936 1.1247 1.0605
No log 0.5517 32 1.2490 0.1346 1.2490 1.1176
No log 0.5862 34 1.1800 0.1715 1.1800 1.0863
No log 0.6207 36 1.2158 0.1622 1.2158 1.1026
No log 0.6552 38 1.2783 0.1404 1.2783 1.1306
No log 0.6897 40 1.3105 0.0898 1.3105 1.1448
No log 0.7241 42 1.3316 0.0898 1.3316 1.1539
No log 0.7586 44 1.2605 0.1314 1.2605 1.1227
No log 0.7931 46 1.3090 0.1314 1.3090 1.1441
No log 0.8276 48 1.4590 0.0254 1.4590 1.2079
No log 0.8621 50 1.5439 -0.0149 1.5439 1.2425
No log 0.8966 52 1.4195 -0.0066 1.4195 1.1914
No log 0.9310 54 1.3090 0.0575 1.3090 1.1441
No log 0.9655 56 1.1950 0.2298 1.1950 1.0932
No log 1.0 58 1.0729 0.2782 1.0729 1.0358
No log 1.0345 60 1.0644 0.3374 1.0644 1.0317
No log 1.0690 62 1.0245 0.2916 1.0245 1.0122
No log 1.1034 64 1.0276 0.3115 1.0276 1.0137
No log 1.1379 66 1.1065 0.2395 1.1065 1.0519
No log 1.1724 68 1.1391 0.2690 1.1391 1.0673
No log 1.2069 70 1.0760 0.2432 1.0760 1.0373
No log 1.2414 72 0.9918 0.3646 0.9918 0.9959
No log 1.2759 74 0.9426 0.4180 0.9426 0.9709
No log 1.3103 76 0.9560 0.4236 0.9560 0.9778
No log 1.3448 78 0.9403 0.4772 0.9403 0.9697
No log 1.3793 80 0.9074 0.4976 0.9074 0.9526
No log 1.4138 82 0.8773 0.5131 0.8773 0.9366
No log 1.4483 84 0.8644 0.4411 0.8644 0.9298
No log 1.4828 86 0.9528 0.4198 0.9528 0.9761
No log 1.5172 88 0.9795 0.3946 0.9795 0.9897
No log 1.5517 90 0.8816 0.4479 0.8816 0.9389
No log 1.5862 92 0.8419 0.3478 0.8419 0.9176
No log 1.6207 94 0.9078 0.3503 0.9078 0.9528
No log 1.6552 96 0.8782 0.3686 0.8782 0.9371
No log 1.6897 98 0.8247 0.4715 0.8247 0.9081
No log 1.7241 100 1.1604 0.3323 1.1604 1.0772
No log 1.7586 102 1.0020 0.3672 1.0020 1.0010
No log 1.7931 104 0.7911 0.4736 0.7911 0.8894
No log 1.8276 106 0.8697 0.4 0.8697 0.9326
No log 1.8621 108 0.9050 0.3778 0.9050 0.9513
No log 1.8966 110 0.9133 0.3885 0.9133 0.9557
No log 1.9310 112 0.9515 0.3996 0.9515 0.9754
No log 1.9655 114 0.9788 0.3409 0.9788 0.9893
No log 2.0 116 1.0258 0.4508 1.0258 1.0128
No log 2.0345 118 1.0138 0.3219 1.0138 1.0069
No log 2.0690 120 1.0092 0.3509 1.0092 1.0046
No log 2.1034 122 0.9923 0.3387 0.9923 0.9961
No log 2.1379 124 0.9858 0.3948 0.9858 0.9928
No log 2.1724 126 0.9805 0.4352 0.9805 0.9902
No log 2.2069 128 0.9745 0.4548 0.9745 0.9872
No log 2.2414 130 0.9926 0.4258 0.9926 0.9963
No log 2.2759 132 1.1147 0.2560 1.1147 1.0558
No log 2.3103 134 1.1092 0.3602 1.1092 1.0532
No log 2.3448 136 1.0027 0.4158 1.0027 1.0013
No log 2.3793 138 0.9816 0.2850 0.9816 0.9908
No log 2.4138 140 0.9850 0.2877 0.9850 0.9925
No log 2.4483 142 0.9794 0.3396 0.9794 0.9897
No log 2.4828 144 0.9876 0.3886 0.9876 0.9938
No log 2.5172 146 0.9418 0.3804 0.9418 0.9704
No log 2.5517 148 0.9602 0.4358 0.9602 0.9799
No log 2.5862 150 0.9097 0.4593 0.9097 0.9538
No log 2.6207 152 0.9409 0.4515 0.9409 0.9700
No log 2.6552 154 0.8925 0.4798 0.8925 0.9447
No log 2.6897 156 0.9503 0.4492 0.9503 0.9748
No log 2.7241 158 1.0224 0.4645 1.0224 1.0111
No log 2.7586 160 0.8753 0.5125 0.8753 0.9356
No log 2.7931 162 0.8793 0.3961 0.8793 0.9377
No log 2.8276 164 0.9409 0.3747 0.9409 0.9700
No log 2.8621 166 0.9400 0.3931 0.9400 0.9695
No log 2.8966 168 0.9650 0.4672 0.9650 0.9823
No log 2.9310 170 1.0290 0.4219 1.0290 1.0144
No log 2.9655 172 0.9402 0.5246 0.9402 0.9696
No log 3.0 174 0.9191 0.5374 0.9191 0.9587
No log 3.0345 176 0.8639 0.3891 0.8639 0.9294
No log 3.0690 178 0.8692 0.5176 0.8692 0.9323
No log 3.1034 180 0.8848 0.5219 0.8848 0.9407
No log 3.1379 182 0.8192 0.4993 0.8192 0.9051
No log 3.1724 184 0.8970 0.5303 0.8970 0.9471
No log 3.2069 186 1.0781 0.4298 1.0781 1.0383
No log 3.2414 188 1.2194 0.4048 1.2194 1.1042
No log 3.2759 190 1.1160 0.3884 1.1160 1.0564
No log 3.3103 192 1.0671 0.3605 1.0671 1.0330
No log 3.3448 194 1.0511 0.4497 1.0511 1.0252
No log 3.3793 196 0.9530 0.4598 0.9530 0.9762
No log 3.4138 198 0.9221 0.4394 0.9221 0.9603
No log 3.4483 200 0.9018 0.3827 0.9018 0.9496
No log 3.4828 202 0.8715 0.3762 0.8715 0.9335
No log 3.5172 204 0.8949 0.3827 0.8949 0.9460
No log 3.5517 206 0.8996 0.3687 0.8996 0.9485
No log 3.5862 208 0.8959 0.3687 0.8959 0.9465
No log 3.6207 210 0.8726 0.3663 0.8726 0.9341
No log 3.6552 212 0.8889 0.3908 0.8889 0.9428
No log 3.6897 214 0.9465 0.3902 0.9465 0.9729
No log 3.7241 216 1.0210 0.4099 1.0210 1.0105
No log 3.7586 218 1.0191 0.4007 1.0191 1.0095
No log 3.7931 220 0.9548 0.4378 0.9548 0.9772
No log 3.8276 222 0.8683 0.3919 0.8683 0.9318
No log 3.8621 224 0.8365 0.4772 0.8365 0.9146
No log 3.8966 226 1.0201 0.4083 1.0201 1.0100
No log 3.9310 228 1.0473 0.4328 1.0473 1.0234
No log 3.9655 230 0.8592 0.5036 0.8592 0.9269
No log 4.0 232 0.7911 0.5413 0.7911 0.8894
No log 4.0345 234 0.8047 0.5746 0.8047 0.8970
No log 4.0690 236 0.8064 0.5719 0.8064 0.8980
No log 4.1034 238 0.9095 0.5060 0.9095 0.9537
No log 4.1379 240 1.0755 0.4003 1.0755 1.0370
No log 4.1724 242 1.2308 0.4055 1.2308 1.1094
No log 4.2069 244 1.2364 0.4124 1.2364 1.1119
No log 4.2414 246 0.9643 0.4603 0.9643 0.9820
No log 4.2759 248 0.8927 0.4710 0.8927 0.9448
No log 4.3103 250 0.8113 0.5332 0.8113 0.9007
No log 4.3448 252 0.7615 0.5009 0.7615 0.8726
No log 4.3793 254 0.7595 0.4318 0.7595 0.8715
No log 4.4138 256 0.7775 0.4042 0.7775 0.8817
No log 4.4483 258 0.8373 0.4429 0.8373 0.9151
No log 4.4828 260 0.8851 0.3389 0.8851 0.9408
No log 4.5172 262 0.9002 0.3069 0.9002 0.9488
No log 4.5517 264 0.9249 0.2772 0.9249 0.9617
No log 4.5862 266 0.9449 0.3045 0.9449 0.9721
No log 4.6207 268 1.0018 0.2844 1.0018 1.0009
No log 4.6552 270 0.9919 0.3975 0.9919 0.9959
No log 4.6897 272 0.9304 0.5016 0.9304 0.9646
No log 4.7241 274 0.8355 0.3804 0.8355 0.9141
No log 4.7586 276 0.8250 0.3920 0.8250 0.9083
No log 4.7931 278 0.8230 0.3829 0.8230 0.9072
No log 4.8276 280 0.8367 0.4409 0.8367 0.9147
No log 4.8621 282 0.9646 0.4839 0.9646 0.9821
No log 4.8966 284 0.9831 0.5390 0.9831 0.9915
No log 4.9310 286 0.8632 0.5700 0.8632 0.9291
No log 4.9655 288 0.8139 0.4254 0.8139 0.9022
No log 5.0 290 0.8629 0.4234 0.8629 0.9289
No log 5.0345 292 0.8945 0.4606 0.8945 0.9458
No log 5.0690 294 0.8508 0.3925 0.8508 0.9224
No log 5.1034 296 0.8838 0.4369 0.8838 0.9401
No log 5.1379 298 0.9596 0.4382 0.9596 0.9796
No log 5.1724 300 0.9458 0.4513 0.9458 0.9725
No log 5.2069 302 0.8457 0.3771 0.8457 0.9196
No log 5.2414 304 0.8049 0.3987 0.8049 0.8972
No log 5.2759 306 0.7931 0.4527 0.7931 0.8906
No log 5.3103 308 0.8043 0.5356 0.8043 0.8968
No log 5.3448 310 0.7763 0.6085 0.7763 0.8811
No log 5.3793 312 0.8391 0.5732 0.8391 0.9160
No log 5.4138 314 0.9130 0.4598 0.9130 0.9555
No log 5.4483 316 0.8781 0.4991 0.8781 0.9371
No log 5.4828 318 0.8294 0.4877 0.8294 0.9107
No log 5.5172 320 0.8491 0.4541 0.8491 0.9215
No log 5.5517 322 0.8830 0.4159 0.8830 0.9397
No log 5.5862 324 0.9209 0.3652 0.9209 0.9596
No log 5.6207 326 1.0213 0.3141 1.0213 1.0106
No log 5.6552 328 1.1249 0.2132 1.1249 1.0606
No log 5.6897 330 1.1646 0.2390 1.1646 1.0792
No log 5.7241 332 1.1571 0.1844 1.1571 1.0757
No log 5.7586 334 1.1686 0.1795 1.1686 1.0810
No log 5.7931 336 1.1229 0.1498 1.1229 1.0597
No log 5.8276 338 1.0534 0.1500 1.0534 1.0264
No log 5.8621 340 1.0010 0.1860 1.0010 1.0005
No log 5.8966 342 0.9729 0.3493 0.9729 0.9864
No log 5.9310 344 0.9664 0.3933 0.9664 0.9831
No log 5.9655 346 0.9720 0.2574 0.9720 0.9859
No log 6.0 348 1.0570 0.2417 1.0570 1.0281
No log 6.0345 350 1.2012 0.3512 1.2012 1.0960
No log 6.0690 352 1.2250 0.3833 1.2250 1.1068
No log 6.1034 354 1.1470 0.3621 1.1470 1.0710
No log 6.1379 356 1.0140 0.3299 1.0140 1.0070
No log 6.1724 358 0.9402 0.3139 0.9402 0.9696
No log 6.2069 360 0.9267 0.3336 0.9267 0.9626
No log 6.2414 362 0.9225 0.3336 0.9225 0.9605
No log 6.2759 364 0.9134 0.3430 0.9134 0.9557
No log 6.3103 366 0.9355 0.3200 0.9355 0.9672
No log 6.3448 368 0.9201 0.3261 0.9201 0.9592
No log 6.3793 370 0.8745 0.4069 0.8745 0.9351
No log 6.4138 372 0.8687 0.4267 0.8687 0.9320
No log 6.4483 374 0.8738 0.5247 0.8738 0.9348
No log 6.4828 376 0.8319 0.4690 0.8319 0.9121
No log 6.5172 378 0.8329 0.4496 0.8329 0.9126
No log 6.5517 380 0.8475 0.4125 0.8475 0.9206
No log 6.5862 382 0.8358 0.4125 0.8358 0.9142
No log 6.6207 384 0.8266 0.3811 0.8266 0.9092
No log 6.6552 386 0.8378 0.4048 0.8378 0.9153
No log 6.6897 388 0.9086 0.4164 0.9086 0.9532
No log 6.7241 390 1.0117 0.4221 1.0117 1.0058
No log 6.7586 392 1.0285 0.4003 1.0285 1.0142
No log 6.7931 394 0.9536 0.3931 0.9536 0.9765
No log 6.8276 396 0.9015 0.3493 0.9015 0.9495
No log 6.8621 398 0.8975 0.3493 0.8975 0.9473
No log 6.8966 400 0.9031 0.3298 0.9031 0.9503
No log 6.9310 402 0.9406 0.4165 0.9406 0.9698
No log 6.9655 404 0.9438 0.4165 0.9438 0.9715
No log 7.0 406 0.8973 0.3335 0.8973 0.9473
No log 7.0345 408 0.8934 0.3380 0.8934 0.9452
No log 7.0690 410 0.8936 0.3522 0.8936 0.9453
No log 7.1034 412 0.9066 0.3493 0.9066 0.9521
No log 7.1379 414 0.8932 0.3632 0.8932 0.9451
No log 7.1724 416 0.8878 0.3609 0.8878 0.9423
No log 7.2069 418 0.9058 0.3920 0.9058 0.9517
No log 7.2414 420 0.9374 0.3802 0.9374 0.9682
No log 7.2759 422 0.9100 0.4164 0.9100 0.9540
No log 7.3103 424 0.9008 0.3861 0.9008 0.9491
No log 7.3448 426 0.9286 0.3249 0.9286 0.9637
No log 7.3793 428 0.9545 0.3595 0.9545 0.9770
No log 7.4138 430 0.9787 0.4641 0.9787 0.9893
No log 7.4483 432 0.9601 0.4633 0.9601 0.9798
No log 7.4828 434 0.9715 0.4397 0.9715 0.9856
No log 7.5172 436 0.9234 0.3908 0.9234 0.9609
No log 7.5517 438 0.8890 0.3941 0.8890 0.9429
No log 7.5862 440 0.8748 0.3886 0.8748 0.9353
No log 7.6207 442 0.9145 0.4532 0.9145 0.9563
No log 7.6552 444 1.0361 0.4065 1.0361 1.0179
No log 7.6897 446 1.1170 0.3833 1.1170 1.0569
No log 7.7241 448 1.1068 0.3697 1.1068 1.0521
No log 7.7586 450 0.9965 0.3119 0.9965 0.9982
No log 7.7931 452 0.9315 0.3478 0.9315 0.9652
No log 7.8276 454 0.9365 0.3478 0.9365 0.9677
No log 7.8621 456 0.9852 0.2721 0.9852 0.9926
No log 7.8966 458 1.1383 0.3220 1.1383 1.0669
No log 7.9310 460 1.2363 0.2886 1.2363 1.1119
No log 7.9655 462 1.1863 0.3205 1.1863 1.0892
No log 8.0 464 1.0549 0.4096 1.0549 1.0271
No log 8.0345 466 0.9169 0.4700 0.9169 0.9576
No log 8.0690 468 0.8831 0.4450 0.8831 0.9397
No log 8.1034 470 0.8753 0.5107 0.8753 0.9356
No log 8.1379 472 0.9218 0.4681 0.9218 0.9601
No log 8.1724 474 0.9704 0.4539 0.9704 0.9851
No log 8.2069 476 0.9523 0.4886 0.9523 0.9758
No log 8.2414 478 0.8853 0.5291 0.8853 0.9409
No log 8.2759 480 0.8338 0.4308 0.8338 0.9131
No log 8.3103 482 0.8123 0.3979 0.8123 0.9013
No log 8.3448 484 0.8061 0.4119 0.8061 0.8978
No log 8.3793 486 0.8129 0.3979 0.8129 0.9016
No log 8.4138 488 0.8772 0.4983 0.8772 0.9366
No log 8.4483 490 0.9686 0.4426 0.9686 0.9842
No log 8.4828 492 0.9658 0.4751 0.9658 0.9827
No log 8.5172 494 0.9407 0.4663 0.9407 0.9699
No log 8.5517 496 0.9486 0.4131 0.9486 0.9740
No log 8.5862 498 0.9829 0.3528 0.9829 0.9914
0.3818 8.6207 500 1.0342 0.4065 1.0342 1.0170
0.3818 8.6552 502 1.0405 0.4186 1.0405 1.0201
0.3818 8.6897 504 0.9492 0.3785 0.9492 0.9743
0.3818 8.7241 506 0.8683 0.4104 0.8683 0.9318
0.3818 8.7586 508 0.8297 0.3943 0.8297 0.9109
0.3818 8.7931 510 0.8249 0.3838 0.8249 0.9082
0.3818 8.8276 512 0.8636 0.4045 0.8636 0.9293
0.3818 8.8621 514 0.8727 0.3652 0.8727 0.9342
0.3818 8.8966 516 0.8561 0.3838 0.8561 0.9252

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run3_AugV5_k19_task2_organization

Finetuned
(4023)
this model