ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k8_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9505
  • Qwk: 0.4596
  • Mse: 0.9505
  • Rmse: 0.9749

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0426 2 3.9761 -0.0187 3.9761 1.9940
No log 0.0851 4 2.0464 0.0737 2.0464 1.4305
No log 0.1277 6 0.9765 0.0569 0.9765 0.9882
No log 0.1702 8 0.7144 0.2110 0.7144 0.8452
No log 0.2128 10 0.7571 0.1225 0.7571 0.8701
No log 0.2553 12 0.8195 0.1698 0.8195 0.9053
No log 0.2979 14 1.1133 0.1308 1.1133 1.0551
No log 0.3404 16 1.0867 0.1675 1.0867 1.0425
No log 0.3830 18 0.8013 0.2408 0.8013 0.8952
No log 0.4255 20 0.8185 0.1118 0.8185 0.9047
No log 0.4681 22 0.9813 0.2443 0.9813 0.9906
No log 0.5106 24 0.6403 0.3449 0.6403 0.8002
No log 0.5532 26 0.6346 0.2189 0.6346 0.7966
No log 0.5957 28 0.6872 0.1826 0.6872 0.8290
No log 0.6383 30 0.6202 0.2526 0.6202 0.7875
No log 0.6809 32 0.5650 0.4336 0.5650 0.7516
No log 0.7234 34 0.6857 0.4448 0.6857 0.8281
No log 0.7660 36 0.9922 0.2855 0.9922 0.9961
No log 0.8085 38 1.0016 0.2736 1.0016 1.0008
No log 0.8511 40 0.7039 0.4392 0.7039 0.8390
No log 0.8936 42 0.5592 0.4671 0.5592 0.7478
No log 0.9362 44 0.5489 0.4882 0.5489 0.7409
No log 0.9787 46 0.5497 0.4882 0.5497 0.7414
No log 1.0213 48 0.5542 0.4623 0.5542 0.7444
No log 1.0638 50 0.5491 0.4983 0.5491 0.7410
No log 1.1064 52 0.5622 0.4691 0.5622 0.7498
No log 1.1489 54 0.5614 0.4393 0.5614 0.7493
No log 1.1915 56 0.5722 0.4343 0.5722 0.7564
No log 1.2340 58 0.5395 0.4972 0.5395 0.7345
No log 1.2766 60 0.5979 0.5052 0.5979 0.7733
No log 1.3191 62 0.6815 0.4315 0.6815 0.8256
No log 1.3617 64 0.6887 0.4200 0.6887 0.8299
No log 1.4043 66 0.6243 0.5278 0.6243 0.7901
No log 1.4468 68 0.5869 0.5 0.5869 0.7661
No log 1.4894 70 0.6193 0.4958 0.6193 0.7869
No log 1.5319 72 0.5765 0.5063 0.5765 0.7593
No log 1.5745 74 0.5867 0.5287 0.5867 0.7660
No log 1.6170 76 0.7417 0.4724 0.7417 0.8612
No log 1.6596 78 0.7467 0.4719 0.7467 0.8641
No log 1.7021 80 0.6583 0.5482 0.6583 0.8113
No log 1.7447 82 0.6298 0.5391 0.6298 0.7936
No log 1.7872 84 0.6774 0.5637 0.6774 0.8230
No log 1.8298 86 0.7556 0.5628 0.7556 0.8693
No log 1.8723 88 0.8246 0.5472 0.8246 0.9081
No log 1.9149 90 0.9036 0.4737 0.9036 0.9506
No log 1.9574 92 0.9621 0.5 0.9621 0.9809
No log 2.0 94 1.0507 0.4957 1.0507 1.0250
No log 2.0426 96 0.9224 0.4904 0.9224 0.9604
No log 2.0851 98 0.8412 0.5230 0.8412 0.9172
No log 2.1277 100 0.8124 0.4998 0.8124 0.9013
No log 2.1702 102 1.0391 0.4738 1.0391 1.0194
No log 2.2128 104 1.2918 0.4021 1.2918 1.1366
No log 2.2553 106 1.1339 0.4194 1.1339 1.0649
No log 2.2979 108 0.7759 0.5369 0.7759 0.8809
No log 2.3404 110 0.7200 0.5286 0.7200 0.8485
No log 2.3830 112 0.6786 0.4848 0.6786 0.8238
No log 2.4255 114 0.7063 0.5116 0.7063 0.8404
No log 2.4681 116 0.7715 0.5293 0.7715 0.8783
No log 2.5106 118 0.9963 0.4496 0.9963 0.9982
No log 2.5532 120 1.2310 0.4276 1.2310 1.1095
No log 2.5957 122 1.1213 0.4397 1.1213 1.0589
No log 2.6383 124 0.9661 0.4799 0.9661 0.9829
No log 2.6809 126 0.7975 0.5709 0.7975 0.8930
No log 2.7234 128 0.7648 0.4898 0.7648 0.8745
No log 2.7660 130 0.7818 0.5367 0.7818 0.8842
No log 2.8085 132 0.8752 0.5510 0.8752 0.9355
No log 2.8511 134 0.9244 0.5362 0.9244 0.9615
No log 2.8936 136 0.9019 0.4907 0.9019 0.9497
No log 2.9362 138 1.0200 0.4521 1.0200 1.0099
No log 2.9787 140 1.0408 0.4773 1.0408 1.0202
No log 3.0213 142 1.1227 0.4684 1.1227 1.0596
No log 3.0638 144 1.4000 0.4178 1.4000 1.1832
No log 3.1064 146 1.3764 0.4092 1.3764 1.1732
No log 3.1489 148 1.1932 0.4607 1.1932 1.0923
No log 3.1915 150 1.0287 0.4906 1.0287 1.0143
No log 3.2340 152 0.9923 0.4919 0.9923 0.9961
No log 3.2766 154 0.9661 0.4975 0.9661 0.9829
No log 3.3191 156 1.0551 0.4698 1.0551 1.0272
No log 3.3617 158 1.1023 0.4594 1.1023 1.0499
No log 3.4043 160 1.0040 0.4443 1.0040 1.0020
No log 3.4468 162 0.8784 0.5404 0.8784 0.9372
No log 3.4894 164 0.8427 0.5159 0.8427 0.9180
No log 3.5319 166 0.8594 0.5085 0.8594 0.9270
No log 3.5745 168 0.9049 0.5086 0.9049 0.9513
No log 3.6170 170 1.0355 0.4524 1.0355 1.0176
No log 3.6596 172 1.1184 0.4491 1.1184 1.0576
No log 3.7021 174 1.0766 0.4584 1.0766 1.0376
No log 3.7447 176 1.0200 0.4637 1.0200 1.0099
No log 3.7872 178 0.9403 0.4581 0.9403 0.9697
No log 3.8298 180 0.9396 0.4550 0.9396 0.9693
No log 3.8723 182 0.9805 0.4758 0.9805 0.9902
No log 3.9149 184 1.0591 0.4611 1.0591 1.0291
No log 3.9574 186 1.0787 0.4612 1.0787 1.0386
No log 4.0 188 1.0403 0.4517 1.0403 1.0199
No log 4.0426 190 1.0193 0.4547 1.0193 1.0096
No log 4.0851 192 0.9925 0.4550 0.9925 0.9962
No log 4.1277 194 1.0205 0.4524 1.0205 1.0102
No log 4.1702 196 0.9690 0.4593 0.9690 0.9844
No log 4.2128 198 0.9060 0.4601 0.9060 0.9518
No log 4.2553 200 0.8301 0.5023 0.8301 0.9111
No log 4.2979 202 0.8531 0.5023 0.8531 0.9236
No log 4.3404 204 0.9610 0.4613 0.9610 0.9803
No log 4.3830 206 1.1537 0.4362 1.1537 1.0741
No log 4.4255 208 1.2413 0.4168 1.2413 1.1142
No log 4.4681 210 1.2276 0.4380 1.2276 1.1080
No log 4.5106 212 1.1127 0.4265 1.1127 1.0548
No log 4.5532 214 1.0481 0.4516 1.0481 1.0237
No log 4.5957 216 1.0330 0.4663 1.0330 1.0163
No log 4.6383 218 1.1415 0.4586 1.1415 1.0684
No log 4.6809 220 1.3317 0.4360 1.3317 1.1540
No log 4.7234 222 1.3412 0.4336 1.3412 1.1581
No log 4.7660 224 1.0853 0.4146 1.0853 1.0418
No log 4.8085 226 0.7964 0.4946 0.7964 0.8924
No log 4.8511 228 0.6745 0.5803 0.6745 0.8213
No log 4.8936 230 0.6675 0.6080 0.6675 0.8170
No log 4.9362 232 0.7032 0.5656 0.7032 0.8385
No log 4.9787 234 0.7784 0.5282 0.7784 0.8823
No log 5.0213 236 0.8963 0.5105 0.8963 0.9467
No log 5.0638 238 0.9883 0.4558 0.9883 0.9941
No log 5.1064 240 1.1432 0.4528 1.1432 1.0692
No log 5.1489 242 1.1547 0.4483 1.1547 1.0746
No log 5.1915 244 0.9952 0.4919 0.9952 0.9976
No log 5.2340 246 0.9250 0.5123 0.9250 0.9618
No log 5.2766 248 0.9716 0.4828 0.9716 0.9857
No log 5.3191 250 1.1316 0.4453 1.1316 1.0638
No log 5.3617 252 1.2507 0.4426 1.2507 1.1184
No log 5.4043 254 1.2496 0.4052 1.2496 1.1178
No log 5.4468 256 1.1207 0.4338 1.1207 1.0586
No log 5.4894 258 0.9241 0.4916 0.9241 0.9613
No log 5.5319 260 0.8076 0.5155 0.8076 0.8987
No log 5.5745 262 0.7813 0.5155 0.7813 0.8839
No log 5.6170 264 0.8169 0.4881 0.8169 0.9038
No log 5.6596 266 0.8781 0.4615 0.8781 0.9371
No log 5.7021 268 1.0031 0.4169 1.0031 1.0016
No log 5.7447 270 1.0959 0.4234 1.0959 1.0469
No log 5.7872 272 1.0838 0.4368 1.0838 1.0411
No log 5.8298 274 1.0073 0.4531 1.0073 1.0036
No log 5.8723 276 1.0181 0.4388 1.0181 1.0090
No log 5.9149 278 1.0434 0.4483 1.0434 1.0215
No log 5.9574 280 0.9614 0.4386 0.9614 0.9805
No log 6.0 282 0.8534 0.4787 0.8534 0.9238
No log 6.0426 284 0.8220 0.4875 0.8220 0.9066
No log 6.0851 286 0.8428 0.4885 0.8428 0.9180
No log 6.1277 288 0.8862 0.4572 0.8862 0.9414
No log 6.1702 290 0.9910 0.4411 0.9910 0.9955
No log 6.2128 292 1.0369 0.4461 1.0369 1.0183
No log 6.2553 294 1.0495 0.4517 1.0495 1.0245
No log 6.2979 296 1.0582 0.4505 1.0582 1.0287
No log 6.3404 298 1.0148 0.4624 1.0148 1.0074
No log 6.3830 300 0.9392 0.4658 0.9392 0.9691
No log 6.4255 302 0.9024 0.5014 0.9024 0.9500
No log 6.4681 304 0.8969 0.5007 0.8969 0.9470
No log 6.5106 306 0.9152 0.4460 0.9152 0.9567
No log 6.5532 308 0.9782 0.4374 0.9782 0.9890
No log 6.5957 310 0.9517 0.4315 0.9517 0.9755
No log 6.6383 312 0.8706 0.4868 0.8706 0.9331
No log 6.6809 314 0.8487 0.5029 0.8487 0.9212
No log 6.7234 316 0.8719 0.4877 0.8719 0.9338
No log 6.7660 318 0.9346 0.4456 0.9346 0.9667
No log 6.8085 320 0.9939 0.4511 0.9939 0.9969
No log 6.8511 322 1.0438 0.4566 1.0438 1.0217
No log 6.8936 324 1.0487 0.4619 1.0487 1.0241
No log 6.9362 326 1.0938 0.4564 1.0938 1.0458
No log 6.9787 328 1.0629 0.4569 1.0629 1.0310
No log 7.0213 330 0.9847 0.4581 0.9847 0.9923
No log 7.0638 332 0.9045 0.4985 0.9045 0.9510
No log 7.1064 334 0.8603 0.5159 0.8603 0.9275
No log 7.1489 336 0.8193 0.5300 0.8193 0.9052
No log 7.1915 338 0.8187 0.5300 0.8187 0.9048
No log 7.2340 340 0.8521 0.5286 0.8521 0.9231
No log 7.2766 342 0.9143 0.4993 0.9143 0.9562
No log 7.3191 344 0.9822 0.4464 0.9822 0.9911
No log 7.3617 346 1.0626 0.4564 1.0626 1.0308
No log 7.4043 348 1.0925 0.4550 1.0925 1.0452
No log 7.4468 350 1.0837 0.4515 1.0837 1.0410
No log 7.4894 352 1.0146 0.4623 1.0146 1.0073
No log 7.5319 354 0.9632 0.4633 0.9632 0.9814
No log 7.5745 356 0.9326 0.4659 0.9326 0.9657
No log 7.6170 358 0.9147 0.4740 0.9147 0.9564
No log 7.6596 360 0.9271 0.4677 0.9271 0.9628
No log 7.7021 362 0.9271 0.4677 0.9271 0.9629
No log 7.7447 364 0.9457 0.4677 0.9457 0.9725
No log 7.7872 366 0.9681 0.4708 0.9681 0.9839
No log 7.8298 368 1.0093 0.4798 1.0093 1.0046
No log 7.8723 370 1.0644 0.4631 1.0644 1.0317
No log 7.9149 372 1.1132 0.4429 1.1132 1.0551
No log 7.9574 374 1.1353 0.4428 1.1353 1.0655
No log 8.0 376 1.1192 0.4428 1.1192 1.0579
No log 8.0426 378 1.0477 0.4627 1.0477 1.0236
No log 8.0851 380 0.9715 0.4695 0.9715 0.9857
No log 8.1277 382 0.9348 0.4684 0.9348 0.9669
No log 8.1702 384 0.9388 0.4625 0.9388 0.9689
No log 8.2128 386 0.9691 0.4640 0.9691 0.9844
No log 8.2553 388 0.9967 0.4637 0.9967 0.9984
No log 8.2979 390 1.0265 0.4688 1.0265 1.0132
No log 8.3404 392 1.0302 0.4688 1.0302 1.0150
No log 8.3830 394 1.0185 0.4637 1.0185 1.0092
No log 8.4255 396 0.9902 0.4637 0.9902 0.9951
No log 8.4681 398 0.9702 0.4584 0.9702 0.9850
No log 8.5106 400 0.9716 0.4584 0.9716 0.9857
No log 8.5532 402 0.9535 0.4587 0.9535 0.9765
No log 8.5957 404 0.9491 0.4587 0.9491 0.9742
No log 8.6383 406 0.9562 0.4587 0.9562 0.9779
No log 8.6809 408 0.9607 0.4587 0.9607 0.9801
No log 8.7234 410 0.9713 0.4637 0.9713 0.9855
No log 8.7660 412 0.9766 0.4637 0.9766 0.9882
No log 8.8085 414 0.9719 0.4637 0.9719 0.9859
No log 8.8511 416 0.9572 0.4584 0.9572 0.9784
No log 8.8936 418 0.9640 0.4637 0.9640 0.9818
No log 8.9362 420 0.9934 0.4586 0.9934 0.9967
No log 8.9787 422 1.0164 0.4586 1.0164 1.0082
No log 9.0213 424 1.0406 0.4637 1.0406 1.0201
No log 9.0638 426 1.0487 0.4634 1.0487 1.0241
No log 9.1064 428 1.0409 0.4637 1.0409 1.0203
No log 9.1489 430 1.0257 0.4637 1.0257 1.0128
No log 9.1915 432 0.9999 0.4598 0.9999 1.0000
No log 9.2340 434 0.9866 0.4598 0.9866 0.9933
No log 9.2766 436 0.9826 0.4598 0.9826 0.9913
No log 9.3191 438 0.9717 0.4598 0.9717 0.9857
No log 9.3617 440 0.9689 0.4598 0.9689 0.9843
No log 9.4043 442 0.9792 0.4598 0.9792 0.9896
No log 9.4468 444 0.9829 0.4598 0.9829 0.9914
No log 9.4894 446 0.9775 0.4586 0.9775 0.9887
No log 9.5319 448 0.9743 0.4637 0.9743 0.9871
No log 9.5745 450 0.9692 0.4648 0.9692 0.9845
No log 9.6170 452 0.9634 0.4648 0.9634 0.9815
No log 9.6596 454 0.9590 0.4648 0.9590 0.9793
No log 9.7021 456 0.9573 0.4596 0.9573 0.9784
No log 9.7447 458 0.9554 0.4596 0.9554 0.9774
No log 9.7872 460 0.9508 0.4596 0.9508 0.9751
No log 9.8298 462 0.9485 0.4596 0.9485 0.9739
No log 9.8723 464 0.9490 0.4596 0.9490 0.9742
No log 9.9149 466 0.9492 0.4596 0.9492 0.9743
No log 9.9574 468 0.9502 0.4596 0.9502 0.9748
No log 10.0 470 0.9505 0.4596 0.9505 0.9749

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k8_task2_organization

Finetuned
(4023)
this model