ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k3_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0832
  • Qwk: 0.6466
  • Mse: 1.0832
  • Rmse: 1.0408

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1176 2 7.2918 -0.0164 7.2918 2.7003
No log 0.2353 4 4.5873 0.0750 4.5873 2.1418
No log 0.3529 6 2.9617 0.0476 2.9617 1.7210
No log 0.4706 8 2.6630 0.0680 2.6630 1.6319
No log 0.5882 10 1.9982 0.1639 1.9982 1.4136
No log 0.7059 12 1.7065 0.2018 1.7065 1.3063
No log 0.8235 14 2.0658 0.1864 2.0658 1.4373
No log 0.9412 16 1.9612 0.1273 1.9612 1.4004
No log 1.0588 18 2.1019 0.1197 2.1019 1.4498
No log 1.1765 20 1.9555 0.1404 1.9555 1.3984
No log 1.2941 22 1.6144 0.1682 1.6145 1.2706
No log 1.4118 24 1.4594 0.3333 1.4594 1.2080
No log 1.5294 26 1.5919 0.2241 1.5919 1.2617
No log 1.6471 28 1.9184 0.24 1.9184 1.3851
No log 1.7647 30 1.8522 0.2791 1.8522 1.3610
No log 1.8824 32 1.3590 0.4065 1.3590 1.1658
No log 2.0 34 1.3395 0.3761 1.3395 1.1574
No log 2.1176 36 1.3804 0.3802 1.3804 1.1749
No log 2.2353 38 1.4725 0.368 1.4725 1.2135
No log 2.3529 40 1.6013 0.3256 1.6013 1.2654
No log 2.4706 42 1.5477 0.3411 1.5477 1.2441
No log 2.5882 44 1.7183 0.2857 1.7183 1.3109
No log 2.7059 46 1.7798 0.2370 1.7798 1.3341
No log 2.8235 48 1.6415 0.3101 1.6415 1.2812
No log 2.9412 50 1.7638 0.2645 1.7638 1.3281
No log 3.0588 52 2.2968 0.1844 2.2968 1.5155
No log 3.1765 54 2.0177 0.1639 2.0177 1.4205
No log 3.2941 56 1.6028 0.2810 1.6028 1.2660
No log 3.4118 58 1.8068 0.2609 1.8068 1.3442
No log 3.5294 60 1.8180 0.2734 1.8180 1.3483
No log 3.6471 62 1.6505 0.3000 1.6505 1.2847
No log 3.7647 64 1.4764 0.4060 1.4764 1.2151
No log 3.8824 66 1.2998 0.4252 1.2998 1.1401
No log 4.0 68 1.2250 0.4320 1.2250 1.1068
No log 4.1176 70 1.2173 0.4444 1.2173 1.1033
No log 4.2353 72 1.2222 0.5197 1.2222 1.1055
No log 4.3529 74 1.3776 0.4567 1.3776 1.1737
No log 4.4706 76 1.3263 0.4769 1.3263 1.1517
No log 4.5882 78 1.1896 0.5385 1.1896 1.0907
No log 4.7059 80 1.2164 0.5294 1.2164 1.1029
No log 4.8235 82 1.2688 0.5797 1.2688 1.1264
No log 4.9412 84 1.1520 0.5185 1.1520 1.0733
No log 5.0588 86 1.1681 0.4769 1.1681 1.0808
No log 5.1765 88 1.2303 0.4806 1.2303 1.1092
No log 5.2941 90 1.1714 0.5 1.1714 1.0823
No log 5.4118 92 1.1519 0.5224 1.1519 1.0732
No log 5.5294 94 1.1936 0.5672 1.1936 1.0925
No log 5.6471 96 1.2529 0.5735 1.2529 1.1193
No log 5.7647 98 1.0895 0.5714 1.0895 1.0438
No log 5.8824 100 1.0624 0.5303 1.0624 1.0307
No log 6.0 102 1.1182 0.5890 1.1182 1.0575
No log 6.1176 104 1.0238 0.6029 1.0238 1.0118
No log 6.2353 106 0.9703 0.5970 0.9703 0.9850
No log 6.3529 108 0.9715 0.6074 0.9715 0.9857
No log 6.4706 110 0.9265 0.6364 0.9265 0.9625
No log 6.5882 112 0.9003 0.6567 0.9003 0.9488
No log 6.7059 114 0.9540 0.6716 0.9540 0.9767
No log 6.8235 116 0.8992 0.6567 0.8992 0.9482
No log 6.9412 118 0.9010 0.6567 0.9010 0.9492
No log 7.0588 120 0.9300 0.6466 0.9300 0.9643
No log 7.1765 122 1.1290 0.5674 1.1290 1.0625
No log 7.2941 124 1.1887 0.4853 1.1887 1.0903
No log 7.4118 126 1.0994 0.5385 1.0994 1.0485
No log 7.5294 128 1.0085 0.5714 1.0085 1.0042
No log 7.6471 130 1.0138 0.5469 1.0138 1.0069
No log 7.7647 132 1.0193 0.5440 1.0193 1.0096
No log 7.8824 134 1.1863 0.5113 1.1863 1.0892
No log 8.0 136 1.3003 0.5395 1.3003 1.1403
No log 8.1176 138 1.1796 0.5395 1.1796 1.0861
No log 8.2353 140 0.9868 0.6577 0.9868 0.9934
No log 8.3529 142 0.8680 0.6269 0.8680 0.9317
No log 8.4706 144 0.8777 0.6212 0.8777 0.9368
No log 8.5882 146 0.9205 0.5873 0.9205 0.9594
No log 8.7059 148 0.9912 0.5806 0.9912 0.9956
No log 8.8235 150 0.9882 0.5691 0.9882 0.9941
No log 8.9412 152 1.0008 0.5397 1.0008 1.0004
No log 9.0588 154 1.0177 0.5891 1.0177 1.0088
No log 9.1765 156 1.0642 0.5760 1.0642 1.0316
No log 9.2941 158 1.1601 0.5429 1.1601 1.0771
No log 9.4118 160 1.1563 0.5429 1.1563 1.0753
No log 9.5294 162 1.0605 0.5556 1.0605 1.0298
No log 9.6471 164 1.0680 0.5854 1.0680 1.0334
No log 9.7647 166 1.0850 0.5500 1.0850 1.0416
No log 9.8824 168 1.0623 0.5854 1.0623 1.0307
No log 10.0 170 1.0643 0.5760 1.0643 1.0317
No log 10.1176 172 1.0801 0.5760 1.0801 1.0393
No log 10.2353 174 1.0305 0.6107 1.0305 1.0152
No log 10.3529 176 1.0461 0.6107 1.0461 1.0228
No log 10.4706 178 1.1066 0.5714 1.1066 1.0519
No log 10.5882 180 1.0170 0.6061 1.0170 1.0085
No log 10.7059 182 0.9684 0.6260 0.9684 0.9841
No log 10.8235 184 0.9598 0.6565 0.9598 0.9797
No log 10.9412 186 0.9610 0.6154 0.9610 0.9803
No log 11.0588 188 1.0093 0.6061 1.0093 1.0047
No log 11.1765 190 1.1914 0.5333 1.1914 1.0915
No log 11.2941 192 1.2011 0.4923 1.2011 1.0960
No log 11.4118 194 1.1292 0.5714 1.1292 1.0626
No log 11.5294 196 1.1022 0.56 1.1022 1.0499
No log 11.6471 198 1.0818 0.5827 1.0818 1.0401
No log 11.7647 200 1.0731 0.5827 1.0731 1.0359
No log 11.8824 202 1.0674 0.5938 1.0674 1.0331
No log 12.0 204 1.1905 0.5469 1.1905 1.0911
No log 12.1176 206 1.3360 0.4762 1.3360 1.1558
No log 12.2353 208 1.3209 0.5276 1.3209 1.1493
No log 12.3529 210 1.1225 0.5753 1.1225 1.0595
No log 12.4706 212 0.9917 0.5692 0.9917 0.9959
No log 12.5882 214 1.0049 0.5116 1.0049 1.0024
No log 12.7059 216 0.9792 0.6 0.9792 0.9895
No log 12.8235 218 1.1219 0.5985 1.1219 1.0592
No log 12.9412 220 1.2241 0.5676 1.2241 1.1064
No log 13.0588 222 1.2303 0.5493 1.2303 1.1092
No log 13.1765 224 1.2111 0.5867 1.2111 1.1005
No log 13.2941 226 1.0722 0.5385 1.0722 1.0355
No log 13.4118 228 1.1172 0.5373 1.1172 1.0570
No log 13.5294 230 1.1158 0.4762 1.1158 1.0563
No log 13.6471 232 1.1140 0.5203 1.1140 1.0555
No log 13.7647 234 1.2277 0.4812 1.2277 1.1080
No log 13.8824 236 1.3015 0.4812 1.3015 1.1408
No log 14.0 238 1.3677 0.4638 1.3677 1.1695
No log 14.1176 240 1.3031 0.4714 1.3031 1.1415
No log 14.2353 242 1.1650 0.5077 1.1650 1.0794
No log 14.3529 244 1.1258 0.5865 1.1258 1.0611
No log 14.4706 246 1.0491 0.625 1.0491 1.0242
No log 14.5882 248 1.0034 0.5891 1.0034 1.0017
No log 14.7059 250 1.0094 0.625 1.0094 1.0047
No log 14.8235 252 0.9776 0.6142 0.9776 0.9887
No log 14.9412 254 0.9553 0.625 0.9553 0.9774
No log 15.0588 256 0.9173 0.5984 0.9173 0.9578
No log 15.1765 258 0.9145 0.6142 0.9145 0.9563
No log 15.2941 260 0.9196 0.6412 0.9196 0.9590
No log 15.4118 262 0.9424 0.6667 0.9424 0.9708
No log 15.5294 264 0.9657 0.6667 0.9657 0.9827
No log 15.6471 266 0.9607 0.6269 0.9607 0.9802
No log 15.7647 268 0.9623 0.6316 0.9623 0.9810
No log 15.8824 270 1.0213 0.6345 1.0213 1.0106
No log 16.0 272 1.0280 0.6584 1.0280 1.0139
No log 16.1176 274 0.9808 0.6748 0.9808 0.9903
No log 16.2353 276 0.9866 0.6709 0.9866 0.9933
No log 16.3529 278 0.9733 0.6301 0.9733 0.9865
No log 16.4706 280 1.0226 0.6029 1.0226 1.0112
No log 16.5882 282 1.0454 0.6107 1.0454 1.0225
No log 16.7059 284 1.0686 0.5846 1.0686 1.0337
No log 16.8235 286 1.1157 0.6143 1.1157 1.0562
No log 16.9412 288 1.0256 0.6174 1.0256 1.0127
No log 17.0588 290 0.9650 0.6447 0.9650 0.9824
No log 17.1765 292 0.9013 0.6757 0.9013 0.9493
No log 17.2941 294 0.9603 0.6309 0.9603 0.9800
No log 17.4118 296 1.0380 0.6184 1.0380 1.0188
No log 17.5294 298 1.1287 0.5692 1.1287 1.0624
No log 17.6471 300 1.1174 0.5484 1.1174 1.0571
No log 17.7647 302 1.0946 0.5484 1.0946 1.0462
No log 17.8824 304 1.1016 0.5873 1.1016 1.0496
No log 18.0 306 1.1564 0.5588 1.1564 1.0754
No log 18.1176 308 1.2270 0.5068 1.2270 1.1077
No log 18.2353 310 1.2418 0.5570 1.2418 1.1143
No log 18.3529 312 1.0489 0.5248 1.0489 1.0241
No log 18.4706 314 0.9364 0.6667 0.9364 0.9677
No log 18.5882 316 0.9190 0.6716 0.9190 0.9586
No log 18.7059 318 0.8726 0.6963 0.8726 0.9341
No log 18.8235 320 0.8721 0.6716 0.8721 0.9338
No log 18.9412 322 0.8889 0.5512 0.8889 0.9428
No log 19.0588 324 0.8753 0.6719 0.8753 0.9356
No log 19.1765 326 0.8979 0.6508 0.8979 0.9476
No log 19.2941 328 0.8837 0.6457 0.8837 0.9400
No log 19.4118 330 0.8585 0.6870 0.8585 0.9265
No log 19.5294 332 0.8537 0.6412 0.8537 0.9239
No log 19.6471 334 0.8703 0.6617 0.8703 0.9329
No log 19.7647 336 0.8851 0.6466 0.8851 0.9408
No log 19.8824 338 0.9602 0.6260 0.9602 0.9799
No log 20.0 340 1.0616 0.675 1.0616 1.0303
No log 20.1176 342 1.1243 0.6463 1.1243 1.0604
No log 20.2353 344 1.1231 0.6709 1.1231 1.0597
No log 20.3529 346 1.1208 0.6131 1.1208 1.0587
No log 20.4706 348 1.0950 0.5891 1.0950 1.0464
No log 20.5882 350 1.0466 0.5984 1.0466 1.0231
No log 20.7059 352 1.0479 0.6032 1.0479 1.0237
No log 20.8235 354 1.0548 0.5984 1.0548 1.0271
No log 20.9412 356 1.0560 0.5984 1.0560 1.0276
No log 21.0588 358 1.0624 0.5865 1.0624 1.0307
No log 21.1765 360 1.0885 0.6014 1.0885 1.0433
No log 21.2941 362 1.1100 0.5517 1.1100 1.0535
No log 21.4118 364 1.1377 0.5417 1.1377 1.0666
No log 21.5294 366 1.1966 0.5455 1.1966 1.0939
No log 21.6471 368 1.2602 0.5139 1.2602 1.1226
No log 21.7647 370 1.2402 0.5077 1.2402 1.1137
No log 21.8824 372 1.1912 0.5827 1.1912 1.0914
No log 22.0 374 1.1494 0.5873 1.1494 1.0721
No log 22.1176 376 1.0949 0.5873 1.0949 1.0464
No log 22.2353 378 1.0453 0.5873 1.0453 1.0224
No log 22.3529 380 1.0180 0.5873 1.0180 1.0089
No log 22.4706 382 1.0452 0.6364 1.0452 1.0223
No log 22.5882 384 1.1045 0.6452 1.1045 1.0510
No log 22.7059 386 1.0655 0.6452 1.0655 1.0322
No log 22.8235 388 0.9960 0.6795 0.9960 0.9980
No log 22.9412 390 0.8753 0.6714 0.8753 0.9356
No log 23.0588 392 0.8406 0.6617 0.8406 0.9168
No log 23.1765 394 0.8330 0.6714 0.8330 0.9127
No log 23.2941 396 0.8446 0.6471 0.8446 0.9190
No log 23.4118 398 0.9240 0.64 0.9240 0.9613
No log 23.5294 400 0.9758 0.64 0.9758 0.9878
No log 23.6471 402 0.9854 0.6061 0.9854 0.9927
No log 23.7647 404 0.9308 0.625 0.9308 0.9648
No log 23.8824 406 0.9328 0.6142 0.9328 0.9658
No log 24.0 408 0.9495 0.625 0.9495 0.9744
No log 24.1176 410 0.9929 0.6483 0.9929 0.9965
No log 24.2353 412 1.0456 0.6626 1.0456 1.0226
No log 24.3529 414 1.0840 0.6747 1.0840 1.0412
No log 24.4706 416 1.0304 0.6747 1.0304 1.0151
No log 24.5882 418 1.0017 0.6829 1.0017 1.0009
No log 24.7059 420 0.8985 0.6438 0.8985 0.9479
No log 24.8235 422 0.8461 0.6154 0.8461 0.9198
No log 24.9412 424 0.8274 0.6565 0.8274 0.9096
No log 25.0588 426 0.8400 0.6260 0.8400 0.9165
No log 25.1765 428 0.8693 0.6316 0.8693 0.9324
No log 25.2941 430 0.8861 0.6277 0.8861 0.9413
No log 25.4118 432 0.9249 0.6438 0.9249 0.9617
No log 25.5294 434 1.0084 0.6708 1.0084 1.0042
No log 25.6471 436 1.0464 0.6829 1.0464 1.0230
No log 25.7647 438 0.9784 0.6708 0.9784 0.9892
No log 25.8824 440 0.9172 0.6143 0.9172 0.9577
No log 26.0 442 0.9116 0.6324 0.9116 0.9548
No log 26.1176 444 0.9264 0.6462 0.9264 0.9625
No log 26.2353 446 0.9742 0.6364 0.9742 0.9870
No log 26.3529 448 1.0935 0.5797 1.0935 1.0457
No log 26.4706 450 1.1899 0.5972 1.1899 1.0908
No log 26.5882 452 1.2538 0.5912 1.2538 1.1197
No log 26.7059 454 1.1907 0.5638 1.1907 1.0912
No log 26.8235 456 1.0960 0.5899 1.0960 1.0469
No log 26.9412 458 1.0541 0.5899 1.0541 1.0267
No log 27.0588 460 0.9928 0.6029 0.9928 0.9964
No log 27.1765 462 0.9983 0.6029 0.9983 0.9992
No log 27.2941 464 0.9955 0.5926 0.9955 0.9977
No log 27.4118 466 0.9356 0.6466 0.9356 0.9673
No log 27.5294 468 0.9074 0.6512 0.9074 0.9526
No log 27.6471 470 0.9170 0.6512 0.9170 0.9576
No log 27.7647 472 0.9655 0.6165 0.9655 0.9826
No log 27.8824 474 1.0776 0.6711 1.0776 1.0381
No log 28.0 476 1.2213 0.6115 1.2213 1.1051
No log 28.1176 478 1.2280 0.5714 1.2280 1.1082
No log 28.2353 480 1.1552 0.64 1.1552 1.0748
No log 28.3529 482 1.0458 0.6466 1.0458 1.0226
No log 28.4706 484 0.9797 0.6357 0.9797 0.9898
No log 28.5882 486 0.9739 0.6032 0.9739 0.9869
No log 28.7059 488 1.0004 0.5806 1.0004 1.0002
No log 28.8235 490 1.0259 0.6562 1.0259 1.0129
No log 28.9412 492 1.0554 0.6202 1.0554 1.0273
No log 29.0588 494 1.1021 0.6621 1.1021 1.0498
No log 29.1765 496 1.0900 0.6452 1.0900 1.0440
No log 29.2941 498 0.9971 0.6395 0.9971 0.9985
0.301 29.4118 500 0.9257 0.6479 0.9257 0.9621
0.301 29.5294 502 0.8687 0.6512 0.8687 0.9320
0.301 29.6471 504 0.8713 0.6406 0.8713 0.9334
0.301 29.7647 506 0.9002 0.6406 0.9002 0.9488
0.301 29.8824 508 0.9539 0.6299 0.9539 0.9767
0.301 30.0 510 1.0283 0.6412 1.0283 1.0141
0.301 30.1176 512 1.0832 0.6466 1.0832 1.0408

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k3_task1_organization

Finetuned
(4023)
this model