ArabicNewSplits6_FineTuningAraBERT_run1_AugV5_k12_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0017
  • Qwk: 0.6118
  • Mse: 1.0017
  • Rmse: 1.0008

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0435 2 2.6343 -0.0023 2.6343 1.6230
No log 0.0870 4 1.9071 0.0319 1.9071 1.3810
No log 0.1304 6 1.3330 0.2457 1.3330 1.1545
No log 0.1739 8 1.5848 0.0731 1.5848 1.2589
No log 0.2174 10 1.6723 0.1350 1.6723 1.2932
No log 0.2609 12 1.6348 0.1205 1.6348 1.2786
No log 0.3043 14 1.5169 0.1449 1.5169 1.2316
No log 0.3478 16 1.4132 0.2119 1.4132 1.1888
No log 0.3913 18 1.3751 0.2047 1.3751 1.1727
No log 0.4348 20 1.5382 0.2934 1.5382 1.2402
No log 0.4783 22 1.6814 0.3288 1.6814 1.2967
No log 0.5217 24 1.5474 0.3448 1.5474 1.2440
No log 0.5652 26 1.3368 0.3299 1.3368 1.1562
No log 0.6087 28 1.2498 0.2087 1.2498 1.1179
No log 0.6522 30 1.2237 0.2574 1.2237 1.1062
No log 0.6957 32 1.1837 0.2991 1.1837 1.0880
No log 0.7391 34 1.1779 0.3319 1.1779 1.0853
No log 0.7826 36 1.1731 0.3271 1.1731 1.0831
No log 0.8261 38 1.1784 0.2984 1.1784 1.0855
No log 0.8696 40 1.3366 0.4204 1.3366 1.1561
No log 0.9130 42 1.5053 0.3490 1.5053 1.2269
No log 0.9565 44 1.5329 0.2578 1.5329 1.2381
No log 1.0 46 1.4395 0.2616 1.4395 1.1998
No log 1.0435 48 1.3583 0.3873 1.3583 1.1655
No log 1.0870 50 1.1998 0.4106 1.1998 1.0954
No log 1.1304 52 1.0725 0.4209 1.0725 1.0356
No log 1.1739 54 1.0921 0.4109 1.0921 1.0450
No log 1.2174 56 1.1455 0.4174 1.1455 1.0703
No log 1.2609 58 1.1643 0.4310 1.1643 1.0790
No log 1.3043 60 1.1335 0.4310 1.1335 1.0647
No log 1.3478 62 1.0375 0.4275 1.0375 1.0186
No log 1.3913 64 0.9800 0.4847 0.9800 0.9899
No log 1.4348 66 1.0441 0.5041 1.0441 1.0218
No log 1.4783 68 1.1818 0.4739 1.1818 1.0871
No log 1.5217 70 1.2031 0.4626 1.2031 1.0969
No log 1.5652 72 1.2234 0.4321 1.2234 1.1061
No log 1.6087 74 1.1554 0.4841 1.1554 1.0749
No log 1.6522 76 1.0766 0.4545 1.0766 1.0376
No log 1.6957 78 1.0497 0.4617 1.0497 1.0246
No log 1.7391 80 1.0450 0.4922 1.0450 1.0223
No log 1.7826 82 1.0629 0.5178 1.0629 1.0310
No log 1.8261 84 1.0588 0.4867 1.0588 1.0290
No log 1.8696 86 1.1203 0.4686 1.1203 1.0585
No log 1.9130 88 1.1615 0.4762 1.1615 1.0777
No log 1.9565 90 1.1543 0.4888 1.1543 1.0744
No log 2.0 92 1.1483 0.4374 1.1483 1.0716
No log 2.0435 94 1.1472 0.3892 1.1472 1.0711
No log 2.0870 96 1.1343 0.3892 1.1343 1.0650
No log 2.1304 98 1.1101 0.4612 1.1101 1.0536
No log 2.1739 100 1.0591 0.5313 1.0591 1.0291
No log 2.2174 102 1.0286 0.5162 1.0286 1.0142
No log 2.2609 104 0.9748 0.5280 0.9748 0.9873
No log 2.3043 106 0.9561 0.5386 0.9561 0.9778
No log 2.3478 108 0.9149 0.5656 0.9149 0.9565
No log 2.3913 110 0.9026 0.5765 0.9026 0.9501
No log 2.4348 112 0.8983 0.5711 0.8983 0.9478
No log 2.4783 114 0.9456 0.5961 0.9456 0.9724
No log 2.5217 116 1.1052 0.5421 1.1052 1.0513
No log 2.5652 118 1.1915 0.5205 1.1915 1.0916
No log 2.6087 120 1.2662 0.5327 1.2662 1.1253
No log 2.6522 122 1.3058 0.5394 1.3058 1.1427
No log 2.6957 124 1.3581 0.5541 1.3581 1.1654
No log 2.7391 126 1.1937 0.5544 1.1937 1.0926
No log 2.7826 128 1.1089 0.5523 1.1089 1.0530
No log 2.8261 130 1.0750 0.5331 1.0750 1.0368
No log 2.8696 132 1.1430 0.5564 1.1430 1.0691
No log 2.9130 134 1.1607 0.5485 1.1607 1.0774
No log 2.9565 136 1.1974 0.5520 1.1974 1.0943
No log 3.0 138 1.3035 0.5475 1.3035 1.1417
No log 3.0435 140 1.4987 0.5409 1.4987 1.2242
No log 3.0870 142 1.5221 0.5185 1.5221 1.2337
No log 3.1304 144 1.4033 0.5321 1.4033 1.1846
No log 3.1739 146 1.2258 0.5293 1.2258 1.1071
No log 3.2174 148 1.2458 0.5365 1.2458 1.1162
No log 3.2609 150 1.2477 0.5220 1.2477 1.1170
No log 3.3043 152 1.1396 0.5439 1.1396 1.0675
No log 3.3478 154 0.9846 0.5907 0.9846 0.9923
No log 3.3913 156 0.9126 0.6535 0.9126 0.9553
No log 3.4348 158 0.9217 0.6148 0.9217 0.9600
No log 3.4783 160 1.0141 0.6289 1.0141 1.0070
No log 3.5217 162 1.1707 0.5827 1.1707 1.0820
No log 3.5652 164 1.2505 0.5835 1.2505 1.1183
No log 3.6087 166 1.1556 0.5924 1.1556 1.0750
No log 3.6522 168 1.0462 0.6113 1.0462 1.0229
No log 3.6957 170 1.0780 0.6095 1.0780 1.0383
No log 3.7391 172 1.0647 0.5974 1.0647 1.0319
No log 3.7826 174 0.9915 0.6414 0.9915 0.9957
No log 3.8261 176 0.9114 0.6302 0.9114 0.9546
No log 3.8696 178 0.8766 0.6232 0.8766 0.9363
No log 3.9130 180 0.9117 0.6338 0.9117 0.9548
No log 3.9565 182 1.0319 0.5846 1.0319 1.0158
No log 4.0 184 1.0441 0.5913 1.0441 1.0218
No log 4.0435 186 0.9823 0.6055 0.9823 0.9911
No log 4.0870 188 0.9558 0.6168 0.9558 0.9776
No log 4.1304 190 0.9141 0.6410 0.9141 0.9561
No log 4.1739 192 0.9306 0.6474 0.9306 0.9647
No log 4.2174 194 0.9582 0.6421 0.9582 0.9789
No log 4.2609 196 0.9283 0.6332 0.9283 0.9635
No log 4.3043 198 0.9261 0.6348 0.9261 0.9624
No log 4.3478 200 0.9504 0.6364 0.9504 0.9749
No log 4.3913 202 0.9402 0.6312 0.9402 0.9696
No log 4.4348 204 0.9603 0.6165 0.9603 0.9800
No log 4.4783 206 1.0090 0.6015 1.0090 1.0045
No log 4.5217 208 1.0256 0.6034 1.0256 1.0127
No log 4.5652 210 1.0471 0.6021 1.0471 1.0233
No log 4.6087 212 0.9806 0.6362 0.9806 0.9902
No log 4.6522 214 0.8731 0.6715 0.8731 0.9344
No log 4.6957 216 0.8732 0.6600 0.8732 0.9345
No log 4.7391 218 0.9695 0.6179 0.9695 0.9847
No log 4.7826 220 1.0959 0.5765 1.0959 1.0469
No log 4.8261 222 1.2624 0.5679 1.2624 1.1235
No log 4.8696 224 1.2471 0.5601 1.2471 1.1168
No log 4.9130 226 1.1333 0.5869 1.1333 1.0646
No log 4.9565 228 1.0470 0.5961 1.0470 1.0232
No log 5.0 230 1.0332 0.5961 1.0332 1.0165
No log 5.0435 232 0.9842 0.5901 0.9842 0.9921
No log 5.0870 234 0.9795 0.6085 0.9795 0.9897
No log 5.1304 236 1.0627 0.5977 1.0627 1.0309
No log 5.1739 238 1.1373 0.6071 1.1373 1.0664
No log 5.2174 240 1.1370 0.5844 1.1370 1.0663
No log 5.2609 242 1.1170 0.6059 1.1170 1.0569
No log 5.3043 244 1.0821 0.6227 1.0821 1.0402
No log 5.3478 246 1.0233 0.6587 1.0233 1.0116
No log 5.3913 248 0.9085 0.6877 0.9085 0.9532
No log 5.4348 250 0.9015 0.6896 0.9015 0.9495
No log 5.4783 252 0.9811 0.6540 0.9811 0.9905
No log 5.5217 254 1.1389 0.6211 1.1389 1.0672
No log 5.5652 256 1.2743 0.5804 1.2743 1.1288
No log 5.6087 258 1.2705 0.5804 1.2705 1.1272
No log 5.6522 260 1.2029 0.5862 1.2029 1.0968
No log 5.6957 262 1.1438 0.5659 1.1438 1.0695
No log 5.7391 264 1.0615 0.5992 1.0615 1.0303
No log 5.7826 266 1.0956 0.5905 1.0956 1.0467
No log 5.8261 268 1.2409 0.5835 1.2409 1.1139
No log 5.8696 270 1.4355 0.5679 1.4355 1.1981
No log 5.9130 272 1.4562 0.5429 1.4562 1.2067
No log 5.9565 274 1.3131 0.5552 1.3131 1.1459
No log 6.0 276 1.1502 0.5692 1.1502 1.0725
No log 6.0435 278 1.0930 0.5302 1.0930 1.0455
No log 6.0870 280 1.0379 0.5636 1.0379 1.0188
No log 6.1304 282 1.0255 0.5861 1.0255 1.0127
No log 6.1739 284 1.0211 0.5970 1.0211 1.0105
No log 6.2174 286 0.9777 0.6156 0.9777 0.9888
No log 6.2609 288 0.9732 0.6295 0.9732 0.9865
No log 6.3043 290 1.0289 0.6028 1.0289 1.0143
No log 6.3478 292 1.0138 0.6280 1.0138 1.0069
No log 6.3913 294 0.9489 0.6354 0.9489 0.9741
No log 6.4348 296 0.9146 0.6449 0.9146 0.9563
No log 6.4783 298 0.8894 0.6543 0.8894 0.9431
No log 6.5217 300 0.9164 0.6449 0.9164 0.9573
No log 6.5652 302 1.0034 0.6275 1.0034 1.0017
No log 6.6087 304 1.1277 0.5919 1.1277 1.0619
No log 6.6522 306 1.1600 0.5836 1.1600 1.0770
No log 6.6957 308 1.1127 0.6010 1.1127 1.0549
No log 6.7391 310 1.0024 0.6174 1.0024 1.0012
No log 6.7826 312 0.8872 0.6356 0.8872 0.9419
No log 6.8261 314 0.8681 0.6371 0.8681 0.9317
No log 6.8696 316 0.9008 0.6310 0.9008 0.9491
No log 6.9130 318 0.9920 0.6172 0.9920 0.9960
No log 6.9565 320 1.0706 0.6138 1.0706 1.0347
No log 7.0 322 1.0602 0.6257 1.0602 1.0297
No log 7.0435 324 0.9928 0.6080 0.9928 0.9964
No log 7.0870 326 0.9652 0.6107 0.9652 0.9824
No log 7.1304 328 0.9718 0.6107 0.9718 0.9858
No log 7.1739 330 0.9604 0.6107 0.9604 0.9800
No log 7.2174 332 0.9542 0.6151 0.9542 0.9768
No log 7.2609 334 0.9804 0.6309 0.9804 0.9901
No log 7.3043 336 1.0306 0.6137 1.0306 1.0152
No log 7.3478 338 1.0801 0.6144 1.0801 1.0393
No log 7.3913 340 1.0620 0.6170 1.0620 1.0305
No log 7.4348 342 1.0208 0.6147 1.0208 1.0103
No log 7.4783 344 1.0539 0.6108 1.0539 1.0266
No log 7.5217 346 1.0854 0.6024 1.0854 1.0418
No log 7.5652 348 1.0713 0.6082 1.0713 1.0350
No log 7.6087 350 1.0196 0.6156 1.0196 1.0097
No log 7.6522 352 0.9942 0.6156 0.9942 0.9971
No log 7.6957 354 0.9870 0.6156 0.9870 0.9935
No log 7.7391 356 1.0232 0.6156 1.0232 1.0115
No log 7.7826 358 1.0485 0.6082 1.0485 1.0239
No log 7.8261 360 1.0829 0.6145 1.0829 1.0406
No log 7.8696 362 1.1572 0.5984 1.1572 1.0757
No log 7.9130 364 1.1866 0.6002 1.1866 1.0893
No log 7.9565 366 1.1462 0.6113 1.1462 1.0706
No log 8.0 368 1.0547 0.6128 1.0547 1.0270
No log 8.0435 370 0.9577 0.6446 0.9577 0.9786
No log 8.0870 372 0.9177 0.6385 0.9177 0.9580
No log 8.1304 374 0.9135 0.6385 0.9135 0.9558
No log 8.1739 376 0.9276 0.6290 0.9276 0.9631
No log 8.2174 378 0.9570 0.6446 0.9570 0.9783
No log 8.2609 380 0.9615 0.6446 0.9615 0.9806
No log 8.3043 382 0.9763 0.6508 0.9763 0.9881
No log 8.3478 384 0.9772 0.6446 0.9772 0.9886
No log 8.3913 386 0.9676 0.6446 0.9676 0.9837
No log 8.4348 388 0.9451 0.6449 0.9451 0.9721
No log 8.4783 390 0.9155 0.6385 0.9155 0.9568
No log 8.5217 392 0.8981 0.6482 0.8981 0.9477
No log 8.5652 394 0.9060 0.6385 0.9060 0.9518
No log 8.6087 396 0.9354 0.6385 0.9354 0.9672
No log 8.6522 398 0.9757 0.6387 0.9757 0.9878
No log 8.6957 400 1.0207 0.6334 1.0207 1.0103
No log 8.7391 402 1.0534 0.6231 1.0534 1.0263
No log 8.7826 404 1.0759 0.6232 1.0759 1.0372
No log 8.8261 406 1.0958 0.6178 1.0958 1.0468
No log 8.8696 408 1.0861 0.6178 1.0861 1.0422
No log 8.9130 410 1.0504 0.6291 1.0504 1.0249
No log 8.9565 412 1.0075 0.6196 1.0075 1.0037
No log 9.0 414 0.9860 0.6161 0.9860 0.9930
No log 9.0435 416 0.9560 0.6324 0.9560 0.9777
No log 9.0870 418 0.9200 0.6290 0.9200 0.9592
No log 9.1304 420 0.9063 0.6385 0.9063 0.9520
No log 9.1739 422 0.9121 0.6290 0.9121 0.9550
No log 9.2174 424 0.9226 0.6290 0.9226 0.9605
No log 9.2609 426 0.9337 0.6290 0.9337 0.9663
No log 9.3043 428 0.9573 0.6040 0.9573 0.9784
No log 9.3478 430 0.9801 0.6062 0.9801 0.9900
No log 9.3913 432 0.9955 0.6101 0.9955 0.9977
No log 9.4348 434 1.0141 0.6075 1.0141 1.0070
No log 9.4783 436 1.0324 0.6172 1.0324 1.0161
No log 9.5217 438 1.0415 0.6221 1.0415 1.0206
No log 9.5652 440 1.0512 0.6221 1.0512 1.0253
No log 9.6087 442 1.0560 0.6221 1.0560 1.0276
No log 9.6522 444 1.0523 0.6221 1.0523 1.0258
No log 9.6957 446 1.0440 0.6221 1.0440 1.0218
No log 9.7391 448 1.0322 0.6125 1.0322 1.0160
No log 9.7826 450 1.0222 0.6138 1.0222 1.0110
No log 9.8261 452 1.0127 0.6138 1.0127 1.0063
No log 9.8696 454 1.0082 0.6105 1.0082 1.0041
No log 9.9130 456 1.0052 0.6118 1.0052 1.0026
No log 9.9565 458 1.0029 0.6118 1.0029 1.0014
No log 10.0 460 1.0017 0.6118 1.0017 1.0008

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_FineTuningAraBERT_run1_AugV5_k12_task5_organization

Finetuned
(4023)
this model