ArabicNewSplits6_FineTuningAraBERT_run1_AugV5_k9_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5807
  • Qwk: 0.7226
  • Mse: 0.5807
  • Rmse: 0.7620

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0435 2 5.4231 -0.0547 5.4231 2.3288
No log 0.0870 4 3.1344 0.0646 3.1344 1.7704
No log 0.1304 6 2.6817 -0.1405 2.6817 1.6376
No log 0.1739 8 1.7092 0.0626 1.7092 1.3074
No log 0.2174 10 1.2110 0.2434 1.2110 1.1005
No log 0.2609 12 1.2852 0.1631 1.2852 1.1337
No log 0.3043 14 1.6344 -0.0031 1.6344 1.2785
No log 0.3478 16 1.5775 -0.0177 1.5775 1.2560
No log 0.3913 18 1.3756 0.0736 1.3756 1.1728
No log 0.4348 20 1.2518 0.2284 1.2518 1.1189
No log 0.4783 22 1.2467 0.2412 1.2467 1.1166
No log 0.5217 24 1.2216 0.3051 1.2216 1.1053
No log 0.5652 26 1.2783 0.1975 1.2783 1.1306
No log 0.6087 28 1.4053 0.0592 1.4053 1.1854
No log 0.6522 30 1.5717 0.0432 1.5717 1.2537
No log 0.6957 32 1.6154 0.0614 1.6154 1.2710
No log 0.7391 34 1.4057 0.1291 1.4057 1.1856
No log 0.7826 36 1.2254 0.2983 1.2254 1.1070
No log 0.8261 38 0.9666 0.3439 0.9666 0.9831
No log 0.8696 40 0.8897 0.4266 0.8897 0.9432
No log 0.9130 42 0.9725 0.3657 0.9725 0.9861
No log 0.9565 44 1.1594 0.2844 1.1594 1.0768
No log 1.0 46 1.3529 0.2139 1.3529 1.1631
No log 1.0435 48 1.5621 0.2036 1.5621 1.2499
No log 1.0870 50 1.4640 0.2301 1.4640 1.2099
No log 1.1304 52 1.1710 0.2690 1.1710 1.0821
No log 1.1739 54 1.1268 0.3003 1.1268 1.0615
No log 1.2174 56 1.2561 0.3099 1.2561 1.1207
No log 1.2609 58 1.2930 0.3284 1.2930 1.1371
No log 1.3043 60 1.4161 0.3424 1.4161 1.1900
No log 1.3478 62 1.1555 0.4301 1.1555 1.0749
No log 1.3913 64 0.9918 0.4818 0.9918 0.9959
No log 1.4348 66 1.1120 0.4243 1.1120 1.0545
No log 1.4783 68 1.0545 0.4744 1.0545 1.0269
No log 1.5217 70 1.1433 0.4700 1.1433 1.0693
No log 1.5652 72 1.0755 0.4894 1.0755 1.0370
No log 1.6087 74 0.8809 0.5979 0.8809 0.9386
No log 1.6522 76 0.7119 0.6420 0.7119 0.8437
No log 1.6957 78 0.6945 0.6616 0.6945 0.8334
No log 1.7391 80 0.6736 0.6598 0.6736 0.8207
No log 1.7826 82 0.6768 0.6531 0.6768 0.8227
No log 1.8261 84 0.6769 0.6508 0.6769 0.8227
No log 1.8696 86 0.6724 0.6666 0.6724 0.8200
No log 1.9130 88 0.8138 0.6100 0.8138 0.9021
No log 1.9565 90 1.3033 0.4489 1.3033 1.1416
No log 2.0 92 1.6163 0.3547 1.6163 1.2714
No log 2.0435 94 1.3582 0.4381 1.3582 1.1654
No log 2.0870 96 1.1797 0.5007 1.1797 1.0862
No log 2.1304 98 1.1746 0.5295 1.1746 1.0838
No log 2.1739 100 1.5061 0.3854 1.5061 1.2272
No log 2.2174 102 1.6984 0.3350 1.6984 1.3032
No log 2.2609 104 1.8089 0.3263 1.8089 1.3450
No log 2.3043 106 1.4700 0.3929 1.4700 1.2124
No log 2.3478 108 0.9388 0.6284 0.9388 0.9689
No log 2.3913 110 0.6435 0.6779 0.6435 0.8022
No log 2.4348 112 0.6180 0.6905 0.6180 0.7862
No log 2.4783 114 0.5849 0.6913 0.5849 0.7648
No log 2.5217 116 0.6105 0.7251 0.6105 0.7813
No log 2.5652 118 0.6489 0.6836 0.6489 0.8055
No log 2.6087 120 0.6330 0.6887 0.6330 0.7956
No log 2.6522 122 0.7053 0.6750 0.7053 0.8398
No log 2.6957 124 0.7735 0.6341 0.7735 0.8795
No log 2.7391 126 0.6372 0.6923 0.6372 0.7983
No log 2.7826 128 0.5806 0.7394 0.5806 0.7620
No log 2.8261 130 0.6022 0.7613 0.6022 0.7760
No log 2.8696 132 0.5982 0.7564 0.5982 0.7734
No log 2.9130 134 0.6078 0.7550 0.6078 0.7796
No log 2.9565 136 0.6086 0.7708 0.6086 0.7801
No log 3.0 138 0.6335 0.7431 0.6335 0.7959
No log 3.0435 140 0.7056 0.6516 0.7056 0.8400
No log 3.0870 142 0.6684 0.6827 0.6684 0.8176
No log 3.1304 144 0.6457 0.6963 0.6457 0.8035
No log 3.1739 146 0.5944 0.7503 0.5944 0.7710
No log 3.2174 148 0.5732 0.7376 0.5732 0.7571
No log 3.2609 150 0.5685 0.7333 0.5685 0.7540
No log 3.3043 152 0.5732 0.7457 0.5732 0.7571
No log 3.3478 154 0.7028 0.6525 0.7028 0.8383
No log 3.3913 156 0.7233 0.6472 0.7233 0.8505
No log 3.4348 158 0.6239 0.6744 0.6239 0.7899
No log 3.4783 160 0.5800 0.7301 0.5800 0.7616
No log 3.5217 162 0.5866 0.7356 0.5866 0.7659
No log 3.5652 164 0.5936 0.7247 0.5936 0.7704
No log 3.6087 166 0.6346 0.6937 0.6346 0.7966
No log 3.6522 168 0.6799 0.6608 0.6799 0.8246
No log 3.6957 170 0.6773 0.6722 0.6773 0.8230
No log 3.7391 172 0.6622 0.6994 0.6622 0.8138
No log 3.7826 174 0.5883 0.7326 0.5883 0.7670
No log 3.8261 176 0.5670 0.7456 0.5670 0.7530
No log 3.8696 178 0.5585 0.7596 0.5585 0.7473
No log 3.9130 180 0.5636 0.7473 0.5636 0.7507
No log 3.9565 182 0.5570 0.7482 0.5570 0.7464
No log 4.0 184 0.5661 0.7427 0.5661 0.7524
No log 4.0435 186 0.6022 0.7094 0.6022 0.7760
No log 4.0870 188 0.5710 0.7372 0.5710 0.7556
No log 4.1304 190 0.5565 0.7690 0.5565 0.7460
No log 4.1739 192 0.5696 0.7564 0.5696 0.7547
No log 4.2174 194 0.5547 0.7607 0.5547 0.7448
No log 4.2609 196 0.5603 0.7562 0.5603 0.7485
No log 4.3043 198 0.5710 0.7487 0.5710 0.7557
No log 4.3478 200 0.6397 0.6919 0.6397 0.7998
No log 4.3913 202 0.7336 0.6488 0.7336 0.8565
No log 4.4348 204 0.7237 0.6497 0.7237 0.8507
No log 4.4783 206 0.6242 0.7087 0.6242 0.7901
No log 4.5217 208 0.5382 0.7595 0.5382 0.7337
No log 4.5652 210 0.5385 0.7526 0.5385 0.7339
No log 4.6087 212 0.5387 0.7537 0.5387 0.7340
No log 4.6522 214 0.5535 0.7650 0.5535 0.7440
No log 4.6957 216 0.5751 0.7493 0.5751 0.7584
No log 4.7391 218 0.5676 0.7560 0.5676 0.7534
No log 4.7826 220 0.5516 0.7797 0.5516 0.7427
No log 4.8261 222 0.5487 0.7573 0.5487 0.7407
No log 4.8696 224 0.5498 0.7718 0.5498 0.7415
No log 4.9130 226 0.5690 0.7262 0.5690 0.7544
No log 4.9565 228 0.5733 0.7439 0.5733 0.7571
No log 5.0 230 0.5573 0.7609 0.5573 0.7465
No log 5.0435 232 0.5573 0.7785 0.5573 0.7465
No log 5.0870 234 0.5817 0.7714 0.5817 0.7627
No log 5.1304 236 0.6135 0.7642 0.6135 0.7833
No log 5.1739 238 0.6348 0.7527 0.6348 0.7968
No log 5.2174 240 0.6578 0.7285 0.6578 0.8110
No log 5.2609 242 0.6539 0.7285 0.6539 0.8086
No log 5.3043 244 0.5961 0.7664 0.5961 0.7721
No log 5.3478 246 0.5407 0.7733 0.5407 0.7353
No log 5.3913 248 0.5629 0.7255 0.5629 0.7503
No log 5.4348 250 0.5775 0.7184 0.5775 0.7600
No log 5.4783 252 0.5313 0.7382 0.5313 0.7289
No log 5.5217 254 0.4873 0.7711 0.4873 0.6981
No log 5.5652 256 0.5204 0.7635 0.5204 0.7214
No log 5.6087 258 0.5575 0.7570 0.5575 0.7467
No log 5.6522 260 0.5370 0.7569 0.5370 0.7328
No log 5.6957 262 0.5120 0.7879 0.5120 0.7155
No log 5.7391 264 0.5519 0.7228 0.5519 0.7429
No log 5.7826 266 0.6061 0.7287 0.6061 0.7785
No log 5.8261 268 0.5943 0.7174 0.5943 0.7709
No log 5.8696 270 0.5586 0.7452 0.5586 0.7474
No log 5.9130 272 0.5523 0.7699 0.5523 0.7432
No log 5.9565 274 0.5630 0.7628 0.5630 0.7504
No log 6.0 276 0.5820 0.7531 0.5820 0.7629
No log 6.0435 278 0.5724 0.7360 0.5724 0.7566
No log 6.0870 280 0.5693 0.7432 0.5693 0.7545
No log 6.1304 282 0.5571 0.7425 0.5571 0.7464
No log 6.1739 284 0.5648 0.7555 0.5648 0.7516
No log 6.2174 286 0.5876 0.7226 0.5876 0.7665
No log 6.2609 288 0.5893 0.7364 0.5893 0.7677
No log 6.3043 290 0.5914 0.7482 0.5914 0.7690
No log 6.3478 292 0.5814 0.7447 0.5814 0.7625
No log 6.3913 294 0.5851 0.7579 0.5851 0.7649
No log 6.4348 296 0.6172 0.7346 0.6172 0.7856
No log 6.4783 298 0.6331 0.7379 0.6331 0.7957
No log 6.5217 300 0.6150 0.7517 0.6150 0.7842
No log 6.5652 302 0.5876 0.7509 0.5876 0.7666
No log 6.6087 304 0.5863 0.7339 0.5863 0.7657
No log 6.6522 306 0.5919 0.7339 0.5919 0.7694
No log 6.6957 308 0.5970 0.7593 0.5970 0.7727
No log 6.7391 310 0.6113 0.7450 0.6113 0.7819
No log 6.7826 312 0.6274 0.7374 0.6274 0.7921
No log 6.8261 314 0.6129 0.7421 0.6129 0.7829
No log 6.8696 316 0.5937 0.7452 0.5937 0.7705
No log 6.9130 318 0.5826 0.7211 0.5826 0.7633
No log 6.9565 320 0.5856 0.7138 0.5856 0.7652
No log 7.0 322 0.5941 0.7316 0.5941 0.7708
No log 7.0435 324 0.6083 0.7413 0.6083 0.7799
No log 7.0870 326 0.6417 0.7208 0.6417 0.8011
No log 7.1304 328 0.6882 0.6811 0.6882 0.8296
No log 7.1739 330 0.7080 0.6464 0.7080 0.8414
No log 7.2174 332 0.6877 0.6713 0.6877 0.8293
No log 7.2609 334 0.6469 0.7167 0.6469 0.8043
No log 7.3043 336 0.6208 0.7370 0.6208 0.7879
No log 7.3478 338 0.6098 0.7388 0.6098 0.7809
No log 7.3913 340 0.6137 0.7385 0.6137 0.7834
No log 7.4348 342 0.6202 0.7536 0.6202 0.7876
No log 7.4783 344 0.6140 0.7536 0.6140 0.7836
No log 7.5217 346 0.6108 0.7385 0.6108 0.7815
No log 7.5652 348 0.6052 0.7284 0.6052 0.7780
No log 7.6087 350 0.6050 0.7219 0.6050 0.7778
No log 7.6522 352 0.6035 0.7392 0.6035 0.7768
No log 7.6957 354 0.6060 0.7550 0.6060 0.7784
No log 7.7391 356 0.6071 0.7483 0.6071 0.7792
No log 7.7826 358 0.6053 0.7439 0.6053 0.7780
No log 7.8261 360 0.6095 0.7457 0.6095 0.7807
No log 7.8696 362 0.6101 0.7457 0.6101 0.7811
No log 7.9130 364 0.6022 0.7380 0.6022 0.7760
No log 7.9565 366 0.5941 0.7476 0.5941 0.7708
No log 8.0 368 0.5900 0.7306 0.5900 0.7681
No log 8.0435 370 0.5900 0.7324 0.5900 0.7681
No log 8.0870 372 0.5868 0.7324 0.5868 0.7660
No log 8.1304 374 0.5845 0.7324 0.5845 0.7645
No log 8.1739 376 0.5833 0.7324 0.5833 0.7637
No log 8.2174 378 0.5795 0.7443 0.5795 0.7612
No log 8.2609 380 0.5757 0.7501 0.5757 0.7587
No log 8.3043 382 0.5738 0.7473 0.5738 0.7575
No log 8.3478 384 0.5728 0.7246 0.5728 0.7569
No log 8.3913 386 0.5720 0.7520 0.5720 0.7563
No log 8.4348 388 0.5712 0.7520 0.5712 0.7558
No log 8.4783 390 0.5705 0.7386 0.5705 0.7553
No log 8.5217 392 0.5717 0.7473 0.5717 0.7561
No log 8.5652 394 0.5739 0.7427 0.5739 0.7575
No log 8.6087 396 0.5775 0.7415 0.5775 0.7599
No log 8.6522 398 0.5807 0.7443 0.5807 0.7621
No log 8.6957 400 0.5886 0.7451 0.5886 0.7672
No log 8.7391 402 0.5928 0.7388 0.5928 0.7699
No log 8.7826 404 0.5885 0.7388 0.5885 0.7671
No log 8.8261 406 0.5815 0.7399 0.5815 0.7625
No log 8.8696 408 0.5770 0.7520 0.5770 0.7596
No log 8.9130 410 0.5762 0.7380 0.5762 0.7591
No log 8.9565 412 0.5783 0.7273 0.5783 0.7604
No log 9.0 414 0.5802 0.7344 0.5802 0.7617
No log 9.0435 416 0.5806 0.7373 0.5806 0.7620
No log 9.0870 418 0.5817 0.7397 0.5817 0.7627
No log 9.1304 420 0.5850 0.7443 0.5850 0.7649
No log 9.1739 422 0.5879 0.7443 0.5879 0.7668
No log 9.2174 424 0.5901 0.7443 0.5901 0.7682
No log 9.2609 426 0.5899 0.7443 0.5899 0.7681
No log 9.3043 428 0.5889 0.7443 0.5889 0.7674
No log 9.3478 430 0.5856 0.7443 0.5856 0.7652
No log 9.3913 432 0.5826 0.7397 0.5826 0.7633
No log 9.4348 434 0.5810 0.7397 0.5810 0.7622
No log 9.4783 436 0.5807 0.7413 0.5807 0.7620
No log 9.5217 438 0.5806 0.7384 0.5806 0.7620
No log 9.5652 440 0.5807 0.7416 0.5807 0.7621
No log 9.6087 442 0.5813 0.7243 0.5813 0.7624
No log 9.6522 444 0.5811 0.7243 0.5811 0.7623
No log 9.6957 446 0.5809 0.7343 0.5809 0.7621
No log 9.7391 448 0.5809 0.7343 0.5809 0.7621
No log 9.7826 450 0.5807 0.7343 0.5807 0.7620
No log 9.8261 452 0.5807 0.7343 0.5807 0.7621
No log 9.8696 454 0.5807 0.7181 0.5807 0.7620
No log 9.9130 456 0.5807 0.7181 0.5807 0.7620
No log 9.9565 458 0.5807 0.7226 0.5807 0.7620
No log 10.0 460 0.5807 0.7226 0.5807 0.7620

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_FineTuningAraBERT_run1_AugV5_k9_task1_organization

Finetuned
(4023)
this model