ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k19_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7343
  • Qwk: 0.6757
  • Mse: 0.7343
  • Rmse: 0.8569

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0233 2 6.7832 0.0178 6.7832 2.6045
No log 0.0465 4 4.5055 0.0500 4.5055 2.1226
No log 0.0698 6 3.9788 -0.0490 3.9788 1.9947
No log 0.0930 8 2.7239 0.0952 2.7239 1.6504
No log 0.1163 10 1.8879 0.1008 1.8879 1.3740
No log 0.1395 12 1.6620 0.1947 1.6620 1.2892
No log 0.1628 14 1.8049 0.1833 1.8049 1.3435
No log 0.1860 16 1.6896 0.1880 1.6896 1.2998
No log 0.2093 18 1.5811 0.2393 1.5811 1.2574
No log 0.2326 20 1.4835 0.384 1.4835 1.2180
No log 0.2558 22 1.4570 0.4375 1.4570 1.2071
No log 0.2791 24 1.4739 0.3103 1.4739 1.2140
No log 0.3023 26 2.4233 0.1538 2.4233 1.5567
No log 0.3256 28 2.6763 0.1125 2.6763 1.6359
No log 0.3488 30 2.2197 0.2000 2.2197 1.4899
No log 0.3721 32 1.6305 0.3833 1.6305 1.2769
No log 0.3953 34 1.4128 0.25 1.4128 1.1886
No log 0.4186 36 1.4444 0.3636 1.4444 1.2018
No log 0.4419 38 1.6262 0.3810 1.6262 1.2752
No log 0.4651 40 1.7028 0.4203 1.7028 1.3049
No log 0.4884 42 1.6783 0.4225 1.6783 1.2955
No log 0.5116 44 1.4788 0.4615 1.4788 1.2160
No log 0.5349 46 1.1361 0.5455 1.1361 1.0659
No log 0.5581 48 0.9916 0.5821 0.9916 0.9958
No log 0.5814 50 0.9319 0.6269 0.9319 0.9653
No log 0.6047 52 0.9514 0.6618 0.9514 0.9754
No log 0.6279 54 0.9795 0.6370 0.9795 0.9897
No log 0.6512 56 0.9655 0.6074 0.9655 0.9826
No log 0.6744 58 0.9329 0.5970 0.9329 0.9659
No log 0.6977 60 0.8561 0.6567 0.8561 0.9253
No log 0.7209 62 0.8385 0.6316 0.8385 0.9157
No log 0.7442 64 0.7946 0.6515 0.7946 0.8914
No log 0.7674 66 0.7523 0.6667 0.7523 0.8673
No log 0.7907 68 0.7472 0.7050 0.7472 0.8644
No log 0.8140 70 0.7306 0.7518 0.7306 0.8548
No log 0.8372 72 0.6717 0.7552 0.6717 0.8196
No log 0.8605 74 0.5876 0.7671 0.5876 0.7666
No log 0.8837 76 0.6144 0.7891 0.6144 0.7839
No log 0.9070 78 0.6267 0.7891 0.6267 0.7916
No log 0.9302 80 0.6516 0.7973 0.6516 0.8072
No log 0.9535 82 0.6286 0.7397 0.6286 0.7928
No log 0.9767 84 0.8436 0.6950 0.8436 0.9185
No log 1.0 86 0.8950 0.6761 0.8950 0.9461
No log 1.0233 88 0.7791 0.7042 0.7791 0.8827
No log 1.0465 90 0.7920 0.6715 0.7920 0.8900
No log 1.0698 92 0.8618 0.6715 0.8618 0.9283
No log 1.0930 94 1.0090 0.6620 1.0090 1.0045
No log 1.1163 96 1.1583 0.6099 1.1583 1.0762
No log 1.1395 98 1.0931 0.6331 1.0931 1.0455
No log 1.1628 100 0.9636 0.6370 0.9636 0.9816
No log 1.1860 102 0.9391 0.6119 0.9391 0.9691
No log 1.2093 104 0.9039 0.6569 0.9039 0.9507
No log 1.2326 106 0.9325 0.6667 0.9325 0.9657
No log 1.2558 108 0.9535 0.6809 0.9535 0.9765
No log 1.2791 110 0.8193 0.6667 0.8193 0.9052
No log 1.3023 112 0.8000 0.6471 0.8000 0.8944
No log 1.3256 114 0.7674 0.7273 0.7674 0.8760
No log 1.3488 116 0.7336 0.7133 0.7336 0.8565
No log 1.3721 118 0.9204 0.6383 0.9204 0.9594
No log 1.3953 120 0.9674 0.6383 0.9674 0.9836
No log 1.4186 122 0.7418 0.6892 0.7418 0.8613
No log 1.4419 124 0.5801 0.7763 0.5801 0.7616
No log 1.4651 126 0.5656 0.8052 0.5656 0.7521
No log 1.4884 128 0.5651 0.8052 0.5651 0.7518
No log 1.5116 130 0.6530 0.7383 0.6530 0.8081
No log 1.5349 132 0.7176 0.7361 0.7176 0.8471
No log 1.5581 134 0.6775 0.7429 0.6775 0.8231
No log 1.5814 136 0.7759 0.6620 0.7759 0.8808
No log 1.6047 138 0.8246 0.6573 0.8246 0.9081
No log 1.6279 140 0.7298 0.7682 0.7298 0.8543
No log 1.6512 142 0.6904 0.7682 0.6904 0.8309
No log 1.6744 144 0.6289 0.7843 0.6289 0.7930
No log 1.6977 146 0.6337 0.7922 0.6337 0.7961
No log 1.7209 148 0.6359 0.7347 0.6359 0.7974
No log 1.7442 150 0.6481 0.7347 0.6481 0.8051
No log 1.7674 152 0.6565 0.7448 0.6565 0.8102
No log 1.7907 154 0.7482 0.6809 0.7482 0.8650
No log 1.8140 156 0.9968 0.6197 0.9968 0.9984
No log 1.8372 158 0.8583 0.7162 0.8583 0.9264
No log 1.8605 160 0.6612 0.7871 0.6612 0.8131
No log 1.8837 162 0.6473 0.7662 0.6473 0.8046
No log 1.9070 164 0.8292 0.7190 0.8292 0.9106
No log 1.9302 166 1.1878 0.5811 1.1878 1.0899
No log 1.9535 168 1.5523 0.4626 1.5523 1.2459
No log 1.9767 170 1.6267 0.3194 1.6267 1.2754
No log 2.0 172 1.3902 0.4444 1.3902 1.1791
No log 2.0233 174 1.3370 0.4722 1.3370 1.1563
No log 2.0465 176 1.4212 0.4225 1.4212 1.1921
No log 2.0698 178 1.4402 0.4722 1.4402 1.2001
No log 2.0930 180 1.2899 0.5241 1.2899 1.1357
No log 2.1163 182 1.0196 0.6014 1.0196 1.0098
No log 2.1395 184 0.8981 0.6853 0.8981 0.9477
No log 2.1628 186 1.0477 0.6154 1.0477 1.0236
No log 2.1860 188 0.9744 0.6345 0.9744 0.9871
No log 2.2093 190 0.7346 0.7123 0.7346 0.8571
No log 2.2326 192 0.9405 0.6528 0.9405 0.9698
No log 2.2558 194 1.0887 0.5874 1.0887 1.0434
No log 2.2791 196 0.9007 0.6713 0.9007 0.9490
No log 2.3023 198 0.7545 0.7222 0.7545 0.8686
No log 2.3256 200 0.7578 0.7532 0.7578 0.8705
No log 2.3488 202 0.7309 0.7763 0.7309 0.8549
No log 2.3721 204 0.9177 0.6667 0.9177 0.9579
No log 2.3953 206 1.3086 0.5205 1.3086 1.1440
No log 2.4186 208 1.3943 0.5035 1.3943 1.1808
No log 2.4419 210 1.2243 0.5594 1.2243 1.1065
No log 2.4651 212 1.0197 0.6861 1.0197 1.0098
No log 2.4884 214 0.9320 0.6087 0.9320 0.9654
No log 2.5116 216 0.9189 0.6043 0.9189 0.9586
No log 2.5349 218 0.8917 0.6944 0.8917 0.9443
No log 2.5581 220 0.9711 0.6803 0.9711 0.9854
No log 2.5814 222 1.0776 0.5931 1.0776 1.0381
No log 2.6047 224 1.1506 0.5342 1.1506 1.0727
No log 2.6279 226 1.1670 0.5342 1.1670 1.0803
No log 2.6512 228 1.0307 0.6207 1.0307 1.0152
No log 2.6744 230 0.9500 0.6857 0.9500 0.9747
No log 2.6977 232 0.9940 0.5802 0.9940 0.9970
No log 2.7209 234 0.9901 0.6418 0.9901 0.9950
No log 2.7442 236 0.9838 0.6571 0.9838 0.9918
No log 2.7674 238 1.1362 0.5594 1.1362 1.0659
No log 2.7907 240 1.3689 0.5068 1.3689 1.1700
No log 2.8140 242 1.3378 0.5067 1.3378 1.1566
No log 2.8372 244 1.0310 0.6184 1.0310 1.0154
No log 2.8605 246 0.9062 0.6800 0.9062 0.9520
No log 2.8837 248 0.8734 0.6755 0.8734 0.9346
No log 2.9070 250 0.8127 0.7383 0.8127 0.9015
No log 2.9302 252 0.8128 0.7413 0.8128 0.9016
No log 2.9535 254 0.8599 0.6074 0.8599 0.9273
No log 2.9767 256 0.8965 0.5821 0.8965 0.9469
No log 3.0 258 0.8851 0.6074 0.8851 0.9408
No log 3.0233 260 0.7561 0.7050 0.7561 0.8695
No log 3.0465 262 0.6579 0.7206 0.6579 0.8111
No log 3.0698 264 0.6355 0.7724 0.6355 0.7972
No log 3.0930 266 0.6466 0.7785 0.6466 0.8041
No log 3.1163 268 0.6876 0.7619 0.6876 0.8292
No log 3.1395 270 0.7060 0.7703 0.7060 0.8402
No log 3.1628 272 0.7471 0.7083 0.7471 0.8644
No log 3.1860 274 0.7738 0.7273 0.7738 0.8797
No log 3.2093 276 0.8053 0.7552 0.8053 0.8974
No log 3.2326 278 0.8560 0.6963 0.8560 0.9252
No log 3.2558 280 0.8197 0.6906 0.8197 0.9054
No log 3.2791 282 0.7588 0.7347 0.7588 0.8711
No log 3.3023 284 0.7226 0.7808 0.7226 0.8501
No log 3.3256 286 0.7264 0.7724 0.7264 0.8523
No log 3.3488 288 0.7572 0.6715 0.7572 0.8702
No log 3.3721 290 0.7930 0.6567 0.7930 0.8905
No log 3.3953 292 0.7773 0.7111 0.7773 0.8817
No log 3.4186 294 0.7170 0.7286 0.7170 0.8468
No log 3.4419 296 0.6653 0.7361 0.6653 0.8157
No log 3.4651 298 0.6343 0.7413 0.6343 0.7964
No log 3.4884 300 0.7099 0.6853 0.7099 0.8425
No log 3.5116 302 0.7479 0.6849 0.7479 0.8648
No log 3.5349 304 0.7686 0.6853 0.7686 0.8767
No log 3.5581 306 0.6950 0.7286 0.6950 0.8337
No log 3.5814 308 0.6905 0.7273 0.6905 0.8310
No log 3.6047 310 0.7194 0.7324 0.7194 0.8482
No log 3.6279 312 0.7322 0.7183 0.7322 0.8557
No log 3.6512 314 0.7205 0.7310 0.7205 0.8488
No log 3.6744 316 0.8078 0.7432 0.8078 0.8988
No log 3.6977 318 0.7897 0.7763 0.7897 0.8887
No log 3.7209 320 0.6797 0.7843 0.6797 0.8244
No log 3.7442 322 0.6781 0.7692 0.6781 0.8235
No log 3.7674 324 0.7208 0.7215 0.7208 0.8490
No log 3.7907 326 0.6904 0.7125 0.6904 0.8309
No log 3.8140 328 0.5982 0.7547 0.5982 0.7734
No log 3.8372 330 0.5355 0.8049 0.5355 0.7317
No log 3.8605 332 0.5179 0.8221 0.5179 0.7197
No log 3.8837 334 0.5328 0.8485 0.5328 0.7300
No log 3.9070 336 0.5817 0.8176 0.5817 0.7627
No log 3.9302 338 0.5813 0.8258 0.5813 0.7624
No log 3.9535 340 0.6180 0.8052 0.6180 0.7861
No log 3.9767 342 0.6651 0.7703 0.6651 0.8155
No log 4.0 344 0.6658 0.7703 0.6658 0.8160
No log 4.0233 346 0.6168 0.7867 0.6168 0.7854
No log 4.0465 348 0.5608 0.8428 0.5608 0.7488
No log 4.0698 350 0.5714 0.8199 0.5714 0.7559
No log 4.0930 352 0.5551 0.8302 0.5551 0.7451
No log 4.1163 354 0.6226 0.7785 0.6226 0.7891
No log 4.1395 356 0.7506 0.6621 0.7506 0.8664
No log 4.1628 358 0.7496 0.6621 0.7496 0.8658
No log 4.1860 360 0.6784 0.7671 0.6784 0.8237
No log 4.2093 362 0.6440 0.7755 0.6440 0.8025
No log 4.2326 364 0.6400 0.7867 0.6400 0.8000
No log 4.2558 366 0.6231 0.7815 0.6231 0.7894
No log 4.2791 368 0.5944 0.8026 0.5944 0.7710
No log 4.3023 370 0.5833 0.8182 0.5833 0.7638
No log 4.3256 372 0.6092 0.7682 0.6092 0.7805
No log 4.3488 374 0.6660 0.7483 0.6660 0.8161
No log 4.3721 376 0.7327 0.7248 0.7327 0.8560
No log 4.3953 378 0.6524 0.7619 0.6524 0.8077
No log 4.4186 380 0.6196 0.7919 0.6196 0.7872
No log 4.4419 382 0.6200 0.7703 0.6200 0.7874
No log 4.4651 384 0.6000 0.7703 0.6000 0.7746
No log 4.4884 386 0.5872 0.7871 0.5872 0.7663
No log 4.5116 388 0.5617 0.7871 0.5617 0.7495
No log 4.5349 390 0.5459 0.7867 0.5459 0.7388
No log 4.5581 392 0.5811 0.7879 0.5811 0.7623
No log 4.5814 394 0.5791 0.7929 0.5791 0.7610
No log 4.6047 396 0.5327 0.8344 0.5327 0.7298
No log 4.6279 398 0.5924 0.8052 0.5924 0.7697
No log 4.6512 400 0.6680 0.7867 0.6680 0.8173
No log 4.6744 402 0.6552 0.7286 0.6552 0.8095
No log 4.6977 404 0.6389 0.7376 0.6389 0.7993
No log 4.7209 406 0.6064 0.7862 0.6064 0.7787
No log 4.7442 408 0.6037 0.7919 0.6037 0.7770
No log 4.7674 410 0.6059 0.8 0.6059 0.7784
No log 4.7907 412 0.6293 0.7947 0.6293 0.7933
No log 4.8140 414 0.6330 0.7552 0.6330 0.7956
No log 4.8372 416 0.6493 0.7586 0.6493 0.8058
No log 4.8605 418 0.7115 0.6993 0.7115 0.8435
No log 4.8837 420 0.7110 0.7083 0.7110 0.8432
No log 4.9070 422 0.6604 0.7260 0.6604 0.8127
No log 4.9302 424 0.5937 0.7922 0.5937 0.7705
No log 4.9535 426 0.6098 0.7742 0.6098 0.7809
No log 4.9767 428 0.6741 0.7260 0.6741 0.8211
No log 5.0 430 0.8096 0.6667 0.8096 0.8998
No log 5.0233 432 0.9079 0.6383 0.9079 0.9528
No log 5.0465 434 0.8405 0.6377 0.8405 0.9168
No log 5.0698 436 0.7755 0.7015 0.7755 0.8806
No log 5.0930 438 0.7325 0.7153 0.7325 0.8558
No log 5.1163 440 0.6812 0.7376 0.6812 0.8254
No log 5.1395 442 0.6943 0.7413 0.6943 0.8333
No log 5.1628 444 0.6921 0.7211 0.6921 0.8319
No log 5.1860 446 0.6091 0.7785 0.6091 0.7805
No log 5.2093 448 0.5739 0.8129 0.5739 0.7576
No log 5.2326 450 0.6027 0.7815 0.6027 0.7764
No log 5.2558 452 0.6331 0.7733 0.6331 0.7957
No log 5.2791 454 0.6423 0.7919 0.6423 0.8015
No log 5.3023 456 0.6395 0.7703 0.6395 0.7997
No log 5.3256 458 0.6782 0.7619 0.6782 0.8236
No log 5.3488 460 0.6907 0.75 0.6907 0.8311
No log 5.3721 462 0.6804 0.7639 0.6804 0.8249
No log 5.3953 464 0.7123 0.7534 0.7123 0.8440
No log 5.4186 466 0.7236 0.7534 0.7236 0.8507
No log 5.4419 468 0.7558 0.7194 0.7558 0.8694
No log 5.4651 470 0.7816 0.6861 0.7816 0.8841
No log 5.4884 472 0.7699 0.6906 0.7699 0.8774
No log 5.5116 474 0.7249 0.7483 0.7249 0.8514
No log 5.5349 476 0.6943 0.7448 0.6943 0.8333
No log 5.5581 478 0.6692 0.7133 0.6692 0.8180
No log 5.5814 480 0.6792 0.7361 0.6792 0.8241
No log 5.6047 482 0.7119 0.7194 0.7119 0.8437
No log 5.6279 484 0.7010 0.7639 0.7010 0.8373
No log 5.6512 486 0.7227 0.7194 0.7227 0.8501
No log 5.6744 488 0.7546 0.7007 0.7546 0.8687
No log 5.6977 490 0.7740 0.7376 0.7740 0.8798
No log 5.7209 492 0.6892 0.7552 0.6892 0.8302
No log 5.7442 494 0.6552 0.7733 0.6552 0.8095
No log 5.7674 496 0.6573 0.7550 0.6573 0.8107
No log 5.7907 498 0.6843 0.7550 0.6843 0.8272
0.4345 5.8140 500 0.7081 0.7397 0.7081 0.8415
0.4345 5.8372 502 0.7371 0.7260 0.7371 0.8585
0.4345 5.8605 504 0.7418 0.7361 0.7418 0.8613
0.4345 5.8837 506 0.7406 0.7361 0.7406 0.8606
0.4345 5.9070 508 0.6760 0.7619 0.6760 0.8222
0.4345 5.9302 510 0.6532 0.7703 0.6532 0.8082
0.4345 5.9535 512 0.6567 0.75 0.6567 0.8104
0.4345 5.9767 514 0.6411 0.7662 0.6411 0.8007
0.4345 6.0 516 0.6012 0.8105 0.6012 0.7753
0.4345 6.0233 518 0.6038 0.8105 0.6038 0.7771
0.4345 6.0465 520 0.6210 0.8105 0.6210 0.7880
0.4345 6.0698 522 0.6726 0.7582 0.6726 0.8201
0.4345 6.0930 524 0.7562 0.7320 0.7562 0.8696
0.4345 6.1163 526 0.7593 0.7320 0.7593 0.8714
0.4345 6.1395 528 0.6713 0.7632 0.6713 0.8194
0.4345 6.1628 530 0.6414 0.7550 0.6414 0.8009
0.4345 6.1860 532 0.6456 0.7919 0.6456 0.8035
0.4345 6.2093 534 0.7141 0.7534 0.7141 0.8450
0.4345 6.2326 536 0.7119 0.7534 0.7119 0.8437
0.4345 6.2558 538 0.6409 0.7838 0.6409 0.8006
0.4345 6.2791 540 0.6684 0.7568 0.6684 0.8175
0.4345 6.3023 542 0.7500 0.7162 0.7500 0.8660
0.4345 6.3256 544 0.7136 0.7403 0.7136 0.8447
0.4345 6.3488 546 0.6133 0.7632 0.6133 0.7832
0.4345 6.3721 548 0.5371 0.8153 0.5371 0.7329
0.4345 6.3953 550 0.5565 0.8077 0.5565 0.7460
0.4345 6.4186 552 0.5892 0.7815 0.5892 0.7676
0.4345 6.4419 554 0.6511 0.7755 0.6511 0.8069
0.4345 6.4651 556 0.7702 0.7042 0.7702 0.8776
0.4345 6.4884 558 0.7832 0.6993 0.7832 0.8850
0.4345 6.5116 560 0.6953 0.7361 0.6953 0.8338
0.4345 6.5349 562 0.6462 0.7815 0.6462 0.8038
0.4345 6.5581 564 0.6215 0.7947 0.6215 0.7884
0.4345 6.5814 566 0.6064 0.8158 0.6064 0.7787
0.4345 6.6047 568 0.5808 0.8171 0.5808 0.7621
0.4345 6.6279 570 0.5416 0.8235 0.5416 0.7359
0.4345 6.6512 572 0.5270 0.8214 0.5270 0.7259
0.4345 6.6744 574 0.5872 0.8024 0.5872 0.7663
0.4345 6.6977 576 0.6549 0.7515 0.6549 0.8093
0.4345 6.7209 578 0.6084 0.7871 0.6084 0.7800
0.4345 6.7442 580 0.5866 0.8077 0.5866 0.7659
0.4345 6.7674 582 0.6427 0.8 0.6427 0.8017
0.4345 6.7907 584 0.6734 0.7843 0.6734 0.8206
0.4345 6.8140 586 0.6946 0.7310 0.6946 0.8334
0.4345 6.8372 588 0.7555 0.6906 0.7555 0.8692
0.4345 6.8605 590 0.8579 0.6479 0.8579 0.9262
0.4345 6.8837 592 0.8474 0.6621 0.8474 0.9205
0.4345 6.9070 594 0.7343 0.6757 0.7343 0.8569

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
3
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k19_task1_organization

Finetuned
(4023)
this model