ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k19_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8508
  • Qwk: 0.7429
  • Mse: 0.8508
  • Rmse: 0.9224

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0222 2 6.8397 0.0057 6.8397 2.6153
No log 0.0444 4 4.8184 0.0253 4.8184 2.1951
No log 0.0667 6 3.8380 -0.0821 3.8380 1.9591
No log 0.0889 8 2.5361 0.0839 2.5361 1.5925
No log 0.1111 10 2.0340 0.1550 2.0340 1.4262
No log 0.1333 12 1.8689 0.2131 1.8689 1.3671
No log 0.1556 14 1.9832 0.2462 1.9832 1.4082
No log 0.1778 16 2.0567 0.2446 2.0567 1.4341
No log 0.2 18 2.0568 0.2319 2.0568 1.4342
No log 0.2222 20 2.1305 0.2590 2.1305 1.4596
No log 0.2444 22 2.2513 0.1739 2.2513 1.5004
No log 0.2667 24 2.8071 0.0142 2.8071 1.6754
No log 0.2889 26 2.9039 -0.0282 2.9039 1.7041
No log 0.3111 28 2.6135 0.0286 2.6135 1.6166
No log 0.3333 30 2.1347 0.1343 2.1347 1.4611
No log 0.3556 32 1.8681 0.3088 1.8681 1.3668
No log 0.3778 34 2.0725 0.2207 2.0725 1.4396
No log 0.4 36 2.2357 0.1761 2.2357 1.4952
No log 0.4222 38 2.3616 0.1905 2.3616 1.5368
No log 0.4444 40 2.4887 0.2556 2.4887 1.5776
No log 0.4667 42 1.8198 0.3179 1.8198 1.3490
No log 0.4889 44 1.3987 0.4394 1.3987 1.1827
No log 0.5111 46 1.2469 0.5152 1.2469 1.1167
No log 0.5333 48 1.1390 0.6029 1.1390 1.0672
No log 0.5556 50 1.1581 0.5778 1.1581 1.0761
No log 0.5778 52 1.1810 0.5755 1.1810 1.0868
No log 0.6 54 1.3014 0.5342 1.3014 1.1408
No log 0.6222 56 1.3731 0.5256 1.3731 1.1718
No log 0.6444 58 1.9281 0.2361 1.9281 1.3886
No log 0.6667 60 2.4420 0.1286 2.4420 1.5627
No log 0.6889 62 2.9279 -0.0458 2.9279 1.7111
No log 0.7111 64 2.4608 0.0611 2.4608 1.5687
No log 0.7333 66 1.7474 0.3239 1.7474 1.3219
No log 0.7556 68 1.7393 0.4393 1.7393 1.3188
No log 0.7778 70 2.2811 0.3610 2.2811 1.5103
No log 0.8 72 2.0202 0.4221 2.0202 1.4213
No log 0.8222 74 1.5095 0.5 1.5095 1.2286
No log 0.8444 76 1.5785 0.3235 1.5785 1.2564
No log 0.8667 78 1.6785 0.3030 1.6785 1.2956
No log 0.8889 80 1.7838 0.2154 1.7838 1.3356
No log 0.9111 82 1.7710 0.3066 1.7710 1.3308
No log 0.9333 84 1.6104 0.4218 1.6104 1.2690
No log 0.9556 86 1.4246 0.4416 1.4246 1.1936
No log 0.9778 88 1.3592 0.5309 1.3592 1.1658
No log 1.0 90 1.2741 0.5570 1.2741 1.1288
No log 1.0222 92 1.1087 0.6832 1.1087 1.0530
No log 1.0444 94 0.8934 0.7152 0.8934 0.9452
No log 1.0667 96 0.7439 0.7451 0.7439 0.8625
No log 1.0889 98 0.8301 0.7105 0.8301 0.9111
No log 1.1111 100 1.0698 0.6743 1.0698 1.0343
No log 1.1333 102 1.0857 0.6977 1.0857 1.0420
No log 1.1556 104 0.6820 0.7333 0.6820 0.8258
No log 1.1778 106 0.6718 0.7194 0.6718 0.8196
No log 1.2 108 0.6942 0.6765 0.6942 0.8332
No log 1.2222 110 0.6562 0.7432 0.6562 0.8100
No log 1.2444 112 0.8895 0.6887 0.8895 0.9431
No log 1.2667 114 0.8144 0.6757 0.8144 0.9024
No log 1.2889 116 0.6899 0.7324 0.6899 0.8306
No log 1.3111 118 0.7806 0.6519 0.7806 0.8835
No log 1.3333 120 0.8190 0.7 0.8190 0.9050
No log 1.3556 122 1.0202 0.6797 1.0202 1.0101
No log 1.3778 124 1.3250 0.6061 1.3250 1.1511
No log 1.4 126 1.4195 0.5965 1.4195 1.1914
No log 1.4222 128 1.1413 0.6242 1.1413 1.0683
No log 1.4444 130 1.0974 0.6623 1.0974 1.0476
No log 1.4667 132 1.2825 0.4925 1.2825 1.1325
No log 1.4889 134 1.3999 0.5034 1.3999 1.1832
No log 1.5111 136 1.6063 0.5116 1.6063 1.2674
No log 1.5333 138 1.7600 0.5 1.7600 1.3267
No log 1.5556 140 1.5277 0.5185 1.5277 1.2360
No log 1.5778 142 1.2497 0.5263 1.2497 1.1179
No log 1.6 144 1.0716 0.5984 1.0716 1.0352
No log 1.6222 146 1.0001 0.625 1.0001 1.0001
No log 1.6444 148 0.8794 0.6667 0.8794 0.9378
No log 1.6667 150 0.8201 0.72 0.8201 0.9056
No log 1.6889 152 0.9730 0.7151 0.9730 0.9864
No log 1.7111 154 1.0623 0.7 1.0623 1.0307
No log 1.7333 156 0.9425 0.7168 0.9425 0.9708
No log 1.7556 158 0.9351 0.7320 0.9351 0.9670
No log 1.7778 160 0.9242 0.7320 0.9242 0.9614
No log 1.8 162 0.7984 0.7273 0.7984 0.8935
No log 1.8222 164 0.7691 0.7097 0.7691 0.8770
No log 1.8444 166 0.8172 0.7097 0.8172 0.9040
No log 1.8667 168 0.8165 0.6621 0.8165 0.9036
No log 1.8889 170 0.7751 0.6475 0.7751 0.8804
No log 1.9111 172 0.7556 0.6912 0.7556 0.8693
No log 1.9333 174 0.7413 0.7183 0.7413 0.8610
No log 1.9556 176 0.7792 0.7083 0.7792 0.8827
No log 1.9778 178 0.7579 0.7417 0.7579 0.8706
No log 2.0 180 0.7707 0.7114 0.7707 0.8779
No log 2.0222 182 0.7378 0.7342 0.7378 0.8589
No log 2.0444 184 0.7044 0.7547 0.7044 0.8393
No log 2.0667 186 0.6551 0.7432 0.6551 0.8094
No log 2.0889 188 0.6805 0.7273 0.6805 0.8249
No log 2.1111 190 0.6536 0.7297 0.6536 0.8085
No log 2.1333 192 0.6803 0.7226 0.6803 0.8248
No log 2.1556 194 0.7901 0.7284 0.7901 0.8889
No log 2.1778 196 0.7204 0.7261 0.7204 0.8488
No log 2.2 198 0.6470 0.7260 0.6470 0.8044
No log 2.2222 200 0.7933 0.6567 0.7933 0.8907
No log 2.2444 202 0.7819 0.6815 0.7819 0.8843
No log 2.2667 204 0.7011 0.7172 0.7011 0.8373
No log 2.2889 206 0.8143 0.6883 0.8143 0.9024
No log 2.3111 208 0.8267 0.7114 0.8267 0.9092
No log 2.3333 210 0.8709 0.6667 0.8709 0.9332
No log 2.3556 212 1.2496 0.4964 1.2496 1.1178
No log 2.3778 214 1.4787 0.3768 1.4787 1.2160
No log 2.4 216 1.5677 0.3830 1.5677 1.2521
No log 2.4222 218 1.5212 0.4384 1.5212 1.2334
No log 2.4444 220 1.6205 0.3913 1.6205 1.2730
No log 2.4667 222 1.3490 0.4627 1.3490 1.1615
No log 2.4889 224 0.9593 0.6423 0.9593 0.9794
No log 2.5111 226 0.7779 0.6809 0.7779 0.8820
No log 2.5333 228 0.7272 0.6759 0.7272 0.8528
No log 2.5556 230 0.7159 0.6759 0.7159 0.8461
No log 2.5778 232 0.7181 0.6667 0.7181 0.8474
No log 2.6 234 0.7487 0.6712 0.7487 0.8653
No log 2.6222 236 0.7586 0.7152 0.7586 0.8710
No log 2.6444 238 0.8076 0.725 0.8076 0.8987
No log 2.6667 240 0.8401 0.7101 0.8401 0.9166
No log 2.6889 242 0.8174 0.7326 0.8174 0.9041
No log 2.7111 244 0.7187 0.7407 0.7187 0.8478
No log 2.7333 246 0.6197 0.7848 0.6197 0.7872
No log 2.7556 248 0.5935 0.7564 0.5935 0.7704
No log 2.7778 250 0.6233 0.7673 0.6233 0.7895
No log 2.8 252 0.5661 0.775 0.5661 0.7524
No log 2.8222 254 0.5634 0.7606 0.5634 0.7506
No log 2.8444 256 0.7888 0.6667 0.7888 0.8882
No log 2.8667 258 0.8381 0.6618 0.8381 0.9155
No log 2.8889 260 0.6461 0.6963 0.6461 0.8038
No log 2.9111 262 0.5008 0.7895 0.5008 0.7077
No log 2.9333 264 0.5978 0.7595 0.5978 0.7731
No log 2.9556 266 0.6731 0.75 0.6731 0.8204
No log 2.9778 268 0.6844 0.7516 0.6844 0.8273
No log 3.0 270 0.6440 0.7436 0.6440 0.8025
No log 3.0222 272 0.6000 0.7448 0.6000 0.7746
No log 3.0444 274 0.6411 0.7299 0.6411 0.8007
No log 3.0667 276 0.5969 0.7536 0.5969 0.7726
No log 3.0889 278 0.6233 0.6569 0.6233 0.7895
No log 3.1111 280 0.6244 0.7034 0.6244 0.7902
No log 3.1333 282 0.6047 0.7436 0.6047 0.7776
No log 3.1556 284 0.4994 0.8221 0.4994 0.7067
No log 3.1778 286 0.5219 0.7871 0.5219 0.7224
No log 3.2 288 0.5178 0.7712 0.5178 0.7196
No log 3.2222 290 0.4775 0.8434 0.4775 0.6910
No log 3.2444 292 0.5296 0.7778 0.5296 0.7278
No log 3.2667 294 0.5370 0.7702 0.5370 0.7328
No log 3.2889 296 0.4889 0.8129 0.4889 0.6992
No log 3.3111 298 0.5163 0.7972 0.5163 0.7185
No log 3.3333 300 0.5270 0.7972 0.5270 0.7260
No log 3.3556 302 0.6007 0.7582 0.6007 0.7751
No log 3.3778 304 0.7968 0.7547 0.7968 0.8926
No log 3.4 306 0.7955 0.7342 0.7955 0.8919
No log 3.4222 308 0.6419 0.7417 0.6419 0.8012
No log 3.4444 310 0.5495 0.7391 0.5495 0.7413
No log 3.4667 312 0.5690 0.7626 0.5690 0.7543
No log 3.4889 314 0.5546 0.75 0.5546 0.7447
No log 3.5111 316 0.6107 0.7742 0.6107 0.7815
No log 3.5333 318 0.8662 0.7337 0.8662 0.9307
No log 3.5556 320 1.1163 0.6742 1.1163 1.0566
No log 3.5778 322 1.1190 0.6744 1.1190 1.0578
No log 3.6 324 0.9522 0.6826 0.9522 0.9758
No log 3.6222 326 0.7375 0.7607 0.7375 0.8588
No log 3.6444 328 0.6501 0.7632 0.6501 0.8063
No log 3.6667 330 0.6129 0.7682 0.6129 0.7829
No log 3.6889 332 0.6108 0.7692 0.6108 0.7815
No log 3.7111 334 0.6616 0.7826 0.6616 0.8134
No log 3.7333 336 0.7405 0.7407 0.7405 0.8605
No log 3.7556 338 0.7021 0.7547 0.7021 0.8379
No log 3.7778 340 0.6025 0.7682 0.6025 0.7762
No log 3.8 342 0.5689 0.7606 0.5689 0.7543
No log 3.8222 344 0.6128 0.7391 0.6128 0.7828
No log 3.8444 346 0.6135 0.7391 0.6135 0.7833
No log 3.8667 348 0.6495 0.7712 0.6495 0.8059
No log 3.8889 350 0.8208 0.6968 0.8208 0.9060
No log 3.9111 352 0.9215 0.6792 0.9215 0.9600
No log 3.9333 354 0.8825 0.6835 0.8825 0.9394
No log 3.9556 356 0.8252 0.6939 0.8252 0.9084
No log 3.9778 358 0.7430 0.6573 0.7430 0.8620
No log 4.0 360 0.6993 0.7133 0.6993 0.8362
No log 4.0222 362 0.6638 0.7143 0.6638 0.8147
No log 4.0444 364 0.7013 0.7042 0.7013 0.8374
No log 4.0667 366 0.9067 0.6575 0.9067 0.9522
No log 4.0889 368 0.9584 0.6331 0.9584 0.9790
No log 4.1111 370 0.8224 0.6331 0.8224 0.9069
No log 4.1333 372 0.6439 0.7222 0.6439 0.8024
No log 4.1556 374 0.5841 0.7517 0.5841 0.7642
No log 4.1778 376 0.5549 0.7712 0.5549 0.7449
No log 4.2 378 0.5296 0.7838 0.5296 0.7277
No log 4.2222 380 0.5441 0.7552 0.5441 0.7376
No log 4.2444 382 0.6028 0.7639 0.6028 0.7764
No log 4.2667 384 0.6626 0.7518 0.6626 0.8140
No log 4.2889 386 0.6771 0.7619 0.6771 0.8229
No log 4.3111 388 0.6797 0.7417 0.6797 0.8244
No log 4.3333 390 0.6680 0.7451 0.6680 0.8173
No log 4.3556 392 0.6612 0.7532 0.6612 0.8132
No log 4.3778 394 0.6157 0.7532 0.6157 0.7846
No log 4.4 396 0.5383 0.8025 0.5383 0.7337
No log 4.4222 398 0.5310 0.7922 0.5310 0.7287
No log 4.4444 400 0.5678 0.8077 0.5678 0.7535
No log 4.4667 402 0.5892 0.7843 0.5892 0.7676
No log 4.4889 404 0.6253 0.8025 0.6253 0.7908
No log 4.5111 406 0.7131 0.7485 0.7131 0.8445
No log 4.5333 408 0.8311 0.6871 0.8311 0.9117
No log 4.5556 410 0.8402 0.6795 0.8402 0.9166
No log 4.5778 412 0.7967 0.6993 0.7967 0.8926
No log 4.6 414 0.7648 0.7101 0.7648 0.8746
No log 4.6222 416 0.7403 0.7324 0.7403 0.8604
No log 4.6444 418 0.7177 0.7552 0.7177 0.8472
No log 4.6667 420 0.6404 0.7785 0.6404 0.8002
No log 4.6889 422 0.6173 0.7619 0.6173 0.7857
No log 4.7111 424 0.6245 0.7619 0.6245 0.7903
No log 4.7333 426 0.6442 0.7324 0.6442 0.8026
No log 4.7556 428 0.6572 0.7286 0.6572 0.8106
No log 4.7778 430 0.6641 0.7050 0.6641 0.8149
No log 4.8 432 0.7558 0.7361 0.7558 0.8693
No log 4.8222 434 0.9420 0.6538 0.9420 0.9706
No log 4.8444 436 1.0341 0.6026 1.0341 1.0169
No log 4.8667 438 0.9388 0.6282 0.9388 0.9689
No log 4.8889 440 0.7434 0.7432 0.7434 0.8622
No log 4.9111 442 0.5902 0.8105 0.5902 0.7683
No log 4.9333 444 0.5663 0.7838 0.5663 0.7525
No log 4.9556 446 0.5598 0.8105 0.5598 0.7482
No log 4.9778 448 0.5918 0.8302 0.5918 0.7693
No log 5.0 450 0.6479 0.7875 0.6479 0.8049
No log 5.0222 452 0.7877 0.7125 0.7877 0.8875
No log 5.0444 454 0.8642 0.6957 0.8642 0.9296
No log 5.0667 456 0.7846 0.7097 0.7846 0.8858
No log 5.0889 458 0.6743 0.7552 0.6743 0.8211
No log 5.1111 460 0.6888 0.7338 0.6888 0.8300
No log 5.1333 462 0.6898 0.7246 0.6898 0.8306
No log 5.1556 464 0.6759 0.7746 0.6759 0.8221
No log 5.1778 466 0.7431 0.6897 0.7431 0.8620
No log 5.2 468 0.7679 0.7342 0.7679 0.8763
No log 5.2222 470 0.7039 0.7421 0.7039 0.8390
No log 5.2444 472 0.5827 0.7950 0.5827 0.7634
No log 5.2667 474 0.5490 0.8105 0.5490 0.7409
No log 5.2889 476 0.5914 0.7919 0.5914 0.7690
No log 5.3111 478 0.6353 0.7724 0.6353 0.7971
No log 5.3333 480 0.6382 0.7785 0.6382 0.7989
No log 5.3556 482 0.6299 0.7763 0.6299 0.7937
No log 5.3778 484 0.6179 0.7722 0.6179 0.7861
No log 5.4 486 0.5935 0.7799 0.5935 0.7704
No log 5.4222 488 0.5735 0.8 0.5735 0.7573
No log 5.4444 490 0.5608 0.8077 0.5608 0.7489
No log 5.4667 492 0.5475 0.7895 0.5475 0.7400
No log 5.4889 494 0.5585 0.7785 0.5585 0.7473
No log 5.5111 496 0.5839 0.7703 0.5839 0.7641
No log 5.5333 498 0.6259 0.7763 0.6259 0.7912
0.4158 5.5556 500 0.6197 0.7815 0.6197 0.7872
0.4158 5.5778 502 0.6098 0.7703 0.6098 0.7809
0.4158 5.6 504 0.6112 0.7785 0.6112 0.7818
0.4158 5.6222 506 0.6280 0.7949 0.6280 0.7925
0.4158 5.6444 508 0.6745 0.7590 0.6745 0.8213
0.4158 5.6667 510 0.7943 0.7657 0.7943 0.8912
0.4158 5.6889 512 0.8981 0.7391 0.8981 0.9477
0.4158 5.7111 514 0.9187 0.7459 0.9187 0.9585
0.4158 5.7333 516 0.8508 0.7429 0.8508 0.9224

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k19_task1_organization

Finetuned
(4019)
this model