ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k9_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6390
  • Qwk: 0.4224
  • Mse: 0.6390
  • Rmse: 0.7994

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0833 2 2.3861 -0.0262 2.3861 1.5447
No log 0.1667 4 1.0688 0.2508 1.0688 1.0338
No log 0.25 6 1.2133 -0.1162 1.2133 1.1015
No log 0.3333 8 1.5554 -0.1924 1.5554 1.2471
No log 0.4167 10 1.1353 -0.0622 1.1353 1.0655
No log 0.5 12 0.9150 0.0 0.9150 0.9566
No log 0.5833 14 0.9082 0.0541 0.9082 0.9530
No log 0.6667 16 0.9328 0.1416 0.9328 0.9658
No log 0.75 18 1.0116 0.2412 1.0116 1.0058
No log 0.8333 20 0.9747 0.1277 0.9747 0.9873
No log 0.9167 22 0.8724 0.0 0.8724 0.9340
No log 1.0 24 0.9296 0.0 0.9296 0.9641
No log 1.0833 26 0.9671 0.1345 0.9671 0.9834
No log 1.1667 28 1.0301 0.2533 1.0301 1.0149
No log 1.25 30 0.9275 0.2408 0.9275 0.9631
No log 1.3333 32 0.7627 0.0 0.7627 0.8733
No log 1.4167 34 0.7109 0.0 0.7109 0.8432
No log 1.5 36 0.7444 0.0 0.7444 0.8628
No log 1.5833 38 0.7567 0.0 0.7567 0.8699
No log 1.6667 40 0.7563 0.0481 0.7563 0.8697
No log 1.75 42 0.7223 -0.0027 0.7223 0.8499
No log 1.8333 44 0.7282 0.0444 0.7282 0.8533
No log 1.9167 46 0.7845 0.2132 0.7845 0.8857
No log 2.0 48 0.8331 0.3131 0.8331 0.9128
No log 2.0833 50 0.8155 0.2736 0.8155 0.9031
No log 2.1667 52 0.7107 0.1729 0.7107 0.8430
No log 2.25 54 0.6970 0.1365 0.6970 0.8349
No log 2.3333 56 0.6822 0.1321 0.6822 0.8260
No log 2.4167 58 0.6988 0.2118 0.6988 0.8360
No log 2.5 60 0.7431 0.2087 0.7431 0.8620
No log 2.5833 62 0.8569 0.3192 0.8569 0.9257
No log 2.6667 64 0.8842 0.3601 0.8842 0.9403
No log 2.75 66 0.7330 0.2992 0.7330 0.8561
No log 2.8333 68 0.6165 0.3862 0.6165 0.7852
No log 2.9167 70 0.7260 0.3637 0.7260 0.8521
No log 3.0 72 0.7028 0.3637 0.7028 0.8384
No log 3.0833 74 0.6228 0.2744 0.6228 0.7892
No log 3.1667 76 0.9057 0.3892 0.9057 0.9517
No log 3.25 78 0.9123 0.3892 0.9123 0.9551
No log 3.3333 80 0.6551 0.3135 0.6551 0.8094
No log 3.4167 82 0.6258 0.4103 0.6258 0.7911
No log 3.5 84 0.6210 0.3864 0.6210 0.7880
No log 3.5833 86 0.6178 0.3581 0.6178 0.7860
No log 3.6667 88 0.6578 0.3492 0.6578 0.8110
No log 3.75 90 0.6617 0.2973 0.6617 0.8135
No log 3.8333 92 0.7579 0.3384 0.7579 0.8706
No log 3.9167 94 0.9450 0.3538 0.9450 0.9721
No log 4.0 96 0.8618 0.4064 0.8618 0.9283
No log 4.0833 98 0.9660 0.3274 0.9660 0.9828
No log 4.1667 100 1.2206 0.3257 1.2206 1.1048
No log 4.25 102 0.8915 0.3377 0.8915 0.9442
No log 4.3333 104 0.7829 0.3343 0.7829 0.8848
No log 4.4167 106 0.6971 0.4640 0.6971 0.8349
No log 4.5 108 0.6775 0.4137 0.6775 0.8231
No log 4.5833 110 0.7188 0.3746 0.7188 0.8478
No log 4.6667 112 1.0206 0.3460 1.0206 1.0102
No log 4.75 114 0.9865 0.3517 0.9865 0.9932
No log 4.8333 116 0.7478 0.3606 0.7478 0.8647
No log 4.9167 118 0.6217 0.3713 0.6217 0.7885
No log 5.0 120 0.6146 0.4229 0.6146 0.7840
No log 5.0833 122 0.6523 0.3032 0.6523 0.8077
No log 5.1667 124 0.7586 0.4154 0.7586 0.8710
No log 5.25 126 0.7891 0.3287 0.7891 0.8883
No log 5.3333 128 0.8265 0.3228 0.8265 0.9091
No log 5.4167 130 0.7152 0.3473 0.7152 0.8457
No log 5.5 132 0.6211 0.3950 0.6211 0.7881
No log 5.5833 134 0.6231 0.3950 0.6231 0.7893
No log 5.6667 136 0.6130 0.4919 0.6130 0.7829
No log 5.75 138 0.6447 0.4764 0.6447 0.8029
No log 5.8333 140 0.5739 0.4763 0.5739 0.7575
No log 5.9167 142 0.5840 0.4349 0.5840 0.7642
No log 6.0 144 0.5841 0.4486 0.5841 0.7642
No log 6.0833 146 0.5589 0.4934 0.5589 0.7476
No log 6.1667 148 0.5758 0.4234 0.5758 0.7588
No log 6.25 150 0.5744 0.4314 0.5744 0.7579
No log 6.3333 152 0.5930 0.4762 0.5930 0.7700
No log 6.4167 154 0.5972 0.4747 0.5972 0.7728
No log 6.5 156 0.6934 0.3746 0.6934 0.8327
No log 6.5833 158 0.6785 0.3494 0.6785 0.8237
No log 6.6667 160 0.6313 0.3865 0.6313 0.7945
No log 6.75 162 0.6418 0.3475 0.6418 0.8011
No log 6.8333 164 0.6381 0.4482 0.6381 0.7988
No log 6.9167 166 0.6555 0.3833 0.6555 0.8096
No log 7.0 168 0.6561 0.3994 0.6561 0.8100
No log 7.0833 170 0.6339 0.4423 0.6339 0.7962
No log 7.1667 172 0.6704 0.4186 0.6704 0.8188
No log 7.25 174 0.7030 0.4464 0.7030 0.8384
No log 7.3333 176 0.6455 0.4044 0.6455 0.8034
No log 7.4167 178 0.6422 0.4362 0.6422 0.8014
No log 7.5 180 0.6252 0.3859 0.6252 0.7907
No log 7.5833 182 0.6244 0.4724 0.6244 0.7902
No log 7.6667 184 0.6981 0.4502 0.6981 0.8355
No log 7.75 186 0.7872 0.4080 0.7872 0.8872
No log 7.8333 188 0.7234 0.4562 0.7234 0.8505
No log 7.9167 190 0.5980 0.5115 0.5980 0.7733
No log 8.0 192 0.5941 0.4283 0.5941 0.7708
No log 8.0833 194 0.6624 0.5251 0.6624 0.8139
No log 8.1667 196 0.6827 0.5251 0.6827 0.8263
No log 8.25 198 0.6207 0.4892 0.6207 0.7879
No log 8.3333 200 0.5781 0.4753 0.5781 0.7603
No log 8.4167 202 0.5687 0.5201 0.5687 0.7541
No log 8.5 204 0.5809 0.4370 0.5809 0.7622
No log 8.5833 206 0.7074 0.4076 0.7074 0.8411
No log 8.6667 208 0.8151 0.3151 0.8151 0.9028
No log 8.75 210 0.7371 0.3657 0.7371 0.8585
No log 8.8333 212 0.5940 0.4315 0.5940 0.7707
No log 8.9167 214 0.6194 0.4749 0.6194 0.7870
No log 9.0 216 0.8320 0.3657 0.8320 0.9122
No log 9.0833 218 0.9181 0.3963 0.9181 0.9582
No log 9.1667 220 0.7738 0.3665 0.7738 0.8797
No log 9.25 222 0.6380 0.4375 0.6380 0.7987
No log 9.3333 224 0.6211 0.3787 0.6211 0.7881
No log 9.4167 226 0.6129 0.3787 0.6129 0.7829
No log 9.5 228 0.6008 0.3738 0.6008 0.7751
No log 9.5833 230 0.5968 0.4547 0.5968 0.7725
No log 9.6667 232 0.5890 0.4547 0.5890 0.7674
No log 9.75 234 0.5940 0.5079 0.5940 0.7707
No log 9.8333 236 0.5920 0.4516 0.5920 0.7694
No log 9.9167 238 0.6036 0.4942 0.6036 0.7769
No log 10.0 240 0.6257 0.4895 0.6257 0.7910
No log 10.0833 242 0.6088 0.4655 0.6088 0.7802
No log 10.1667 244 0.5913 0.4314 0.5913 0.7690
No log 10.25 246 0.6086 0.4059 0.6086 0.7802
No log 10.3333 248 0.5933 0.4534 0.5933 0.7703
No log 10.4167 250 0.6159 0.3746 0.6159 0.7848
No log 10.5 252 0.6718 0.3918 0.6718 0.8197
No log 10.5833 254 0.7078 0.4030 0.7078 0.8413
No log 10.6667 256 0.6874 0.4030 0.6874 0.8291
No log 10.75 258 0.6249 0.3789 0.6249 0.7905
No log 10.8333 260 0.6110 0.2652 0.6110 0.7817
No log 10.9167 262 0.6381 0.3712 0.6381 0.7988
No log 11.0 264 0.7081 0.3918 0.7081 0.8415
No log 11.0833 266 0.6925 0.4067 0.6925 0.8322
No log 11.1667 268 0.6269 0.3789 0.6269 0.7918
No log 11.25 270 0.6165 0.3196 0.6165 0.7852
No log 11.3333 272 0.6283 0.3355 0.6283 0.7927
No log 11.4167 274 0.6267 0.3942 0.6267 0.7916
No log 11.5 276 0.6127 0.3640 0.6127 0.7828
No log 11.5833 278 0.5887 0.4970 0.5887 0.7672
No log 11.6667 280 0.5749 0.5208 0.5749 0.7582
No log 11.75 282 0.5695 0.4402 0.5695 0.7546
No log 11.8333 284 0.5763 0.5397 0.5763 0.7591
No log 11.9167 286 0.5796 0.5397 0.5796 0.7613
No log 12.0 288 0.5694 0.5208 0.5694 0.7546
No log 12.0833 290 0.5710 0.4949 0.5710 0.7556
No log 12.1667 292 0.5736 0.4949 0.5736 0.7573
No log 12.25 294 0.5827 0.5386 0.5827 0.7634
No log 12.3333 296 0.5881 0.4949 0.5881 0.7669
No log 12.4167 298 0.6423 0.3127 0.6423 0.8014
No log 12.5 300 0.6896 0.3963 0.6896 0.8304
No log 12.5833 302 0.6525 0.3544 0.6525 0.8077
No log 12.6667 304 0.6958 0.3723 0.6958 0.8342
No log 12.75 306 0.8065 0.4366 0.8065 0.8980
No log 12.8333 308 0.7472 0.4123 0.7472 0.8644
No log 12.9167 310 0.6422 0.3261 0.6422 0.8014
No log 13.0 312 0.5945 0.3387 0.5945 0.7710
No log 13.0833 314 0.5943 0.3701 0.5943 0.7709
No log 13.1667 316 0.6153 0.3399 0.6153 0.7844
No log 13.25 318 0.6532 0.4808 0.6532 0.8082
No log 13.3333 320 0.6448 0.4409 0.6448 0.8030
No log 13.4167 322 0.5870 0.5086 0.5870 0.7661
No log 13.5 324 0.5503 0.4934 0.5503 0.7418
No log 13.5833 326 0.5474 0.4701 0.5474 0.7398
No log 13.6667 328 0.5688 0.5468 0.5688 0.7542
No log 13.75 330 0.5852 0.5149 0.5852 0.7650
No log 13.8333 332 0.6089 0.4582 0.6089 0.7803
No log 13.9167 334 0.6444 0.4329 0.6444 0.8028
No log 14.0 336 0.6500 0.4067 0.6500 0.8062
No log 14.0833 338 0.5936 0.4144 0.5936 0.7705
No log 14.1667 340 0.5563 0.4596 0.5563 0.7459
No log 14.25 342 0.5538 0.5422 0.5538 0.7442
No log 14.3333 344 0.5623 0.5307 0.5623 0.7499
No log 14.4167 346 0.6049 0.4247 0.6049 0.7778
No log 14.5 348 0.6294 0.4067 0.6294 0.7934
No log 14.5833 350 0.5853 0.3814 0.5853 0.7651
No log 14.6667 352 0.5728 0.4524 0.5728 0.7569
No log 14.75 354 0.5683 0.5307 0.5683 0.7539
No log 14.8333 356 0.5774 0.4639 0.5774 0.7599
No log 14.9167 358 0.6541 0.4582 0.6541 0.8088
No log 15.0 360 0.6730 0.4880 0.6730 0.8204
No log 15.0833 362 0.5899 0.5254 0.5899 0.7681
No log 15.1667 364 0.5496 0.5800 0.5496 0.7413
No log 15.25 366 0.5654 0.5092 0.5654 0.7519
No log 15.3333 368 0.5579 0.5361 0.5579 0.7470
No log 15.4167 370 0.5783 0.4292 0.5783 0.7605
No log 15.5 372 0.6137 0.4251 0.6137 0.7834
No log 15.5833 374 0.5868 0.4020 0.5868 0.7661
No log 15.6667 376 0.5551 0.4206 0.5551 0.7451
No log 15.75 378 0.5494 0.4837 0.5494 0.7412
No log 15.8333 380 0.5556 0.4124 0.5556 0.7454
No log 15.9167 382 0.5950 0.4190 0.5950 0.7713
No log 16.0 384 0.5972 0.4190 0.5972 0.7728
No log 16.0833 386 0.5874 0.4076 0.5874 0.7665
No log 16.1667 388 0.5807 0.3471 0.5807 0.7620
No log 16.25 390 0.5708 0.3572 0.5708 0.7555
No log 16.3333 392 0.5905 0.3996 0.5905 0.7684
No log 16.4167 394 0.6001 0.4491 0.6001 0.7747
No log 16.5 396 0.5735 0.4444 0.5735 0.7573
No log 16.5833 398 0.5721 0.4703 0.5721 0.7564
No log 16.6667 400 0.5745 0.4970 0.5745 0.7580
No log 16.75 402 0.5787 0.4918 0.5787 0.7607
No log 16.8333 404 0.5834 0.4849 0.5834 0.7638
No log 16.9167 406 0.5647 0.4949 0.5647 0.7515
No log 17.0 408 0.5651 0.4729 0.5651 0.7517
No log 17.0833 410 0.5682 0.4444 0.5682 0.7538
No log 17.1667 412 0.5546 0.4729 0.5546 0.7447
No log 17.25 414 0.5430 0.4402 0.5430 0.7369
No log 17.3333 416 0.5482 0.5305 0.5482 0.7404
No log 17.4167 418 0.5754 0.4918 0.5754 0.7585
No log 17.5 420 0.5550 0.5065 0.5550 0.7450
No log 17.5833 422 0.5254 0.5122 0.5254 0.7249
No log 17.6667 424 0.5447 0.4867 0.5447 0.7380
No log 17.75 426 0.5390 0.5329 0.5390 0.7341
No log 17.8333 428 0.5321 0.4825 0.5321 0.7295
No log 17.9167 430 0.5353 0.5386 0.5353 0.7316
No log 18.0 432 0.5315 0.5539 0.5315 0.7291
No log 18.0833 434 0.5380 0.5151 0.5380 0.7335
No log 18.1667 436 0.5621 0.5272 0.5621 0.7497
No log 18.25 438 0.5617 0.5272 0.5617 0.7494
No log 18.3333 440 0.5368 0.5042 0.5368 0.7327
No log 18.4167 442 0.5330 0.5042 0.5330 0.7301
No log 18.5 444 0.5371 0.5521 0.5371 0.7329
No log 18.5833 446 0.5754 0.4437 0.5754 0.7586
No log 18.6667 448 0.6138 0.4089 0.6138 0.7835
No log 18.75 450 0.6158 0.3819 0.6158 0.7847
No log 18.8333 452 0.6007 0.4414 0.6007 0.7751
No log 18.9167 454 0.5751 0.4945 0.5751 0.7584
No log 19.0 456 0.5829 0.4684 0.5829 0.7635
No log 19.0833 458 0.6289 0.4144 0.6289 0.7931
No log 19.1667 460 0.7113 0.3918 0.7113 0.8434
No log 19.25 462 0.7137 0.3918 0.7137 0.8448
No log 19.3333 464 0.7073 0.3918 0.7073 0.8410
No log 19.4167 466 0.6710 0.4224 0.6710 0.8191
No log 19.5 468 0.6414 0.3637 0.6414 0.8009
No log 19.5833 470 0.6724 0.4224 0.6724 0.8200
No log 19.6667 472 0.6465 0.3637 0.6465 0.8040
No log 19.75 474 0.5702 0.4306 0.5702 0.7551
No log 19.8333 476 0.5260 0.4264 0.5260 0.7252
No log 19.9167 478 0.5410 0.3661 0.5410 0.7355
No log 20.0 480 0.5377 0.3633 0.5377 0.7333
No log 20.0833 482 0.5276 0.4444 0.5276 0.7264
No log 20.1667 484 0.5896 0.3712 0.5896 0.7678
No log 20.25 486 0.6450 0.3564 0.6450 0.8031
No log 20.3333 488 0.6560 0.3746 0.6560 0.8099
No log 20.4167 490 0.6347 0.3564 0.6347 0.7967
No log 20.5 492 0.5774 0.3518 0.5774 0.7599
No log 20.5833 494 0.5588 0.4330 0.5588 0.7475
No log 20.6667 496 0.5481 0.4614 0.5481 0.7403
No log 20.75 498 0.5390 0.5587 0.5390 0.7342
0.3138 20.8333 500 0.5481 0.5307 0.5481 0.7404
0.3138 20.9167 502 0.5560 0.5053 0.5560 0.7456
0.3138 21.0 504 0.5494 0.5141 0.5494 0.7412
0.3138 21.0833 506 0.5528 0.4895 0.5528 0.7435
0.3138 21.1667 508 0.5518 0.4639 0.5518 0.7428
0.3138 21.25 510 0.5456 0.5550 0.5456 0.7386
0.3138 21.3333 512 0.5448 0.5379 0.5448 0.7381
0.3138 21.4167 514 0.5265 0.6111 0.5265 0.7256
0.3138 21.5 516 0.5239 0.5782 0.5239 0.7238
0.3138 21.5833 518 0.5338 0.5141 0.5338 0.7306
0.3138 21.6667 520 0.5280 0.5853 0.5280 0.7266
0.3138 21.75 522 0.5249 0.5044 0.5249 0.7245
0.3138 21.8333 524 0.5265 0.5305 0.5265 0.7256
0.3138 21.9167 526 0.5167 0.5798 0.5167 0.7188
0.3138 22.0 528 0.5127 0.5853 0.5127 0.7160
0.3138 22.0833 530 0.5846 0.4749 0.5846 0.7646
0.3138 22.1667 532 0.6323 0.4329 0.6323 0.7952
0.3138 22.25 534 0.6166 0.4224 0.6166 0.7852
0.3138 22.3333 536 0.5869 0.4835 0.5869 0.7661
0.3138 22.4167 538 0.5446 0.5362 0.5446 0.7379
0.3138 22.5 540 0.5320 0.5926 0.5320 0.7294
0.3138 22.5833 542 0.5337 0.5926 0.5337 0.7305
0.3138 22.6667 544 0.5442 0.6214 0.5442 0.7377
0.3138 22.75 546 0.5458 0.6214 0.5458 0.7388
0.3138 22.8333 548 0.5250 0.5593 0.5250 0.7246
0.3138 22.9167 550 0.5171 0.5784 0.5171 0.7191
0.3138 23.0 552 0.5131 0.6091 0.5131 0.7163
0.3138 23.0833 554 0.5175 0.5539 0.5175 0.7194
0.3138 23.1667 556 0.5207 0.5286 0.5207 0.7216
0.3138 23.25 558 0.5310 0.4569 0.5310 0.7287
0.3138 23.3333 560 0.5671 0.3996 0.5671 0.7530
0.3138 23.4167 562 0.6023 0.3918 0.6023 0.7761
0.3138 23.5 564 0.6390 0.4224 0.6390 0.7994

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k9_task7_organization

Finetuned
(4023)
this model