ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k15_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7803
  • Qwk: 0.4955
  • Mse: 0.7803
  • Rmse: 0.8834

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0513 2 4.6165 -0.0179 4.6165 2.1486
No log 0.1026 4 2.8710 -0.0231 2.8710 1.6944
No log 0.1538 6 2.1203 -0.0647 2.1203 1.4561
No log 0.2051 8 1.4524 0.0279 1.4524 1.2051
No log 0.2564 10 1.6575 0.0300 1.6575 1.2874
No log 0.3077 12 1.5431 0.0371 1.5431 1.2422
No log 0.3590 14 1.3230 -0.0511 1.3230 1.1502
No log 0.4103 16 1.1595 0.0882 1.1595 1.0768
No log 0.4615 18 1.2108 0.0909 1.2108 1.1004
No log 0.5128 20 1.1741 0.1154 1.1741 1.0836
No log 0.5641 22 1.1535 0.0792 1.1535 1.0740
No log 0.6154 24 1.1449 0.1408 1.1449 1.0700
No log 0.6667 26 1.1799 0.0970 1.1799 1.0862
No log 0.7179 28 1.2803 0.0232 1.2803 1.1315
No log 0.7692 30 1.3531 0.0 1.3531 1.1632
No log 0.8205 32 1.2932 0.0380 1.2932 1.1372
No log 0.8718 34 1.1336 0.2074 1.1336 1.0647
No log 0.9231 36 1.0738 0.1218 1.0738 1.0362
No log 0.9744 38 1.0661 0.1370 1.0661 1.0325
No log 1.0256 40 1.0737 0.1848 1.0737 1.0362
No log 1.0769 42 1.0947 0.2074 1.0947 1.0463
No log 1.1282 44 1.1012 0.2125 1.1012 1.0494
No log 1.1795 46 1.0271 0.1725 1.0271 1.0135
No log 1.2308 48 0.9669 0.2944 0.9669 0.9833
No log 1.2821 50 0.9679 0.3288 0.9679 0.9838
No log 1.3333 52 0.9473 0.2944 0.9473 0.9733
No log 1.3846 54 0.9451 0.2842 0.9451 0.9722
No log 1.4359 56 0.9414 0.2865 0.9414 0.9703
No log 1.4872 58 0.9470 0.3979 0.9470 0.9731
No log 1.5385 60 1.0563 0.2441 1.0563 1.0278
No log 1.5897 62 1.0513 0.2834 1.0513 1.0253
No log 1.6410 64 1.0556 0.3108 1.0556 1.0274
No log 1.6923 66 1.0786 0.3547 1.0786 1.0385
No log 1.7436 68 1.1077 0.2835 1.1077 1.0525
No log 1.7949 70 1.1438 0.2669 1.1438 1.0695
No log 1.8462 72 1.0474 0.2551 1.0474 1.0234
No log 1.8974 74 1.0008 0.3414 1.0008 1.0004
No log 1.9487 76 0.9496 0.3819 0.9496 0.9745
No log 2.0 78 0.9969 0.3063 0.9969 0.9985
No log 2.0513 80 1.0100 0.2474 1.0100 1.0050
No log 2.1026 82 0.9360 0.3153 0.9360 0.9675
No log 2.1538 84 0.9227 0.3304 0.9227 0.9606
No log 2.2051 86 0.8863 0.3214 0.8863 0.9414
No log 2.2564 88 0.8880 0.3175 0.8880 0.9423
No log 2.3077 90 0.8734 0.3744 0.8734 0.9346
No log 2.3590 92 0.8972 0.3976 0.8972 0.9472
No log 2.4103 94 0.9447 0.4466 0.9447 0.9720
No log 2.4615 96 0.9657 0.4231 0.9657 0.9827
No log 2.5128 98 0.9785 0.4231 0.9785 0.9892
No log 2.5641 100 0.9767 0.4404 0.9767 0.9883
No log 2.6154 102 1.0072 0.4711 1.0072 1.0036
No log 2.6667 104 0.9995 0.4662 0.9995 0.9997
No log 2.7179 106 1.1354 0.3130 1.1354 1.0655
No log 2.7692 108 1.0171 0.4211 1.0171 1.0085
No log 2.8205 110 0.9990 0.4211 0.9990 0.9995
No log 2.8718 112 1.0314 0.3787 1.0314 1.0156
No log 2.9231 114 1.0224 0.3787 1.0224 1.0111
No log 2.9744 116 0.9335 0.3335 0.9335 0.9662
No log 3.0256 118 0.9375 0.4709 0.9375 0.9682
No log 3.0769 120 1.0067 0.4369 1.0067 1.0034
No log 3.1282 122 0.8711 0.4879 0.8711 0.9333
No log 3.1795 124 1.0941 0.4515 1.0941 1.0460
No log 3.2308 126 1.1544 0.4471 1.1544 1.0744
No log 3.2821 128 0.9587 0.5106 0.9587 0.9791
No log 3.3333 130 0.8255 0.5618 0.8255 0.9086
No log 3.3846 132 1.0152 0.3878 1.0152 1.0076
No log 3.4359 134 1.1338 0.3666 1.1338 1.0648
No log 3.4872 136 0.9193 0.4161 0.9193 0.9588
No log 3.5385 138 0.8026 0.4792 0.8026 0.8959
No log 3.5897 140 0.9448 0.2543 0.9448 0.9720
No log 3.6410 142 0.8931 0.3988 0.8931 0.9451
No log 3.6923 144 0.7539 0.5492 0.7539 0.8683
No log 3.7436 146 0.8108 0.4943 0.8108 0.9005
No log 3.7949 148 1.0066 0.4898 1.0066 1.0033
No log 3.8462 150 0.9984 0.4579 0.9984 0.9992
No log 3.8974 152 0.8072 0.5291 0.8072 0.8985
No log 3.9487 154 0.8034 0.6089 0.8034 0.8963
No log 4.0 156 0.8058 0.6050 0.8058 0.8977
No log 4.0513 158 0.7590 0.5485 0.7590 0.8712
No log 4.1026 160 0.8928 0.3001 0.8928 0.9449
No log 4.1538 162 1.0286 0.1487 1.0286 1.0142
No log 4.2051 164 0.9408 0.3743 0.9408 0.9699
No log 4.2564 166 0.7911 0.4988 0.7911 0.8895
No log 4.3077 168 0.8690 0.3566 0.8690 0.9322
No log 4.3590 170 0.8536 0.3704 0.8536 0.9239
No log 4.4103 172 0.8642 0.5192 0.8642 0.9296
No log 4.4615 174 0.9413 0.4568 0.9413 0.9702
No log 4.5128 176 0.8599 0.4824 0.8599 0.9273
No log 4.5641 178 0.7842 0.4119 0.7842 0.8855
No log 4.6154 180 0.8721 0.3926 0.8721 0.9339
No log 4.6667 182 0.8499 0.4308 0.8499 0.9219
No log 4.7179 184 0.7779 0.5176 0.7779 0.8820
No log 4.7692 186 0.8383 0.4724 0.8383 0.9156
No log 4.8205 188 0.9667 0.3972 0.9667 0.9832
No log 4.8718 190 0.9227 0.4268 0.9227 0.9606
No log 4.9231 192 0.8471 0.4550 0.8471 0.9204
No log 4.9744 194 0.8421 0.4550 0.8421 0.9177
No log 5.0256 196 0.8040 0.4385 0.8040 0.8966
No log 5.0769 198 0.7935 0.4078 0.7935 0.8908
No log 5.1282 200 0.7886 0.4473 0.7886 0.8880
No log 5.1795 202 0.8235 0.4433 0.8235 0.9075
No log 5.2308 204 0.8667 0.4291 0.8667 0.9310
No log 5.2821 206 0.8313 0.4097 0.8313 0.9118
No log 5.3333 208 0.8407 0.4676 0.8407 0.9169
No log 5.3846 210 0.8525 0.4368 0.8525 0.9233
No log 5.4359 212 0.9009 0.4220 0.9009 0.9492
No log 5.4872 214 0.9234 0.4143 0.9234 0.9609
No log 5.5385 216 0.9535 0.4285 0.9535 0.9765
No log 5.5897 218 0.8889 0.4633 0.8889 0.9428
No log 5.6410 220 0.9011 0.3945 0.9011 0.9492
No log 5.6923 222 0.8453 0.4858 0.8453 0.9194
No log 5.7436 224 0.8058 0.3979 0.8058 0.8977
No log 5.7949 226 0.8207 0.3702 0.8207 0.9060
No log 5.8462 228 0.9398 0.4036 0.9398 0.9695
No log 5.8974 230 0.9640 0.3236 0.9640 0.9819
No log 5.9487 232 0.8711 0.3782 0.8711 0.9333
No log 6.0 234 0.8008 0.4428 0.8008 0.8949
No log 6.0513 236 0.8288 0.4354 0.8288 0.9104
No log 6.1026 238 0.8043 0.5368 0.8043 0.8968
No log 6.1538 240 0.9108 0.5138 0.9108 0.9544
No log 6.2051 242 1.2842 0.3513 1.2842 1.1332
No log 6.2564 244 1.4479 0.3154 1.4479 1.2033
No log 6.3077 246 1.3135 0.3811 1.3135 1.1461
No log 6.3590 248 1.0261 0.5098 1.0261 1.0130
No log 6.4103 250 0.8697 0.4859 0.8697 0.9326
No log 6.4615 252 0.7952 0.4450 0.7952 0.8918
No log 6.5128 254 0.7701 0.4434 0.7701 0.8776
No log 6.5641 256 0.7707 0.5025 0.7707 0.8779
No log 6.6154 258 0.7526 0.5377 0.7526 0.8675
No log 6.6667 260 0.7373 0.5986 0.7373 0.8587
No log 6.7179 262 0.7187 0.5990 0.7187 0.8478
No log 6.7692 264 0.7296 0.6148 0.7296 0.8541
No log 6.8205 266 0.7313 0.6120 0.7313 0.8551
No log 6.8718 268 0.7603 0.6253 0.7603 0.8719
No log 6.9231 270 0.7963 0.6083 0.7963 0.8924
No log 6.9744 272 0.7534 0.6423 0.7534 0.8680
No log 7.0256 274 0.7571 0.5618 0.7571 0.8701
No log 7.0769 276 0.8005 0.5366 0.8005 0.8947
No log 7.1282 278 0.7461 0.5905 0.7461 0.8638
No log 7.1795 280 0.7250 0.5485 0.7250 0.8515
No log 7.2308 282 0.7529 0.5301 0.7529 0.8677
No log 7.2821 284 0.7620 0.5301 0.7620 0.8729
No log 7.3333 286 0.7358 0.5833 0.7358 0.8578
No log 7.3846 288 0.8247 0.5163 0.8247 0.9081
No log 7.4359 290 0.9558 0.4373 0.9558 0.9777
No log 7.4872 292 0.9064 0.4570 0.9064 0.9521
No log 7.5385 294 0.7819 0.5676 0.7819 0.8843
No log 7.5897 296 0.7385 0.4644 0.7385 0.8593
No log 7.6410 298 0.7454 0.5131 0.7454 0.8634
No log 7.6923 300 0.7812 0.5540 0.7812 0.8839
No log 7.7436 302 0.8641 0.4911 0.8641 0.9296
No log 7.7949 304 0.9851 0.4767 0.9851 0.9925
No log 7.8462 306 0.9480 0.4551 0.9480 0.9736
No log 7.8974 308 0.8190 0.4958 0.8190 0.9050
No log 7.9487 310 0.7567 0.5688 0.7567 0.8699
No log 8.0 312 0.7723 0.5706 0.7723 0.8788
No log 8.0513 314 0.7919 0.5587 0.7919 0.8899
No log 8.1026 316 0.7684 0.6354 0.7684 0.8766
No log 8.1538 318 0.7397 0.5859 0.7397 0.8600
No log 8.2051 320 0.7447 0.5657 0.7447 0.8630
No log 8.2564 322 0.7546 0.5666 0.7546 0.8687
No log 8.3077 324 0.7340 0.5098 0.7340 0.8567
No log 8.3590 326 0.7190 0.5485 0.7190 0.8480
No log 8.4103 328 0.7318 0.5017 0.7318 0.8555
No log 8.4615 330 0.7497 0.5485 0.7497 0.8659
No log 8.5128 332 0.7926 0.5304 0.7926 0.8903
No log 8.5641 334 0.8152 0.5484 0.8152 0.9029
No log 8.6154 336 0.7987 0.5098 0.7987 0.8937
No log 8.6667 338 0.7844 0.5359 0.7844 0.8857
No log 8.7179 340 0.7932 0.4995 0.7932 0.8906
No log 8.7692 342 0.7971 0.4082 0.7971 0.8928
No log 8.8205 344 0.8018 0.3797 0.8018 0.8954
No log 8.8718 346 0.7933 0.3797 0.7933 0.8907
No log 8.9231 348 0.7856 0.4082 0.7856 0.8863
No log 8.9744 350 0.7822 0.4082 0.7822 0.8844
No log 9.0256 352 0.7804 0.4995 0.7804 0.8834
No log 9.0769 354 0.8031 0.4730 0.8031 0.8961
No log 9.1282 356 0.8454 0.5088 0.8454 0.9195
No log 9.1795 358 0.8773 0.4946 0.8773 0.9366
No log 9.2308 360 0.8159 0.4593 0.8159 0.9033
No log 9.2821 362 0.7757 0.4527 0.7757 0.8807
No log 9.3333 364 0.7877 0.4802 0.7877 0.8875
No log 9.3846 366 0.7808 0.4789 0.7808 0.8836
No log 9.4359 368 0.8586 0.5079 0.8586 0.9266
No log 9.4872 370 0.9658 0.3761 0.9658 0.9827
No log 9.5385 372 0.9177 0.4192 0.9177 0.9580
No log 9.5897 374 0.7977 0.4730 0.7977 0.8931
No log 9.6410 376 0.7787 0.5163 0.7787 0.8824
No log 9.6923 378 0.8348 0.5548 0.8348 0.9137
No log 9.7436 380 0.7987 0.5852 0.7987 0.8937
No log 9.7949 382 0.7373 0.5166 0.7373 0.8587
No log 9.8462 384 0.7357 0.5582 0.7357 0.8577
No log 9.8974 386 0.7747 0.5959 0.7747 0.8802
No log 9.9487 388 0.8030 0.5888 0.8030 0.8961
No log 10.0 390 0.7391 0.5986 0.7391 0.8597
No log 10.0513 392 0.6888 0.5602 0.6888 0.8300
No log 10.1026 394 0.6826 0.5939 0.6826 0.8262
No log 10.1538 396 0.6859 0.5724 0.6859 0.8282
No log 10.2051 398 0.7079 0.6014 0.7079 0.8413
No log 10.2564 400 0.7163 0.6043 0.7163 0.8463
No log 10.3077 402 0.7463 0.5663 0.7463 0.8639
No log 10.3590 404 0.8317 0.5266 0.8317 0.9120
No log 10.4103 406 0.8553 0.5447 0.8553 0.9248
No log 10.4615 408 0.7723 0.5416 0.7723 0.8788
No log 10.5128 410 0.7297 0.4851 0.7297 0.8542
No log 10.5641 412 0.7225 0.4610 0.7225 0.8500
No log 10.6154 414 0.7194 0.4490 0.7194 0.8482
No log 10.6667 416 0.7009 0.4776 0.7009 0.8372
No log 10.7179 418 0.6902 0.4927 0.6902 0.8308
No log 10.7692 420 0.7027 0.5375 0.7027 0.8383
No log 10.8205 422 0.7299 0.4980 0.7299 0.8544
No log 10.8718 424 0.7393 0.4980 0.7393 0.8599
No log 10.9231 426 0.7246 0.4626 0.7246 0.8512
No log 10.9744 428 0.7143 0.4388 0.7143 0.8452
No log 11.0256 430 0.7245 0.4507 0.7245 0.8512
No log 11.0769 432 0.7809 0.4713 0.7809 0.8837
No log 11.1282 434 0.8390 0.4824 0.8390 0.9159
No log 11.1795 436 0.8219 0.4713 0.8219 0.9066
No log 11.2308 438 0.7664 0.4729 0.7664 0.8755
No log 11.2821 440 0.7507 0.4660 0.7507 0.8664
No log 11.3333 442 0.7509 0.5500 0.7509 0.8665
No log 11.3846 444 0.7462 0.5155 0.7462 0.8638
No log 11.4359 446 0.7547 0.4729 0.7547 0.8687
No log 11.4872 448 0.7578 0.4593 0.7578 0.8705
No log 11.5385 450 0.7631 0.4456 0.7631 0.8736
No log 11.5897 452 0.7340 0.5375 0.7340 0.8567
No log 11.6410 454 0.7250 0.5500 0.7250 0.8515
No log 11.6923 456 0.7178 0.5261 0.7178 0.8472
No log 11.7436 458 0.7181 0.5373 0.7181 0.8474
No log 11.7949 460 0.7519 0.5103 0.7519 0.8671
No log 11.8462 462 0.7508 0.4965 0.7508 0.8665
No log 11.8974 464 0.7226 0.5570 0.7226 0.8500
No log 11.9487 466 0.7263 0.5704 0.7263 0.8523
No log 12.0 468 0.7321 0.5817 0.7321 0.8556
No log 12.0513 470 0.7307 0.5806 0.7307 0.8548
No log 12.1026 472 0.7972 0.6109 0.7972 0.8929
No log 12.1538 474 0.8657 0.5611 0.8657 0.9304
No log 12.2051 476 0.8259 0.5647 0.8259 0.9088
No log 12.2564 478 0.7378 0.5932 0.7378 0.8590
No log 12.3077 480 0.7360 0.5817 0.7360 0.8579
No log 12.3590 482 0.7400 0.5393 0.7400 0.8603
No log 12.4103 484 0.7374 0.5024 0.7374 0.8587
No log 12.4615 486 0.7972 0.5540 0.7972 0.8929
No log 12.5128 488 0.8276 0.5877 0.8276 0.9097
No log 12.5641 490 0.7995 0.5622 0.7995 0.8942
No log 12.6154 492 0.7616 0.5024 0.7616 0.8727
No log 12.6667 494 0.7566 0.5024 0.7566 0.8698
No log 12.7179 496 0.7587 0.5024 0.7587 0.8711
No log 12.7692 498 0.7602 0.5024 0.7602 0.8719
0.3181 12.8205 500 0.7703 0.5024 0.7703 0.8777
0.3181 12.8718 502 0.7925 0.5262 0.7925 0.8902
0.3181 12.9231 504 0.8273 0.4877 0.8273 0.9096
0.3181 12.9744 506 0.8236 0.5253 0.8236 0.9075
0.3181 13.0256 508 0.8109 0.5720 0.8109 0.9005
0.3181 13.0769 510 0.8614 0.5368 0.8614 0.9281
0.3181 13.1282 512 0.9166 0.4356 0.9166 0.9574
0.3181 13.1795 514 0.8557 0.5183 0.8557 0.9250
0.3181 13.2308 516 0.7551 0.5359 0.7551 0.8690
0.3181 13.2821 518 0.7553 0.5638 0.7553 0.8691
0.3181 13.3333 520 0.8573 0.5032 0.8573 0.9259
0.3181 13.3846 522 0.8743 0.5137 0.8743 0.9350
0.3181 13.4359 524 0.7999 0.6057 0.7999 0.8944
0.3181 13.4872 526 0.8007 0.5548 0.8007 0.8948
0.3181 13.5385 528 0.8910 0.5185 0.8910 0.9439
0.3181 13.5897 530 0.9247 0.5505 0.9247 0.9616
0.3181 13.6410 532 0.8705 0.5455 0.8705 0.9330
0.3181 13.6923 534 0.8034 0.5046 0.8034 0.8963
0.3181 13.7436 536 0.7497 0.5438 0.7497 0.8659
0.3181 13.7949 538 0.7395 0.5485 0.7395 0.8600
0.3181 13.8462 540 0.7559 0.5528 0.7559 0.8694
0.3181 13.8974 542 0.7763 0.5504 0.7763 0.8811
0.3181 13.9487 544 0.7690 0.5504 0.7690 0.8769
0.3181 14.0 546 0.7992 0.5591 0.7992 0.8940
0.3181 14.0513 548 0.8103 0.5491 0.8103 0.9001
0.3181 14.1026 550 0.7833 0.5192 0.7833 0.8851
0.3181 14.1538 552 0.7533 0.5334 0.7533 0.8679
0.3181 14.2051 554 0.7421 0.4873 0.7421 0.8615
0.3181 14.2564 556 0.7475 0.4776 0.7475 0.8646
0.3181 14.3077 558 0.7600 0.4918 0.7600 0.8718
0.3181 14.3590 560 0.7719 0.4932 0.7719 0.8786
0.3181 14.4103 562 0.7803 0.4955 0.7803 0.8834

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k15_task5_organization

Finetuned
(4023)
this model