ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k20_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8586
  • Qwk: 0.7215
  • Mse: 0.8586
  • Rmse: 0.9266

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0208 2 6.6894 0.0303 6.6894 2.5864
No log 0.0417 4 4.2317 0.0905 4.2317 2.0571
No log 0.0625 6 3.5108 -0.0333 3.5108 1.8737
No log 0.0833 8 2.5096 0.0986 2.5096 1.5842
No log 0.1042 10 1.9339 0.1167 1.9339 1.3906
No log 0.125 12 1.8958 0.1081 1.8958 1.3769
No log 0.1458 14 2.4817 -0.0813 2.4817 1.5753
No log 0.1667 16 2.6096 -0.1538 2.6096 1.6154
No log 0.1875 18 2.4568 0.0312 2.4568 1.5674
No log 0.2083 20 2.4408 0.0896 2.4408 1.5623
No log 0.2292 22 2.2365 0.1778 2.2365 1.4955
No log 0.25 24 2.3310 0.1594 2.3310 1.5268
No log 0.2708 26 2.7365 0.0822 2.7365 1.6542
No log 0.2917 28 3.1516 0.0364 3.1516 1.7753
No log 0.3125 30 3.8848 0.0 3.8848 1.9710
No log 0.3333 32 3.8007 0.0278 3.8007 1.9495
No log 0.3542 34 2.9360 0.1287 2.9360 1.7135
No log 0.375 36 2.3493 0.2041 2.3493 1.5328
No log 0.3958 38 1.5026 0.3636 1.5026 1.2258
No log 0.4167 40 1.4831 0.3802 1.4831 1.2178
No log 0.4375 42 1.6433 0.3968 1.6433 1.2819
No log 0.4583 44 1.6417 0.4211 1.6417 1.2813
No log 0.4792 46 1.8044 0.3885 1.8044 1.3433
No log 0.5 48 2.3810 0.2194 2.3810 1.5431
No log 0.5208 50 3.1069 0.1124 3.1069 1.7626
No log 0.5417 52 3.2112 0.1087 3.2112 1.7920
No log 0.5625 54 2.6623 0.2118 2.6623 1.6317
No log 0.5833 56 1.8963 0.2968 1.8963 1.3770
No log 0.6042 58 1.6655 0.4354 1.6655 1.2905
No log 0.625 60 1.7288 0.4625 1.7288 1.3148
No log 0.6458 62 1.8299 0.4393 1.8299 1.3528
No log 0.6667 64 2.2745 0.3553 2.2745 1.5081
No log 0.6875 66 2.6396 0.3365 2.6396 1.6247
No log 0.7083 68 2.1756 0.4 2.1756 1.4750
No log 0.7292 70 1.5348 0.4968 1.5348 1.2389
No log 0.75 72 1.1817 0.4928 1.1817 1.0871
No log 0.7708 74 1.2221 0.3968 1.2221 1.1055
No log 0.7917 76 1.3301 0.4545 1.3301 1.1533
No log 0.8125 78 1.4392 0.4444 1.4392 1.1997
No log 0.8333 80 1.5281 0.4460 1.5281 1.2362
No log 0.8542 82 1.4408 0.4559 1.4408 1.2003
No log 0.875 84 1.4237 0.4559 1.4237 1.1932
No log 0.8958 86 1.4040 0.4706 1.4040 1.1849
No log 0.9167 88 1.4791 0.4306 1.4791 1.2162
No log 0.9375 90 1.6523 0.4286 1.6523 1.2854
No log 0.9583 92 1.9849 0.3436 1.9849 1.4089
No log 0.9792 94 2.2841 0.3390 2.2841 1.5113
No log 1.0 96 2.0551 0.3616 2.0551 1.4335
No log 1.0208 98 1.6980 0.4615 1.6980 1.3031
No log 1.0417 100 1.5484 0.5269 1.5484 1.2444
No log 1.0625 102 1.4722 0.5476 1.4722 1.2133
No log 1.0833 104 1.5778 0.5380 1.5778 1.2561
No log 1.1042 106 1.5345 0.5402 1.5345 1.2388
No log 1.125 108 1.1592 0.6098 1.1592 1.0766
No log 1.1458 110 1.0152 0.5634 1.0152 1.0076
No log 1.1667 112 1.0167 0.5874 1.0167 1.0083
No log 1.1875 114 1.1188 0.5906 1.1188 1.0577
No log 1.2083 116 1.3603 0.5576 1.3603 1.1663
No log 1.2292 118 1.6719 0.4819 1.6719 1.2930
No log 1.25 120 1.7320 0.4671 1.7320 1.3161
No log 1.2708 122 2.0217 0.3548 2.0217 1.4219
No log 1.2917 124 1.9517 0.4066 1.9517 1.3970
No log 1.3125 126 1.9631 0.4153 1.9631 1.4011
No log 1.3333 128 1.4576 0.5595 1.4576 1.2073
No log 1.3542 130 1.2372 0.6145 1.2372 1.1123
No log 1.375 132 1.2779 0.6145 1.2779 1.1304
No log 1.3958 134 1.1604 0.6429 1.1604 1.0772
No log 1.4167 136 0.9845 0.6069 0.9845 0.9922
No log 1.4375 138 0.9800 0.6143 0.9800 0.9900
No log 1.4583 140 0.9918 0.6383 0.9918 0.9959
No log 1.4792 142 1.0923 0.6443 1.0923 1.0451
No log 1.5 144 1.2333 0.5906 1.2333 1.1106
No log 1.5208 146 1.2414 0.5850 1.2414 1.1142
No log 1.5417 148 1.1654 0.6164 1.1654 1.0795
No log 1.5625 150 1.1853 0.5986 1.1853 1.0887
No log 1.5833 152 1.2588 0.52 1.2588 1.1220
No log 1.6042 154 1.4784 0.4875 1.4784 1.2159
No log 1.625 156 1.3308 0.5490 1.3308 1.1536
No log 1.6458 158 1.0345 0.6358 1.0345 1.0171
No log 1.6667 160 0.9883 0.6323 0.9883 0.9941
No log 1.6875 162 1.0503 0.6076 1.0503 1.0248
No log 1.7083 164 1.1519 0.5814 1.1519 1.0733
No log 1.7292 166 1.0085 0.6234 1.0085 1.0042
No log 1.75 168 0.9663 0.6667 0.9663 0.9830
No log 1.7708 170 0.9470 0.6667 0.9470 0.9731
No log 1.7917 172 0.9402 0.6887 0.9402 0.9697
No log 1.8125 174 1.0324 0.6184 1.0324 1.0161
No log 1.8333 176 1.1337 0.6322 1.1337 1.0647
No log 1.8542 178 1.3890 0.6064 1.3890 1.1786
No log 1.875 180 1.7638 0.4975 1.7638 1.3281
No log 1.8958 182 1.5731 0.5622 1.5731 1.2542
No log 1.9167 184 1.1813 0.6173 1.1813 1.0869
No log 1.9375 186 0.9648 0.6709 0.9648 0.9822
No log 1.9583 188 0.8860 0.6069 0.8860 0.9413
No log 1.9792 190 0.8952 0.6486 0.8952 0.9461
No log 2.0 192 0.9704 0.6133 0.9704 0.9851
No log 2.0208 194 1.0296 0.6585 1.0296 1.0147
No log 2.0417 196 1.1578 0.6289 1.1578 1.0760
No log 2.0625 198 1.2506 0.5912 1.2506 1.1183
No log 2.0833 200 1.1219 0.6211 1.1219 1.0592
No log 2.1042 202 0.9449 0.6389 0.9449 0.9720
No log 2.125 204 0.9060 0.6154 0.9060 0.9519
No log 2.1458 206 0.8750 0.6667 0.8750 0.9354
No log 2.1667 208 0.8772 0.6901 0.8772 0.9366
No log 2.1875 210 0.8858 0.6531 0.8858 0.9412
No log 2.2083 212 0.9699 0.6460 0.9699 0.9848
No log 2.2292 214 1.0475 0.6548 1.0475 1.0235
No log 2.25 216 1.0421 0.6747 1.0421 1.0208
No log 2.2708 218 1.0423 0.6824 1.0423 1.0209
No log 2.2917 220 1.0078 0.7018 1.0078 1.0039
No log 2.3125 222 1.0000 0.6826 1.0000 1.0000
No log 2.3333 224 0.9733 0.6627 0.9733 0.9866
No log 2.3542 226 1.0391 0.6243 1.0391 1.0194
No log 2.375 228 1.2288 0.6444 1.2288 1.1085
No log 2.3958 230 1.5775 0.5833 1.5775 1.2560
No log 2.4167 232 2.0541 0.4608 2.0541 1.4332
No log 2.4375 234 2.0816 0.4314 2.0816 1.4428
No log 2.4583 236 1.7885 0.4923 1.7885 1.3374
No log 2.4792 238 1.3498 0.5977 1.3498 1.1618
No log 2.5 240 1.1826 0.6076 1.1826 1.0875
No log 2.5208 242 1.1518 0.5974 1.1518 1.0732
No log 2.5417 244 1.1791 0.5806 1.1791 1.0859
No log 2.5625 246 1.3878 0.5848 1.3878 1.1781
No log 2.5833 248 1.5780 0.4914 1.5780 1.2562
No log 2.6042 250 1.5971 0.4633 1.5971 1.2638
No log 2.625 252 1.4222 0.5444 1.4222 1.1926
No log 2.6458 254 1.1527 0.6154 1.1527 1.0736
No log 2.6667 256 1.0384 0.65 1.0384 1.0190
No log 2.6875 258 1.1367 0.6821 1.1367 1.0662
No log 2.7083 260 1.4279 0.5625 1.4279 1.1949
No log 2.7292 262 1.7328 0.4949 1.7328 1.3164
No log 2.75 264 1.7675 0.5025 1.7675 1.3295
No log 2.7708 266 1.3688 0.6224 1.3688 1.1699
No log 2.7917 268 1.0085 0.7111 1.0085 1.0042
No log 2.8125 270 0.7756 0.7073 0.7756 0.8807
No log 2.8333 272 0.7683 0.6944 0.7683 0.8765
No log 2.8542 274 0.8265 0.6761 0.8265 0.9091
No log 2.875 276 0.8262 0.6761 0.8262 0.9090
No log 2.8958 278 0.7931 0.7234 0.7931 0.8906
No log 2.9167 280 0.7452 0.7286 0.7452 0.8633
No log 2.9375 282 0.7314 0.7092 0.7314 0.8552
No log 2.9583 284 0.7140 0.7183 0.7140 0.8450
No log 2.9792 286 0.7013 0.7211 0.7013 0.8374
No log 3.0 288 0.6948 0.7595 0.6948 0.8335
No log 3.0208 290 0.6899 0.7625 0.6899 0.8306
No log 3.0417 292 0.6883 0.7625 0.6883 0.8296
No log 3.0625 294 0.6997 0.7578 0.6997 0.8365
No log 3.0833 296 0.7236 0.7531 0.7236 0.8506
No log 3.1042 298 0.7937 0.7229 0.7937 0.8909
No log 3.125 300 0.9003 0.7052 0.9003 0.9488
No log 3.1458 302 1.0741 0.6743 1.0741 1.0364
No log 3.1667 304 1.0858 0.6860 1.0858 1.0420
No log 3.1875 306 0.9384 0.7018 0.9384 0.9687
No log 3.2083 308 0.8790 0.6988 0.8790 0.9375
No log 3.2292 310 0.8098 0.6962 0.8098 0.8999
No log 3.25 312 0.7615 0.7152 0.7615 0.8726
No log 3.2708 314 0.7365 0.7516 0.7365 0.8582
No log 3.2917 316 0.7222 0.7595 0.7222 0.8498
No log 3.3125 318 0.6975 0.75 0.6975 0.8351
No log 3.3333 320 0.7102 0.7297 0.7102 0.8427
No log 3.3542 322 0.7379 0.7211 0.7379 0.8590
No log 3.375 324 0.7295 0.6939 0.7295 0.8541
No log 3.3958 326 0.7416 0.7013 0.7416 0.8611
No log 3.4167 328 0.8174 0.725 0.8174 0.9041
No log 3.4375 330 1.1442 0.6851 1.1442 1.0697
No log 3.4583 332 1.4967 0.5625 1.4967 1.2234
No log 3.4792 334 1.5028 0.5625 1.5028 1.2259
No log 3.5 336 1.0758 0.7101 1.0758 1.0372
No log 3.5208 338 0.8698 0.7381 0.8698 0.9326
No log 3.5417 340 0.8121 0.7296 0.8121 0.9012
No log 3.5625 342 0.8129 0.7296 0.8129 0.9016
No log 3.5833 344 0.7943 0.7296 0.7943 0.8912
No log 3.6042 346 0.7696 0.6957 0.7696 0.8773
No log 3.625 348 0.7810 0.6748 0.7810 0.8837
No log 3.6458 350 0.7949 0.6748 0.7949 0.8916
No log 3.6667 352 0.8084 0.6748 0.8084 0.8991
No log 3.6875 354 0.8396 0.6871 0.8396 0.9163
No log 3.7083 356 0.9127 0.7241 0.9127 0.9554
No log 3.7292 358 0.9672 0.7241 0.9672 0.9835
No log 3.75 360 0.9473 0.7241 0.9473 0.9733
No log 3.7708 362 0.9171 0.7186 0.9171 0.9577
No log 3.7917 364 0.9339 0.6994 0.9339 0.9664
No log 3.8125 366 0.9822 0.6748 0.9822 0.9911
No log 3.8333 368 0.9852 0.6748 0.9852 0.9926
No log 3.8542 370 0.9539 0.6994 0.9539 0.9767
No log 3.875 372 0.8738 0.7117 0.8738 0.9348
No log 3.8958 374 0.7656 0.7368 0.7656 0.8750
No log 3.9167 376 0.7589 0.6567 0.7589 0.8711
No log 3.9375 378 0.7985 0.6667 0.7985 0.8936
No log 3.9583 380 0.8051 0.6565 0.8051 0.8973
No log 3.9792 382 0.8048 0.6519 0.8048 0.8971
No log 4.0 384 0.9194 0.7034 0.9194 0.9588
No log 4.0208 386 1.0398 0.6842 1.0398 1.0197
No log 4.0417 388 1.0740 0.6506 1.0740 1.0364
No log 4.0625 390 0.8906 0.7097 0.8906 0.9437
No log 4.0833 392 0.7675 0.7020 0.7675 0.8761
No log 4.1042 394 0.7333 0.7020 0.7333 0.8564
No log 4.125 396 0.7251 0.7067 0.7251 0.8515
No log 4.1458 398 0.7374 0.7123 0.7374 0.8587
No log 4.1667 400 0.7392 0.7172 0.7392 0.8597
No log 4.1875 402 0.7195 0.7133 0.7195 0.8482
No log 4.2083 404 0.7212 0.6857 0.7212 0.8492
No log 4.2292 406 0.7341 0.6857 0.7341 0.8568
No log 4.25 408 0.7700 0.7034 0.7700 0.8775
No log 4.2708 410 0.8472 0.7333 0.8472 0.9204
No log 4.2917 412 0.9308 0.7059 0.9308 0.9648
No log 4.3125 414 0.9527 0.7051 0.9527 0.9760
No log 4.3333 416 0.8275 0.7451 0.8275 0.9097
No log 4.3542 418 0.7456 0.7403 0.7456 0.8635
No log 4.375 420 0.6880 0.7162 0.6880 0.8294
No log 4.3958 422 0.6948 0.7355 0.6948 0.8336
No log 4.4167 424 0.7594 0.7355 0.7594 0.8715
No log 4.4375 426 0.9120 0.7273 0.9120 0.9550
No log 4.4583 428 1.1861 0.6190 1.1861 1.0891
No log 4.4792 430 1.1335 0.6552 1.1335 1.0647
No log 4.5 432 0.8770 0.7209 0.8770 0.9365
No log 4.5208 434 0.6971 0.7375 0.6971 0.8349
No log 4.5417 436 0.6918 0.7297 0.6918 0.8317
No log 4.5625 438 0.7637 0.6667 0.7637 0.8739
No log 4.5833 440 0.7389 0.6897 0.7389 0.8596
No log 4.6042 442 0.6940 0.7403 0.6940 0.8331
No log 4.625 444 0.7026 0.7558 0.7026 0.8382
No log 4.6458 446 0.7792 0.7727 0.7792 0.8827
No log 4.6667 448 0.8219 0.7709 0.8219 0.9066
No log 4.6875 450 0.8001 0.7657 0.8001 0.8945
No log 4.7083 452 0.7929 0.7640 0.7929 0.8904
No log 4.7292 454 0.8411 0.7241 0.8411 0.9171
No log 4.75 456 0.8771 0.7326 0.8771 0.9365
No log 4.7708 458 0.7852 0.7176 0.7852 0.8861
No log 4.7917 460 0.7271 0.7329 0.7271 0.8527
No log 4.8125 462 0.7268 0.7211 0.7268 0.8525
No log 4.8333 464 0.7419 0.7222 0.7419 0.8613
No log 4.8542 466 0.7509 0.7211 0.7509 0.8665
No log 4.875 468 0.7542 0.7211 0.7542 0.8685
No log 4.8958 470 0.7447 0.7027 0.7447 0.8630
No log 4.9167 472 0.7104 0.75 0.7104 0.8428
No log 4.9375 474 0.6679 0.7516 0.6679 0.8173
No log 4.9583 476 0.6489 0.7578 0.6489 0.8055
No log 4.9792 478 0.6523 0.7673 0.6523 0.8077
No log 5.0 480 0.6641 0.7607 0.6641 0.8149
No log 5.0208 482 0.6477 0.7925 0.6477 0.8048
No log 5.0417 484 0.6378 0.775 0.6378 0.7986
No log 5.0625 486 0.6450 0.7799 0.6450 0.8031
No log 5.0833 488 0.6490 0.7799 0.6490 0.8056
No log 5.1042 490 0.6600 0.7712 0.6600 0.8124
No log 5.125 492 0.6857 0.7451 0.6857 0.8281
No log 5.1458 494 0.7042 0.7211 0.7042 0.8392
No log 5.1667 496 0.7227 0.7297 0.7227 0.8501
No log 5.1875 498 0.7381 0.7632 0.7381 0.8591
0.4533 5.2083 500 0.7361 0.7397 0.7361 0.8580
0.4533 5.2292 502 0.7333 0.7114 0.7333 0.8563
0.4533 5.25 504 0.7742 0.7013 0.7742 0.8799
0.4533 5.2708 506 0.8059 0.7006 0.8059 0.8977
0.4533 5.2917 508 0.8504 0.7215 0.8504 0.9222
0.4533 5.3125 510 0.8586 0.7215 0.8586 0.9266

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k20_task1_organization

Finetuned
(4019)
this model