ArabicNewSplits6_FineTuningAraBERT_run1_AugV5_k10_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7400
  • Qwk: 0.7168
  • Mse: 0.7400
  • Rmse: 0.8602

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0513 2 2.1832 0.0379 2.1832 1.4776
No log 0.1026 4 1.6201 0.1595 1.6201 1.2728
No log 0.1538 6 1.2888 0.2264 1.2888 1.1352
No log 0.2051 8 1.4053 0.2423 1.4053 1.1855
No log 0.2564 10 1.4438 0.3318 1.4438 1.2016
No log 0.3077 12 1.4594 0.3145 1.4594 1.2081
No log 0.3590 14 1.4014 0.2445 1.4014 1.1838
No log 0.4103 16 1.3826 0.1672 1.3826 1.1758
No log 0.4615 18 1.4265 0.2256 1.4265 1.1944
No log 0.5128 20 1.3873 0.1523 1.3873 1.1779
No log 0.5641 22 1.3365 0.1216 1.3365 1.1561
No log 0.6154 24 1.3018 0.1216 1.3018 1.1409
No log 0.6667 26 1.3047 0.1523 1.3047 1.1422
No log 0.7179 28 1.2884 0.1961 1.2884 1.1351
No log 0.7692 30 1.2581 0.2256 1.2581 1.1217
No log 0.8205 32 1.2769 0.2736 1.2769 1.1300
No log 0.8718 34 1.2657 0.3645 1.2657 1.1250
No log 0.9231 36 1.2289 0.4031 1.2289 1.1086
No log 0.9744 38 1.1289 0.4112 1.1289 1.0625
No log 1.0256 40 1.0606 0.3820 1.0606 1.0299
No log 1.0769 42 1.0274 0.4033 1.0274 1.0136
No log 1.1282 44 1.0035 0.4162 1.0035 1.0017
No log 1.1795 46 0.9672 0.4847 0.9672 0.9835
No log 1.2308 48 0.9318 0.5453 0.9318 0.9653
No log 1.2821 50 0.9294 0.5562 0.9294 0.9640
No log 1.3333 52 0.8441 0.6111 0.8441 0.9188
No log 1.3846 54 0.8103 0.6552 0.8103 0.9002
No log 1.4359 56 0.7674 0.6735 0.7674 0.8760
No log 1.4872 58 0.7425 0.6847 0.7425 0.8617
No log 1.5385 60 0.7187 0.6922 0.7187 0.8478
No log 1.5897 62 0.7116 0.7157 0.7116 0.8436
No log 1.6410 64 0.7102 0.7249 0.7102 0.8427
No log 1.6923 66 0.6935 0.7213 0.6935 0.8328
No log 1.7436 68 0.7133 0.7121 0.7133 0.8445
No log 1.7949 70 0.7720 0.7098 0.7720 0.8786
No log 1.8462 72 1.0255 0.6403 1.0255 1.0127
No log 1.8974 74 1.1067 0.5470 1.1067 1.0520
No log 1.9487 76 0.9338 0.5944 0.9338 0.9664
No log 2.0 78 0.7586 0.6788 0.7586 0.8710
No log 2.0513 80 0.7481 0.6767 0.7481 0.8650
No log 2.1026 82 0.7948 0.6473 0.7948 0.8915
No log 2.1538 84 0.9945 0.5106 0.9945 0.9972
No log 2.2051 86 1.1933 0.5089 1.1933 1.0924
No log 2.2564 88 1.1281 0.5474 1.1281 1.0621
No log 2.3077 90 0.9199 0.5534 0.9199 0.9591
No log 2.3590 92 0.7691 0.6538 0.7691 0.8770
No log 2.4103 94 0.7461 0.6227 0.7461 0.8638
No log 2.4615 96 0.7423 0.6532 0.7423 0.8616
No log 2.5128 98 0.7370 0.6695 0.7370 0.8585
No log 2.5641 100 0.7197 0.6573 0.7197 0.8484
No log 2.6154 102 0.7617 0.6706 0.7617 0.8727
No log 2.6667 104 0.7371 0.7020 0.7371 0.8585
No log 2.7179 106 0.7541 0.7030 0.7541 0.8684
No log 2.7692 108 0.7398 0.7035 0.7398 0.8601
No log 2.8205 110 0.6906 0.7174 0.6906 0.8310
No log 2.8718 112 0.6856 0.7042 0.6856 0.8280
No log 2.9231 114 0.6862 0.7237 0.6862 0.8284
No log 2.9744 116 0.8663 0.6526 0.8663 0.9308
No log 3.0256 118 1.0808 0.5897 1.0808 1.0396
No log 3.0769 120 1.0175 0.6074 1.0175 1.0087
No log 3.1282 122 0.8164 0.6722 0.8164 0.9036
No log 3.1795 124 0.6849 0.7531 0.6849 0.8276
No log 3.2308 126 0.6777 0.7569 0.6777 0.8233
No log 3.2821 128 0.7734 0.7062 0.7734 0.8795
No log 3.3333 130 0.9628 0.6163 0.9628 0.9812
No log 3.3846 132 0.9503 0.6104 0.9503 0.9748
No log 3.4359 134 0.7939 0.6985 0.7939 0.8910
No log 3.4872 136 0.6977 0.7743 0.6977 0.8353
No log 3.5385 138 0.7155 0.7644 0.7155 0.8459
No log 3.5897 140 0.7868 0.6796 0.7868 0.8870
No log 3.6410 142 0.8441 0.6521 0.8441 0.9188
No log 3.6923 144 0.9031 0.6465 0.9031 0.9503
No log 3.7436 146 0.8537 0.6609 0.8537 0.9240
No log 3.7949 148 0.7894 0.6965 0.7894 0.8885
No log 3.8462 150 0.7144 0.7517 0.7144 0.8453
No log 3.8974 152 0.6597 0.7747 0.6597 0.8122
No log 3.9487 154 0.6561 0.7715 0.6561 0.8100
No log 4.0 156 0.7499 0.7423 0.7499 0.8660
No log 4.0513 158 0.7807 0.7106 0.7807 0.8835
No log 4.1026 160 0.7160 0.7570 0.7160 0.8462
No log 4.1538 162 0.6387 0.7815 0.6387 0.7992
No log 4.2051 164 0.6363 0.7815 0.6363 0.7977
No log 4.2564 166 0.6342 0.7858 0.6342 0.7964
No log 4.3077 168 0.6129 0.7589 0.6129 0.7829
No log 4.3590 170 0.6084 0.7548 0.6084 0.7800
No log 4.4103 172 0.6608 0.7528 0.6608 0.8129
No log 4.4615 174 0.7203 0.7371 0.7203 0.8487
No log 4.5128 176 0.6645 0.7601 0.6645 0.8151
No log 4.5641 178 0.6059 0.7565 0.6059 0.7784
No log 4.6154 180 0.5893 0.7614 0.5893 0.7677
No log 4.6667 182 0.6331 0.7411 0.6331 0.7956
No log 4.7179 184 0.6585 0.7279 0.6585 0.8115
No log 4.7692 186 0.6894 0.7102 0.6894 0.8303
No log 4.8205 188 0.7779 0.7099 0.7779 0.8820
No log 4.8718 190 0.7884 0.7099 0.7884 0.8879
No log 4.9231 192 0.7104 0.7166 0.7104 0.8429
No log 4.9744 194 0.6529 0.7332 0.6529 0.8080
No log 5.0256 196 0.6365 0.7379 0.6365 0.7978
No log 5.0769 198 0.6413 0.7454 0.6413 0.8008
No log 5.1282 200 0.6602 0.7337 0.6602 0.8125
No log 5.1795 202 0.7373 0.7153 0.7373 0.8587
No log 5.2308 204 0.8444 0.7075 0.8444 0.9189
No log 5.2821 206 0.8433 0.7224 0.8433 0.9183
No log 5.3333 208 0.7503 0.7288 0.7503 0.8662
No log 5.3846 210 0.6782 0.7751 0.6782 0.8235
No log 5.4359 212 0.6446 0.7671 0.6446 0.8029
No log 5.4872 214 0.6599 0.7563 0.6599 0.8123
No log 5.5385 216 0.7319 0.7229 0.7319 0.8555
No log 5.5897 218 0.7961 0.7050 0.7961 0.8922
No log 5.6410 220 0.8166 0.7014 0.8166 0.9036
No log 5.6923 222 0.8122 0.7095 0.8122 0.9012
No log 5.7436 224 0.8005 0.7092 0.8005 0.8947
No log 5.7949 226 0.7724 0.7116 0.7724 0.8789
No log 5.8462 228 0.7154 0.7493 0.7154 0.8458
No log 5.8974 230 0.6990 0.7666 0.6990 0.8360
No log 5.9487 232 0.7268 0.7608 0.7268 0.8525
No log 6.0 234 0.7916 0.6998 0.7916 0.8897
No log 6.0513 236 0.8817 0.6736 0.8817 0.9390
No log 6.1026 238 0.8925 0.6568 0.8925 0.9447
No log 6.1538 240 0.8301 0.7018 0.8301 0.9111
No log 6.2051 242 0.7195 0.7490 0.7195 0.8482
No log 6.2564 244 0.6564 0.7817 0.6564 0.8102
No log 6.3077 246 0.6537 0.7784 0.6537 0.8085
No log 6.3590 248 0.6828 0.7615 0.6828 0.8263
No log 6.4103 250 0.7492 0.7199 0.7492 0.8656
No log 6.4615 252 0.8064 0.7190 0.8064 0.8980
No log 6.5128 254 0.7885 0.7252 0.7885 0.8880
No log 6.5641 256 0.7226 0.7331 0.7226 0.8501
No log 6.6154 258 0.6653 0.7607 0.6653 0.8157
No log 6.6667 260 0.6559 0.7652 0.6559 0.8099
No log 6.7179 262 0.6533 0.7696 0.6533 0.8083
No log 6.7692 264 0.6722 0.7601 0.6722 0.8199
No log 6.8205 266 0.7000 0.7462 0.7000 0.8367
No log 6.8718 268 0.7361 0.7352 0.7361 0.8579
No log 6.9231 270 0.7745 0.7288 0.7745 0.8801
No log 6.9744 272 0.8029 0.7210 0.8029 0.8961
No log 7.0256 274 0.8552 0.6986 0.8552 0.9247
No log 7.0769 276 0.8398 0.7039 0.8398 0.9164
No log 7.1282 278 0.7690 0.7103 0.7690 0.8769
No log 7.1795 280 0.7346 0.7336 0.7346 0.8571
No log 7.2308 282 0.7332 0.7294 0.7332 0.8563
No log 7.2821 284 0.7440 0.7130 0.7440 0.8625
No log 7.3333 286 0.7685 0.7180 0.7685 0.8766
No log 7.3846 288 0.7881 0.7093 0.7881 0.8877
No log 7.4359 290 0.7708 0.7157 0.7708 0.8779
No log 7.4872 292 0.7165 0.7368 0.7165 0.8465
No log 7.5385 294 0.6626 0.7678 0.6626 0.8140
No log 7.5897 296 0.6313 0.7811 0.6313 0.7945
No log 7.6410 298 0.6297 0.7811 0.6297 0.7935
No log 7.6923 300 0.6527 0.7760 0.6527 0.8079
No log 7.7436 302 0.6773 0.7798 0.6773 0.8230
No log 7.7949 304 0.7113 0.7580 0.7113 0.8434
No log 7.8462 306 0.7456 0.7159 0.7456 0.8635
No log 7.8974 308 0.7496 0.7239 0.7496 0.8658
No log 7.9487 310 0.7433 0.7338 0.7433 0.8621
No log 8.0 312 0.7300 0.7602 0.7300 0.8544
No log 8.0513 314 0.7568 0.7218 0.7568 0.8700
No log 8.1026 316 0.7712 0.6925 0.7712 0.8782
No log 8.1538 318 0.7631 0.7086 0.7631 0.8736
No log 8.2051 320 0.7336 0.7543 0.7336 0.8565
No log 8.2564 322 0.6894 0.7703 0.6894 0.8303
No log 8.3077 324 0.6642 0.7833 0.6642 0.8150
No log 8.3590 326 0.6615 0.7833 0.6615 0.8133
No log 8.4103 328 0.6711 0.7833 0.6711 0.8192
No log 8.4615 330 0.6843 0.7754 0.6843 0.8272
No log 8.5128 332 0.6994 0.7666 0.6994 0.8363
No log 8.5641 334 0.6981 0.7540 0.6981 0.8355
No log 8.6154 336 0.6886 0.7540 0.6886 0.8298
No log 8.6667 338 0.6795 0.7506 0.6795 0.8243
No log 8.7179 340 0.6788 0.7506 0.6788 0.8239
No log 8.7692 342 0.6841 0.7461 0.6841 0.8271
No log 8.8205 344 0.6964 0.7540 0.6964 0.8345
No log 8.8718 346 0.7096 0.7287 0.7096 0.8424
No log 8.9231 348 0.7057 0.7331 0.7057 0.8400
No log 8.9744 350 0.7014 0.7395 0.7014 0.8375
No log 9.0256 352 0.6952 0.7558 0.6952 0.8338
No log 9.0769 354 0.6988 0.7558 0.6988 0.8360
No log 9.1282 356 0.7056 0.7558 0.7056 0.8400
No log 9.1795 358 0.7185 0.7415 0.7185 0.8477
No log 9.2308 360 0.7286 0.7209 0.7286 0.8536
No log 9.2821 362 0.7380 0.7225 0.7380 0.8591
No log 9.3333 364 0.7489 0.7225 0.7489 0.8654
No log 9.3846 366 0.7585 0.7225 0.7585 0.8709
No log 9.4359 368 0.7613 0.7225 0.7613 0.8726
No log 9.4872 370 0.7595 0.7225 0.7595 0.8715
No log 9.5385 372 0.7600 0.7225 0.7600 0.8718
No log 9.5897 374 0.7635 0.7225 0.7635 0.8738
No log 9.6410 376 0.7651 0.7183 0.7651 0.8747
No log 9.6923 378 0.7602 0.7225 0.7602 0.8719
No log 9.7436 380 0.7556 0.7225 0.7556 0.8693
No log 9.7949 382 0.7509 0.7225 0.7509 0.8666
No log 9.8462 384 0.7466 0.7225 0.7466 0.8641
No log 9.8974 386 0.7434 0.7225 0.7434 0.8622
No log 9.9487 388 0.7410 0.7225 0.7410 0.8608
No log 10.0 390 0.7400 0.7168 0.7400 0.8602

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_FineTuningAraBERT_run1_AugV5_k10_task5_organization

Finetuned
(4023)
this model