ArabicNewSplits6_FineTuningAraBERT_run2_AugV5_k8_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5306
  • Qwk: 0.3023
  • Mse: 0.5306
  • Rmse: 0.7284

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0513 2 3.3364 -0.0066 3.3364 1.8266
No log 0.1026 4 1.7436 -0.0070 1.7436 1.3204
No log 0.1538 6 1.3009 0.0294 1.3009 1.1406
No log 0.2051 8 0.6288 0.1795 0.6288 0.7930
No log 0.2564 10 0.5897 0.0476 0.5897 0.7679
No log 0.3077 12 1.2716 0.0929 1.2716 1.1277
No log 0.3590 14 0.9003 0.0698 0.9003 0.9489
No log 0.4103 16 0.5536 0.0 0.5536 0.7440
No log 0.4615 18 0.5851 0.0 0.5851 0.7649
No log 0.5128 20 0.6577 -0.0081 0.6577 0.8110
No log 0.5641 22 0.7544 0.0647 0.7544 0.8685
No log 0.6154 24 0.8336 0.0078 0.8336 0.9130
No log 0.6667 26 0.8256 0.0698 0.8256 0.9086
No log 0.7179 28 0.7625 0.1605 0.7625 0.8732
No log 0.7692 30 0.6825 0.4167 0.6825 0.8261
No log 0.8205 32 0.7592 0.2263 0.7592 0.8713
No log 0.8718 34 0.6697 0.3535 0.6697 0.8184
No log 0.9231 36 0.5725 -0.0081 0.5725 0.7567
No log 0.9744 38 0.5576 0.0 0.5576 0.7467
No log 1.0256 40 0.5471 -0.0081 0.5471 0.7397
No log 1.0769 42 0.6616 0.2464 0.6616 0.8134
No log 1.1282 44 0.6593 0.2381 0.6593 0.8120
No log 1.1795 46 0.6753 0.2381 0.6753 0.8218
No log 1.2308 48 0.7559 0.2281 0.7559 0.8694
No log 1.2821 50 0.6564 0.2464 0.6564 0.8102
No log 1.3333 52 0.5206 0.0909 0.5206 0.7215
No log 1.3846 54 0.5451 0.0 0.5451 0.7383
No log 1.4359 56 0.5343 -0.0081 0.5343 0.7309
No log 1.4872 58 0.5481 0.3464 0.5481 0.7403
No log 1.5385 60 0.7829 0.1712 0.7829 0.8848
No log 1.5897 62 0.8628 0.1453 0.8628 0.9289
No log 1.6410 64 0.5412 0.3333 0.5412 0.7356
No log 1.6923 66 0.4876 0.0815 0.4876 0.6983
No log 1.7436 68 0.5756 0.3333 0.5756 0.7587
No log 1.7949 70 0.8552 0.1864 0.8552 0.9248
No log 1.8462 72 0.6174 0.3016 0.6174 0.7858
No log 1.8974 74 0.4937 0.0909 0.4937 0.7026
No log 1.9487 76 0.5451 0.1765 0.5451 0.7383
No log 2.0 78 0.5165 0.1642 0.5165 0.7187
No log 2.0513 80 0.7536 0.2554 0.7536 0.8681
No log 2.1026 82 0.8214 0.2263 0.8214 0.9063
No log 2.1538 84 0.5261 0.1888 0.5261 0.7254
No log 2.2051 86 0.5775 0.1206 0.5775 0.7599
No log 2.2564 88 0.5140 0.2318 0.5140 0.7169
No log 2.3077 90 0.6680 0.2607 0.6680 0.8173
No log 2.3590 92 0.6941 0.2637 0.6941 0.8331
No log 2.4103 94 0.4842 0.3007 0.4842 0.6958
No log 2.4615 96 0.5143 0.2208 0.5143 0.7171
No log 2.5128 98 0.4838 0.3766 0.4838 0.6956
No log 2.5641 100 0.6593 0.3271 0.6593 0.8120
No log 2.6154 102 0.5159 0.4894 0.5159 0.7183
No log 2.6667 104 0.8087 0.2258 0.8087 0.8993
No log 2.7179 106 1.0627 0.1126 1.0627 1.0309
No log 2.7692 108 0.9623 0.1378 0.9623 0.9809
No log 2.8205 110 0.5297 0.3878 0.5297 0.7278
No log 2.8718 112 0.5850 0.3623 0.5850 0.7649
No log 2.9231 114 0.5198 0.4051 0.5198 0.7210
No log 2.9744 116 0.4905 0.4146 0.4905 0.7003
No log 3.0256 118 0.5505 0.2393 0.5505 0.7420
No log 3.0769 120 0.5063 0.2105 0.5063 0.7115
No log 3.1282 122 0.5298 0.36 0.5298 0.7279
No log 3.1795 124 0.5995 0.2000 0.5995 0.7743
No log 3.2308 126 0.7198 0.2577 0.7198 0.8484
No log 3.2821 128 0.5329 0.2865 0.5329 0.7300
No log 3.3333 130 0.5976 0.3561 0.5976 0.7730
No log 3.3846 132 0.5505 0.3469 0.5505 0.7419
No log 3.4359 134 0.5723 0.2289 0.5723 0.7565
No log 3.4872 136 0.5584 0.2485 0.5584 0.7473
No log 3.5385 138 0.6117 0.3814 0.6117 0.7821
No log 3.5897 140 0.5720 0.3182 0.5720 0.7563
No log 3.6410 142 0.5964 0.3297 0.5964 0.7722
No log 3.6923 144 0.6051 0.1917 0.6051 0.7779
No log 3.7436 146 0.7445 0.2653 0.7445 0.8628
No log 3.7949 148 0.5849 0.3052 0.5849 0.7648
No log 3.8462 150 0.5779 0.3462 0.5779 0.7602
No log 3.8974 152 0.8398 0.2281 0.8398 0.9164
No log 3.9487 154 0.8749 0.2275 0.8749 0.9354
No log 4.0 156 0.5431 0.3367 0.5431 0.7370
No log 4.0513 158 0.5265 0.2889 0.5265 0.7256
No log 4.1026 160 0.5356 0.2914 0.5356 0.7319
No log 4.1538 162 0.5829 0.2174 0.5829 0.7635
No log 4.2051 164 0.5487 0.3149 0.5487 0.7407
No log 4.2564 166 0.5911 0.2088 0.5911 0.7688
No log 4.3077 168 0.5719 0.2340 0.5719 0.7562
No log 4.3590 170 0.6576 0.3214 0.6576 0.8109
No log 4.4103 172 0.7102 0.2900 0.7102 0.8427
No log 4.4615 174 0.5599 0.3520 0.5599 0.7483
No log 4.5128 176 0.8254 0.2287 0.8254 0.9085
No log 4.5641 178 0.8380 0.2281 0.8380 0.9154
No log 4.6154 180 0.5665 0.2418 0.5665 0.7526
No log 4.6667 182 0.6345 0.2593 0.6345 0.7966
No log 4.7179 184 0.8270 0.1736 0.8270 0.9094
No log 4.7692 186 0.7337 0.2857 0.7337 0.8565
No log 4.8205 188 0.5567 0.1020 0.5567 0.7461
No log 4.8718 190 0.5412 0.2432 0.5412 0.7356
No log 4.9231 192 0.6186 0.2809 0.6186 0.7865
No log 4.9744 194 0.5463 0.3054 0.5463 0.7391
No log 5.0256 196 0.5743 0.3663 0.5743 0.7579
No log 5.0769 198 0.5990 0.3462 0.5990 0.7740
No log 5.1282 200 0.5764 0.3684 0.5764 0.7592
No log 5.1795 202 0.6914 0.3684 0.6914 0.8315
No log 5.2308 204 0.6459 0.3242 0.6459 0.8037
No log 5.2821 206 0.6272 0.3043 0.6272 0.7920
No log 5.3333 208 0.6074 0.3363 0.6074 0.7793
No log 5.3846 210 0.5901 0.3171 0.5901 0.7682
No log 5.4359 212 0.5776 0.2340 0.5776 0.7600
No log 5.4872 214 0.5933 0.2527 0.5933 0.7703
No log 5.5385 216 0.5584 0.2766 0.5584 0.7472
No log 5.5897 218 0.5567 0.2093 0.5567 0.7462
No log 5.6410 220 0.6042 0.2273 0.6042 0.7773
No log 5.6923 222 0.6333 0.2258 0.6333 0.7958
No log 5.7436 224 0.6430 0.2258 0.6430 0.8019
No log 5.7949 226 0.6190 0.1739 0.6190 0.7867
No log 5.8462 228 0.6124 0.3161 0.6124 0.7825
No log 5.8974 230 0.8424 0.2653 0.8424 0.9178
No log 5.9487 232 0.9167 0.2756 0.9167 0.9574
No log 6.0 234 0.7262 0.2857 0.7262 0.8522
No log 6.0513 236 0.5851 0.1515 0.5851 0.7649
No log 6.1026 238 0.6395 0.2577 0.6395 0.7997
No log 6.1538 240 0.6226 0.1209 0.6226 0.7891
No log 6.2051 242 0.6106 0.1186 0.6106 0.7814
No log 6.2564 244 0.6076 0.2941 0.6076 0.7795
No log 6.3077 246 0.6287 0.2653 0.6287 0.7929
No log 6.3590 248 0.6004 0.1345 0.6004 0.7748
No log 6.4103 250 0.6204 0.1648 0.6204 0.7876
No log 6.4615 252 0.6730 0.2079 0.6730 0.8204
No log 6.5128 254 0.6689 0.2079 0.6689 0.8179
No log 6.5641 256 0.6154 0.1765 0.6154 0.7845
No log 6.6154 258 0.6903 0.4286 0.6903 0.8308
No log 6.6667 260 0.6738 0.4286 0.6738 0.8208
No log 6.7179 262 0.5892 0.2607 0.5892 0.7676
No log 6.7692 264 0.5986 0.2941 0.5986 0.7737
No log 6.8205 266 0.6382 0.2917 0.6382 0.7989
No log 6.8718 268 0.5861 0.2865 0.5861 0.7655
No log 6.9231 270 0.5543 0.25 0.5543 0.7445
No log 6.9744 272 0.5451 0.25 0.5451 0.7383
No log 7.0256 274 0.5310 0.2457 0.5310 0.7287
No log 7.0769 276 0.5244 0.2457 0.5244 0.7241
No log 7.1282 278 0.5187 0.1807 0.5187 0.7202
No log 7.1795 280 0.5180 0.1716 0.5180 0.7197
No log 7.2308 282 0.5516 0.3103 0.5516 0.7427
No log 7.2821 284 0.5943 0.3585 0.5943 0.7709
No log 7.3333 286 0.5729 0.3010 0.5729 0.7569
No log 7.3846 288 0.5407 0.1813 0.5407 0.7353
No log 7.4359 290 0.5842 0.2967 0.5842 0.7643
No log 7.4872 292 0.5986 0.2542 0.5986 0.7737
No log 7.5385 294 0.5822 0.2457 0.5822 0.7630
No log 7.5897 296 0.5944 0.2323 0.5944 0.7710
No log 7.6410 298 0.6361 0.3077 0.6361 0.7975
No log 7.6923 300 0.6059 0.2653 0.6059 0.7784
No log 7.7436 302 0.5832 0.2265 0.5832 0.7637
No log 7.7949 304 0.6126 0.2090 0.6126 0.7827
No log 7.8462 306 0.6148 0.2090 0.6148 0.7841
No log 7.8974 308 0.5860 0.2096 0.5860 0.7655
No log 7.9487 310 0.5615 0.2516 0.5615 0.7493
No log 8.0 312 0.5529 0.1899 0.5529 0.7436
No log 8.0513 314 0.5474 0.1373 0.5474 0.7399
No log 8.1026 316 0.5444 0.1467 0.5444 0.7379
No log 8.1538 318 0.5491 0.2096 0.5491 0.7410
No log 8.2051 320 0.5569 0.2096 0.5569 0.7462
No log 8.2564 322 0.5648 0.2093 0.5648 0.7516
No log 8.3077 324 0.5772 0.2542 0.5772 0.7597
No log 8.3590 326 0.5554 0.1543 0.5554 0.7453
No log 8.4103 328 0.5378 0.2000 0.5378 0.7333
No log 8.4615 330 0.5313 0.2000 0.5313 0.7289
No log 8.5128 332 0.5275 0.2393 0.5275 0.7263
No log 8.5641 334 0.5230 0.2393 0.5230 0.7232
No log 8.6154 336 0.5204 0.2000 0.5204 0.7214
No log 8.6667 338 0.5315 0.2593 0.5315 0.7290
No log 8.7179 340 0.5650 0.2626 0.5650 0.7517
No log 8.7692 342 0.5702 0.2727 0.5702 0.7551
No log 8.8205 344 0.5455 0.2542 0.5455 0.7386
No log 8.8718 346 0.5204 0.2000 0.5204 0.7214
No log 8.9231 348 0.5247 0.3446 0.5247 0.7244
No log 8.9744 350 0.5338 0.3016 0.5338 0.7306
No log 9.0256 352 0.5323 0.3016 0.5323 0.7296
No log 9.0769 354 0.5244 0.3446 0.5244 0.7241
No log 9.1282 356 0.5184 0.3216 0.5184 0.7200
No log 9.1795 358 0.5222 0.2593 0.5222 0.7226
No log 9.2308 360 0.5274 0.3023 0.5274 0.7262
No log 9.2821 362 0.5339 0.3023 0.5339 0.7307
No log 9.3333 364 0.5372 0.3023 0.5372 0.7330
No log 9.3846 366 0.5442 0.2542 0.5442 0.7377
No log 9.4359 368 0.5485 0.2542 0.5485 0.7406
No log 9.4872 370 0.5454 0.2542 0.5454 0.7385
No log 9.5385 372 0.5345 0.3023 0.5345 0.7311
No log 9.5897 374 0.5280 0.3023 0.5280 0.7266
No log 9.6410 376 0.5253 0.2593 0.5253 0.7248
No log 9.6923 378 0.5244 0.3455 0.5244 0.7241
No log 9.7436 380 0.5250 0.3455 0.5250 0.7246
No log 9.7949 382 0.5260 0.3455 0.5260 0.7253
No log 9.8462 384 0.5264 0.2593 0.5264 0.7256
No log 9.8974 386 0.5280 0.3054 0.5280 0.7266
No log 9.9487 388 0.5298 0.3023 0.5298 0.7279
No log 10.0 390 0.5306 0.3023 0.5306 0.7284

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_FineTuningAraBERT_run2_AugV5_k8_task3_organization

Finetuned
(4023)
this model