ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run1_AugV5_k7_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8799
  • Qwk: 0.5295
  • Mse: 0.8799
  • Rmse: 0.9380

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.05 2 3.9185 -0.0188 3.9185 1.9795
No log 0.1 4 2.1508 0.1087 2.1508 1.4666
No log 0.15 6 1.0710 0.0773 1.0710 1.0349
No log 0.2 8 0.8727 -0.0270 0.8727 0.9342
No log 0.25 10 0.7158 0.1737 0.7158 0.8460
No log 0.3 12 0.6735 0.2287 0.6735 0.8207
No log 0.35 14 0.6730 0.2242 0.6730 0.8203
No log 0.4 16 0.6932 0.1819 0.6932 0.8326
No log 0.45 18 0.7006 0.1687 0.7006 0.8370
No log 0.5 20 0.6701 0.2242 0.6701 0.8186
No log 0.55 22 0.6604 0.2598 0.6604 0.8127
No log 0.6 24 0.7019 0.2414 0.7019 0.8378
No log 0.65 26 0.6455 0.3011 0.6455 0.8034
No log 0.7 28 0.6087 0.2243 0.6087 0.7802
No log 0.75 30 0.6406 0.2011 0.6406 0.8004
No log 0.8 32 0.6838 0.3008 0.6838 0.8269
No log 0.85 34 0.6613 0.2641 0.6613 0.8132
No log 0.9 36 0.7120 0.2001 0.7120 0.8438
No log 0.95 38 0.8552 0.1792 0.8552 0.9248
No log 1.0 40 0.7594 0.2358 0.7594 0.8714
No log 1.05 42 0.6820 0.3347 0.6820 0.8259
No log 1.1 44 0.5949 0.4294 0.5949 0.7713
No log 1.15 46 0.5369 0.4642 0.5369 0.7327
No log 1.2 48 0.5274 0.4199 0.5274 0.7262
No log 1.25 50 0.5501 0.4219 0.5501 0.7417
No log 1.3 52 0.5920 0.4571 0.5920 0.7694
No log 1.35 54 0.5958 0.5039 0.5958 0.7719
No log 1.4 56 0.5225 0.4227 0.5225 0.7228
No log 1.45 58 0.6481 0.3933 0.6481 0.8051
No log 1.5 60 0.7828 0.4528 0.7828 0.8847
No log 1.55 62 0.7080 0.3925 0.7080 0.8414
No log 1.6 64 0.5850 0.4944 0.5850 0.7649
No log 1.65 66 0.5671 0.5428 0.5671 0.7530
No log 1.7 68 0.5620 0.54 0.5620 0.7497
No log 1.75 70 0.5420 0.5310 0.5420 0.7362
No log 1.8 72 0.5787 0.4812 0.5787 0.7607
No log 1.85 74 0.6853 0.4555 0.6853 0.8278
No log 1.9 76 0.6915 0.4514 0.6915 0.8315
No log 1.95 78 0.6171 0.5161 0.6171 0.7856
No log 2.0 80 0.6330 0.5352 0.6330 0.7956
No log 2.05 82 0.6769 0.5356 0.6769 0.8228
No log 2.1 84 0.6994 0.5813 0.6994 0.8363
No log 2.15 86 0.7551 0.4841 0.7551 0.8689
No log 2.2 88 0.8752 0.4268 0.8752 0.9355
No log 2.25 90 0.9943 0.4500 0.9943 0.9972
No log 2.3 92 1.0307 0.4674 1.0307 1.0152
No log 2.35 94 0.8568 0.4998 0.8568 0.9256
No log 2.4 96 0.7686 0.5203 0.7686 0.8767
No log 2.45 98 0.7452 0.5410 0.7452 0.8633
No log 2.5 100 0.7344 0.5043 0.7344 0.8570
No log 2.55 102 0.8165 0.4409 0.8165 0.9036
No log 2.6 104 1.0046 0.4442 1.0046 1.0023
No log 2.65 106 1.0594 0.4342 1.0594 1.0293
No log 2.7 108 0.9331 0.4285 0.9331 0.9660
No log 2.75 110 0.7676 0.4954 0.7676 0.8761
No log 2.8 112 0.7245 0.5751 0.7245 0.8511
No log 2.85 114 0.7318 0.4968 0.7318 0.8555
No log 2.9 116 0.8225 0.4589 0.8225 0.9069
No log 2.95 118 1.0257 0.4373 1.0257 1.0128
No log 3.0 120 1.0772 0.4487 1.0772 1.0379
No log 3.05 122 1.0473 0.4516 1.0473 1.0234
No log 3.1 124 1.0938 0.4569 1.0938 1.0458
No log 3.15 126 1.1155 0.4381 1.1155 1.0562
No log 3.2 128 1.1861 0.4481 1.1861 1.0891
No log 3.25 130 1.1134 0.4961 1.1134 1.0552
No log 3.3 132 1.0816 0.4935 1.0816 1.0400
No log 3.35 134 1.0504 0.4347 1.0504 1.0249
No log 3.4 136 1.0916 0.4055 1.0916 1.0448
No log 3.45 138 1.0960 0.3987 1.0960 1.0469
No log 3.5 140 1.1371 0.4188 1.1371 1.0663
No log 3.55 142 0.9978 0.4267 0.9978 0.9989
No log 3.6 144 1.0332 0.4125 1.0332 1.0165
No log 3.65 146 1.1530 0.4268 1.1530 1.0738
No log 3.7 148 1.1491 0.4267 1.1491 1.0720
No log 3.75 150 0.9310 0.4040 0.9310 0.9649
No log 3.8 152 0.8187 0.4542 0.8187 0.9048
No log 3.85 154 0.7883 0.4941 0.7883 0.8878
No log 3.9 156 0.7454 0.4654 0.7454 0.8634
No log 3.95 158 0.7567 0.4691 0.7567 0.8699
No log 4.0 160 0.8048 0.4467 0.8048 0.8971
No log 4.05 162 0.9154 0.4858 0.9154 0.9567
No log 4.1 164 1.0442 0.4617 1.0442 1.0219
No log 4.15 166 1.0674 0.4712 1.0674 1.0332
No log 4.2 168 1.1179 0.4639 1.1179 1.0573
No log 4.25 170 1.2077 0.4762 1.2077 1.0989
No log 4.3 172 1.1992 0.4888 1.1992 1.0951
No log 4.35 174 1.2418 0.4537 1.2418 1.1144
No log 4.4 176 1.0994 0.4707 1.0994 1.0485
No log 4.45 178 0.8958 0.4468 0.8958 0.9465
No log 4.5 180 0.8244 0.4483 0.8244 0.9080
No log 4.55 182 0.8533 0.4404 0.8533 0.9237
No log 4.6 184 0.8856 0.4439 0.8856 0.9411
No log 4.65 186 0.9009 0.4449 0.9009 0.9491
No log 4.7 188 0.9093 0.4621 0.9093 0.9536
No log 4.75 190 0.9127 0.4627 0.9127 0.9554
No log 4.8 192 0.9144 0.5017 0.9144 0.9562
No log 4.85 194 0.9177 0.4830 0.9177 0.9580
No log 4.9 196 0.9575 0.4685 0.9575 0.9785
No log 4.95 198 1.0899 0.4442 1.0899 1.0440
No log 5.0 200 1.1448 0.4507 1.1448 1.0699
No log 5.05 202 1.0722 0.4430 1.0722 1.0355
No log 5.1 204 1.0174 0.4492 1.0174 1.0087
No log 5.15 206 1.0028 0.4568 1.0028 1.0014
No log 5.2 208 1.0147 0.4524 1.0147 1.0073
No log 5.25 210 1.0904 0.4469 1.0904 1.0442
No log 5.3 212 1.1277 0.4297 1.1277 1.0619
No log 5.35 214 1.0965 0.4396 1.0965 1.0472
No log 5.4 216 1.0296 0.4393 1.0296 1.0147
No log 5.45 218 0.9526 0.5248 0.9526 0.9760
No log 5.5 220 0.9570 0.4599 0.9570 0.9783
No log 5.55 222 0.9493 0.4599 0.9493 0.9743
No log 5.6 224 0.9349 0.5069 0.9349 0.9669
No log 5.65 226 0.9856 0.4685 0.9856 0.9928
No log 5.7 228 1.0140 0.4285 1.0140 1.0070
No log 5.75 230 0.9917 0.4284 0.9917 0.9959
No log 5.8 232 0.9682 0.4457 0.9682 0.9840
No log 5.85 234 0.9417 0.4529 0.9417 0.9704
No log 5.9 236 0.9152 0.4900 0.9152 0.9567
No log 5.95 238 0.9199 0.4998 0.9199 0.9591
No log 6.0 240 0.9434 0.4934 0.9434 0.9713
No log 6.05 242 0.9620 0.5012 0.9620 0.9808
No log 6.1 244 0.9556 0.5359 0.9556 0.9776
No log 6.15 246 0.9630 0.5156 0.9630 0.9813
No log 6.2 248 0.9795 0.4773 0.9795 0.9897
No log 6.25 250 0.9398 0.4837 0.9398 0.9695
No log 6.3 252 0.8764 0.5277 0.8764 0.9361
No log 6.35 254 0.8306 0.5234 0.8306 0.9114
No log 6.4 256 0.8131 0.5494 0.8131 0.9017
No log 6.45 258 0.8215 0.5440 0.8215 0.9064
No log 6.5 260 0.8564 0.5277 0.8564 0.9254
No log 6.55 262 0.9370 0.4737 0.9370 0.9680
No log 6.6 264 1.0115 0.4404 1.0115 1.0058
No log 6.65 266 1.0214 0.4458 1.0214 1.0107
No log 6.7 268 0.9487 0.4455 0.9487 0.9740
No log 6.75 270 0.9188 0.4837 0.9188 0.9585
No log 6.8 272 0.8919 0.5077 0.8919 0.9444
No log 6.85 274 0.8759 0.5342 0.8759 0.9359
No log 6.9 276 0.8860 0.5216 0.8860 0.9413
No log 6.95 278 0.9130 0.5053 0.9130 0.9555
No log 7.0 280 0.9217 0.4637 0.9217 0.9601
No log 7.05 282 0.9432 0.4443 0.9432 0.9712
No log 7.1 284 0.9674 0.4536 0.9674 0.9836
No log 7.15 286 0.9724 0.4534 0.9724 0.9861
No log 7.2 288 0.9540 0.4423 0.9540 0.9767
No log 7.25 290 0.8900 0.4840 0.8900 0.9434
No log 7.3 292 0.8736 0.5106 0.8736 0.9347
No log 7.35 294 0.8503 0.5407 0.8503 0.9221
No log 7.4 296 0.8677 0.5305 0.8677 0.9315
No log 7.45 298 0.9091 0.5144 0.9091 0.9535
No log 7.5 300 0.9583 0.4809 0.9583 0.9789
No log 7.55 302 1.0082 0.4609 1.0082 1.0041
No log 7.6 304 1.0396 0.4583 1.0396 1.0196
No log 7.65 306 1.0052 0.4595 1.0052 1.0026
No log 7.7 308 0.9409 0.5157 0.9409 0.9700
No log 7.75 310 0.9042 0.5130 0.9042 0.9509
No log 7.8 312 0.9025 0.5130 0.9025 0.9500
No log 7.85 314 0.9361 0.5156 0.9361 0.9675
No log 7.9 316 0.9855 0.4553 0.9855 0.9927
No log 7.95 318 0.9869 0.4598 0.9869 0.9934
No log 8.0 320 1.0103 0.4595 1.0103 1.0051
No log 8.05 322 1.0527 0.4473 1.0527 1.0260
No log 8.1 324 1.0790 0.4467 1.0790 1.0387
No log 8.15 326 1.0533 0.4473 1.0533 1.0263
No log 8.2 328 0.9977 0.4475 0.9977 0.9988
No log 8.25 330 0.9316 0.4731 0.9316 0.9652
No log 8.3 332 0.8778 0.5324 0.8778 0.9369
No log 8.35 334 0.8555 0.5164 0.8555 0.9249
No log 8.4 336 0.8364 0.5193 0.8364 0.9145
No log 8.45 338 0.8363 0.5247 0.8363 0.9145
No log 8.5 340 0.8341 0.5247 0.8341 0.9133
No log 8.55 342 0.8400 0.5376 0.8400 0.9165
No log 8.6 344 0.8572 0.5428 0.8572 0.9258
No log 8.65 346 0.8769 0.5109 0.8769 0.9364
No log 8.7 348 0.9010 0.4662 0.9010 0.9492
No log 8.75 350 0.9173 0.4600 0.9173 0.9578
No log 8.8 352 0.9249 0.4600 0.9249 0.9617
No log 8.85 354 0.9324 0.4600 0.9324 0.9656
No log 8.9 356 0.9263 0.4982 0.9263 0.9624
No log 8.95 358 0.9129 0.5181 0.9129 0.9554
No log 9.0 360 0.9003 0.5272 0.9003 0.9488
No log 9.05 362 0.8916 0.5291 0.8916 0.9442
No log 9.1 364 0.8888 0.5291 0.8888 0.9428
No log 9.15 366 0.8862 0.5276 0.8862 0.9414
No log 9.2 368 0.8886 0.5376 0.8886 0.9427
No log 9.25 370 0.8932 0.5362 0.8932 0.9451
No log 9.3 372 0.8929 0.5362 0.8929 0.9449
No log 9.35 374 0.8906 0.5042 0.8906 0.9437
No log 9.4 376 0.8972 0.4670 0.8972 0.9472
No log 9.45 378 0.9006 0.4613 0.9006 0.9490
No log 9.5 380 0.9032 0.4662 0.9032 0.9504
No log 9.55 382 0.8988 0.4613 0.8988 0.9480
No log 9.6 384 0.8893 0.4730 0.8893 0.9430
No log 9.65 386 0.8808 0.5042 0.8808 0.9385
No log 9.7 388 0.8761 0.5295 0.8761 0.9360
No log 9.75 390 0.8743 0.5362 0.8743 0.9350
No log 9.8 392 0.8763 0.5295 0.8763 0.9361
No log 9.85 394 0.8779 0.5295 0.8779 0.9369
No log 9.9 396 0.8788 0.5295 0.8788 0.9374
No log 9.95 398 0.8797 0.5295 0.8797 0.9379
No log 10.0 400 0.8799 0.5295 0.8799 0.9380

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run1_AugV5_k7_task2_organization

Finetuned
(4023)
this model