ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run1_AugV5_k7_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6993
  • Qwk: 0.7333
  • Mse: 0.6993
  • Rmse: 0.8363

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0488 2 5.2289 -0.0125 5.2289 2.2867
No log 0.0976 4 3.3366 0.0724 3.3366 1.8266
No log 0.1463 6 1.9019 0.1067 1.9019 1.3791
No log 0.1951 8 1.2326 0.2638 1.2326 1.1102
No log 0.2439 10 1.1014 0.2716 1.1014 1.0495
No log 0.2927 12 1.1617 0.2767 1.1617 1.0778
No log 0.3415 14 1.1759 0.2220 1.1759 1.0844
No log 0.3902 16 1.2008 0.2169 1.2008 1.0958
No log 0.4390 18 1.1861 0.2405 1.1861 1.0891
No log 0.4878 20 1.1501 0.3122 1.1501 1.0724
No log 0.5366 22 1.1938 0.2894 1.1938 1.0926
No log 0.5854 24 1.0897 0.3483 1.0897 1.0439
No log 0.6341 26 0.8300 0.5858 0.8300 0.9110
No log 0.6829 28 0.7933 0.5494 0.7933 0.8907
No log 0.7317 30 0.7421 0.6020 0.7421 0.8615
No log 0.7805 32 1.0604 0.4503 1.0604 1.0298
No log 0.8293 34 1.2389 0.4077 1.2389 1.1131
No log 0.8780 36 1.0027 0.4537 1.0027 1.0013
No log 0.9268 38 0.7117 0.5932 0.7117 0.8436
No log 0.9756 40 0.6602 0.6660 0.6602 0.8125
No log 1.0244 42 0.6716 0.6878 0.6716 0.8195
No log 1.0732 44 0.8267 0.6125 0.8267 0.9092
No log 1.1220 46 1.3064 0.4873 1.3064 1.1430
No log 1.1707 48 1.8600 0.3346 1.8600 1.3638
No log 1.2195 50 1.8232 0.3528 1.8232 1.3503
No log 1.2683 52 1.3077 0.4704 1.3077 1.1436
No log 1.3171 54 0.7983 0.6885 0.7983 0.8935
No log 1.3659 56 0.8301 0.6830 0.8301 0.9111
No log 1.4146 58 0.8312 0.6774 0.8312 0.9117
No log 1.4634 60 0.7897 0.6751 0.7897 0.8887
No log 1.5122 62 0.8280 0.6393 0.8280 0.9100
No log 1.5610 64 0.8927 0.5765 0.8927 0.9448
No log 1.6098 66 0.8939 0.5901 0.8939 0.9455
No log 1.6585 68 0.9149 0.5832 0.9149 0.9565
No log 1.7073 70 0.8583 0.6406 0.8583 0.9265
No log 1.7561 72 0.8638 0.6665 0.8638 0.9294
No log 1.8049 74 0.9655 0.6185 0.9655 0.9826
No log 1.8537 76 1.0856 0.5497 1.0856 1.0419
No log 1.9024 78 1.0901 0.5603 1.0901 1.0441
No log 1.9512 80 1.1505 0.5361 1.1505 1.0726
No log 2.0 82 1.1015 0.5402 1.1015 1.0495
No log 2.0488 84 0.9548 0.5818 0.9548 0.9771
No log 2.0976 86 0.7507 0.6905 0.7507 0.8665
No log 2.1463 88 0.7073 0.7033 0.7073 0.8410
No log 2.1951 90 0.7086 0.7039 0.7086 0.8418
No log 2.2439 92 0.7258 0.6814 0.7258 0.8519
No log 2.2927 94 0.8829 0.5728 0.8829 0.9397
No log 2.3415 96 1.0924 0.5633 1.0924 1.0452
No log 2.3902 98 1.1703 0.5377 1.1703 1.0818
No log 2.4390 100 0.9919 0.5857 0.9919 0.9959
No log 2.4878 102 0.7314 0.6929 0.7314 0.8552
No log 2.5366 104 0.6743 0.7395 0.6743 0.8212
No log 2.5854 106 0.7476 0.7243 0.7476 0.8647
No log 2.6341 108 0.7656 0.7095 0.7656 0.8750
No log 2.6829 110 0.7190 0.7433 0.7190 0.8479
No log 2.7317 112 0.6343 0.7305 0.6343 0.7964
No log 2.7805 114 0.7408 0.7112 0.7408 0.8607
No log 2.8293 116 0.8921 0.6291 0.8921 0.9445
No log 2.8780 118 0.8427 0.6484 0.8427 0.9180
No log 2.9268 120 0.6771 0.6697 0.6771 0.8228
No log 2.9756 122 0.6220 0.7669 0.6220 0.7887
No log 3.0244 124 0.8668 0.6905 0.8668 0.9310
No log 3.0732 126 0.9807 0.6391 0.9807 0.9903
No log 3.1220 128 0.9739 0.6493 0.9739 0.9869
No log 3.1707 130 0.8511 0.7152 0.8511 0.9225
No log 3.2195 132 0.7180 0.7249 0.7180 0.8474
No log 3.2683 134 0.7927 0.6559 0.7927 0.8903
No log 3.3171 136 1.0274 0.5724 1.0274 1.0136
No log 3.3659 138 1.0364 0.5375 1.0364 1.0180
No log 3.4146 140 0.9210 0.6134 0.9210 0.9597
No log 3.4634 142 0.7659 0.6422 0.7659 0.8751
No log 3.5122 144 0.7148 0.6551 0.7148 0.8455
No log 3.5610 146 0.7176 0.6753 0.7176 0.8471
No log 3.6098 148 0.6952 0.6929 0.6952 0.8338
No log 3.6585 150 0.7105 0.6731 0.7105 0.8429
No log 3.7073 152 0.7812 0.6346 0.7812 0.8839
No log 3.7561 154 0.7642 0.6644 0.7642 0.8742
No log 3.8049 156 0.6576 0.7030 0.6576 0.8109
No log 3.8537 158 0.6358 0.7382 0.6358 0.7974
No log 3.9024 160 0.7822 0.7081 0.7822 0.8844
No log 3.9512 162 0.7813 0.7025 0.7813 0.8839
No log 4.0 164 0.7911 0.7029 0.7911 0.8895
No log 4.0488 166 0.7057 0.7251 0.7057 0.8401
No log 4.0976 168 0.6398 0.7409 0.6398 0.7998
No log 4.1463 170 0.6392 0.7561 0.6392 0.7995
No log 4.1951 172 0.6914 0.7493 0.6914 0.8315
No log 4.2439 174 0.7621 0.7250 0.7621 0.8730
No log 4.2927 176 0.7840 0.7231 0.7840 0.8854
No log 4.3415 178 0.7226 0.7475 0.7226 0.8500
No log 4.3902 180 0.6933 0.7285 0.6933 0.8326
No log 4.4390 182 0.6646 0.7280 0.6646 0.8152
No log 4.4878 184 0.6532 0.7425 0.6532 0.8082
No log 4.5366 186 0.6797 0.6860 0.6797 0.8244
No log 4.5854 188 0.6745 0.6827 0.6745 0.8213
No log 4.6341 190 0.6470 0.6994 0.6470 0.8044
No log 4.6829 192 0.6636 0.7157 0.6636 0.8146
No log 4.7317 194 0.7728 0.7181 0.7728 0.8791
No log 4.7805 196 0.7882 0.6822 0.7882 0.8878
No log 4.8293 198 0.7037 0.7349 0.7037 0.8388
No log 4.8780 200 0.6391 0.7340 0.6391 0.7994
No log 4.9268 202 0.7140 0.6642 0.7140 0.8450
No log 4.9756 204 0.7874 0.6550 0.7874 0.8874
No log 5.0244 206 0.7636 0.6542 0.7636 0.8738
No log 5.0732 208 0.7094 0.6884 0.7094 0.8422
No log 5.1220 210 0.6746 0.7337 0.6746 0.8213
No log 5.1707 212 0.6769 0.7544 0.6769 0.8227
No log 5.2195 214 0.6881 0.7452 0.6881 0.8295
No log 5.2683 216 0.7212 0.7514 0.7212 0.8493
No log 5.3171 218 0.7442 0.7217 0.7442 0.8627
No log 5.3659 220 0.7226 0.7090 0.7226 0.8500
No log 5.4146 222 0.6680 0.7477 0.6680 0.8173
No log 5.4634 224 0.6383 0.7673 0.6383 0.7990
No log 5.5122 226 0.6453 0.7497 0.6453 0.8033
No log 5.5610 228 0.6542 0.7566 0.6542 0.8088
No log 5.6098 230 0.6535 0.7677 0.6535 0.8084
No log 5.6585 232 0.6661 0.7449 0.6661 0.8161
No log 5.7073 234 0.6771 0.7427 0.6771 0.8229
No log 5.7561 236 0.7021 0.7412 0.7021 0.8379
No log 5.8049 238 0.7134 0.7369 0.7134 0.8446
No log 5.8537 240 0.7247 0.7155 0.7247 0.8513
No log 5.9024 242 0.6962 0.7369 0.6962 0.8344
No log 5.9512 244 0.6691 0.7489 0.6691 0.8180
No log 6.0 246 0.6716 0.7300 0.6716 0.8195
No log 6.0488 248 0.7009 0.7156 0.7009 0.8372
No log 6.0976 250 0.7345 0.6971 0.7345 0.8571
No log 6.1463 252 0.7350 0.7142 0.7350 0.8573
No log 6.1951 254 0.7342 0.7263 0.7342 0.8569
No log 6.2439 256 0.7295 0.7560 0.7295 0.8541
No log 6.2927 258 0.7178 0.7550 0.7178 0.8472
No log 6.3415 260 0.7206 0.7571 0.7206 0.8489
No log 6.3902 262 0.7163 0.7519 0.7163 0.8463
No log 6.4390 264 0.7088 0.7566 0.7088 0.8419
No log 6.4878 266 0.7067 0.7475 0.7067 0.8407
No log 6.5366 268 0.7075 0.7600 0.7075 0.8411
No log 6.5854 270 0.7109 0.7630 0.7109 0.8432
No log 6.6341 272 0.7152 0.7423 0.7152 0.8457
No log 6.6829 274 0.7574 0.7116 0.7574 0.8703
No log 6.7317 276 0.8081 0.6700 0.8081 0.8989
No log 6.7805 278 0.8260 0.6739 0.8260 0.9089
No log 6.8293 280 0.7989 0.6853 0.7989 0.8938
No log 6.8780 282 0.7467 0.7090 0.7467 0.8641
No log 6.9268 284 0.7295 0.7327 0.7295 0.8541
No log 6.9756 286 0.7309 0.7365 0.7309 0.8549
No log 7.0244 288 0.7267 0.7365 0.7267 0.8524
No log 7.0732 290 0.7316 0.7365 0.7316 0.8553
No log 7.1220 292 0.7443 0.7148 0.7443 0.8627
No log 7.1707 294 0.7605 0.7072 0.7605 0.8721
No log 7.2195 296 0.7963 0.7008 0.7963 0.8923
No log 7.2683 298 0.8257 0.6861 0.8257 0.9087
No log 7.3171 300 0.8010 0.7174 0.8010 0.8950
No log 7.3659 302 0.7716 0.7072 0.7716 0.8784
No log 7.4146 304 0.7395 0.7391 0.7395 0.8599
No log 7.4634 306 0.7208 0.7364 0.7208 0.8490
No log 7.5122 308 0.7202 0.7009 0.7202 0.8486
No log 7.5610 310 0.7192 0.6775 0.7192 0.8481
No log 7.6098 312 0.7195 0.7068 0.7195 0.8482
No log 7.6585 314 0.7121 0.7184 0.7121 0.8439
No log 7.7073 316 0.7064 0.7527 0.7064 0.8405
No log 7.7561 318 0.7013 0.7527 0.7013 0.8374
No log 7.8049 320 0.6983 0.7441 0.6983 0.8356
No log 7.8537 322 0.6993 0.7441 0.6993 0.8363
No log 7.9024 324 0.7014 0.7441 0.7014 0.8375
No log 7.9512 326 0.6968 0.7441 0.6968 0.8347
No log 8.0 328 0.6905 0.7566 0.6905 0.8310
No log 8.0488 330 0.6849 0.7600 0.6849 0.8276
No log 8.0976 332 0.6840 0.7484 0.6840 0.8270
No log 8.1463 334 0.6950 0.7322 0.6950 0.8337
No log 8.1951 336 0.7045 0.7365 0.7045 0.8393
No log 8.2439 338 0.7114 0.7266 0.7114 0.8435
No log 8.2927 340 0.7125 0.7297 0.7125 0.8441
No log 8.3415 342 0.7076 0.7322 0.7076 0.8412
No log 8.3902 344 0.7079 0.7322 0.7079 0.8414
No log 8.4390 346 0.7063 0.7308 0.7063 0.8404
No log 8.4878 348 0.6983 0.7219 0.6983 0.8356
No log 8.5366 350 0.6962 0.7333 0.6962 0.8344
No log 8.5854 352 0.6913 0.7452 0.6913 0.8314
No log 8.6341 354 0.6872 0.7547 0.6872 0.8290
No log 8.6829 356 0.6870 0.7512 0.6870 0.8288
No log 8.7317 358 0.6873 0.7443 0.6873 0.8291
No log 8.7805 360 0.6851 0.7443 0.6851 0.8277
No log 8.8293 362 0.6827 0.7459 0.6827 0.8262
No log 8.8780 364 0.6807 0.7493 0.6807 0.8250
No log 8.9268 366 0.6826 0.7547 0.6826 0.8262
No log 8.9756 368 0.6860 0.7372 0.6860 0.8282
No log 9.0244 370 0.6910 0.7333 0.6910 0.8313
No log 9.0732 372 0.6949 0.7293 0.6949 0.8336
No log 9.1220 374 0.6979 0.7293 0.6979 0.8354
No log 9.1707 376 0.6989 0.7293 0.6989 0.8360
No log 9.2195 378 0.7011 0.7139 0.7011 0.8373
No log 9.2683 380 0.7035 0.7139 0.7035 0.8387
No log 9.3171 382 0.7037 0.7139 0.7037 0.8389
No log 9.3659 384 0.7037 0.7139 0.7037 0.8389
No log 9.4146 386 0.7001 0.7293 0.7001 0.8367
No log 9.4634 388 0.6982 0.7293 0.6982 0.8356
No log 9.5122 390 0.6980 0.7293 0.6980 0.8355
No log 9.5610 392 0.6975 0.7293 0.6975 0.8352
No log 9.6098 394 0.6957 0.7333 0.6957 0.8341
No log 9.6585 396 0.6961 0.7333 0.6961 0.8343
No log 9.7073 398 0.6967 0.7333 0.6967 0.8347
No log 9.7561 400 0.6971 0.7333 0.6971 0.8349
No log 9.8049 402 0.6976 0.7333 0.6976 0.8352
No log 9.8537 404 0.6988 0.7333 0.6988 0.8360
No log 9.9024 406 0.6992 0.7333 0.6992 0.8362
No log 9.9512 408 0.6993 0.7333 0.6993 0.8362
No log 10.0 410 0.6993 0.7333 0.6993 0.8363

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run1_AugV5_k7_task1_organization

Finetuned
(4023)
this model