ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k16_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8447
  • Qwk: 0.6269
  • Mse: 0.8447
  • Rmse: 0.9191

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0167 2 6.5622 0.0311 6.5622 2.5617
No log 0.0333 4 4.1621 0.1140 4.1621 2.0401
No log 0.05 6 3.0189 0.1149 3.0189 1.7375
No log 0.0667 8 1.8859 0.3252 1.8859 1.3733
No log 0.0833 10 1.8615 0.2500 1.8615 1.3644
No log 0.1 12 1.5567 0.1538 1.5567 1.2477
No log 0.1167 14 1.6013 0.2243 1.6013 1.2654
No log 0.1333 16 1.5373 0.4068 1.5373 1.2399
No log 0.15 18 1.5366 0.4032 1.5366 1.2396
No log 0.1667 20 1.5099 0.3871 1.5099 1.2288
No log 0.1833 22 1.3957 0.3652 1.3957 1.1814
No log 0.2 24 1.4703 0.2752 1.4703 1.2125
No log 0.2167 26 1.3227 0.3540 1.3227 1.1501
No log 0.2333 28 1.1471 0.5039 1.1471 1.0710
No log 0.25 30 1.1312 0.5 1.1312 1.0636
No log 0.2667 32 1.1976 0.4138 1.1976 1.0943
No log 0.2833 34 1.1998 0.4407 1.1998 1.0953
No log 0.3 36 1.1419 0.5167 1.1419 1.0686
No log 0.3167 38 1.0943 0.5366 1.0943 1.0461
No log 0.3333 40 1.1097 0.544 1.1097 1.0534
No log 0.35 42 1.0677 0.6154 1.0677 1.0333
No log 0.3667 44 1.0824 0.6047 1.0824 1.0404
No log 0.3833 46 1.2087 0.5312 1.2087 1.0994
No log 0.4 48 1.1130 0.5873 1.1130 1.0550
No log 0.4167 50 1.1269 0.5736 1.1269 1.0615
No log 0.4333 52 0.9953 0.5581 0.9953 0.9977
No log 0.45 54 0.9561 0.5802 0.9561 0.9778
No log 0.4667 56 0.9283 0.6165 0.9283 0.9635
No log 0.4833 58 1.0866 0.5891 1.0866 1.0424
No log 0.5 60 1.4343 0.3529 1.4343 1.1976
No log 0.5167 62 1.6466 0.2321 1.6466 1.2832
No log 0.5333 64 1.3136 0.4034 1.3136 1.1461
No log 0.55 66 1.1314 0.5397 1.1314 1.0637
No log 0.5667 68 1.1096 0.5397 1.1096 1.0534
No log 0.5833 70 1.0585 0.512 1.0585 1.0288
No log 0.6 72 0.9702 0.5846 0.9702 0.9850
No log 0.6167 74 0.9226 0.6165 0.9226 0.9605
No log 0.6333 76 0.9340 0.6212 0.9340 0.9665
No log 0.65 78 0.9471 0.5970 0.9471 0.9732
No log 0.6667 80 0.8161 0.6522 0.8161 0.9034
No log 0.6833 82 0.8460 0.6277 0.8460 0.9198
No log 0.7 84 1.0712 0.6197 1.0712 1.0350
No log 0.7167 86 1.2216 0.5755 1.2216 1.1053
No log 0.7333 88 1.3992 0.5350 1.3992 1.1829
No log 0.75 90 1.6677 0.5304 1.6677 1.2914
No log 0.7667 92 1.6272 0.5638 1.6272 1.2756
No log 0.7833 94 1.5396 0.5495 1.5396 1.2408
No log 0.8 96 1.3559 0.6127 1.3559 1.1644
No log 0.8167 98 1.1656 0.6347 1.1656 1.0796
No log 0.8333 100 1.0102 0.6497 1.0102 1.0051
No log 0.85 102 0.9059 0.6667 0.9059 0.9518
No log 0.8667 104 0.9219 0.6761 0.9219 0.9602
No log 0.8833 106 0.9202 0.6667 0.9202 0.9593
No log 0.9 108 1.0733 0.5588 1.0733 1.0360
No log 0.9167 110 1.2094 0.6115 1.2094 1.0997
No log 0.9333 112 1.4149 0.5780 1.4149 1.1895
No log 0.95 114 1.2575 0.6543 1.2575 1.1214
No log 0.9667 116 0.8783 0.6483 0.8783 0.9372
No log 0.9833 118 0.8389 0.7162 0.8389 0.9159
No log 1.0 120 0.8621 0.6846 0.8621 0.9285
No log 1.0167 122 0.9045 0.7013 0.9045 0.9510
No log 1.0333 124 1.1320 0.5912 1.1320 1.0640
No log 1.05 126 1.6684 0.5538 1.6684 1.2917
No log 1.0667 128 1.8147 0.5672 1.8147 1.3471
No log 1.0833 130 1.5328 0.5537 1.5328 1.2381
No log 1.1 132 1.0318 0.6122 1.0318 1.0158
No log 1.1167 134 0.8884 0.6370 0.8884 0.9426
No log 1.1333 136 0.8811 0.6667 0.8811 0.9387
No log 1.15 138 1.0434 0.6338 1.0434 1.0215
No log 1.1667 140 1.2837 0.5677 1.2837 1.1330
No log 1.1833 142 1.2481 0.5769 1.2481 1.1172
No log 1.2 144 1.0881 0.5806 1.0881 1.0431
No log 1.2167 146 0.9890 0.6667 0.9890 0.9945
No log 1.2333 148 1.0528 0.6623 1.0528 1.0261
No log 1.25 150 1.1060 0.5714 1.1060 1.0517
No log 1.2667 152 1.0412 0.5578 1.0412 1.0204
No log 1.2833 154 1.0211 0.6 1.0211 1.0105
No log 1.3 156 0.9914 0.5850 0.9914 0.9957
No log 1.3167 158 1.1197 0.5917 1.1197 1.0581
No log 1.3333 160 1.1355 0.5988 1.1355 1.0656
No log 1.35 162 1.0045 0.6316 1.0045 1.0022
No log 1.3667 164 0.8607 0.6619 0.8607 0.9277
No log 1.3833 166 0.8311 0.6232 0.8311 0.9117
No log 1.4 168 0.7551 0.7101 0.7551 0.8689
No log 1.4167 170 0.6978 0.7626 0.6978 0.8353
No log 1.4333 172 0.7078 0.75 0.7078 0.8413
No log 1.45 174 0.7376 0.7042 0.7376 0.8588
No log 1.4667 176 0.8512 0.6389 0.8512 0.9226
No log 1.4833 178 1.1001 0.6104 1.1001 1.0489
No log 1.5 180 1.1621 0.6076 1.1621 1.0780
No log 1.5167 182 1.1316 0.6832 1.1316 1.0638
No log 1.5333 184 0.9902 0.6829 0.9902 0.9951
No log 1.55 186 0.8968 0.6842 0.8968 0.9470
No log 1.5667 188 0.9058 0.6928 0.9058 0.9517
No log 1.5833 190 1.0034 0.6389 1.0034 1.0017
No log 1.6 192 0.9545 0.5899 0.9545 0.9770
No log 1.6167 194 0.9154 0.5970 0.9154 0.9568
No log 1.6333 196 0.9787 0.6043 0.9787 0.9893
No log 1.65 198 0.9595 0.6294 0.9595 0.9796
No log 1.6667 200 0.8392 0.7092 0.8392 0.9161
No log 1.6833 202 0.8122 0.6857 0.8122 0.9012
No log 1.7 204 0.9915 0.6536 0.9915 0.9957
No log 1.7167 206 1.4476 0.5031 1.4476 1.2032
No log 1.7333 208 1.5306 0.4654 1.5306 1.2372
No log 1.75 210 1.3036 0.5417 1.3036 1.1417
No log 1.7667 212 1.0239 0.6056 1.0239 1.0119
No log 1.7833 214 0.9946 0.6626 0.9946 0.9973
No log 1.8 216 1.0509 0.6092 1.0509 1.0251
No log 1.8167 218 0.9910 0.6289 0.9910 0.9955
No log 1.8333 220 0.8435 0.6809 0.8435 0.9184
No log 1.85 222 0.7831 0.6957 0.7831 0.8849
No log 1.8667 224 0.8078 0.6522 0.8078 0.8988
No log 1.8833 226 0.8267 0.6763 0.8267 0.9092
No log 1.9 228 1.0266 0.6056 1.0266 1.0132
No log 1.9167 230 1.2806 0.5270 1.2806 1.1316
No log 1.9333 232 1.2029 0.5734 1.2029 1.0968
No log 1.95 234 1.0261 0.5954 1.0261 1.0130
No log 1.9667 236 0.8786 0.6462 0.8786 0.9374
No log 1.9833 238 0.8415 0.6565 0.8415 0.9173
No log 2.0 240 0.9720 0.6494 0.9720 0.9859
No log 2.0167 242 1.3089 0.6022 1.3089 1.1441
No log 2.0333 244 1.6027 0.5911 1.6027 1.2660
No log 2.05 246 1.4523 0.5949 1.4523 1.2051
No log 2.0667 248 1.0249 0.6536 1.0249 1.0124
No log 2.0833 250 0.7670 0.6866 0.7670 0.8758
No log 2.1 252 0.7440 0.6866 0.7440 0.8625
No log 2.1167 254 0.7728 0.6866 0.7728 0.8791
No log 2.1333 256 0.8584 0.6286 0.8584 0.9265
No log 2.15 258 1.0364 0.625 1.0364 1.0180
No log 2.1667 260 1.2312 0.5333 1.2312 1.1096
No log 2.1833 262 1.0741 0.5811 1.0741 1.0364
No log 2.2 264 0.8696 0.6475 0.8696 0.9325
No log 2.2167 266 0.8558 0.6571 0.8558 0.9251
No log 2.2333 268 1.0251 0.6111 1.0251 1.0125
No log 2.25 270 1.1515 0.5578 1.1515 1.0731
No log 2.2667 272 1.0653 0.5797 1.0653 1.0321
No log 2.2833 274 0.8844 0.6515 0.8844 0.9404
No log 2.3 276 0.8493 0.6462 0.8493 0.9216
No log 2.3167 278 0.8506 0.6515 0.8506 0.9223
No log 2.3333 280 0.8830 0.6061 0.8830 0.9397
No log 2.35 282 0.9582 0.6061 0.9582 0.9789
No log 2.3667 284 1.0469 0.6029 1.0469 1.0232
No log 2.3833 286 1.1682 0.5732 1.1682 1.0808
No log 2.4 288 1.1935 0.6087 1.1935 1.0925
No log 2.4167 290 1.2444 0.6098 1.2444 1.1155
No log 2.4333 292 1.0428 0.6115 1.0428 1.0212
No log 2.45 294 0.8808 0.6232 0.8808 0.9385
No log 2.4667 296 0.9347 0.6301 0.9347 0.9668
No log 2.4833 298 1.0499 0.5946 1.0499 1.0246
No log 2.5 300 1.0385 0.5816 1.0385 1.0191
No log 2.5167 302 1.0786 0.5926 1.0786 1.0386
No log 2.5333 304 0.9741 0.6212 0.9741 0.9869
No log 2.55 306 0.8639 0.6412 0.8639 0.9294
No log 2.5667 308 0.7975 0.6767 0.7975 0.8930
No log 2.5833 310 0.8949 0.6483 0.8949 0.9460
No log 2.6 312 1.0362 0.6545 1.0362 1.0179
No log 2.6167 314 1.0246 0.6543 1.0246 1.0122
No log 2.6333 316 1.1401 0.6341 1.1401 1.0677
No log 2.65 318 1.0877 0.5960 1.0877 1.0429
No log 2.6667 320 0.9036 0.6418 0.9036 0.9506
No log 2.6833 322 0.8038 0.6471 0.8038 0.8965
No log 2.7 324 0.8278 0.6577 0.8278 0.9099
No log 2.7167 326 0.9919 0.6706 0.9919 0.9959
No log 2.7333 328 1.0920 0.6742 1.0920 1.0450
No log 2.75 330 0.9999 0.6667 0.9999 1.0000
No log 2.7667 332 0.8411 0.6618 0.8411 0.9171
No log 2.7833 334 0.7595 0.6718 0.7595 0.8715
No log 2.8 336 0.7577 0.6718 0.7577 0.8704
No log 2.8167 338 0.8404 0.6269 0.8404 0.9167
No log 2.8333 340 0.8846 0.6286 0.8846 0.9405
No log 2.85 342 0.9791 0.625 0.9791 0.9895
No log 2.8667 344 1.0689 0.5915 1.0689 1.0339
No log 2.8833 346 1.0461 0.6087 1.0461 1.0228
No log 2.9 348 0.9489 0.6212 0.9489 0.9741
No log 2.9167 350 0.8517 0.6718 0.8517 0.9229
No log 2.9333 352 0.8484 0.6406 0.8484 0.9211
No log 2.95 354 0.8165 0.6406 0.8165 0.9036
No log 2.9667 356 0.8354 0.6567 0.8354 0.9140
No log 2.9833 358 0.8801 0.6710 0.8801 0.9382
No log 3.0 360 0.7870 0.6923 0.7870 0.8872
No log 3.0167 362 0.7448 0.7215 0.7448 0.8630
No log 3.0333 364 0.8160 0.6883 0.8160 0.9033
No log 3.05 366 1.0478 0.6543 1.0478 1.0236
No log 3.0667 368 1.1944 0.5987 1.1944 1.0929
No log 3.0833 370 1.4228 0.5862 1.4228 1.1928
No log 3.1 372 1.4937 0.5153 1.4937 1.2222
No log 3.1167 374 1.3706 0.4521 1.3706 1.1707
No log 3.1333 376 1.1946 0.5714 1.1946 1.0930
No log 3.15 378 1.0801 0.6202 1.0801 1.0393
No log 3.1667 380 1.0460 0.6107 1.0460 1.0227
No log 3.1833 382 1.1108 0.5865 1.1108 1.0540
No log 3.2 384 1.2570 0.5714 1.2570 1.1211
No log 3.2167 386 1.2704 0.5677 1.2704 1.1271
No log 3.2333 388 1.1098 0.6 1.1098 1.0535
No log 3.25 390 0.9604 0.6324 0.9604 0.9800
No log 3.2667 392 0.9064 0.6165 0.9064 0.9520
No log 3.2833 394 0.8443 0.6357 0.8443 0.9189
No log 3.3 396 0.7773 0.7246 0.7773 0.8817
No log 3.3167 398 0.7365 0.7310 0.7365 0.8582
No log 3.3333 400 0.7493 0.72 0.7493 0.8656
No log 3.35 402 0.7268 0.72 0.7268 0.8525
No log 3.3667 404 0.7535 0.72 0.7535 0.8680
No log 3.3833 406 0.8617 0.6582 0.8617 0.9283
No log 3.4 408 0.9753 0.6708 0.9753 0.9876
No log 3.4167 410 0.9063 0.6752 0.9063 0.9520
No log 3.4333 412 0.8139 0.6759 0.8139 0.9022
No log 3.45 414 0.8205 0.6765 0.8205 0.9058
No log 3.4667 416 0.9548 0.6061 0.9548 0.9771
No log 3.4833 418 1.1264 0.6197 1.1264 1.0613
No log 3.5 420 1.1517 0.5882 1.1517 1.0732
No log 3.5167 422 1.0709 0.6107 1.0709 1.0348
No log 3.5333 424 1.0224 0.6202 1.0224 1.0111
No log 3.55 426 1.0382 0.6094 1.0382 1.0189
No log 3.5667 428 1.1218 0.5954 1.1218 1.0592
No log 3.5833 430 1.2701 0.5490 1.2701 1.1270
No log 3.6 432 1.2966 0.5786 1.2966 1.1387
No log 3.6167 434 1.2091 0.5490 1.2091 1.0996
No log 3.6333 436 1.0836 0.6040 1.0836 1.0409
No log 3.65 438 0.9739 0.6212 0.9739 0.9869
No log 3.6667 440 0.9451 0.6260 0.9451 0.9721
No log 3.6833 442 1.0301 0.5890 1.0301 1.0149
No log 3.7 444 1.1541 0.6027 1.1541 1.0743
No log 3.7167 446 1.2086 0.5931 1.2086 1.0993
No log 3.7333 448 1.2705 0.5490 1.2705 1.1272
No log 3.75 450 1.1183 0.6 1.1183 1.0575
No log 3.7667 452 0.9461 0.6412 0.9461 0.9727
No log 3.7833 454 0.9204 0.6412 0.9204 0.9594
No log 3.8 456 0.8971 0.6412 0.8971 0.9472
No log 3.8167 458 0.9235 0.6277 0.9235 0.9610
No log 3.8333 460 1.0825 0.5867 1.0825 1.0404
No log 3.85 462 1.1590 0.6258 1.1590 1.0765
No log 3.8667 464 1.0170 0.5811 1.0170 1.0085
No log 3.8833 466 0.8915 0.6364 0.8915 0.9442
No log 3.9 468 0.8336 0.6667 0.8336 0.9130
No log 3.9167 470 0.7940 0.6815 0.7940 0.8911
No log 3.9333 472 0.8176 0.6815 0.8176 0.9042
No log 3.95 474 0.8866 0.6667 0.8866 0.9416
No log 3.9667 476 1.1141 0.5493 1.1141 1.0555
No log 3.9833 478 1.4001 0.5031 1.4001 1.1833
No log 4.0 480 1.5148 0.4512 1.5148 1.2308
No log 4.0167 482 1.4556 0.4815 1.4556 1.2065
No log 4.0333 484 1.3366 0.4414 1.3366 1.1561
No log 4.05 486 1.1935 0.5373 1.1935 1.0925
No log 4.0667 488 1.1111 0.6667 1.1111 1.0541
No log 4.0833 490 1.1178 0.6475 1.1178 1.0573
No log 4.1 492 1.0735 0.6475 1.0735 1.0361
No log 4.1167 494 1.0300 0.6711 1.0300 1.0149
No log 4.1333 496 0.9835 0.6709 0.9835 0.9917
No log 4.15 498 0.8893 0.6711 0.8893 0.9430
0.4466 4.1667 500 0.8422 0.6759 0.8422 0.9177
0.4466 4.1833 502 0.8714 0.6475 0.8714 0.9335
0.4466 4.2 504 0.9356 0.6324 0.9356 0.9673
0.4466 4.2167 506 0.8969 0.6370 0.8969 0.9471
0.4466 4.2333 508 0.8509 0.6269 0.8509 0.9225
0.4466 4.25 510 0.8447 0.6269 0.8447 0.9191

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k16_task1_organization

Finetuned
(4023)
this model