ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k7_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3276
  • Qwk: 0.4964
  • Mse: 1.3276
  • Rmse: 1.1522

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0645 2 6.8299 -0.0058 6.8299 2.6134
No log 0.1290 4 4.0761 0.0702 4.0761 2.0189
No log 0.1935 6 2.7042 0.0870 2.7042 1.6444
No log 0.2581 8 2.4076 0.0465 2.4076 1.5516
No log 0.3226 10 2.2492 0.0 2.2492 1.4997
No log 0.3871 12 1.7992 0.1887 1.7992 1.3413
No log 0.4516 14 1.7834 0.1667 1.7834 1.3354
No log 0.5161 16 2.1138 0.1260 2.1138 1.4539
No log 0.5806 18 2.7349 0.0732 2.7349 1.6538
No log 0.6452 20 2.4102 0.0822 2.4102 1.5525
No log 0.7097 22 2.2349 0.1727 2.2349 1.4950
No log 0.7742 24 2.1084 0.1926 2.1084 1.4520
No log 0.8387 26 1.9125 0.3175 1.9125 1.3829
No log 0.9032 28 1.6030 0.2975 1.6030 1.2661
No log 0.9677 30 1.4858 0.3415 1.4858 1.2189
No log 1.0323 32 1.4291 0.3793 1.4291 1.1954
No log 1.0968 34 1.5476 0.3478 1.5476 1.2440
No log 1.1613 36 1.7870 0.1754 1.7870 1.3368
No log 1.2258 38 2.1780 0.1197 2.1780 1.4758
No log 1.2903 40 2.6099 -0.0159 2.6099 1.6155
No log 1.3548 42 2.0002 0.0348 2.0002 1.4143
No log 1.4194 44 1.3974 0.4375 1.3974 1.1821
No log 1.4839 46 1.3521 0.4928 1.3521 1.1628
No log 1.5484 48 1.2181 0.5547 1.2181 1.1037
No log 1.6129 50 1.2717 0.5203 1.2717 1.1277
No log 1.6774 52 1.7783 0.2149 1.7783 1.3335
No log 1.7419 54 2.0509 0.0806 2.0509 1.4321
No log 1.8065 56 1.8040 0.2167 1.8040 1.3431
No log 1.8710 58 1.4702 0.3115 1.4702 1.2125
No log 1.9355 60 1.4054 0.3810 1.4054 1.1855
No log 2.0 62 1.4685 0.3465 1.4685 1.2118
No log 2.0645 64 1.3355 0.4219 1.3355 1.1556
No log 2.1290 66 1.4787 0.3333 1.4787 1.2160
No log 2.1935 68 1.5903 0.3053 1.5903 1.2611
No log 2.2581 70 1.4959 0.3182 1.4959 1.2231
No log 2.3226 72 1.7673 0.2290 1.7673 1.3294
No log 2.3871 74 1.7291 0.2481 1.7291 1.3149
No log 2.4516 76 1.3836 0.4211 1.3836 1.1763
No log 2.5161 78 1.4954 0.3817 1.4954 1.2229
No log 2.5806 80 1.3361 0.4478 1.3361 1.1559
No log 2.6452 82 0.9916 0.6277 0.9916 0.9958
No log 2.7097 84 0.9893 0.6029 0.9893 0.9946
No log 2.7742 86 1.2865 0.4697 1.2865 1.1342
No log 2.8387 88 1.6490 0.2857 1.6490 1.2841
No log 2.9032 90 1.5524 0.3385 1.5524 1.2460
No log 2.9677 92 1.1565 0.5185 1.1565 1.0754
No log 3.0323 94 1.0172 0.5625 1.0172 1.0085
No log 3.0968 96 1.1453 0.5333 1.1453 1.0702
No log 3.1613 98 1.4953 0.3688 1.4953 1.2228
No log 3.2258 100 1.4288 0.4414 1.4288 1.1953
No log 3.2903 102 1.4045 0.4490 1.4045 1.1851
No log 3.3548 104 1.5871 0.4133 1.5871 1.2598
No log 3.4194 106 1.7216 0.3448 1.7216 1.3121
No log 3.4839 108 1.6849 0.375 1.6849 1.2981
No log 3.5484 110 1.4363 0.4276 1.4363 1.1984
No log 3.6129 112 1.1554 0.5839 1.1554 1.0749
No log 3.6774 114 0.9317 0.5802 0.9317 0.9652
No log 3.7419 116 0.9417 0.5581 0.9417 0.9704
No log 3.8065 118 0.9474 0.6 0.9474 0.9734
No log 3.8710 120 1.2107 0.5606 1.2107 1.1003
No log 3.9355 122 1.3952 0.4604 1.3952 1.1812
No log 4.0 124 1.4061 0.4783 1.4061 1.1858
No log 4.0645 126 1.2628 0.5294 1.2628 1.1238
No log 4.1290 128 1.1222 0.6423 1.1222 1.0594
No log 4.1935 130 1.1299 0.6131 1.1299 1.0630
No log 4.2581 132 1.1203 0.5942 1.1203 1.0585
No log 4.3226 134 1.1430 0.5755 1.1430 1.0691
No log 4.3871 136 1.1241 0.5970 1.1241 1.0602
No log 4.4516 138 1.2208 0.6015 1.2208 1.1049
No log 4.5161 140 1.1995 0.5846 1.1995 1.0952
No log 4.5806 142 1.1552 0.4068 1.1552 1.0748
No log 4.6452 144 1.1251 0.5440 1.1251 1.0607
No log 4.7097 146 1.1643 0.5882 1.1643 1.0790
No log 4.7742 148 1.1984 0.5468 1.1984 1.0947
No log 4.8387 150 1.4139 0.4476 1.4139 1.1891
No log 4.9032 152 1.4375 0.4286 1.4375 1.1989
No log 4.9677 154 1.2433 0.5217 1.2433 1.1150
No log 5.0323 156 1.1259 0.6222 1.1259 1.0611
No log 5.0968 158 1.1521 0.6015 1.1521 1.0734
No log 5.1613 160 1.3107 0.4776 1.3107 1.1449
No log 5.2258 162 1.7045 0.3404 1.7045 1.3056
No log 5.2903 164 1.7371 0.3239 1.7371 1.3180
No log 5.3548 166 1.6978 0.3239 1.6978 1.3030
No log 5.4194 168 1.3087 0.5103 1.3087 1.1440
No log 5.4839 170 0.9015 0.6950 0.9015 0.9495
No log 5.5484 172 0.8789 0.6154 0.8789 0.9375
No log 5.6129 174 0.8582 0.625 0.8582 0.9264
No log 5.6774 176 0.9004 0.6857 0.9004 0.9489
No log 5.7419 178 1.0234 0.6377 1.0234 1.0116
No log 5.8065 180 1.1565 0.6131 1.1565 1.0754
No log 5.8710 182 1.1271 0.6165 1.1271 1.0616
No log 5.9355 184 1.1363 0.5781 1.1363 1.0660
No log 6.0 186 1.1634 0.5538 1.1634 1.0786
No log 6.0645 188 1.1228 0.5891 1.1228 1.0596
No log 6.1290 190 1.1181 0.5556 1.1181 1.0574
No log 6.1935 192 1.0886 0.5827 1.0886 1.0434
No log 6.2581 194 1.0702 0.5827 1.0702 1.0345
No log 6.3226 196 1.0684 0.5891 1.0684 1.0336
No log 6.3871 198 1.0629 0.6165 1.0629 1.0310
No log 6.4516 200 1.0485 0.6165 1.0485 1.0239
No log 6.5161 202 1.0650 0.6029 1.0650 1.0320
No log 6.5806 204 1.0734 0.6074 1.0734 1.0361
No log 6.6452 206 0.9462 0.6316 0.9462 0.9727
No log 6.7097 208 0.9247 0.6412 0.9247 0.9616
No log 6.7742 210 0.9109 0.6515 0.9109 0.9544
No log 6.8387 212 0.9849 0.6567 0.9849 0.9924
No log 6.9032 214 0.9461 0.6567 0.9461 0.9727
No log 6.9677 216 0.8912 0.6466 0.8912 0.9440
No log 7.0323 218 0.9160 0.6567 0.9160 0.9571
No log 7.0968 220 1.0918 0.5942 1.0918 1.0449
No log 7.1613 222 1.2841 0.5217 1.2841 1.1332
No log 7.2258 224 1.2197 0.5401 1.2197 1.1044
No log 7.2903 226 1.0573 0.6074 1.0573 1.0283
No log 7.3548 228 1.0376 0.6212 1.0376 1.0186
No log 7.4194 230 1.1882 0.5909 1.1882 1.0900
No log 7.4839 232 1.2295 0.5802 1.2295 1.1088
No log 7.5484 234 1.2772 0.5522 1.2772 1.1301
No log 7.6129 236 1.3730 0.5037 1.3730 1.1718
No log 7.6774 238 1.3150 0.5224 1.3150 1.1467
No log 7.7419 240 1.2587 0.5481 1.2587 1.1219
No log 7.8065 242 1.1580 0.5882 1.1580 1.0761
No log 7.8710 244 1.0791 0.6074 1.0791 1.0388
No log 7.9355 246 1.1183 0.5693 1.1183 1.0575
No log 8.0 248 1.0514 0.5926 1.0514 1.0254
No log 8.0645 250 0.9807 0.6015 0.9807 0.9903
No log 8.1290 252 0.9733 0.5846 0.9733 0.9866
No log 8.1935 254 1.0098 0.6061 1.0098 1.0049
No log 8.2581 256 0.9890 0.6061 0.9890 0.9945
No log 8.3226 258 0.9277 0.5954 0.9277 0.9632
No log 8.3871 260 0.8741 0.6667 0.8741 0.9349
No log 8.4516 262 0.9279 0.6119 0.9279 0.9633
No log 8.5161 264 1.1479 0.5839 1.1479 1.0714
No log 8.5806 266 1.2487 0.5333 1.2487 1.1174
No log 8.6452 268 1.1345 0.5441 1.1345 1.0651
No log 8.7097 270 1.1532 0.5373 1.1532 1.0739
No log 8.7742 272 1.0791 0.5672 1.0791 1.0388
No log 8.8387 274 1.0102 0.5926 1.0102 1.0051
No log 8.9032 276 1.1476 0.5455 1.1476 1.0713
No log 8.9677 278 1.3486 0.5037 1.3486 1.1613
No log 9.0323 280 1.3873 0.5037 1.3873 1.1778
No log 9.0968 282 1.3009 0.5075 1.3009 1.1406
No log 9.1613 284 1.2518 0.5507 1.2518 1.1189
No log 9.2258 286 1.2262 0.5401 1.2262 1.1073
No log 9.2903 288 1.2832 0.5401 1.2832 1.1328
No log 9.3548 290 1.3988 0.4962 1.3988 1.1827
No log 9.4194 292 1.3307 0.4962 1.3307 1.1536
No log 9.4839 294 1.1749 0.5827 1.1749 1.0839
No log 9.5484 296 1.0538 0.5781 1.0538 1.0266
No log 9.6129 298 0.9662 0.6364 0.9662 0.9829
No log 9.6774 300 0.9636 0.5821 0.9636 0.9817
No log 9.7419 302 1.1765 0.5468 1.1765 1.0847
No log 9.8065 304 1.3295 0.4965 1.3295 1.1530
No log 9.8710 306 1.5422 0.3857 1.5422 1.2418
No log 9.9355 308 1.5370 0.4029 1.5370 1.2398
No log 10.0 310 1.4076 0.4593 1.4076 1.1864
No log 10.0645 312 1.2012 0.5385 1.2012 1.0960
No log 10.1290 314 1.0841 0.5714 1.0841 1.0412
No log 10.1935 316 1.1161 0.5821 1.1161 1.0565
No log 10.2581 318 1.1172 0.5821 1.1172 1.0570
No log 10.3226 320 1.2048 0.5344 1.2048 1.0976
No log 10.3871 322 1.2019 0.5344 1.2019 1.0963
No log 10.4516 324 1.0804 0.5397 1.0804 1.0394
No log 10.5161 326 1.0665 0.5669 1.0665 1.0327
No log 10.5806 328 1.0476 0.6094 1.0476 1.0235
No log 10.6452 330 1.0447 0.5891 1.0447 1.0221
No log 10.7097 332 1.0699 0.5581 1.0699 1.0343
No log 10.7742 334 1.1913 0.5303 1.1913 1.0915
No log 10.8387 336 1.2524 0.5303 1.2524 1.1191
No log 10.9032 338 1.1429 0.5231 1.1429 1.0691
No log 10.9677 340 1.0525 0.5625 1.0525 1.0259
No log 11.0323 342 1.0785 0.5426 1.0785 1.0385
No log 11.0968 344 1.0780 0.5426 1.0780 1.0383
No log 11.1613 346 1.0821 0.5426 1.0821 1.0403
No log 11.2258 348 1.0376 0.6047 1.0376 1.0186
No log 11.2903 350 1.0683 0.5606 1.0683 1.0336
No log 11.3548 352 1.2543 0.5652 1.2543 1.1200
No log 11.4194 354 1.3989 0.4203 1.3989 1.1827
No log 11.4839 356 1.4157 0.4648 1.4157 1.1899
No log 11.5484 358 1.2669 0.5652 1.2669 1.1256
No log 11.6129 360 1.1961 0.5652 1.1961 1.0937
No log 11.6774 362 1.1275 0.5821 1.1275 1.0618
No log 11.7419 364 1.1310 0.5758 1.1310 1.0635
No log 11.8065 366 1.1959 0.5581 1.1959 1.0936
No log 11.8710 368 1.2182 0.4961 1.2182 1.1037
No log 11.9355 370 1.1999 0.5 1.1999 1.0954
No log 12.0 372 1.1925 0.4961 1.1925 1.0920
No log 12.0645 374 1.1082 0.5538 1.1082 1.0527
No log 12.1290 376 1.0390 0.5714 1.0390 1.0193
No log 12.1935 378 1.0038 0.5827 1.0038 1.0019
No log 12.2581 380 1.0371 0.5714 1.0371 1.0184
No log 12.3226 382 1.1973 0.5778 1.1973 1.0942
No log 12.3871 384 1.2183 0.5606 1.2183 1.1038
No log 12.4516 386 1.0960 0.5821 1.0960 1.0469
No log 12.5161 388 1.0106 0.6202 1.0106 1.0053
No log 12.5806 390 1.0449 0.6 1.0449 1.0222
No log 12.6452 392 1.1287 0.5692 1.1287 1.0624
No log 12.7097 394 1.1217 0.5802 1.1217 1.0591
No log 12.7742 396 1.0332 0.5865 1.0332 1.0164
No log 12.8387 398 0.9501 0.6 0.9501 0.9747
No log 12.9032 400 0.9397 0.6 0.9397 0.9694
No log 12.9677 402 0.9522 0.6061 0.9522 0.9758
No log 13.0323 404 1.0178 0.5778 1.0178 1.0088
No log 13.0968 406 1.0014 0.5821 1.0014 1.0007
No log 13.1613 408 0.9611 0.6061 0.9611 0.9804
No log 13.2258 410 0.9399 0.6061 0.9399 0.9695
No log 13.2903 412 0.9310 0.6061 0.9310 0.9649
No log 13.3548 414 0.8946 0.5846 0.8946 0.9458
No log 13.4194 416 0.8789 0.6212 0.8789 0.9375
No log 13.4839 418 0.8782 0.6370 0.8782 0.9371
No log 13.5484 420 0.9152 0.6212 0.9152 0.9567
No log 13.6129 422 0.9485 0.6316 0.9485 0.9739
No log 13.6774 424 0.9502 0.6316 0.9502 0.9748
No log 13.7419 426 0.9084 0.6567 0.9084 0.9531
No log 13.8065 428 0.8024 0.6714 0.8024 0.8958
No log 13.8710 430 0.7747 0.7042 0.7747 0.8801
No log 13.9355 432 0.8314 0.6383 0.8314 0.9118
No log 14.0 434 1.0355 0.5778 1.0355 1.0176
No log 14.0645 436 1.2555 0.4964 1.2555 1.1205
No log 14.1290 438 1.2601 0.4964 1.2601 1.1225
No log 14.1935 440 1.1120 0.5714 1.1120 1.0545
No log 14.2581 442 0.9341 0.6107 0.9341 0.9665
No log 14.3226 444 0.8584 0.625 0.8584 0.9265
No log 14.3871 446 0.8591 0.6202 0.8591 0.9269
No log 14.4516 448 0.8936 0.6202 0.8936 0.9453
No log 14.5161 450 0.9120 0.6107 0.9120 0.9550
No log 14.5806 452 0.9308 0.5846 0.9308 0.9648
No log 14.6452 454 0.9379 0.5821 0.9379 0.9684
No log 14.7097 456 1.0179 0.5606 1.0179 1.0089
No log 14.7742 458 1.0298 0.5564 1.0298 1.0148
No log 14.8387 460 1.0654 0.5693 1.0654 1.0322
No log 14.9032 462 0.9774 0.6269 0.9774 0.9887
No log 14.9677 464 0.9593 0.6269 0.9593 0.9794
No log 15.0323 466 1.0469 0.5672 1.0469 1.0232
No log 15.0968 468 1.1406 0.5606 1.1406 1.0680
No log 15.1613 470 1.2219 0.5414 1.2219 1.1054
No log 15.2258 472 1.2092 0.5414 1.2092 1.0996
No log 15.2903 474 1.1006 0.5271 1.1006 1.0491
No log 15.3548 476 1.0252 0.5669 1.0252 1.0125
No log 15.4194 478 1.0500 0.5781 1.0500 1.0247
No log 15.4839 480 1.0501 0.5736 1.0501 1.0248
No log 15.5484 482 1.0484 0.5736 1.0484 1.0239
No log 15.6129 484 1.0105 0.5736 1.0105 1.0052
No log 15.6774 486 0.9995 0.5827 0.9995 0.9997
No log 15.7419 488 0.9973 0.5827 0.9973 0.9987
No log 15.8065 490 1.0301 0.5954 1.0301 1.0149
No log 15.8710 492 1.0024 0.5891 1.0024 1.0012
No log 15.9355 494 0.9952 0.5938 0.9952 0.9976
No log 16.0 496 0.9727 0.6107 0.9727 0.9863
No log 16.0645 498 0.9224 0.5891 0.9224 0.9604
0.3439 16.1290 500 0.8925 0.5891 0.8925 0.9447
0.3439 16.1935 502 0.8863 0.5938 0.8863 0.9414
0.3439 16.2581 504 0.9155 0.5891 0.9155 0.9568
0.3439 16.3226 506 0.9998 0.5781 0.9998 0.9999
0.3439 16.3871 508 1.1038 0.5758 1.1038 1.0506
0.3439 16.4516 510 1.2708 0.5481 1.2708 1.1273
0.3439 16.5161 512 1.3276 0.4964 1.3276 1.1522

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k7_task1_organization

Finetuned
(4019)
this model