ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k2_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7436
  • Qwk: -0.0473
  • Mse: 0.7436
  • Rmse: 0.8623

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1818 2 3.7095 -0.0068 3.7095 1.9260
No log 0.3636 4 1.9807 -0.0696 1.9807 1.4074
No log 0.5455 6 1.2060 0.0271 1.2060 1.0982
No log 0.7273 8 1.1043 0.0032 1.1043 1.0508
No log 0.9091 10 0.9576 0.0446 0.9576 0.9786
No log 1.0909 12 0.6860 0.0964 0.6860 0.8282
No log 1.2727 14 0.7024 0.0334 0.7024 0.8381
No log 1.4545 16 0.7762 0.0786 0.7762 0.8810
No log 1.6364 18 0.7125 0.0296 0.7125 0.8441
No log 1.8182 20 0.6924 0.0506 0.6924 0.8321
No log 2.0 22 0.7851 0.0512 0.7851 0.8861
No log 2.1818 24 0.9486 -0.0253 0.9486 0.9739
No log 2.3636 26 1.0451 0.0680 1.0451 1.0223
No log 2.5455 28 0.8068 -0.0033 0.8068 0.8982
No log 2.7273 30 0.6938 -0.0644 0.6938 0.8330
No log 2.9091 32 0.7124 -0.0131 0.7124 0.8440
No log 3.0909 34 0.7706 0.1943 0.7706 0.8778
No log 3.2727 36 0.7825 0.1722 0.7825 0.8846
No log 3.4545 38 0.8565 0.0408 0.8565 0.9254
No log 3.6364 40 0.9775 0.1176 0.9775 0.9887
No log 3.8182 42 1.0302 -0.1281 1.0302 1.0150
No log 4.0 44 1.0880 -0.0023 1.0880 1.0431
No log 4.1818 46 1.1343 -0.0283 1.1343 1.0650
No log 4.3636 48 1.0969 -0.1727 1.0969 1.0473
No log 4.5455 50 1.0995 -0.0806 1.0995 1.0486
No log 4.7273 52 1.0507 -0.1821 1.0507 1.0250
No log 4.9091 54 1.0167 -0.1371 1.0167 1.0083
No log 5.0909 56 0.9539 0.0423 0.9539 0.9767
No log 5.2727 58 0.9939 -0.0148 0.9939 0.9970
No log 5.4545 60 0.9999 0.0022 0.9999 1.0000
No log 5.6364 62 1.2060 -0.0611 1.2060 1.0982
No log 5.8182 64 1.0724 -0.0756 1.0724 1.0356
No log 6.0 66 1.3070 -0.0895 1.3070 1.1432
No log 6.1818 68 1.3435 -0.0694 1.3435 1.1591
No log 6.3636 70 1.1743 -0.0170 1.1743 1.0836
No log 6.5455 72 1.2419 0.0642 1.2419 1.1144
No log 6.7273 74 1.1199 0.0151 1.1199 1.0582
No log 6.9091 76 1.0537 0.0067 1.0537 1.0265
No log 7.0909 78 1.0311 -0.0504 1.0311 1.0154
No log 7.2727 80 0.9036 -0.0247 0.9036 0.9506
No log 7.4545 82 0.9796 0.0432 0.9796 0.9898
No log 7.6364 84 0.9313 -0.0364 0.9313 0.9650
No log 7.8182 86 0.9607 -0.0790 0.9607 0.9802
No log 8.0 88 0.9444 -0.0765 0.9444 0.9718
No log 8.1818 90 0.9777 -0.0727 0.9777 0.9888
No log 8.3636 92 0.9853 -0.0454 0.9853 0.9926
No log 8.5455 94 1.0636 0.0086 1.0636 1.0313
No log 8.7273 96 0.8817 0.1379 0.8817 0.9390
No log 8.9091 98 0.8525 0.1386 0.8525 0.9233
No log 9.0909 100 0.9022 0.0029 0.9022 0.9499
No log 9.2727 102 0.8850 0.0377 0.8850 0.9407
No log 9.4545 104 0.8576 0.0884 0.8576 0.9261
No log 9.6364 106 0.9149 0.0309 0.9149 0.9565
No log 9.8182 108 0.7914 0.1031 0.7914 0.8896
No log 10.0 110 0.9802 0.0458 0.9802 0.9901
No log 10.1818 112 0.9037 -0.0379 0.9037 0.9506
No log 10.3636 114 0.9433 0.0692 0.9433 0.9712
No log 10.5455 116 1.0657 -0.0194 1.0657 1.0323
No log 10.7273 118 0.8252 0.0087 0.8252 0.9084
No log 10.9091 120 0.8767 -0.0492 0.8767 0.9363
No log 11.0909 122 0.9045 -0.0393 0.9045 0.9511
No log 11.2727 124 0.8582 0.0968 0.8582 0.9264
No log 11.4545 126 1.0339 -0.0138 1.0339 1.0168
No log 11.6364 128 1.0014 -0.0118 1.0014 1.0007
No log 11.8182 130 0.8534 0.1010 0.8534 0.9238
No log 12.0 132 0.8428 0.0095 0.8428 0.9180
No log 12.1818 134 0.7837 0.0670 0.7837 0.8853
No log 12.3636 136 0.7448 0.2483 0.7448 0.8630
No log 12.5455 138 0.7714 0.1480 0.7714 0.8783
No log 12.7273 140 0.7925 0.1224 0.7925 0.8902
No log 12.9091 142 0.8836 -0.0107 0.8836 0.9400
No log 13.0909 144 1.1281 -0.0099 1.1281 1.0621
No log 13.2727 146 0.9617 -0.0733 0.9617 0.9807
No log 13.4545 148 0.8536 0.1918 0.8536 0.9239
No log 13.6364 150 0.9409 0.0320 0.9409 0.9700
No log 13.8182 152 0.7644 0.2298 0.7644 0.8743
No log 14.0 154 0.7872 0.0633 0.7872 0.8872
No log 14.1818 156 0.7522 0.1080 0.7522 0.8673
No log 14.3636 158 0.7204 0.0214 0.7204 0.8487
No log 14.5455 160 0.7546 -0.0336 0.7546 0.8687
No log 14.7273 162 0.7189 0.0869 0.7189 0.8479
No log 14.9091 164 0.7829 0.1480 0.7829 0.8848
No log 15.0909 166 0.9220 -0.0059 0.9220 0.9602
No log 15.2727 168 1.0222 0.0703 1.0222 1.0111
No log 15.4545 170 1.0371 0.0689 1.0371 1.0184
No log 15.6364 172 0.9058 0.0995 0.9058 0.9517
No log 15.8182 174 0.9635 0.0320 0.9635 0.9816
No log 16.0 176 1.1448 0.0025 1.1448 1.0700
No log 16.1818 178 0.9059 -0.0500 0.9059 0.9518
No log 16.3636 180 0.7384 0.2522 0.7384 0.8593
No log 16.5455 182 1.0767 -0.0120 1.0767 1.0376
No log 16.7273 184 1.1305 0.0531 1.1305 1.0632
No log 16.9091 186 0.8463 0.0275 0.8463 0.9199
No log 17.0909 188 0.8847 -0.0094 0.8847 0.9406
No log 17.2727 190 1.1830 0.0063 1.1830 1.0877
No log 17.4545 192 1.0306 -0.0508 1.0306 1.0152
No log 17.6364 194 0.8099 0.2443 0.8099 0.8999
No log 17.8182 196 0.9062 -0.0743 0.9062 0.9519
No log 18.0 198 0.9288 -0.0696 0.9288 0.9637
No log 18.1818 200 0.8240 0.1321 0.8240 0.9077
No log 18.3636 202 0.8740 -0.0076 0.8740 0.9349
No log 18.5455 204 0.9064 -0.0097 0.9064 0.9521
No log 18.7273 206 0.8360 0.0793 0.8360 0.9143
No log 18.9091 208 0.7923 0.1359 0.7923 0.8901
No log 19.0909 210 0.7997 0.1456 0.7997 0.8942
No log 19.2727 212 0.8222 0.1007 0.8222 0.9067
No log 19.4545 214 0.8539 0.0987 0.8539 0.9240
No log 19.6364 216 0.8974 0.0268 0.8974 0.9473
No log 19.8182 218 0.8528 0.0644 0.8528 0.9235
No log 20.0 220 0.7903 0.1769 0.7903 0.8890
No log 20.1818 222 0.7927 0.0660 0.7927 0.8903
No log 20.3636 224 0.7717 0.1244 0.7717 0.8785
No log 20.5455 226 0.7751 0.1463 0.7751 0.8804
No log 20.7273 228 0.7804 0.1474 0.7804 0.8834
No log 20.9091 230 0.8260 0.0490 0.8260 0.9088
No log 21.0909 232 0.8915 -0.0486 0.8915 0.9442
No log 21.2727 234 0.9404 -0.0500 0.9404 0.9697
No log 21.4545 236 0.8347 0.1281 0.8347 0.9136
No log 21.6364 238 0.8432 0.0229 0.8432 0.9183
No log 21.8182 240 0.8992 -0.0363 0.8992 0.9483
No log 22.0 242 0.8161 0.0654 0.8161 0.9034
No log 22.1818 244 0.7666 0.2166 0.7666 0.8755
No log 22.3636 246 0.8305 -0.0818 0.8305 0.9113
No log 22.5455 248 0.8906 -0.0486 0.8906 0.9437
No log 22.7273 250 0.7759 0.0639 0.7759 0.8809
No log 22.9091 252 0.7952 0.0905 0.7952 0.8918
No log 23.0909 254 0.9918 -0.0120 0.9918 0.9959
No log 23.2727 256 0.9824 -0.0120 0.9824 0.9911
No log 23.4545 258 0.8273 0.0514 0.8273 0.9095
No log 23.6364 260 0.8313 -0.0008 0.8313 0.9118
No log 23.8182 262 0.9198 -0.0504 0.9198 0.9591
No log 24.0 264 0.8223 -0.0079 0.8223 0.9068
No log 24.1818 266 0.6873 0.0914 0.6873 0.8291
No log 24.3636 268 0.6974 -0.0451 0.6974 0.8351
No log 24.5455 270 0.7342 -0.0451 0.7342 0.8569
No log 24.7273 272 0.7286 -0.0451 0.7286 0.8536
No log 24.9091 274 0.7391 0.0768 0.7391 0.8597
No log 25.0909 276 0.9213 0.0545 0.9213 0.9598
No log 25.2727 278 0.9979 0.0111 0.9979 0.9990
No log 25.4545 280 0.8773 0.0304 0.8773 0.9366
No log 25.6364 282 0.7898 0.0733 0.7898 0.8887
No log 25.8182 284 0.8263 0.1400 0.8263 0.9090
No log 26.0 286 0.8130 0.1386 0.8130 0.9016
No log 26.1818 288 0.7326 0.1646 0.7326 0.8559
No log 26.3636 290 0.7998 0.0793 0.7998 0.8943
No log 26.5455 292 0.9154 0.0169 0.9154 0.9568
No log 26.7273 294 0.8262 0.0684 0.8262 0.9089
No log 26.9091 296 0.7272 0.1529 0.7272 0.8528
No log 27.0909 298 0.7557 0.1460 0.7557 0.8693
No log 27.2727 300 0.7814 0.1079 0.7814 0.8840
No log 27.4545 302 0.7284 0.2906 0.7284 0.8535
No log 27.6364 304 0.8584 0.0200 0.8584 0.9265
No log 27.8182 306 0.9884 0.0046 0.9884 0.9942
No log 28.0 308 0.9213 0.0431 0.9213 0.9598
No log 28.1818 310 0.7317 0.0562 0.7317 0.8554
No log 28.3636 312 0.6923 0.0513 0.6923 0.8321
No log 28.5455 314 0.7160 0.1135 0.7160 0.8462
No log 28.7273 316 0.6866 0.1024 0.6866 0.8286
No log 28.9091 318 0.7110 0.1440 0.7110 0.8432
No log 29.0909 320 0.8614 0.0304 0.8614 0.9281
No log 29.2727 322 0.9913 -0.0204 0.9913 0.9957
No log 29.4545 324 0.9119 0.0217 0.9119 0.9549
No log 29.6364 326 0.7845 0.1846 0.7845 0.8857
No log 29.8182 328 0.8761 0.1349 0.8761 0.9360
No log 30.0 330 1.0006 0.0472 1.0006 1.0003
No log 30.1818 332 0.9758 0.0769 0.9758 0.9878
No log 30.3636 334 0.8052 0.0239 0.8052 0.8973
No log 30.5455 336 0.7905 0.0525 0.7905 0.8891
No log 30.7273 338 0.9130 0.0304 0.9130 0.9555
No log 30.9091 340 0.8888 0.0250 0.8888 0.9427
No log 31.0909 342 0.7964 -0.0031 0.7964 0.8924
No log 31.2727 344 0.7546 0.1495 0.7546 0.8687
No log 31.4545 346 0.7574 0.1495 0.7574 0.8703
No log 31.6364 348 0.7592 0.1495 0.7592 0.8713
No log 31.8182 350 0.7596 0.1495 0.7596 0.8715
No log 32.0 352 0.7898 0.0755 0.7898 0.8887
No log 32.1818 354 0.8418 -0.0076 0.8418 0.9175
No log 32.3636 356 0.8925 -0.0114 0.8925 0.9447
No log 32.5455 358 0.8196 0.1475 0.8196 0.9053
No log 32.7273 360 0.7552 0.2128 0.7552 0.8690
No log 32.9091 362 0.7487 0.1803 0.7487 0.8653
No log 33.0909 364 0.7522 0.0218 0.7522 0.8673
No log 33.2727 366 0.6954 0.1576 0.6954 0.8339
No log 33.4545 368 0.6760 0.0914 0.6760 0.8222
No log 33.6364 370 0.6999 0.2225 0.6999 0.8366
No log 33.8182 372 0.7291 0.2838 0.7291 0.8538
No log 34.0 374 0.7814 0.0793 0.7814 0.8840
No log 34.1818 376 0.8004 0.1239 0.8004 0.8946
No log 34.3636 378 0.8129 0.0517 0.8129 0.9016
No log 34.5455 380 0.8248 0.0551 0.8248 0.9082
No log 34.7273 382 0.8015 0.1465 0.8015 0.8953
No log 34.9091 384 0.7995 0.0562 0.7995 0.8942
No log 35.0909 386 0.8097 0.0831 0.8097 0.8998
No log 35.2727 388 0.7551 0.1096 0.7551 0.8690
No log 35.4545 390 0.7159 0.1740 0.7159 0.8461
No log 35.6364 392 0.7112 0.1362 0.7112 0.8434
No log 35.8182 394 0.7275 0.0814 0.7275 0.8529
No log 36.0 396 0.7653 0.1879 0.7653 0.8748
No log 36.1818 398 0.8095 0.0304 0.8095 0.8997
No log 36.3636 400 0.7892 0.1431 0.7892 0.8884
No log 36.5455 402 0.7855 0.1796 0.7855 0.8863
No log 36.7273 404 0.8168 0.0952 0.8168 0.9038
No log 36.9091 406 0.8046 0.1714 0.8046 0.8970
No log 37.0909 408 0.7921 0.1983 0.7921 0.8900
No log 37.2727 410 0.7930 0.0562 0.7930 0.8905
No log 37.4545 412 0.7863 0.0490 0.7863 0.8867
No log 37.6364 414 0.7506 0.0600 0.7506 0.8664
No log 37.8182 416 0.7286 0.1856 0.7286 0.8536
No log 38.0 418 0.7208 0.1787 0.7208 0.8490
No log 38.1818 420 0.7248 0.1553 0.7248 0.8513
No log 38.3636 422 0.7619 0.0755 0.7619 0.8729
No log 38.5455 424 0.7783 0.0333 0.7783 0.8822
No log 38.7273 426 0.7331 0.1879 0.7331 0.8562
No log 38.9091 428 0.7550 0.1501 0.7550 0.8689
No log 39.0909 430 0.7756 0.1079 0.7756 0.8807
No log 39.2727 432 0.7819 0.1079 0.7819 0.8842
No log 39.4545 434 0.7511 0.1080 0.7511 0.8667
No log 39.6364 436 0.7259 0.1553 0.7259 0.8520
No log 39.8182 438 0.8096 0.0676 0.8096 0.8998
No log 40.0 440 0.8322 0.0676 0.8322 0.9123
No log 40.1818 442 0.7576 0.1150 0.7576 0.8704
No log 40.3636 444 0.6932 0.1199 0.6932 0.8326
No log 40.5455 446 0.7284 0.1080 0.7284 0.8535
No log 40.7273 448 0.7654 0.0633 0.7654 0.8749
No log 40.9091 450 0.7576 0.1080 0.7576 0.8704
No log 41.0909 452 0.7470 0.1778 0.7470 0.8643
No log 41.2727 454 0.7313 0.1249 0.7313 0.8551
No log 41.4545 456 0.7280 0.1249 0.7280 0.8532
No log 41.6364 458 0.7518 0.0562 0.7518 0.8671
No log 41.8182 460 0.8107 0.0793 0.8107 0.9004
No log 42.0 462 0.8564 -0.0076 0.8564 0.9254
No log 42.1818 464 0.8175 0.0392 0.8175 0.9042
No log 42.3636 466 0.7679 0.2195 0.7679 0.8763
No log 42.5455 468 0.7898 0.1537 0.7898 0.8887
No log 42.7273 470 0.8561 -0.0037 0.8561 0.9253
No log 42.9091 472 0.8291 0.0285 0.8291 0.9105
No log 43.0909 474 0.7781 0.2294 0.7781 0.8821
No log 43.2727 476 0.8225 0.0016 0.8225 0.9069
No log 43.4545 478 0.8500 0.0755 0.8500 0.9220
No log 43.6364 480 0.8204 0.0793 0.8204 0.9058
No log 43.8182 482 0.7731 0.0574 0.7731 0.8793
No log 44.0 484 0.7507 0.1906 0.7507 0.8664
No log 44.1818 486 0.7577 0.1537 0.7577 0.8705
No log 44.3636 488 0.7378 0.1131 0.7378 0.8590
No log 44.5455 490 0.6991 -0.0541 0.6991 0.8361
No log 44.7273 492 0.6972 0.0436 0.6972 0.8350
No log 44.9091 494 0.7251 0.0247 0.7251 0.8515
No log 45.0909 496 0.7551 0.1387 0.7551 0.8690
No log 45.2727 498 0.7622 0.1001 0.7622 0.8730
0.2431 45.4545 500 0.7573 0.1612 0.7573 0.8702
0.2431 45.6364 502 0.7535 0.1612 0.7535 0.8681
0.2431 45.8182 504 0.7421 0.0723 0.7421 0.8614
0.2431 46.0 506 0.7307 0.0967 0.7307 0.8548
0.2431 46.1818 508 0.7268 -0.0541 0.7268 0.8525
0.2431 46.3636 510 0.7436 -0.0473 0.7436 0.8623

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k2_task3_organization

Finetuned
(4019)
this model