ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k8_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7949
  • Qwk: 0.0308
  • Mse: 0.7949
  • Rmse: 0.8916

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0488 2 3.9204 0.0103 3.9204 1.9800
No log 0.0976 4 2.1228 -0.0302 2.1228 1.4570
No log 0.1463 6 1.6873 0.0172 1.6873 1.2989
No log 0.1951 8 1.0503 -0.0117 1.0503 1.0249
No log 0.2439 10 0.7613 -0.0725 0.7613 0.8725
No log 0.2927 12 0.7422 -0.0160 0.7422 0.8615
No log 0.3415 14 0.9588 0.0819 0.9588 0.9792
No log 0.3902 16 1.3408 0.0 1.3408 1.1579
No log 0.4390 18 1.0502 0.0683 1.0502 1.0248
No log 0.4878 20 0.7083 -0.0035 0.7083 0.8416
No log 0.5366 22 0.7302 0.0 0.7302 0.8545
No log 0.5854 24 0.7299 0.0460 0.7299 0.8544
No log 0.6341 26 0.7901 -0.0801 0.7901 0.8889
No log 0.6829 28 0.8688 -0.1682 0.8688 0.9321
No log 0.7317 30 0.9056 -0.0532 0.9056 0.9517
No log 0.7805 32 0.9327 -0.0821 0.9327 0.9658
No log 0.8293 34 0.9791 -0.0517 0.9791 0.9895
No log 0.8780 36 0.9307 -0.1077 0.9307 0.9647
No log 0.9268 38 0.8751 -0.1051 0.8751 0.9355
No log 0.9756 40 0.7779 -0.0118 0.7779 0.8820
No log 1.0244 42 0.8974 0.0946 0.8974 0.9473
No log 1.0732 44 0.8219 0.0748 0.8219 0.9066
No log 1.1220 46 0.7469 -0.0032 0.7469 0.8643
No log 1.1707 48 0.8664 -0.0016 0.8664 0.9308
No log 1.2195 50 0.8570 0.1428 0.8570 0.9257
No log 1.2683 52 0.8146 -0.0391 0.8146 0.9025
No log 1.3171 54 0.9526 0.0946 0.9526 0.9760
No log 1.3659 56 0.8327 0.1051 0.8327 0.9125
No log 1.4146 58 1.1236 0.0525 1.1236 1.0600
No log 1.4634 60 0.9710 0.0415 0.9710 0.9854
No log 1.5122 62 0.8158 0.0465 0.8158 0.9032
No log 1.5610 64 0.9742 0.0576 0.9742 0.9870
No log 1.6098 66 0.8163 0.1324 0.8163 0.9035
No log 1.6585 68 1.0304 0.0101 1.0304 1.0151
No log 1.7073 70 1.2324 0.0293 1.2324 1.1101
No log 1.7561 72 0.9874 0.0027 0.9874 0.9937
No log 1.8049 74 0.8805 0.1432 0.8805 0.9384
No log 1.8537 76 0.9393 0.1789 0.9393 0.9692
No log 1.9024 78 1.2647 0.1046 1.2647 1.1246
No log 1.9512 80 1.2750 0.0496 1.2750 1.1292
No log 2.0 82 0.9076 0.0494 0.9076 0.9527
No log 2.0488 84 0.9160 0.0927 0.9160 0.9571
No log 2.0976 86 0.9233 0.0927 0.9233 0.9609
No log 2.1463 88 0.9051 0.1339 0.9051 0.9514
No log 2.1951 90 1.0406 0.0111 1.0406 1.0201
No log 2.2439 92 0.8911 0.0623 0.8911 0.9440
No log 2.2927 94 0.8249 -0.0241 0.8249 0.9082
No log 2.3415 96 0.8139 -0.0686 0.8139 0.9022
No log 2.3902 98 0.8316 0.0833 0.8316 0.9119
No log 2.4390 100 0.8689 -0.0251 0.8689 0.9321
No log 2.4878 102 0.8949 -0.0271 0.8949 0.9460
No log 2.5366 104 0.8995 0.0549 0.8995 0.9484
No log 2.5854 106 0.9465 0.0483 0.9465 0.9729
No log 2.6341 108 1.0255 0.1005 1.0255 1.0127
No log 2.6829 110 0.9665 0.0494 0.9665 0.9831
No log 2.7317 112 1.0715 0.0639 1.0715 1.0352
No log 2.7805 114 0.9997 0.1179 0.9997 0.9999
No log 2.8293 116 0.9612 0.1834 0.9612 0.9804
No log 2.8780 118 1.0231 0.1192 1.0231 1.0115
No log 2.9268 120 0.9629 0.1733 0.9629 0.9813
No log 2.9756 122 1.0101 0.0120 1.0101 1.0051
No log 3.0244 124 0.9298 0.1091 0.9298 0.9643
No log 3.0732 126 1.0431 0.0723 1.0431 1.0213
No log 3.1220 128 1.0024 0.0428 1.0024 1.0012
No log 3.1707 130 0.9044 0.0681 0.9044 0.9510
No log 3.2195 132 0.8436 0.0538 0.8436 0.9185
No log 3.2683 134 0.8761 0.0041 0.8761 0.9360
No log 3.3171 136 0.7466 0.2252 0.7466 0.8641
No log 3.3659 138 0.8965 -0.0187 0.8965 0.9468
No log 3.4146 140 0.8934 0.0767 0.8934 0.9452
No log 3.4634 142 0.7473 0.1951 0.7473 0.8645
No log 3.5122 144 0.7629 0.1196 0.7629 0.8734
No log 3.5610 146 0.8093 0.1868 0.8093 0.8996
No log 3.6098 148 0.8588 0.0699 0.8588 0.9267
No log 3.6585 150 0.7927 0.1372 0.7927 0.8904
No log 3.7073 152 0.7865 -0.0145 0.7865 0.8868
No log 3.7561 154 0.7809 0.0412 0.7809 0.8837
No log 3.8049 156 0.7881 -0.0118 0.7881 0.8878
No log 3.8537 158 0.7714 -0.0059 0.7714 0.8783
No log 3.9024 160 0.8704 0.0303 0.8704 0.9330
No log 3.9512 162 0.8582 0.1036 0.8582 0.9264
No log 4.0 164 0.8292 0.1561 0.8292 0.9106
No log 4.0488 166 0.9894 0.0627 0.9894 0.9947
No log 4.0976 168 0.8616 0.0538 0.8616 0.9282
No log 4.1463 170 0.9188 0.1078 0.9188 0.9586
No log 4.1951 172 0.8963 -0.0039 0.8963 0.9467
No log 4.2439 174 0.7523 -0.0033 0.7523 0.8674
No log 4.2927 176 0.8618 0.0442 0.8618 0.9283
No log 4.3415 178 0.8303 0.0512 0.8303 0.9112
No log 4.3902 180 0.7855 0.0814 0.7855 0.8863
No log 4.4390 182 0.7817 0.1453 0.7817 0.8841
No log 4.4878 184 0.8512 0.1048 0.8512 0.9226
No log 4.5366 186 0.8865 0.0016 0.8865 0.9415
No log 4.5854 188 0.8198 0.1144 0.8198 0.9054
No log 4.6341 190 0.7969 0.1722 0.7969 0.8927
No log 4.6829 192 0.7595 0.1304 0.7595 0.8715
No log 4.7317 194 0.7159 0.0814 0.7159 0.8461
No log 4.7805 196 0.6953 0.0814 0.6953 0.8338
No log 4.8293 198 0.7126 0.1691 0.7126 0.8441
No log 4.8780 200 0.6955 0.0814 0.6955 0.8340
No log 4.9268 202 0.7509 0.0639 0.7509 0.8665
No log 4.9756 204 0.7863 0.0768 0.7863 0.8868
No log 5.0244 206 0.9284 0.0391 0.9284 0.9635
No log 5.0732 208 1.0255 0.0175 1.0255 1.0127
No log 5.1220 210 0.8544 0.1379 0.8544 0.9243
No log 5.1707 212 0.8036 0.0680 0.8036 0.8964
No log 5.2195 214 0.7847 0.0247 0.7847 0.8858
No log 5.2683 216 0.8540 0.1079 0.8540 0.9241
No log 5.3171 218 0.9255 0.1450 0.9255 0.9620
No log 5.3659 220 0.7980 0.0893 0.7980 0.8933
No log 5.4146 222 0.7915 0.1146 0.7915 0.8897
No log 5.4634 224 0.8475 0.0525 0.8475 0.9206
No log 5.5122 226 0.8041 0.0776 0.8041 0.8967
No log 5.5610 228 0.8093 0.0074 0.8093 0.8996
No log 5.6098 230 0.7762 0.0814 0.7762 0.8810
No log 5.6585 232 0.7829 0.0639 0.7829 0.8848
No log 5.7073 234 0.7400 0.1254 0.7400 0.8603
No log 5.7561 236 0.7600 0.0622 0.7600 0.8718
No log 5.8049 238 0.7645 0.0622 0.7645 0.8743
No log 5.8537 240 0.7433 -0.0091 0.7433 0.8622
No log 5.9024 242 0.7829 0.1146 0.7829 0.8848
No log 5.9512 244 0.7854 0.1001 0.7854 0.8862
No log 6.0 246 0.7333 -0.0091 0.7333 0.8563
No log 6.0488 248 0.7930 -0.0195 0.7930 0.8905
No log 6.0976 250 0.7443 -0.0406 0.7443 0.8627
No log 6.1463 252 0.7107 0.0460 0.7107 0.8431
No log 6.1951 254 0.7220 0.1828 0.7220 0.8497
No log 6.2439 256 0.7046 -0.0065 0.7046 0.8394
No log 6.2927 258 0.7803 -0.0226 0.7803 0.8833
No log 6.3415 260 0.7589 -0.0226 0.7589 0.8711
No log 6.3902 262 0.7330 0.0723 0.7330 0.8562
No log 6.4390 264 0.8688 0.0316 0.8688 0.9321
No log 6.4878 266 0.8217 0.0409 0.8217 0.9065
No log 6.5366 268 0.7218 0.1304 0.7218 0.8496
No log 6.5854 270 0.8336 0.1426 0.8336 0.9130
No log 6.6341 272 0.8388 0.1426 0.8388 0.9159
No log 6.6829 274 0.7924 0.0930 0.7924 0.8902
No log 6.7317 276 0.8379 0.1449 0.8379 0.9154
No log 6.7805 278 0.9352 0.0566 0.9352 0.9671
No log 6.8293 280 0.7943 0.1449 0.7943 0.8912
No log 6.8780 282 0.7090 -0.0065 0.7090 0.8420
No log 6.9268 284 0.7464 0.0094 0.7464 0.8639
No log 6.9756 286 0.7202 -0.0033 0.7202 0.8487
No log 7.0244 288 0.7273 0.0334 0.7273 0.8528
No log 7.0732 290 0.7571 0.0807 0.7571 0.8701
No log 7.1220 292 0.7642 0.0395 0.7642 0.8742
No log 7.1707 294 0.8176 -0.0248 0.8176 0.9042
No log 7.2195 296 0.9045 0.0405 0.9045 0.9511
No log 7.2683 298 0.8367 0.0710 0.8367 0.9147
No log 7.3171 300 0.7601 0.0926 0.7601 0.8719
No log 7.3659 302 0.8460 0.0525 0.8460 0.9198
No log 7.4146 304 0.9186 0.0016 0.9186 0.9584
No log 7.4634 306 0.8642 0.2254 0.8642 0.9296
No log 7.5122 308 0.9830 0.0365 0.9830 0.9914
No log 7.5610 310 1.0046 0.0365 1.0046 1.0023
No log 7.6098 312 0.8949 0.1037 0.8949 0.9460
No log 7.6585 314 0.8527 0.1867 0.8527 0.9234
No log 7.7073 316 0.8347 0.1048 0.8347 0.9136
No log 7.7561 318 0.8284 0.0690 0.8284 0.9102
No log 7.8049 320 0.8184 0.0081 0.8184 0.9046
No log 7.8537 322 0.8399 -0.0730 0.8399 0.9164
No log 7.9024 324 0.8671 -0.0268 0.8671 0.9312
No log 7.9512 326 0.8945 0.0301 0.8945 0.9458
No log 8.0 328 0.9285 0.0490 0.9285 0.9636
No log 8.0488 330 0.8679 0.1001 0.8679 0.9316
No log 8.0976 332 0.7825 -0.0096 0.7825 0.8846
No log 8.1463 334 0.7433 -0.0062 0.7433 0.8621
No log 8.1951 336 0.7433 0.0814 0.7433 0.8621
No log 8.2439 338 0.7975 0.1001 0.7975 0.8930
No log 8.2927 340 0.8732 0.0525 0.8732 0.9345
No log 8.3415 342 0.8920 -0.0138 0.8920 0.9445
No log 8.3902 344 1.0341 0.0635 1.0341 1.0169
No log 8.4390 346 1.0277 0.0903 1.0277 1.0137
No log 8.4878 348 0.9470 0.0336 0.9470 0.9731
No log 8.5366 350 0.9156 0.0407 0.9156 0.9569
No log 8.5854 352 0.8513 0.0476 0.8513 0.9227
No log 8.6341 354 0.7730 -0.0125 0.7730 0.8792
No log 8.6829 356 0.7891 -0.0350 0.7891 0.8883
No log 8.7317 358 0.7952 -0.0350 0.7952 0.8917
No log 8.7805 360 0.7782 0.0414 0.7782 0.8822
No log 8.8293 362 0.7954 0.0600 0.7954 0.8918
No log 8.8780 364 0.8157 0.0611 0.8157 0.9031
No log 8.9268 366 0.7952 0.1189 0.7952 0.8917
No log 8.9756 368 0.7923 0.1031 0.7923 0.8901
No log 9.0244 370 0.7747 0.0889 0.7747 0.8802
No log 9.0732 372 0.7983 0.1003 0.7983 0.8935
No log 9.1220 374 0.7981 0.1001 0.7981 0.8934
No log 9.1707 376 0.7763 0.0732 0.7763 0.8811
No log 9.2195 378 0.7927 0.0889 0.7927 0.8903
No log 9.2683 380 0.7813 0.0749 0.7813 0.8839
No log 9.3171 382 0.8295 0.1449 0.8295 0.9108
No log 9.3659 384 0.8197 0.1449 0.8197 0.9054
No log 9.4146 386 0.7819 0.1553 0.7819 0.8842
No log 9.4634 388 0.7873 0.1449 0.7873 0.8873
No log 9.5122 390 0.7483 0.1199 0.7483 0.8650
No log 9.5610 392 0.7364 -0.0532 0.7364 0.8581
No log 9.6098 394 0.7595 -0.0469 0.7595 0.8715
No log 9.6585 396 0.7808 0.0412 0.7808 0.8836
No log 9.7073 398 0.8406 0.1329 0.8406 0.9168
No log 9.7561 400 0.8952 0.1105 0.8952 0.9462
No log 9.8049 402 0.8286 0.1001 0.8286 0.9103
No log 9.8537 404 0.7935 0.0884 0.7935 0.8908
No log 9.9024 406 0.8267 -0.0187 0.8267 0.9092
No log 9.9512 408 0.7809 0.0412 0.7809 0.8837
No log 10.0 410 0.7960 0.1001 0.7960 0.8922
No log 10.0488 412 0.7764 0.1553 0.7764 0.8811
No log 10.0976 414 0.7422 0.0318 0.7422 0.8615
No log 10.1463 416 0.7331 0.0768 0.7331 0.8562
No log 10.1951 418 0.7480 0.2105 0.7480 0.8648
No log 10.2439 420 0.7492 0.1199 0.7492 0.8655
No log 10.2927 422 0.7760 0.0840 0.7760 0.8809
No log 10.3415 424 0.8186 0.1358 0.8186 0.9047
No log 10.3902 426 0.7833 0.0851 0.7833 0.8850
No log 10.4390 428 0.7664 0.0851 0.7664 0.8754
No log 10.4878 430 0.7801 0.1440 0.7801 0.8833
No log 10.5366 432 0.7952 0.1899 0.7952 0.8917
No log 10.5854 434 0.7470 0.1199 0.7470 0.8643
No log 10.6341 436 0.7425 -0.0446 0.7425 0.8617
No log 10.6829 438 0.7479 -0.0506 0.7479 0.8648
No log 10.7317 440 0.7429 0.0768 0.7429 0.8619
No log 10.7805 442 0.7676 0.1899 0.7676 0.8762
No log 10.8293 444 0.8440 0.0409 0.8440 0.9187
No log 10.8780 446 0.7930 0.0909 0.7930 0.8905
No log 10.9268 448 0.7141 0.1259 0.7141 0.8451
No log 10.9756 450 0.7737 -0.0350 0.7737 0.8796
No log 11.0244 452 0.8325 0.0249 0.8325 0.9124
No log 11.0732 454 0.7769 -0.0446 0.7769 0.8814
No log 11.1220 456 0.7964 0.1449 0.7964 0.8924
No log 11.1707 458 0.8461 0.1196 0.8461 0.9199
No log 11.2195 460 0.8499 0.1196 0.8499 0.9219
No log 11.2683 462 0.8061 0.1001 0.8061 0.8978
No log 11.3171 464 0.7702 0.0791 0.7702 0.8776
No log 11.3659 466 0.7578 0.0828 0.7578 0.8705
No log 11.4146 468 0.7449 0.1722 0.7449 0.8631
No log 11.4634 470 0.7796 0.1001 0.7796 0.8829
No log 11.5122 472 0.8041 0.0909 0.8041 0.8967
No log 11.5610 474 0.7677 0.1599 0.7677 0.8762
No log 11.6098 476 0.7599 0.0828 0.7599 0.8717
No log 11.6585 478 0.7852 -0.0113 0.7852 0.8861
No log 11.7073 480 0.8185 0.0700 0.8185 0.9047
No log 11.7561 482 0.8435 0.0959 0.8435 0.9184
No log 11.8049 484 0.8123 0.1049 0.8123 0.9013
No log 11.8537 486 0.7737 -0.0113 0.7737 0.8796
No log 11.9024 488 0.7887 -0.0427 0.7887 0.8881
No log 11.9512 490 0.8205 -0.0280 0.8205 0.9058
No log 12.0 492 0.8518 0.0239 0.8518 0.9229
No log 12.0488 494 0.8272 -0.0025 0.8272 0.9095
No log 12.0976 496 0.8310 0.1475 0.8310 0.9116
No log 12.1463 498 0.8219 0.0670 0.8219 0.9066
0.2793 12.1951 500 0.8213 0.0861 0.8213 0.9063
0.2793 12.2439 502 0.8049 0.0101 0.8049 0.8972
0.2793 12.2927 504 0.7685 -0.0465 0.7685 0.8766
0.2793 12.3415 506 0.7526 0.1146 0.7526 0.8675
0.2793 12.3902 508 0.8200 0.0909 0.8200 0.9056
0.2793 12.4390 510 0.8441 0.0909 0.8441 0.9187
0.2793 12.4878 512 0.7901 0.1449 0.7901 0.8889
0.2793 12.5366 514 0.7844 0.0449 0.7844 0.8857
0.2793 12.5854 516 0.8288 0.0192 0.8288 0.9104
0.2793 12.6341 518 0.8122 -0.0238 0.8122 0.9012
0.2793 12.6829 520 0.7747 0.0289 0.7747 0.8802
0.2793 12.7317 522 0.7680 0.1495 0.7680 0.8764
0.2793 12.7805 524 0.7831 0.1001 0.7831 0.8849
0.2793 12.8293 526 0.8375 0.1147 0.8375 0.9152
0.2793 12.8780 528 0.8569 0.1065 0.8569 0.9257
0.2793 12.9268 530 0.8078 0.0236 0.8078 0.8988
0.2793 12.9756 532 0.8185 0.1646 0.8185 0.9047
0.2793 13.0244 534 0.8541 0.1597 0.8541 0.9242
0.2793 13.0732 536 0.8706 0.1194 0.8706 0.9331
0.2793 13.1220 538 0.8447 0.0506 0.8447 0.9191
0.2793 13.1707 540 0.7860 0.1249 0.7860 0.8866
0.2793 13.2195 542 0.8487 0.0867 0.8487 0.9212
0.2793 13.2683 544 0.8837 0.0711 0.8837 0.9401
0.2793 13.3171 546 0.7814 0.1449 0.7814 0.8840
0.2793 13.3659 548 0.7038 0.1311 0.7038 0.8389
0.2793 13.4146 550 0.7618 -0.0912 0.7618 0.8728
0.2793 13.4634 552 0.8070 -0.0280 0.8070 0.8983
0.2793 13.5122 554 0.7695 0.0376 0.7695 0.8772
0.2793 13.5610 556 0.7851 0.0741 0.7851 0.8860
0.2793 13.6098 558 0.7949 0.0308 0.7949 0.8916

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k8_task3_organization

Finetuned
(4019)
this model