ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k15_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7278
  • Qwk: 0.1199
  • Mse: 0.7278
  • Rmse: 0.8531

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0260 2 3.7693 -0.0073 3.7693 1.9415
No log 0.0519 4 2.1966 0.0478 2.1966 1.4821
No log 0.0779 6 2.1994 0.0104 2.1994 1.4830
No log 0.1039 8 1.8004 0.0183 1.8004 1.3418
No log 0.1299 10 0.9326 -0.0122 0.9326 0.9657
No log 0.1558 12 0.9274 0.0486 0.9274 0.9630
No log 0.1818 14 0.7443 0.0296 0.7443 0.8627
No log 0.2078 16 0.6984 0.0374 0.6984 0.8357
No log 0.2338 18 0.9853 0.0596 0.9853 0.9926
No log 0.2597 20 1.4307 0.0815 1.4307 1.1961
No log 0.2857 22 1.1820 -0.0065 1.1820 1.0872
No log 0.3117 24 0.8184 0.1144 0.8184 0.9046
No log 0.3377 26 0.8525 0.0700 0.8525 0.9233
No log 0.3636 28 0.8241 0.0488 0.8241 0.9078
No log 0.3896 30 0.8773 0.1953 0.8773 0.9367
No log 0.4156 32 1.2139 -0.0288 1.2139 1.1018
No log 0.4416 34 0.9032 0.0652 0.9032 0.9504
No log 0.4675 36 0.9627 -0.0682 0.9627 0.9812
No log 0.4935 38 0.9185 -0.0470 0.9185 0.9584
No log 0.5195 40 0.8900 -0.0288 0.8900 0.9434
No log 0.5455 42 0.9178 0.0221 0.9178 0.9580
No log 0.5714 44 0.8936 0.0764 0.8936 0.9453
No log 0.5974 46 0.8580 0.1218 0.8580 0.9263
No log 0.6234 48 0.9096 0.1296 0.9096 0.9537
No log 0.6494 50 0.8199 0.1979 0.8199 0.9055
No log 0.6753 52 0.8587 0.0084 0.8587 0.9266
No log 0.7013 54 0.8239 0.1519 0.8239 0.9077
No log 0.7273 56 0.8295 0.1700 0.8295 0.9108
No log 0.7532 58 0.8032 0.0757 0.8032 0.8962
No log 0.7792 60 0.9924 -0.0832 0.9924 0.9962
No log 0.8052 62 0.8504 0.0517 0.8504 0.9222
No log 0.8312 64 0.9745 0.0929 0.9745 0.9872
No log 0.8571 66 0.9987 0.2809 0.9987 0.9993
No log 0.8831 68 1.1035 0.1445 1.1035 1.0505
No log 0.9091 70 1.1091 0.1240 1.1091 1.0532
No log 0.9351 72 1.4581 0.1020 1.4581 1.2075
No log 0.9610 74 1.4477 0.1319 1.4477 1.2032
No log 0.9870 76 1.2330 0.1770 1.2330 1.1104
No log 1.0130 78 1.0334 0.1641 1.0334 1.0166
No log 1.0390 80 1.1090 0.1076 1.1090 1.0531
No log 1.0649 82 0.8038 0.2345 0.8038 0.8965
No log 1.0909 84 0.7764 0.2526 0.7764 0.8811
No log 1.1169 86 0.8892 0.1150 0.8892 0.9430
No log 1.1429 88 0.6675 0.2706 0.6675 0.8170
No log 1.1688 90 0.8580 0.0576 0.8580 0.9263
No log 1.1948 92 1.1474 0.1277 1.1474 1.0712
No log 1.2208 94 0.9044 0.0576 0.9044 0.9510
No log 1.2468 96 0.6828 0.1304 0.6828 0.8263
No log 1.2727 98 0.7809 0.2087 0.7809 0.8837
No log 1.2987 100 0.8487 0.1615 0.8487 0.9212
No log 1.3247 102 0.9326 0.1318 0.9326 0.9657
No log 1.3506 104 1.0213 0.1472 1.0213 1.0106
No log 1.3766 106 1.1984 0.0989 1.1984 1.0947
No log 1.4026 108 1.4057 0.0292 1.4057 1.1856
No log 1.4286 110 1.1573 0.0160 1.1573 1.0758
No log 1.4545 112 1.0524 -0.0358 1.0524 1.0259
No log 1.4805 114 1.2078 0.0401 1.2078 1.0990
No log 1.5065 116 0.8905 -0.0345 0.8905 0.9436
No log 1.5325 118 0.7819 0.0978 0.7819 0.8843
No log 1.5584 120 0.7962 0.1974 0.7962 0.8923
No log 1.5844 122 0.8665 0.1426 0.8665 0.9309
No log 1.6104 124 0.8955 0.0962 0.8955 0.9463
No log 1.6364 126 0.9838 -0.0424 0.9838 0.9919
No log 1.6623 128 1.1312 0.0078 1.1312 1.0636
No log 1.6883 130 0.9540 0.1331 0.9540 0.9767
No log 1.7143 132 1.1042 0.0468 1.1042 1.0508
No log 1.7403 134 1.1628 0.0551 1.1628 1.0783
No log 1.7662 136 0.9938 0.1006 0.9938 0.9969
No log 1.7922 138 0.9029 0.1432 0.9029 0.9502
No log 1.8182 140 0.9263 0.0520 0.9263 0.9624
No log 1.8442 142 0.9545 0.0014 0.9545 0.9770
No log 1.8701 144 0.8689 0.1558 0.8689 0.9321
No log 1.8961 146 0.8797 0.0842 0.8797 0.9379
No log 1.9221 148 0.8820 0.1608 0.8820 0.9391
No log 1.9481 150 0.9448 0.0405 0.9448 0.9720
No log 1.9740 152 1.2164 0.0307 1.2164 1.1029
No log 2.0 154 1.0910 0.0757 1.0910 1.0445
No log 2.0260 156 0.9479 0.2001 0.9479 0.9736
No log 2.0519 158 0.9182 0.2280 0.9182 0.9582
No log 2.0779 160 0.9035 0.1007 0.9035 0.9505
No log 2.1039 162 0.8689 0.1550 0.8689 0.9322
No log 2.1299 164 0.9566 -0.0797 0.9566 0.9781
No log 2.1558 166 0.9714 -0.0797 0.9714 0.9856
No log 2.1818 168 0.8660 0.1141 0.8660 0.9306
No log 2.2078 170 0.8465 0.0522 0.8465 0.9201
No log 2.2338 172 0.8376 0.0522 0.8376 0.9152
No log 2.2597 174 0.8733 0.1144 0.8733 0.9345
No log 2.2857 176 0.9788 -0.0029 0.9788 0.9894
No log 2.3117 178 0.9431 0.2170 0.9431 0.9711
No log 2.3377 180 0.9658 0.0541 0.9658 0.9828
No log 2.3636 182 0.9039 0.1882 0.9039 0.9507
No log 2.3896 184 0.8205 0.1304 0.8205 0.9058
No log 2.4156 186 0.8198 0.1254 0.8198 0.9054
No log 2.4416 188 0.9698 -0.0094 0.9698 0.9848
No log 2.4675 190 0.8014 0.1199 0.8014 0.8952
No log 2.4935 192 0.7837 0.0532 0.7837 0.8853
No log 2.5195 194 0.7583 -0.0029 0.7583 0.8708
No log 2.5455 196 0.8939 0.0377 0.8939 0.9455
No log 2.5714 198 0.9000 0.0348 0.9000 0.9487
No log 2.5974 200 0.7564 0.0869 0.7564 0.8697
No log 2.6234 202 0.8756 0.0319 0.8756 0.9357
No log 2.6494 204 0.8450 0.1434 0.8450 0.9192
No log 2.6753 206 0.7773 0.1659 0.7773 0.8816
No log 2.7013 208 0.8560 0.0959 0.8560 0.9252
No log 2.7273 210 0.8094 0.1541 0.8094 0.8997
No log 2.7532 212 0.7609 0.0869 0.7609 0.8723
No log 2.7792 214 0.7585 -0.0465 0.7585 0.8709
No log 2.8052 216 0.7554 0.0394 0.7554 0.8692
No log 2.8312 218 0.8611 0.0956 0.8611 0.9279
No log 2.8571 220 1.1415 0.0979 1.1415 1.0684
No log 2.8831 222 1.0515 0.0101 1.0515 1.0254
No log 2.9091 224 0.8287 0.0639 0.8287 0.9103
No log 2.9351 226 0.8029 0.0101 0.8029 0.8961
No log 2.9610 228 0.8217 -0.0257 0.8217 0.9065
No log 2.9870 230 0.7665 0.1199 0.7665 0.8755
No log 3.0130 232 0.7557 0.1199 0.7557 0.8693
No log 3.0390 234 0.8069 0.0680 0.8069 0.8983
No log 3.0649 236 0.8539 0.0123 0.8539 0.9240
No log 3.0909 238 0.7676 0.0814 0.7676 0.8761
No log 3.1169 240 0.7667 -0.0059 0.7667 0.8756
No log 3.1429 242 0.8001 0.1541 0.8001 0.8945
No log 3.1688 244 0.8185 0.1943 0.8185 0.9047
No log 3.1948 246 0.7814 0.2009 0.7814 0.8839
No log 3.2208 248 0.7968 0.2009 0.7968 0.8926
No log 3.2468 250 0.7419 0.2122 0.7419 0.8613
No log 3.2727 252 0.7509 0.0436 0.7509 0.8665
No log 3.2987 254 0.7764 0.2999 0.7764 0.8812
No log 3.3247 256 0.8448 0.0920 0.8448 0.9191
No log 3.3506 258 0.9258 0.0805 0.9258 0.9622
No log 3.3766 260 0.8724 0.0484 0.8724 0.9340
No log 3.4026 262 0.8486 0.1857 0.8486 0.9212
No log 3.4286 264 0.7704 0.3199 0.7704 0.8778
No log 3.4545 266 0.7606 0.1004 0.7606 0.8721
No log 3.4805 268 0.7449 0.1599 0.7449 0.8631
No log 3.5065 270 0.7900 0.1882 0.7900 0.8888
No log 3.5325 272 0.8043 0.1882 0.8043 0.8968
No log 3.5584 274 0.8797 0.0016 0.8797 0.9379
No log 3.5844 276 0.8230 0.1379 0.8230 0.9072
No log 3.6104 278 0.7519 0.0414 0.7519 0.8672
No log 3.6364 280 0.7311 0.0541 0.7311 0.8551
No log 3.6623 282 0.7078 0.0374 0.7078 0.8413
No log 3.6883 284 0.7342 0.1675 0.7342 0.8569
No log 3.7143 286 0.7391 0.2078 0.7391 0.8597
No log 3.7403 288 0.7069 0.1318 0.7069 0.8407
No log 3.7662 290 0.7126 0.0355 0.7126 0.8442
No log 3.7922 292 0.7496 0.2009 0.7496 0.8658
No log 3.8182 294 0.7719 0.1001 0.7719 0.8786
No log 3.8442 296 0.7533 0.1196 0.7533 0.8679
No log 3.8701 298 0.7961 0.1282 0.7961 0.8922
No log 3.8961 300 0.7873 0.0205 0.7873 0.8873
No log 3.9221 302 0.8637 0.0692 0.8637 0.9293
No log 3.9481 304 0.7526 0.0628 0.7526 0.8675
No log 3.9740 306 0.7004 0.0814 0.7004 0.8369
No log 4.0 308 0.6845 0.0814 0.6845 0.8273
No log 4.0260 310 0.7146 0.0214 0.7146 0.8454
No log 4.0519 312 0.8731 0.1024 0.8731 0.9344
No log 4.0779 314 0.8705 0.1147 0.8705 0.9330
No log 4.1039 316 0.8422 0.1580 0.8422 0.9177
No log 4.1299 318 0.9585 0.1537 0.9585 0.9790
No log 4.1558 320 0.8779 0.1215 0.8779 0.9370
No log 4.1818 322 0.9346 0.0762 0.9346 0.9667
No log 4.2078 324 0.9542 -0.0114 0.9542 0.9768
No log 4.2338 326 0.8054 0.0639 0.8054 0.8974
No log 4.2597 328 0.7475 0.0 0.7475 0.8646
No log 4.2857 330 0.7178 0.0967 0.7178 0.8472
No log 4.3117 332 0.7116 0.1758 0.7116 0.8436
No log 4.3377 334 0.7508 0.1148 0.7508 0.8665
No log 4.3636 336 0.7790 0.0588 0.7790 0.8826
No log 4.3896 338 0.7818 0.0680 0.7818 0.8842
No log 4.4156 340 0.8105 0.1095 0.8105 0.9003
No log 4.4416 342 0.8242 0.1094 0.8242 0.9078
No log 4.4675 344 0.9132 -0.0008 0.9132 0.9556
No log 4.4935 346 0.9017 0.0837 0.9017 0.9496
No log 4.5195 348 0.8376 0.0690 0.8376 0.9152
No log 4.5455 350 0.8415 0.0152 0.8415 0.9173
No log 4.5714 352 0.8225 0.0183 0.8225 0.9069
No log 4.5974 354 0.7716 -0.0125 0.7716 0.8784
No log 4.6234 356 0.7471 0.0296 0.7471 0.8643
No log 4.6494 358 0.8384 0.0549 0.8384 0.9156
No log 4.6753 360 0.8609 0.0409 0.8609 0.9279
No log 4.7013 362 0.7701 0.0714 0.7701 0.8775
No log 4.7273 364 0.7816 0.0821 0.7816 0.8841
No log 4.7532 366 0.8673 -0.0608 0.8673 0.9313
No log 4.7792 368 0.7995 0.0376 0.7995 0.8941
No log 4.8052 370 0.8691 0.1286 0.8691 0.9323
No log 4.8312 372 0.8577 0.0504 0.8577 0.9261
No log 4.8571 374 0.8143 0.0723 0.8143 0.9024
No log 4.8831 376 0.7971 0.0783 0.7971 0.8928
No log 4.9091 378 0.7607 0.1254 0.7607 0.8722
No log 4.9351 380 0.7645 0.1097 0.7645 0.8744
No log 4.9610 382 0.8147 0.0953 0.8147 0.9026
No log 4.9870 384 0.8272 0.0953 0.8272 0.9095
No log 5.0130 386 0.7746 0.1047 0.7746 0.8801
No log 5.0390 388 0.7685 0.1148 0.7685 0.8767
No log 5.0649 390 0.7914 -0.0446 0.7914 0.8896
No log 5.0909 392 0.8728 0.0710 0.8728 0.9342
No log 5.1169 394 0.8198 0.0123 0.8198 0.9054
No log 5.1429 396 0.8055 0.1095 0.8055 0.8975
No log 5.1688 398 0.8541 0.1701 0.8541 0.9242
No log 5.1948 400 0.7826 0.2078 0.7826 0.8846
No log 5.2208 402 0.7377 0.1318 0.7377 0.8589
No log 5.2468 404 0.7469 0.1196 0.7469 0.8642
No log 5.2727 406 0.8017 0.2078 0.8017 0.8954
No log 5.2987 408 0.8101 0.1599 0.8101 0.9000
No log 5.3247 410 0.8168 0.1141 0.8168 0.9038
No log 5.3506 412 0.8298 0.1141 0.8298 0.9109
No log 5.3766 414 0.8450 0.0920 0.8450 0.9193
No log 5.4026 416 0.8522 0.1633 0.8522 0.9231
No log 5.4286 418 0.7591 0.1495 0.7591 0.8713
No log 5.4545 420 0.6806 0.1444 0.6806 0.8250
No log 5.4805 422 0.6720 0.0914 0.6720 0.8198
No log 5.5065 424 0.6749 0.1444 0.6749 0.8215
No log 5.5325 426 0.7927 0.1440 0.7927 0.8903
No log 5.5584 428 0.9471 0.0267 0.9471 0.9732
No log 5.5844 430 0.8615 0.0793 0.8615 0.9281
No log 5.6104 432 0.7252 0.1318 0.7252 0.8516
No log 5.6364 434 0.7529 0.0918 0.7529 0.8677
No log 5.6623 436 0.7398 0.0814 0.7398 0.8601
No log 5.6883 438 0.7479 0.1254 0.7479 0.8648
No log 5.7143 440 0.8040 0.1097 0.8040 0.8967
No log 5.7403 442 0.9161 0.0304 0.9161 0.9571
No log 5.7662 444 0.8733 -0.0425 0.8733 0.9345
No log 5.7922 446 0.8234 0.0600 0.8234 0.9074
No log 5.8182 448 0.8411 0.0041 0.8411 0.9171
No log 5.8442 450 0.8925 0.0304 0.8925 0.9447
No log 5.8701 452 0.8024 0.0793 0.8024 0.8958
No log 5.8961 454 0.7581 0.1097 0.7581 0.8707
No log 5.9221 456 0.7628 0.1387 0.7628 0.8734
No log 5.9481 458 0.7405 0.1612 0.7405 0.8605
No log 5.9740 460 0.7337 0.1612 0.7337 0.8566
No log 6.0 462 0.7057 0.1318 0.7057 0.8401
No log 6.0260 464 0.7182 0.1259 0.7182 0.8475
No log 6.0519 466 0.7841 0.0913 0.7841 0.8855
No log 6.0779 468 0.7755 0.0549 0.7755 0.8806
No log 6.1039 470 0.7459 0.1146 0.7459 0.8637
No log 6.1299 472 0.7796 0.1096 0.7796 0.8829
No log 6.1558 474 0.8066 0.0123 0.8066 0.8981
No log 6.1818 476 0.8804 -0.0033 0.8804 0.9383
No log 6.2078 478 0.8755 -0.0033 0.8755 0.9357
No log 6.2338 480 0.8207 0.0909 0.8207 0.9059
No log 6.2597 482 0.7485 0.0680 0.7485 0.8652
No log 6.2857 484 0.7420 0.1047 0.7420 0.8614
No log 6.3117 486 0.8140 0.0442 0.8140 0.9022
No log 6.3377 488 0.8488 0.0362 0.8488 0.9213
No log 6.3636 490 0.7881 0.0152 0.7881 0.8878
No log 6.3896 492 0.8071 0.2103 0.8071 0.8984
No log 6.4156 494 0.8559 0.2046 0.8559 0.9252
No log 6.4416 496 0.7844 0.0428 0.7844 0.8857
No log 6.4675 498 0.7873 0.0999 0.7873 0.8873
0.2916 6.4935 500 0.9954 0.1246 0.9954 0.9977
0.2916 6.5195 502 1.1260 0.1719 1.1260 1.0611
0.2916 6.5455 504 0.9580 0.0557 0.9580 0.9788
0.2916 6.5714 506 0.7656 0.1943 0.7656 0.8750
0.2916 6.5974 508 0.7135 0.0814 0.7135 0.8447
0.2916 6.6234 510 0.7248 0.0914 0.7248 0.8513
0.2916 6.6494 512 0.7539 0.0414 0.7539 0.8682
0.2916 6.6753 514 0.7716 0.1675 0.7716 0.8784
0.2916 6.7013 516 0.8989 0.1027 0.8989 0.9481
0.2916 6.7273 518 0.9168 0.1027 0.9168 0.9575
0.2916 6.7532 520 0.8035 0.2009 0.8035 0.8964
0.2916 6.7792 522 0.7390 0.1807 0.7390 0.8597
0.2916 6.8052 524 0.7127 0.1371 0.7127 0.8442
0.2916 6.8312 526 0.6955 0.2339 0.6955 0.8340
0.2916 6.8571 528 0.6948 0.1758 0.6948 0.8335
0.2916 6.8831 530 0.7293 0.1440 0.7293 0.8540
0.2916 6.9091 532 0.7448 0.1336 0.7448 0.8630
0.2916 6.9351 534 0.7640 0.1336 0.7640 0.8740
0.2916 6.9610 536 0.8355 0.0842 0.8355 0.9140
0.2916 6.9870 538 0.8811 0.2053 0.8811 0.9387
0.2916 7.0130 540 0.9087 0.1641 0.9087 0.9533
0.2916 7.0390 542 0.8794 0.1455 0.8794 0.9378
0.2916 7.0649 544 0.8743 0.0799 0.8743 0.9351
0.2916 7.0909 546 0.8672 0.0409 0.8672 0.9312
0.2916 7.1169 548 0.8147 0.0512 0.8147 0.9026
0.2916 7.1429 550 0.7474 0.1146 0.7474 0.8645
0.2916 7.1688 552 0.7278 0.1199 0.7278 0.8531

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k15_task3_organization

Finetuned
(4019)
this model