ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k17_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8553
  • Qwk: 0.6619
  • Mse: 0.8553
  • Rmse: 0.9248

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0157 2 6.9445 0.0057 6.9445 2.6352
No log 0.0315 4 5.0021 0.0833 5.0021 2.2365
No log 0.0472 6 3.0473 0.1034 3.0473 1.7457
No log 0.0630 8 2.1131 0.1871 2.1131 1.4537
No log 0.0787 10 1.7332 0.1982 1.7332 1.3165
No log 0.0945 12 1.6750 0.1905 1.6750 1.2942
No log 0.1102 14 1.7330 0.0943 1.7330 1.3164
No log 0.1260 16 2.1817 0.1417 2.1817 1.4770
No log 0.1417 18 2.3116 0.0876 2.3116 1.5204
No log 0.1575 20 1.7692 0.2143 1.7692 1.3301
No log 0.1732 22 1.5361 0.2037 1.5361 1.2394
No log 0.1890 24 1.4764 0.2281 1.4764 1.2151
No log 0.2047 26 1.6298 0.2261 1.6298 1.2767
No log 0.2205 28 1.5408 0.2931 1.5408 1.2413
No log 0.2362 30 1.3119 0.2569 1.3119 1.1454
No log 0.2520 32 1.2846 0.2883 1.2846 1.1334
No log 0.2677 34 1.5542 0.2712 1.5542 1.2467
No log 0.2835 36 1.8154 0.2595 1.8154 1.3474
No log 0.2992 38 1.9134 0.2571 1.9134 1.3833
No log 0.3150 40 1.4558 0.3478 1.4558 1.2066
No log 0.3307 42 1.2673 0.3393 1.2673 1.1257
No log 0.3465 44 1.2734 0.2430 1.2734 1.1284
No log 0.3622 46 1.3433 0.3393 1.3433 1.1590
No log 0.3780 48 1.3866 0.4762 1.3866 1.1776
No log 0.3937 50 1.3774 0.5079 1.3774 1.1736
No log 0.4094 52 1.3861 0.4882 1.3861 1.1773
No log 0.4252 54 1.4644 0.4113 1.4644 1.2101
No log 0.4409 56 1.6506 0.4 1.6506 1.2847
No log 0.4567 58 1.4375 0.3731 1.4375 1.1989
No log 0.4724 60 1.1421 0.5303 1.1421 1.0687
No log 0.4882 62 1.1395 0.5324 1.1395 1.0675
No log 0.5039 64 1.0970 0.5147 1.0970 1.0474
No log 0.5197 66 1.0931 0.5714 1.0931 1.0455
No log 0.5354 68 1.2828 0.4545 1.2828 1.1326
No log 0.5512 70 1.3654 0.4211 1.3654 1.1685
No log 0.5669 72 1.3108 0.4348 1.3108 1.1449
No log 0.5827 74 1.1368 0.5401 1.1368 1.0662
No log 0.5984 76 0.9938 0.6479 0.9938 0.9969
No log 0.6142 78 1.0204 0.6389 1.0204 1.0102
No log 0.6299 80 1.1463 0.5612 1.1463 1.0706
No log 0.6457 82 1.0833 0.6043 1.0833 1.0408
No log 0.6614 84 0.9395 0.6713 0.9395 0.9693
No log 0.6772 86 1.0365 0.6486 1.0365 1.0181
No log 0.6929 88 1.1942 0.5479 1.1942 1.0928
No log 0.7087 90 1.2532 0.5170 1.2532 1.1195
No log 0.7244 92 1.2254 0.5541 1.2254 1.1070
No log 0.7402 94 1.1240 0.5931 1.1240 1.0602
No log 0.7559 96 1.0873 0.5714 1.0873 1.0428
No log 0.7717 98 1.0942 0.5734 1.0942 1.0460
No log 0.7874 100 1.0836 0.5806 1.0836 1.0410
No log 0.8031 102 1.3358 0.5556 1.3358 1.1558
No log 0.8189 104 1.2922 0.5590 1.2922 1.1367
No log 0.8346 106 1.0446 0.6460 1.0446 1.0221
No log 0.8504 108 1.1190 0.6705 1.1190 1.0578
No log 0.8661 110 1.2395 0.5848 1.2395 1.1133
No log 0.8819 112 1.0972 0.6282 1.0972 1.0475
No log 0.8976 114 0.9768 0.6043 0.9768 0.9883
No log 0.9134 116 0.9798 0.6434 0.9798 0.9898
No log 0.9291 118 1.0813 0.6788 1.0813 1.0399
No log 0.9449 120 1.1775 0.5570 1.1775 1.0851
No log 0.9606 122 1.2053 0.5478 1.2053 1.0979
No log 0.9764 124 0.9980 0.6282 0.9980 0.9990
No log 0.9921 126 1.0048 0.6667 1.0048 1.0024
No log 1.0079 128 1.1806 0.5714 1.1806 1.0865
No log 1.0236 130 1.2001 0.5605 1.2001 1.0955
No log 1.0394 132 0.9930 0.7051 0.9930 0.9965
No log 1.0551 134 0.8773 0.7027 0.8773 0.9367
No log 1.0709 136 0.8478 0.7162 0.8478 0.9207
No log 1.0866 138 0.8715 0.7133 0.8715 0.9336
No log 1.1024 140 1.0763 0.6490 1.0763 1.0374
No log 1.1181 142 1.0898 0.6364 1.0898 1.0439
No log 1.1339 144 0.9280 0.6939 0.9280 0.9633
No log 1.1496 146 0.8444 0.75 0.8444 0.9189
No log 1.1654 148 0.8311 0.6950 0.8311 0.9116
No log 1.1811 150 0.8759 0.7248 0.8759 0.9359
No log 1.1969 152 0.9580 0.6792 0.9580 0.9788
No log 1.2126 154 1.2001 0.5614 1.2001 1.0955
No log 1.2283 156 1.4766 0.4485 1.4766 1.2151
No log 1.2441 158 1.3626 0.4545 1.3626 1.1673
No log 1.2598 160 1.1325 0.5735 1.1325 1.0642
No log 1.2756 162 0.9743 0.6107 0.9743 0.9870
No log 1.2913 164 0.9164 0.6715 0.9164 0.9573
No log 1.3071 166 0.9304 0.6475 0.9304 0.9646
No log 1.3228 168 1.0512 0.6203 1.0512 1.0253
No log 1.3386 170 1.0744 0.6282 1.0744 1.0365
No log 1.3543 172 0.9831 0.6241 0.9831 0.9915
No log 1.3701 174 0.9297 0.6429 0.9297 0.9642
No log 1.3858 176 0.9855 0.6525 0.9855 0.9927
No log 1.4016 178 1.0279 0.6621 1.0279 1.0139
No log 1.4173 180 0.9614 0.6531 0.9614 0.9805
No log 1.4331 182 1.0692 0.6582 1.0692 1.0340
No log 1.4488 184 1.2380 0.5882 1.2380 1.1126
No log 1.4646 186 1.1400 0.6588 1.1400 1.0677
No log 1.4803 188 1.0030 0.6746 1.0030 1.0015
No log 1.4961 190 0.9435 0.7239 0.9435 0.9713
No log 1.5118 192 1.0345 0.6667 1.0345 1.0171
No log 1.5276 194 1.0044 0.6345 1.0044 1.0022
No log 1.5433 196 1.0015 0.6131 1.0015 1.0008
No log 1.5591 198 1.0628 0.5970 1.0628 1.0309
No log 1.5748 200 1.2446 0.5563 1.2446 1.1156
No log 1.5906 202 1.3594 0.4634 1.3594 1.1659
No log 1.6063 204 1.3186 0.5176 1.3186 1.1483
No log 1.6220 206 1.0399 0.6842 1.0399 1.0198
No log 1.6378 208 0.9047 0.7237 0.9047 0.9512
No log 1.6535 210 0.9242 0.6974 0.9242 0.9614
No log 1.6693 212 1.0827 0.6164 1.0827 1.0405
No log 1.6850 214 1.1602 0.6163 1.1602 1.0771
No log 1.7008 216 1.0458 0.6303 1.0458 1.0227
No log 1.7165 218 0.9434 0.6933 0.9434 0.9713
No log 1.7323 220 0.9024 0.6806 0.9024 0.9499
No log 1.7480 222 0.9309 0.6806 0.9309 0.9648
No log 1.7638 224 0.9766 0.6525 0.9766 0.9882
No log 1.7795 226 0.9066 0.6803 0.9066 0.9521
No log 1.7953 228 0.7965 0.7451 0.7965 0.8925
No log 1.8110 230 0.7465 0.7532 0.7465 0.8640
No log 1.8268 232 0.7062 0.7821 0.7062 0.8404
No log 1.8425 234 0.7565 0.7355 0.7565 0.8698
No log 1.8583 236 0.9306 0.6928 0.9306 0.9647
No log 1.8740 238 1.2181 0.5786 1.2181 1.1037
No log 1.8898 240 1.1086 0.6316 1.1086 1.0529
No log 1.9055 242 0.9376 0.6667 0.9376 0.9683
No log 1.9213 244 0.7848 0.6812 0.7848 0.8859
No log 1.9370 246 0.7587 0.7338 0.7587 0.8710
No log 1.9528 248 0.6990 0.7347 0.6990 0.8360
No log 1.9685 250 0.8194 0.7564 0.8194 0.9052
No log 1.9843 252 1.2218 0.6023 1.2218 1.1053
No log 2.0 254 1.3332 0.6105 1.3332 1.1546
No log 2.0157 256 1.0863 0.6437 1.0863 1.0423
No log 2.0315 258 0.8642 0.7532 0.8642 0.9296
No log 2.0472 260 0.8692 0.7075 0.8692 0.9323
No log 2.0630 262 0.9436 0.7075 0.9436 0.9714
No log 2.0787 264 0.9421 0.6763 0.9421 0.9706
No log 2.0945 266 0.9459 0.6714 0.9459 0.9726
No log 2.1102 268 0.9394 0.7273 0.9394 0.9692
No log 2.1260 270 0.9458 0.7176 0.9458 0.9725
No log 2.1417 272 0.8302 0.7528 0.8302 0.9112
No log 2.1575 274 0.7111 0.7955 0.7111 0.8433
No log 2.1732 276 0.7614 0.7442 0.7614 0.8726
No log 2.1890 278 0.8900 0.7239 0.8900 0.9434
No log 2.2047 280 1.0731 0.6182 1.0731 1.0359
No log 2.2205 282 1.1527 0.5839 1.1527 1.0736
No log 2.2362 284 1.0383 0.6579 1.0383 1.0189
No log 2.2520 286 0.8380 0.7183 0.8380 0.9154
No log 2.2677 288 0.7871 0.7324 0.7871 0.8872
No log 2.2835 290 0.8325 0.7083 0.8325 0.9124
No log 2.2992 292 0.9439 0.7027 0.9439 0.9716
No log 2.3150 294 0.9091 0.7273 0.9091 0.9535
No log 2.3307 296 0.8926 0.7320 0.8926 0.9448
No log 2.3465 298 0.8343 0.7211 0.8343 0.9134
No log 2.3622 300 0.8410 0.7397 0.8410 0.9170
No log 2.3780 302 0.8656 0.6993 0.8656 0.9304
No log 2.3937 304 0.9260 0.6944 0.9260 0.9623
No log 2.4094 306 0.9286 0.6887 0.9286 0.9636
No log 2.4252 308 0.9132 0.6918 0.9132 0.9556
No log 2.4409 310 0.8428 0.7362 0.8428 0.9180
No log 2.4567 312 0.8417 0.7205 0.8417 0.9174
No log 2.4724 314 0.9036 0.6707 0.9036 0.9506
No log 2.4882 316 0.9900 0.6875 0.9900 0.9950
No log 2.5039 318 0.9683 0.6803 0.9683 0.9840
No log 2.5197 320 0.9334 0.7027 0.9334 0.9661
No log 2.5354 322 1.0240 0.6835 1.0240 1.0119
No log 2.5512 324 1.1845 0.5409 1.1845 1.0883
No log 2.5669 326 1.1205 0.5882 1.1205 1.0585
No log 2.5827 328 0.9152 0.6667 0.9152 0.9567
No log 2.5984 330 0.8096 0.6812 0.8096 0.8998
No log 2.6142 332 0.8056 0.7092 0.8056 0.8975
No log 2.6299 334 0.9253 0.6980 0.9253 0.9619
No log 2.6457 336 1.1053 0.6460 1.1053 1.0514
No log 2.6614 338 1.1096 0.6242 1.1096 1.0534
No log 2.6772 340 1.0024 0.6939 1.0024 1.0012
No log 2.6929 342 0.9265 0.7083 0.9265 0.9626
No log 2.7087 344 0.9021 0.6667 0.9021 0.9498
No log 2.7244 346 0.8755 0.6986 0.8755 0.9357
No log 2.7402 348 0.8953 0.6849 0.8953 0.9462
No log 2.7559 350 0.9220 0.6207 0.9220 0.9602
No log 2.7717 352 0.9484 0.6351 0.9484 0.9739
No log 2.7874 354 1.0279 0.6405 1.0279 1.0138
No log 2.8031 356 1.0910 0.6087 1.0910 1.0445
No log 2.8189 358 1.0663 0.6242 1.0663 1.0326
No log 2.8346 360 0.9549 0.6792 0.9549 0.9772
No log 2.8504 362 0.8354 0.7436 0.8354 0.9140
No log 2.8661 364 0.8490 0.7179 0.8490 0.9214
No log 2.8819 366 1.0306 0.6503 1.0306 1.0152
No log 2.8976 368 1.2196 0.5714 1.2196 1.1044
No log 2.9134 370 1.2035 0.6012 1.2035 1.0970
No log 2.9291 372 1.2898 0.5089 1.2898 1.1357
No log 2.9449 374 1.3188 0.5233 1.3188 1.1484
No log 2.9606 376 1.1215 0.5844 1.1215 1.0590
No log 2.9764 378 1.0083 0.6667 1.0083 1.0041
No log 2.9921 380 1.0470 0.6438 1.0470 1.0232
No log 3.0079 382 1.1649 0.5974 1.1649 1.0793
No log 3.0236 384 1.1227 0.6323 1.1227 1.0596
No log 3.0394 386 1.0025 0.7105 1.0025 1.0012
No log 3.0551 388 0.9155 0.7190 0.9155 0.9568
No log 3.0709 390 0.8995 0.7355 0.8995 0.9484
No log 3.0866 392 0.8873 0.7355 0.8873 0.9420
No log 3.1024 394 0.9320 0.6950 0.9320 0.9654
No log 3.1181 396 0.9994 0.6809 0.9994 0.9997
No log 3.1339 398 1.0812 0.6 1.0812 1.0398
No log 3.1496 400 1.1083 0.6104 1.1083 1.0527
No log 3.1654 402 0.9931 0.6755 0.9931 0.9966
No log 3.1811 404 0.9528 0.6795 0.9528 0.9761
No log 3.1969 406 0.8908 0.7261 0.8908 0.9438
No log 3.2126 408 0.9055 0.72 0.9055 0.9516
No log 3.2283 410 0.9361 0.7134 0.9361 0.9675
No log 3.2441 412 0.9880 0.6708 0.9880 0.9940
No log 3.2598 414 0.9557 0.7089 0.9557 0.9776
No log 3.2756 416 0.9359 0.7152 0.9359 0.9674
No log 3.2913 418 0.9033 0.6763 0.9033 0.9504
No log 3.3071 420 0.9194 0.6522 0.9194 0.9589
No log 3.3228 422 0.9284 0.6619 0.9284 0.9635
No log 3.3386 424 1.0333 0.6331 1.0333 1.0165
No log 3.3543 426 1.1086 0.6395 1.1086 1.0529
No log 3.3701 428 1.0628 0.6351 1.0628 1.0309
No log 3.3858 430 0.9519 0.6763 0.9519 0.9756
No log 3.4016 432 0.9053 0.6861 0.9053 0.9515
No log 3.4173 434 0.9314 0.6531 0.9314 0.9651
No log 3.4331 436 0.9945 0.6490 0.9945 0.9972
No log 3.4488 438 1.0495 0.6405 1.0495 1.0244
No log 3.4646 440 1.0951 0.6234 1.0951 1.0465
No log 3.4803 442 1.0434 0.6259 1.0434 1.0215
No log 3.4961 444 0.9499 0.6857 0.9499 0.9746
No log 3.5118 446 0.8922 0.6812 0.8922 0.9446
No log 3.5276 448 0.9072 0.6950 0.9072 0.9525
No log 3.5433 450 0.9576 0.6267 0.9576 0.9786
No log 3.5591 452 0.9543 0.6338 0.9543 0.9769
No log 3.5748 454 0.9304 0.6475 0.9304 0.9646
No log 3.5906 456 0.8784 0.6763 0.8784 0.9372
No log 3.6063 458 0.8504 0.6957 0.8504 0.9222
No log 3.6220 460 0.8312 0.6950 0.8312 0.9117
No log 3.6378 462 0.8838 0.7083 0.8838 0.9401
No log 3.6535 464 1.0235 0.6145 1.0235 1.0117
No log 3.6693 466 1.0274 0.6456 1.0274 1.0136
No log 3.6850 468 0.9083 0.7067 0.9083 0.9530
No log 3.7008 470 0.8010 0.7183 0.8010 0.8950
No log 3.7165 472 0.7675 0.7286 0.7675 0.8761
No log 3.7323 474 0.7528 0.7671 0.7528 0.8676
No log 3.7480 476 0.7965 0.7027 0.7965 0.8925
No log 3.7638 478 0.8843 0.7089 0.8843 0.9403
No log 3.7795 480 0.9104 0.6795 0.9104 0.9542
No log 3.7953 482 0.9245 0.6536 0.9245 0.9615
No log 3.8110 484 0.8945 0.6803 0.8945 0.9458
No log 3.8268 486 0.8196 0.6714 0.8196 0.9053
No log 3.8425 488 0.7788 0.6957 0.7788 0.8825
No log 3.8583 490 0.7579 0.7123 0.7579 0.8706
No log 3.8740 492 0.8616 0.6962 0.8616 0.9282
No log 3.8898 494 1.0859 0.6272 1.0859 1.0420
No log 3.9055 496 1.1643 0.5862 1.1643 1.0790
No log 3.9213 498 1.0577 0.6627 1.0577 1.0284
0.4533 3.9370 500 0.9088 0.6622 0.9088 0.9533
0.4533 3.9528 502 0.8835 0.6667 0.8835 0.9400
0.4533 3.9685 504 0.7970 0.7273 0.7970 0.8927
0.4533 3.9843 506 0.7270 0.7692 0.7270 0.8527
0.4533 4.0 508 0.7092 0.7891 0.7092 0.8422
0.4533 4.0157 510 0.7678 0.7320 0.7678 0.8763
0.4533 4.0315 512 0.8876 0.6829 0.8876 0.9421
0.4533 4.0472 514 1.0318 0.6429 1.0318 1.0158
0.4533 4.0630 516 0.9432 0.6536 0.9432 0.9712
0.4533 4.0787 518 0.8396 0.6471 0.8396 0.9163
0.4533 4.0945 520 0.8000 0.6769 0.8000 0.8944
0.4533 4.1102 522 0.8147 0.6769 0.8147 0.9026
0.4533 4.1260 524 0.8868 0.6377 0.8868 0.9417
0.4533 4.1417 526 0.9620 0.6486 0.9620 0.9808
0.4533 4.1575 528 0.9001 0.6667 0.9001 0.9487
0.4533 4.1732 530 0.8569 0.6525 0.8569 0.9257
0.4533 4.1890 532 0.7570 0.7465 0.7570 0.8701
0.4533 4.2047 534 0.7552 0.7338 0.7552 0.8690
0.4533 4.2205 536 0.7845 0.7338 0.7845 0.8857
0.4533 4.2362 538 0.8045 0.7183 0.8045 0.8969
0.4533 4.2520 540 0.8654 0.6857 0.8654 0.9303
0.4533 4.2677 542 0.9621 0.6483 0.9621 0.9809
0.4533 4.2835 544 0.9347 0.6667 0.9347 0.9668
0.4533 4.2992 546 0.8593 0.6812 0.8593 0.9270
0.4533 4.3150 548 0.7999 0.7007 0.7999 0.8944
0.4533 4.3307 550 0.8113 0.7007 0.8113 0.9007
0.4533 4.3465 552 0.8712 0.7050 0.8712 0.9334
0.4533 4.3622 554 0.9608 0.6525 0.9608 0.9802
0.4533 4.3780 556 1.1445 0.5696 1.1445 1.0698
0.4533 4.3937 558 1.1559 0.5556 1.1559 1.0751
0.4533 4.4094 560 1.0146 0.6494 1.0146 1.0073
0.4533 4.4252 562 0.8553 0.6619 0.8553 0.9248

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k17_task1_organization

Finetuned
(4023)
this model