ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k13_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9430
  • Qwk: 0.5954
  • Mse: 0.9430
  • Rmse: 0.9711

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0204 2 6.9612 0.0176 6.9612 2.6384
No log 0.0408 4 4.5401 0.0488 4.5401 2.1308
No log 0.0612 6 3.0486 0.0702 3.0486 1.7460
No log 0.0816 8 2.0827 0.2137 2.0827 1.4432
No log 0.1020 10 1.6611 0.1698 1.6611 1.2888
No log 0.1224 12 1.5564 0.2593 1.5564 1.2475
No log 0.1429 14 1.5734 0.3652 1.5734 1.2544
No log 0.1633 16 1.5668 0.2222 1.5668 1.2517
No log 0.1837 18 1.5652 0.2222 1.5652 1.2511
No log 0.2041 20 1.5641 0.2342 1.5641 1.2506
No log 0.2245 22 1.5827 0.2632 1.5827 1.2581
No log 0.2449 24 1.5734 0.3009 1.5734 1.2544
No log 0.2653 26 1.5317 0.3932 1.5317 1.2376
No log 0.2857 28 1.3038 0.4480 1.3038 1.1419
No log 0.3061 30 1.4560 0.4872 1.4560 1.2067
No log 0.3265 32 2.0050 0.4024 2.0050 1.4160
No log 0.3469 34 2.0634 0.3735 2.0634 1.4364
No log 0.3673 36 1.9221 0.3949 1.9221 1.3864
No log 0.3878 38 1.3034 0.5106 1.3034 1.1417
No log 0.4082 40 1.1561 0.4390 1.1561 1.0752
No log 0.4286 42 1.2599 0.4065 1.2599 1.1225
No log 0.4490 44 1.2299 0.3934 1.2299 1.1090
No log 0.4694 46 1.1355 0.3761 1.1355 1.0656
No log 0.4898 48 1.1389 0.4959 1.1389 1.0672
No log 0.5102 50 1.1653 0.5645 1.1653 1.0795
No log 0.5306 52 1.1411 0.5645 1.1411 1.0682
No log 0.5510 54 1.1291 0.544 1.1291 1.0626
No log 0.5714 56 1.2735 0.4918 1.2735 1.1285
No log 0.5918 58 1.1338 0.4603 1.1338 1.0648
No log 0.6122 60 1.0101 0.6131 1.0101 1.0050
No log 0.6327 62 0.9779 0.6197 0.9779 0.9889
No log 0.6531 64 0.8922 0.6087 0.8922 0.9446
No log 0.6735 66 0.8660 0.6176 0.8660 0.9306
No log 0.6939 68 0.8297 0.6423 0.8297 0.9109
No log 0.7143 70 0.8751 0.6571 0.8751 0.9355
No log 0.7347 72 1.1365 0.5882 1.1365 1.0661
No log 0.7551 74 1.1834 0.5352 1.1834 1.0879
No log 0.7755 76 0.9651 0.6331 0.9651 0.9824
No log 0.7959 78 0.8819 0.6757 0.8819 0.9391
No log 0.8163 80 0.8421 0.7013 0.8421 0.9177
No log 0.8367 82 0.8380 0.6883 0.8380 0.9154
No log 0.8571 84 0.9141 0.6331 0.9141 0.9561
No log 0.8776 86 0.9848 0.6131 0.9848 0.9924
No log 0.8980 88 0.8942 0.6389 0.8942 0.9456
No log 0.9184 90 0.8995 0.6933 0.8995 0.9484
No log 0.9388 92 0.9211 0.6846 0.9211 0.9597
No log 0.9592 94 0.8101 0.6933 0.8101 0.9001
No log 0.9796 96 0.7871 0.7123 0.7871 0.8872
No log 1.0 98 0.9230 0.6405 0.9230 0.9607
No log 1.0204 100 1.0321 0.6203 1.0321 1.0159
No log 1.0408 102 0.9544 0.6241 0.9544 0.9769
No log 1.0612 104 0.9500 0.6438 0.9500 0.9747
No log 1.0816 106 0.9104 0.6711 0.9104 0.9541
No log 1.1020 108 0.9188 0.6579 0.9188 0.9585
No log 1.1224 110 0.8077 0.6842 0.8077 0.8987
No log 1.1429 112 0.7655 0.7179 0.7655 0.8749
No log 1.1633 114 0.7982 0.7436 0.7982 0.8934
No log 1.1837 116 0.8701 0.6941 0.8701 0.9328
No log 1.2041 118 1.1069 0.5988 1.1069 1.0521
No log 1.2245 120 1.2199 0.5789 1.2199 1.1045
No log 1.2449 122 1.1496 0.5547 1.1496 1.0722
No log 1.2653 124 1.0237 0.5693 1.0237 1.0118
No log 1.2857 126 0.9331 0.6447 0.9331 0.9660
No log 1.3061 128 0.8174 0.7226 0.8174 0.9041
No log 1.3265 130 0.8487 0.6914 0.8487 0.9213
No log 1.3469 132 0.8560 0.7051 0.8560 0.9252
No log 1.3673 134 0.8003 0.7297 0.8003 0.8946
No log 1.3878 136 0.7137 0.7724 0.7137 0.8448
No log 1.4082 138 0.6627 0.7397 0.6627 0.8141
No log 1.4286 140 0.7616 0.7950 0.7616 0.8727
No log 1.4490 142 1.0208 0.6471 1.0208 1.0104
No log 1.4694 144 1.2032 0.6145 1.2032 1.0969
No log 1.4898 146 1.0772 0.6234 1.0772 1.0379
No log 1.5102 148 0.8892 0.6620 0.8892 0.9430
No log 1.5306 150 0.8636 0.6377 0.8636 0.9293
No log 1.5510 152 0.7937 0.6809 0.7937 0.8909
No log 1.5714 154 0.6631 0.7733 0.6631 0.8143
No log 1.5918 156 0.6899 0.7871 0.6899 0.8306
No log 1.6122 158 0.8681 0.7081 0.8681 0.9317
No log 1.6327 160 1.3667 0.5746 1.3667 1.1690
No log 1.6531 162 1.6387 0.5532 1.6387 1.2801
No log 1.6735 164 1.3808 0.5714 1.3808 1.1751
No log 1.6939 166 1.0348 0.6154 1.0348 1.0172
No log 1.7143 168 0.7911 0.7162 0.7911 0.8894
No log 1.7347 170 0.7281 0.7260 0.7281 0.8533
No log 1.7551 172 0.7719 0.7211 0.7719 0.8786
No log 1.7755 174 0.9439 0.6405 0.9439 0.9716
No log 1.7959 176 0.9992 0.6081 0.9992 0.9996
No log 1.8163 178 0.9017 0.6331 0.9017 0.9496
No log 1.8367 180 0.8720 0.6857 0.8720 0.9338
No log 1.8571 182 0.7916 0.7211 0.7916 0.8897
No log 1.8776 184 0.7245 0.7417 0.7245 0.8512
No log 1.8980 186 0.7080 0.7673 0.7080 0.8414
No log 1.9184 188 0.8380 0.7375 0.8380 0.9154
No log 1.9388 190 0.9404 0.6790 0.9404 0.9698
No log 1.9592 192 1.0152 0.6234 1.0152 1.0076
No log 1.9796 194 0.8915 0.6887 0.8915 0.9442
No log 2.0 196 0.8540 0.7067 0.8540 0.9241
No log 2.0204 198 0.8711 0.6974 0.8711 0.9333
No log 2.0408 200 0.9902 0.6433 0.9902 0.9951
No log 2.0612 202 1.1130 0.6207 1.1130 1.0550
No log 2.0816 204 1.0085 0.6474 1.0085 1.0043
No log 2.1020 206 0.7421 0.7432 0.7421 0.8615
No log 2.1224 208 0.6972 0.6761 0.6972 0.8350
No log 2.1429 210 0.7589 0.6892 0.7589 0.8712
No log 2.1633 212 0.7482 0.6944 0.7482 0.8650
No log 2.1837 214 0.8024 0.7051 0.8024 0.8958
No log 2.2041 216 1.0201 0.6824 1.0201 1.0100
No log 2.2245 218 1.2287 0.5882 1.2287 1.1085
No log 2.2449 220 1.2047 0.5789 1.2047 1.0976
No log 2.2653 222 1.1684 0.5113 1.1684 1.0809
No log 2.2857 224 1.1539 0.5 1.1539 1.0742
No log 2.3061 226 1.0818 0.5634 1.0818 1.0401
No log 2.3265 228 1.1329 0.5926 1.1329 1.0644
No log 2.3469 230 1.0674 0.5926 1.0674 1.0332
No log 2.3673 232 0.8650 0.6795 0.8650 0.9301
No log 2.3878 234 0.7556 0.7534 0.7556 0.8692
No log 2.4082 236 0.7767 0.7273 0.7767 0.8813
No log 2.4286 238 0.8178 0.7042 0.8178 0.9043
No log 2.4490 240 0.8483 0.7310 0.8483 0.9210
No log 2.4694 242 0.8318 0.7297 0.8318 0.9120
No log 2.4898 244 0.7939 0.7682 0.7939 0.8910
No log 2.5102 246 0.7405 0.7821 0.7405 0.8605
No log 2.5306 248 0.7857 0.7632 0.7857 0.8864
No log 2.5510 250 0.8861 0.7226 0.8861 0.9413
No log 2.5714 252 0.9303 0.6939 0.9303 0.9645
No log 2.5918 254 0.9091 0.6939 0.9091 0.9535
No log 2.6122 256 0.8475 0.7383 0.8475 0.9206
No log 2.6327 258 0.7823 0.7389 0.7823 0.8845
No log 2.6531 260 0.7544 0.7683 0.7544 0.8686
No log 2.6735 262 0.8414 0.7429 0.8414 0.9173
No log 2.6939 264 0.9694 0.7065 0.9694 0.9846
No log 2.7143 266 0.9391 0.7033 0.9391 0.9690
No log 2.7347 268 0.7694 0.7564 0.7694 0.8771
No log 2.7551 270 0.6646 0.7703 0.6646 0.8152
No log 2.7755 272 0.7449 0.7361 0.7449 0.8631
No log 2.7959 274 0.8486 0.6993 0.8486 0.9212
No log 2.8163 276 0.9207 0.6619 0.9207 0.9596
No log 2.8367 278 0.9257 0.6619 0.9257 0.9622
No log 2.8571 280 0.8923 0.6434 0.8923 0.9446
No log 2.8776 282 0.8111 0.6968 0.8111 0.9006
No log 2.8980 284 0.7023 0.7432 0.7023 0.8380
No log 2.9184 286 0.7217 0.7211 0.7217 0.8495
No log 2.9388 288 0.8154 0.7261 0.8154 0.9030
No log 2.9592 290 0.8263 0.6892 0.8263 0.9090
No log 2.9796 292 0.9050 0.6525 0.9050 0.9513
No log 3.0 294 0.9476 0.6429 0.9476 0.9734
No log 3.0204 296 0.9127 0.6567 0.9127 0.9554
No log 3.0408 298 0.8764 0.6567 0.8764 0.9362
No log 3.0612 300 0.8049 0.6857 0.8049 0.8972
No log 3.0816 302 0.6718 0.7692 0.6718 0.8196
No log 3.1020 304 0.6337 0.7692 0.6337 0.7961
No log 3.1224 306 0.6824 0.7799 0.6824 0.8261
No log 3.1429 308 0.7088 0.8095 0.7088 0.8419
No log 3.1633 310 0.7065 0.8 0.7065 0.8405
No log 3.1837 312 0.6571 0.7974 0.6571 0.8106
No log 3.2041 314 0.6449 0.7755 0.6449 0.8030
No log 3.2245 316 0.7105 0.7361 0.7105 0.8429
No log 3.2449 318 0.8515 0.7020 0.8515 0.9228
No log 3.2653 320 0.8143 0.7020 0.8143 0.9024
No log 3.2857 322 0.6912 0.7361 0.6912 0.8314
No log 3.3061 324 0.6331 0.7671 0.6331 0.7957
No log 3.3265 326 0.6765 0.7516 0.6765 0.8225
No log 3.3469 328 0.7499 0.7125 0.7499 0.8660
No log 3.3673 330 0.7946 0.7134 0.7946 0.8914
No log 3.3878 332 0.6751 0.7771 0.6751 0.8216
No log 3.4082 334 0.5955 0.7947 0.5955 0.7717
No log 3.4286 336 0.5851 0.7947 0.5851 0.7649
No log 3.4490 338 0.6426 0.7838 0.6426 0.8016
No log 3.4694 340 0.8118 0.6944 0.8118 0.9010
No log 3.4898 342 0.8178 0.6974 0.8178 0.9043
No log 3.5102 344 0.7388 0.7692 0.7388 0.8595
No log 3.5306 346 0.6135 0.8 0.6135 0.7832
No log 3.5510 348 0.6104 0.7947 0.6104 0.7812
No log 3.5714 350 0.6448 0.7919 0.6448 0.8030
No log 3.5918 352 0.8214 0.6968 0.8214 0.9063
No log 3.6122 354 1.0266 0.6782 1.0266 1.0132
No log 3.6327 356 1.0393 0.6629 1.0393 1.0195
No log 3.6531 358 0.8695 0.7125 0.8695 0.9325
No log 3.6735 360 0.8426 0.7125 0.8426 0.9179
No log 3.6939 362 0.7647 0.7273 0.7647 0.8745
No log 3.7143 364 0.7697 0.7143 0.7697 0.8774
No log 3.7347 366 0.8302 0.6906 0.8302 0.9112
No log 3.7551 368 0.8883 0.6443 0.8883 0.9425
No log 3.7755 370 0.9054 0.6711 0.9054 0.9515
No log 3.7959 372 0.8888 0.6853 0.8888 0.9427
No log 3.8163 374 0.9093 0.6761 0.9093 0.9536
No log 3.8367 376 0.8358 0.6906 0.8358 0.9142
No log 3.8571 378 0.7997 0.7007 0.7997 0.8943
No log 3.8776 380 0.7903 0.7007 0.7903 0.8890
No log 3.8980 382 0.8118 0.6765 0.8118 0.9010
No log 3.9184 384 0.8543 0.6812 0.8543 0.9243
No log 3.9388 386 0.8862 0.6579 0.8862 0.9414
No log 3.9592 388 0.8641 0.7073 0.8641 0.9296
No log 3.9796 390 0.7432 0.7205 0.7432 0.8621
No log 4.0 392 0.6634 0.7799 0.6634 0.8145
No log 4.0204 394 0.6095 0.7821 0.6095 0.7807
No log 4.0408 396 0.6337 0.7871 0.6337 0.7961
No log 4.0612 398 0.7219 0.7975 0.7219 0.8496
No log 4.0816 400 0.9140 0.6795 0.9140 0.9560
No log 4.1020 402 1.0206 0.6289 1.0206 1.0102
No log 4.1224 404 1.0069 0.6447 1.0069 1.0034
No log 4.1429 406 0.8783 0.6567 0.8783 0.9372
No log 4.1633 408 0.7987 0.7007 0.7987 0.8937
No log 4.1837 410 0.7280 0.7286 0.7280 0.8532
No log 4.2041 412 0.6893 0.7397 0.6893 0.8302
No log 4.2245 414 0.6860 0.7578 0.6860 0.8282
No log 4.2449 416 0.6517 0.7929 0.6517 0.8073
No log 4.2653 418 0.5759 0.8092 0.5759 0.7588
No log 4.2857 420 0.5605 0.8049 0.5605 0.7487
No log 4.3061 422 0.5945 0.8 0.5945 0.7711
No log 4.3265 424 0.6992 0.7746 0.6992 0.8362
No log 4.3469 426 0.7166 0.7701 0.7166 0.8465
No log 4.3673 428 0.6286 0.7901 0.6286 0.7928
No log 4.3878 430 0.5584 0.8101 0.5584 0.7473
No log 4.4082 432 0.5682 0.8101 0.5682 0.7538
No log 4.4286 434 0.6413 0.7671 0.6413 0.8008
No log 4.4490 436 0.8391 0.7006 0.8391 0.9160
No log 4.4694 438 1.0285 0.6289 1.0285 1.0142
No log 4.4898 440 1.0086 0.5833 1.0086 1.0043
No log 4.5102 442 0.9575 0.6286 0.9575 0.9785
No log 4.5306 444 0.8866 0.6515 0.8866 0.9416
No log 4.5510 446 0.8789 0.6525 0.8789 0.9375
No log 4.5714 448 0.8167 0.7 0.8167 0.9037
No log 4.5918 450 0.7550 0.7376 0.7550 0.8689
No log 4.6122 452 0.7397 0.7246 0.7397 0.8601
No log 4.6327 454 0.7314 0.7376 0.7314 0.8552
No log 4.6531 456 0.7094 0.7376 0.7094 0.8422
No log 4.6735 458 0.7450 0.7376 0.7450 0.8631
No log 4.6939 460 0.8629 0.6522 0.8629 0.9289
No log 4.7143 462 0.9611 0.6434 0.9611 0.9803
No log 4.7347 464 0.9882 0.6259 0.9882 0.9941
No log 4.7551 466 0.8937 0.6567 0.8937 0.9454
No log 4.7755 468 0.7981 0.6812 0.7981 0.8934
No log 4.7959 470 0.7591 0.6812 0.7591 0.8713
No log 4.8163 472 0.7563 0.6906 0.7563 0.8697
No log 4.8367 474 0.7957 0.7320 0.7957 0.8920
No log 4.8571 476 0.7940 0.7059 0.7940 0.8910
No log 4.8776 478 0.8145 0.7051 0.8145 0.9025
No log 4.8980 480 0.8204 0.7059 0.8204 0.9057
No log 4.9184 482 0.8118 0.7179 0.8118 0.9010
No log 4.9388 484 0.8693 0.7051 0.8693 0.9324
No log 4.9592 486 0.8986 0.6842 0.8986 0.9479
No log 4.9796 488 0.8359 0.7105 0.8359 0.9143
No log 5.0 490 0.7456 0.6993 0.7456 0.8635
No log 5.0204 492 0.7232 0.7234 0.7232 0.8504
No log 5.0408 494 0.7646 0.7007 0.7646 0.8744
No log 5.0612 496 0.8214 0.6866 0.8214 0.9063
No log 5.0816 498 0.9186 0.6528 0.9186 0.9584
0.4533 5.1020 500 0.9157 0.6528 0.9157 0.9569
0.4533 5.1224 502 0.8487 0.6897 0.8487 0.9212
0.4533 5.1429 504 0.8215 0.7007 0.8215 0.9064
0.4533 5.1633 506 0.8065 0.7007 0.8065 0.8980
0.4533 5.1837 508 0.8424 0.6993 0.8424 0.9178
0.4533 5.2041 510 0.9089 0.6525 0.9089 0.9533
0.4533 5.2245 512 0.8877 0.6525 0.8877 0.9422
0.4533 5.2449 514 0.8230 0.6986 0.8230 0.9072
0.4533 5.2653 516 0.7959 0.7059 0.7959 0.8921
0.4533 5.2857 518 0.8210 0.7215 0.8210 0.9061
0.4533 5.3061 520 0.7836 0.6755 0.7836 0.8852
0.4533 5.3265 522 0.8222 0.625 0.8222 0.9068
0.4533 5.3469 524 0.9226 0.6623 0.9226 0.9605
0.4533 5.3673 526 0.9479 0.6164 0.9479 0.9736
0.4533 5.3878 528 0.9076 0.6277 0.9076 0.9527
0.4533 5.4082 530 0.8738 0.6119 0.8738 0.9347
0.4533 5.4286 532 0.8935 0.6165 0.8935 0.9453
0.4533 5.4490 534 0.9430 0.5954 0.9430 0.9711

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k13_task1_organization

Finetuned
(4023)
this model