ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k3_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4923
  • Qwk: 0.3911
  • Mse: 0.4923
  • Rmse: 0.7017

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1176 2 2.6204 -0.0449 2.6204 1.6187
No log 0.2353 4 1.2976 0.1268 1.2976 1.1391
No log 0.3529 6 0.9967 -0.1408 0.9967 0.9984
No log 0.4706 8 0.9020 0.1504 0.9020 0.9498
No log 0.5882 10 0.9526 0.2271 0.9526 0.9760
No log 0.7059 12 0.8932 0.3076 0.8932 0.9451
No log 0.8235 14 0.8360 0.2813 0.8360 0.9144
No log 0.9412 16 0.8225 0.2045 0.8225 0.9069
No log 1.0588 18 0.7802 0.2045 0.7802 0.8833
No log 1.1765 20 0.8251 0.3409 0.8251 0.9084
No log 1.2941 22 0.7573 0.3494 0.7573 0.8702
No log 1.4118 24 0.7180 0.1313 0.7180 0.8474
No log 1.5294 26 0.7218 0.1529 0.7218 0.8496
No log 1.6471 28 0.7276 0.1580 0.7276 0.8530
No log 1.7647 30 0.7315 0.1729 0.7315 0.8553
No log 1.8824 32 0.7012 0.0359 0.7012 0.8374
No log 2.0 34 0.7021 0.0428 0.7021 0.8379
No log 2.1176 36 0.7206 0.0481 0.7206 0.8489
No log 2.2353 38 0.7540 0.1365 0.7540 0.8684
No log 2.3529 40 0.7047 0.0893 0.7047 0.8395
No log 2.4706 42 0.6704 0.1660 0.6704 0.8188
No log 2.5882 44 0.6456 0.1903 0.6456 0.8035
No log 2.7059 46 0.6149 0.2121 0.6149 0.7842
No log 2.8235 48 0.6179 0.3494 0.6179 0.7861
No log 2.9412 50 0.6111 0.2290 0.6111 0.7817
No log 3.0588 52 0.6532 0.1608 0.6532 0.8082
No log 3.1765 54 0.6776 0.2706 0.6776 0.8231
No log 3.2941 56 0.5891 0.3809 0.5891 0.7676
No log 3.4118 58 0.6235 0.3649 0.6235 0.7896
No log 3.5294 60 0.6283 0.3572 0.6283 0.7927
No log 3.6471 62 0.5786 0.4354 0.5786 0.7606
No log 3.7647 64 0.5614 0.3754 0.5614 0.7493
No log 3.8824 66 0.5767 0.5111 0.5767 0.7594
No log 4.0 68 0.6977 0.4624 0.6977 0.8353
No log 4.1176 70 0.6505 0.5183 0.6505 0.8066
No log 4.2353 72 0.5278 0.4019 0.5278 0.7265
No log 4.3529 74 0.5004 0.4547 0.5004 0.7074
No log 4.4706 76 0.4736 0.4788 0.4736 0.6882
No log 4.5882 78 0.4537 0.5437 0.4537 0.6736
No log 4.7059 80 0.4556 0.5141 0.4556 0.6750
No log 4.8235 82 0.5201 0.4308 0.5201 0.7212
No log 4.9412 84 0.5003 0.5549 0.5003 0.7073
No log 5.0588 86 0.5893 0.5984 0.5893 0.7676
No log 5.1765 88 0.5473 0.5420 0.5473 0.7398
No log 5.2941 90 0.5449 0.3363 0.5449 0.7382
No log 5.4118 92 0.5603 0.3603 0.5603 0.7485
No log 5.5294 94 0.6346 0.5606 0.6346 0.7966
No log 5.6471 96 0.9051 0.3370 0.9051 0.9514
No log 5.7647 98 0.9008 0.3316 0.9008 0.9491
No log 5.8824 100 0.6404 0.5385 0.6404 0.8003
No log 6.0 102 0.6145 0.2205 0.6145 0.7839
No log 6.1176 104 0.5768 0.2513 0.5768 0.7595
No log 6.2353 106 0.6020 0.5606 0.6020 0.7759
No log 6.3529 108 0.7893 0.3538 0.7893 0.8884
No log 6.4706 110 0.7951 0.3597 0.7951 0.8917
No log 6.5882 112 0.5657 0.6022 0.5657 0.7521
No log 6.7059 114 0.5109 0.4419 0.5109 0.7148
No log 6.8235 116 0.5114 0.4253 0.5114 0.7151
No log 6.9412 118 0.5547 0.5820 0.5547 0.7448
No log 7.0588 120 0.7513 0.4032 0.7513 0.8668
No log 7.1765 122 0.7332 0.4444 0.7332 0.8562
No log 7.2941 124 0.4953 0.4984 0.4953 0.7038
No log 7.4118 126 0.6617 0.3617 0.6617 0.8134
No log 7.5294 128 0.6838 0.3556 0.6838 0.8269
No log 7.6471 130 0.4739 0.5633 0.4739 0.6884
No log 7.7647 132 0.4541 0.5815 0.4541 0.6739
No log 7.8824 134 0.4594 0.4681 0.4594 0.6778
No log 8.0 136 0.4622 0.5214 0.4622 0.6798
No log 8.1176 138 0.4575 0.4984 0.4575 0.6764
No log 8.2353 140 0.4659 0.5736 0.4659 0.6825
No log 8.3529 142 0.4393 0.5232 0.4393 0.6628
No log 8.4706 144 0.4311 0.5979 0.4311 0.6566
No log 8.5882 146 0.4794 0.5867 0.4794 0.6924
No log 8.7059 148 0.6273 0.5402 0.6273 0.7920
No log 8.8235 150 0.6476 0.5339 0.6476 0.8048
No log 8.9412 152 0.5863 0.5661 0.5863 0.7657
No log 9.0588 154 0.6576 0.4400 0.6576 0.8109
No log 9.1765 156 0.8444 0.4116 0.8444 0.9189
No log 9.2941 158 0.8481 0.4116 0.8481 0.9209
No log 9.4118 160 0.6429 0.5506 0.6429 0.8018
No log 9.5294 162 0.5183 0.5989 0.5183 0.7199
No log 9.6471 164 0.4379 0.6839 0.4379 0.6618
No log 9.7647 166 0.4671 0.6514 0.4671 0.6834
No log 9.8824 168 0.4253 0.6828 0.4253 0.6521
No log 10.0 170 0.4806 0.6325 0.4806 0.6932
No log 10.1176 172 0.4376 0.6541 0.4376 0.6615
No log 10.2353 174 0.5058 0.6271 0.5058 0.7112
No log 10.3529 176 0.5526 0.5460 0.5526 0.7434
No log 10.4706 178 0.4556 0.5812 0.4556 0.6750
No log 10.5882 180 0.4529 0.6032 0.4529 0.6730
No log 10.7059 182 0.4561 0.6018 0.4561 0.6754
No log 10.8235 184 0.4767 0.6187 0.4767 0.6905
No log 10.9412 186 0.4545 0.5413 0.4545 0.6742
No log 11.0588 188 0.4722 0.4527 0.4722 0.6872
No log 11.1765 190 0.5170 0.4610 0.5170 0.7190
No log 11.2941 192 0.5019 0.4674 0.5019 0.7084
No log 11.4118 194 0.4641 0.5307 0.4641 0.6813
No log 11.5294 196 0.5652 0.5787 0.5652 0.7518
No log 11.6471 198 0.5671 0.5252 0.5671 0.7531
No log 11.7647 200 0.5182 0.5036 0.5182 0.7198
No log 11.8824 202 0.5274 0.3141 0.5274 0.7262
No log 12.0 204 0.5398 0.2815 0.5398 0.7347
No log 12.1176 206 0.5544 0.4524 0.5544 0.7446
No log 12.2353 208 0.5781 0.4134 0.5781 0.7603
No log 12.3529 210 0.5315 0.4524 0.5315 0.7291
No log 12.4706 212 0.5422 0.4632 0.5422 0.7363
No log 12.5882 214 0.7185 0.4519 0.7185 0.8476
No log 12.7059 216 0.8113 0.3499 0.8113 0.9007
No log 12.8235 218 0.8663 0.3521 0.8663 0.9307
No log 12.9412 220 0.6803 0.4751 0.6803 0.8248
No log 13.0588 222 0.5493 0.4211 0.5493 0.7412
No log 13.1765 224 0.5911 0.3585 0.5911 0.7689
No log 13.2941 226 0.6652 0.3001 0.6652 0.8156
No log 13.4118 228 0.6552 0.2979 0.6552 0.8094
No log 13.5294 230 0.6213 0.2641 0.6213 0.7882
No log 13.6471 232 0.6461 0.3372 0.6461 0.8038
No log 13.7647 234 0.8287 0.3761 0.8287 0.9103
No log 13.8824 236 0.7959 0.3938 0.7959 0.8922
No log 14.0 238 0.6206 0.3518 0.6206 0.7878
No log 14.1176 240 0.5955 0.3552 0.5955 0.7717
No log 14.2353 242 0.5855 0.3836 0.5855 0.7652
No log 14.3529 244 0.6070 0.4662 0.6070 0.7791
No log 14.4706 246 0.6485 0.5545 0.6485 0.8053
No log 14.5882 248 0.6417 0.5120 0.6417 0.8011
No log 14.7059 250 0.5725 0.4592 0.5725 0.7566
No log 14.8235 252 0.5431 0.4492 0.5431 0.7369
No log 14.9412 254 0.6240 0.3215 0.6240 0.7899
No log 15.0588 256 0.6165 0.2940 0.6165 0.7852
No log 15.1765 258 0.5739 0.2955 0.5739 0.7576
No log 15.2941 260 0.6123 0.4845 0.6123 0.7825
No log 15.4118 262 0.7485 0.3777 0.7485 0.8652
No log 15.5294 264 0.7446 0.3777 0.7446 0.8629
No log 15.6471 266 0.6643 0.4624 0.6643 0.8150
No log 15.7647 268 0.6238 0.5215 0.6238 0.7898
No log 15.8824 270 0.5549 0.5111 0.5549 0.7449
No log 16.0 272 0.5199 0.5209 0.5199 0.7210
No log 16.1176 274 0.4927 0.4962 0.4927 0.7019
No log 16.2353 276 0.5093 0.5111 0.5093 0.7136
No log 16.3529 278 0.5280 0.5430 0.5280 0.7266
No log 16.4706 280 0.4851 0.4962 0.4851 0.6965
No log 16.5882 282 0.5022 0.4314 0.5022 0.7087
No log 16.7059 284 0.5447 0.3426 0.5447 0.7381
No log 16.8235 286 0.5209 0.3396 0.5209 0.7217
No log 16.9412 288 0.4857 0.3604 0.4857 0.6969
No log 17.0588 290 0.4755 0.5060 0.4755 0.6895
No log 17.1765 292 0.4927 0.5039 0.4927 0.7019
No log 17.2941 294 0.4937 0.5039 0.4937 0.7027
No log 17.4118 296 0.4961 0.2607 0.4961 0.7043
No log 17.5294 298 0.5228 0.2345 0.5228 0.7231
No log 17.6471 300 0.5248 0.2345 0.5248 0.7244
No log 17.7647 302 0.5123 0.3228 0.5123 0.7157
No log 17.8824 304 0.6304 0.5538 0.6304 0.7940
No log 18.0 306 0.8163 0.4961 0.8163 0.9035
No log 18.1176 308 0.7968 0.4961 0.7968 0.8926
No log 18.2353 310 0.6099 0.5646 0.6099 0.7810
No log 18.3529 312 0.5030 0.4378 0.5030 0.7092
No log 18.4706 314 0.4883 0.4482 0.4883 0.6988
No log 18.5882 316 0.4928 0.5437 0.4928 0.7020
No log 18.7059 318 0.5897 0.5481 0.5897 0.7679
No log 18.8235 320 0.7103 0.5492 0.7103 0.8428
No log 18.9412 322 0.6852 0.5533 0.6852 0.8278
No log 19.0588 324 0.5359 0.5570 0.5359 0.7320
No log 19.1765 326 0.4737 0.5533 0.4737 0.6883
No log 19.2941 328 0.4694 0.5732 0.4694 0.6851
No log 19.4118 330 0.4734 0.5732 0.4734 0.6880
No log 19.5294 332 0.5448 0.5773 0.5448 0.7381
No log 19.6471 334 0.7258 0.5139 0.7258 0.8520
No log 19.7647 336 0.7438 0.5139 0.7438 0.8624
No log 19.8824 338 0.6238 0.5068 0.6238 0.7898
No log 20.0 340 0.4956 0.4983 0.4956 0.7040
No log 20.1176 342 0.4834 0.5122 0.4834 0.6952
No log 20.2353 344 0.4859 0.4724 0.4859 0.6971
No log 20.3529 346 0.4639 0.5003 0.4639 0.6811
No log 20.4706 348 0.4975 0.5149 0.4975 0.7053
No log 20.5882 350 0.5232 0.5730 0.5232 0.7233
No log 20.7059 352 0.4897 0.5497 0.4897 0.6998
No log 20.8235 354 0.4775 0.5497 0.4775 0.6910
No log 20.9412 356 0.4887 0.5497 0.4887 0.6991
No log 21.0588 358 0.5242 0.5275 0.5242 0.7240
No log 21.1765 360 0.5136 0.5158 0.5136 0.7167
No log 21.2941 362 0.5218 0.5158 0.5218 0.7223
No log 21.4118 364 0.5195 0.5349 0.5195 0.7208
No log 21.5294 366 0.5113 0.5580 0.5113 0.7151
No log 21.6471 368 0.5314 0.5158 0.5314 0.7290
No log 21.7647 370 0.5877 0.5851 0.5877 0.7666
No log 21.8824 372 0.5989 0.5851 0.5989 0.7739
No log 22.0 374 0.6206 0.5494 0.6206 0.7878
No log 22.1176 376 0.5548 0.4925 0.5548 0.7448
No log 22.2353 378 0.4918 0.5060 0.4918 0.7013
No log 22.3529 380 0.4884 0.4657 0.4884 0.6988
No log 22.4706 382 0.4876 0.4657 0.4876 0.6983
No log 22.5882 384 0.4844 0.4634 0.4844 0.6960
No log 22.7059 386 0.4793 0.4634 0.4793 0.6923
No log 22.8235 388 0.4803 0.4634 0.4803 0.6931
No log 22.9412 390 0.4791 0.4902 0.4791 0.6922
No log 23.0588 392 0.4961 0.4962 0.4961 0.7043
No log 23.1765 394 0.5035 0.5018 0.5035 0.7095
No log 23.2941 396 0.4976 0.4780 0.4976 0.7054
No log 23.4118 398 0.4921 0.4803 0.4921 0.7015
No log 23.5294 400 0.4943 0.4224 0.4943 0.7030
No log 23.6471 402 0.4966 0.4752 0.4966 0.7047
No log 23.7647 404 0.5195 0.5111 0.5195 0.7208
No log 23.8824 406 0.5934 0.4884 0.5934 0.7703
No log 24.0 408 0.6155 0.5183 0.6155 0.7845
No log 24.1176 410 0.5880 0.4815 0.5880 0.7668
No log 24.2353 412 0.5325 0.5119 0.5325 0.7297
No log 24.3529 414 0.5204 0.3552 0.5204 0.7214
No log 24.4706 416 0.5223 0.3243 0.5223 0.7227
No log 24.5882 418 0.5193 0.2930 0.5193 0.7206
No log 24.7059 420 0.5159 0.3243 0.5159 0.7183
No log 24.8235 422 0.5225 0.4857 0.5225 0.7229
No log 24.9412 424 0.5569 0.4569 0.5569 0.7463
No log 25.0588 426 0.5645 0.4569 0.5645 0.7513
No log 25.1765 428 0.5444 0.4857 0.5444 0.7378
No log 25.2941 430 0.5452 0.4586 0.5452 0.7384
No log 25.4118 432 0.5604 0.4758 0.5604 0.7486
No log 25.5294 434 0.5855 0.4758 0.5855 0.7652
No log 25.6471 436 0.5904 0.5270 0.5904 0.7684
No log 25.7647 438 0.5633 0.4758 0.5633 0.7505
No log 25.8824 440 0.5725 0.5018 0.5725 0.7566
No log 26.0 442 0.6112 0.5215 0.6112 0.7818
No log 26.1176 444 0.6172 0.5120 0.6172 0.7856
No log 26.2353 446 0.5880 0.5545 0.5880 0.7668
No log 26.3529 448 0.5140 0.5939 0.5140 0.7169
No log 26.4706 450 0.4818 0.5503 0.4818 0.6941
No log 26.5882 452 0.4803 0.4361 0.4803 0.6931
No log 26.7059 454 0.5052 0.4285 0.5052 0.7108
No log 26.8235 456 0.4946 0.4569 0.4946 0.7033
No log 26.9412 458 0.4888 0.4402 0.4888 0.6991
No log 27.0588 460 0.5513 0.5681 0.5513 0.7425
No log 27.1765 462 0.5705 0.5330 0.5705 0.7553
No log 27.2941 464 0.6151 0.5120 0.6151 0.7843
No log 27.4118 466 0.6172 0.4795 0.6172 0.7856
No log 27.5294 468 0.5798 0.5411 0.5798 0.7615
No log 27.6471 470 0.5332 0.4941 0.5332 0.7302
No log 27.7647 472 0.5256 0.5039 0.5256 0.7250
No log 27.8824 474 0.5206 0.5309 0.5206 0.7215
No log 28.0 476 0.5173 0.4173 0.5173 0.7192
No log 28.1176 478 0.5210 0.3552 0.5210 0.7218
No log 28.2353 480 0.5248 0.4199 0.5248 0.7245
No log 28.3529 482 0.5268 0.3258 0.5268 0.7258
No log 28.4706 484 0.5290 0.3258 0.5290 0.7273
No log 28.5882 486 0.5321 0.3920 0.5321 0.7295
No log 28.7059 488 0.5293 0.4111 0.5293 0.7275
No log 28.8235 490 0.5524 0.5189 0.5524 0.7432
No log 28.9412 492 0.6242 0.4482 0.6242 0.7901
No log 29.0588 494 0.6512 0.5108 0.6512 0.8069
No log 29.1765 496 0.5952 0.5085 0.5952 0.7715
No log 29.2941 498 0.5018 0.5237 0.5018 0.7084
0.2928 29.4118 500 0.4679 0.5228 0.4679 0.6840
0.2928 29.5294 502 0.4952 0.4429 0.4952 0.7037
0.2928 29.6471 504 0.5046 0.4429 0.5046 0.7104
0.2928 29.7647 506 0.4954 0.4211 0.4954 0.7038
0.2928 29.8824 508 0.4815 0.4314 0.4815 0.6939
0.2928 30.0 510 0.4887 0.5467 0.4887 0.6991
0.2928 30.1176 512 0.5286 0.5252 0.5286 0.7270
0.2928 30.2353 514 0.5332 0.5252 0.5332 0.7302
0.2928 30.3529 516 0.5072 0.5111 0.5072 0.7122
0.2928 30.4706 518 0.5036 0.5209 0.5036 0.7097
0.2928 30.5882 520 0.5030 0.5209 0.5030 0.7093
0.2928 30.7059 522 0.5065 0.5209 0.5065 0.7117
0.2928 30.8235 524 0.5252 0.5349 0.5252 0.7247
0.2928 30.9412 526 0.5231 0.5349 0.5231 0.7232
0.2928 31.0588 528 0.4987 0.5467 0.4987 0.7062
0.2928 31.1765 530 0.4870 0.4729 0.4870 0.6978
0.2928 31.2941 532 0.4799 0.4468 0.4799 0.6928
0.2928 31.4118 534 0.4762 0.4703 0.4762 0.6901
0.2928 31.5294 536 0.4898 0.4174 0.4898 0.6998
0.2928 31.6471 538 0.4923 0.3911 0.4923 0.7017

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k3_task7_organization

Finetuned
(4019)
this model