ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k11_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8296
  • Qwk: 0.4391
  • Mse: 0.8296
  • Rmse: 0.9108

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0357 2 4.1429 0.0176 4.1429 2.0354
No log 0.0714 4 2.3361 -0.0406 2.3361 1.5284
No log 0.1071 6 1.5948 0.0181 1.5948 1.2628
No log 0.1429 8 1.3657 0.1036 1.3657 1.1686
No log 0.1786 10 1.1540 0.0977 1.1540 1.0742
No log 0.2143 12 1.2016 0.1637 1.2016 1.0962
No log 0.25 14 2.1014 0.0956 2.1014 1.4496
No log 0.2857 16 1.6307 0.1591 1.6307 1.2770
No log 0.3214 18 1.0219 0.2716 1.0219 1.0109
No log 0.3571 20 1.0443 0.1545 1.0443 1.0219
No log 0.3929 22 0.9824 0.2818 0.9824 0.9912
No log 0.4286 24 1.0268 0.2834 1.0268 1.0133
No log 0.4643 26 1.0783 0.2125 1.0783 1.0384
No log 0.5 28 1.0843 0.2004 1.0843 1.0413
No log 0.5357 30 1.0504 0.2903 1.0504 1.0249
No log 0.5714 32 1.0077 0.2564 1.0077 1.0038
No log 0.6071 34 0.9438 0.3156 0.9438 0.9715
No log 0.6429 36 0.9443 0.3414 0.9443 0.9717
No log 0.6786 38 0.9482 0.3464 0.9482 0.9737
No log 0.7143 40 0.9492 0.3139 0.9492 0.9743
No log 0.75 42 1.1674 0.3295 1.1674 1.0805
No log 0.7857 44 1.2814 0.2143 1.2814 1.1320
No log 0.8214 46 1.0454 0.2191 1.0454 1.0224
No log 0.8571 48 0.9529 0.3045 0.9529 0.9762
No log 0.8929 50 0.9702 0.4224 0.9702 0.9850
No log 0.9286 52 0.9403 0.3332 0.9403 0.9697
No log 0.9643 54 1.0985 0.2534 1.0985 1.0481
No log 1.0 56 1.2341 0.2024 1.2341 1.1109
No log 1.0357 58 1.0992 0.2905 1.0992 1.0484
No log 1.0714 60 0.9018 0.3316 0.9018 0.9496
No log 1.1071 62 0.9556 0.4696 0.9556 0.9775
No log 1.1429 64 0.8953 0.3367 0.8953 0.9462
No log 1.1786 66 0.9648 0.2880 0.9648 0.9822
No log 1.2143 68 1.0194 0.2659 1.0194 1.0097
No log 1.25 70 1.0244 0.2659 1.0244 1.0121
No log 1.2857 72 0.9216 0.2981 0.9216 0.9600
No log 1.3214 74 0.8599 0.3680 0.8599 0.9273
No log 1.3571 76 0.8407 0.3711 0.8407 0.9169
No log 1.3929 78 0.8567 0.4734 0.8567 0.9256
No log 1.4286 80 0.8627 0.4343 0.8627 0.9288
No log 1.4643 82 1.0388 0.4513 1.0388 1.0192
No log 1.5 84 0.9897 0.4306 0.9897 0.9949
No log 1.5357 86 0.8985 0.4102 0.8985 0.9479
No log 1.5714 88 0.9143 0.3775 0.9143 0.9562
No log 1.6071 90 0.9918 0.4161 0.9918 0.9959
No log 1.6429 92 0.8743 0.3839 0.8743 0.9350
No log 1.6786 94 0.9939 0.3498 0.9939 0.9969
No log 1.7143 96 0.9600 0.3105 0.9600 0.9798
No log 1.75 98 0.8586 0.4027 0.8586 0.9266
No log 1.7857 100 0.9295 0.4349 0.9295 0.9641
No log 1.8214 102 0.8683 0.3897 0.8683 0.9318
No log 1.8571 104 0.9043 0.3627 0.9043 0.9509
No log 1.8929 106 1.0408 0.5145 1.0408 1.0202
No log 1.9286 108 0.8593 0.4277 0.8593 0.9270
No log 1.9643 110 0.8822 0.4076 0.8822 0.9393
No log 2.0 112 0.9303 0.4247 0.9303 0.9645
No log 2.0357 114 0.8145 0.4754 0.8145 0.9025
No log 2.0714 116 0.9274 0.4084 0.9274 0.9630
No log 2.1071 118 0.8748 0.4370 0.8748 0.9353
No log 2.1429 120 0.9070 0.4575 0.9070 0.9524
No log 2.1786 122 1.0193 0.2989 1.0193 1.0096
No log 2.2143 124 0.8638 0.4601 0.8638 0.9294
No log 2.25 126 0.9087 0.4067 0.9087 0.9533
No log 2.2857 128 0.9142 0.4067 0.9142 0.9561
No log 2.3214 130 0.8464 0.4336 0.8464 0.9200
No log 2.3571 132 0.9892 0.3391 0.9892 0.9946
No log 2.3929 134 0.9641 0.3532 0.9641 0.9819
No log 2.4286 136 0.8262 0.3740 0.8262 0.9089
No log 2.4643 138 0.8860 0.4215 0.8860 0.9413
No log 2.5 140 0.9917 0.4307 0.9917 0.9959
No log 2.5357 142 0.9322 0.4296 0.9322 0.9655
No log 2.5714 144 1.1807 0.2729 1.1807 1.0866
No log 2.6071 146 1.5997 0.2430 1.5997 1.2648
No log 2.6429 148 1.5724 0.2293 1.5724 1.2540
No log 2.6786 150 1.2478 0.3254 1.2478 1.1171
No log 2.7143 152 1.1994 0.3095 1.1994 1.0952
No log 2.75 154 1.1197 0.3908 1.1197 1.0581
No log 2.7857 156 1.0026 0.3875 1.0026 1.0013
No log 2.8214 158 0.9354 0.3296 0.9354 0.9672
No log 2.8571 160 0.9040 0.3445 0.9040 0.9508
No log 2.8929 162 0.9157 0.4597 0.9157 0.9569
No log 2.9286 164 0.8751 0.3151 0.8751 0.9355
No log 2.9643 166 0.9786 0.3086 0.9786 0.9893
No log 3.0 168 1.0613 0.3967 1.0613 1.0302
No log 3.0357 170 0.9018 0.4031 0.9018 0.9496
No log 3.0714 172 0.9595 0.4496 0.9595 0.9795
No log 3.1071 174 1.1137 0.4449 1.1137 1.0553
No log 3.1429 176 0.9745 0.4234 0.9745 0.9871
No log 3.1786 178 0.8820 0.3154 0.8820 0.9391
No log 3.2143 180 0.9786 0.5351 0.9786 0.9893
No log 3.25 182 0.9197 0.4147 0.9197 0.9590
No log 3.2857 184 0.8435 0.4073 0.8435 0.9184
No log 3.3214 186 0.8417 0.3804 0.8417 0.9174
No log 3.3571 188 0.8324 0.4073 0.8324 0.9124
No log 3.3929 190 0.8379 0.4076 0.8379 0.9154
No log 3.4286 192 0.8510 0.4110 0.8510 0.9225
No log 3.4643 194 0.9005 0.4250 0.9005 0.9489
No log 3.5 196 0.8671 0.4131 0.8671 0.9312
No log 3.5357 198 0.8466 0.4247 0.8466 0.9201
No log 3.5714 200 0.8532 0.4373 0.8532 0.9237
No log 3.6071 202 0.9059 0.4310 0.9059 0.9518
No log 3.6429 204 0.8987 0.4302 0.8987 0.9480
No log 3.6786 206 0.8544 0.4696 0.8544 0.9243
No log 3.7143 208 0.8967 0.4202 0.8967 0.9469
No log 3.75 210 0.9057 0.4091 0.9057 0.9517
No log 3.7857 212 0.8472 0.4940 0.8472 0.9204
No log 3.8214 214 0.7850 0.4475 0.7850 0.8860
No log 3.8571 216 0.7884 0.4475 0.7884 0.8879
No log 3.8929 218 0.8032 0.4264 0.8032 0.8962
No log 3.9286 220 0.8096 0.4615 0.8096 0.8998
No log 3.9643 222 0.8060 0.4581 0.8060 0.8978
No log 4.0 224 0.7609 0.4062 0.7609 0.8723
No log 4.0357 226 0.7945 0.4396 0.7945 0.8914
No log 4.0714 228 0.7727 0.5066 0.7727 0.8790
No log 4.1071 230 0.7846 0.4159 0.7846 0.8858
No log 4.1429 232 0.7841 0.5009 0.7841 0.8855
No log 4.1786 234 0.8061 0.4563 0.8061 0.8978
No log 4.2143 236 0.8930 0.4804 0.8930 0.9450
No log 4.25 238 0.8560 0.4599 0.8560 0.9252
No log 4.2857 240 0.9050 0.3705 0.9050 0.9513
No log 4.3214 242 1.1262 0.3584 1.1262 1.0612
No log 4.3571 244 1.0591 0.3451 1.0591 1.0291
No log 4.3929 246 0.9161 0.2761 0.9161 0.9572
No log 4.4286 248 1.0338 0.3462 1.0338 1.0168
No log 4.4643 250 1.0749 0.3443 1.0749 1.0368
No log 4.5 252 1.0175 0.3695 1.0175 1.0087
No log 4.5357 254 0.9957 0.2696 0.9957 0.9979
No log 4.5714 256 1.0269 0.3607 1.0269 1.0133
No log 4.6071 258 1.0039 0.2675 1.0039 1.0019
No log 4.6429 260 1.0550 0.2117 1.0550 1.0271
No log 4.6786 262 1.1086 0.2572 1.1086 1.0529
No log 4.7143 264 1.0548 0.3042 1.0548 1.0270
No log 4.75 266 0.9213 0.2986 0.9213 0.9598
No log 4.7857 268 0.8644 0.3250 0.8644 0.9298
No log 4.8214 270 0.8695 0.4524 0.8695 0.9324
No log 4.8571 272 0.8979 0.4984 0.8979 0.9476
No log 4.8929 274 0.8148 0.4628 0.8148 0.9026
No log 4.9286 276 0.8143 0.5392 0.8143 0.9024
No log 4.9643 278 0.8505 0.5554 0.8505 0.9222
No log 5.0 280 0.8089 0.5909 0.8089 0.8994
No log 5.0357 282 0.7717 0.4505 0.7717 0.8785
No log 5.0714 284 0.7505 0.5002 0.7505 0.8663
No log 5.1071 286 0.7409 0.5305 0.7409 0.8608
No log 5.1429 288 0.7201 0.5305 0.7201 0.8486
No log 5.1786 290 0.7191 0.5396 0.7191 0.8480
No log 5.2143 292 0.7192 0.5146 0.7192 0.8480
No log 5.25 294 0.7601 0.5070 0.7601 0.8718
No log 5.2857 296 0.8818 0.4667 0.8818 0.9390
No log 5.3214 298 0.9076 0.4885 0.9076 0.9527
No log 5.3571 300 0.9603 0.4554 0.9603 0.9800
No log 5.3929 302 1.0148 0.3913 1.0148 1.0074
No log 5.4286 304 0.9392 0.4435 0.9392 0.9691
No log 5.4643 306 0.8811 0.3970 0.8811 0.9387
No log 5.5 308 0.8229 0.4119 0.8229 0.9072
No log 5.5357 310 0.8156 0.4102 0.8156 0.9031
No log 5.5714 312 0.9037 0.4326 0.9037 0.9506
No log 5.6071 314 0.8417 0.4570 0.8417 0.9174
No log 5.6429 316 0.7985 0.3973 0.7985 0.8936
No log 5.6786 318 0.7429 0.4966 0.7429 0.8619
No log 5.7143 320 0.7531 0.4599 0.7531 0.8678
No log 5.75 322 0.7774 0.5048 0.7774 0.8817
No log 5.7857 324 0.7286 0.5076 0.7286 0.8536
No log 5.8214 326 0.6967 0.5304 0.6967 0.8347
No log 5.8571 328 0.7155 0.5494 0.7155 0.8459
No log 5.8929 330 0.6688 0.5763 0.6688 0.8178
No log 5.9286 332 0.6761 0.4988 0.6761 0.8223
No log 5.9643 334 0.7307 0.5279 0.7307 0.8548
No log 6.0 336 0.9319 0.4444 0.9319 0.9654
No log 6.0357 338 0.9976 0.4133 0.9976 0.9988
No log 6.0714 340 0.8215 0.5229 0.8215 0.9064
No log 6.1071 342 0.6771 0.5288 0.6771 0.8229
No log 6.1429 344 0.7446 0.4883 0.7446 0.8629
No log 6.1786 346 0.7577 0.5256 0.7577 0.8705
No log 6.2143 348 0.6950 0.5428 0.6950 0.8337
No log 6.25 350 0.7138 0.5318 0.7138 0.8449
No log 6.2857 352 0.9476 0.4226 0.9476 0.9735
No log 6.3214 354 1.0735 0.3897 1.0735 1.0361
No log 6.3571 356 0.9512 0.3780 0.9512 0.9753
No log 6.3929 358 0.8126 0.5563 0.8126 0.9015
No log 6.4286 360 0.8964 0.4796 0.8964 0.9468
No log 6.4643 362 0.9859 0.4430 0.9859 0.9929
No log 6.5 364 0.9316 0.4699 0.9316 0.9652
No log 6.5357 366 0.8228 0.4370 0.8228 0.9071
No log 6.5714 368 0.8173 0.3878 0.8173 0.9041
No log 6.6071 370 0.8448 0.4712 0.8448 0.9191
No log 6.6429 372 0.8398 0.5140 0.8398 0.9164
No log 6.6786 374 0.7832 0.5111 0.7832 0.8850
No log 6.7143 376 0.7721 0.4971 0.7721 0.8787
No log 6.75 378 0.7960 0.5446 0.7960 0.8922
No log 6.7857 380 0.7612 0.5057 0.7612 0.8725
No log 6.8214 382 0.8094 0.4954 0.8094 0.8997
No log 6.8571 384 0.9503 0.4667 0.9503 0.9748
No log 6.8929 386 0.9460 0.4667 0.9460 0.9726
No log 6.9286 388 0.8467 0.4696 0.8467 0.9201
No log 6.9643 390 0.8109 0.4599 0.8109 0.9005
No log 7.0 392 0.8584 0.3939 0.8584 0.9265
No log 7.0357 394 0.9413 0.3677 0.9413 0.9702
No log 7.0714 396 0.9151 0.3654 0.9151 0.9566
No log 7.1071 398 0.8765 0.3677 0.8765 0.9362
No log 7.1429 400 0.8425 0.3879 0.8425 0.9179
No log 7.1786 402 0.9219 0.4781 0.9219 0.9602
No log 7.2143 404 1.0438 0.4332 1.0438 1.0216
No log 7.25 406 1.0076 0.4873 1.0076 1.0038
No log 7.2857 408 0.9602 0.4681 0.9602 0.9799
No log 7.3214 410 0.8405 0.5039 0.8405 0.9168
No log 7.3571 412 0.7831 0.4511 0.7831 0.8850
No log 7.3929 414 0.7967 0.4588 0.7967 0.8926
No log 7.4286 416 0.8017 0.4303 0.8017 0.8954
No log 7.4643 418 0.7978 0.3837 0.7978 0.8932
No log 7.5 420 0.8079 0.3979 0.8079 0.8988
No log 7.5357 422 0.8053 0.4660 0.8053 0.8974
No log 7.5714 424 0.8421 0.4977 0.8421 0.9176
No log 7.6071 426 0.8395 0.5403 0.8395 0.9162
No log 7.6429 428 0.8222 0.5275 0.8222 0.9067
No log 7.6786 430 0.8261 0.5391 0.8261 0.9089
No log 7.7143 432 0.8125 0.5158 0.8125 0.9014
No log 7.75 434 0.8009 0.5163 0.8009 0.8949
No log 7.7857 436 0.8056 0.4838 0.8056 0.8975
No log 7.8214 438 0.7850 0.5156 0.7850 0.8860
No log 7.8571 440 0.7913 0.5260 0.7913 0.8895
No log 7.8929 442 0.8194 0.4947 0.8194 0.9052
No log 7.9286 444 0.8598 0.4067 0.8598 0.9273
No log 7.9643 446 0.8478 0.4691 0.8478 0.9208
No log 8.0 448 0.8396 0.4138 0.8396 0.9163
No log 8.0357 450 0.8366 0.3998 0.8366 0.9147
No log 8.0714 452 0.8242 0.3817 0.8242 0.9078
No log 8.1071 454 0.8071 0.4119 0.8071 0.8984
No log 8.1429 456 0.7991 0.5171 0.7991 0.8939
No log 8.1786 458 0.7929 0.5498 0.7929 0.8904
No log 8.2143 460 0.7964 0.5523 0.7964 0.8924
No log 8.25 462 0.8294 0.4608 0.8294 0.9107
No log 8.2857 464 0.8351 0.4608 0.8351 0.9139
No log 8.3214 466 0.8209 0.5052 0.8209 0.9060
No log 8.3571 468 0.8415 0.4128 0.8415 0.9173
No log 8.3929 470 0.8355 0.4119 0.8355 0.9141
No log 8.4286 472 0.8383 0.3817 0.8383 0.9156
No log 8.4643 474 0.8279 0.4277 0.8279 0.9099
No log 8.5 476 0.8140 0.4433 0.8140 0.9022
No log 8.5357 478 0.8213 0.4338 0.8213 0.9062
No log 8.5714 480 0.9415 0.4290 0.9415 0.9703
No log 8.6071 482 0.9363 0.4290 0.9363 0.9676
No log 8.6429 484 0.8093 0.4706 0.8093 0.8996
No log 8.6786 486 0.9127 0.4510 0.9127 0.9553
No log 8.7143 488 1.0016 0.5122 1.0016 1.0008
No log 8.75 490 0.9040 0.5044 0.9040 0.9508
No log 8.7857 492 0.7990 0.4706 0.7990 0.8939
No log 8.8214 494 0.8238 0.4976 0.8238 0.9076
No log 8.8571 496 0.8044 0.4706 0.8044 0.8969
No log 8.8929 498 0.8473 0.4772 0.8473 0.9205
0.3297 8.9286 500 0.9223 0.5267 0.9223 0.9604
0.3297 8.9643 502 0.8687 0.4419 0.8687 0.9321
0.3297 9.0 504 0.8171 0.4676 0.8171 0.9039
0.3297 9.0357 506 0.8214 0.4550 0.8214 0.9063
0.3297 9.0714 508 0.8190 0.4429 0.8190 0.9050
0.3297 9.1071 510 0.8296 0.4391 0.8296 0.9108

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k11_task5_organization

Finetuned
(4019)
this model