ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k19_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5532
  • Qwk: 0.4505
  • Mse: 0.5532
  • Rmse: 0.7438

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0206 2 2.6693 -0.1213 2.6693 1.6338
No log 0.0412 4 1.3713 0.0111 1.3713 1.1710
No log 0.0619 6 1.0520 -0.0681 1.0520 1.0257
No log 0.0825 8 0.8570 0.1347 0.8570 0.9257
No log 0.1031 10 1.0218 0.1535 1.0218 1.0108
No log 0.1237 12 0.7916 0.2204 0.7916 0.8897
No log 0.1443 14 0.8251 0.4129 0.8251 0.9083
No log 0.1649 16 0.7227 0.3213 0.7227 0.8501
No log 0.1856 18 0.8165 0.1962 0.8165 0.9036
No log 0.2062 20 0.7809 0.3032 0.7809 0.8837
No log 0.2268 22 0.7220 0.3238 0.7220 0.8497
No log 0.2474 24 0.6803 0.2506 0.6803 0.8248
No log 0.2680 26 0.6816 0.2085 0.6816 0.8256
No log 0.2887 28 0.8104 0.2904 0.8104 0.9002
No log 0.3093 30 0.7834 0.3032 0.7834 0.8851
No log 0.3299 32 0.6716 0.2085 0.6716 0.8195
No log 0.3505 34 0.6433 0.3774 0.6433 0.8020
No log 0.3711 36 0.7268 0.4674 0.7268 0.8526
No log 0.3918 38 0.6480 0.3363 0.6480 0.8050
No log 0.4124 40 0.7301 0.3355 0.7301 0.8545
No log 0.4330 42 0.9711 0.3137 0.9711 0.9855
No log 0.4536 44 0.8982 0.3042 0.8982 0.9477
No log 0.4742 46 0.7074 0.2590 0.7074 0.8411
No log 0.4948 48 0.7264 0.2182 0.7264 0.8523
No log 0.5155 50 0.8223 0.2812 0.8223 0.9068
No log 0.5361 52 0.7711 0.3095 0.7711 0.8781
No log 0.5567 54 0.8653 0.3359 0.8653 0.9302
No log 0.5773 56 0.8291 0.3359 0.8291 0.9106
No log 0.5979 58 0.8267 0.3606 0.8267 0.9092
No log 0.6186 60 0.7145 0.2995 0.7145 0.8453
No log 0.6392 62 0.6564 0.2981 0.6564 0.8102
No log 0.6598 64 0.6982 0.3127 0.6982 0.8356
No log 0.6804 66 0.6403 0.3324 0.6403 0.8002
No log 0.7010 68 0.7080 0.3355 0.7080 0.8414
No log 0.7216 70 0.8440 0.2518 0.8440 0.9187
No log 0.7423 72 1.4476 0.1327 1.4476 1.2032
No log 0.7629 74 1.6323 0.0665 1.6323 1.2776
No log 0.7835 76 1.1951 0.2055 1.1951 1.0932
No log 0.8041 78 0.7252 0.4697 0.7252 0.8516
No log 0.8247 80 0.8447 0.4340 0.8447 0.9190
No log 0.8454 82 0.8605 0.4038 0.8605 0.9276
No log 0.8660 84 0.7144 0.4 0.7144 0.8452
No log 0.8866 86 0.7776 0.3450 0.7776 0.8818
No log 0.9072 88 1.0060 0.3051 1.0060 1.0030
No log 0.9278 90 1.0804 0.2886 1.0804 1.0394
No log 0.9485 92 1.0399 0.2886 1.0399 1.0198
No log 0.9691 94 1.0783 0.2392 1.0783 1.0384
No log 0.9897 96 1.1094 0.2084 1.1094 1.0533
No log 1.0103 98 1.1500 0.2084 1.1500 1.0724
No log 1.0309 100 1.3225 0.1805 1.3225 1.1500
No log 1.0515 102 1.0531 0.2999 1.0531 1.0262
No log 1.0722 104 0.8173 0.1541 0.8173 0.9040
No log 1.0928 106 0.7715 0.2121 0.7715 0.8784
No log 1.1134 108 0.7361 0.3258 0.7361 0.8579
No log 1.1340 110 0.8103 0.3440 0.8103 0.9002
No log 1.1546 112 1.0390 0.3052 1.0390 1.0193
No log 1.1753 114 1.0237 0.2824 1.0237 1.0118
No log 1.1959 116 0.9767 0.3031 0.9767 0.9883
No log 1.2165 118 0.8939 0.3381 0.8939 0.9454
No log 1.2371 120 0.8098 0.3754 0.8098 0.8999
No log 1.2577 122 0.7120 0.4704 0.7120 0.8438
No log 1.2784 124 0.8298 0.3710 0.8298 0.9110
No log 1.2990 126 0.9349 0.3411 0.9349 0.9669
No log 1.3196 128 1.0383 0.3203 1.0383 1.0190
No log 1.3402 130 0.8010 0.3320 0.8010 0.8950
No log 1.3608 132 0.7419 0.4513 0.7419 0.8613
No log 1.3814 134 0.7376 0.4513 0.7376 0.8588
No log 1.4021 136 0.9012 0.3740 0.9012 0.9493
No log 1.4227 138 0.8403 0.3803 0.8403 0.9167
No log 1.4433 140 0.8236 0.3803 0.8236 0.9075
No log 1.4639 142 0.7159 0.4684 0.7159 0.8461
No log 1.4845 144 0.8336 0.3579 0.8336 0.9130
No log 1.5052 146 0.8220 0.3638 0.8220 0.9066
No log 1.5258 148 0.6905 0.3936 0.6905 0.8310
No log 1.5464 150 0.6307 0.3561 0.6307 0.7942
No log 1.5670 152 0.6242 0.3910 0.6242 0.7901
No log 1.5876 154 0.6268 0.3586 0.6268 0.7917
No log 1.6082 156 0.6331 0.4229 0.6331 0.7957
No log 1.6289 158 0.6648 0.4414 0.6648 0.8153
No log 1.6495 160 0.9026 0.3618 0.9026 0.9501
No log 1.6701 162 0.9658 0.3305 0.9658 0.9827
No log 1.6907 164 0.7497 0.3976 0.7497 0.8659
No log 1.7113 166 0.7172 0.4260 0.7172 0.8469
No log 1.7320 168 0.7615 0.4144 0.7615 0.8727
No log 1.7526 170 0.7400 0.4163 0.7400 0.8602
No log 1.7732 172 0.7296 0.4163 0.7296 0.8541
No log 1.7938 174 0.6909 0.4091 0.6909 0.8312
No log 1.8144 176 0.6833 0.4091 0.6833 0.8266
No log 1.8351 178 0.7483 0.3909 0.7483 0.8650
No log 1.8557 180 0.8904 0.3074 0.8904 0.9436
No log 1.8763 182 0.7663 0.4409 0.7663 0.8754
No log 1.8969 184 0.6519 0.3572 0.6519 0.8074
No log 1.9175 186 0.6627 0.3894 0.6627 0.8141
No log 1.9381 188 0.6985 0.3518 0.6985 0.8357
No log 1.9588 190 0.7401 0.4330 0.7401 0.8603
No log 1.9794 192 0.8194 0.4102 0.8194 0.9052
No log 2.0 194 0.7861 0.4424 0.7861 0.8866
No log 2.0206 196 0.6676 0.5770 0.6676 0.8170
No log 2.0412 198 0.6282 0.4315 0.6282 0.7926
No log 2.0619 200 0.6818 0.4444 0.6818 0.8257
No log 2.0825 202 0.6068 0.3457 0.6068 0.7790
No log 2.1031 204 0.6000 0.5036 0.6000 0.7746
No log 2.1237 206 0.5949 0.4945 0.5949 0.7713
No log 2.1443 208 0.5643 0.4776 0.5643 0.7512
No log 2.1649 210 0.7613 0.3439 0.7613 0.8726
No log 2.1856 212 1.0503 0.2746 1.0503 1.0249
No log 2.2062 214 0.9777 0.3278 0.9777 0.9888
No log 2.2268 216 0.7014 0.4193 0.7014 0.8375
No log 2.2474 218 0.5543 0.4160 0.5543 0.7445
No log 2.2680 220 0.5851 0.4684 0.5851 0.7649
No log 2.2887 222 0.5883 0.4945 0.5883 0.7670
No log 2.3093 224 0.5528 0.4575 0.5528 0.7435
No log 2.3299 226 0.5756 0.4737 0.5756 0.7587
No log 2.3505 228 0.5676 0.5092 0.5676 0.7534
No log 2.3711 230 0.5738 0.4358 0.5738 0.7575
No log 2.3918 232 0.6157 0.5655 0.6157 0.7847
No log 2.4124 234 0.5793 0.4555 0.5793 0.7611
No log 2.4330 236 0.5645 0.4287 0.5645 0.7513
No log 2.4536 238 0.6147 0.4391 0.6147 0.7840
No log 2.4742 240 0.6576 0.4230 0.6576 0.8109
No log 2.4948 242 0.5951 0.4427 0.5951 0.7714
No log 2.5155 244 0.5930 0.4914 0.5930 0.7701
No log 2.5361 246 0.6730 0.4684 0.6730 0.8204
No log 2.5567 248 0.6468 0.4964 0.6468 0.8042
No log 2.5773 250 0.5927 0.5092 0.5927 0.7699
No log 2.5979 252 0.7676 0.3204 0.7676 0.8761
No log 2.6186 254 0.7714 0.3432 0.7714 0.8783
No log 2.6392 256 0.6751 0.4179 0.6751 0.8216
No log 2.6598 258 0.6501 0.4556 0.6501 0.8063
No log 2.6804 260 0.5863 0.5234 0.5863 0.7657
No log 2.7010 262 0.5896 0.4660 0.5896 0.7679
No log 2.7216 264 0.5905 0.4660 0.5905 0.7684
No log 2.7423 266 0.5869 0.3100 0.5869 0.7661
No log 2.7629 268 0.5881 0.3837 0.5881 0.7669
No log 2.7835 270 0.6168 0.4660 0.6168 0.7853
No log 2.8041 272 0.6106 0.4114 0.6106 0.7814
No log 2.8247 274 0.6222 0.4322 0.6222 0.7888
No log 2.8454 276 0.6247 0.4322 0.6247 0.7904
No log 2.8660 278 0.6296 0.4114 0.6296 0.7935
No log 2.8866 280 0.6958 0.4212 0.6958 0.8341
No log 2.9072 282 0.8636 0.3538 0.8636 0.9293
No log 2.9278 284 0.9035 0.3417 0.9035 0.9505
No log 2.9485 286 0.9675 0.3359 0.9675 0.9836
No log 2.9691 288 0.8758 0.3359 0.8758 0.9358
No log 2.9897 290 0.8096 0.4404 0.8096 0.8998
No log 3.0103 292 0.7228 0.3843 0.7228 0.8502
No log 3.0309 294 0.6914 0.4091 0.6914 0.8315
No log 3.0515 296 0.6851 0.4036 0.6851 0.8277
No log 3.0722 298 0.7518 0.4190 0.7518 0.8670
No log 3.0928 300 0.8846 0.3204 0.8846 0.9406
No log 3.1134 302 0.8844 0.2830 0.8844 0.9404
No log 3.1340 304 0.7855 0.3630 0.7855 0.8863
No log 3.1546 306 0.6453 0.3945 0.6453 0.8033
No log 3.1753 308 0.6364 0.3781 0.6364 0.7977
No log 3.1959 310 0.6383 0.3675 0.6383 0.7989
No log 3.2165 312 0.7177 0.3444 0.7177 0.8472
No log 3.2371 314 0.7696 0.3637 0.7696 0.8772
No log 3.2577 316 0.6842 0.4100 0.6842 0.8272
No log 3.2784 318 0.6357 0.3964 0.6357 0.7973
No log 3.2990 320 0.6367 0.4762 0.6367 0.7979
No log 3.3196 322 0.6382 0.3945 0.6382 0.7989
No log 3.3402 324 0.6922 0.3615 0.6922 0.8320
No log 3.3608 326 0.6819 0.3060 0.6819 0.8258
No log 3.3814 328 0.6182 0.3649 0.6182 0.7863
No log 3.4021 330 0.5926 0.3781 0.5926 0.7698
No log 3.4227 332 0.5903 0.4077 0.5903 0.7683
No log 3.4433 334 0.5987 0.3945 0.5987 0.7738
No log 3.4639 336 0.6541 0.4020 0.6541 0.8088
No log 3.4845 338 0.7074 0.4294 0.7074 0.8411
No log 3.5052 340 0.6359 0.4020 0.6359 0.7974
No log 3.5258 342 0.6135 0.4337 0.6135 0.7833
No log 3.5464 344 0.6172 0.4697 0.6172 0.7856
No log 3.5670 346 0.6278 0.4876 0.6278 0.7923
No log 3.5876 348 0.6131 0.4505 0.6131 0.7830
No log 3.6082 350 0.6833 0.4020 0.6833 0.8266
No log 3.6289 352 0.9161 0.3324 0.9161 0.9571
No log 3.6495 354 1.0016 0.2824 1.0016 1.0008
No log 3.6701 356 0.9607 0.2613 0.9607 0.9802
No log 3.6907 358 0.7637 0.3843 0.7637 0.8739
No log 3.7113 360 0.6042 0.3702 0.6042 0.7773
No log 3.7320 362 0.6163 0.4486 0.6163 0.7850
No log 3.7526 364 0.5990 0.4101 0.5990 0.7739
No log 3.7732 366 0.6377 0.4206 0.6377 0.7986
No log 3.7938 368 0.8873 0.3403 0.8873 0.9420
No log 3.8144 370 1.0346 0.3399 1.0346 1.0172
No log 3.8351 372 0.9559 0.3501 0.9559 0.9777
No log 3.8557 374 0.7309 0.3544 0.7309 0.8549
No log 3.8763 376 0.6121 0.3474 0.6121 0.7824
No log 3.8969 378 0.6119 0.3454 0.6119 0.7823
No log 3.9175 380 0.6113 0.3183 0.6113 0.7819
No log 3.9381 382 0.6522 0.3341 0.6522 0.8076
No log 3.9588 384 0.7227 0.2817 0.7227 0.8501
No log 3.9794 386 0.7156 0.3737 0.7156 0.8459
No log 4.0 388 0.6949 0.4158 0.6949 0.8336
No log 4.0206 390 0.6294 0.3545 0.6294 0.7934
No log 4.0412 392 0.6007 0.3324 0.6007 0.7750
No log 4.0619 394 0.5946 0.4276 0.5946 0.7711
No log 4.0825 396 0.6063 0.3649 0.6063 0.7787
No log 4.1031 398 0.6617 0.3894 0.6617 0.8134
No log 4.1237 400 0.6900 0.4076 0.6900 0.8307
No log 4.1443 402 0.6723 0.4076 0.6723 0.8199
No log 4.1649 404 0.6272 0.4001 0.6272 0.7920
No log 4.1856 406 0.6120 0.4441 0.6120 0.7823
No log 4.2062 408 0.6130 0.4659 0.6130 0.7829
No log 4.2268 410 0.6378 0.4517 0.6378 0.7986
No log 4.2474 412 0.6321 0.4734 0.6321 0.7950
No log 4.2680 414 0.6175 0.3984 0.6175 0.7858
No log 4.2887 416 0.7276 0.4491 0.7276 0.8530
No log 4.3093 418 0.8278 0.3473 0.8278 0.9099
No log 4.3299 420 0.8214 0.3538 0.8214 0.9063
No log 4.3505 422 0.8310 0.3538 0.8310 0.9116
No log 4.3711 424 0.8344 0.3538 0.8344 0.9135
No log 4.3918 426 0.9034 0.3110 0.9034 0.9505
No log 4.4124 428 0.9032 0.3251 0.9032 0.9504
No log 4.4330 430 0.8245 0.3134 0.8245 0.9080
No log 4.4536 432 0.7398 0.3737 0.7398 0.8601
No log 4.4742 434 0.6623 0.3545 0.6623 0.8138
No log 4.4948 436 0.6576 0.3545 0.6576 0.8109
No log 4.5155 438 0.6699 0.3518 0.6699 0.8185
No log 4.5361 440 0.7514 0.3972 0.7514 0.8668
No log 4.5567 442 0.7734 0.3894 0.7734 0.8794
No log 4.5773 444 0.8181 0.4230 0.8181 0.9045
No log 4.5979 446 0.8129 0.4387 0.8129 0.9016
No log 4.6186 448 0.7579 0.4307 0.7579 0.8706
No log 4.6392 450 0.6667 0.3238 0.6667 0.8165
No log 4.6598 452 0.6201 0.3701 0.6201 0.7875
No log 4.6804 454 0.6215 0.3701 0.6215 0.7884
No log 4.7010 456 0.6331 0.3267 0.6331 0.7957
No log 4.7216 458 0.6802 0.3518 0.6802 0.8248
No log 4.7423 460 0.7853 0.4624 0.7853 0.8862
No log 4.7629 462 0.8497 0.3846 0.8497 0.9218
No log 4.7835 464 0.7931 0.4624 0.7931 0.8906
No log 4.8041 466 0.6924 0.3814 0.6924 0.8321
No log 4.8247 468 0.6362 0.4100 0.6362 0.7976
No log 4.8454 470 0.6174 0.4100 0.6174 0.7858
No log 4.8660 472 0.5892 0.4514 0.5892 0.7676
No log 4.8866 474 0.5826 0.4576 0.5826 0.7633
No log 4.9072 476 0.6230 0.5385 0.6230 0.7893
No log 4.9278 478 0.6714 0.5345 0.6714 0.8194
No log 4.9485 480 0.6151 0.5158 0.6151 0.7843
No log 4.9691 482 0.5515 0.4591 0.5515 0.7426
No log 4.9897 484 0.5437 0.4547 0.5437 0.7374
No log 5.0103 486 0.5496 0.4634 0.5496 0.7414
No log 5.0309 488 0.5578 0.4613 0.5578 0.7469
No log 5.0515 490 0.5525 0.4505 0.5525 0.7433
No log 5.0722 492 0.5498 0.4991 0.5498 0.7415
No log 5.0928 494 0.5745 0.4576 0.5745 0.7580
No log 5.1134 496 0.6153 0.4534 0.6153 0.7844
No log 5.1340 498 0.6599 0.5219 0.6599 0.8124
0.31 5.1546 500 0.6303 0.5560 0.6303 0.7939
0.31 5.1753 502 0.6293 0.5560 0.6293 0.7933
0.31 5.1959 504 0.5941 0.4660 0.5941 0.7708
0.31 5.2165 506 0.5772 0.4419 0.5772 0.7597
0.31 5.2371 508 0.5577 0.4505 0.5577 0.7468
0.31 5.2577 510 0.5532 0.4505 0.5532 0.7438

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
5
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k19_task7_organization

Finetuned
(4019)
this model