ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k20_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9566
  • Qwk: 0.3173
  • Mse: 0.9566
  • Rmse: 0.9781

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0196 2 2.7080 -0.0568 2.7080 1.6456
No log 0.0392 4 1.5615 0.0732 1.5615 1.2496
No log 0.0588 6 1.3630 -0.2311 1.3630 1.1675
No log 0.0784 8 1.1584 -0.1841 1.1584 1.0763
No log 0.0980 10 1.0091 0.0532 1.0091 1.0045
No log 0.1176 12 0.8963 0.0200 0.8963 0.9468
No log 0.1373 14 0.8177 0.0 0.8177 0.9042
No log 0.1569 16 0.8042 -0.0027 0.8042 0.8968
No log 0.1765 18 0.8571 0.0679 0.8571 0.9258
No log 0.1961 20 0.8795 0.0679 0.8795 0.9378
No log 0.2157 22 0.8593 0.0236 0.8593 0.9270
No log 0.2353 24 0.8303 0.0608 0.8303 0.9112
No log 0.2549 26 0.8235 0.1407 0.8235 0.9075
No log 0.2745 28 0.8417 0.1699 0.8417 0.9174
No log 0.2941 30 0.8571 0.0816 0.8571 0.9258
No log 0.3137 32 0.8588 0.1183 0.8588 0.9267
No log 0.3333 34 0.8308 0.1359 0.8308 0.9115
No log 0.3529 36 0.8310 0.0652 0.8310 0.9116
No log 0.3725 38 0.8971 0.1174 0.8971 0.9471
No log 0.3922 40 0.9141 0.1548 0.9141 0.9561
No log 0.4118 42 0.8249 0.0330 0.8249 0.9082
No log 0.4314 44 0.8294 0.0968 0.8294 0.9107
No log 0.4510 46 0.9166 0.0520 0.9166 0.9574
No log 0.4706 48 0.9445 -0.0112 0.9445 0.9719
No log 0.4902 50 0.9087 0.0200 0.9087 0.9533
No log 0.5098 52 0.8897 0.0627 0.8897 0.9433
No log 0.5294 54 0.9268 -0.0112 0.9268 0.9627
No log 0.5490 56 0.8846 0.0627 0.8846 0.9405
No log 0.5686 58 0.8812 0.1308 0.8812 0.9387
No log 0.5882 60 0.8685 0.0968 0.8685 0.9319
No log 0.6078 62 0.9796 0.0994 0.9796 0.9897
No log 0.6275 64 1.0392 0.1672 1.0392 1.0194
No log 0.6471 66 1.0396 0.1251 1.0396 1.0196
No log 0.6667 68 0.9813 0.1029 0.9813 0.9906
No log 0.6863 70 0.9264 0.0520 0.9264 0.9625
No log 0.7059 72 0.9153 0.0856 0.9153 0.9567
No log 0.7255 74 0.9366 0.0821 0.9366 0.9678
No log 0.7451 76 0.9526 0.0563 0.9526 0.9760
No log 0.7647 78 0.9425 0.1010 0.9425 0.9708
No log 0.7843 80 0.9164 0.1093 0.9164 0.9573
No log 0.8039 82 0.9192 0.0532 0.9192 0.9587
No log 0.8235 84 1.0088 0.0668 1.0088 1.0044
No log 0.8431 86 1.0957 0.1334 1.0957 1.0467
No log 0.8627 88 1.0767 0.1178 1.0767 1.0376
No log 0.8824 90 0.9867 0.0691 0.9867 0.9933
No log 0.9020 92 1.0048 0.0279 1.0048 1.0024
No log 0.9216 94 0.9892 0.0302 0.9892 0.9946
No log 0.9412 96 0.9323 0.0124 0.9323 0.9656
No log 0.9608 98 0.9469 0.1065 0.9469 0.9731
No log 0.9804 100 0.9644 0.0279 0.9644 0.9820
No log 1.0 102 0.9589 0.0279 0.9589 0.9792
No log 1.0196 104 0.9271 0.0994 0.9271 0.9629
No log 1.0392 106 0.8880 0.0541 0.8880 0.9423
No log 1.0588 108 0.8871 -0.0955 0.8871 0.9419
No log 1.0784 110 0.8739 -0.0533 0.8739 0.9348
No log 1.0980 112 0.8730 0.0888 0.8730 0.9344
No log 1.1176 114 0.8942 0.1141 0.8942 0.9456
No log 1.1373 116 0.9039 0.1694 0.9039 0.9508
No log 1.1569 118 0.9315 0.1694 0.9315 0.9652
No log 1.1765 120 0.9368 0.2662 0.9368 0.9679
No log 1.1961 122 0.9529 0.2173 0.9529 0.9761
No log 1.2157 124 0.9231 0.2899 0.9231 0.9608
No log 1.2353 126 0.8629 0.2305 0.8629 0.9289
No log 1.2549 128 0.8277 0.1806 0.8277 0.9098
No log 1.2745 130 0.8328 0.2279 0.8328 0.9126
No log 1.2941 132 0.8517 0.1853 0.8517 0.9229
No log 1.3137 134 0.8757 0.1806 0.8757 0.9358
No log 1.3333 136 0.9133 0.2027 0.9133 0.9557
No log 1.3529 138 0.9132 0.2335 0.9132 0.9556
No log 1.3725 140 0.8931 0.2090 0.8931 0.9451
No log 1.3922 142 0.9015 0.1236 0.9015 0.9495
No log 1.4118 144 0.8989 0.0980 0.8989 0.9481
No log 1.4314 146 0.8810 0.1457 0.8810 0.9386
No log 1.4510 148 0.8880 0.2335 0.8880 0.9423
No log 1.4706 150 0.9016 0.2027 0.9016 0.9495
No log 1.4902 152 0.8995 0.2077 0.8995 0.9484
No log 1.5098 154 0.8659 0.1558 0.8659 0.9305
No log 1.5294 156 0.8866 0.0699 0.8866 0.9416
No log 1.5490 158 0.9011 0.1592 0.9011 0.9493
No log 1.5686 160 0.9264 0.2063 0.9264 0.9625
No log 1.5882 162 0.8751 0.2796 0.8751 0.9354
No log 1.6078 164 0.8356 0.2203 0.8356 0.9141
No log 1.6275 166 0.8343 0.1773 0.8343 0.9134
No log 1.6471 168 0.8741 0.2832 0.8741 0.9349
No log 1.6667 170 0.8720 0.3207 0.8720 0.9338
No log 1.6863 172 0.8270 0.2835 0.8270 0.9094
No log 1.7059 174 0.8312 0.3149 0.8312 0.9117
No log 1.7255 176 0.8679 0.2013 0.8679 0.9316
No log 1.7451 178 0.8501 0.2048 0.8501 0.9220
No log 1.7647 180 0.8444 0.2023 0.8444 0.9189
No log 1.7843 182 0.8699 0.2805 0.8699 0.9327
No log 1.8039 184 0.8886 0.2643 0.8886 0.9426
No log 1.8235 186 0.8826 0.2354 0.8826 0.9395
No log 1.8431 188 0.8609 0.2717 0.8609 0.9278
No log 1.8627 190 0.8553 0.0941 0.8553 0.9248
No log 1.8824 192 0.8562 0.1624 0.8562 0.9253
No log 1.9020 194 0.8861 0.2691 0.8861 0.9413
No log 1.9216 196 0.9713 0.3320 0.9713 0.9855
No log 1.9412 198 1.0736 0.2555 1.0736 1.0361
No log 1.9608 200 1.0763 0.2555 1.0763 1.0375
No log 1.9804 202 0.9574 0.3125 0.9574 0.9785
No log 2.0 204 0.8996 0.2577 0.8996 0.9485
No log 2.0196 206 0.8775 0.1364 0.8775 0.9367
No log 2.0392 208 0.8845 0.1816 0.8845 0.9405
No log 2.0588 210 0.8860 0.2066 0.8860 0.9413
No log 2.0784 212 0.8957 0.2173 0.8957 0.9464
No log 2.0980 214 0.9990 0.3068 0.9990 0.9995
No log 2.1176 216 1.0438 0.3183 1.0438 1.0217
No log 2.1373 218 1.0972 0.2175 1.0972 1.0475
No log 2.1569 220 1.0493 0.2881 1.0493 1.0243
No log 2.1765 222 0.9512 0.3068 0.9512 0.9753
No log 2.1961 224 0.9188 0.3125 0.9188 0.9586
No log 2.2157 226 0.8987 0.2887 0.8987 0.9480
No log 2.2353 228 0.8713 0.2145 0.8713 0.9334
No log 2.2549 230 0.8968 0.2099 0.8968 0.9470
No log 2.2745 232 0.8991 0.2643 0.8991 0.9482
No log 2.2941 234 0.9074 0.2882 0.9074 0.9526
No log 2.3137 236 0.9074 0.2776 0.9074 0.9526
No log 2.3333 238 0.8643 0.1464 0.8643 0.9297
No log 2.3529 240 0.8869 0.2325 0.8869 0.9417
No log 2.3725 242 0.9222 0.1966 0.9222 0.9603
No log 2.3922 244 0.9196 0.2595 0.9196 0.9589
No log 2.4118 246 0.9407 0.2210 0.9407 0.9699
No log 2.4314 248 0.9837 0.2375 0.9837 0.9918
No log 2.4510 250 0.9951 0.2375 0.9951 0.9976
No log 2.4706 252 0.9585 0.2451 0.9585 0.9790
No log 2.4902 254 0.9178 0.1859 0.9178 0.9580
No log 2.5098 256 0.8875 0.1582 0.8875 0.9421
No log 2.5294 258 0.8886 0.2016 0.8886 0.9427
No log 2.5490 260 0.8830 0.2016 0.8830 0.9397
No log 2.5686 262 0.9076 0.2397 0.9076 0.9527
No log 2.5882 264 0.9375 0.3115 0.9375 0.9682
No log 2.6078 266 0.9432 0.3320 0.9432 0.9712
No log 2.6275 268 0.9305 0.3544 0.9305 0.9646
No log 2.6471 270 0.9410 0.3544 0.9410 0.9701
No log 2.6667 272 0.9942 0.3544 0.9942 0.9971
No log 2.6863 274 1.0056 0.3320 1.0056 1.0028
No log 2.7059 276 0.9695 0.2825 0.9695 0.9846
No log 2.7255 278 0.9222 0.2057 0.9222 0.9603
No log 2.7451 280 0.8799 0.1791 0.8799 0.9380
No log 2.7647 282 0.8644 0.2442 0.8644 0.9297
No log 2.7843 284 0.8596 0.2164 0.8596 0.9271
No log 2.8039 286 0.8594 0.1732 0.8594 0.9271
No log 2.8235 288 0.8642 0.2555 0.8642 0.9296
No log 2.8431 290 0.8573 0.2504 0.8573 0.9259
No log 2.8627 292 0.8481 0.3035 0.8481 0.9209
No log 2.8824 294 0.8317 0.3018 0.8317 0.9120
No log 2.9020 296 0.7896 0.2988 0.7896 0.8886
No log 2.9216 298 0.7695 0.3051 0.7695 0.8772
No log 2.9412 300 0.7534 0.2561 0.7534 0.8680
No log 2.9608 302 0.7453 0.2715 0.7453 0.8633
No log 2.9804 304 0.7578 0.2965 0.7578 0.8705
No log 3.0 306 0.7595 0.2713 0.7595 0.8715
No log 3.0196 308 0.7769 0.3031 0.7769 0.8814
No log 3.0392 310 0.8212 0.3340 0.8212 0.9062
No log 3.0588 312 0.8927 0.3183 0.8927 0.9448
No log 3.0784 314 0.9290 0.3320 0.9290 0.9639
No log 3.0980 316 0.9194 0.3157 0.9194 0.9588
No log 3.1176 318 0.8941 0.3157 0.8941 0.9456
No log 3.1373 320 0.8627 0.2633 0.8627 0.9288
No log 3.1569 322 0.8108 0.2445 0.8108 0.9004
No log 3.1765 324 0.8091 0.2872 0.8091 0.8995
No log 3.1961 326 0.8020 0.3144 0.8020 0.8955
No log 3.2157 328 0.7921 0.2872 0.7921 0.8900
No log 3.2353 330 0.8153 0.2576 0.8153 0.9029
No log 3.2549 332 0.8468 0.2781 0.8468 0.9202
No log 3.2745 334 0.9425 0.2928 0.9425 0.9708
No log 3.2941 336 1.0430 0.3044 1.0430 1.0213
No log 3.3137 338 1.0041 0.2826 1.0041 1.0020
No log 3.3333 340 0.8920 0.3379 0.8920 0.9445
No log 3.3529 342 0.8682 0.3379 0.8682 0.9318
No log 3.3725 344 0.8271 0.3526 0.8271 0.9094
No log 3.3922 346 0.8431 0.3544 0.8431 0.9182
No log 3.4118 348 0.9033 0.3012 0.9033 0.9504
No log 3.4314 350 0.8964 0.2861 0.8964 0.9468
No log 3.4510 352 0.8697 0.2861 0.8697 0.9326
No log 3.4706 354 0.8832 0.3159 0.8832 0.9398
No log 3.4902 356 0.8826 0.2315 0.8826 0.9394
No log 3.5098 358 0.8435 0.2781 0.8435 0.9184
No log 3.5294 360 0.8336 0.3569 0.8336 0.9130
No log 3.5490 362 0.8430 0.3042 0.8430 0.9182
No log 3.5686 364 0.8718 0.3209 0.8718 0.9337
No log 3.5882 366 0.9371 0.2590 0.9371 0.9681
No log 3.6078 368 0.9566 0.2539 0.9566 0.9781
No log 3.6275 370 0.9009 0.2857 0.9009 0.9492
No log 3.6471 372 0.9150 0.2857 0.9150 0.9566
No log 3.6667 374 1.0028 0.3206 1.0028 1.0014
No log 3.6863 376 1.0232 0.2826 1.0232 1.0115
No log 3.7059 378 0.9529 0.3068 0.9529 0.9762
No log 3.7255 380 0.9374 0.3068 0.9374 0.9682
No log 3.7451 382 0.9314 0.2804 0.9314 0.9651
No log 3.7647 384 0.8602 0.3146 0.8602 0.9275
No log 3.7843 386 0.8379 0.2479 0.8379 0.9154
No log 3.8039 388 0.8379 0.2843 0.8379 0.9154
No log 3.8235 390 0.8464 0.3314 0.8464 0.9200
No log 3.8431 392 0.8530 0.2722 0.8530 0.9236
No log 3.8627 394 0.9168 0.2779 0.9168 0.9575
No log 3.8824 396 1.0031 0.2727 1.0031 1.0015
No log 3.9020 398 1.0279 0.3239 1.0279 1.0138
No log 3.9216 400 0.9826 0.3379 0.9826 0.9913
No log 3.9412 402 0.8767 0.3231 0.8767 0.9363
No log 3.9608 404 0.8261 0.2445 0.8261 0.9089
No log 3.9804 406 0.8155 0.2445 0.8155 0.9030
No log 4.0 408 0.8281 0.2862 0.8281 0.9100
No log 4.0196 410 0.8634 0.3085 0.8634 0.9292
No log 4.0392 412 0.8808 0.3368 0.8808 0.9385
No log 4.0588 414 0.9137 0.2670 0.9137 0.9559
No log 4.0784 416 0.9028 0.2670 0.9028 0.9502
No log 4.0980 418 0.8765 0.2415 0.8765 0.9362
No log 4.1176 420 0.8317 0.2988 0.8317 0.9120
No log 4.1373 422 0.8085 0.2621 0.8085 0.8992
No log 4.1569 424 0.7984 0.1986 0.7984 0.8935
No log 4.1765 426 0.7988 0.1616 0.7988 0.8938
No log 4.1961 428 0.8033 0.1351 0.8033 0.8963
No log 4.2157 430 0.7997 0.1597 0.7997 0.8942
No log 4.2353 432 0.8060 0.1811 0.8060 0.8978
No log 4.2549 434 0.8184 0.2249 0.8184 0.9047
No log 4.2745 436 0.8283 0.2862 0.8283 0.9101
No log 4.2941 438 0.8510 0.3353 0.8510 0.9225
No log 4.3137 440 0.8436 0.1713 0.8436 0.9185
No log 4.3333 442 0.8462 0.1164 0.8462 0.9199
No log 4.3529 444 0.8710 0.1975 0.8710 0.9333
No log 4.3725 446 0.8917 0.3018 0.8917 0.9443
No log 4.3922 448 0.8661 0.1960 0.8661 0.9306
No log 4.4118 450 0.8222 0.2475 0.8222 0.9067
No log 4.4314 452 0.8121 0.2247 0.8121 0.9012
No log 4.4510 454 0.8243 0.2502 0.8243 0.9079
No log 4.4706 456 0.8535 0.2988 0.8535 0.9239
No log 4.4902 458 0.8654 0.3018 0.8654 0.9303
No log 4.5098 460 0.8597 0.2526 0.8597 0.9272
No log 4.5294 462 0.8447 0.2066 0.8447 0.9191
No log 4.5490 464 0.8298 0.2458 0.8298 0.9110
No log 4.5686 466 0.8272 0.2462 0.8272 0.9095
No log 4.5882 468 0.8300 0.2514 0.8300 0.9110
No log 4.6078 470 0.8139 0.2514 0.8139 0.9021
No log 4.6275 472 0.8106 0.2744 0.8106 0.9003
No log 4.6471 474 0.8187 0.2312 0.8187 0.9048
No log 4.6667 476 0.8217 0.2020 0.8217 0.9065
No log 4.6863 478 0.8353 0.2862 0.8353 0.9140
No log 4.7059 480 0.8945 0.3036 0.8945 0.9458
No log 4.7255 482 0.9075 0.2616 0.9075 0.9526
No log 4.7451 484 0.8495 0.2751 0.8495 0.9217
No log 4.7647 486 0.8087 0.2838 0.8087 0.8993
No log 4.7843 488 0.8123 0.2806 0.8123 0.9013
No log 4.8039 490 0.8206 0.2744 0.8206 0.9059
No log 4.8235 492 0.8278 0.2285 0.8278 0.9098
No log 4.8431 494 0.8363 0.2257 0.8363 0.9145
No log 4.8627 496 0.8325 0.2857 0.8325 0.9124
No log 4.8824 498 0.8214 0.3260 0.8214 0.9063
0.3458 4.9020 500 0.8199 0.3382 0.8199 0.9055
0.3458 4.9216 502 0.8601 0.3268 0.8601 0.9274
0.3458 4.9412 504 0.9328 0.2958 0.9328 0.9658
0.3458 4.9608 506 1.0059 0.2853 1.0059 1.0030
0.3458 4.9804 508 0.9630 0.2928 0.9630 0.9813
0.3458 5.0 510 0.9566 0.3173 0.9566 0.9781

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k20_task7_organization

Finetuned
(4019)
this model