ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k20_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7526
  • Qwk: 0.1789
  • Mse: 0.7526
  • Rmse: 0.8675

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0299 2 2.7005 -0.0262 2.7005 1.6433
No log 0.0597 4 1.3803 0.0547 1.3803 1.1748
No log 0.0896 6 1.1456 -0.0846 1.1456 1.0703
No log 0.1194 8 1.4492 -0.1509 1.4492 1.2038
No log 0.1493 10 1.1327 -0.1608 1.1327 1.0643
No log 0.1791 12 1.0874 -0.0726 1.0874 1.0428
No log 0.2090 14 1.0838 -0.0753 1.0838 1.0411
No log 0.2388 16 1.0025 -0.0526 1.0025 1.0012
No log 0.2687 18 1.0177 -0.0700 1.0177 1.0088
No log 0.2985 20 0.9439 -0.0320 0.9439 0.9715
No log 0.3284 22 0.8799 0.0 0.8799 0.9380
No log 0.3582 24 0.8766 0.0 0.8766 0.9363
No log 0.3881 26 0.8740 0.0 0.8740 0.9349
No log 0.4179 28 0.8105 0.0 0.8105 0.9003
No log 0.4478 30 0.7800 0.0 0.7800 0.8832
No log 0.4776 32 0.7581 0.0 0.7581 0.8707
No log 0.5075 34 0.7741 0.0 0.7741 0.8799
No log 0.5373 36 0.8452 0.0 0.8452 0.9193
No log 0.5672 38 0.9679 0.0952 0.9679 0.9838
No log 0.5970 40 0.8806 0.0078 0.8806 0.9384
No log 0.6269 42 0.7805 -0.0027 0.7805 0.8835
No log 0.6567 44 0.7794 -0.0027 0.7794 0.8829
No log 0.6866 46 0.7893 0.0937 0.7893 0.8884
No log 0.7164 48 0.8328 0.1770 0.8328 0.9126
No log 0.7463 50 0.8052 0.1372 0.8052 0.8973
No log 0.7761 52 0.7823 0.1580 0.7823 0.8845
No log 0.8060 54 0.7812 0.2280 0.7812 0.8839
No log 0.8358 56 0.8036 0.2218 0.8036 0.8964
No log 0.8657 58 0.8623 0.1132 0.8623 0.9286
No log 0.8955 60 0.9422 0.1263 0.9422 0.9706
No log 0.9254 62 0.9503 0.0535 0.9503 0.9749
No log 0.9552 64 0.9043 0.1263 0.9043 0.9509
No log 0.9851 66 0.8570 0.1176 0.8570 0.9257
No log 1.0149 68 0.8342 0.1829 0.8342 0.9133
No log 1.0448 70 0.8308 0.1456 0.8308 0.9115
No log 1.0746 72 0.8668 0.1569 0.8668 0.9310
No log 1.1045 74 1.0012 0.0590 1.0012 1.0006
No log 1.1343 76 0.9536 0.0952 0.9536 0.9765
No log 1.1642 78 0.8479 0.1133 0.8479 0.9208
No log 1.1940 80 0.8630 0.1569 0.8630 0.9290
No log 1.2239 82 0.9713 0.1339 0.9713 0.9855
No log 1.2537 84 0.9383 0.0535 0.9383 0.9686
No log 1.2836 86 0.8667 -0.0051 0.8667 0.9310
No log 1.3134 88 0.8649 0.1359 0.8649 0.9300
No log 1.3433 90 0.8879 0.3020 0.8879 0.9423
No log 1.3731 92 0.8725 0.0327 0.8725 0.9341
No log 1.4030 94 0.8971 -0.0389 0.8971 0.9472
No log 1.4328 96 0.9802 -0.0354 0.9802 0.9901
No log 1.4627 98 0.9689 0.0051 0.9689 0.9843
No log 1.4925 100 0.9189 0.0218 0.9189 0.9586
No log 1.5224 102 0.9162 0.2621 0.9162 0.9572
No log 1.5522 104 0.9688 0.0790 0.9688 0.9843
No log 1.5821 106 1.0187 0.0134 1.0187 1.0093
No log 1.6119 108 0.9051 0.0437 0.9051 0.9514
No log 1.6418 110 0.8272 -0.0099 0.8272 0.9095
No log 1.6716 112 0.8160 0.1181 0.8160 0.9033
No log 1.7015 114 0.8571 0.1714 0.8571 0.9258
No log 1.7313 116 0.8643 0.1673 0.8643 0.9297
No log 1.7612 118 0.7843 0.1136 0.7843 0.8856
No log 1.7910 120 0.7821 0.2379 0.7821 0.8843
No log 1.8209 122 0.8781 0.3261 0.8781 0.9371
No log 1.8507 124 0.9370 0.3384 0.9370 0.9680
No log 1.8806 126 0.8679 0.3287 0.8679 0.9316
No log 1.9104 128 0.8135 0.2591 0.8135 0.9019
No log 1.9403 130 0.8920 0.0836 0.8920 0.9444
No log 1.9701 132 0.9980 0.0947 0.9980 0.9990
No log 2.0 134 1.0124 0.0980 1.0124 1.0062
No log 2.0299 136 0.9662 0.0936 0.9662 0.9830
No log 2.0597 138 0.9007 0.1924 0.9007 0.9490
No log 2.0896 140 0.8877 0.1953 0.8877 0.9422
No log 2.1194 142 0.9271 0.1459 0.9271 0.9629
No log 2.1493 144 0.9125 0.0520 0.9125 0.9553
No log 2.1791 146 0.9045 0.0652 0.9045 0.9511
No log 2.2090 148 0.9335 0.1646 0.9335 0.9662
No log 2.2388 150 0.9938 0.1259 0.9938 0.9969
No log 2.2687 152 1.0200 0.0741 1.0200 1.0099
No log 2.2985 154 0.9976 0.0210 0.9976 0.9988
No log 2.3284 156 0.9250 0.2129 0.9250 0.9618
No log 2.3582 158 0.9117 0.2471 0.9117 0.9548
No log 2.3881 160 0.9306 0.1525 0.9306 0.9647
No log 2.4179 162 1.0096 0.1602 1.0096 1.0048
No log 2.4478 164 1.0510 0.1518 1.0510 1.0252
No log 2.4776 166 1.1144 0.0830 1.1144 1.0556
No log 2.5075 168 1.1368 0.0456 1.1368 1.0662
No log 2.5373 170 1.0519 0.0354 1.0519 1.0256
No log 2.5672 172 1.0222 0.1523 1.0222 1.0110
No log 2.5970 174 1.0373 0.2140 1.0373 1.0185
No log 2.6269 176 0.9934 0.2256 0.9934 0.9967
No log 2.6567 178 0.9679 0.1373 0.9679 0.9838
No log 2.6866 180 0.9794 0.1179 0.9794 0.9897
No log 2.7164 182 1.1865 0.1317 1.1865 1.0893
No log 2.7463 184 1.2624 0.1363 1.2624 1.1236
No log 2.7761 186 1.0602 0.1912 1.0602 1.0297
No log 2.8060 188 0.9340 0.1639 0.9340 0.9664
No log 2.8358 190 0.9208 0.1555 0.9208 0.9596
No log 2.8657 192 0.9856 0.2887 0.9856 0.9928
No log 2.8955 194 0.9800 0.2724 0.9800 0.9900
No log 2.9254 196 0.9047 0.2132 0.9047 0.9512
No log 2.9552 198 0.9414 0.0779 0.9414 0.9702
No log 2.9851 200 1.0697 0.2097 1.0697 1.0342
No log 3.0149 202 0.9411 0.1126 0.9411 0.9701
No log 3.0448 204 0.9151 0.2071 0.9151 0.9566
No log 3.0746 206 1.1034 0.1499 1.1034 1.0504
No log 3.1045 208 1.1477 0.1057 1.1477 1.0713
No log 3.1343 210 1.0540 0.1243 1.0540 1.0267
No log 3.1642 212 0.9027 0.2784 0.9027 0.9501
No log 3.1940 214 0.8513 0.2535 0.8513 0.9227
No log 3.2239 216 0.8441 0.2163 0.8441 0.9187
No log 3.2537 218 0.8417 0.2243 0.8417 0.9174
No log 3.2836 220 0.8677 0.2634 0.8677 0.9315
No log 3.3134 222 0.8712 0.3117 0.8712 0.9334
No log 3.3433 224 0.8950 0.2465 0.8950 0.9460
No log 3.3731 226 0.9290 0.2403 0.9290 0.9639
No log 3.4030 228 0.8882 0.2410 0.8882 0.9425
No log 3.4328 230 1.0325 0.2857 1.0325 1.0161
No log 3.4627 232 1.1228 0.2550 1.1228 1.0596
No log 3.4925 234 0.9604 0.2570 0.9604 0.9800
No log 3.5224 236 0.8895 0.1629 0.8895 0.9431
No log 3.5522 238 0.9035 0.1982 0.9035 0.9505
No log 3.5821 240 0.8692 0.1629 0.8692 0.9323
No log 3.6119 242 0.9012 0.2498 0.9012 0.9493
No log 3.6418 244 0.9534 0.3402 0.9534 0.9764
No log 3.6716 246 0.8491 0.2291 0.8491 0.9215
No log 3.7015 248 0.8018 0.1490 0.8018 0.8954
No log 3.7313 250 0.8070 0.2140 0.8070 0.8984
No log 3.7612 252 0.8196 0.2633 0.8196 0.9053
No log 3.7910 254 0.8638 0.3393 0.8638 0.9294
No log 3.8209 256 0.8258 0.3018 0.8258 0.9087
No log 3.8507 258 0.7979 0.3060 0.7979 0.8932
No log 3.8806 260 0.8048 0.3462 0.8048 0.8971
No log 3.9104 262 0.7999 0.3403 0.7999 0.8944
No log 3.9403 264 0.8165 0.2980 0.8165 0.9036
No log 3.9701 266 0.8741 0.3130 0.8741 0.9349
No log 4.0 268 0.8514 0.3417 0.8514 0.9227
No log 4.0299 270 0.8555 0.3043 0.8555 0.9249
No log 4.0597 272 0.8105 0.2661 0.8105 0.9003
No log 4.0896 274 0.8310 0.2793 0.8310 0.9116
No log 4.1194 276 0.8256 0.3725 0.8256 0.9086
No log 4.1493 278 0.8590 0.3573 0.8590 0.9268
No log 4.1791 280 1.0536 0.2837 1.0536 1.0265
No log 4.2090 282 1.1786 0.2501 1.1786 1.0856
No log 4.2388 284 1.0355 0.2926 1.0355 1.0176
No log 4.2687 286 0.8577 0.3396 0.8577 0.9261
No log 4.2985 288 0.8293 0.1888 0.8293 0.9106
No log 4.3284 290 0.8306 0.1888 0.8306 0.9114
No log 4.3582 292 0.9088 0.2544 0.9088 0.9533
No log 4.3881 294 1.1156 0.2484 1.1156 1.0562
No log 4.4179 296 1.2192 0.2462 1.2192 1.1042
No log 4.4478 298 1.0725 0.2119 1.0725 1.0356
No log 4.4776 300 0.8830 0.2249 0.8830 0.9397
No log 4.5075 302 0.8928 0.2414 0.8928 0.9449
No log 4.5373 304 0.9325 0.2857 0.9325 0.9657
No log 4.5672 306 0.8736 0.1468 0.8736 0.9347
No log 4.5970 308 0.8651 0.2883 0.8651 0.9301
No log 4.6269 310 0.8977 0.1886 0.8977 0.9475
No log 4.6567 312 0.8876 0.2145 0.8876 0.9421
No log 4.6866 314 0.8350 0.2834 0.8350 0.9138
No log 4.7164 316 0.8135 0.1361 0.8135 0.9020
No log 4.7463 318 0.8088 0.1285 0.8088 0.8993
No log 4.7761 320 0.8075 0.1606 0.8075 0.8986
No log 4.8060 322 0.8084 0.2213 0.8084 0.8991
No log 4.8358 324 0.8308 0.3209 0.8308 0.9115
No log 4.8657 326 0.8137 0.3211 0.8137 0.9021
No log 4.8955 328 0.8050 0.2192 0.8050 0.8972
No log 4.9254 330 0.7983 0.2192 0.7983 0.8935
No log 4.9552 332 0.8080 0.3247 0.8080 0.8989
No log 4.9851 334 0.8352 0.3292 0.8352 0.9139
No log 5.0149 336 0.8740 0.3005 0.8740 0.9349
No log 5.0448 338 0.8007 0.3458 0.8007 0.8948
No log 5.0746 340 0.7606 0.2965 0.7606 0.8721
No log 5.1045 342 0.7671 0.3171 0.7671 0.8758
No log 5.1343 344 0.8344 0.2828 0.8344 0.9135
No log 5.1642 346 0.9559 0.3451 0.9559 0.9777
No log 5.1940 348 0.9040 0.3228 0.9040 0.9508
No log 5.2239 350 0.8100 0.2824 0.8100 0.9000
No log 5.2537 352 0.8258 0.2355 0.8258 0.9088
No log 5.2836 354 0.8058 0.2566 0.8058 0.8977
No log 5.3134 356 0.8484 0.2776 0.8484 0.9211
No log 5.3433 358 0.9371 0.36 0.9371 0.9680
No log 5.3731 360 0.8494 0.3320 0.8494 0.9216
No log 5.4030 362 0.7617 0.2936 0.7617 0.8728
No log 5.4328 364 0.7854 0.1555 0.7854 0.8862
No log 5.4627 366 0.7734 0.1555 0.7734 0.8794
No log 5.4925 368 0.7439 0.1741 0.7439 0.8625
No log 5.5224 370 0.7652 0.2661 0.7652 0.8747
No log 5.5522 372 0.8713 0.2779 0.8713 0.9335
No log 5.5821 374 0.9442 0.2877 0.9442 0.9717
No log 5.6119 376 0.8853 0.3005 0.8853 0.9409
No log 5.6418 378 0.8018 0.3326 0.8018 0.8955
No log 5.6716 380 0.7966 0.3155 0.7966 0.8925
No log 5.7015 382 0.8066 0.3212 0.8066 0.8981
No log 5.7313 384 0.8195 0.3326 0.8195 0.9053
No log 5.7612 386 0.8630 0.2669 0.8630 0.9290
No log 5.7910 388 0.8268 0.2988 0.8268 0.9093
No log 5.8209 390 0.8246 0.3111 0.8246 0.9081
No log 5.8507 392 0.8239 0.1825 0.8239 0.9077
No log 5.8806 394 0.8369 0.1829 0.8369 0.9148
No log 5.9104 396 0.8396 0.2032 0.8396 0.9163
No log 5.9403 398 0.8716 0.2155 0.8716 0.9336
No log 5.9701 400 1.0018 0.3094 1.0018 1.0009
No log 6.0 402 1.0827 0.2821 1.0827 1.0405
No log 6.0299 404 1.0216 0.3094 1.0216 1.0108
No log 6.0597 406 0.8525 0.2593 0.8525 0.9233
No log 6.0896 408 0.7818 0.1874 0.7818 0.8842
No log 6.1194 410 0.7698 0.1749 0.7698 0.8774
No log 6.1493 412 0.7620 0.1432 0.7620 0.8729
No log 6.1791 414 0.7845 0.2809 0.7845 0.8857
No log 6.2090 416 0.8222 0.2749 0.8222 0.9067
No log 6.2388 418 0.8012 0.2605 0.8012 0.8951
No log 6.2687 420 0.7975 0.1432 0.7975 0.8931
No log 6.2985 422 0.8533 0.2991 0.8533 0.9237
No log 6.3284 424 0.8908 0.2651 0.8908 0.9438
No log 6.3582 426 0.8604 0.2385 0.8604 0.9276
No log 6.3881 428 0.8421 0.2921 0.8421 0.9177
No log 6.4179 430 0.8886 0.3330 0.8886 0.9426
No log 6.4478 432 0.9936 0.3097 0.9936 0.9968
No log 6.4776 434 1.0116 0.2440 1.0116 1.0058
No log 6.5075 436 0.9296 0.2877 0.9296 0.9641
No log 6.5373 438 0.8485 0.2721 0.8485 0.9212
No log 6.5672 440 0.8248 0.1967 0.8248 0.9082
No log 6.5970 442 0.8359 0.1432 0.8359 0.9143
No log 6.6269 444 0.8200 0.1741 0.8200 0.9056
No log 6.6567 446 0.8004 0.1404 0.8004 0.8946
No log 6.6866 448 0.7875 0.2078 0.7875 0.8874
No log 6.7164 450 0.8005 0.2809 0.8005 0.8947
No log 6.7463 452 0.8217 0.2283 0.8217 0.9065
No log 6.7761 454 0.7964 0.3078 0.7964 0.8924
No log 6.8060 456 0.7809 0.2872 0.7809 0.8837
No log 6.8358 458 0.7823 0.2872 0.7823 0.8845
No log 6.8657 460 0.8167 0.2805 0.8167 0.9037
No log 6.8955 462 0.9234 0.36 0.9234 0.9609
No log 6.9254 464 0.9799 0.3451 0.9799 0.9899
No log 6.9552 466 0.9043 0.3446 0.9043 0.9509
No log 6.9851 468 0.8152 0.2980 0.8152 0.9029
No log 7.0149 470 0.8007 0.2798 0.8007 0.8948
No log 7.0448 472 0.7875 0.1829 0.7875 0.8874
No log 7.0746 474 0.7852 0.2048 0.7852 0.8861
No log 7.1045 476 0.7827 0.2034 0.7827 0.8847
No log 7.1343 478 0.8168 0.2202 0.8168 0.9038
No log 7.1642 480 0.9748 0.3015 0.9748 0.9873
No log 7.1940 482 1.0732 0.2930 1.0732 1.0359
No log 7.2239 484 1.0581 0.2977 1.0581 1.0287
No log 7.2537 486 0.9293 0.2981 0.9293 0.9640
No log 7.2836 488 0.8477 0.3240 0.8477 0.9207
No log 7.3134 490 0.8227 0.2210 0.8227 0.9070
No log 7.3433 492 0.8257 0.2643 0.8257 0.9087
No log 7.3731 494 0.8717 0.3207 0.8717 0.9336
No log 7.4030 496 0.9502 0.2952 0.9502 0.9748
No log 7.4328 498 0.9212 0.2701 0.9212 0.9598
0.3692 7.4627 500 0.8390 0.3458 0.8390 0.9160
0.3692 7.4925 502 0.7798 0.1935 0.7798 0.8831
0.3692 7.5224 504 0.7498 0.0661 0.7498 0.8659
0.3692 7.5522 506 0.7591 0.1854 0.7591 0.8712
0.3692 7.5821 508 0.7684 0.2833 0.7684 0.8766
0.3692 7.6119 510 0.7526 0.1789 0.7526 0.8675

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k20_task7_organization

Finetuned
(4019)
this model