ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k18_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7188
  • Qwk: 0.2609
  • Mse: 0.7188
  • Rmse: 0.8478

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0328 2 2.5138 -0.0593 2.5138 1.5855
No log 0.0656 4 1.2007 0.1910 1.2007 1.0958
No log 0.0984 6 1.1381 -0.2238 1.1381 1.0668
No log 0.1311 8 1.4513 -0.2523 1.4513 1.2047
No log 0.1639 10 1.3730 -0.2682 1.3730 1.1717
No log 0.1967 12 1.2567 -0.1242 1.2567 1.1210
No log 0.2295 14 1.1562 -0.1336 1.1562 1.0753
No log 0.2623 16 1.0523 -0.1335 1.0523 1.0258
No log 0.2951 18 0.9615 -0.0426 0.9615 0.9805
No log 0.3279 20 0.9595 -0.0354 0.9595 0.9795
No log 0.3607 22 0.9936 -0.0123 0.9936 0.9968
No log 0.3934 24 0.9245 -0.0173 0.9245 0.9615
No log 0.4262 26 0.8521 0.1620 0.8521 0.9231
No log 0.4590 28 0.9186 0.0518 0.9186 0.9584
No log 0.4918 30 1.0713 0.1259 1.0713 1.0350
No log 0.5246 32 1.0056 0.1225 1.0056 1.0028
No log 0.5574 34 0.9449 0.1228 0.9449 0.9721
No log 0.5902 36 0.8093 0.2226 0.8093 0.8996
No log 0.6230 38 0.7159 0.2486 0.7159 0.8461
No log 0.6557 40 0.7326 0.1232 0.7326 0.8559
No log 0.6885 42 0.7639 0.1232 0.7639 0.8740
No log 0.7213 44 0.8214 0.1604 0.8214 0.9063
No log 0.7541 46 0.8788 0.1904 0.8788 0.9375
No log 0.7869 48 0.9109 0.2447 0.9109 0.9544
No log 0.8197 50 0.9183 0.2386 0.9183 0.9583
No log 0.8525 52 0.9568 0.2264 0.9568 0.9782
No log 0.8852 54 0.9807 0.0925 0.9807 0.9903
No log 0.9180 56 1.0102 0.1323 1.0102 1.0051
No log 0.9508 58 1.0825 0.1304 1.0825 1.0404
No log 0.9836 60 0.9731 0.1352 0.9731 0.9865
No log 1.0164 62 0.8875 -0.0079 0.8875 0.9421
No log 1.0492 64 0.9524 0.0448 0.9524 0.9759
No log 1.0820 66 0.9303 0.0208 0.9303 0.9645
No log 1.1148 68 0.9806 0.0898 0.9806 0.9902
No log 1.1475 70 1.0462 -0.1015 1.0462 1.0229
No log 1.1803 72 0.9861 0.1228 0.9861 0.9930
No log 1.2131 74 0.9795 0.0764 0.9795 0.9897
No log 1.2459 76 0.9832 0.1009 0.9832 0.9916
No log 1.2787 78 1.0245 0.0022 1.0245 1.0122
No log 1.3115 80 1.1186 -0.0797 1.1186 1.0577
No log 1.3443 82 1.1033 -0.0438 1.1033 1.0504
No log 1.3770 84 0.9289 -0.0025 0.9289 0.9638
No log 1.4098 86 0.9569 0.1373 0.9569 0.9782
No log 1.4426 88 1.0567 0.1145 1.0567 1.0279
No log 1.4754 90 1.0414 0.0924 1.0414 1.0205
No log 1.5082 92 0.9923 -0.1004 0.9923 0.9962
No log 1.5410 94 1.0407 -0.0853 1.0407 1.0201
No log 1.5738 96 1.0776 0.0513 1.0776 1.0381
No log 1.6066 98 1.1861 0.1014 1.1861 1.0891
No log 1.6393 100 1.1690 0.0379 1.1690 1.0812
No log 1.6721 102 1.0521 0.0774 1.0521 1.0257
No log 1.7049 104 0.9502 0.1256 0.9502 0.9748
No log 1.7377 106 0.9747 -0.0550 0.9747 0.9873
No log 1.7705 108 1.0608 -0.0368 1.0608 1.0299
No log 1.8033 110 1.0490 -0.0368 1.0490 1.0242
No log 1.8361 112 0.9503 0.0802 0.9503 0.9749
No log 1.8689 114 0.9153 0.1012 0.9153 0.9567
No log 1.9016 116 0.9221 0.0975 0.9221 0.9603
No log 1.9344 118 0.9332 0.1361 0.9332 0.9660
No log 1.9672 120 0.9694 0.0483 0.9694 0.9846
No log 2.0 122 0.9559 0.0518 0.9559 0.9777
No log 2.0328 124 1.0383 0.0540 1.0383 1.0190
No log 2.0656 126 1.0442 0.0172 1.0442 1.0219
No log 2.0984 128 1.0765 0.1100 1.0765 1.0376
No log 2.1311 130 1.1738 0.1693 1.1738 1.0834
No log 2.1639 132 1.2534 0.0830 1.2534 1.1195
No log 2.1967 134 1.5033 -0.1207 1.5033 1.2261
No log 2.2295 136 1.5694 -0.0853 1.5694 1.2528
No log 2.2623 138 1.2705 -0.0882 1.2705 1.1272
No log 2.2951 140 1.0973 0.1464 1.0973 1.0475
No log 2.3279 142 1.1498 0.1448 1.1498 1.0723
No log 2.3607 144 1.0727 0.1368 1.0727 1.0357
No log 2.3934 146 0.9449 0.0200 0.9449 0.9721
No log 2.4262 148 0.9167 0.1212 0.9167 0.9574
No log 2.4590 150 1.0105 -0.0123 1.0105 1.0053
No log 2.4918 152 1.0537 0.0214 1.0537 1.0265
No log 2.5246 154 0.9875 0.1511 0.9875 0.9937
No log 2.5574 156 0.9961 0.2071 0.9961 0.9980
No log 2.5902 158 1.1000 0.1808 1.1000 1.0488
No log 2.6230 160 1.0504 0.1900 1.0504 1.0249
No log 2.6557 162 0.9904 0.0954 0.9904 0.9952
No log 2.6885 164 1.1195 0.0885 1.1195 1.0581
No log 2.7213 166 1.1348 0.0041 1.1348 1.0653
No log 2.7541 168 0.9617 0.0890 0.9617 0.9806
No log 2.7869 170 0.8580 0.1697 0.8580 0.9263
No log 2.8197 172 0.9903 0.1946 0.9903 0.9951
No log 2.8525 174 1.0453 0.1682 1.0453 1.0224
No log 2.8852 176 0.9288 0.2414 0.9288 0.9637
No log 2.9180 178 0.8509 0.1569 0.8509 0.9224
No log 2.9508 180 0.9495 0.1294 0.9495 0.9744
No log 2.9836 182 0.9725 0.0529 0.9725 0.9861
No log 3.0164 184 0.9040 0.1091 0.9040 0.9508
No log 3.0492 186 0.9289 0.2471 0.9289 0.9638
No log 3.0820 188 1.0289 0.2059 1.0289 1.0144
No log 3.1148 190 1.0156 0.2781 1.0156 1.0078
No log 3.1475 192 0.8899 0.2419 0.8899 0.9433
No log 3.1803 194 0.8470 0.2126 0.8470 0.9203
No log 3.2131 196 0.9015 0.1089 0.9015 0.9495
No log 3.2459 198 0.9267 0.1828 0.9267 0.9627
No log 3.2787 200 0.8567 0.2325 0.8567 0.9256
No log 3.3115 202 0.8617 0.2914 0.8617 0.9283
No log 3.3443 204 0.8611 0.3350 0.8611 0.9280
No log 3.3770 206 0.8477 0.2482 0.8477 0.9207
No log 3.4098 208 0.8755 0.1729 0.8755 0.9357
No log 3.4426 210 0.8656 0.1951 0.8656 0.9304
No log 3.4754 212 0.8530 0.1951 0.8530 0.9236
No log 3.5082 214 0.8206 0.2775 0.8206 0.9059
No log 3.5410 216 0.7934 0.2479 0.7934 0.8908
No log 3.5738 218 0.7737 0.2256 0.7737 0.8796
No log 3.6066 220 0.7704 0.1970 0.7704 0.8777
No log 3.6393 222 0.7650 0.2947 0.7650 0.8746
No log 3.6721 224 0.7793 0.2264 0.7793 0.8828
No log 3.7049 226 0.8057 0.1558 0.8057 0.8976
No log 3.7377 228 0.7960 0.2264 0.7960 0.8922
No log 3.7705 230 0.8091 0.2553 0.8091 0.8995
No log 3.8033 232 0.8321 0.2694 0.8321 0.9122
No log 3.8361 234 0.8256 0.2720 0.8256 0.9086
No log 3.8689 236 0.8599 0.2429 0.8599 0.9273
No log 3.9016 238 0.8984 0.2130 0.8984 0.9479
No log 3.9344 240 0.8478 0.1695 0.8478 0.9208
No log 3.9672 242 0.8710 0.3207 0.8710 0.9333
No log 4.0 244 1.0144 0.3294 1.0144 1.0072
No log 4.0328 246 0.9797 0.3632 0.9797 0.9898
No log 4.0656 248 0.8191 0.2968 0.8191 0.9051
No log 4.0984 250 0.7475 0.2965 0.7475 0.8646
No log 4.1311 252 0.7737 0.3273 0.7737 0.8796
No log 4.1639 254 0.7624 0.3273 0.7624 0.8731
No log 4.1967 256 0.7372 0.1649 0.7372 0.8586
No log 4.2295 258 0.7652 0.2605 0.7652 0.8747
No log 4.2623 260 0.7646 0.2605 0.7646 0.8744
No log 4.2951 262 0.7402 0.2980 0.7402 0.8604
No log 4.3279 264 0.7116 0.3380 0.7116 0.8435
No log 4.3607 266 0.7005 0.3808 0.7005 0.8370
No log 4.3934 268 0.6984 0.4101 0.6984 0.8357
No log 4.4262 270 0.6959 0.3481 0.6959 0.8342
No log 4.4590 272 0.6833 0.4217 0.6833 0.8266
No log 4.4918 274 0.6696 0.3811 0.6696 0.8183
No log 4.5246 276 0.6639 0.3703 0.6639 0.8148
No log 4.5574 278 0.7095 0.2611 0.7095 0.8423
No log 4.5902 280 0.7292 0.2581 0.7292 0.8540
No log 4.6230 282 0.6727 0.2979 0.6727 0.8202
No log 4.6557 284 0.6958 0.2777 0.6958 0.8341
No log 4.6885 286 0.8826 0.4026 0.8826 0.9395
No log 4.7213 288 1.0118 0.4305 1.0118 1.0059
No log 4.7541 290 0.9081 0.3709 0.9081 0.9529
No log 4.7869 292 0.7853 0.2607 0.7853 0.8862
No log 4.8197 294 1.0320 0.2063 1.0320 1.0159
No log 4.8525 296 1.2654 0.0707 1.2654 1.1249
No log 4.8852 298 1.1967 -0.0141 1.1967 1.0939
No log 4.9180 300 0.9861 0.0936 0.9861 0.9930
No log 4.9508 302 0.8036 0.1979 0.8036 0.8964
No log 4.9836 304 0.7791 0.1902 0.7791 0.8827
No log 5.0164 306 0.8194 0.2171 0.8194 0.9052
No log 5.0492 308 0.8203 0.1820 0.8203 0.9057
No log 5.0820 310 0.8045 0.2152 0.8045 0.8969
No log 5.1148 312 0.8379 0.1857 0.8379 0.9153
No log 5.1475 314 0.8774 0.1828 0.8774 0.9367
No log 5.1803 316 0.8463 0.1419 0.8463 0.9199
No log 5.2131 318 0.8306 0.2857 0.8306 0.9114
No log 5.2459 320 0.8562 0.2857 0.8562 0.9253
No log 5.2787 322 0.8297 0.2605 0.8297 0.9109
No log 5.3115 324 0.7590 0.3160 0.7590 0.8712
No log 5.3443 326 0.7868 0.2283 0.7868 0.8870
No log 5.3770 328 0.8669 0.1281 0.8669 0.9311
No log 5.4098 330 0.8447 0.1281 0.8447 0.9191
No log 5.4426 332 0.7826 0.2249 0.7826 0.8847
No log 5.4754 334 0.8065 0.2040 0.8065 0.8980
No log 5.5082 336 0.8909 0.3060 0.8909 0.9439
No log 5.5410 338 0.9166 0.3173 0.9166 0.9574
No log 5.5738 340 0.9358 0.3611 0.9358 0.9673
No log 5.6066 342 0.9357 0.2531 0.9357 0.9673
No log 5.6393 344 0.9702 0.1421 0.9702 0.9850
No log 5.6721 346 1.0812 0.2665 1.0812 1.0398
No log 5.7049 348 1.1310 0.2273 1.1310 1.0635
No log 5.7377 350 1.0385 0.1360 1.0385 1.0191
No log 5.7705 352 0.9129 0.1690 0.9129 0.9554
No log 5.8033 354 0.9020 0.2481 0.9020 0.9497
No log 5.8361 356 0.8888 0.2481 0.8888 0.9428
No log 5.8689 358 0.8767 0.2661 0.8767 0.9363
No log 5.9016 360 0.8674 0.2566 0.8674 0.9313
No log 5.9344 362 0.8712 0.1705 0.8712 0.9334
No log 5.9672 364 0.8546 0.1729 0.8546 0.9244
No log 6.0 366 0.8446 0.1397 0.8446 0.9190
No log 6.0328 368 0.8678 0.1356 0.8678 0.9315
No log 6.0656 370 0.8778 0.1606 0.8778 0.9369
No log 6.0984 372 0.8972 0.0982 0.8972 0.9472
No log 6.1311 374 0.9082 0.1356 0.9082 0.9530
No log 6.1639 376 0.9066 0.1397 0.9066 0.9522
No log 6.1967 378 0.8769 0.0982 0.8769 0.9364
No log 6.2295 380 0.8724 0.1592 0.8724 0.9340
No log 6.2623 382 0.8638 0.2424 0.8638 0.9294
No log 6.2951 384 0.8504 0.0623 0.8504 0.9222
No log 6.3279 386 0.8477 0.1052 0.8477 0.9207
No log 6.3607 388 0.8457 0.1052 0.8457 0.9196
No log 6.3934 390 0.8351 0.0909 0.8351 0.9138
No log 6.4262 392 0.8297 0.0912 0.8297 0.9109
No log 6.4590 394 0.8191 0.2085 0.8191 0.9051
No log 6.4918 396 0.8121 0.2488 0.8121 0.9012
No log 6.5246 398 0.8343 0.2488 0.8343 0.9134
No log 6.5574 400 0.8109 0.2034 0.8109 0.9005
No log 6.5902 402 0.7789 0.2172 0.7789 0.8826
No log 6.6230 404 0.7616 0.3042 0.7616 0.8727
No log 6.6557 406 0.7275 0.3336 0.7275 0.8529
No log 6.6885 408 0.7438 0.2274 0.7438 0.8624
No log 6.7213 410 0.7289 0.2578 0.7289 0.8538
No log 6.7541 412 0.7058 0.3293 0.7058 0.8401
No log 6.7869 414 0.6921 0.3837 0.6921 0.8319
No log 6.8197 416 0.6710 0.3336 0.6710 0.8191
No log 6.8525 418 0.7082 0.4118 0.7082 0.8416
No log 6.8852 420 0.7684 0.3180 0.7684 0.8766
No log 6.9180 422 0.7745 0.3276 0.7745 0.8801
No log 6.9508 424 0.7206 0.2642 0.7206 0.8489
No log 6.9836 426 0.6913 0.3129 0.6913 0.8315
No log 7.0164 428 0.6945 0.3129 0.6944 0.8333
No log 7.0492 430 0.7472 0.2835 0.7472 0.8644
No log 7.0820 432 0.7859 0.2549 0.7859 0.8865
No log 7.1148 434 0.7584 0.2342 0.7584 0.8709
No log 7.1475 436 0.7685 0.2526 0.7685 0.8767
No log 7.1803 438 0.7700 0.3512 0.7700 0.8775
No log 7.2131 440 0.7681 0.3186 0.7681 0.8764
No log 7.2459 442 0.7634 0.2371 0.7634 0.8737
No log 7.2787 444 0.7469 0.2535 0.7469 0.8642
No log 7.3115 446 0.7209 0.2713 0.7209 0.8491
No log 7.3443 448 0.7201 0.1818 0.7201 0.8486
No log 7.3770 450 0.7190 0.1558 0.7190 0.8479
No log 7.4098 452 0.7146 0.1529 0.7146 0.8453
No log 7.4426 454 0.7250 0.3474 0.7250 0.8515
No log 7.4754 456 0.7364 0.3170 0.7364 0.8582
No log 7.5082 458 0.7729 0.2488 0.7729 0.8792
No log 7.5410 460 0.8089 0.2171 0.8089 0.8994
No log 7.5738 462 0.8200 0.2491 0.8200 0.9056
No log 7.6066 464 0.7858 0.2085 0.7858 0.8864
No log 7.6393 466 0.7603 0.2424 0.7603 0.8719
No log 7.6721 468 0.7502 0.2683 0.7502 0.8662
No log 7.7049 470 0.7547 0.2684 0.7547 0.8687
No log 7.7377 472 0.7495 0.2310 0.7495 0.8657
No log 7.7705 474 0.7909 0.2710 0.7909 0.8893
No log 7.8033 476 0.8330 0.2681 0.8330 0.9127
No log 7.8361 478 0.8037 0.3051 0.8037 0.8965
No log 7.8689 480 0.7901 0.1969 0.7901 0.8889
No log 7.9016 482 0.8020 0.1522 0.8020 0.8956
No log 7.9344 484 0.8015 0.1522 0.8015 0.8953
No log 7.9672 486 0.7872 0.2424 0.7872 0.8872
No log 8.0 488 0.7773 0.2424 0.7773 0.8816
No log 8.0328 490 0.7687 0.2424 0.7687 0.8768
No log 8.0656 492 0.7681 0.2451 0.7681 0.8764
No log 8.0984 494 0.8089 0.2419 0.8089 0.8994
No log 8.1311 496 0.8151 0.2947 0.8151 0.9028
No log 8.1639 498 0.7960 0.3225 0.7960 0.8922
0.3696 8.1967 500 0.7964 0.3225 0.7964 0.8924
0.3696 8.2295 502 0.8069 0.2947 0.8069 0.8983
0.3696 8.2623 504 0.8439 0.2127 0.8439 0.9186
0.3696 8.2951 506 0.8138 0.2474 0.8138 0.9021
0.3696 8.3279 508 0.7472 0.2747 0.7472 0.8644
0.3696 8.3607 510 0.7188 0.2609 0.7188 0.8478

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k18_task7_organization

Finetuned
(4019)
this model