ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k12_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8691
  • Qwk: 0.0071
  • Mse: 0.8691
  • Rmse: 0.9322

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0606 2 3.6352 -0.0047 3.6352 1.9066
No log 0.1212 4 1.9707 0.0643 1.9707 1.4038
No log 0.1818 6 1.9928 0.0260 1.9928 1.4117
No log 0.2424 8 1.7523 -0.0591 1.7523 1.3237
No log 0.3030 10 1.8084 0.0063 1.8084 1.3448
No log 0.3636 12 1.3776 -0.1121 1.3776 1.1737
No log 0.4242 14 1.4047 -0.0887 1.4047 1.1852
No log 0.4848 16 1.7791 -0.0039 1.7791 1.3338
No log 0.5455 18 1.9808 0.0102 1.9808 1.4074
No log 0.6061 20 1.2194 0.0433 1.2194 1.1043
No log 0.6667 22 0.7463 0.0506 0.7463 0.8639
No log 0.7273 24 0.6972 -0.1216 0.6972 0.8350
No log 0.7879 26 0.7066 -0.1216 0.7066 0.8406
No log 0.8485 28 0.7771 -0.1239 0.7771 0.8816
No log 0.9091 30 0.9173 0.0157 0.9173 0.9578
No log 0.9697 32 1.0391 -0.0423 1.0391 1.0194
No log 1.0303 34 1.1360 -0.0751 1.1360 1.0658
No log 1.0909 36 1.1778 -0.0500 1.1778 1.0853
No log 1.1515 38 1.2544 0.0 1.2544 1.1200
No log 1.2121 40 1.1191 0.0 1.1191 1.0579
No log 1.2727 42 0.8834 -0.0658 0.8834 0.9399
No log 1.3333 44 0.8219 0.0549 0.8219 0.9066
No log 1.3939 46 0.8705 0.0404 0.8705 0.9330
No log 1.4545 48 0.8313 0.0826 0.8313 0.9117
No log 1.5152 50 0.8040 0.1627 0.8040 0.8967
No log 1.5758 52 0.8243 0.0786 0.8243 0.9079
No log 1.6364 54 0.7956 0.2669 0.7956 0.8919
No log 1.6970 56 0.8559 -0.0545 0.8559 0.9251
No log 1.7576 58 0.8759 0.0576 0.8759 0.9359
No log 1.8182 60 0.9055 0.0089 0.9055 0.9516
No log 1.8788 62 0.8405 -0.0056 0.8405 0.9168
No log 1.9394 64 0.8714 -0.0079 0.8714 0.9335
No log 2.0 66 0.9966 0.0026 0.9966 0.9983
No log 2.0606 68 1.0383 -0.0628 1.0383 1.0190
No log 2.1212 70 0.9072 0.0576 0.9072 0.9525
No log 2.1818 72 0.7767 -0.1244 0.7767 0.8813
No log 2.2424 74 0.7588 -0.0240 0.7588 0.8711
No log 2.3030 76 0.7799 -0.0062 0.7799 0.8831
No log 2.3636 78 0.8930 -0.0442 0.8930 0.9450
No log 2.4242 80 1.6048 -0.0484 1.6048 1.2668
No log 2.4848 82 1.9756 0.0136 1.9756 1.4056
No log 2.5455 84 1.4111 -0.0720 1.4111 1.1879
No log 2.6061 86 0.7770 -0.0056 0.7770 0.8815
No log 2.6667 88 0.7191 0.1379 0.7191 0.8480
No log 2.7273 90 0.8336 -0.0459 0.8336 0.9130
No log 2.7879 92 1.1151 0.0431 1.1151 1.0560
No log 2.8485 94 1.2771 0.0305 1.2771 1.1301
No log 2.9091 96 1.0364 0.0458 1.0364 1.0180
No log 2.9697 98 0.7591 0.0512 0.7591 0.8713
No log 3.0303 100 0.7423 0.0857 0.7423 0.8616
No log 3.0909 102 0.9997 0.0134 0.9997 0.9999
No log 3.1515 104 1.0254 0.0515 1.0254 1.0126
No log 3.2121 106 0.9484 -0.0163 0.9484 0.9739
No log 3.2727 108 1.0687 0.0107 1.0687 1.0338
No log 3.3333 110 1.0109 0.0428 1.0109 1.0055
No log 3.3939 112 1.2465 0.0619 1.2465 1.1165
No log 3.4545 114 0.9165 0.0946 0.9165 0.9573
No log 3.5152 116 0.7486 0.1318 0.7486 0.8652
No log 3.5758 118 0.7329 0.0914 0.7329 0.8561
No log 3.6364 120 0.8343 0.0043 0.8343 0.9134
No log 3.6970 122 0.9967 0.0609 0.9967 0.9983
No log 3.7576 124 0.8907 0.0068 0.8907 0.9438
No log 3.8182 126 0.8567 0.1096 0.8567 0.9256
No log 3.8788 128 0.7818 -0.0062 0.7818 0.8842
No log 3.9394 130 0.7978 -0.1601 0.7978 0.8932
No log 4.0 132 0.7969 0.0338 0.7969 0.8927
No log 4.0606 134 1.3262 -0.0320 1.3262 1.1516
No log 4.1212 136 1.4315 0.0199 1.4315 1.1965
No log 4.1818 138 0.9092 0.1107 0.9092 0.9535
No log 4.2424 140 0.7340 0.0030 0.7340 0.8567
No log 4.3030 142 0.7448 0.0922 0.7448 0.8630
No log 4.3636 144 0.9551 0.0793 0.9551 0.9773
No log 4.4242 146 1.0222 0.0233 1.0222 1.0110
No log 4.4848 148 0.7702 0.0749 0.7702 0.8776
No log 4.5455 150 0.7024 0.1552 0.7024 0.8381
No log 4.6061 152 0.7267 0.1202 0.7267 0.8525
No log 4.6667 154 0.8869 0.0946 0.8869 0.9418
No log 4.7273 156 0.8946 0.0287 0.8946 0.9459
No log 4.7879 158 0.8022 0.1449 0.8022 0.8957
No log 4.8485 160 0.8178 0.1395 0.8178 0.9043
No log 4.9091 162 0.7944 0.1627 0.7944 0.8913
No log 4.9697 164 0.7852 0.0432 0.7852 0.8861
No log 5.0303 166 0.8530 0.1187 0.8530 0.9236
No log 5.0909 168 1.0866 -0.0094 1.0866 1.0424
No log 5.1515 170 1.3127 0.0544 1.3127 1.1457
No log 5.2121 172 0.9377 0.0452 0.9377 0.9684
No log 5.2727 174 0.8113 0.0301 0.8113 0.9007
No log 5.3333 176 0.8154 0.0551 0.8154 0.9030
No log 5.3939 178 0.9475 0.1841 0.9475 0.9734
No log 5.4545 180 0.9993 0.0946 0.9993 0.9996
No log 5.5152 182 0.8623 0.1235 0.8623 0.9286
No log 5.5758 184 0.7991 0.0923 0.7991 0.8939
No log 5.6364 186 0.8440 0.0917 0.8440 0.9187
No log 5.6970 188 0.8392 0.0913 0.8392 0.9161
No log 5.7576 190 0.9912 0.0839 0.9912 0.9956
No log 5.8182 192 0.9956 0.1114 0.9956 0.9978
No log 5.8788 194 0.8302 0.0826 0.8302 0.9111
No log 5.9394 196 0.7266 -0.0059 0.7266 0.8524
No log 6.0 198 0.7460 0.0303 0.7460 0.8637
No log 6.0606 200 0.9876 0.0684 0.9876 0.9938
No log 6.1212 202 1.0250 -0.0054 1.0250 1.0124
No log 6.1818 204 0.7844 0.0863 0.7844 0.8857
No log 6.2424 206 0.8678 -0.2705 0.8678 0.9316
No log 6.3030 208 0.7778 -0.0939 0.7778 0.8819
No log 6.3636 210 0.8550 0.0549 0.8550 0.9247
No log 6.4242 212 1.2085 -0.1920 1.2085 1.0993
No log 6.4848 214 1.0278 -0.0909 1.0278 1.0138
No log 6.5455 216 0.8310 0.1553 0.8310 0.9116
No log 6.6061 218 0.7784 -0.0427 0.7784 0.8822
No log 6.6667 220 0.7846 0.0393 0.7846 0.8858
No log 6.7273 222 0.8970 0.0920 0.8970 0.9471
No log 6.7879 224 0.8433 0.0920 0.8433 0.9183
No log 6.8485 226 0.8114 0.1094 0.8114 0.9008
No log 6.9091 228 0.8099 0.1095 0.8099 0.8999
No log 6.9697 230 0.8538 0.1003 0.8538 0.9240
No log 7.0303 232 1.1092 -0.0500 1.1092 1.0532
No log 7.0909 234 0.9862 -0.0425 0.9862 0.9931
No log 7.1515 236 0.7949 0.1298 0.7949 0.8916
No log 7.2121 238 0.7748 -0.0892 0.7748 0.8802
No log 7.2727 240 0.7250 0.0460 0.7250 0.8515
No log 7.3333 242 0.8260 0.1047 0.8260 0.9088
No log 7.3939 244 0.9861 -0.0122 0.9861 0.9930
No log 7.4545 246 0.8791 0.0909 0.8791 0.9376
No log 7.5152 248 0.9349 0.0651 0.9349 0.9669
No log 7.5758 250 0.9720 0.0651 0.9720 0.9859
No log 7.6364 252 1.0898 -0.1572 1.0898 1.0439
No log 7.6970 254 0.9603 0.0250 0.9603 0.9799
No log 7.7576 256 0.7898 0.0680 0.7898 0.8887
No log 7.8182 258 0.7651 0.1254 0.7651 0.8747
No log 7.8788 260 0.7660 0.0481 0.7660 0.8752
No log 7.9394 262 0.7943 0.0214 0.7943 0.8912
No log 8.0 264 0.9580 0.0986 0.9580 0.9788
No log 8.0606 266 0.8677 0.1001 0.8677 0.9315
No log 8.1212 268 0.8103 0.0236 0.8103 0.9002
No log 8.1818 270 0.8169 -0.0226 0.8169 0.9038
No log 8.2424 272 0.7980 -0.0741 0.7980 0.8933
No log 8.3030 274 0.8524 0.0205 0.8524 0.9232
No log 8.3636 276 0.8539 0.2208 0.8539 0.9241
No log 8.4242 278 0.7738 0.0670 0.7738 0.8796
No log 8.4848 280 0.7390 0.1379 0.7390 0.8597
No log 8.5455 282 0.7493 0.1828 0.7493 0.8656
No log 8.6061 284 0.7602 0.1691 0.7602 0.8719
No log 8.6667 286 0.8247 0.0549 0.8247 0.9081
No log 8.7273 288 0.7785 0.1097 0.7785 0.8823
No log 8.7879 290 0.7459 0.0030 0.7459 0.8636
No log 8.8485 292 0.7333 0.0030 0.7333 0.8563
No log 8.9091 294 0.7635 0.2105 0.7635 0.8738
No log 8.9697 296 0.9748 0.0111 0.9748 0.9873
No log 9.0303 298 0.9551 -0.0532 0.9551 0.9773
No log 9.0909 300 0.7580 0.2105 0.7580 0.8707
No log 9.1515 302 0.7084 0.1021 0.7084 0.8416
No log 9.2121 304 0.7728 -0.1473 0.7728 0.8791
No log 9.2727 306 0.7516 -0.1470 0.7516 0.8669
No log 9.3333 308 0.7481 0.2258 0.7481 0.8649
No log 9.3939 310 0.8891 -0.0861 0.8891 0.9429
No log 9.4545 312 0.8275 0.0017 0.8275 0.9096
No log 9.5152 314 0.6938 0.1512 0.6938 0.8329
No log 9.5758 316 0.6974 0.1023 0.6974 0.8351
No log 9.6364 318 0.6995 0.1512 0.6995 0.8363
No log 9.6970 320 0.7350 0.2180 0.7350 0.8573
No log 9.7576 322 0.7497 0.2122 0.7497 0.8659
No log 9.8182 324 0.7403 0.1354 0.7403 0.8604
No log 9.8788 326 0.8034 0.1612 0.8034 0.8964
No log 9.9394 328 0.7810 0.1298 0.7810 0.8838
No log 10.0 330 0.7963 0.0393 0.7963 0.8923
No log 10.0606 332 0.8254 -0.0079 0.8254 0.9085
No log 10.1212 334 0.8470 0.0327 0.8470 0.9203
No log 10.1818 336 0.9669 -0.0030 0.9669 0.9833
No log 10.2424 338 0.9733 0.0793 0.9733 0.9865
No log 10.3030 340 0.8776 -0.0810 0.8776 0.9368
No log 10.3636 342 0.9233 -0.1111 0.9233 0.9609
No log 10.4242 344 0.9517 -0.1530 0.9517 0.9756
No log 10.4848 346 0.8616 -0.1905 0.8616 0.9282
No log 10.5455 348 0.8397 -0.0215 0.8397 0.9163
No log 10.6061 350 0.9877 -0.0878 0.9877 0.9938
No log 10.6667 352 0.9885 -0.0122 0.9885 0.9943
No log 10.7273 354 0.8849 0.0476 0.8849 0.9407
No log 10.7879 356 0.7675 -0.0096 0.7675 0.8761
No log 10.8485 358 0.8210 -0.1823 0.8210 0.9061
No log 10.9091 360 0.8534 -0.0941 0.8534 0.9238
No log 10.9697 362 0.8514 0.0359 0.8514 0.9227
No log 11.0303 364 1.0568 0.0267 1.0568 1.0280
No log 11.0909 366 1.1179 -0.0539 1.1179 1.0573
No log 11.1515 368 0.8779 0.0068 0.8779 0.9369
No log 11.2121 370 0.8102 0.1096 0.8102 0.9001
No log 11.2727 372 0.8036 0.1565 0.8036 0.8964
No log 11.3333 374 0.8059 0.1506 0.8059 0.8977
No log 11.3939 376 0.7592 0.0909 0.7592 0.8713
No log 11.4545 378 0.7529 0.0857 0.7529 0.8677
No log 11.5152 380 0.7611 0.0863 0.7611 0.8724
No log 11.5758 382 0.7756 0.0814 0.7756 0.8807
No log 11.6364 384 0.7973 0.0814 0.7973 0.8929
No log 11.6970 386 0.8482 -0.0686 0.8482 0.9210
No log 11.7576 388 0.8319 -0.0541 0.8319 0.9121
No log 11.8182 390 0.8233 -0.0500 0.8233 0.9074
No log 11.8788 392 0.8364 0.0690 0.8364 0.9145
No log 11.9394 394 0.8533 0.0146 0.8533 0.9237
No log 12.0 396 0.8558 0.0167 0.8558 0.9251
No log 12.0606 398 0.8522 0.0118 0.8522 0.9231
No log 12.1212 400 0.8639 0.0525 0.8639 0.9294
No log 12.1818 402 0.8873 0.0476 0.8873 0.9420
No log 12.2424 404 0.9420 0.0409 0.9420 0.9706
No log 12.3030 406 0.8724 -0.0113 0.8724 0.9340
No log 12.3636 408 0.8851 -0.1588 0.8851 0.9408
No log 12.4242 410 0.8852 -0.0462 0.8852 0.9409
No log 12.4848 412 0.8719 -0.0103 0.8719 0.9338
No log 12.5455 414 0.8939 0.0452 0.8939 0.9455
No log 12.6061 416 0.8387 -0.0320 0.8387 0.9158
No log 12.6667 418 0.8689 -0.1472 0.8689 0.9322
No log 12.7273 420 0.8512 0.1292 0.8512 0.9226
No log 12.7879 422 0.9155 0.0065 0.9155 0.9568
No log 12.8485 424 0.9709 0.0409 0.9709 0.9853
No log 12.9091 426 0.8505 0.1003 0.8505 0.9222
No log 12.9697 428 0.8174 -0.0675 0.8174 0.9041
No log 13.0303 430 0.8120 -0.1535 0.8120 0.9011
No log 13.0909 432 0.7810 0.0611 0.7810 0.8837
No log 13.1515 434 0.9323 0.0346 0.9323 0.9655
No log 13.2121 436 1.0265 -0.0122 1.0265 1.0132
No log 13.2727 438 0.8733 0.1387 0.8733 0.9345
No log 13.3333 440 0.8345 0.0257 0.8345 0.9135
No log 13.3939 442 0.8750 0.0622 0.8750 0.9354
No log 13.4545 444 0.8897 0.0586 0.8897 0.9432
No log 13.5152 446 0.9906 -0.0441 0.9906 0.9953
No log 13.5758 448 1.0321 0.0260 1.0321 1.0159
No log 13.6364 450 0.9040 0.0377 0.9040 0.9508
No log 13.6970 452 0.8256 0.1047 0.8256 0.9086
No log 13.7576 454 0.8473 0.1449 0.8473 0.9205
No log 13.8182 456 0.9123 0.0867 0.9123 0.9551
No log 13.8788 458 0.9551 0.0867 0.9551 0.9773
No log 13.9394 460 0.9425 0.0476 0.9425 0.9708
No log 14.0 462 0.8927 0.1047 0.8927 0.9448
No log 14.0606 464 0.8663 0.2078 0.8663 0.9307
No log 14.1212 466 0.8744 0.0490 0.8744 0.9351
No log 14.1818 468 0.8080 0.2009 0.8080 0.8989
No log 14.2424 470 0.7510 0.1675 0.7510 0.8666
No log 14.3030 472 0.7212 0.0967 0.7212 0.8492
No log 14.3636 474 0.7203 0.0479 0.7203 0.8487
No log 14.4242 476 0.7466 0.1691 0.7466 0.8640
No log 14.4848 478 0.8537 0.0512 0.8537 0.9240
No log 14.5455 480 0.8457 0.0512 0.8457 0.9196
No log 14.6061 482 0.7863 0.1758 0.7863 0.8868
No log 14.6667 484 0.7971 -0.0967 0.7971 0.8928
No log 14.7273 486 0.8431 -0.1737 0.8431 0.9182
No log 14.7879 488 0.8517 -0.1271 0.8517 0.9229
No log 14.8485 490 0.9008 0.0628 0.9008 0.9491
No log 14.9091 492 1.0345 0.0346 1.0345 1.0171
No log 14.9697 494 0.9992 0.0409 0.9992 0.9996
No log 15.0303 496 0.8705 -0.0170 0.8705 0.9330
No log 15.0909 498 0.8466 0.0465 0.8466 0.9201
0.3451 15.1515 500 0.8302 0.0926 0.8302 0.9112
0.3451 15.2121 502 0.8231 0.1371 0.8231 0.9072
0.3451 15.2727 504 0.8743 0.0588 0.8743 0.9351
0.3451 15.3333 506 0.8381 0.0628 0.8381 0.9155
0.3451 15.3939 508 0.8339 0.0588 0.8339 0.9132
0.3451 15.4545 510 0.8502 0.0549 0.8502 0.9220
0.3451 15.5152 512 0.8691 0.0071 0.8691 0.9322

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k12_task3_organization

Finetuned
(4023)
this model