ArabicNewSplits7_FineTuningAraBERT_run3_AugV5_k2_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8524
  • Qwk: 0.6892
  • Mse: 0.8524
  • Rmse: 0.9233

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1667 2 7.1108 0.0055 7.1108 2.6666
No log 0.3333 4 4.5070 0.0620 4.5070 2.1230
No log 0.5 6 2.8718 0.0952 2.8718 1.6946
No log 0.6667 8 2.2613 0.1037 2.2613 1.5038
No log 0.8333 10 1.7092 0.1869 1.7092 1.3073
No log 1.0 12 1.6019 0.1905 1.6019 1.2657
No log 1.1667 14 1.6486 0.1905 1.6486 1.2840
No log 1.3333 16 1.7564 0.1905 1.7564 1.3253
No log 1.5 18 1.8860 0.1869 1.8860 1.3733
No log 1.6667 20 2.2138 0.1185 2.2138 1.4879
No log 1.8333 22 3.1032 0.0503 3.1032 1.7616
No log 2.0 24 3.1156 0.0503 3.1156 1.7651
No log 2.1667 26 2.4323 0.0667 2.4323 1.5596
No log 2.3333 28 1.7025 0.1964 1.7025 1.3048
No log 2.5 30 1.4533 0.2778 1.4533 1.2055
No log 2.6667 32 1.4861 0.1714 1.4861 1.2191
No log 2.8333 34 1.7352 0.1495 1.7352 1.3173
No log 3.0 36 1.7615 0.2018 1.7615 1.3272
No log 3.1667 38 1.5839 0.2909 1.5839 1.2585
No log 3.3333 40 1.5560 0.3009 1.5560 1.2474
No log 3.5 42 1.5358 0.3333 1.5358 1.2393
No log 3.6667 44 1.4825 0.3186 1.4825 1.2176
No log 3.8333 46 1.3540 0.3571 1.3540 1.1636
No log 4.0 48 1.3249 0.3966 1.3249 1.1510
No log 4.1667 50 1.2535 0.3966 1.2535 1.1196
No log 4.3333 52 1.2704 0.5079 1.2704 1.1271
No log 4.5 54 1.1770 0.5354 1.1770 1.0849
No log 4.6667 56 0.9817 0.5645 0.9817 0.9908
No log 4.8333 58 1.1473 0.5289 1.1473 1.0711
No log 5.0 60 1.4538 0.3684 1.4538 1.2057
No log 5.1667 62 1.2028 0.3684 1.2028 1.0967
No log 5.3333 64 1.0119 0.5865 1.0119 1.0059
No log 5.5 66 1.0073 0.6383 1.0073 1.0036
No log 5.6667 68 0.9126 0.6759 0.9126 0.9553
No log 5.8333 70 0.9060 0.6714 0.9060 0.9518
No log 6.0 72 1.1046 0.5344 1.1046 1.0510
No log 6.1667 74 1.0891 0.5344 1.0891 1.0436
No log 6.3333 76 0.9279 0.5954 0.9279 0.9633
No log 6.5 78 0.9581 0.6345 0.9581 0.9788
No log 6.6667 80 0.9612 0.6111 0.9612 0.9804
No log 6.8333 82 0.8609 0.6901 0.8609 0.9279
No log 7.0 84 1.0278 0.5816 1.0278 1.0138
No log 7.1667 86 1.1624 0.5379 1.1624 1.0781
No log 7.3333 88 1.0888 0.5594 1.0888 1.0434
No log 7.5 90 0.8447 0.6389 0.8447 0.9191
No log 7.6667 92 0.7885 0.7075 0.7885 0.8880
No log 7.8333 94 0.7921 0.7297 0.7921 0.8900
No log 8.0 96 0.7635 0.7733 0.7635 0.8738
No log 8.1667 98 0.8383 0.6577 0.8383 0.9156
No log 8.3333 100 0.9808 0.6267 0.9808 0.9903
No log 8.5 102 0.9644 0.6309 0.9644 0.9820
No log 8.6667 104 0.8345 0.6622 0.8345 0.9135
No log 8.8333 106 0.7991 0.7429 0.7991 0.8939
No log 9.0 108 0.8347 0.7246 0.8347 0.9136
No log 9.1667 110 0.9443 0.7059 0.9443 0.9718
No log 9.3333 112 1.1809 0.5238 1.1809 1.0867
No log 9.5 114 1.2504 0.4355 1.2504 1.1182
No log 9.6667 116 1.0457 0.6043 1.0457 1.0226
No log 9.8333 118 0.9005 0.6993 0.9005 0.9489
No log 10.0 120 0.8576 0.72 0.8576 0.9261
No log 10.1667 122 0.8399 0.6753 0.8399 0.9165
No log 10.3333 124 0.9124 0.6452 0.9124 0.9552
No log 10.5 126 1.1035 0.6164 1.1035 1.0505
No log 10.6667 128 1.0596 0.6013 1.0596 1.0294
No log 10.8333 130 0.9122 0.6438 0.9122 0.9551
No log 11.0 132 0.8426 0.6901 0.8426 0.9179
No log 11.1667 134 0.8593 0.6806 0.8593 0.9270
No log 11.3333 136 0.9642 0.625 0.9642 0.9819
No log 11.5 138 0.9668 0.6164 0.9668 0.9833
No log 11.6667 140 0.8974 0.6486 0.8974 0.9473
No log 11.8333 142 0.8076 0.7059 0.8076 0.8987
No log 12.0 144 0.7660 0.7483 0.7660 0.8752
No log 12.1667 146 0.7792 0.7483 0.7792 0.8827
No log 12.3333 148 0.7778 0.7260 0.7778 0.8819
No log 12.5 150 0.8543 0.6853 0.8543 0.9243
No log 12.6667 152 0.9816 0.5890 0.9816 0.9908
No log 12.8333 154 1.0607 0.5890 1.0607 1.0299
No log 13.0 156 1.1253 0.5556 1.1253 1.0608
No log 13.1667 158 1.1924 0.5068 1.1924 1.0920
No log 13.3333 160 1.0567 0.6275 1.0567 1.0279
No log 13.5 162 0.8389 0.7125 0.8389 0.9159
No log 13.6667 164 0.8215 0.7051 0.8215 0.9064
No log 13.8333 166 0.8614 0.6497 0.8614 0.9281
No log 14.0 168 0.9410 0.6203 0.9410 0.9701
No log 14.1667 170 1.0563 0.6211 1.0563 1.0277
No log 14.3333 172 0.9903 0.6267 0.9903 0.9951
No log 14.5 174 0.8057 0.6866 0.8057 0.8976
No log 14.6667 176 0.7826 0.7059 0.7826 0.8847
No log 14.8333 178 0.7602 0.7376 0.7602 0.8719
No log 15.0 180 0.7341 0.7517 0.7341 0.8568
No log 15.1667 182 0.8595 0.7093 0.8595 0.9271
No log 15.3333 184 0.9486 0.7119 0.9486 0.9740
No log 15.5 186 0.9357 0.6824 0.9357 0.9673
No log 15.6667 188 0.9810 0.6585 0.9810 0.9905
No log 15.8333 190 0.9151 0.6241 0.9151 0.9566
No log 16.0 192 0.8550 0.6154 0.8550 0.9247
No log 16.1667 194 0.8589 0.6260 0.8589 0.9268
No log 16.3333 196 0.8128 0.6667 0.8128 0.9016
No log 16.5 198 0.8631 0.6486 0.8631 0.9290
No log 16.6667 200 0.9815 0.6818 0.9815 0.9907
No log 16.8333 202 0.8811 0.7073 0.8811 0.9387
No log 17.0 204 0.6586 0.7651 0.6586 0.8115
No log 17.1667 206 0.6408 0.7534 0.6408 0.8005
No log 17.3333 208 0.7442 0.6759 0.7442 0.8627
No log 17.5 210 0.8482 0.6573 0.8482 0.9210
No log 17.6667 212 0.8615 0.6370 0.8615 0.9282
No log 17.8333 214 0.8236 0.6715 0.8236 0.9075
No log 18.0 216 0.8150 0.6715 0.8150 0.9028
No log 18.1667 218 0.7462 0.7206 0.7462 0.8638
No log 18.3333 220 0.6725 0.7586 0.6725 0.8201
No log 18.5 222 0.6464 0.7703 0.6464 0.8040
No log 18.6667 224 0.6719 0.7568 0.6719 0.8197
No log 18.8333 226 0.8381 0.6753 0.8381 0.9155
No log 19.0 228 0.9414 0.5986 0.9414 0.9702
No log 19.1667 230 0.8698 0.6475 0.8698 0.9326
No log 19.3333 232 0.8325 0.7111 0.8325 0.9124
No log 19.5 234 0.8142 0.7299 0.8142 0.9023
No log 19.6667 236 0.8461 0.6870 0.8461 0.9198
No log 19.8333 238 0.9333 0.6667 0.9333 0.9661
No log 20.0 240 0.9887 0.6099 0.9887 0.9943
No log 20.1667 242 0.9326 0.6667 0.9326 0.9657
No log 20.3333 244 0.8998 0.6912 0.8998 0.9486
No log 20.5 246 0.8705 0.6866 0.8705 0.9330
No log 20.6667 248 0.8412 0.7007 0.8412 0.9171
No log 20.8333 250 0.8468 0.7034 0.8468 0.9202
No log 21.0 252 0.8037 0.7123 0.8037 0.8965
No log 21.1667 254 0.7463 0.7550 0.7463 0.8639
No log 21.3333 256 0.7430 0.7564 0.7430 0.8619
No log 21.5 258 0.7862 0.7195 0.7862 0.8867
No log 21.6667 260 0.7274 0.75 0.7274 0.8529
No log 21.8333 262 0.7165 0.7651 0.7165 0.8465
No log 22.0 264 0.7859 0.7162 0.7859 0.8865
No log 22.1667 266 0.8773 0.6531 0.8773 0.9366
No log 22.3333 268 0.8909 0.6712 0.8909 0.9439
No log 22.5 270 0.9299 0.6438 0.9299 0.9643
No log 22.6667 272 0.8884 0.6712 0.8884 0.9425
No log 22.8333 274 0.8097 0.7162 0.8097 0.8998
No log 23.0 276 0.7891 0.6429 0.7891 0.8883
No log 23.1667 278 0.8115 0.7260 0.8115 0.9008
No log 23.3333 280 0.8470 0.6993 0.8470 0.9203
No log 23.5 282 0.9293 0.6338 0.9293 0.9640
No log 23.6667 284 0.9399 0.6338 0.9399 0.9695
No log 23.8333 286 0.8569 0.6667 0.8569 0.9257
No log 24.0 288 0.8045 0.6857 0.8045 0.8969
No log 24.1667 290 0.8007 0.6857 0.8007 0.8948
No log 24.3333 292 0.7841 0.7042 0.7841 0.8855
No log 24.5 294 0.8146 0.7285 0.8146 0.9026
No log 24.6667 296 0.8265 0.7059 0.8265 0.9091
No log 24.8333 298 0.7984 0.7285 0.7984 0.8935
No log 25.0 300 0.7734 0.7172 0.7734 0.8794
No log 25.1667 302 0.8073 0.72 0.8073 0.8985
No log 25.3333 304 0.8920 0.6667 0.8920 0.9444
No log 25.5 306 0.9488 0.6241 0.9488 0.9741
No log 25.6667 308 0.9099 0.6165 0.9099 0.9539
No log 25.8333 310 0.8758 0.6107 0.8758 0.9358
No log 26.0 312 0.8291 0.7050 0.8291 0.9105
No log 26.1667 314 0.8214 0.6944 0.8214 0.9063
No log 26.3333 316 0.8883 0.6475 0.8883 0.9425
No log 26.5 318 0.9865 0.6405 0.9865 0.9932
No log 26.6667 320 1.0269 0.6242 1.0269 1.0134
No log 26.8333 322 0.9218 0.6575 0.9218 0.9601
No log 27.0 324 0.8419 0.6853 0.8419 0.9175
No log 27.1667 326 0.8149 0.7183 0.8149 0.9027
No log 27.3333 328 0.8705 0.6522 0.8705 0.9330
No log 27.5 330 0.9574 0.6475 0.9574 0.9785
No log 27.6667 332 1.0478 0.6053 1.0478 1.0236
No log 27.8333 334 0.9814 0.6395 0.9814 0.9906
No log 28.0 336 0.8358 0.6667 0.8358 0.9142
No log 28.1667 338 0.6990 0.7586 0.6990 0.8360
No log 28.3333 340 0.6579 0.7632 0.6579 0.8111
No log 28.5 342 0.6492 0.7742 0.6492 0.8057
No log 28.6667 344 0.6711 0.7922 0.6711 0.8192
No log 28.8333 346 0.7993 0.7073 0.7993 0.8940
No log 29.0 348 0.9587 0.6667 0.9587 0.9791
No log 29.1667 350 0.9402 0.6584 0.9402 0.9696
No log 29.3333 352 0.8431 0.6351 0.8431 0.9182
No log 29.5 354 0.8009 0.7050 0.8009 0.8950
No log 29.6667 356 0.8172 0.6714 0.8172 0.9040
No log 29.8333 358 0.8783 0.6351 0.8783 0.9372
No log 30.0 360 0.8592 0.6522 0.8592 0.9269
No log 30.1667 362 0.8181 0.6522 0.8181 0.9045
No log 30.3333 364 0.7788 0.6957 0.7788 0.8825
No log 30.5 366 0.7530 0.7194 0.7530 0.8677
No log 30.6667 368 0.7778 0.6857 0.7778 0.8819
No log 30.8333 370 0.8354 0.6892 0.8354 0.9140
No log 31.0 372 0.8698 0.6483 0.8698 0.9326
No log 31.1667 374 0.8845 0.6479 0.8845 0.9405
No log 31.3333 376 0.8580 0.6842 0.8580 0.9263
No log 31.5 378 0.8187 0.7083 0.8187 0.9048
No log 31.6667 380 0.7432 0.6763 0.7432 0.8621
No log 31.8333 382 0.7087 0.7273 0.7087 0.8418
No log 32.0 384 0.7167 0.7413 0.7167 0.8466
No log 32.1667 386 0.7629 0.7020 0.7629 0.8735
No log 32.3333 388 0.7664 0.7273 0.7664 0.8754
No log 32.5 390 0.7270 0.7333 0.7270 0.8527
No log 32.6667 392 0.7101 0.7517 0.7101 0.8427
No log 32.8333 394 0.7158 0.7568 0.7158 0.8461
No log 33.0 396 0.7240 0.75 0.7240 0.8509
No log 33.1667 398 0.7743 0.7183 0.7743 0.8800
No log 33.3333 400 0.8415 0.6897 0.8415 0.9173
No log 33.5 402 0.8268 0.7083 0.8268 0.9093
No log 33.6667 404 0.7831 0.7183 0.7831 0.8849
No log 33.8333 406 0.7566 0.7183 0.7566 0.8698
No log 34.0 408 0.7954 0.7310 0.7954 0.8918
No log 34.1667 410 0.9323 0.6974 0.9323 0.9655
No log 34.3333 412 0.9951 0.6797 0.9951 0.9975
No log 34.5 414 0.9178 0.7020 0.9178 0.9580
No log 34.6667 416 0.7972 0.7083 0.7972 0.8929
No log 34.8333 418 0.7481 0.7042 0.7481 0.8649
No log 35.0 420 0.7540 0.7260 0.7540 0.8683
No log 35.1667 422 0.7464 0.7234 0.7464 0.8639
No log 35.3333 424 0.7680 0.7383 0.7680 0.8763
No log 35.5 426 0.7652 0.7383 0.7652 0.8748
No log 35.6667 428 0.7354 0.7333 0.7354 0.8575
No log 35.8333 430 0.6921 0.7682 0.6921 0.8319
No log 36.0 432 0.6928 0.7682 0.6928 0.8323
No log 36.1667 434 0.7341 0.7692 0.7341 0.8568
No log 36.3333 436 0.7624 0.7451 0.7624 0.8731
No log 36.5 438 0.7873 0.7432 0.7873 0.8873
No log 36.6667 440 0.7908 0.7432 0.7908 0.8893
No log 36.8333 442 0.7920 0.7432 0.7920 0.8900
No log 37.0 444 0.7966 0.7550 0.7966 0.8925
No log 37.1667 446 0.7960 0.7532 0.7960 0.8922
No log 37.3333 448 0.8214 0.7389 0.8214 0.9063
No log 37.5 450 0.8331 0.7389 0.8331 0.9127
No log 37.6667 452 0.8287 0.7389 0.8287 0.9103
No log 37.8333 454 0.7600 0.7383 0.7600 0.8718
No log 38.0 456 0.7372 0.7183 0.7372 0.8586
No log 38.1667 458 0.7650 0.7092 0.7650 0.8746
No log 38.3333 460 0.7802 0.7183 0.7802 0.8833
No log 38.5 462 0.7609 0.7050 0.7609 0.8723
No log 38.6667 464 0.7502 0.7211 0.7502 0.8661
No log 38.8333 466 0.7603 0.7320 0.7603 0.8720
No log 39.0 468 0.7638 0.75 0.7638 0.8739
No log 39.1667 470 0.7234 0.7619 0.7234 0.8506
No log 39.3333 472 0.6924 0.7515 0.6924 0.8321
No log 39.5 474 0.7035 0.7407 0.7035 0.8387
No log 39.6667 476 0.7316 0.7308 0.7316 0.8554
No log 39.8333 478 0.7464 0.7114 0.7464 0.8639
No log 40.0 480 0.7714 0.7211 0.7714 0.8783
No log 40.1667 482 0.7673 0.6993 0.7673 0.8759
No log 40.3333 484 0.7465 0.7222 0.7465 0.8640
No log 40.5 486 0.7327 0.7222 0.7327 0.8560
No log 40.6667 488 0.7344 0.7310 0.7344 0.8570
No log 40.8333 490 0.7961 0.75 0.7961 0.8923
No log 41.0 492 0.8810 0.7059 0.8810 0.9386
No log 41.1667 494 0.8970 0.6577 0.8970 0.9471
No log 41.3333 496 0.8680 0.6759 0.8680 0.9317
No log 41.5 498 0.7871 0.7050 0.7871 0.8872
0.3795 41.6667 500 0.7426 0.7429 0.7426 0.8618
0.3795 41.8333 502 0.7465 0.7286 0.7465 0.8640
0.3795 42.0 504 0.7505 0.7234 0.7505 0.8663
0.3795 42.1667 506 0.7884 0.6912 0.7884 0.8879
0.3795 42.3333 508 0.8212 0.7162 0.8212 0.9062
0.3795 42.5 510 0.8524 0.6892 0.8524 0.9233

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run3_AugV5_k2_task1_organization

Finetuned
(4023)
this model