ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k18_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7389
  • Qwk: 0.6944
  • Mse: 0.7389
  • Rmse: 0.8596

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0244 2 7.1110 0.0056 7.1110 2.6666
No log 0.0488 4 4.5651 0.1138 4.5651 2.1366
No log 0.0732 6 3.8657 -0.0847 3.8657 1.9661
No log 0.0976 8 3.2204 0.0241 3.2204 1.7945
No log 0.1220 10 2.0434 0.1429 2.0434 1.4295
No log 0.1463 12 2.0285 0.2069 2.0285 1.4243
No log 0.1707 14 2.5115 0.0276 2.5115 1.5848
No log 0.1951 16 2.5544 -0.0135 2.5544 1.5982
No log 0.2195 18 2.1723 0.1102 2.1723 1.4739
No log 0.2439 20 1.8919 0.1416 1.8919 1.3755
No log 0.2683 22 1.8656 0.1982 1.8656 1.3659
No log 0.2927 24 1.9930 0.3051 1.9930 1.4118
No log 0.3171 26 2.0065 0.2381 2.0065 1.4165
No log 0.3415 28 1.7498 0.3761 1.7498 1.3228
No log 0.3659 30 1.4276 0.2752 1.4276 1.1948
No log 0.3902 32 1.4126 0.1887 1.4126 1.1885
No log 0.4146 34 1.3500 0.3684 1.3500 1.1619
No log 0.4390 36 1.2649 0.4071 1.2649 1.1247
No log 0.4634 38 1.3406 0.4 1.3406 1.1578
No log 0.4878 40 1.6808 0.3824 1.6808 1.2965
No log 0.5122 42 1.6269 0.3556 1.6269 1.2755
No log 0.5366 44 1.2605 0.4262 1.2605 1.1227
No log 0.5610 46 1.2032 0.3717 1.2032 1.0969
No log 0.5854 48 1.2667 0.3684 1.2667 1.1255
No log 0.6098 50 1.2362 0.3860 1.2362 1.1118
No log 0.6341 52 1.1536 0.5082 1.1536 1.0741
No log 0.6585 54 1.1366 0.5821 1.1366 1.0661
No log 0.6829 56 1.1889 0.5802 1.1889 1.0904
No log 0.7073 58 1.5023 0.3559 1.5023 1.2257
No log 0.7317 60 1.8838 0.2017 1.8838 1.3725
No log 0.7561 62 1.7050 0.2564 1.7050 1.3058
No log 0.7805 64 1.4599 0.3248 1.4599 1.2082
No log 0.8049 66 1.0821 0.5238 1.0821 1.0402
No log 0.8293 68 1.0747 0.5116 1.0747 1.0367
No log 0.8537 70 1.3319 0.4211 1.3319 1.1541
No log 0.8780 72 1.5333 0.4118 1.5333 1.2383
No log 0.9024 74 1.6248 0.3741 1.6248 1.2747
No log 0.9268 76 1.2982 0.5038 1.2982 1.1394
No log 0.9512 78 1.1542 0.4651 1.1542 1.0743
No log 0.9756 80 1.4440 0.4286 1.4440 1.2017
No log 1.0 82 1.6466 0.3597 1.6466 1.2832
No log 1.0244 84 1.2930 0.5072 1.2930 1.1371
No log 1.0488 86 1.0478 0.5714 1.0478 1.0236
No log 1.0732 88 1.6025 0.2602 1.6025 1.2659
No log 1.0976 90 1.9527 0.1935 1.9527 1.3974
No log 1.1220 92 1.6006 0.3485 1.6006 1.2651
No log 1.1463 94 1.1882 0.4687 1.1882 1.0900
No log 1.1707 96 0.9791 0.5909 0.9791 0.9895
No log 1.1951 98 0.9613 0.6212 0.9613 0.9805
No log 1.2195 100 1.1204 0.5455 1.1204 1.0585
No log 1.2439 102 1.4708 0.4627 1.4708 1.2128
No log 1.2683 104 1.7952 0.2029 1.7952 1.3398
No log 1.2927 106 1.7640 0.2899 1.7640 1.3281
No log 1.3171 108 1.4257 0.4889 1.4257 1.1940
No log 1.3415 110 1.2611 0.5401 1.2611 1.1230
No log 1.3659 112 1.2312 0.5507 1.2312 1.1096
No log 1.3902 114 1.4698 0.4889 1.4698 1.2124
No log 1.4146 116 1.6202 0.3830 1.6202 1.2729
No log 1.4390 118 1.4179 0.5109 1.4179 1.1907
No log 1.4634 120 1.0093 0.6475 1.0093 1.0046
No log 1.4878 122 0.9048 0.6569 0.9048 0.9512
No log 1.5122 124 0.8961 0.6423 0.8961 0.9466
No log 1.5366 126 0.9413 0.6370 0.9413 0.9702
No log 1.5610 128 1.0815 0.6277 1.0815 1.0399
No log 1.5854 130 1.3402 0.4889 1.3402 1.1577
No log 1.6098 132 1.3589 0.4511 1.3589 1.1657
No log 1.6341 134 1.2093 0.5588 1.2093 1.0997
No log 1.6585 136 1.1372 0.5839 1.1372 1.0664
No log 1.6829 138 1.0943 0.5942 1.0943 1.0461
No log 1.7073 140 1.1732 0.5985 1.1732 1.0831
No log 1.7317 142 1.0890 0.6232 1.0890 1.0435
No log 1.7561 144 0.9062 0.6176 0.9062 0.9519
No log 1.7805 146 0.9023 0.6331 0.9023 0.9499
No log 1.8049 148 1.0906 0.5882 1.0906 1.0443
No log 1.8293 150 1.2435 0.5547 1.2435 1.1151
No log 1.8537 152 1.2023 0.5778 1.2023 1.0965
No log 1.8780 154 0.9663 0.5781 0.9663 0.9830
No log 1.9024 156 0.8442 0.6718 0.8442 0.9188
No log 1.9268 158 0.8435 0.6615 0.8435 0.9184
No log 1.9512 160 0.7857 0.7246 0.7857 0.8864
No log 1.9756 162 0.7300 0.7552 0.7300 0.8544
No log 2.0 164 0.7194 0.7448 0.7194 0.8482
No log 2.0244 166 0.7625 0.6571 0.7625 0.8732
No log 2.0488 168 0.7519 0.7310 0.7519 0.8671
No log 2.0732 170 0.7613 0.7552 0.7613 0.8725
No log 2.0976 172 0.7557 0.7324 0.7557 0.8693
No log 2.1220 174 0.7492 0.7222 0.7492 0.8656
No log 2.1463 176 0.7686 0.6901 0.7686 0.8767
No log 2.1707 178 0.7888 0.6667 0.7888 0.8881
No log 2.1951 180 0.8752 0.6471 0.8752 0.9355
No log 2.2195 182 0.9004 0.6471 0.9004 0.9489
No log 2.2439 184 0.8596 0.6423 0.8596 0.9272
No log 2.2683 186 0.8269 0.6522 0.8269 0.9093
No log 2.2927 188 0.8756 0.6525 0.8756 0.9358
No log 2.3171 190 1.0616 0.6197 1.0616 1.0303
No log 2.3415 192 1.0268 0.6345 1.0268 1.0133
No log 2.3659 194 0.8827 0.6528 0.8827 0.9395
No log 2.3902 196 0.8274 0.6241 0.8274 0.9096
No log 2.4146 198 0.7723 0.6338 0.7723 0.8788
No log 2.4390 200 0.8025 0.6479 0.8025 0.8958
No log 2.4634 202 0.7608 0.6429 0.7608 0.8723
No log 2.4878 204 0.7043 0.7448 0.7043 0.8392
No log 2.5122 206 0.6660 0.7785 0.6660 0.8161
No log 2.5366 208 0.6395 0.7703 0.6395 0.7997
No log 2.5610 210 0.6658 0.7260 0.6658 0.8159
No log 2.5854 212 0.7856 0.6389 0.7856 0.8864
No log 2.6098 214 0.7521 0.6389 0.7521 0.8672
No log 2.6341 216 0.7330 0.6667 0.7330 0.8561
No log 2.6585 218 0.8169 0.6667 0.8169 0.9038
No log 2.6829 220 0.8237 0.6569 0.8237 0.9076
No log 2.7073 222 0.8159 0.6569 0.8159 0.9033
No log 2.7317 224 0.8354 0.6569 0.8354 0.9140
No log 2.7561 226 0.7792 0.6569 0.7792 0.8827
No log 2.7805 228 0.6944 0.7808 0.6944 0.8333
No log 2.8049 230 0.7659 0.6714 0.7659 0.8752
No log 2.8293 232 0.7838 0.6522 0.7838 0.8853
No log 2.8537 234 0.8843 0.6475 0.8843 0.9404
No log 2.8780 236 0.8826 0.6475 0.8826 0.9395
No log 2.9024 238 0.8782 0.6475 0.8782 0.9371
No log 2.9268 240 0.8373 0.6522 0.8373 0.9151
No log 2.9512 242 0.6917 0.7586 0.6917 0.8317
No log 2.9756 244 0.6764 0.7586 0.6764 0.8225
No log 3.0 246 0.6881 0.7606 0.6881 0.8295
No log 3.0244 248 0.6941 0.6857 0.6941 0.8331
No log 3.0488 250 0.6814 0.7042 0.6814 0.8255
No log 3.0732 252 0.6458 0.7123 0.6458 0.8036
No log 3.0976 254 0.6436 0.7211 0.6436 0.8023
No log 3.1220 256 0.6363 0.7211 0.6363 0.7977
No log 3.1463 258 0.7085 0.6759 0.7085 0.8417
No log 3.1707 260 0.8003 0.6667 0.8003 0.8946
No log 3.1951 262 0.8868 0.6522 0.8868 0.9417
No log 3.2195 264 1.0035 0.6475 1.0035 1.0018
No log 3.2439 266 0.9810 0.6522 0.9810 0.9905
No log 3.2683 268 0.9310 0.6222 0.9310 0.9649
No log 3.2927 270 0.8936 0.6515 0.8936 0.9453
No log 3.3171 272 0.8474 0.6515 0.8474 0.9205
No log 3.3415 274 0.8947 0.6522 0.8947 0.9459
No log 3.3659 276 1.0835 0.6667 1.0835 1.0409
No log 3.3902 278 1.2089 0.6494 1.2089 1.0995
No log 3.4146 280 1.1598 0.6494 1.1598 1.0769
No log 3.4390 282 0.9008 0.6438 0.9008 0.9491
No log 3.4634 284 0.7348 0.6892 0.7348 0.8572
No log 3.4878 286 0.6706 0.6809 0.6706 0.8189
No log 3.5122 288 0.6886 0.6809 0.6886 0.8298
No log 3.5366 290 0.7492 0.6763 0.7492 0.8655
No log 3.5610 292 0.7612 0.6857 0.7612 0.8725
No log 3.5854 294 0.7749 0.6714 0.7749 0.8803
No log 3.6098 296 0.7451 0.6944 0.7451 0.8632
No log 3.6341 298 0.7218 0.7075 0.7218 0.8496
No log 3.6585 300 0.6373 0.7516 0.6373 0.7983
No log 3.6829 302 0.6484 0.7898 0.6484 0.8052
No log 3.7073 304 0.6768 0.7550 0.6768 0.8226
No log 3.7317 306 0.7276 0.6667 0.7276 0.8530
No log 3.7561 308 0.7824 0.6573 0.7824 0.8845
No log 3.7805 310 0.7905 0.6993 0.7905 0.8891
No log 3.8049 312 0.7912 0.6993 0.7912 0.8895
No log 3.8293 314 0.7594 0.7222 0.7594 0.8714
No log 3.8537 316 0.7418 0.7310 0.7418 0.8613
No log 3.8780 318 0.7264 0.7083 0.7264 0.8523
No log 3.9024 320 0.7122 0.7034 0.7122 0.8439
No log 3.9268 322 0.7673 0.6621 0.7673 0.8760
No log 3.9512 324 0.7773 0.6575 0.7773 0.8816
No log 3.9756 326 0.6926 0.7075 0.6926 0.8322
No log 4.0 328 0.7126 0.7517 0.7126 0.8442
No log 4.0244 330 0.7731 0.7172 0.7731 0.8792
No log 4.0488 332 0.7778 0.6857 0.7778 0.8819
No log 4.0732 334 0.7978 0.7153 0.7978 0.8932
No log 4.0976 336 0.8843 0.5954 0.8843 0.9404
No log 4.1220 338 0.9686 0.5909 0.9686 0.9842
No log 4.1463 340 0.9679 0.6015 0.9679 0.9838
No log 4.1707 342 0.8893 0.5909 0.8893 0.9430
No log 4.1951 344 0.8310 0.6232 0.8310 0.9116
No log 4.2195 346 0.8263 0.6763 0.8263 0.9090
No log 4.2439 348 0.8584 0.6763 0.8584 0.9265
No log 4.2683 350 0.8279 0.6812 0.8279 0.9099
No log 4.2927 352 0.7734 0.7092 0.7734 0.8794
No log 4.3171 354 0.7639 0.6714 0.7639 0.8740
No log 4.3415 356 0.8199 0.6577 0.8199 0.9055
No log 4.3659 358 0.9355 0.6792 0.9355 0.9672
No log 4.3902 360 0.8874 0.6755 0.8874 0.9420
No log 4.4146 362 0.7926 0.6345 0.7926 0.8903
No log 4.4390 364 0.7424 0.6714 0.7424 0.8616
No log 4.4634 366 0.7409 0.7338 0.7409 0.8608
No log 4.4878 368 0.7467 0.7714 0.7467 0.8641
No log 4.5122 370 0.7454 0.7391 0.7454 0.8633
No log 4.5366 372 0.7147 0.7299 0.7147 0.8454
No log 4.5610 374 0.6886 0.6763 0.6886 0.8298
No log 4.5854 376 0.6333 0.75 0.6333 0.7958
No log 4.6098 378 0.6097 0.7862 0.6097 0.7808
No log 4.6341 380 0.6111 0.7895 0.6111 0.7818
No log 4.6585 382 0.6056 0.7785 0.6056 0.7782
No log 4.6829 384 0.6822 0.6528 0.6822 0.8260
No log 4.7073 386 0.7357 0.6438 0.7357 0.8577
No log 4.7317 388 0.7062 0.6483 0.7062 0.8403
No log 4.7561 390 0.6760 0.6950 0.6760 0.8222
No log 4.7805 392 0.6790 0.7376 0.6790 0.8240
No log 4.8049 394 0.6926 0.7234 0.6926 0.8322
No log 4.8293 396 0.6609 0.7763 0.6609 0.8129
No log 4.8537 398 0.5987 0.7712 0.5987 0.7738
No log 4.8780 400 0.6614 0.7273 0.6614 0.8133
No log 4.9024 402 0.7621 0.7114 0.7621 0.8730
No log 4.9268 404 0.8380 0.6757 0.8380 0.9154
No log 4.9512 406 0.8031 0.6939 0.8031 0.8962
No log 4.9756 408 0.7265 0.7075 0.7265 0.8523
No log 5.0 410 0.7028 0.7273 0.7028 0.8383
No log 5.0244 412 0.6667 0.7324 0.6667 0.8165
No log 5.0488 414 0.6755 0.7310 0.6755 0.8219
No log 5.0732 416 0.8015 0.7417 0.8015 0.8953
No log 5.0976 418 0.8438 0.7417 0.8438 0.9186
No log 5.1220 420 0.7503 0.7297 0.7503 0.8662
No log 5.1463 422 0.6573 0.7619 0.6573 0.8107
No log 5.1707 424 0.6920 0.7042 0.6920 0.8319
No log 5.1951 426 0.7681 0.6853 0.7681 0.8764
No log 5.2195 428 0.7669 0.6901 0.7669 0.8757
No log 5.2439 430 0.7218 0.7234 0.7218 0.8496
No log 5.2683 432 0.6641 0.7692 0.6641 0.8149
No log 5.2927 434 0.6158 0.7947 0.6158 0.7847
No log 5.3171 436 0.5937 0.7949 0.5937 0.7706
No log 5.3415 438 0.6090 0.7848 0.6090 0.7804
No log 5.3659 440 0.6687 0.7347 0.6687 0.8177
No log 5.3902 442 0.6829 0.7550 0.6829 0.8264
No log 5.4146 444 0.6967 0.7662 0.6967 0.8347
No log 5.4390 446 0.7011 0.7662 0.7011 0.8373
No log 5.4634 448 0.6948 0.7484 0.6948 0.8336
No log 5.4878 450 0.6781 0.7484 0.6781 0.8235
No log 5.5122 452 0.6294 0.7692 0.6294 0.7934
No log 5.5366 454 0.6356 0.7692 0.6356 0.7973
No log 5.5610 456 0.6889 0.7226 0.6889 0.8300
No log 5.5854 458 0.7123 0.7059 0.7123 0.8440
No log 5.6098 460 0.6461 0.7484 0.6461 0.8038
No log 5.6341 462 0.5705 0.7950 0.5705 0.7553
No log 5.6585 464 0.5191 0.825 0.5191 0.7205
No log 5.6829 466 0.5530 0.7901 0.5530 0.7437
No log 5.7073 468 0.5525 0.8101 0.5525 0.7433
No log 5.7317 470 0.5486 0.8077 0.5486 0.7407
No log 5.7561 472 0.6888 0.7134 0.6888 0.8299
No log 5.7805 474 0.7566 0.6846 0.7566 0.8698
No log 5.8049 476 0.6837 0.6667 0.6837 0.8269
No log 5.8293 478 0.6201 0.7895 0.6201 0.7875
No log 5.8537 480 0.6039 0.7974 0.6039 0.7771
No log 5.8780 482 0.5570 0.7871 0.5570 0.7463
No log 5.9024 484 0.5936 0.7742 0.5936 0.7705
No log 5.9268 486 0.7879 0.6573 0.7879 0.8877
No log 5.9512 488 1.0587 0.6164 1.0587 1.0289
No log 5.9756 490 1.0907 0.6164 1.0907 1.0444
No log 6.0 492 0.9870 0.6434 0.9870 0.9935
No log 6.0244 494 0.7858 0.6944 0.7858 0.8865
No log 6.0488 496 0.6341 0.7467 0.6341 0.7963
No log 6.0732 498 0.6049 0.7792 0.6049 0.7778
0.4186 6.0976 500 0.6336 0.7397 0.6336 0.7960
0.4186 6.1220 502 0.7450 0.6944 0.7450 0.8631
0.4186 6.1463 504 0.8595 0.7134 0.8595 0.9271
0.4186 6.1707 506 0.8750 0.6797 0.8750 0.9354
0.4186 6.1951 508 0.8338 0.6667 0.8338 0.9131
0.4186 6.2195 510 0.7389 0.6944 0.7389 0.8596

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k18_task1_organization

Finetuned
(4023)
this model