ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k17_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7522
  • Qwk: 0.7114
  • Mse: 0.7522
  • Rmse: 0.8673

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0260 2 7.1412 -0.0056 7.1412 2.6723
No log 0.0519 4 4.4280 0.0598 4.4280 2.1043
No log 0.0779 6 3.0605 0.0952 3.0605 1.7494
No log 0.1039 8 2.2310 0.1690 2.2310 1.4937
No log 0.1299 10 2.1725 0.1955 2.1725 1.4739
No log 0.1558 12 2.5903 0.1342 2.5903 1.6094
No log 0.1818 14 2.4156 0.1449 2.4156 1.5542
No log 0.2078 16 2.0221 0.1475 2.0221 1.4220
No log 0.2338 18 1.8709 0.1579 1.8709 1.3678
No log 0.2597 20 1.6971 0.1905 1.6971 1.3027
No log 0.2857 22 1.5562 0.1905 1.5562 1.2475
No log 0.3117 24 1.4569 0.2478 1.4569 1.2070
No log 0.3377 26 1.3605 0.3390 1.3605 1.1664
No log 0.3636 28 1.1813 0.4202 1.1813 1.0869
No log 0.3896 30 1.0448 0.5000 1.0448 1.0221
No log 0.4156 32 0.9924 0.6202 0.9924 0.9962
No log 0.4416 34 1.0340 0.6131 1.0340 1.0169
No log 0.4675 36 0.8972 0.6047 0.8972 0.9472
No log 0.4935 38 0.9764 0.6202 0.9764 0.9882
No log 0.5195 40 0.8869 0.6418 0.8869 0.9417
No log 0.5455 42 0.8595 0.6950 0.8595 0.9271
No log 0.5714 44 0.8409 0.7114 0.8409 0.9170
No log 0.5974 46 0.9415 0.6074 0.9415 0.9703
No log 0.6234 48 1.0074 0.5778 1.0074 1.0037
No log 0.6494 50 0.9940 0.6099 0.9940 0.9970
No log 0.6753 52 0.9002 0.6531 0.9002 0.9488
No log 0.7013 54 0.9178 0.6483 0.9178 0.9580
No log 0.7273 56 0.9665 0.5972 0.9665 0.9831
No log 0.7532 58 0.9459 0.6575 0.9459 0.9726
No log 0.7792 60 0.9467 0.6575 0.9467 0.9730
No log 0.8052 62 0.9840 0.625 0.9840 0.9920
No log 0.8312 64 0.9262 0.6712 0.9262 0.9624
No log 0.8571 66 0.8842 0.6712 0.8842 0.9403
No log 0.8831 68 0.8578 0.6621 0.8578 0.9262
No log 0.9091 70 0.9097 0.6575 0.9097 0.9538
No log 0.9351 72 1.0595 0.6111 1.0595 1.0293
No log 0.9610 74 0.8914 0.6759 0.8914 0.9441
No log 0.9870 76 0.8060 0.7285 0.8060 0.8978
No log 1.0130 78 1.4278 0.5478 1.4278 1.1949
No log 1.0390 80 1.9052 0.5056 1.9052 1.3803
No log 1.0649 82 1.3146 0.5478 1.3146 1.1466
No log 1.0909 84 0.9154 0.5714 0.9154 0.9568
No log 1.1169 86 1.2383 0.5000 1.2383 1.1128
No log 1.1429 88 1.1186 0.5224 1.1186 1.0577
No log 1.1688 90 0.8443 0.6471 0.8443 0.9189
No log 1.1948 92 0.9732 0.6222 0.9732 0.9865
No log 1.2208 94 0.8967 0.6667 0.8967 0.9469
No log 1.2468 96 0.8723 0.6423 0.8723 0.9340
No log 1.2727 98 0.9472 0.6345 0.9472 0.9732
No log 1.2987 100 0.9481 0.6294 0.9481 0.9737
No log 1.3247 102 0.9454 0.6143 0.9454 0.9723
No log 1.3506 104 1.0226 0.6345 1.0226 1.0112
No log 1.3766 106 1.1718 0.6111 1.1718 1.0825
No log 1.4026 108 1.1259 0.5915 1.1259 1.0611
No log 1.4286 110 0.9739 0.6370 0.9739 0.9869
No log 1.4545 112 1.1666 0.5816 1.1666 1.0801
No log 1.4805 114 1.5941 0.5031 1.5941 1.2626
No log 1.5065 116 1.4860 0.5283 1.4860 1.2190
No log 1.5325 118 1.0267 0.5693 1.0267 1.0132
No log 1.5584 120 0.8224 0.6765 0.8224 0.9069
No log 1.5844 122 0.8353 0.7133 0.8353 0.9139
No log 1.6104 124 0.7914 0.7083 0.7914 0.8896
No log 1.6364 126 0.7478 0.75 0.7478 0.8648
No log 1.6623 128 0.7287 0.7712 0.7287 0.8537
No log 1.6883 130 0.8569 0.6795 0.8569 0.9257
No log 1.7143 132 0.7969 0.7468 0.7969 0.8927
No log 1.7403 134 0.7884 0.6980 0.7884 0.8879
No log 1.7662 136 0.8611 0.6620 0.8611 0.9280
No log 1.7922 138 0.8322 0.6667 0.8322 0.9123
No log 1.8182 140 0.9635 0.5693 0.9635 0.9816
No log 1.8442 142 1.0384 0.5441 1.0384 1.0190
No log 1.8701 144 0.9067 0.6061 0.9067 0.9522
No log 1.8961 146 0.8702 0.6466 0.8702 0.9328
No log 1.9221 148 0.8862 0.6260 0.8862 0.9414
No log 1.9481 150 0.9254 0.6212 0.9254 0.9620
No log 1.9740 152 0.9695 0.5625 0.9695 0.9846
No log 2.0 154 0.9591 0.5846 0.9591 0.9793
No log 2.0260 156 0.9191 0.6308 0.9191 0.9587
No log 2.0519 158 0.8893 0.6569 0.8893 0.9430
No log 2.0779 160 0.8135 0.7059 0.8135 0.9019
No log 2.1039 162 0.7740 0.7194 0.7740 0.8798
No log 2.1299 164 0.7511 0.7133 0.7511 0.8666
No log 2.1558 166 0.8305 0.6759 0.8305 0.9113
No log 2.1818 168 0.8307 0.6575 0.8307 0.9114
No log 2.2078 170 0.8223 0.6849 0.8223 0.9068
No log 2.2338 172 0.7405 0.7619 0.7405 0.8605
No log 2.2597 174 0.8126 0.6849 0.8126 0.9015
No log 2.2857 176 0.7964 0.7075 0.7964 0.8924
No log 2.3117 178 0.7714 0.7211 0.7714 0.8783
No log 2.3377 180 0.7295 0.7297 0.7295 0.8541
No log 2.3636 182 0.6524 0.76 0.6524 0.8077
No log 2.3896 184 0.5965 0.7671 0.5965 0.7724
No log 2.4156 186 0.6164 0.7703 0.6164 0.7851
No log 2.4416 188 0.8974 0.6887 0.8974 0.9473
No log 2.4675 190 1.1498 0.6203 1.1498 1.0723
No log 2.4935 192 0.9606 0.6892 0.9606 0.9801
No log 2.5195 194 0.6621 0.7483 0.6621 0.8137
No log 2.5455 196 0.6383 0.7586 0.6383 0.7990
No log 2.5714 198 0.6274 0.7586 0.6274 0.7921
No log 2.5974 200 0.6746 0.7483 0.6746 0.8213
No log 2.6234 202 0.7683 0.7183 0.7683 0.8765
No log 2.6494 204 0.8679 0.7007 0.8679 0.9316
No log 2.6753 206 0.9090 0.5556 0.9090 0.9534
No log 2.7013 208 0.9871 0.4921 0.9871 0.9935
No log 2.7273 210 0.9434 0.5197 0.9434 0.9713
No log 2.7532 212 0.8244 0.6074 0.8244 0.9080
No log 2.7792 214 0.7439 0.7133 0.7439 0.8625
No log 2.8052 216 0.6974 0.7619 0.6974 0.8351
No log 2.8312 218 0.7137 0.7172 0.7137 0.8448
No log 2.8571 220 0.6712 0.7534 0.6712 0.8192
No log 2.8831 222 0.6663 0.7183 0.6663 0.8163
No log 2.9091 224 0.6881 0.7338 0.6881 0.8295
No log 2.9351 226 0.6984 0.7206 0.6984 0.8357
No log 2.9610 228 0.7299 0.6957 0.7299 0.8543
No log 2.9870 230 0.7209 0.6906 0.7209 0.8490
No log 3.0130 232 0.6122 0.7586 0.6122 0.7824
No log 3.0390 234 0.5846 0.7724 0.5846 0.7646
No log 3.0649 236 0.6528 0.7297 0.6528 0.8080
No log 3.0909 238 0.6886 0.7297 0.6886 0.8298
No log 3.1169 240 0.7466 0.7273 0.7466 0.8641
No log 3.1429 242 0.8285 0.6418 0.8285 0.9102
No log 3.1688 244 0.8452 0.6418 0.8452 0.9193
No log 3.1948 246 0.7909 0.7183 0.7909 0.8893
No log 3.2208 248 0.7198 0.7651 0.7198 0.8484
No log 3.2468 250 0.6477 0.7742 0.6477 0.8048
No log 3.2727 252 0.5883 0.7712 0.5883 0.7670
No log 3.2987 254 0.7034 0.7105 0.7034 0.8387
No log 3.3247 256 0.9850 0.6259 0.9850 0.9925
No log 3.3506 258 0.9428 0.6259 0.9428 0.9710
No log 3.3766 260 0.7209 0.7143 0.7209 0.8490
No log 3.4026 262 0.6730 0.7871 0.6730 0.8204
No log 3.4286 264 0.7806 0.7595 0.7806 0.8835
No log 3.4545 266 0.7992 0.7179 0.7992 0.8940
No log 3.4805 268 0.7603 0.7763 0.7603 0.8720
No log 3.5065 270 0.8248 0.6619 0.8248 0.9082
No log 3.5325 272 0.8408 0.6809 0.8408 0.9170
No log 3.5584 274 0.7423 0.7143 0.7423 0.8616
No log 3.5844 276 0.6883 0.7724 0.6883 0.8296
No log 3.6104 278 0.6576 0.7919 0.6576 0.8109
No log 3.6364 280 0.6052 0.8079 0.6052 0.7780
No log 3.6623 282 0.5835 0.8025 0.5835 0.7639
No log 3.6883 284 0.5259 0.8302 0.5259 0.7252
No log 3.7143 286 0.5340 0.8258 0.5340 0.7307
No log 3.7403 288 0.5389 0.8129 0.5389 0.7341
No log 3.7662 290 0.5436 0.8129 0.5436 0.7373
No log 3.7922 292 0.5414 0.8354 0.5414 0.7358
No log 3.8182 294 0.5650 0.8052 0.5650 0.7517
No log 3.8442 296 0.5488 0.8333 0.5488 0.7408
No log 3.8701 298 0.7027 0.7453 0.7027 0.8383
No log 3.8961 300 1.1297 0.6628 1.1297 1.0629
No log 3.9221 302 1.1556 0.6433 1.1556 1.0750
No log 3.9481 304 0.8633 0.7170 0.8633 0.9291
No log 3.9740 306 0.6048 0.8182 0.6048 0.7777
No log 4.0 308 0.6757 0.7586 0.6757 0.8220
No log 4.0260 310 0.7863 0.6571 0.7863 0.8868
No log 4.0519 312 0.7692 0.6906 0.7692 0.8770
No log 4.0779 314 0.7373 0.7552 0.7373 0.8587
No log 4.1039 316 0.7596 0.6950 0.7596 0.8715
No log 4.1299 318 0.7174 0.6950 0.7174 0.8470
No log 4.1558 320 0.6706 0.7619 0.6706 0.8189
No log 4.1818 322 0.7797 0.6575 0.7797 0.8830
No log 4.2078 324 0.9144 0.6621 0.9144 0.9563
No log 4.2338 326 0.8567 0.6621 0.8567 0.9256
No log 4.2597 328 0.7236 0.7172 0.7236 0.8507
No log 4.2857 330 0.6457 0.7763 0.6457 0.8036
No log 4.3117 332 0.6337 0.7843 0.6337 0.7961
No log 4.3377 334 0.6542 0.7467 0.6542 0.8088
No log 4.3636 336 0.7243 0.7034 0.7243 0.8510
No log 4.3896 338 0.6903 0.7260 0.6903 0.8309
No log 4.4156 340 0.6536 0.7919 0.6536 0.8084
No log 4.4416 342 0.6327 0.8228 0.6327 0.7954
No log 4.4675 344 0.5859 0.8375 0.5859 0.7655
No log 4.4935 346 0.5545 0.8375 0.5545 0.7446
No log 4.5195 348 0.5510 0.8125 0.5510 0.7423
No log 4.5455 350 0.5719 0.8125 0.5719 0.7562
No log 4.5714 352 0.6165 0.8077 0.6165 0.7852
No log 4.5974 354 0.7036 0.7632 0.7036 0.8388
No log 4.6234 356 0.7484 0.7042 0.7484 0.8651
No log 4.6494 358 0.7192 0.7376 0.7192 0.8481
No log 4.6753 360 0.7523 0.6809 0.7523 0.8673
No log 4.7013 362 0.7457 0.7083 0.7457 0.8635
No log 4.7273 364 0.7162 0.7273 0.7162 0.8463
No log 4.7532 366 0.7160 0.7273 0.7160 0.8461
No log 4.7792 368 0.7307 0.7310 0.7307 0.8548
No log 4.8052 370 0.7485 0.6939 0.7485 0.8651
No log 4.8312 372 0.7637 0.6939 0.7637 0.8739
No log 4.8571 374 0.8116 0.6621 0.8116 0.9009
No log 4.8831 376 0.7950 0.6944 0.7950 0.8916
No log 4.9091 378 0.7614 0.7273 0.7614 0.8726
No log 4.9351 380 0.7608 0.7133 0.7608 0.8722
No log 4.9610 382 0.7484 0.7133 0.7484 0.8651
No log 4.9870 384 0.7089 0.7133 0.7089 0.8420
No log 5.0130 386 0.6625 0.7260 0.6625 0.8139
No log 5.0390 388 0.6929 0.7260 0.6929 0.8324
No log 5.0649 390 0.7467 0.6974 0.7467 0.8641
No log 5.0909 392 0.6914 0.7383 0.6914 0.8315
No log 5.1169 394 0.7017 0.7383 0.7017 0.8377
No log 5.1429 396 0.6988 0.7162 0.6988 0.8359
No log 5.1688 398 0.6983 0.7568 0.6983 0.8356
No log 5.1948 400 0.7447 0.7133 0.7447 0.8629
No log 5.2208 402 0.7777 0.7092 0.7777 0.8819
No log 5.2468 404 0.7751 0.6715 0.7751 0.8804
No log 5.2727 406 0.7828 0.7143 0.7828 0.8848
No log 5.2987 408 0.8061 0.6571 0.8061 0.8978
No log 5.3247 410 0.7798 0.6713 0.7798 0.8830
No log 5.3506 412 0.7318 0.7172 0.7318 0.8555
No log 5.3766 414 0.7183 0.6812 0.7183 0.8475
No log 5.4026 416 0.7408 0.7338 0.7408 0.8607
No log 5.4286 418 0.7449 0.7338 0.7449 0.8631
No log 5.4545 420 0.7304 0.7552 0.7304 0.8547
No log 5.4805 422 0.7120 0.7552 0.7120 0.8438
No log 5.5065 424 0.6796 0.7755 0.6796 0.8244
No log 5.5325 426 0.6476 0.7895 0.6476 0.8047
No log 5.5584 428 0.6506 0.7568 0.6506 0.8066
No log 5.5844 430 0.6451 0.7448 0.6451 0.8032
No log 5.6104 432 0.6635 0.7651 0.6635 0.8145
No log 5.6364 434 0.6582 0.7947 0.6582 0.8113
No log 5.6623 436 0.6778 0.7763 0.6778 0.8233
No log 5.6883 438 0.7535 0.7162 0.7535 0.8680
No log 5.7143 440 0.7085 0.7383 0.7085 0.8417
No log 5.7403 442 0.5990 0.8026 0.5990 0.7739
No log 5.7662 444 0.5977 0.7919 0.5977 0.7731
No log 5.7922 446 0.6564 0.7361 0.6564 0.8102
No log 5.8182 448 0.8093 0.6993 0.8093 0.8996
No log 5.8442 450 0.8551 0.6939 0.8551 0.9247
No log 5.8701 452 0.7587 0.7162 0.7587 0.8710
No log 5.8961 454 0.6280 0.7974 0.6280 0.7925
No log 5.9221 456 0.6253 0.8129 0.6253 0.7907
No log 5.9481 458 0.6112 0.8105 0.6112 0.7818
No log 5.9740 460 0.6265 0.7843 0.6265 0.7915
No log 6.0 462 0.6420 0.8026 0.6420 0.8013
No log 6.0260 464 0.7135 0.7613 0.7135 0.8447
No log 6.0519 466 0.7712 0.7105 0.7712 0.8782
No log 6.0779 468 0.7249 0.7320 0.7249 0.8514
No log 6.1039 470 0.6299 0.8077 0.6299 0.7937
No log 6.1299 472 0.7034 0.7013 0.7034 0.8387
No log 6.1558 474 0.7415 0.6887 0.7415 0.8611
No log 6.1818 476 0.6721 0.7226 0.6721 0.8198
No log 6.2078 478 0.6244 0.8125 0.6244 0.7902
No log 6.2338 480 0.6374 0.8025 0.6374 0.7984
No log 6.2597 482 0.6460 0.7925 0.6460 0.8037
No log 6.2857 484 0.7518 0.6753 0.7518 0.8671
No log 6.3117 486 0.8871 0.6531 0.8871 0.9419
No log 6.3377 488 0.8685 0.6667 0.8685 0.9320
No log 6.3636 490 0.7460 0.7237 0.7460 0.8637
No log 6.3896 492 0.7446 0.7826 0.7446 0.8629
No log 6.4156 494 0.8014 0.7394 0.8014 0.8952
No log 6.4416 496 0.8113 0.7394 0.8113 0.9007
No log 6.4675 498 0.7631 0.7952 0.7631 0.8736
0.4032 6.4935 500 0.7327 0.7673 0.7327 0.8560
0.4032 6.5195 502 0.7398 0.7285 0.7398 0.8601
0.4032 6.5455 504 0.7487 0.7347 0.7487 0.8653
0.4032 6.5714 506 0.7512 0.7183 0.7512 0.8667
0.4032 6.5974 508 0.7397 0.7448 0.7397 0.8600
0.4032 6.6234 510 0.7522 0.7114 0.7522 0.8673

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k17_task1_organization

Finetuned
(4023)
this model