ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k19_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8857
  • Qwk: 0.4809
  • Mse: 0.8857
  • Rmse: 0.9411

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0417 2 4.3322 -0.0048 4.3322 2.0814
No log 0.0833 4 2.5281 -0.0340 2.5281 1.5900
No log 0.125 6 1.4897 0.0185 1.4897 1.2205
No log 0.1667 8 1.1472 0.2343 1.1472 1.0711
No log 0.2083 10 1.4258 0.0343 1.4258 1.1941
No log 0.25 12 1.4676 0.0568 1.4676 1.2114
No log 0.2917 14 1.1440 0.1848 1.1440 1.0696
No log 0.3333 16 1.0720 0.1313 1.0720 1.0354
No log 0.375 18 1.0918 0.1864 1.0918 1.0449
No log 0.4167 20 1.0645 0.0422 1.0645 1.0318
No log 0.4583 22 1.1121 0.1076 1.1121 1.0546
No log 0.5 24 1.1023 0.0824 1.1023 1.0499
No log 0.5417 26 1.1543 0.1203 1.1543 1.0744
No log 0.5833 28 1.0460 0.1263 1.0460 1.0228
No log 0.625 30 0.9569 0.2865 0.9569 0.9782
No log 0.6667 32 0.9375 0.2865 0.9375 0.9682
No log 0.7083 34 0.9329 0.2671 0.9329 0.9659
No log 0.75 36 1.0419 0.1881 1.0419 1.0207
No log 0.7917 38 1.4336 -0.0270 1.4336 1.1973
No log 0.8333 40 1.3413 -0.0112 1.3413 1.1581
No log 0.875 42 0.8825 0.3221 0.8825 0.9394
No log 0.9167 44 0.9190 0.4406 0.9190 0.9587
No log 0.9583 46 0.9979 0.2956 0.9979 0.9989
No log 1.0 48 0.9045 0.4357 0.9045 0.9510
No log 1.0417 50 0.8200 0.4210 0.8200 0.9055
No log 1.0833 52 0.8961 0.3815 0.8961 0.9466
No log 1.125 54 0.9969 0.2956 0.9969 0.9984
No log 1.1667 56 1.1032 0.2038 1.1032 1.0503
No log 1.2083 58 1.1889 0.1426 1.1889 1.0904
No log 1.25 60 0.9552 0.3958 0.9552 0.9773
No log 1.2917 62 0.8357 0.5472 0.8357 0.9142
No log 1.3333 64 0.8712 0.5062 0.8712 0.9334
No log 1.375 66 0.8265 0.5195 0.8265 0.9091
No log 1.4167 68 0.7830 0.5107 0.7830 0.8849
No log 1.4583 70 0.7917 0.5528 0.7917 0.8898
No log 1.5 72 0.8603 0.5279 0.8603 0.9275
No log 1.5417 74 0.8340 0.5183 0.8340 0.9132
No log 1.5833 76 0.7311 0.5329 0.7311 0.8550
No log 1.625 78 0.7282 0.5748 0.7282 0.8533
No log 1.6667 80 0.7276 0.5650 0.7276 0.8530
No log 1.7083 82 0.7115 0.5797 0.7115 0.8435
No log 1.75 84 0.8166 0.4921 0.8166 0.9037
No log 1.7917 86 0.7273 0.5654 0.7273 0.8528
No log 1.8333 88 0.7993 0.5339 0.7993 0.8941
No log 1.875 90 1.4456 0.3099 1.4456 1.2023
No log 1.9167 92 1.4957 0.3138 1.4957 1.2230
No log 1.9583 94 1.1015 0.3539 1.1015 1.0495
No log 2.0 96 0.7592 0.4838 0.7592 0.8713
No log 2.0417 98 0.8233 0.4962 0.8233 0.9073
No log 2.0833 100 1.0542 0.3040 1.0542 1.0268
No log 2.125 102 1.0384 0.3424 1.0384 1.0190
No log 2.1667 104 0.8155 0.5245 0.8155 0.9031
No log 2.2083 106 0.7862 0.5575 0.7862 0.8867
No log 2.25 108 0.8147 0.4810 0.8147 0.9026
No log 2.2917 110 0.8344 0.3800 0.8344 0.9134
No log 2.3333 112 0.8142 0.5342 0.8142 0.9023
No log 2.375 114 0.9228 0.4482 0.9228 0.9606
No log 2.4167 116 0.9353 0.4577 0.9353 0.9671
No log 2.4583 118 0.8401 0.4966 0.8401 0.9166
No log 2.5 120 0.7965 0.5450 0.7965 0.8924
No log 2.5417 122 0.8057 0.4996 0.8057 0.8976
No log 2.5833 124 0.7873 0.5124 0.7873 0.8873
No log 2.625 126 0.8080 0.4964 0.8080 0.8989
No log 2.6667 128 0.8035 0.4969 0.8035 0.8964
No log 2.7083 130 0.7780 0.5451 0.7780 0.8821
No log 2.75 132 0.8098 0.5300 0.8098 0.8999
No log 2.7917 134 0.7938 0.5038 0.7938 0.8909
No log 2.8333 136 0.8025 0.5261 0.8025 0.8958
No log 2.875 138 0.8010 0.4903 0.8010 0.8950
No log 2.9167 140 0.8132 0.5463 0.8132 0.9018
No log 2.9583 142 0.9190 0.4521 0.9190 0.9587
No log 3.0 144 1.0141 0.3959 1.0141 1.0070
No log 3.0417 146 0.9964 0.4150 0.9964 0.9982
No log 3.0833 148 0.8910 0.3993 0.8910 0.9439
No log 3.125 150 0.8635 0.4910 0.8635 0.9292
No log 3.1667 152 0.8781 0.4956 0.8781 0.9371
No log 3.2083 154 0.8687 0.5002 0.8687 0.9320
No log 3.25 156 0.9479 0.4244 0.9479 0.9736
No log 3.2917 158 0.9392 0.4098 0.9392 0.9691
No log 3.3333 160 0.8499 0.5135 0.8499 0.9219
No log 3.375 162 0.8727 0.3908 0.8727 0.9342
No log 3.4167 164 0.8658 0.3908 0.8658 0.9305
No log 3.4583 166 0.8206 0.4676 0.8206 0.9059
No log 3.5 168 0.8539 0.4599 0.8539 0.9241
No log 3.5417 170 0.9221 0.4695 0.9221 0.9603
No log 3.5833 172 0.8504 0.4510 0.8504 0.9222
No log 3.625 174 0.7662 0.5621 0.7662 0.8753
No log 3.6667 176 0.8112 0.5098 0.8112 0.9007
No log 3.7083 178 0.7759 0.5253 0.7759 0.8809
No log 3.75 180 0.7376 0.5752 0.7376 0.8589
No log 3.7917 182 0.7213 0.5822 0.7213 0.8493
No log 3.8333 184 0.7138 0.5572 0.7138 0.8449
No log 3.875 186 0.6895 0.5247 0.6895 0.8303
No log 3.9167 188 0.6717 0.5747 0.6717 0.8195
No log 3.9583 190 0.6500 0.5871 0.6500 0.8062
No log 4.0 192 0.6237 0.6076 0.6237 0.7898
No log 4.0417 194 0.6414 0.6179 0.6414 0.8009
No log 4.0833 196 0.6532 0.6512 0.6532 0.8082
No log 4.125 198 0.6099 0.5742 0.6099 0.7810
No log 4.1667 200 0.6747 0.6082 0.6747 0.8214
No log 4.2083 202 0.7341 0.5857 0.7341 0.8568
No log 4.25 204 0.6794 0.5548 0.6794 0.8243
No log 4.2917 206 0.6475 0.5577 0.6475 0.8047
No log 4.3333 208 0.7071 0.5837 0.7071 0.8409
No log 4.375 210 0.7083 0.5746 0.7083 0.8416
No log 4.4167 212 0.6769 0.5391 0.6769 0.8227
No log 4.4583 214 0.6887 0.5214 0.6887 0.8299
No log 4.5 216 0.8266 0.5271 0.8266 0.9092
No log 4.5417 218 0.8963 0.5070 0.8963 0.9467
No log 4.5833 220 0.7722 0.5345 0.7722 0.8788
No log 4.625 222 0.6847 0.5396 0.6847 0.8275
No log 4.6667 224 0.7353 0.5585 0.7353 0.8575
No log 4.7083 226 0.7818 0.4560 0.7818 0.8842
No log 4.75 228 0.7743 0.3882 0.7743 0.8799
No log 4.7917 230 0.7755 0.4373 0.7755 0.8806
No log 4.8333 232 0.7595 0.4090 0.7595 0.8715
No log 4.875 234 0.7377 0.4124 0.7377 0.8589
No log 4.9167 236 0.7292 0.5025 0.7292 0.8539
No log 4.9583 238 0.7676 0.4973 0.7676 0.8761
No log 5.0 240 0.8075 0.4952 0.8075 0.8986
No log 5.0417 242 0.8578 0.5458 0.8578 0.9262
No log 5.0833 244 0.7872 0.4597 0.7872 0.8872
No log 5.125 246 0.7191 0.5734 0.7191 0.8480
No log 5.1667 248 0.7071 0.5171 0.7071 0.8409
No log 5.2083 250 0.8157 0.4578 0.8157 0.9032
No log 5.25 252 0.9052 0.4280 0.9052 0.9514
No log 5.2917 254 0.8428 0.4489 0.8428 0.9180
No log 5.3333 256 0.7293 0.5329 0.7293 0.8540
No log 5.375 258 0.7361 0.5232 0.7361 0.8580
No log 5.4167 260 0.8102 0.4697 0.8102 0.9001
No log 5.4583 262 0.7904 0.5088 0.7904 0.8890
No log 5.5 264 0.7732 0.5304 0.7732 0.8793
No log 5.5417 266 0.7664 0.5303 0.7664 0.8754
No log 5.5833 268 0.7687 0.4835 0.7687 0.8768
No log 5.625 270 0.8093 0.4513 0.8093 0.8996
No log 5.6667 272 0.7795 0.4069 0.7795 0.8829
No log 5.7083 274 0.8003 0.4714 0.8003 0.8946
No log 5.75 276 0.8880 0.4310 0.8880 0.9423
No log 5.7917 278 0.8889 0.4318 0.8889 0.9428
No log 5.8333 280 0.8459 0.4439 0.8459 0.9197
No log 5.875 282 0.8710 0.4310 0.8710 0.9333
No log 5.9167 284 0.8661 0.4310 0.8661 0.9306
No log 5.9583 286 0.8013 0.4714 0.8013 0.8952
No log 6.0 288 0.7552 0.4748 0.7552 0.8690
No log 6.0417 290 0.7615 0.4748 0.7615 0.8726
No log 6.0833 292 0.7995 0.4461 0.7995 0.8941
No log 6.125 294 0.8797 0.4054 0.8797 0.9379
No log 6.1667 296 0.9766 0.4197 0.9766 0.9882
No log 6.2083 298 0.9645 0.3953 0.9645 0.9821
No log 6.25 300 0.8797 0.4558 0.8797 0.9379
No log 6.2917 302 0.7774 0.5305 0.7774 0.8817
No log 6.3333 304 0.7073 0.4883 0.7073 0.8410
No log 6.375 306 0.6890 0.5023 0.6890 0.8300
No log 6.4167 308 0.6824 0.5129 0.6824 0.8261
No log 6.4583 310 0.7034 0.5676 0.7034 0.8387
No log 6.5 312 0.7455 0.5663 0.7455 0.8634
No log 6.5417 314 0.7167 0.5880 0.7167 0.8466
No log 6.5833 316 0.6949 0.5510 0.6949 0.8336
No log 6.625 318 0.7036 0.5950 0.7036 0.8388
No log 6.6667 320 0.7000 0.5510 0.7000 0.8367
No log 6.7083 322 0.7009 0.5510 0.7009 0.8372
No log 6.75 324 0.7073 0.5342 0.7073 0.8410
No log 6.7917 326 0.7462 0.5599 0.7462 0.8638
No log 6.8333 328 0.8446 0.5735 0.8446 0.9190
No log 6.875 330 0.8554 0.4794 0.8554 0.9249
No log 6.9167 332 0.7694 0.5964 0.7694 0.8772
No log 6.9583 334 0.7019 0.4903 0.7019 0.8378
No log 7.0 336 0.7305 0.4765 0.7305 0.8547
No log 7.0417 338 0.7227 0.4641 0.7227 0.8501
No log 7.0833 340 0.6865 0.5033 0.6865 0.8286
No log 7.125 342 0.7616 0.4958 0.7616 0.8727
No log 7.1667 344 0.8859 0.4216 0.8859 0.9412
No log 7.2083 346 0.9132 0.4216 0.9132 0.9556
No log 7.25 348 0.8983 0.3539 0.8983 0.9478
No log 7.2917 350 0.8823 0.3222 0.8823 0.9393
No log 7.3333 352 0.8383 0.3169 0.8383 0.9156
No log 7.375 354 0.7845 0.4309 0.7845 0.8857
No log 7.4167 356 0.7686 0.4576 0.7686 0.8767
No log 7.4583 358 0.7825 0.5242 0.7825 0.8846
No log 7.5 360 0.7832 0.5666 0.7832 0.8850
No log 7.5417 362 0.8219 0.5636 0.8219 0.9066
No log 7.5833 364 0.7890 0.6071 0.7890 0.8883
No log 7.625 366 0.7070 0.5923 0.7070 0.8408
No log 7.6667 368 0.6793 0.5678 0.6793 0.8242
No log 7.7083 370 0.6742 0.5833 0.6742 0.8211
No log 7.75 372 0.6843 0.5033 0.6843 0.8272
No log 7.7917 374 0.6911 0.5018 0.6911 0.8313
No log 7.8333 376 0.7138 0.5865 0.7138 0.8449
No log 7.875 378 0.7463 0.5601 0.7463 0.8639
No log 7.9167 380 0.7405 0.5601 0.7405 0.8605
No log 7.9583 382 0.7043 0.6082 0.7043 0.8392
No log 8.0 384 0.6661 0.5835 0.6661 0.8161
No log 8.0417 386 0.6714 0.5629 0.6714 0.8194
No log 8.0833 388 0.6763 0.5629 0.6763 0.8224
No log 8.125 390 0.6741 0.5629 0.6741 0.8210
No log 8.1667 392 0.6892 0.5304 0.6892 0.8302
No log 8.2083 394 0.6887 0.5442 0.6887 0.8299
No log 8.25 396 0.6785 0.5171 0.6785 0.8237
No log 8.2917 398 0.6890 0.5549 0.6890 0.8300
No log 8.3333 400 0.7006 0.5003 0.7006 0.8370
No log 8.375 402 0.7508 0.5429 0.7508 0.8665
No log 8.4167 404 0.7498 0.5429 0.7498 0.8659
No log 8.4583 406 0.7334 0.5245 0.7334 0.8564
No log 8.5 408 0.7353 0.5127 0.7353 0.8575
No log 8.5417 410 0.7294 0.4660 0.7294 0.8541
No log 8.5833 412 0.7272 0.4660 0.7272 0.8527
No log 8.625 414 0.7272 0.4660 0.7272 0.8527
No log 8.6667 416 0.7308 0.4882 0.7308 0.8549
No log 8.7083 418 0.7712 0.5728 0.7712 0.8782
No log 8.75 420 0.7667 0.5666 0.7667 0.8756
No log 8.7917 422 0.7521 0.5451 0.7521 0.8672
No log 8.8333 424 0.7278 0.5002 0.7278 0.8531
No log 8.875 426 0.7227 0.5247 0.7227 0.8501
No log 8.9167 428 0.7265 0.5117 0.7265 0.8524
No log 8.9583 430 0.7262 0.5131 0.7262 0.8522
No log 9.0 432 0.7351 0.5033 0.7351 0.8574
No log 9.0417 434 0.7263 0.4450 0.7263 0.8523
No log 9.0833 436 0.7124 0.5050 0.7124 0.8440
No log 9.125 438 0.7116 0.5050 0.7116 0.8436
No log 9.1667 440 0.7240 0.5731 0.7240 0.8509
No log 9.2083 442 0.7957 0.5650 0.7957 0.8920
No log 9.25 444 0.8021 0.5447 0.8021 0.8956
No log 9.2917 446 0.7347 0.5572 0.7347 0.8572
No log 9.3333 448 0.6936 0.5622 0.6936 0.8328
No log 9.375 450 0.6997 0.4118 0.6997 0.8365
No log 9.4167 452 0.6979 0.4493 0.6979 0.8354
No log 9.4583 454 0.6768 0.5288 0.6768 0.8227
No log 9.5 456 0.6791 0.5259 0.6791 0.8241
No log 9.5417 458 0.7070 0.6051 0.7070 0.8408
No log 9.5833 460 0.7505 0.5938 0.7505 0.8663
No log 9.625 462 0.8078 0.5628 0.8078 0.8988
No log 9.6667 464 0.8080 0.5647 0.8080 0.8989
No log 9.7083 466 0.7644 0.5487 0.7644 0.8743
No log 9.75 468 0.7235 0.5823 0.7235 0.8506
No log 9.7917 470 0.7142 0.5495 0.7142 0.8451
No log 9.8333 472 0.7044 0.5627 0.7044 0.8393
No log 9.875 474 0.6942 0.5862 0.6942 0.8332
No log 9.9167 476 0.7021 0.4581 0.7021 0.8379
No log 9.9583 478 0.7070 0.4949 0.7070 0.8408
No log 10.0 480 0.7005 0.5169 0.7005 0.8369
No log 10.0417 482 0.6892 0.4922 0.6892 0.8302
No log 10.0833 484 0.7208 0.5380 0.7208 0.8490
No log 10.125 486 0.7798 0.5170 0.7798 0.8830
No log 10.1667 488 0.7857 0.5385 0.7857 0.8864
No log 10.2083 490 0.7407 0.5482 0.7407 0.8606
No log 10.25 492 0.6870 0.5025 0.6870 0.8288
No log 10.2917 494 0.6793 0.5419 0.6793 0.8242
No log 10.3333 496 0.6878 0.5405 0.6878 0.8293
No log 10.375 498 0.7346 0.5380 0.7346 0.8571
0.3005 10.4167 500 0.8043 0.5349 0.8043 0.8968
0.3005 10.4583 502 0.8452 0.5167 0.8452 0.9194
0.3005 10.5 504 0.8923 0.4681 0.8923 0.9446
0.3005 10.5417 506 0.9333 0.3738 0.9333 0.9661
0.3005 10.5833 508 0.9559 0.4539 0.9559 0.9777
0.3005 10.625 510 0.8857 0.4809 0.8857 0.9411

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k19_task5_organization

Finetuned
(4023)
this model