ArabicNewSplits6_FineTuningAraBERTFreeze_run1_AugV5_k8_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7392
  • Qwk: 0.5546
  • Mse: 0.7392
  • Rmse: 0.8598

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0952 2 6.3823 -0.0278 6.3823 2.5263
No log 0.1905 4 4.2905 -0.0269 4.2905 2.0713
No log 0.2857 6 2.9754 0.0179 2.9754 1.7249
No log 0.3810 8 2.1280 0.0168 2.1280 1.4588
No log 0.4762 10 1.5407 0.0602 1.5407 1.2412
No log 0.5714 12 1.1863 0.0417 1.1863 1.0892
No log 0.6667 14 1.1069 -0.0132 1.1069 1.0521
No log 0.7619 16 1.2557 0.0292 1.2557 1.1206
No log 0.8571 18 1.3092 0.0351 1.3092 1.1442
No log 0.9524 20 1.0609 0.0438 1.0609 1.0300
No log 1.0476 22 0.9162 0.0280 0.9162 0.9572
No log 1.1429 24 0.7561 0.2166 0.7561 0.8696
No log 1.2381 26 0.6744 0.3098 0.6744 0.8212
No log 1.3333 28 0.6397 0.2806 0.6397 0.7998
No log 1.4286 30 0.6387 0.2886 0.6387 0.7992
No log 1.5238 32 0.6678 0.2171 0.6678 0.8172
No log 1.6190 34 0.7234 0.2404 0.7234 0.8505
No log 1.7143 36 0.8149 0.1914 0.8149 0.9027
No log 1.8095 38 0.8723 0.2691 0.8723 0.9340
No log 1.9048 40 0.8933 0.2953 0.8933 0.9451
No log 2.0 42 0.8633 0.3157 0.8633 0.9291
No log 2.0952 44 0.7725 0.2839 0.7725 0.8789
No log 2.1905 46 0.6663 0.3575 0.6663 0.8163
No log 2.2857 48 0.6106 0.4371 0.6106 0.7814
No log 2.3810 50 0.5919 0.4068 0.5919 0.7694
No log 2.4762 52 0.5872 0.4151 0.5872 0.7663
No log 2.5714 54 0.5996 0.4313 0.5996 0.7743
No log 2.6667 56 0.6340 0.4433 0.6340 0.7963
No log 2.7619 58 0.6586 0.4649 0.6586 0.8115
No log 2.8571 60 0.7269 0.3795 0.7269 0.8526
No log 2.9524 62 0.8532 0.3544 0.8532 0.9237
No log 3.0476 64 0.8770 0.3316 0.8770 0.9365
No log 3.1429 66 0.8051 0.4098 0.8051 0.8973
No log 3.2381 68 0.7222 0.4551 0.7222 0.8498
No log 3.3333 70 0.7405 0.4688 0.7405 0.8605
No log 3.4286 72 0.7901 0.4608 0.7901 0.8889
No log 3.5238 74 0.7377 0.4293 0.7377 0.8589
No log 3.6190 76 0.6380 0.4936 0.6380 0.7987
No log 3.7143 78 0.5633 0.5128 0.5633 0.7505
No log 3.8095 80 0.5539 0.5403 0.5539 0.7443
No log 3.9048 82 0.5675 0.5085 0.5675 0.7533
No log 4.0 84 0.5900 0.4945 0.5900 0.7681
No log 4.0952 86 0.6390 0.5103 0.6390 0.7994
No log 4.1905 88 0.6814 0.4990 0.6814 0.8255
No log 4.2857 90 0.7045 0.4653 0.7045 0.8393
No log 4.3810 92 0.6778 0.5056 0.6778 0.8233
No log 4.4762 94 0.5901 0.5239 0.5901 0.7682
No log 4.5714 96 0.5171 0.4912 0.5171 0.7191
No log 4.6667 98 0.5045 0.4813 0.5045 0.7103
No log 4.7619 100 0.5059 0.4813 0.5059 0.7113
No log 4.8571 102 0.5033 0.4744 0.5033 0.7094
No log 4.9524 104 0.5063 0.4757 0.5063 0.7115
No log 5.0476 106 0.5249 0.5254 0.5249 0.7245
No log 5.1429 108 0.5866 0.5646 0.5866 0.7659
No log 5.2381 110 0.6912 0.4836 0.6912 0.8314
No log 5.3333 112 0.7357 0.4848 0.7357 0.8577
No log 5.4286 114 0.6889 0.5018 0.6889 0.8300
No log 5.5238 116 0.6115 0.5643 0.6115 0.7820
No log 5.6190 118 0.5688 0.5722 0.5688 0.7542
No log 5.7143 120 0.5543 0.5504 0.5543 0.7445
No log 5.8095 122 0.5349 0.5246 0.5349 0.7314
No log 5.9048 124 0.5298 0.4928 0.5298 0.7278
No log 6.0 126 0.5245 0.4918 0.5245 0.7242
No log 6.0952 128 0.5161 0.5367 0.5161 0.7184
No log 6.1905 130 0.5406 0.5700 0.5406 0.7353
No log 6.2857 132 0.5754 0.5985 0.5754 0.7586
No log 6.3810 134 0.6277 0.5310 0.6277 0.7923
No log 6.4762 136 0.6598 0.5590 0.6598 0.8123
No log 6.5714 138 0.6582 0.5371 0.6582 0.8113
No log 6.6667 140 0.6235 0.5885 0.6235 0.7896
No log 6.7619 142 0.5936 0.5527 0.5936 0.7704
No log 6.8571 144 0.6013 0.6136 0.6013 0.7755
No log 6.9524 146 0.5985 0.5682 0.5985 0.7736
No log 7.0476 148 0.5683 0.5939 0.5683 0.7539
No log 7.1429 150 0.5367 0.5178 0.5367 0.7326
No log 7.2381 152 0.5577 0.54 0.5577 0.7468
No log 7.3333 154 0.5772 0.4983 0.5772 0.7597
No log 7.4286 156 0.5667 0.5403 0.5667 0.7528
No log 7.5238 158 0.5645 0.5552 0.5645 0.7514
No log 7.6190 160 0.6085 0.5820 0.6085 0.7801
No log 7.7143 162 0.6524 0.5557 0.6524 0.8077
No log 7.8095 164 0.6701 0.5474 0.6701 0.8186
No log 7.9048 166 0.6417 0.5543 0.6417 0.8010
No log 8.0 168 0.6227 0.5350 0.6227 0.7891
No log 8.0952 170 0.6039 0.5585 0.6039 0.7771
No log 8.1905 172 0.5959 0.5688 0.5959 0.7720
No log 8.2857 174 0.5973 0.5302 0.5973 0.7728
No log 8.3810 176 0.5970 0.5320 0.5970 0.7727
No log 8.4762 178 0.6103 0.5441 0.6103 0.7812
No log 8.5714 180 0.6350 0.5861 0.6350 0.7969
No log 8.6667 182 0.6503 0.5888 0.6503 0.8064
No log 8.7619 184 0.6679 0.5787 0.6679 0.8172
No log 8.8571 186 0.7047 0.5267 0.7047 0.8395
No log 8.9524 188 0.6992 0.5430 0.6992 0.8362
No log 9.0476 190 0.6849 0.5536 0.6849 0.8276
No log 9.1429 192 0.7055 0.54 0.7055 0.8399
No log 9.2381 194 0.7023 0.5246 0.7023 0.8380
No log 9.3333 196 0.6690 0.5519 0.6690 0.8179
No log 9.4286 198 0.6583 0.5405 0.6583 0.8114
No log 9.5238 200 0.6982 0.5234 0.6982 0.8356
No log 9.6190 202 0.7263 0.5248 0.7263 0.8522
No log 9.7143 204 0.7260 0.5248 0.7260 0.8520
No log 9.8095 206 0.7074 0.5413 0.7074 0.8411
No log 9.9048 208 0.6811 0.5272 0.6811 0.8253
No log 10.0 210 0.6760 0.5354 0.6760 0.8222
No log 10.0952 212 0.6828 0.5363 0.6828 0.8263
No log 10.1905 214 0.7025 0.5135 0.7025 0.8382
No log 10.2857 216 0.7021 0.5170 0.7021 0.8379
No log 10.3810 218 0.6957 0.5673 0.6957 0.8341
No log 10.4762 220 0.6908 0.5264 0.6908 0.8312
No log 10.5714 222 0.6835 0.5315 0.6835 0.8268
No log 10.6667 224 0.7029 0.5214 0.7029 0.8384
No log 10.7619 226 0.7081 0.5091 0.7081 0.8415
No log 10.8571 228 0.6950 0.5192 0.6950 0.8337
No log 10.9524 230 0.6985 0.5187 0.6985 0.8357
No log 11.0476 232 0.6983 0.5287 0.6983 0.8356
No log 11.1429 234 0.7049 0.5302 0.7049 0.8396
No log 11.2381 236 0.7168 0.5233 0.7168 0.8466
No log 11.3333 238 0.6974 0.5186 0.6974 0.8351
No log 11.4286 240 0.6860 0.5579 0.6860 0.8283
No log 11.5238 242 0.7026 0.5405 0.7026 0.8382
No log 11.6190 244 0.7156 0.5356 0.7156 0.8459
No log 11.7143 246 0.7102 0.5534 0.7102 0.8428
No log 11.8095 248 0.7114 0.5861 0.7114 0.8434
No log 11.9048 250 0.7415 0.5419 0.7415 0.8611
No log 12.0 252 0.7476 0.5554 0.7476 0.8647
No log 12.0952 254 0.7332 0.5419 0.7332 0.8563
No log 12.1905 256 0.7368 0.5421 0.7368 0.8584
No log 12.2857 258 0.7496 0.5889 0.7496 0.8658
No log 12.3810 260 0.7682 0.5550 0.7682 0.8765
No log 12.4762 262 0.7850 0.5271 0.7850 0.8860
No log 12.5714 264 0.7740 0.5425 0.7740 0.8798
No log 12.6667 266 0.7405 0.5257 0.7405 0.8605
No log 12.7619 268 0.7147 0.5113 0.7147 0.8454
No log 12.8571 270 0.7161 0.5087 0.7161 0.8462
No log 12.9524 272 0.7252 0.4773 0.7252 0.8516
No log 13.0476 274 0.7432 0.5006 0.7432 0.8621
No log 13.1429 276 0.7506 0.5589 0.7506 0.8664
No log 13.2381 278 0.7540 0.5591 0.7540 0.8683
No log 13.3333 280 0.7802 0.5643 0.7802 0.8833
No log 13.4286 282 0.8127 0.5263 0.8127 0.9015
No log 13.5238 284 0.8167 0.5654 0.8167 0.9037
No log 13.6190 286 0.8284 0.5865 0.8284 0.9102
No log 13.7143 288 0.8305 0.5714 0.8305 0.9113
No log 13.8095 290 0.8217 0.5265 0.8217 0.9065
No log 13.9048 292 0.8017 0.5189 0.8017 0.8954
No log 14.0 294 0.7896 0.5180 0.7896 0.8886
No log 14.0952 296 0.8009 0.5306 0.8009 0.8949
No log 14.1905 298 0.8387 0.5228 0.8387 0.9158
No log 14.2857 300 0.8306 0.5395 0.8306 0.9114
No log 14.3810 302 0.7899 0.5654 0.7899 0.8887
No log 14.4762 304 0.7734 0.5105 0.7734 0.8794
No log 14.5714 306 0.8229 0.4767 0.8229 0.9072
No log 14.6667 308 0.8724 0.4785 0.8724 0.9340
No log 14.7619 310 0.8609 0.4812 0.8609 0.9278
No log 14.8571 312 0.8121 0.5175 0.8121 0.9012
No log 14.9524 314 0.7948 0.5527 0.7948 0.8915
No log 15.0476 316 0.7959 0.54 0.7959 0.8922
No log 15.1429 318 0.7769 0.5411 0.7769 0.8814
No log 15.2381 320 0.7637 0.5040 0.7637 0.8739
No log 15.3333 322 0.7606 0.5292 0.7606 0.8721
No log 15.4286 324 0.7691 0.5105 0.7691 0.8770
No log 15.5238 326 0.7944 0.5540 0.7944 0.8913
No log 15.6190 328 0.8170 0.5668 0.8170 0.9039
No log 15.7143 330 0.8292 0.5548 0.8292 0.9106
No log 15.8095 332 0.8312 0.5623 0.8312 0.9117
No log 15.9048 334 0.8097 0.5407 0.8097 0.8999
No log 16.0 336 0.8006 0.5040 0.8005 0.8947
No log 16.0952 338 0.8129 0.5105 0.8129 0.9016
No log 16.1905 340 0.8228 0.5331 0.8228 0.9071
No log 16.2857 342 0.8206 0.5515 0.8206 0.9059
No log 16.3810 344 0.8130 0.5477 0.8130 0.9017
No log 16.4762 346 0.8020 0.5477 0.8020 0.8955
No log 16.5714 348 0.7853 0.5489 0.7853 0.8862
No log 16.6667 350 0.7577 0.5347 0.7577 0.8704
No log 16.7619 352 0.7347 0.5289 0.7347 0.8572
No log 16.8571 354 0.7257 0.5049 0.7257 0.8519
No log 16.9524 356 0.7229 0.5008 0.7229 0.8503
No log 17.0476 358 0.7199 0.5166 0.7199 0.8485
No log 17.1429 360 0.7249 0.5390 0.7249 0.8514
No log 17.2381 362 0.7295 0.5458 0.7295 0.8541
No log 17.3333 364 0.7426 0.5494 0.7426 0.8618
No log 17.4286 366 0.7751 0.5338 0.7751 0.8804
No log 17.5238 368 0.8010 0.5435 0.8010 0.8950
No log 17.6190 370 0.7867 0.5344 0.7867 0.8869
No log 17.7143 372 0.7710 0.5188 0.7710 0.8780
No log 17.8095 374 0.7734 0.5508 0.7734 0.8794
No log 17.9048 376 0.7780 0.5719 0.7780 0.8820
No log 18.0 378 0.7727 0.5499 0.7727 0.8790
No log 18.0952 380 0.7769 0.5691 0.7769 0.8814
No log 18.1905 382 0.7686 0.5575 0.7686 0.8767
No log 18.2857 384 0.7585 0.5590 0.7585 0.8709
No log 18.3810 386 0.7553 0.5797 0.7553 0.8691
No log 18.4762 388 0.7592 0.5803 0.7592 0.8713
No log 18.5714 390 0.7587 0.5672 0.7587 0.8711
No log 18.6667 392 0.7592 0.5519 0.7592 0.8713
No log 18.7619 394 0.7505 0.5776 0.7505 0.8663
No log 18.8571 396 0.7432 0.5590 0.7432 0.8621
No log 18.9524 398 0.7522 0.5667 0.7522 0.8673
No log 19.0476 400 0.7618 0.5703 0.7618 0.8728
No log 19.1429 402 0.7708 0.5575 0.7708 0.8780
No log 19.2381 404 0.7873 0.5812 0.7873 0.8873
No log 19.3333 406 0.8064 0.5784 0.8064 0.8980
No log 19.4286 408 0.7943 0.5787 0.7943 0.8912
No log 19.5238 410 0.7795 0.5849 0.7795 0.8829
No log 19.6190 412 0.7626 0.5577 0.7626 0.8733
No log 19.7143 414 0.7659 0.5666 0.7659 0.8751
No log 19.8095 416 0.7762 0.5598 0.7762 0.8810
No log 19.9048 418 0.7655 0.5696 0.7655 0.8749
No log 20.0 420 0.7479 0.5515 0.7479 0.8648
No log 20.0952 422 0.7391 0.5324 0.7391 0.8597
No log 20.1905 424 0.7266 0.5658 0.7266 0.8524
No log 20.2857 426 0.7138 0.5838 0.7138 0.8449
No log 20.3810 428 0.7202 0.6037 0.7202 0.8487
No log 20.4762 430 0.7237 0.5916 0.7237 0.8507
No log 20.5714 432 0.7025 0.6072 0.7025 0.8381
No log 20.6667 434 0.6979 0.5882 0.6979 0.8354
No log 20.7619 436 0.7044 0.5541 0.7044 0.8393
No log 20.8571 438 0.7343 0.5595 0.7343 0.8569
No log 20.9524 440 0.7675 0.5148 0.7675 0.8761
No log 21.0476 442 0.7886 0.5063 0.7886 0.8880
No log 21.1429 444 0.7982 0.5602 0.7982 0.8934
No log 21.2381 446 0.8085 0.5701 0.8085 0.8992
No log 21.3333 448 0.8160 0.5650 0.8160 0.9033
No log 21.4286 450 0.8132 0.5324 0.8132 0.9018
No log 21.5238 452 0.7877 0.5375 0.7877 0.8875
No log 21.6190 454 0.7508 0.5254 0.7508 0.8665
No log 21.7143 456 0.7270 0.5133 0.7270 0.8526
No log 21.8095 458 0.7169 0.5274 0.7169 0.8467
No log 21.9048 460 0.7186 0.5251 0.7186 0.8477
No log 22.0 462 0.7353 0.5327 0.7353 0.8575
No log 22.0952 464 0.7596 0.5615 0.7596 0.8715
No log 22.1905 466 0.7517 0.5926 0.7517 0.8670
No log 22.2857 468 0.7420 0.5868 0.7420 0.8614
No log 22.3810 470 0.7405 0.5710 0.7405 0.8605
No log 22.4762 472 0.7454 0.5398 0.7454 0.8634
No log 22.5714 474 0.7527 0.5020 0.7527 0.8676
No log 22.6667 476 0.7459 0.5272 0.7459 0.8637
No log 22.7619 478 0.7478 0.5037 0.7478 0.8647
No log 22.8571 480 0.7586 0.4922 0.7586 0.8710
No log 22.9524 482 0.7715 0.4770 0.7715 0.8784
No log 23.0476 484 0.7607 0.5127 0.7607 0.8722
No log 23.1429 486 0.7412 0.5152 0.7412 0.8609
No log 23.2381 488 0.7342 0.5471 0.7342 0.8569
No log 23.3333 490 0.7449 0.5325 0.7449 0.8631
No log 23.4286 492 0.7636 0.5154 0.7636 0.8738
No log 23.5238 494 0.7518 0.5275 0.7518 0.8670
No log 23.6190 496 0.7302 0.5511 0.7302 0.8545
No log 23.7143 498 0.7355 0.5743 0.7355 0.8576
0.539 23.8095 500 0.7394 0.5561 0.7394 0.8599
0.539 23.9048 502 0.7242 0.5694 0.7242 0.8510
0.539 24.0 504 0.7105 0.5603 0.7105 0.8429
0.539 24.0952 506 0.7102 0.5491 0.7102 0.8428
0.539 24.1905 508 0.7265 0.5291 0.7265 0.8523
0.539 24.2857 510 0.7605 0.5368 0.7605 0.8721
0.539 24.3810 512 0.7777 0.5505 0.7777 0.8819
0.539 24.4762 514 0.7653 0.5575 0.7653 0.8748
0.539 24.5714 516 0.7648 0.5564 0.7648 0.8746
0.539 24.6667 518 0.7684 0.5615 0.7684 0.8766
0.539 24.7619 520 0.7920 0.5623 0.7920 0.8899
0.539 24.8571 522 0.8379 0.5388 0.8379 0.9154
0.539 24.9524 524 0.8389 0.5345 0.8389 0.9159
0.539 25.0476 526 0.8114 0.5479 0.8114 0.9008
0.539 25.1429 528 0.7869 0.5640 0.7869 0.8871
0.539 25.2381 530 0.7722 0.5324 0.7722 0.8787
0.539 25.3333 532 0.7719 0.5548 0.7719 0.8786
0.539 25.4286 534 0.7588 0.5550 0.7588 0.8711
0.539 25.5238 536 0.7553 0.5734 0.7553 0.8691
0.539 25.6190 538 0.7529 0.5800 0.7529 0.8677
0.539 25.7143 540 0.7545 0.5800 0.7545 0.8686
0.539 25.8095 542 0.7542 0.5476 0.7542 0.8685
0.539 25.9048 544 0.7464 0.5604 0.7464 0.8639
0.539 26.0 546 0.7325 0.5437 0.7325 0.8558
0.539 26.0952 548 0.7215 0.5433 0.7215 0.8494
0.539 26.1905 550 0.7169 0.5376 0.7169 0.8467
0.539 26.2857 552 0.7312 0.5546 0.7312 0.8551
0.539 26.3810 554 0.7392 0.5546 0.7392 0.8598

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_FineTuningAraBERTFreeze_run1_AugV5_k8_task2_organization

Finetuned
(4023)
this model