ArabicNewSplits6_FineTuningAraBERT_run2_AugV5_k11_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7066
  • Qwk: 0.5020
  • Mse: 0.7066
  • Rmse: 0.8406

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0364 2 3.8207 -0.0092 3.8207 1.9546
No log 0.0727 4 1.9298 0.0875 1.9298 1.3892
No log 0.1091 6 1.3077 0.0959 1.3077 1.1436
No log 0.1455 8 0.8584 -0.0316 0.8584 0.9265
No log 0.1818 10 0.7669 0.1532 0.7669 0.8757
No log 0.2182 12 0.7217 0.1521 0.7217 0.8495
No log 0.2545 14 0.8524 0.0846 0.8524 0.9232
No log 0.2909 16 0.9304 0.0534 0.9304 0.9646
No log 0.3273 18 0.8206 0.0583 0.8206 0.9059
No log 0.3636 20 0.8070 0.0586 0.8070 0.8983
No log 0.4 22 0.7666 0.0441 0.7666 0.8756
No log 0.4364 24 0.7004 0.3002 0.7004 0.8369
No log 0.4727 26 0.6503 0.2602 0.6503 0.8064
No log 0.5091 28 0.6570 0.2424 0.6570 0.8106
No log 0.5455 30 0.6589 0.3109 0.6589 0.8117
No log 0.5818 32 0.6924 0.2210 0.6924 0.8321
No log 0.6182 34 0.8741 0.2018 0.8741 0.9349
No log 0.6545 36 0.9918 0.0526 0.9918 0.9959
No log 0.6909 38 0.8361 0.1899 0.8361 0.9144
No log 0.7273 40 0.7574 0.2373 0.7574 0.8703
No log 0.7636 42 0.8476 0.1904 0.8476 0.9206
No log 0.8 44 0.8055 0.1981 0.8055 0.8975
No log 0.8364 46 0.6656 0.2947 0.6656 0.8159
No log 0.8727 48 0.6461 0.375 0.6461 0.8038
No log 0.9091 50 0.6177 0.4046 0.6177 0.7860
No log 0.9455 52 0.6442 0.3663 0.6442 0.8026
No log 0.9818 54 0.6883 0.2775 0.6883 0.8296
No log 1.0182 56 0.7642 0.2215 0.7642 0.8742
No log 1.0545 58 0.7281 0.2254 0.7281 0.8533
No log 1.0909 60 0.6428 0.3475 0.6428 0.8018
No log 1.1273 62 0.5821 0.3958 0.5821 0.7630
No log 1.1636 64 0.5556 0.3523 0.5556 0.7454
No log 1.2 66 0.5463 0.4395 0.5463 0.7391
No log 1.2364 68 0.5354 0.4825 0.5354 0.7317
No log 1.2727 70 0.5480 0.4790 0.5480 0.7403
No log 1.3091 72 0.5524 0.4584 0.5524 0.7433
No log 1.3455 74 0.5675 0.4012 0.5675 0.7533
No log 1.3818 76 0.5619 0.4734 0.5619 0.7496
No log 1.4182 78 0.5823 0.4038 0.5823 0.7631
No log 1.4545 80 0.5980 0.3754 0.5980 0.7733
No log 1.4909 82 0.5616 0.4379 0.5616 0.7494
No log 1.5273 84 0.5857 0.4727 0.5857 0.7653
No log 1.5636 86 0.6309 0.4033 0.6309 0.7943
No log 1.6 88 0.5912 0.4695 0.5912 0.7689
No log 1.6364 90 0.5857 0.4658 0.5857 0.7653
No log 1.6727 92 0.6088 0.4312 0.6088 0.7803
No log 1.7091 94 0.6812 0.3738 0.6812 0.8253
No log 1.7455 96 0.8896 0.3931 0.8896 0.9432
No log 1.7818 98 0.8543 0.3854 0.8543 0.9243
No log 1.8182 100 0.6757 0.4433 0.6757 0.8220
No log 1.8545 102 0.6084 0.4058 0.6084 0.7800
No log 1.8909 104 0.6242 0.5110 0.6242 0.7900
No log 1.9273 106 0.6529 0.5064 0.6529 0.8080
No log 1.9636 108 0.6753 0.4967 0.6753 0.8218
No log 2.0 110 0.6427 0.4836 0.6427 0.8017
No log 2.0364 112 0.6260 0.4058 0.6260 0.7912
No log 2.0727 114 0.6440 0.4169 0.6440 0.8025
No log 2.1091 116 0.6547 0.4607 0.6547 0.8092
No log 2.1455 118 0.6466 0.4520 0.6466 0.8041
No log 2.1818 120 0.6354 0.4605 0.6354 0.7971
No log 2.2182 122 0.6979 0.4587 0.6979 0.8354
No log 2.2545 124 0.7740 0.4803 0.7740 0.8798
No log 2.2909 126 0.8124 0.4451 0.8124 0.9013
No log 2.3273 128 0.7170 0.4712 0.7170 0.8468
No log 2.3636 130 0.6490 0.4674 0.6490 0.8056
No log 2.4 132 0.6323 0.4234 0.6323 0.7952
No log 2.4364 134 0.6231 0.4234 0.6231 0.7894
No log 2.4727 136 0.6302 0.5126 0.6302 0.7939
No log 2.5091 138 0.7228 0.4203 0.7228 0.8502
No log 2.5455 140 0.7586 0.4099 0.7586 0.8710
No log 2.5818 142 0.6794 0.4308 0.6794 0.8243
No log 2.6182 144 0.5984 0.5364 0.5984 0.7736
No log 2.6545 146 0.6232 0.4682 0.6232 0.7894
No log 2.6909 148 0.6475 0.5190 0.6475 0.8047
No log 2.7273 150 0.5905 0.5174 0.5905 0.7684
No log 2.7636 152 0.5687 0.5385 0.5687 0.7541
No log 2.8 154 0.6035 0.5263 0.6035 0.7768
No log 2.8364 156 0.6957 0.5022 0.6957 0.8341
No log 2.8727 158 0.7787 0.4922 0.7787 0.8824
No log 2.9091 160 0.7987 0.4838 0.7987 0.8937
No log 2.9455 162 0.7655 0.5390 0.7655 0.8749
No log 2.9818 164 0.8613 0.4762 0.8613 0.9281
No log 3.0182 166 0.9183 0.4390 0.9183 0.9583
No log 3.0545 168 0.8511 0.4912 0.8511 0.9225
No log 3.0909 170 0.7511 0.5271 0.7511 0.8666
No log 3.1273 172 0.7423 0.4883 0.7423 0.8616
No log 3.1636 174 0.7622 0.4995 0.7622 0.8731
No log 3.2 176 0.7961 0.4740 0.7961 0.8922
No log 3.2364 178 0.8095 0.4613 0.8095 0.8997
No log 3.2727 180 0.7973 0.4509 0.7973 0.8929
No log 3.3091 182 0.7496 0.4700 0.7496 0.8658
No log 3.3455 184 0.7126 0.4466 0.7126 0.8442
No log 3.3818 186 0.6897 0.4594 0.6897 0.8305
No log 3.4182 188 0.6702 0.4705 0.6702 0.8186
No log 3.4545 190 0.6706 0.4989 0.6706 0.8189
No log 3.4909 192 0.6732 0.4924 0.6732 0.8205
No log 3.5273 194 0.6978 0.5096 0.6978 0.8353
No log 3.5636 196 0.7238 0.5096 0.7238 0.8508
No log 3.6 198 0.7819 0.4862 0.7819 0.8843
No log 3.6364 200 0.7625 0.4922 0.7625 0.8732
No log 3.6727 202 0.7818 0.5104 0.7818 0.8842
No log 3.7091 204 0.7569 0.5163 0.7569 0.8700
No log 3.7455 206 0.7327 0.5035 0.7327 0.8560
No log 3.7818 208 0.7434 0.5313 0.7434 0.8622
No log 3.8182 210 0.7440 0.5077 0.7440 0.8626
No log 3.8545 212 0.7843 0.4938 0.7843 0.8856
No log 3.8909 214 0.8050 0.5348 0.8050 0.8972
No log 3.9273 216 0.8018 0.5056 0.8018 0.8954
No log 3.9636 218 0.7344 0.4930 0.7344 0.8570
No log 4.0 220 0.7249 0.5335 0.7249 0.8514
No log 4.0364 222 0.7298 0.5349 0.7298 0.8543
No log 4.0727 224 0.7557 0.4914 0.7557 0.8693
No log 4.1091 226 0.8261 0.4886 0.8261 0.9089
No log 4.1455 228 0.8427 0.4666 0.8427 0.9180
No log 4.1818 230 0.7785 0.4847 0.7785 0.8823
No log 4.2182 232 0.7196 0.5239 0.7196 0.8483
No log 4.2545 234 0.7023 0.5068 0.7023 0.8380
No log 4.2909 236 0.7424 0.4714 0.7424 0.8616
No log 4.3273 238 0.7541 0.4677 0.7541 0.8684
No log 4.3636 240 0.7687 0.5049 0.7687 0.8768
No log 4.4 242 0.7514 0.4840 0.7514 0.8668
No log 4.4364 244 0.7558 0.4886 0.7558 0.8693
No log 4.4727 246 0.7221 0.4871 0.7221 0.8498
No log 4.5091 248 0.6961 0.4745 0.6961 0.8343
No log 4.5455 250 0.6831 0.4691 0.6831 0.8265
No log 4.5818 252 0.6929 0.5259 0.6929 0.8324
No log 4.6182 254 0.7682 0.5015 0.7682 0.8765
No log 4.6545 256 0.8388 0.4767 0.8388 0.9159
No log 4.6909 258 0.8158 0.4761 0.8158 0.9032
No log 4.7273 260 0.7093 0.4898 0.7093 0.8422
No log 4.7636 262 0.6223 0.5317 0.6223 0.7889
No log 4.8 264 0.6101 0.5208 0.6101 0.7811
No log 4.8364 266 0.6035 0.5221 0.6035 0.7768
No log 4.8727 268 0.5998 0.5173 0.5998 0.7745
No log 4.9091 270 0.6727 0.4746 0.6727 0.8202
No log 4.9455 272 0.7681 0.4383 0.7681 0.8764
No log 4.9818 274 0.7571 0.4387 0.7571 0.8701
No log 5.0182 276 0.6727 0.4593 0.6727 0.8202
No log 5.0545 278 0.6142 0.5580 0.6142 0.7837
No log 5.0909 280 0.6167 0.5335 0.6167 0.7853
No log 5.1273 282 0.6362 0.5321 0.6362 0.7976
No log 5.1636 284 0.6667 0.5068 0.6667 0.8165
No log 5.2 286 0.7075 0.4935 0.7075 0.8412
No log 5.2364 288 0.7447 0.4951 0.7447 0.8630
No log 5.2727 290 0.7464 0.5026 0.7464 0.8639
No log 5.3091 292 0.7157 0.4980 0.7157 0.8460
No log 5.3455 294 0.7093 0.4974 0.7093 0.8422
No log 5.3818 296 0.7218 0.5080 0.7218 0.8496
No log 5.4182 298 0.7434 0.5060 0.7434 0.8622
No log 5.4545 300 0.7825 0.4868 0.7825 0.8846
No log 5.4909 302 0.7892 0.4742 0.7892 0.8884
No log 5.5273 304 0.7801 0.4913 0.7801 0.8832
No log 5.5636 306 0.7578 0.5124 0.7578 0.8705
No log 5.6 308 0.7342 0.4847 0.7342 0.8569
No log 5.6364 310 0.7246 0.5082 0.7246 0.8512
No log 5.6727 312 0.7247 0.5060 0.7247 0.8513
No log 5.7091 314 0.7459 0.4695 0.7459 0.8636
No log 5.7455 316 0.7261 0.4817 0.7261 0.8521
No log 5.7818 318 0.6626 0.4747 0.6626 0.8140
No log 5.8182 320 0.6028 0.5934 0.6028 0.7764
No log 5.8545 322 0.5977 0.5695 0.5977 0.7731
No log 5.8909 324 0.6116 0.5582 0.6116 0.7821
No log 5.9273 326 0.6194 0.5613 0.6194 0.7870
No log 5.9636 328 0.6357 0.5340 0.6357 0.7973
No log 6.0 330 0.6996 0.4760 0.6996 0.8364
No log 6.0364 332 0.7684 0.4602 0.7684 0.8766
No log 6.0727 334 0.7929 0.4594 0.7929 0.8905
No log 6.1091 336 0.7535 0.5017 0.7535 0.8680
No log 6.1455 338 0.7037 0.5290 0.7037 0.8389
No log 6.1818 340 0.6913 0.5335 0.6913 0.8315
No log 6.2182 342 0.6893 0.5378 0.6893 0.8302
No log 6.2545 344 0.6806 0.5261 0.6806 0.8250
No log 6.2909 346 0.6688 0.5261 0.6688 0.8178
No log 6.3273 348 0.6797 0.4951 0.6797 0.8245
No log 6.3636 350 0.6874 0.4864 0.6874 0.8291
No log 6.4 352 0.6785 0.5244 0.6785 0.8237
No log 6.4364 354 0.6618 0.5814 0.6618 0.8135
No log 6.4727 356 0.6627 0.5735 0.6627 0.8141
No log 6.5091 358 0.6538 0.5830 0.6538 0.8086
No log 6.5455 360 0.6550 0.5757 0.6550 0.8093
No log 6.5818 362 0.6737 0.5560 0.6737 0.8208
No log 6.6182 364 0.6811 0.5434 0.6811 0.8253
No log 6.6545 366 0.7009 0.4898 0.7009 0.8372
No log 6.6909 368 0.6853 0.5389 0.6853 0.8278
No log 6.7273 370 0.6814 0.5392 0.6814 0.8255
No log 6.7636 372 0.6754 0.5873 0.6754 0.8219
No log 6.8 374 0.6803 0.5580 0.6803 0.8248
No log 6.8364 376 0.6912 0.5642 0.6912 0.8314
No log 6.8727 378 0.7190 0.5124 0.7190 0.8479
No log 6.9091 380 0.7620 0.4903 0.7620 0.8729
No log 6.9455 382 0.8094 0.4487 0.8094 0.8996
No log 6.9818 384 0.7999 0.4487 0.7999 0.8944
No log 7.0182 386 0.7632 0.4903 0.7632 0.8736
No log 7.0545 388 0.7280 0.5012 0.7280 0.8533
No log 7.0909 390 0.7284 0.5012 0.7284 0.8535
No log 7.1273 392 0.7499 0.4986 0.7499 0.8659
No log 7.1636 394 0.7814 0.4816 0.7814 0.8840
No log 7.2 396 0.7991 0.4738 0.7991 0.8940
No log 7.2364 398 0.7807 0.4761 0.7807 0.8836
No log 7.2727 400 0.7660 0.4977 0.7660 0.8752
No log 7.3091 402 0.7295 0.5241 0.7295 0.8541
No log 7.3455 404 0.7127 0.4977 0.7127 0.8442
No log 7.3818 406 0.7072 0.5217 0.7072 0.8410
No log 7.4182 408 0.6954 0.5217 0.6954 0.8339
No log 7.4545 410 0.6833 0.5469 0.6833 0.8266
No log 7.4909 412 0.6835 0.5451 0.6835 0.8267
No log 7.5273 414 0.6981 0.5254 0.6981 0.8355
No log 7.5636 416 0.7312 0.4893 0.7312 0.8551
No log 7.6 418 0.7292 0.4847 0.7292 0.8539
No log 7.6364 420 0.7060 0.4948 0.7060 0.8402
No log 7.6727 422 0.6682 0.5135 0.6682 0.8174
No log 7.7091 424 0.6368 0.5821 0.6368 0.7980
No log 7.7455 426 0.6310 0.5825 0.6310 0.7943
No log 7.7818 428 0.6271 0.5776 0.6271 0.7919
No log 7.8182 430 0.6322 0.5776 0.6322 0.7951
No log 7.8545 432 0.6446 0.5678 0.6446 0.8029
No log 7.8909 434 0.6691 0.5167 0.6691 0.8180
No log 7.9273 436 0.6923 0.4961 0.6923 0.8320
No log 7.9636 438 0.7022 0.4970 0.7022 0.8380
No log 8.0 440 0.7062 0.4725 0.7062 0.8404
No log 8.0364 442 0.6975 0.4979 0.6975 0.8351
No log 8.0727 444 0.7016 0.4912 0.7016 0.8376
No log 8.1091 446 0.7110 0.4901 0.7110 0.8432
No log 8.1455 448 0.7253 0.4809 0.7253 0.8516
No log 8.1818 450 0.7371 0.4919 0.7371 0.8585
No log 8.2182 452 0.7370 0.4919 0.7370 0.8585
No log 8.2545 454 0.7357 0.4868 0.7357 0.8577
No log 8.2909 456 0.7419 0.4861 0.7419 0.8613
No log 8.3273 458 0.7403 0.4868 0.7403 0.8604
No log 8.3636 460 0.7241 0.4944 0.7241 0.8509
No log 8.4 462 0.7099 0.4896 0.7099 0.8425
No log 8.4364 464 0.7044 0.4904 0.7044 0.8393
No log 8.4727 466 0.6973 0.4912 0.6973 0.8351
No log 8.5091 468 0.7016 0.4912 0.7016 0.8376
No log 8.5455 470 0.7170 0.4798 0.7170 0.8468
No log 8.5818 472 0.7419 0.5009 0.7419 0.8613
No log 8.6182 474 0.7579 0.4982 0.7579 0.8706
No log 8.6545 476 0.7546 0.4913 0.7546 0.8687
No log 8.6909 478 0.7294 0.5 0.7294 0.8540
No log 8.7273 480 0.6920 0.5046 0.6920 0.8319
No log 8.7636 482 0.6605 0.4925 0.6605 0.8127
No log 8.8 484 0.6460 0.5254 0.6460 0.8037
No log 8.8364 486 0.6466 0.5267 0.6466 0.8041
No log 8.8727 488 0.6555 0.5138 0.6555 0.8097
No log 8.9091 490 0.6730 0.4883 0.6730 0.8204
No log 8.9455 492 0.6918 0.4935 0.6918 0.8318
No log 8.9818 494 0.7012 0.5046 0.7012 0.8374
No log 9.0182 496 0.7027 0.5046 0.7027 0.8383
No log 9.0545 498 0.7105 0.5046 0.7105 0.8429
0.4264 9.0909 500 0.7134 0.5046 0.7134 0.8446
0.4264 9.1273 502 0.7074 0.4994 0.7074 0.8411
0.4264 9.1636 504 0.7037 0.4994 0.7037 0.8389
0.4264 9.2 506 0.6977 0.4944 0.6977 0.8353
0.4264 9.2364 508 0.6933 0.5047 0.6933 0.8326
0.4264 9.2727 510 0.6916 0.5047 0.6916 0.8316
0.4264 9.3091 512 0.6942 0.5055 0.6942 0.8332
0.4264 9.3455 514 0.6977 0.5055 0.6977 0.8353
0.4264 9.3818 516 0.6979 0.5055 0.6979 0.8354
0.4264 9.4182 518 0.6992 0.5038 0.6992 0.8362
0.4264 9.4545 520 0.7058 0.5096 0.7058 0.8401
0.4264 9.4909 522 0.7149 0.5020 0.7149 0.8455
0.4264 9.5273 524 0.7233 0.5071 0.7233 0.8505
0.4264 9.5636 526 0.7281 0.5046 0.7281 0.8533
0.4264 9.6 528 0.7313 0.4950 0.7313 0.8552
0.4264 9.6364 530 0.7332 0.4950 0.7332 0.8562
0.4264 9.6727 532 0.7325 0.4950 0.7325 0.8558
0.4264 9.7091 534 0.7283 0.5036 0.7283 0.8534
0.4264 9.7455 536 0.7224 0.5124 0.7224 0.8499
0.4264 9.7818 538 0.7174 0.5203 0.7174 0.8470
0.4264 9.8182 540 0.7139 0.5149 0.7139 0.8449
0.4264 9.8545 542 0.7116 0.5149 0.7116 0.8436
0.4264 9.8909 544 0.7095 0.5149 0.7095 0.8423
0.4264 9.9273 546 0.7076 0.5020 0.7076 0.8412
0.4264 9.9636 548 0.7070 0.5020 0.7070 0.8408
0.4264 10.0 550 0.7066 0.5020 0.7066 0.8406

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_FineTuningAraBERT_run2_AugV5_k11_task2_organization

Finetuned
(4023)
this model