ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k15_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5918
  • Qwk: 0.4997
  • Mse: 0.5918
  • Rmse: 0.7693

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0526 2 2.4742 -0.0262 2.4742 1.5730
No log 0.1053 4 1.3467 0.0423 1.3467 1.1605
No log 0.1579 6 0.9517 -0.0622 0.9517 0.9756
No log 0.2105 8 0.9130 0.1508 0.9130 0.9555
No log 0.2632 10 0.9536 0.1180 0.9536 0.9765
No log 0.3158 12 0.9793 0.1584 0.9793 0.9896
No log 0.3684 14 0.7882 -0.0149 0.7882 0.8878
No log 0.4211 16 0.7247 0.0 0.7247 0.8513
No log 0.4737 18 0.8137 0.2156 0.8137 0.9020
No log 0.5263 20 0.8123 0.1786 0.8123 0.9013
No log 0.5789 22 0.8021 0.0 0.8021 0.8956
No log 0.6316 24 0.7675 -0.0027 0.7675 0.8761
No log 0.6842 26 0.6940 0.1287 0.6940 0.8331
No log 0.7368 28 0.7868 0.1372 0.7868 0.8870
No log 0.7895 30 0.8928 0.3444 0.8928 0.9449
No log 0.8421 32 0.7854 0.0944 0.7854 0.8862
No log 0.8947 34 0.7197 0.1327 0.7197 0.8483
No log 0.9474 36 0.7623 0.1327 0.7623 0.8731
No log 1.0 38 0.7829 0.2128 0.7829 0.8848
No log 1.0526 40 0.7623 0.2412 0.7623 0.8731
No log 1.1053 42 0.7603 0.1942 0.7603 0.8720
No log 1.1579 44 0.7794 0.3019 0.7794 0.8829
No log 1.2105 46 0.9561 0.1300 0.9561 0.9778
No log 1.2632 48 1.0133 0.0390 1.0133 1.0066
No log 1.3158 50 0.8660 0.2642 0.8660 0.9306
No log 1.3684 52 0.7905 0.3050 0.7905 0.8891
No log 1.4211 54 0.7617 0.1710 0.7617 0.8728
No log 1.4737 56 0.7881 0.3477 0.7881 0.8878
No log 1.5263 58 0.7931 0.3125 0.7931 0.8906
No log 1.5789 60 0.7634 0.2815 0.7634 0.8737
No log 1.6316 62 0.7977 0.1988 0.7977 0.8932
No log 1.6842 64 0.7782 0.2540 0.7782 0.8822
No log 1.7368 66 0.7562 0.2711 0.7562 0.8696
No log 1.7895 68 0.7735 0.1935 0.7735 0.8795
No log 1.8421 70 0.7375 0.1935 0.7375 0.8588
No log 1.8947 72 0.6690 0.2325 0.6690 0.8179
No log 1.9474 74 0.7948 0.2584 0.7948 0.8915
No log 2.0 76 0.7145 0.3299 0.7145 0.8453
No log 2.0526 78 0.6207 0.4111 0.6207 0.7878
No log 2.1053 80 0.8465 0.3134 0.8465 0.9200
No log 2.1579 82 1.0387 0.4186 1.0387 1.0192
No log 2.2105 84 0.9571 0.3782 0.9571 0.9783
No log 2.2632 86 0.6300 0.4124 0.6300 0.7937
No log 2.3158 88 0.6868 0.4677 0.6868 0.8287
No log 2.3684 90 0.7406 0.4078 0.7406 0.8606
No log 2.4211 92 0.6830 0.3125 0.6830 0.8264
No log 2.4737 94 0.6364 0.3228 0.6364 0.7977
No log 2.5263 96 0.7200 0.3167 0.7200 0.8486
No log 2.5789 98 0.8432 0.3234 0.8432 0.9183
No log 2.6316 100 0.7633 0.3167 0.7633 0.8736
No log 2.6842 102 0.7309 0.2923 0.7309 0.8549
No log 2.7368 104 0.8185 0.2402 0.8185 0.9047
No log 2.7895 106 0.8356 0.2435 0.8356 0.9141
No log 2.8421 108 0.7709 0.2738 0.7709 0.8780
No log 2.8947 110 0.7136 0.3001 0.7136 0.8448
No log 2.9474 112 0.6290 0.3029 0.6290 0.7931
No log 3.0 114 0.5911 0.3717 0.5911 0.7688
No log 3.0526 116 0.5821 0.3661 0.5821 0.7629
No log 3.1053 118 0.5815 0.3509 0.5815 0.7626
No log 3.1579 120 0.5986 0.4524 0.5986 0.7737
No log 3.2105 122 0.6030 0.4524 0.6030 0.7766
No log 3.2632 124 0.6077 0.4249 0.6077 0.7796
No log 3.3158 126 0.7531 0.4556 0.7531 0.8678
No log 3.3684 128 0.7409 0.4341 0.7409 0.8607
No log 3.4211 130 0.5828 0.4677 0.5828 0.7634
No log 3.4737 132 0.6658 0.4513 0.6658 0.8160
No log 3.5263 134 0.7540 0.4197 0.7540 0.8683
No log 3.5789 136 0.6749 0.4218 0.6749 0.8215
No log 3.6316 138 0.5742 0.4300 0.5742 0.7577
No log 3.6842 140 0.5686 0.4876 0.5686 0.7540
No log 3.7368 142 0.5589 0.4127 0.5589 0.7476
No log 3.7895 144 0.5672 0.4292 0.5672 0.7531
No log 3.8421 146 0.6264 0.4247 0.6264 0.7914
No log 3.8947 148 0.7403 0.4424 0.7403 0.8604
No log 3.9474 150 0.6883 0.4197 0.6883 0.8296
No log 4.0 152 0.5715 0.4212 0.5715 0.7559
No log 4.0526 154 0.5813 0.4190 0.5813 0.7624
No log 4.1053 156 0.5703 0.4270 0.5703 0.7552
No log 4.1579 158 0.5649 0.4352 0.5649 0.7516
No log 4.2105 160 0.5107 0.5623 0.5107 0.7147
No log 4.2632 162 0.5246 0.4902 0.5246 0.7243
No log 4.3158 164 0.5683 0.3840 0.5683 0.7539
No log 4.3684 166 0.6582 0.3480 0.6582 0.8113
No log 4.4211 168 0.5556 0.4267 0.5556 0.7454
No log 4.4737 170 0.5546 0.5479 0.5546 0.7447
No log 4.5263 172 0.5440 0.5463 0.5440 0.7375
No log 4.5789 174 0.5930 0.4437 0.5930 0.7700
No log 4.6316 176 0.6979 0.4315 0.6979 0.8354
No log 4.6842 178 0.6182 0.3914 0.6182 0.7862
No log 4.7368 180 0.5305 0.5304 0.5305 0.7284
No log 4.7895 182 0.5348 0.5304 0.5348 0.7313
No log 4.8421 184 0.5679 0.4007 0.5679 0.7536
No log 4.8947 186 0.5817 0.4007 0.5817 0.7627
No log 4.9474 188 0.5516 0.4413 0.5516 0.7427
No log 5.0 190 0.5172 0.5703 0.5172 0.7192
No log 5.0526 192 0.5893 0.4463 0.5893 0.7677
No log 5.1053 194 0.8252 0.3786 0.8252 0.9084
No log 5.1579 196 0.8355 0.3786 0.8355 0.9141
No log 5.2105 198 0.6419 0.3761 0.6419 0.8012
No log 5.2632 200 0.5213 0.5801 0.5213 0.7220
No log 5.3158 202 0.5496 0.4867 0.5496 0.7414
No log 5.3684 204 0.5489 0.4437 0.5489 0.7409
No log 5.4211 206 0.5283 0.5675 0.5283 0.7268
No log 5.4737 208 0.5314 0.4938 0.5314 0.7289
No log 5.5263 210 0.5220 0.4869 0.5220 0.7225
No log 5.5789 212 0.5123 0.4828 0.5123 0.7157
No log 5.6316 214 0.5335 0.4292 0.5335 0.7304
No log 5.6842 216 0.5656 0.4371 0.5656 0.7521
No log 5.7368 218 0.5555 0.4371 0.5555 0.7453
No log 5.7895 220 0.4966 0.5232 0.4966 0.7047
No log 5.8421 222 0.5011 0.5796 0.5011 0.7079
No log 5.8947 224 0.5582 0.6030 0.5582 0.7472
No log 5.9474 226 0.5958 0.5664 0.5958 0.7719
No log 6.0 228 0.5356 0.5569 0.5356 0.7318
No log 6.0526 230 0.4995 0.6530 0.4995 0.7068
No log 6.1053 232 0.5159 0.4883 0.5159 0.7183
No log 6.1579 234 0.5146 0.5605 0.5146 0.7174
No log 6.2105 236 0.5385 0.5438 0.5385 0.7338
No log 6.2632 238 0.5703 0.5161 0.5703 0.7552
No log 6.3158 240 0.5370 0.5283 0.5370 0.7328
No log 6.3684 242 0.5047 0.5457 0.5047 0.7104
No log 6.4211 244 0.5251 0.4524 0.5251 0.7246
No log 6.4737 246 0.5243 0.4052 0.5243 0.7241
No log 6.5263 248 0.5345 0.4795 0.5345 0.7311
No log 6.5789 250 0.5387 0.5152 0.5387 0.7339
No log 6.6316 252 0.5729 0.5368 0.5729 0.7569
No log 6.6842 254 0.5901 0.5721 0.5901 0.7682
No log 6.7368 256 0.5387 0.5252 0.5387 0.7340
No log 6.7895 258 0.5034 0.5739 0.5034 0.7095
No log 6.8421 260 0.5435 0.5291 0.5435 0.7372
No log 6.8947 262 0.6099 0.5215 0.6099 0.7809
No log 6.9474 264 0.5547 0.5107 0.5547 0.7448
No log 7.0 266 0.4939 0.5289 0.4939 0.7028
No log 7.0526 268 0.5009 0.5071 0.5009 0.7077
No log 7.1053 270 0.5180 0.5141 0.5180 0.7198
No log 7.1579 272 0.5547 0.4875 0.5547 0.7447
No log 7.2105 274 0.5821 0.4702 0.5821 0.7630
No log 7.2632 276 0.5706 0.5030 0.5706 0.7553
No log 7.3158 278 0.5622 0.5383 0.5622 0.7498
No log 7.3684 280 0.5435 0.5515 0.5435 0.7372
No log 7.4211 282 0.5574 0.4954 0.5574 0.7466
No log 7.4737 284 0.6301 0.4819 0.6301 0.7938
No log 7.5263 286 0.5940 0.5219 0.5940 0.7707
No log 7.5789 288 0.5182 0.5323 0.5182 0.7198
No log 7.6316 290 0.5355 0.5250 0.5355 0.7318
No log 7.6842 292 0.5407 0.5335 0.5407 0.7353
No log 7.7368 294 0.5244 0.6073 0.5244 0.7242
No log 7.7895 296 0.5300 0.5460 0.5300 0.7280
No log 7.8421 298 0.5318 0.5142 0.5318 0.7292
No log 7.8947 300 0.5174 0.5406 0.5174 0.7193
No log 7.9474 302 0.6188 0.5200 0.6188 0.7866
No log 8.0 304 0.6952 0.5343 0.6952 0.8338
No log 8.0526 306 0.5854 0.5317 0.5854 0.7651
No log 8.1053 308 0.5078 0.4829 0.5078 0.7126
No log 8.1579 310 0.5762 0.4430 0.5762 0.7591
No log 8.2105 312 0.6103 0.4350 0.6103 0.7812
No log 8.2632 314 0.5538 0.4243 0.5538 0.7442
No log 8.3158 316 0.5434 0.4019 0.5434 0.7372
No log 8.3684 318 0.5885 0.5512 0.5885 0.7671
No log 8.4211 320 0.5941 0.5933 0.5941 0.7708
No log 8.4737 322 0.5445 0.5009 0.5445 0.7379
No log 8.5263 324 0.5772 0.4597 0.5772 0.7597
No log 8.5789 326 0.6576 0.3913 0.6576 0.8109
No log 8.6316 328 0.6099 0.3891 0.6099 0.7809
No log 8.6842 330 0.5710 0.4414 0.5710 0.7557
No log 8.7368 332 0.5100 0.4816 0.5100 0.7141
No log 8.7895 334 0.4986 0.5003 0.4986 0.7061
No log 8.8421 336 0.5188 0.5228 0.5188 0.7203
No log 8.8947 338 0.5206 0.5228 0.5206 0.7215
No log 8.9474 340 0.4914 0.5538 0.4914 0.7010
No log 9.0 342 0.5456 0.5528 0.5456 0.7386
No log 9.0526 344 0.5837 0.5544 0.5837 0.7640
No log 9.1053 346 0.5562 0.5390 0.5562 0.7458
No log 9.1579 348 0.5311 0.5574 0.5311 0.7288
No log 9.2105 350 0.5312 0.5379 0.5312 0.7288
No log 9.2632 352 0.5321 0.5396 0.5321 0.7294
No log 9.3158 354 0.5381 0.5493 0.5381 0.7336
No log 9.3684 356 0.5856 0.5317 0.5856 0.7652
No log 9.4211 358 0.6108 0.5247 0.6108 0.7815
No log 9.4737 360 0.5593 0.5065 0.5593 0.7478
No log 9.5263 362 0.5335 0.4468 0.5335 0.7304
No log 9.5789 364 0.5720 0.3840 0.5720 0.7563
No log 9.6316 366 0.5746 0.3840 0.5746 0.7580
No log 9.6842 368 0.5303 0.5081 0.5303 0.7282
No log 9.7368 370 0.5184 0.4724 0.5184 0.7200
No log 9.7895 372 0.5173 0.5246 0.5173 0.7193
No log 9.8421 374 0.5219 0.4504 0.5219 0.7225
No log 9.8947 376 0.5283 0.3887 0.5283 0.7268
No log 9.9474 378 0.5363 0.3887 0.5363 0.7323
No log 10.0 380 0.5394 0.3738 0.5394 0.7344
No log 10.0526 382 0.5300 0.4561 0.5300 0.7280
No log 10.1053 384 0.5403 0.4561 0.5403 0.7350
No log 10.1579 386 0.5523 0.4111 0.5523 0.7432
No log 10.2105 388 0.6062 0.4212 0.6062 0.7786
No log 10.2632 390 0.6456 0.4684 0.6456 0.8035
No log 10.3158 392 0.6039 0.4997 0.6039 0.7771
No log 10.3684 394 0.5580 0.4352 0.5580 0.7470
No log 10.4211 396 0.5466 0.4828 0.5466 0.7393
No log 10.4737 398 0.5464 0.5397 0.5464 0.7392
No log 10.5263 400 0.5796 0.5498 0.5796 0.7613
No log 10.5789 402 0.5928 0.5095 0.5928 0.7699
No log 10.6316 404 0.5464 0.4918 0.5464 0.7392
No log 10.6842 406 0.5486 0.4086 0.5486 0.7407
No log 10.7368 408 0.6251 0.4663 0.6251 0.7906
No log 10.7895 410 0.6499 0.4663 0.6499 0.8062
No log 10.8421 412 0.6379 0.4663 0.6379 0.7987
No log 10.8947 414 0.5804 0.3809 0.5804 0.7619
No log 10.9474 416 0.5501 0.3920 0.5501 0.7417
No log 11.0 418 0.5381 0.4448 0.5381 0.7335
No log 11.0526 420 0.5329 0.4448 0.5329 0.7300
No log 11.1053 422 0.5343 0.4591 0.5343 0.7310
No log 11.1579 424 0.5567 0.4774 0.5567 0.7461
No log 11.2105 426 0.5875 0.4522 0.5875 0.7665
No log 11.2632 428 0.5271 0.5036 0.5271 0.7260
No log 11.3158 430 0.5001 0.4990 0.5001 0.7072
No log 11.3684 432 0.5363 0.5649 0.5363 0.7323
No log 11.4211 434 0.5570 0.5587 0.5570 0.7463
No log 11.4737 436 0.5185 0.5414 0.5185 0.7201
No log 11.5263 438 0.4972 0.4934 0.4972 0.7051
No log 11.5789 440 0.5613 0.4764 0.5613 0.7492
No log 11.6316 442 0.5970 0.4764 0.5970 0.7727
No log 11.6842 444 0.5405 0.4081 0.5405 0.7352
No log 11.7368 446 0.5138 0.4878 0.5138 0.7168
No log 11.7895 448 0.5207 0.4991 0.5207 0.7216
No log 11.8421 450 0.5100 0.5440 0.5100 0.7141
No log 11.8947 452 0.5355 0.4622 0.5355 0.7318
No log 11.9474 454 0.5639 0.4684 0.5639 0.7509
No log 12.0 456 0.5370 0.3990 0.5370 0.7328
No log 12.0526 458 0.5296 0.4147 0.5296 0.7277
No log 12.1053 460 0.5412 0.4067 0.5412 0.7356
No log 12.1579 462 0.5332 0.3754 0.5332 0.7302
No log 12.2105 464 0.5430 0.4677 0.5430 0.7369
No log 12.2632 466 0.5524 0.4429 0.5524 0.7432
No log 12.3158 468 0.5381 0.3947 0.5381 0.7336
No log 12.3684 470 0.5363 0.4378 0.5363 0.7323
No log 12.4211 472 0.5399 0.4354 0.5399 0.7348
No log 12.4737 474 0.5463 0.4437 0.5463 0.7391
No log 12.5263 476 0.5432 0.4437 0.5432 0.7370
No log 12.5789 478 0.5236 0.4867 0.5236 0.7236
No log 12.6316 480 0.5083 0.5440 0.5083 0.7130
No log 12.6842 482 0.5126 0.5390 0.5126 0.7160
No log 12.7368 484 0.5293 0.5352 0.5293 0.7275
No log 12.7895 486 0.5629 0.5997 0.5629 0.7503
No log 12.8421 488 0.5812 0.5328 0.5812 0.7624
No log 12.8947 490 0.5474 0.5997 0.5474 0.7398
No log 12.9474 492 0.5049 0.5127 0.5049 0.7106
No log 13.0 494 0.4787 0.5522 0.4787 0.6919
No log 13.0526 496 0.5091 0.6087 0.5091 0.7135
No log 13.1053 498 0.5909 0.5219 0.5909 0.7687
0.3675 13.1579 500 0.6440 0.5008 0.6440 0.8025
0.3675 13.2105 502 0.5974 0.4997 0.5974 0.7729
0.3675 13.2632 504 0.5570 0.3622 0.5570 0.7463
0.3675 13.3158 506 0.5531 0.3416 0.5531 0.7437
0.3675 13.3684 508 0.5640 0.3622 0.5640 0.7510
0.3675 13.4211 510 0.5918 0.4997 0.5918 0.7693

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k15_task7_organization

Finetuned
(4023)
this model