ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k10_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5601
  • Qwk: 0.5512
  • Mse: 0.5601
  • Rmse: 0.7484

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0571 2 2.4600 -0.0788 2.4600 1.5684
No log 0.1143 4 1.1620 0.0715 1.1620 1.0780
No log 0.1714 6 0.7674 0.0937 0.7674 0.8760
No log 0.2286 8 0.8405 -0.0149 0.8405 0.9168
No log 0.2857 10 0.8139 0.0236 0.8139 0.9022
No log 0.3429 12 0.7363 -0.0027 0.7363 0.8581
No log 0.4 14 0.8171 0.0937 0.8171 0.9039
No log 0.4571 16 1.0162 0.0643 1.0162 1.0081
No log 0.5143 18 1.0817 -0.0201 1.0817 1.0400
No log 0.5714 20 0.9878 0.1661 0.9878 0.9939
No log 0.6286 22 0.7874 0.0889 0.7874 0.8873
No log 0.6857 24 0.7664 0.3839 0.7664 0.8754
No log 0.7429 26 0.8259 0.3637 0.8259 0.9088
No log 0.8 28 0.7702 0.2471 0.7702 0.8776
No log 0.8571 30 0.7336 0.1561 0.7336 0.8565
No log 0.9143 32 0.7545 0.0840 0.7545 0.8686
No log 0.9714 34 0.7518 0.1236 0.7518 0.8671
No log 1.0286 36 0.7147 0.1508 0.7147 0.8454
No log 1.0857 38 0.7108 0.1807 0.7108 0.8431
No log 1.1429 40 0.7053 0.2145 0.7053 0.8398
No log 1.2 42 0.7063 0.2374 0.7063 0.8404
No log 1.2571 44 0.7035 0.2374 0.7035 0.8387
No log 1.3143 46 0.7172 0.2748 0.7172 0.8469
No log 1.3714 48 0.9109 0.0966 0.9109 0.9544
No log 1.4286 50 1.0176 0.1856 1.0176 1.0087
No log 1.4857 52 0.9017 0.0970 0.9017 0.9496
No log 1.5429 54 0.7405 0.2454 0.7405 0.8605
No log 1.6 56 0.6835 0.3169 0.6835 0.8268
No log 1.6571 58 0.6600 0.3137 0.6600 0.8124
No log 1.7143 60 0.6595 0.2987 0.6595 0.8121
No log 1.7714 62 0.6572 0.3050 0.6572 0.8106
No log 1.8286 64 0.6999 0.2981 0.6999 0.8366
No log 1.8857 66 0.6729 0.2441 0.6729 0.8203
No log 1.9429 68 0.6839 0.2572 0.6839 0.8270
No log 2.0 70 0.6673 0.2121 0.6673 0.8169
No log 2.0571 72 0.6793 0.3416 0.6793 0.8242
No log 2.1143 74 0.6962 0.3341 0.6962 0.8344
No log 2.1714 76 0.7138 0.3267 0.7138 0.8449
No log 2.2286 78 0.7027 0.2923 0.7027 0.8383
No log 2.2857 80 0.9024 0.2504 0.9024 0.9500
No log 2.3429 82 1.0164 0.1654 1.0164 1.0082
No log 2.4 84 0.8819 0.2837 0.8819 0.9391
No log 2.4571 86 0.7336 0.3481 0.7336 0.8565
No log 2.5143 88 0.6905 0.3296 0.6905 0.8310
No log 2.5714 90 0.8044 0.3953 0.8044 0.8969
No log 2.6286 92 0.6588 0.5271 0.6588 0.8117
No log 2.6857 94 0.7086 0.3807 0.7086 0.8418
No log 2.7429 96 0.9867 0.2932 0.9867 0.9933
No log 2.8 98 1.0568 0.2886 1.0568 1.0280
No log 2.8571 100 0.9008 0.3144 0.9008 0.9491
No log 2.9143 102 0.7200 0.4671 0.7200 0.8485
No log 2.9714 104 0.6187 0.4859 0.6187 0.7866
No log 3.0286 106 0.5686 0.4425 0.5686 0.7540
No log 3.0857 108 0.5543 0.4425 0.5543 0.7445
No log 3.1429 110 0.5782 0.4211 0.5782 0.7604
No log 3.2 112 0.6052 0.4958 0.6052 0.7779
No log 3.2571 114 0.6956 0.3921 0.6956 0.8341
No log 3.3143 116 0.6842 0.5112 0.6842 0.8272
No log 3.3714 118 0.5814 0.5028 0.5814 0.7625
No log 3.4286 120 0.5685 0.4762 0.5685 0.7540
No log 3.4857 122 0.5652 0.4847 0.5652 0.7518
No log 3.5429 124 0.5891 0.5577 0.5891 0.7675
No log 3.6 126 0.5969 0.5345 0.5969 0.7726
No log 3.6571 128 0.5712 0.4768 0.5712 0.7558
No log 3.7143 130 0.6246 0.4459 0.6246 0.7903
No log 3.7714 132 0.6562 0.3688 0.6562 0.8101
No log 3.8286 134 0.5892 0.3915 0.5892 0.7676
No log 3.8857 136 0.6042 0.3958 0.6042 0.7773
No log 3.9429 138 0.6199 0.3958 0.6199 0.7874
No log 4.0 140 0.6155 0.3808 0.6155 0.7845
No log 4.0571 142 0.6652 0.3081 0.6652 0.8156
No log 4.1143 144 0.6862 0.3296 0.6862 0.8284
No log 4.1714 146 0.6511 0.2748 0.6511 0.8069
No log 4.2286 148 0.6536 0.3754 0.6536 0.8084
No log 4.2857 150 0.6504 0.4345 0.6504 0.8065
No log 4.3429 152 0.6582 0.4044 0.6582 0.8113
No log 4.4 154 0.6374 0.4762 0.6374 0.7984
No log 4.4571 156 0.6405 0.5231 0.6405 0.8003
No log 4.5143 158 0.6637 0.5092 0.6637 0.8147
No log 4.5714 160 0.6696 0.5286 0.6696 0.8183
No log 4.6286 162 0.7368 0.3365 0.7368 0.8584
No log 4.6857 164 0.8008 0.3615 0.8008 0.8949
No log 4.7429 166 0.8468 0.4110 0.8468 0.9202
No log 4.8 168 0.6906 0.5335 0.6906 0.8310
No log 4.8571 170 0.6134 0.4748 0.6134 0.7832
No log 4.9143 172 0.6411 0.5429 0.6411 0.8007
No log 4.9714 174 0.5892 0.4965 0.5892 0.7676
No log 5.0286 176 0.5785 0.5025 0.5785 0.7606
No log 5.0857 178 0.5932 0.5025 0.5932 0.7702
No log 5.1429 180 0.6113 0.4695 0.6113 0.7819
No log 5.2 182 0.6440 0.5525 0.6440 0.8025
No log 5.2571 184 0.6223 0.5411 0.6223 0.7889
No log 5.3143 186 0.5636 0.4094 0.5636 0.7507
No log 5.3714 188 0.5518 0.4923 0.5518 0.7428
No log 5.4286 190 0.5725 0.4595 0.5725 0.7566
No log 5.4857 192 0.5794 0.5483 0.5794 0.7612
No log 5.5429 194 0.5810 0.5373 0.5810 0.7622
No log 5.6 196 0.6052 0.5720 0.6052 0.7780
No log 5.6571 198 0.6653 0.5672 0.6653 0.8157
No log 5.7143 200 0.7492 0.4637 0.7492 0.8656
No log 5.7714 202 0.7073 0.4987 0.7073 0.8410
No log 5.8286 204 0.5837 0.5648 0.5837 0.7640
No log 5.8857 206 0.6527 0.4982 0.6527 0.8079
No log 5.9429 208 0.7039 0.4217 0.7039 0.8390
No log 6.0 210 0.6914 0.4315 0.6914 0.8315
No log 6.0571 212 0.5748 0.5362 0.5748 0.7581
No log 6.1143 214 0.5734 0.5422 0.5734 0.7573
No log 6.1714 216 0.6763 0.4708 0.6763 0.8224
No log 6.2286 218 0.7155 0.4545 0.7155 0.8458
No log 6.2857 220 0.6555 0.5393 0.6555 0.8096
No log 6.3429 222 0.5719 0.5498 0.5719 0.7562
No log 6.4 224 0.5470 0.5826 0.5470 0.7396
No log 6.4571 226 0.5354 0.5797 0.5354 0.7317
No log 6.5143 228 0.5343 0.5304 0.5343 0.7310
No log 6.5714 230 0.5331 0.5533 0.5331 0.7301
No log 6.6286 232 0.5324 0.5444 0.5324 0.7297
No log 6.6857 234 0.5304 0.4562 0.5304 0.7283
No log 6.7429 236 0.5458 0.5543 0.5458 0.7388
No log 6.8 238 0.5551 0.5395 0.5551 0.7451
No log 6.8571 240 0.5424 0.5485 0.5424 0.7365
No log 6.9143 242 0.5398 0.4938 0.5398 0.7347
No log 6.9714 244 0.5642 0.5368 0.5642 0.7511
No log 7.0286 246 0.5549 0.4997 0.5549 0.7449
No log 7.0857 248 0.5370 0.5390 0.5370 0.7328
No log 7.1429 250 0.5421 0.5167 0.5421 0.7363
No log 7.2 252 0.5416 0.5549 0.5416 0.7359
No log 7.2571 254 0.5372 0.5390 0.5372 0.7329
No log 7.3143 256 0.6791 0.4783 0.6791 0.8241
No log 7.3714 258 0.8381 0.3472 0.8381 0.9155
No log 7.4286 260 0.8090 0.4161 0.8090 0.8994
No log 7.4857 262 0.6400 0.5195 0.6400 0.8000
No log 7.5429 264 0.5237 0.5707 0.5237 0.7237
No log 7.6 266 0.5503 0.5222 0.5503 0.7418
No log 7.6571 268 0.5554 0.4735 0.5554 0.7452
No log 7.7143 270 0.5244 0.5939 0.5244 0.7242
No log 7.7714 272 0.5477 0.5953 0.5477 0.7401
No log 7.8286 274 0.5664 0.5943 0.5664 0.7526
No log 7.8857 276 0.5259 0.5867 0.5259 0.7252
No log 7.9429 278 0.5188 0.5707 0.5188 0.7203
No log 8.0 280 0.5184 0.5707 0.5184 0.7200
No log 8.0571 282 0.5205 0.5336 0.5205 0.7214
No log 8.1143 284 0.5424 0.5518 0.5424 0.7365
No log 8.1714 286 0.5427 0.5853 0.5427 0.7367
No log 8.2286 288 0.5655 0.5678 0.5655 0.7520
No log 8.2857 290 0.5948 0.5706 0.5948 0.7712
No log 8.3429 292 0.5559 0.5855 0.5559 0.7456
No log 8.4 294 0.5177 0.5483 0.5177 0.7195
No log 8.4571 296 0.4920 0.5563 0.4920 0.7015
No log 8.5143 298 0.4949 0.6092 0.4949 0.7035
No log 8.5714 300 0.4948 0.6108 0.4948 0.7034
No log 8.6286 302 0.5000 0.6108 0.5000 0.7071
No log 8.6857 304 0.5064 0.5413 0.5064 0.7116
No log 8.7429 306 0.5138 0.5663 0.5138 0.7168
No log 8.8 308 0.5130 0.5736 0.5130 0.7162
No log 8.8571 310 0.5091 0.5563 0.5091 0.7135
No log 8.9143 312 0.5204 0.6143 0.5204 0.7214
No log 8.9714 314 0.5147 0.5943 0.5147 0.7174
No log 9.0286 316 0.4945 0.5184 0.4945 0.7032
No log 9.0857 318 0.5151 0.5801 0.5151 0.7177
No log 9.1429 320 0.4955 0.5021 0.4955 0.7039
No log 9.2 322 0.5008 0.5861 0.5008 0.7077
No log 9.2571 324 0.4939 0.5861 0.4939 0.7028
No log 9.3143 326 0.4881 0.5463 0.4881 0.6987
No log 9.3714 328 0.4995 0.5463 0.4995 0.7067
No log 9.4286 330 0.5167 0.5538 0.5167 0.7188
No log 9.4857 332 0.5181 0.5767 0.5181 0.7198
No log 9.5429 334 0.5325 0.5718 0.5325 0.7297
No log 9.6 336 0.5290 0.5160 0.5290 0.7273
No log 9.6571 338 0.5248 0.5131 0.5248 0.7244
No log 9.7143 340 0.5293 0.4867 0.5293 0.7275
No log 9.7714 342 0.5178 0.4547 0.5178 0.7196
No log 9.8286 344 0.5210 0.4338 0.5210 0.7218
No log 9.8857 346 0.5448 0.4635 0.5448 0.7381
No log 9.9429 348 0.5482 0.4575 0.5482 0.7404
No log 10.0 350 0.5253 0.5143 0.5253 0.7248
No log 10.0571 352 0.5289 0.5131 0.5289 0.7272
No log 10.1143 354 0.6116 0.5045 0.6116 0.7820
No log 10.1714 356 0.6448 0.4947 0.6448 0.8030
No log 10.2286 358 0.5796 0.4916 0.5796 0.7613
No log 10.2857 360 0.5278 0.4866 0.5278 0.7265
No log 10.3429 362 0.5280 0.5421 0.5280 0.7266
No log 10.4 364 0.5255 0.4934 0.5255 0.7249
No log 10.4571 366 0.5264 0.5003 0.5264 0.7255
No log 10.5143 368 0.5293 0.5071 0.5293 0.7275
No log 10.5714 370 0.5524 0.5324 0.5524 0.7432
No log 10.6286 372 0.5432 0.5708 0.5432 0.7371
No log 10.6857 374 0.5292 0.5708 0.5292 0.7275
No log 10.7429 376 0.4984 0.5114 0.4984 0.7060
No log 10.8 378 0.5265 0.5110 0.5265 0.7256
No log 10.8571 380 0.6454 0.5003 0.6454 0.8034
No log 10.9143 382 0.6635 0.5003 0.6635 0.8146
No log 10.9714 384 0.5806 0.5445 0.5806 0.7619
No log 11.0286 386 0.4910 0.5373 0.4910 0.7007
No log 11.0857 388 0.4925 0.5326 0.4925 0.7018
No log 11.1429 390 0.4935 0.5250 0.4935 0.7025
No log 11.2 392 0.5021 0.5523 0.5021 0.7086
No log 11.2571 394 0.5118 0.5523 0.5118 0.7154
No log 11.3143 396 0.5260 0.5250 0.5260 0.7253
No log 11.3714 398 0.5192 0.5781 0.5192 0.7205
No log 11.4286 400 0.5121 0.4681 0.5121 0.7156
No log 11.4857 402 0.5148 0.4378 0.5148 0.7175
No log 11.5429 404 0.5244 0.3945 0.5244 0.7242
No log 11.6 406 0.5273 0.4795 0.5273 0.7261
No log 11.6571 408 0.4990 0.4171 0.4990 0.7064
No log 11.7143 410 0.5150 0.5110 0.5150 0.7176
No log 11.7714 412 0.5476 0.5961 0.5476 0.7400
No log 11.8286 414 0.5749 0.5400 0.5749 0.7582
No log 11.8857 416 0.5318 0.5855 0.5318 0.7292
No log 11.9429 418 0.5023 0.4762 0.5023 0.7087
No log 12.0 420 0.5099 0.4762 0.5099 0.7140
No log 12.0571 422 0.5227 0.4942 0.5227 0.7230
No log 12.1143 424 0.5247 0.4942 0.5247 0.7244
No log 12.1714 426 0.5189 0.4991 0.5189 0.7204
No log 12.2286 428 0.5300 0.4562 0.5300 0.7280
No log 12.2857 430 0.5543 0.3914 0.5543 0.7445
No log 12.3429 432 0.5973 0.4464 0.5973 0.7729
No log 12.4 434 0.5873 0.4464 0.5873 0.7664
No log 12.4571 436 0.5468 0.3599 0.5468 0.7395
No log 12.5143 438 0.5297 0.4972 0.5297 0.7278
No log 12.5714 440 0.5280 0.4972 0.5280 0.7267
No log 12.6286 442 0.5269 0.4722 0.5269 0.7259
No log 12.6857 444 0.5197 0.4788 0.5197 0.7209
No log 12.7429 446 0.5167 0.4972 0.5167 0.7188
No log 12.8 448 0.5230 0.5213 0.5230 0.7232
No log 12.8571 450 0.5208 0.4972 0.5208 0.7216
No log 12.9143 452 0.5219 0.4742 0.5219 0.7224
No log 12.9714 454 0.5327 0.4527 0.5327 0.7299
No log 13.0286 456 0.5473 0.5266 0.5473 0.7398
No log 13.0857 458 0.5535 0.5495 0.5535 0.7440
No log 13.1429 460 0.5489 0.5283 0.5489 0.7409
No log 13.2 462 0.5295 0.5046 0.5295 0.7277
No log 13.2571 464 0.5140 0.4677 0.5140 0.7169
No log 13.3143 466 0.5136 0.4746 0.5136 0.7166
No log 13.3714 468 0.5270 0.4821 0.5270 0.7259
No log 13.4286 470 0.5289 0.4821 0.5289 0.7273
No log 13.4857 472 0.5377 0.4821 0.5377 0.7333
No log 13.5429 474 0.5770 0.5171 0.5770 0.7596
No log 13.6 476 0.6415 0.4904 0.6415 0.8010
No log 13.6571 478 0.6099 0.5101 0.6099 0.7809
No log 13.7143 480 0.5345 0.4809 0.5345 0.7311
No log 13.7714 482 0.5161 0.4838 0.5161 0.7184
No log 13.8286 484 0.5196 0.3552 0.5196 0.7208
No log 13.8857 486 0.5301 0.4111 0.5301 0.7281
No log 13.9429 488 0.5460 0.4067 0.5460 0.7389
No log 14.0 490 0.5468 0.4418 0.5468 0.7394
No log 14.0571 492 0.5540 0.5286 0.5540 0.7443
No log 14.1143 494 0.5593 0.5070 0.5593 0.7479
No log 14.1714 496 0.5595 0.5070 0.5595 0.7480
No log 14.2286 498 0.5705 0.5267 0.5705 0.7553
0.2682 14.2857 500 0.6016 0.5869 0.6016 0.7756
0.2682 14.3429 502 0.6016 0.5869 0.6016 0.7756
0.2682 14.4 504 0.6032 0.5869 0.6032 0.7767
0.2682 14.4571 506 0.5727 0.5733 0.5727 0.7568
0.2682 14.5143 508 0.5654 0.5283 0.5654 0.7519
0.2682 14.5714 510 0.5601 0.5512 0.5601 0.7484

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k10_task7_organization

Finetuned
(4019)
this model