ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k7_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5135
  • Qwk: 0.4958
  • Mse: 0.5135
  • Rmse: 0.7166

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0541 2 2.5123 -0.0788 2.5123 1.5850
No log 0.1081 4 1.1927 0.0979 1.1927 1.0921
No log 0.1622 6 0.9192 -0.0354 0.9192 0.9587
No log 0.2162 8 0.9520 0.1867 0.9520 0.9757
No log 0.2703 10 0.9182 0.2460 0.9182 0.9583
No log 0.3243 12 0.9697 0.2363 0.9697 0.9848
No log 0.3784 14 0.8141 0.2193 0.8141 0.9022
No log 0.4324 16 0.7218 0.2361 0.7218 0.8496
No log 0.4865 18 0.9577 0.2784 0.9577 0.9786
No log 0.5405 20 0.7337 0.3719 0.7338 0.8566
No log 0.5946 22 0.5626 0.3608 0.5626 0.7501
No log 0.6486 24 0.5668 0.3348 0.5668 0.7529
No log 0.7027 26 0.7287 0.3553 0.7287 0.8536
No log 0.7568 28 1.2923 0.1453 1.2923 1.1368
No log 0.8108 30 1.4796 0.0595 1.4796 1.2164
No log 0.8649 32 1.3667 0.0799 1.3667 1.1691
No log 0.9189 34 1.0556 0.3217 1.0556 1.0274
No log 0.9730 36 0.6825 0.3553 0.6825 0.8262
No log 1.0270 38 0.5780 0.4437 0.5780 0.7603
No log 1.0811 40 0.6759 0.4144 0.6759 0.8221
No log 1.1351 42 0.5624 0.3494 0.5624 0.7499
No log 1.1892 44 0.6317 0.3173 0.6317 0.7948
No log 1.2432 46 0.7228 0.3492 0.7228 0.8502
No log 1.2973 48 0.8540 0.4088 0.8540 0.9241
No log 1.3514 50 0.9364 0.3719 0.9364 0.9677
No log 1.4054 52 0.9209 0.3239 0.9209 0.9596
No log 1.4595 54 0.7420 0.3090 0.7420 0.8614
No log 1.5135 56 0.6365 0.2841 0.6365 0.7978
No log 1.5676 58 0.5533 0.0444 0.5533 0.7438
No log 1.6216 60 0.5694 0.4758 0.5694 0.7546
No log 1.6757 62 0.5841 0.4919 0.5841 0.7643
No log 1.7297 64 0.5711 0.2336 0.5711 0.7557
No log 1.7838 66 0.7300 0.1358 0.7300 0.8544
No log 1.8378 68 0.9738 0.3483 0.9738 0.9868
No log 1.8919 70 1.0381 0.3603 1.0381 1.0189
No log 1.9459 72 0.9198 0.3601 0.9198 0.9591
No log 2.0 74 0.7377 0.1685 0.7377 0.8589
No log 2.0541 76 0.5541 0.4264 0.5541 0.7444
No log 2.1081 78 0.5542 0.4264 0.5542 0.7444
No log 2.1622 80 0.6085 0.3146 0.6085 0.7801
No log 2.2162 82 0.8764 0.2846 0.8764 0.9361
No log 2.2703 84 1.1718 0.2403 1.1718 1.0825
No log 2.3243 86 1.1522 0.2403 1.1522 1.0734
No log 2.3784 88 0.8890 0.2516 0.8890 0.9429
No log 2.4324 90 0.6335 0.3097 0.6335 0.7959
No log 2.4865 92 0.5748 0.4249 0.5748 0.7582
No log 2.5405 94 0.5382 0.5161 0.5382 0.7336
No log 2.5946 96 0.5344 0.4685 0.5344 0.7310
No log 2.6486 98 0.5285 0.4468 0.5285 0.7270
No log 2.7027 100 0.5315 0.3836 0.5315 0.7290
No log 2.7568 102 0.5254 0.4291 0.5254 0.7248
No log 2.8108 104 0.5279 0.5209 0.5279 0.7266
No log 2.8649 106 0.5091 0.4816 0.5091 0.7135
No log 2.9189 108 0.5853 0.3706 0.5853 0.7650
No log 2.9730 110 0.7720 0.3953 0.7720 0.8786
No log 3.0270 112 0.9994 0.3411 0.9994 0.9997
No log 3.0811 114 0.9980 0.3411 0.9980 0.9990
No log 3.1351 116 0.8056 0.4152 0.8056 0.8976
No log 3.1892 118 0.5418 0.4575 0.5418 0.7361
No log 3.2432 120 0.5137 0.5232 0.5137 0.7168
No log 3.2973 122 0.5135 0.4774 0.5135 0.7166
No log 3.3514 124 0.4966 0.5485 0.4966 0.7047
No log 3.4054 126 0.5219 0.4190 0.5219 0.7224
No log 3.4595 128 0.4987 0.4444 0.4987 0.7062
No log 3.5135 130 0.5225 0.5571 0.5225 0.7228
No log 3.5676 132 0.5669 0.5161 0.5669 0.7529
No log 3.6216 134 0.5596 0.5368 0.5596 0.7481
No log 3.6757 136 0.5066 0.5397 0.5066 0.7118
No log 3.7297 138 0.5635 0.4597 0.5635 0.7507
No log 3.7838 140 0.5868 0.4684 0.5868 0.7660
No log 3.8378 142 0.5241 0.5306 0.5241 0.7239
No log 3.8919 144 0.5501 0.5104 0.5501 0.7417
No log 3.9459 146 0.5611 0.4480 0.5611 0.7491
No log 4.0 148 0.5512 0.4752 0.5512 0.7424
No log 4.0541 150 0.5508 0.4752 0.5508 0.7422
No log 4.1081 152 0.5615 0.4800 0.5615 0.7493
No log 4.1622 154 0.5488 0.4800 0.5488 0.7408
No log 4.2162 156 0.5554 0.4955 0.5554 0.7453
No log 4.2703 158 0.5382 0.5160 0.5382 0.7336
No log 4.3243 160 0.5137 0.5205 0.5137 0.7167
No log 4.3784 162 0.5270 0.5407 0.5270 0.7259
No log 4.4324 164 0.5254 0.5840 0.5254 0.7248
No log 4.4865 166 0.5393 0.6242 0.5393 0.7344
No log 4.5405 168 0.5301 0.5318 0.5301 0.7281
No log 4.5946 170 0.5350 0.4675 0.5350 0.7314
No log 4.6486 172 0.5183 0.5812 0.5183 0.7199
No log 4.7027 174 0.5498 0.5980 0.5498 0.7415
No log 4.7568 176 0.5143 0.5951 0.5143 0.7171
No log 4.8108 178 0.5065 0.5722 0.5065 0.7117
No log 4.8649 180 0.5495 0.4952 0.5495 0.7413
No log 4.9189 182 0.5258 0.4935 0.5258 0.7251
No log 4.9730 184 0.5503 0.5106 0.5503 0.7418
No log 5.0270 186 0.6134 0.5046 0.6134 0.7832
No log 5.0811 188 0.6761 0.5205 0.6761 0.8222
No log 5.1351 190 0.6294 0.5046 0.6294 0.7934
No log 5.1892 192 0.5338 0.5030 0.5338 0.7306
No log 5.2432 194 0.5162 0.5090 0.5162 0.7185
No log 5.2973 196 0.5164 0.4892 0.5164 0.7186
No log 5.3514 198 0.6080 0.5046 0.6080 0.7797
No log 5.4054 200 0.7186 0.5047 0.7186 0.8477
No log 5.4595 202 0.7098 0.5047 0.7098 0.8425
No log 5.5135 204 0.6619 0.5047 0.6619 0.8135
No log 5.5676 206 0.7029 0.4867 0.7029 0.8384
No log 5.6216 208 0.6948 0.4867 0.6948 0.8335
No log 5.6757 210 0.6654 0.4867 0.6654 0.8157
No log 5.7297 212 0.5637 0.4598 0.5637 0.7508
No log 5.7838 214 0.5267 0.5249 0.5267 0.7257
No log 5.8378 216 0.5830 0.4909 0.5830 0.7636
No log 5.8919 218 0.5226 0.5794 0.5226 0.7229
No log 5.9459 220 0.4739 0.6210 0.4739 0.6884
No log 6.0 222 0.4974 0.5956 0.4974 0.7053
No log 6.0541 224 0.5892 0.5116 0.5892 0.7676
No log 6.1081 226 0.6235 0.5116 0.6235 0.7896
No log 6.1622 228 0.6574 0.5131 0.6574 0.8108
No log 6.2162 230 0.7315 0.4683 0.7315 0.8553
No log 6.2703 232 0.5902 0.5101 0.5902 0.7682
No log 6.3243 234 0.5172 0.5655 0.5172 0.7192
No log 6.3784 236 0.5206 0.5655 0.5206 0.7215
No log 6.4324 238 0.5374 0.5383 0.5374 0.7331
No log 6.4865 240 0.5946 0.4957 0.5946 0.7711
No log 6.5405 242 0.6325 0.5152 0.6325 0.7953
No log 6.5946 244 0.5916 0.4957 0.5916 0.7691
No log 6.6486 246 0.5370 0.6298 0.5370 0.7328
No log 6.7027 248 0.5498 0.6028 0.5498 0.7415
No log 6.7568 250 0.5435 0.5442 0.5435 0.7372
No log 6.8108 252 0.4899 0.6240 0.4899 0.7000
No log 6.8649 254 0.4682 0.6351 0.4682 0.6843
No log 6.9189 256 0.4803 0.6701 0.4803 0.6930
No log 6.9730 258 0.4973 0.5779 0.4973 0.7052
No log 7.0270 260 0.5029 0.5779 0.5029 0.7092
No log 7.0811 262 0.5486 0.4851 0.5486 0.7407
No log 7.1351 264 0.5349 0.5455 0.5349 0.7314
No log 7.1892 266 0.4916 0.5960 0.4916 0.7012
No log 7.2432 268 0.4769 0.6242 0.4769 0.6906
No log 7.2973 270 0.4780 0.6243 0.4780 0.6914
No log 7.3514 272 0.5320 0.5352 0.5320 0.7294
No log 7.4054 274 0.5359 0.5352 0.5359 0.7320
No log 7.4595 276 0.4975 0.5902 0.4975 0.7054
No log 7.5135 278 0.5265 0.5437 0.5265 0.7256
No log 7.5676 280 0.5312 0.5437 0.5312 0.7288
No log 7.6216 282 0.5215 0.5195 0.5215 0.7221
No log 7.6757 284 0.5215 0.5213 0.5215 0.7222
No log 7.7297 286 0.5285 0.5479 0.5285 0.7270
No log 7.7838 288 0.6330 0.4009 0.6330 0.7956
No log 7.8378 290 0.6862 0.3827 0.6862 0.8284
No log 7.8919 292 0.6269 0.3754 0.6269 0.7918
No log 7.9459 294 0.5536 0.5675 0.5536 0.7441
No log 8.0 296 0.5445 0.4795 0.5445 0.7379
No log 8.0541 298 0.5613 0.4753 0.5613 0.7492
No log 8.1081 300 0.5338 0.4681 0.5338 0.7306
No log 8.1622 302 0.5444 0.5591 0.5444 0.7378
No log 8.2162 304 0.6010 0.4931 0.6010 0.7752
No log 8.2703 306 0.6489 0.4747 0.6489 0.8055
No log 8.3243 308 0.6271 0.4747 0.6271 0.7919
No log 8.3784 310 0.6107 0.4690 0.6107 0.7815
No log 8.4324 312 0.5352 0.5283 0.5352 0.7316
No log 8.4865 314 0.5221 0.5731 0.5221 0.7225
No log 8.5405 316 0.5337 0.5510 0.5337 0.7305
No log 8.5946 318 0.5839 0.4589 0.5839 0.7641
No log 8.6486 320 0.6330 0.2559 0.6330 0.7956
No log 8.7027 322 0.6030 0.3661 0.6030 0.7765
No log 8.7568 324 0.5487 0.5283 0.5487 0.7408
No log 8.8108 326 0.5384 0.5283 0.5384 0.7338
No log 8.8649 328 0.5071 0.5248 0.5071 0.7121
No log 8.9189 330 0.5037 0.5890 0.5037 0.7097
No log 8.9730 332 0.5192 0.5110 0.5192 0.7206
No log 9.0270 334 0.5213 0.5110 0.5213 0.7220
No log 9.0811 336 0.4975 0.5463 0.4975 0.7053
No log 9.1351 338 0.4985 0.5765 0.4985 0.7060
No log 9.1892 340 0.4957 0.5321 0.4957 0.7040
No log 9.2432 342 0.5028 0.5521 0.5028 0.7091
No log 9.2973 344 0.5058 0.5422 0.5058 0.7112
No log 9.3514 346 0.4890 0.5890 0.4890 0.6993
No log 9.4054 348 0.5240 0.4997 0.5240 0.7239
No log 9.4595 350 0.6008 0.5205 0.6008 0.7751
No log 9.5135 352 0.5903 0.5077 0.5903 0.7683
No log 9.5676 354 0.5736 0.5131 0.5736 0.7574
No log 9.6216 356 0.5971 0.5131 0.5971 0.7727
No log 9.6757 358 0.5874 0.5131 0.5874 0.7664
No log 9.7297 360 0.5200 0.5726 0.5200 0.7211
No log 9.7838 362 0.4724 0.6900 0.4724 0.6873
No log 9.8378 364 0.4657 0.6914 0.4657 0.6824
No log 9.8919 366 0.4857 0.5849 0.4857 0.6969
No log 9.9459 368 0.5658 0.5077 0.5658 0.7522
No log 10.0 370 0.7004 0.4799 0.7004 0.8369
No log 10.0541 372 0.7230 0.4799 0.7230 0.8503
No log 10.1081 374 0.6398 0.4906 0.6398 0.7999
No log 10.1622 376 0.5610 0.5205 0.5610 0.7490
No log 10.2162 378 0.5485 0.5061 0.5485 0.7406
No log 10.2703 380 0.5190 0.5513 0.5190 0.7204
No log 10.3243 382 0.5057 0.5283 0.5057 0.7111
No log 10.3784 384 0.5012 0.5283 0.5012 0.7080
No log 10.4324 386 0.5359 0.4694 0.5359 0.7320
No log 10.4865 388 0.5845 0.5112 0.5845 0.7646
No log 10.5405 390 0.6362 0.4396 0.6362 0.7976
No log 10.5946 392 0.6614 0.4396 0.6614 0.8133
No log 10.6486 394 0.6030 0.5112 0.6030 0.7765
No log 10.7027 396 0.5268 0.5065 0.5268 0.7258
No log 10.7568 398 0.5082 0.5189 0.5082 0.7129
No log 10.8108 400 0.5225 0.5076 0.5225 0.7229
No log 10.8649 402 0.5229 0.5658 0.5229 0.7231
No log 10.9189 404 0.5385 0.4913 0.5385 0.7338
No log 10.9730 406 0.5455 0.5061 0.5455 0.7386
No log 11.0270 408 0.5348 0.5127 0.5348 0.7313
No log 11.0811 410 0.5413 0.4850 0.5413 0.7357
No log 11.1351 412 0.5269 0.4850 0.5269 0.7258
No log 11.1892 414 0.5179 0.5283 0.5179 0.7197
No log 11.2432 416 0.5124 0.5432 0.5124 0.7158
No log 11.2973 418 0.5102 0.5356 0.5102 0.7143
No log 11.3514 420 0.5458 0.4411 0.5458 0.7388
No log 11.4054 422 0.6420 0.4811 0.6420 0.8013
No log 11.4595 424 0.6587 0.4615 0.6587 0.8116
No log 11.5135 426 0.6570 0.4811 0.6570 0.8106
No log 11.5676 428 0.6021 0.4482 0.6021 0.7759
No log 11.6216 430 0.5261 0.4633 0.5261 0.7254
No log 11.6757 432 0.5056 0.5815 0.5056 0.7111
No log 11.7297 434 0.5039 0.5902 0.5039 0.7099
No log 11.7838 436 0.5162 0.5283 0.5162 0.7185
No log 11.8378 438 0.5506 0.4411 0.5506 0.7420
No log 11.8919 440 0.5518 0.4411 0.5518 0.7428
No log 11.9459 442 0.5569 0.4411 0.5569 0.7462
No log 12.0 444 0.5217 0.5432 0.5217 0.7223
No log 12.0541 446 0.5100 0.5649 0.5100 0.7142
No log 12.1081 448 0.5067 0.5649 0.5067 0.7118
No log 12.1622 450 0.5011 0.5979 0.5011 0.7079
No log 12.2162 452 0.4918 0.5750 0.4918 0.7013
No log 12.2703 454 0.4876 0.5649 0.4876 0.6983
No log 12.3243 456 0.4933 0.5432 0.4933 0.7023
No log 12.3784 458 0.4867 0.5432 0.4867 0.6976
No log 12.4324 460 0.4781 0.5714 0.4781 0.6914
No log 12.4865 462 0.4813 0.5550 0.4813 0.6938
No log 12.5405 464 0.4855 0.5765 0.4855 0.6968
No log 12.5946 466 0.5342 0.4886 0.5342 0.7309
No log 12.6486 468 0.5888 0.4613 0.5888 0.7673
No log 12.7027 470 0.6179 0.5003 0.6179 0.7861
No log 12.7568 472 0.5956 0.5200 0.5956 0.7718
No log 12.8108 474 0.5223 0.5152 0.5223 0.7227
No log 12.8649 476 0.4694 0.5507 0.4694 0.6851
No log 12.9189 478 0.4778 0.6326 0.4778 0.6912
No log 12.9730 480 0.4694 0.6326 0.4694 0.6851
No log 13.0270 482 0.4591 0.6542 0.4591 0.6776
No log 13.0811 484 0.4656 0.5571 0.4656 0.6824
No log 13.1351 486 0.5011 0.5603 0.5011 0.7079
No log 13.1892 488 0.5092 0.5748 0.5092 0.7136
No log 13.2432 490 0.5502 0.5331 0.5502 0.7417
No log 13.2973 492 0.5735 0.5265 0.5735 0.7573
No log 13.3514 494 0.5980 0.5281 0.5980 0.7733
No log 13.4054 496 0.5493 0.5112 0.5493 0.7412
No log 13.4595 498 0.4908 0.5702 0.4908 0.7006
0.3193 13.5135 500 0.4888 0.5405 0.4888 0.6991
0.3193 13.5676 502 0.5048 0.4364 0.5048 0.7105
0.3193 13.6216 504 0.5125 0.4124 0.5125 0.7159
0.3193 13.6757 506 0.5151 0.4704 0.5151 0.7177
0.3193 13.7297 508 0.5124 0.4968 0.5124 0.7158
0.3193 13.7838 510 0.5135 0.4958 0.5135 0.7166

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k7_task7_organization

Finetuned
(4019)
this model