ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k6_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4968
  • Qwk: 0.5283
  • Mse: 0.4968
  • Rmse: 0.7048

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0645 2 2.6453 -0.1213 2.6453 1.6264
No log 0.1290 4 1.3149 0.0745 1.3149 1.1467
No log 0.1935 6 0.9635 -0.1523 0.9635 0.9816
No log 0.2581 8 1.0538 0.0863 1.0538 1.0265
No log 0.3226 10 0.8104 0.1914 0.8104 0.9002
No log 0.3871 12 0.7235 0.2708 0.7235 0.8506
No log 0.4516 14 0.9186 0.2784 0.9186 0.9584
No log 0.5161 16 0.7134 0.2036 0.7134 0.8446
No log 0.5806 18 0.7205 0.2036 0.7205 0.8488
No log 0.6452 20 0.7412 0.2992 0.7412 0.8609
No log 0.7097 22 0.6635 0.2903 0.6635 0.8146
No log 0.7742 24 0.7356 0.2825 0.7356 0.8576
No log 0.8387 26 1.1203 0.2679 1.1203 1.0584
No log 0.9032 28 1.0421 0.2996 1.0421 1.0209
No log 0.9677 30 0.7833 0.3978 0.7833 0.8850
No log 1.0323 32 0.8048 0.4208 0.8048 0.8971
No log 1.0968 34 1.0642 0.2701 1.0642 1.0316
No log 1.1613 36 1.0606 0.2701 1.0606 1.0298
No log 1.2258 38 0.8389 0.3760 0.8389 0.9159
No log 1.2903 40 0.6636 0.4301 0.6636 0.8146
No log 1.3548 42 0.5516 0.4005 0.5516 0.7427
No log 1.4194 44 0.5212 0.3530 0.5212 0.7220
No log 1.4839 46 0.5805 0.4695 0.5805 0.7619
No log 1.5484 48 1.0091 0.2477 1.0091 1.0046
No log 1.6129 50 1.1341 0.1694 1.1341 1.0649
No log 1.6774 52 0.8695 0.3679 0.8695 0.9325
No log 1.7419 54 0.5206 0.4724 0.5206 0.7215
No log 1.8065 56 0.6353 0.5027 0.6353 0.7971
No log 1.8710 58 0.7724 0.3998 0.7724 0.8788
No log 1.9355 60 0.6294 0.4491 0.6294 0.7934
No log 2.0 62 0.5446 0.3039 0.5446 0.7380
No log 2.0645 64 0.6065 0.4562 0.6065 0.7788
No log 2.1290 66 0.6053 0.4695 0.6053 0.7780
No log 2.1935 68 0.5364 0.5152 0.5364 0.7324
No log 2.2581 70 0.5457 0.4984 0.5457 0.7387
No log 2.3226 72 0.5548 0.5232 0.5548 0.7449
No log 2.3871 74 0.5703 0.5074 0.5703 0.7552
No log 2.4516 76 0.8040 0.3527 0.8040 0.8967
No log 2.5161 78 0.8144 0.3973 0.8144 0.9025
No log 2.5806 80 0.7965 0.3973 0.7965 0.8925
No log 2.6452 82 0.6464 0.3166 0.6464 0.8040
No log 2.7097 84 0.5287 0.4829 0.5287 0.7271
No log 2.7742 86 0.5065 0.4575 0.5065 0.7117
No log 2.8387 88 0.5160 0.5117 0.5160 0.7184
No log 2.9032 90 0.5464 0.5907 0.5464 0.7392
No log 2.9677 92 0.5125 0.4839 0.5125 0.7159
No log 3.0323 94 0.5228 0.4438 0.5228 0.7230
No log 3.0968 96 0.6003 0.5603 0.6003 0.7748
No log 3.1613 98 0.6688 0.5470 0.6688 0.8178
No log 3.2258 100 0.5451 0.4635 0.5451 0.7383
No log 3.2903 102 0.5887 0.5543 0.5887 0.7672
No log 3.3548 104 0.5578 0.5357 0.5578 0.7469
No log 3.4194 106 0.5486 0.5266 0.5486 0.7407
No log 3.4839 108 0.7052 0.5232 0.7052 0.8398
No log 3.5484 110 0.6479 0.5363 0.6479 0.8049
No log 3.6129 112 0.6242 0.5543 0.6242 0.7900
No log 3.6774 114 0.5008 0.5702 0.5008 0.7077
No log 3.7419 116 0.4958 0.5723 0.4958 0.7041
No log 3.8065 118 0.4900 0.5723 0.4900 0.7000
No log 3.8710 120 0.4875 0.5904 0.4875 0.6982
No log 3.9355 122 0.5509 0.5603 0.5509 0.7422
No log 4.0 124 0.6305 0.5252 0.6305 0.7941
No log 4.0645 126 0.5647 0.5603 0.5647 0.7514
No log 4.1290 128 0.5367 0.5669 0.5367 0.7326
No log 4.1935 130 0.5084 0.6431 0.5084 0.7131
No log 4.2581 132 0.5069 0.5904 0.5069 0.7120
No log 4.3226 134 0.5622 0.5486 0.5622 0.7498
No log 4.3871 136 0.5339 0.5918 0.5339 0.7307
No log 4.4516 138 0.5153 0.6114 0.5153 0.7179
No log 4.5161 140 0.4844 0.5596 0.4844 0.6960
No log 4.5806 142 0.5064 0.6114 0.5064 0.7116
No log 4.6452 144 0.5109 0.5596 0.5109 0.7148
No log 4.7097 146 0.5266 0.5574 0.5266 0.7257
No log 4.7742 148 0.5459 0.5745 0.5459 0.7388
No log 4.8387 150 0.5488 0.5084 0.5488 0.7408
No log 4.9032 152 0.5620 0.5234 0.5620 0.7497
No log 4.9677 154 0.5773 0.4858 0.5773 0.7598
No log 5.0323 156 0.5413 0.5596 0.5413 0.7357
No log 5.0968 158 0.5262 0.5517 0.5262 0.7254
No log 5.1613 160 0.4968 0.5625 0.4968 0.7049
No log 5.2258 162 0.4967 0.6678 0.4967 0.7048
No log 5.2903 164 0.6427 0.5077 0.6427 0.8017
No log 5.3548 166 0.6656 0.4852 0.6656 0.8158
No log 5.4194 168 0.5608 0.4892 0.5608 0.7489
No log 5.4839 170 0.5137 0.5222 0.5137 0.7167
No log 5.5484 172 0.5345 0.5488 0.5345 0.7311
No log 5.6129 174 0.5441 0.5210 0.5441 0.7376
No log 5.6774 176 0.7109 0.5195 0.7109 0.8431
No log 5.7419 178 0.7286 0.5240 0.7286 0.8536
No log 5.8065 180 0.7453 0.5240 0.7453 0.8633
No log 5.8710 182 0.6859 0.5355 0.6859 0.8282
No log 5.9355 184 0.5831 0.5239 0.5831 0.7636
No log 6.0 186 0.5749 0.5994 0.5749 0.7582
No log 6.0645 188 0.5846 0.5392 0.5846 0.7646
No log 6.1290 190 0.6317 0.5028 0.6317 0.7948
No log 6.1935 192 0.5881 0.5058 0.5881 0.7669
No log 6.2581 194 0.5676 0.5039 0.5676 0.7534
No log 6.3226 196 0.5651 0.4866 0.5651 0.7517
No log 6.3871 198 0.5597 0.5042 0.5597 0.7481
No log 6.4516 200 0.5911 0.5356 0.5911 0.7688
No log 6.5161 202 0.6567 0.4594 0.6567 0.8104
No log 6.5806 204 0.6192 0.5368 0.6192 0.7869
No log 6.6452 206 0.5751 0.5335 0.5751 0.7584
No log 6.7097 208 0.5602 0.4904 0.5602 0.7485
No log 6.7742 210 0.5449 0.5009 0.5449 0.7382
No log 6.8387 212 0.5779 0.5283 0.5779 0.7602
No log 6.9032 214 0.6415 0.5058 0.6415 0.8009
No log 6.9677 216 0.6030 0.4875 0.6030 0.7765
No log 7.0323 218 0.5676 0.5483 0.5676 0.7534
No log 7.0968 220 0.6969 0.4784 0.6969 0.8348
No log 7.1613 222 0.9819 0.3092 0.9819 0.9909
No log 7.2258 224 0.9501 0.2903 0.9501 0.9747
No log 7.2903 226 0.8591 0.4383 0.8591 0.9269
No log 7.3548 228 0.6072 0.5106 0.6072 0.7792
No log 7.4194 230 0.5426 0.5335 0.5426 0.7366
No log 7.4839 232 0.5663 0.5061 0.5663 0.7526
No log 7.5484 234 0.5973 0.5090 0.5973 0.7728
No log 7.6129 236 0.7318 0.4836 0.7318 0.8555
No log 7.6774 238 0.7249 0.4768 0.7249 0.8514
No log 7.7419 240 0.6004 0.5144 0.6004 0.7748
No log 7.8065 242 0.5592 0.5692 0.5592 0.7478
No log 7.8710 244 0.5545 0.5476 0.5545 0.7446
No log 7.9355 246 0.5539 0.5213 0.5539 0.7442
No log 8.0 248 0.5754 0.4980 0.5754 0.7586
No log 8.0645 250 0.6357 0.4875 0.6357 0.7973
No log 8.1290 252 0.5986 0.4832 0.5986 0.7737
No log 8.1935 254 0.5818 0.5321 0.5818 0.7628
No log 8.2581 256 0.5893 0.4655 0.5893 0.7676
No log 8.3226 258 0.6466 0.4997 0.6466 0.8041
No log 8.3871 260 0.6774 0.4672 0.6774 0.8231
No log 8.4516 262 0.7532 0.4362 0.7532 0.8678
No log 8.5161 264 0.7618 0.4039 0.7618 0.8728
No log 8.5806 266 0.6438 0.4672 0.6438 0.8024
No log 8.6452 268 0.5634 0.4762 0.5634 0.7506
No log 8.7097 270 0.5606 0.5250 0.5606 0.7487
No log 8.7742 272 0.5353 0.4829 0.5353 0.7317
No log 8.8387 274 0.5476 0.4895 0.5476 0.7400
No log 8.9032 276 0.6681 0.4527 0.6681 0.8174
No log 8.9677 278 0.7700 0.4601 0.7700 0.8775
No log 9.0323 280 0.7203 0.4890 0.7203 0.8487
No log 9.0968 282 0.5978 0.4892 0.5978 0.7732
No log 9.1613 284 0.5516 0.5074 0.5516 0.7427
No log 9.2258 286 0.5478 0.5136 0.5478 0.7401
No log 9.2903 288 0.5784 0.4952 0.5784 0.7605
No log 9.3548 290 0.5733 0.5014 0.5733 0.7571
No log 9.4194 292 0.5810 0.5014 0.5810 0.7622
No log 9.4839 294 0.5528 0.4770 0.5528 0.7435
No log 9.5484 296 0.5461 0.4849 0.5461 0.7390
No log 9.6129 298 0.5413 0.4898 0.5413 0.7358
No log 9.6774 300 0.5457 0.4898 0.5457 0.7387
No log 9.7419 302 0.5662 0.4406 0.5662 0.7525
No log 9.8065 304 0.5621 0.5152 0.5621 0.7497
No log 9.8710 306 0.5815 0.5084 0.5815 0.7625
No log 9.9355 308 0.5839 0.5379 0.5839 0.7641
No log 10.0 310 0.5916 0.4876 0.5916 0.7691
No log 10.0645 312 0.5996 0.4832 0.5996 0.7744
No log 10.1290 314 0.5857 0.4895 0.5857 0.7653
No log 10.1935 316 0.5562 0.5213 0.5562 0.7458
No log 10.2581 318 0.5542 0.4700 0.5542 0.7445
No log 10.3226 320 0.5515 0.5039 0.5515 0.7426
No log 10.3871 322 0.5583 0.5722 0.5583 0.7472
No log 10.4516 324 0.5908 0.5127 0.5908 0.7687
No log 10.5161 326 0.6104 0.4575 0.6104 0.7813
No log 10.5806 328 0.5673 0.5195 0.5673 0.7532
No log 10.6452 330 0.5449 0.5687 0.5449 0.7382
No log 10.7097 332 0.5400 0.5915 0.5400 0.7349
No log 10.7742 334 0.5347 0.5768 0.5347 0.7312
No log 10.8387 336 0.5239 0.5611 0.5239 0.7238
No log 10.9032 338 0.5266 0.5538 0.5266 0.7257
No log 10.9677 340 0.5469 0.5061 0.5469 0.7395
No log 11.0323 342 0.5479 0.5114 0.5479 0.7402
No log 11.0968 344 0.5748 0.5254 0.5748 0.7582
No log 11.1613 346 0.5771 0.5254 0.5771 0.7597
No log 11.2258 348 0.5383 0.5697 0.5383 0.7337
No log 11.2903 350 0.5327 0.6092 0.5327 0.7299
No log 11.3548 352 0.5299 0.5902 0.5299 0.7279
No log 11.4194 354 0.5302 0.5707 0.5302 0.7282
No log 11.4839 356 0.5686 0.5070 0.5686 0.7541
No log 11.5484 358 0.6146 0.4652 0.6146 0.7839
No log 11.6129 360 0.5798 0.5152 0.5798 0.7615
No log 11.6774 362 0.5514 0.4841 0.5514 0.7425
No log 11.7419 364 0.5596 0.5450 0.5596 0.7481
No log 11.8065 366 0.6013 0.5142 0.6013 0.7754
No log 11.8710 368 0.6382 0.4488 0.6382 0.7989
No log 11.9355 370 0.6206 0.4847 0.6206 0.7878
No log 12.0 372 0.5956 0.5037 0.5956 0.7718
No log 12.0645 374 0.6333 0.4353 0.6333 0.7958
No log 12.1290 376 0.6408 0.4353 0.6408 0.8005
No log 12.1935 378 0.6040 0.4615 0.6040 0.7772
No log 12.2581 380 0.5985 0.3915 0.5985 0.7736
No log 12.3226 382 0.6456 0.4413 0.6456 0.8035
No log 12.3871 384 0.6738 0.4114 0.6738 0.8209
No log 12.4516 386 0.6312 0.4397 0.6312 0.7945
No log 12.5161 388 0.5903 0.3915 0.5903 0.7683
No log 12.5806 390 0.6136 0.4635 0.6136 0.7833
No log 12.6452 392 0.6256 0.5110 0.6256 0.7910
No log 12.7097 394 0.6139 0.5110 0.6139 0.7835
No log 12.7742 396 0.6090 0.5368 0.6090 0.7804
No log 12.8387 398 0.6072 0.4941 0.6072 0.7792
No log 12.9032 400 0.5860 0.5152 0.5860 0.7655
No log 12.9677 402 0.5834 0.5028 0.5834 0.7638
No log 13.0323 404 0.6111 0.4832 0.6111 0.7817
No log 13.0968 406 0.5999 0.4832 0.5999 0.7745
No log 13.1613 408 0.5761 0.4527 0.5761 0.7590
No log 13.2258 410 0.5746 0.4656 0.5746 0.7580
No log 13.2903 412 0.5827 0.4527 0.5827 0.7633
No log 13.3548 414 0.6332 0.4770 0.6332 0.7958
No log 13.4194 416 0.6709 0.3988 0.6709 0.8191
No log 13.4839 418 0.6633 0.4241 0.6633 0.8144
No log 13.5484 420 0.6175 0.4517 0.6175 0.7858
No log 13.6129 422 0.5926 0.5133 0.5926 0.7698
No log 13.6774 424 0.5815 0.5133 0.5815 0.7626
No log 13.7419 426 0.5746 0.5133 0.5746 0.7580
No log 13.8065 428 0.5738 0.4527 0.5738 0.7575
No log 13.8710 430 0.5883 0.4480 0.5883 0.7670
No log 13.9355 432 0.6276 0.4711 0.6276 0.7922
No log 14.0 434 0.6067 0.4711 0.6067 0.7789
No log 14.0645 436 0.5552 0.4918 0.5552 0.7451
No log 14.1290 438 0.5426 0.4857 0.5426 0.7366
No log 14.1935 440 0.5628 0.4740 0.5628 0.7502
No log 14.2581 442 0.5418 0.5114 0.5418 0.7361
No log 14.3226 444 0.5291 0.4656 0.5291 0.7274
No log 14.3871 446 0.6117 0.4949 0.6117 0.7821
No log 14.4516 448 0.7146 0.4615 0.7146 0.8453
No log 14.5161 450 0.6954 0.4811 0.6954 0.8339
No log 14.5806 452 0.5945 0.5078 0.5945 0.7711
No log 14.6452 454 0.5558 0.5563 0.5558 0.7455
No log 14.7097 456 0.5610 0.4937 0.5610 0.7490
No log 14.7742 458 0.5511 0.4575 0.5511 0.7424
No log 14.8387 460 0.5453 0.5213 0.5453 0.7384
No log 14.9032 462 0.5416 0.4918 0.5416 0.7359
No log 14.9677 464 0.5437 0.4918 0.5437 0.7373
No log 15.0323 466 0.5604 0.5339 0.5604 0.7486
No log 15.0968 468 0.5661 0.5339 0.5661 0.7524
No log 15.1613 470 0.5592 0.5339 0.5592 0.7478
No log 15.2258 472 0.5587 0.5231 0.5587 0.7474
No log 15.2903 474 0.5681 0.5390 0.5681 0.7537
No log 15.3548 476 0.5712 0.5250 0.5712 0.7558
No log 15.4194 478 0.5965 0.4499 0.5965 0.7723
No log 15.4839 480 0.6035 0.4499 0.6035 0.7768
No log 15.5484 482 0.5762 0.4556 0.5762 0.7591
No log 15.6129 484 0.5458 0.5414 0.5458 0.7388
No log 15.6774 486 0.5368 0.4918 0.5368 0.7327
No log 15.7419 488 0.5314 0.4918 0.5314 0.7290
No log 15.8065 490 0.5335 0.4991 0.5335 0.7304
No log 15.8710 492 0.5388 0.5463 0.5388 0.7340
No log 15.9355 494 0.5277 0.5446 0.5277 0.7264
No log 16.0 496 0.5208 0.5213 0.5208 0.7217
No log 16.0645 498 0.5165 0.5812 0.5165 0.7187
0.2803 16.1290 500 0.5316 0.5736 0.5316 0.7291
0.2803 16.1935 502 0.5625 0.5524 0.5625 0.7500
0.2803 16.2581 504 0.5927 0.5471 0.5927 0.7698
0.2803 16.3226 506 0.5694 0.5332 0.5694 0.7546
0.2803 16.3871 508 0.5218 0.5868 0.5218 0.7224
0.2803 16.4516 510 0.4987 0.5929 0.4987 0.7062
0.2803 16.5161 512 0.4903 0.5929 0.4903 0.7002
0.2803 16.5806 514 0.5066 0.5841 0.5066 0.7118
0.2803 16.6452 516 0.5312 0.5438 0.5312 0.7288
0.2803 16.7097 518 0.5476 0.5438 0.5476 0.7400
0.2803 16.7742 520 0.5056 0.5642 0.5056 0.7111
0.2803 16.8387 522 0.5092 0.5642 0.5092 0.7136
0.2803 16.9032 524 0.5089 0.5841 0.5089 0.7133
0.2803 16.9677 526 0.4852 0.6503 0.4852 0.6966
0.2803 17.0323 528 0.4851 0.6298 0.4851 0.6965
0.2803 17.0968 530 0.4860 0.6492 0.4860 0.6971
0.2803 17.1613 532 0.5073 0.5642 0.5073 0.7123
0.2803 17.2258 534 0.5733 0.5299 0.5733 0.7571
0.2803 17.2903 536 0.6068 0.5031 0.6068 0.7790
0.2803 17.3548 538 0.5650 0.5299 0.5650 0.7517
0.2803 17.4194 540 0.4968 0.5283 0.4968 0.7048

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k6_task7_organization

Finetuned
(4019)
this model