ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k6_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6793
  • Qwk: 0.5368
  • Mse: 0.6793
  • Rmse: 0.8242

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0952 2 4.5712 -0.0445 4.5712 2.1380
No log 0.1905 4 2.5380 -0.0206 2.5380 1.5931
No log 0.2857 6 1.9607 -0.0055 1.9607 1.4003
No log 0.3810 8 1.1118 0.2196 1.1118 1.0544
No log 0.4762 10 1.1568 0.2196 1.1568 1.0755
No log 0.5714 12 1.1649 0.1268 1.1649 1.0793
No log 0.6667 14 1.2003 0.1205 1.2003 1.0956
No log 0.7619 16 1.3377 -0.0560 1.3377 1.1566
No log 0.8571 18 1.5157 -0.0148 1.5157 1.2311
No log 0.9524 20 1.3080 0.0232 1.3080 1.1437
No log 1.0476 22 1.0724 0.2171 1.0724 1.0356
No log 1.1429 24 1.0225 0.3435 1.0225 1.0112
No log 1.2381 26 1.0527 0.1779 1.0527 1.0260
No log 1.3333 28 1.3476 0.0436 1.3476 1.1609
No log 1.4286 30 1.6078 -0.0359 1.6078 1.2680
No log 1.5238 32 1.7108 0.0408 1.7108 1.3080
No log 1.6190 34 1.3471 0.0701 1.3471 1.1607
No log 1.7143 36 1.0244 0.2449 1.0244 1.0121
No log 1.8095 38 1.0658 0.2505 1.0658 1.0324
No log 1.9048 40 1.0232 0.2449 1.0232 1.0115
No log 2.0 42 1.1340 0.1296 1.1340 1.0649
No log 2.0952 44 1.4590 0.0380 1.4590 1.2079
No log 2.1905 46 1.4488 0.0760 1.4488 1.2037
No log 2.2857 48 1.1578 0.1296 1.1578 1.0760
No log 2.3810 50 1.0181 0.2351 1.0181 1.0090
No log 2.4762 52 1.0385 0.3067 1.0385 1.0191
No log 2.5714 54 1.0491 0.3045 1.0491 1.0243
No log 2.6667 56 1.0156 0.3958 1.0156 1.0077
No log 2.7619 58 1.0757 0.3196 1.0757 1.0372
No log 2.8571 60 1.0430 0.3196 1.0430 1.0213
No log 2.9524 62 0.9450 0.3879 0.9450 0.9721
No log 3.0476 64 0.9179 0.3377 0.9179 0.9581
No log 3.1429 66 0.9253 0.4232 0.9253 0.9619
No log 3.2381 68 1.0054 0.2837 1.0054 1.0027
No log 3.3333 70 0.9698 0.3049 0.9698 0.9848
No log 3.4286 72 0.8844 0.5139 0.8844 0.9404
No log 3.5238 74 1.0781 0.3477 1.0781 1.0383
No log 3.6190 76 1.0306 0.4013 1.0306 1.0152
No log 3.7143 78 0.8825 0.5392 0.8825 0.9394
No log 3.8095 80 0.8436 0.4794 0.8436 0.9185
No log 3.9048 82 0.9172 0.4723 0.9172 0.9577
No log 4.0 84 0.9996 0.4082 0.9996 0.9998
No log 4.0952 86 0.8279 0.5089 0.8279 0.9099
No log 4.1905 88 1.0098 0.3642 1.0098 1.0049
No log 4.2857 90 1.2032 0.3831 1.2032 1.0969
No log 4.3810 92 0.9814 0.3981 0.9814 0.9907
No log 4.4762 94 0.7864 0.4526 0.7864 0.8868
No log 4.5714 96 1.0324 0.4347 1.0324 1.0161
No log 4.6667 98 1.1164 0.3396 1.1164 1.0566
No log 4.7619 100 0.9071 0.4068 0.9071 0.9524
No log 4.8571 102 0.7933 0.5063 0.7933 0.8907
No log 4.9524 104 0.9029 0.4470 0.9029 0.9502
No log 5.0476 106 0.8812 0.4560 0.8812 0.9387
No log 5.1429 108 0.8508 0.4560 0.8508 0.9224
No log 5.2381 110 0.7888 0.4998 0.7888 0.8881
No log 5.3333 112 0.7661 0.4644 0.7661 0.8753
No log 5.4286 114 0.7639 0.4511 0.7639 0.8740
No log 5.5238 116 0.7510 0.4776 0.7510 0.8666
No log 5.6190 118 0.7666 0.5446 0.7666 0.8755
No log 5.7143 120 0.7959 0.5046 0.7959 0.8921
No log 5.8095 122 0.8298 0.4285 0.8298 0.9109
No log 5.9048 124 0.7804 0.5366 0.7804 0.8834
No log 6.0 126 0.7895 0.5135 0.7895 0.8886
No log 6.0952 128 0.8359 0.5110 0.8359 0.9143
No log 6.1905 130 0.6832 0.6032 0.6832 0.8266
No log 6.2857 132 0.6644 0.4722 0.6644 0.8151
No log 6.3810 134 0.7379 0.4671 0.7379 0.8590
No log 6.4762 136 0.7870 0.5018 0.7870 0.8871
No log 6.5714 138 0.6979 0.4737 0.6979 0.8354
No log 6.6667 140 0.6968 0.5710 0.6968 0.8347
No log 6.7619 142 0.7098 0.4660 0.7098 0.8425
No log 6.8571 144 0.7188 0.4660 0.7188 0.8478
No log 6.9524 146 0.7270 0.4644 0.7270 0.8527
No log 7.0476 148 0.7379 0.5112 0.7379 0.8590
No log 7.1429 150 0.7611 0.4982 0.7611 0.8724
No log 7.2381 152 0.8045 0.4734 0.8045 0.8969
No log 7.3333 154 0.8798 0.4910 0.8798 0.9380
No log 7.4286 156 0.8313 0.5018 0.8313 0.9117
No log 7.5238 158 0.7253 0.5866 0.7253 0.8517
No log 7.6190 160 0.7500 0.5350 0.7500 0.8660
No log 7.7143 162 0.8117 0.5190 0.8117 0.9010
No log 7.8095 164 0.7050 0.6432 0.7050 0.8397
No log 7.9048 166 0.7178 0.5803 0.7178 0.8473
No log 8.0 168 0.8213 0.5440 0.8213 0.9063
No log 8.0952 170 0.7211 0.5728 0.7211 0.8492
No log 8.1905 172 0.6739 0.6229 0.6739 0.8209
No log 8.2857 174 0.7821 0.5375 0.7821 0.8843
No log 8.3810 176 0.7514 0.5515 0.7514 0.8669
No log 8.4762 178 0.6754 0.5845 0.6754 0.8218
No log 8.5714 180 0.7114 0.5325 0.7114 0.8434
No log 8.6667 182 0.7073 0.5115 0.7073 0.8410
No log 8.7619 184 0.6499 0.6282 0.6499 0.8062
No log 8.8571 186 0.7586 0.5447 0.7586 0.8710
No log 8.9524 188 0.9381 0.5075 0.9381 0.9685
No log 9.0476 190 0.9187 0.5328 0.9187 0.9585
No log 9.1429 192 0.7271 0.6245 0.7271 0.8527
No log 9.2381 194 0.6878 0.6112 0.6878 0.8293
No log 9.3333 196 0.7028 0.6166 0.7028 0.8383
No log 9.4286 198 0.6806 0.6032 0.6806 0.8250
No log 9.5238 200 0.7884 0.5766 0.7884 0.8879
No log 9.6190 202 0.8689 0.5510 0.8689 0.9321
No log 9.7143 204 0.7787 0.5579 0.7787 0.8824
No log 9.8095 206 0.6961 0.6121 0.6961 0.8343
No log 9.9048 208 0.6613 0.5412 0.6613 0.8132
No log 10.0 210 0.7041 0.5314 0.7041 0.8391
No log 10.0952 212 0.6909 0.5314 0.6909 0.8312
No log 10.1905 214 0.6895 0.5044 0.6895 0.8303
No log 10.2857 216 0.7073 0.5671 0.7073 0.8410
No log 10.3810 218 0.7386 0.5500 0.7386 0.8594
No log 10.4762 220 0.7395 0.5842 0.7395 0.8599
No log 10.5714 222 0.7517 0.5450 0.7517 0.8670
No log 10.6667 224 0.7842 0.5167 0.7842 0.8855
No log 10.7619 226 0.7515 0.4973 0.7515 0.8669
No log 10.8571 228 0.7484 0.4824 0.7484 0.8651
No log 10.9524 230 0.7874 0.5508 0.7874 0.8874
No log 11.0476 232 0.7569 0.5055 0.7569 0.8700
No log 11.1429 234 0.8257 0.4929 0.8257 0.9087
No log 11.2381 236 0.9382 0.4885 0.9382 0.9686
No log 11.3333 238 0.8618 0.4799 0.8618 0.9283
No log 11.4286 240 0.7513 0.5232 0.7513 0.8668
No log 11.5238 242 0.7321 0.4544 0.7321 0.8556
No log 11.6190 244 0.7237 0.5194 0.7237 0.8507
No log 11.7143 246 0.7186 0.5596 0.7186 0.8477
No log 11.8095 248 0.7113 0.5959 0.7113 0.8434
No log 11.9048 250 0.7643 0.5668 0.7643 0.8742
No log 12.0 252 0.7405 0.5977 0.7405 0.8605
No log 12.0952 254 0.6960 0.5985 0.6960 0.8342
No log 12.1905 256 0.7113 0.5300 0.7113 0.8434
No log 12.2857 258 0.7361 0.4860 0.7361 0.8579
No log 12.3810 260 0.7270 0.5406 0.7270 0.8526
No log 12.4762 262 0.7216 0.5146 0.7216 0.8495
No log 12.5714 264 0.7267 0.5680 0.7267 0.8525
No log 12.6667 266 0.7307 0.5016 0.7307 0.8548
No log 12.7619 268 0.7325 0.5152 0.7325 0.8558
No log 12.8571 270 0.7350 0.5155 0.7350 0.8573
No log 12.9524 272 0.7498 0.5570 0.7498 0.8659
No log 13.0476 274 0.7938 0.5697 0.7938 0.8910
No log 13.1429 276 0.7859 0.5279 0.7859 0.8865
No log 13.2381 278 0.7320 0.5774 0.7320 0.8556
No log 13.3333 280 0.6819 0.5261 0.6819 0.8258
No log 13.4286 282 0.7004 0.5548 0.7004 0.8369
No log 13.5238 284 0.7061 0.5548 0.7061 0.8403
No log 13.6190 286 0.6909 0.5524 0.6909 0.8312
No log 13.7143 288 0.6785 0.5063 0.6785 0.8237
No log 13.8095 290 0.6786 0.4938 0.6786 0.8238
No log 13.9048 292 0.6908 0.5060 0.6908 0.8312
No log 14.0 294 0.7504 0.5922 0.7504 0.8662
No log 14.0952 296 0.8281 0.5229 0.8281 0.9100
No log 14.1905 298 0.7722 0.5432 0.7722 0.8787
No log 14.2857 300 0.7146 0.4971 0.7146 0.8454
No log 14.3810 302 0.7448 0.5610 0.7448 0.8630
No log 14.4762 304 0.8543 0.5344 0.8543 0.9243
No log 14.5714 306 0.8623 0.5344 0.8623 0.9286
No log 14.6667 308 0.7674 0.5279 0.7674 0.8760
No log 14.7619 310 0.7121 0.5622 0.7121 0.8439
No log 14.8571 312 0.6690 0.5500 0.6690 0.8179
No log 14.9524 314 0.6599 0.4957 0.6599 0.8124
No log 15.0476 316 0.6609 0.4957 0.6609 0.8129
No log 15.1429 318 0.6608 0.4957 0.6608 0.8129
No log 15.2381 320 0.6624 0.6104 0.6624 0.8139
No log 15.3333 322 0.6901 0.5821 0.6901 0.8307
No log 15.4286 324 0.7039 0.6137 0.7039 0.8390
No log 15.5238 326 0.7328 0.6025 0.7328 0.8561
No log 15.6190 328 0.7102 0.5546 0.7102 0.8427
No log 15.7143 330 0.6853 0.5163 0.6853 0.8279
No log 15.8095 332 0.7165 0.5339 0.7165 0.8465
No log 15.9048 334 0.7110 0.5446 0.7110 0.8432
No log 16.0 336 0.6993 0.5163 0.6993 0.8362
No log 16.0952 338 0.7695 0.4450 0.7695 0.8772
No log 16.1905 340 0.8393 0.5021 0.8393 0.9161
No log 16.2857 342 0.8007 0.5167 0.8007 0.8948
No log 16.3810 344 0.7235 0.5098 0.7235 0.8506
No log 16.4762 346 0.6993 0.5726 0.6993 0.8362
No log 16.5714 348 0.7308 0.5759 0.7308 0.8548
No log 16.6667 350 0.7275 0.5622 0.7275 0.8529
No log 16.7619 352 0.7099 0.5036 0.7099 0.8426
No log 16.8571 354 0.7799 0.5292 0.7799 0.8831
No log 16.9524 356 0.8024 0.5167 0.8024 0.8958
No log 17.0476 358 0.7632 0.4952 0.7632 0.8736
No log 17.1429 360 0.7198 0.5038 0.7198 0.8484
No log 17.2381 362 0.7407 0.5737 0.7407 0.8607
No log 17.3333 364 0.7522 0.5633 0.7522 0.8673
No log 17.4286 366 0.7202 0.5182 0.7202 0.8486
No log 17.5238 368 0.7294 0.5660 0.7294 0.8541
No log 17.6190 370 0.8534 0.5208 0.8534 0.9238
No log 17.7143 372 0.9864 0.4681 0.9864 0.9932
No log 17.8095 374 0.9710 0.4681 0.9710 0.9854
No log 17.9048 376 0.8389 0.4796 0.8389 0.9159
No log 18.0 378 0.7311 0.5752 0.7311 0.8551
No log 18.0952 380 0.7189 0.5169 0.7189 0.8479
No log 18.1905 382 0.7455 0.5076 0.7455 0.8634
No log 18.2857 384 0.7331 0.5076 0.7331 0.8562
No log 18.3810 386 0.7084 0.4927 0.7084 0.8416
No log 18.4762 388 0.7344 0.4743 0.7344 0.8570
No log 18.5714 390 0.7839 0.5305 0.7839 0.8854
No log 18.6667 392 0.8044 0.5056 0.8044 0.8969
No log 18.7619 394 0.7770 0.4361 0.7770 0.8815
No log 18.8571 396 0.7428 0.4527 0.7428 0.8619
No log 18.9524 398 0.7373 0.5199 0.7373 0.8587
No log 19.0476 400 0.7421 0.5434 0.7421 0.8614
No log 19.1429 402 0.7141 0.5076 0.7141 0.8450
No log 19.2381 404 0.6929 0.5773 0.6929 0.8324
No log 19.3333 406 0.7604 0.5766 0.7604 0.8720
No log 19.4286 408 0.8382 0.5211 0.8382 0.9155
No log 19.5238 410 0.7975 0.5628 0.7975 0.8930
No log 19.6190 412 0.7450 0.5549 0.7450 0.8631
No log 19.7143 414 0.6811 0.5929 0.6811 0.8253
No log 19.8095 416 0.6692 0.5671 0.6692 0.8181
No log 19.9048 418 0.6747 0.5057 0.6747 0.8214
No log 20.0 420 0.6774 0.4804 0.6774 0.8231
No log 20.0952 422 0.6826 0.5577 0.6826 0.8262
No log 20.1905 424 0.7112 0.5482 0.7112 0.8433
No log 20.2857 426 0.7722 0.5756 0.7722 0.8787
No log 20.3810 428 0.7604 0.5756 0.7604 0.8720
No log 20.4762 430 0.7172 0.5127 0.7172 0.8468
No log 20.5714 432 0.7103 0.5483 0.7103 0.8428
No log 20.6667 434 0.7210 0.5287 0.7210 0.8491
No log 20.7619 436 0.7072 0.5366 0.7072 0.8409
No log 20.8571 438 0.7122 0.5678 0.7122 0.8439
No log 20.9524 440 0.7511 0.5147 0.7511 0.8667
No log 21.0476 442 0.8032 0.5124 0.8032 0.8962
No log 21.1429 444 0.8290 0.5319 0.8290 0.9105
No log 21.2381 446 0.7784 0.5644 0.7784 0.8823
No log 21.3333 448 0.6875 0.5872 0.6875 0.8292
No log 21.4286 450 0.6568 0.5742 0.6568 0.8105
No log 21.5238 452 0.6833 0.5698 0.6833 0.8266
No log 21.6190 454 0.6707 0.5902 0.6707 0.8190
No log 21.7143 456 0.6551 0.5935 0.6551 0.8094
No log 21.8095 458 0.6949 0.5787 0.6949 0.8336
No log 21.9048 460 0.7435 0.5243 0.7435 0.8622
No log 22.0 462 0.7499 0.5243 0.7499 0.8659
No log 22.0952 464 0.7247 0.5039 0.7247 0.8513
No log 22.1905 466 0.6862 0.5666 0.6862 0.8284
No log 22.2857 468 0.6737 0.5249 0.6737 0.8208
No log 22.3810 470 0.6804 0.5125 0.6804 0.8248
No log 22.4762 472 0.6985 0.5558 0.6985 0.8358
No log 22.5714 474 0.7614 0.5046 0.7614 0.8726
No log 22.6667 476 0.8385 0.4910 0.8385 0.9157
No log 22.7619 478 0.8302 0.5330 0.8302 0.9112
No log 22.8571 480 0.7574 0.5163 0.7574 0.8703
No log 22.9524 482 0.7118 0.4995 0.7118 0.8437
No log 23.0476 484 0.7187 0.5402 0.7187 0.8478
No log 23.1429 486 0.7256 0.4819 0.7256 0.8518
No log 23.2381 488 0.7201 0.5386 0.7201 0.8486
No log 23.3333 490 0.7239 0.5002 0.7239 0.8508
No log 23.4286 492 0.7167 0.5036 0.7167 0.8466
No log 23.5238 494 0.7142 0.5029 0.7142 0.8451
No log 23.6190 496 0.7133 0.5463 0.7133 0.8446
No log 23.7143 498 0.7161 0.5894 0.7161 0.8462
0.2502 23.8095 500 0.7110 0.5786 0.7110 0.8432
0.2502 23.9048 502 0.6917 0.5155 0.6917 0.8317
0.2502 24.0 504 0.6810 0.5076 0.6810 0.8252
0.2502 24.0952 506 0.6803 0.5076 0.6803 0.8248
0.2502 24.1905 508 0.6777 0.4706 0.6777 0.8232
0.2502 24.2857 510 0.6793 0.5368 0.6793 0.8242

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k6_task5_organization

Finetuned
(4019)
this model