ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k9_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5637
  • Qwk: 0.5626
  • Mse: 0.5637
  • Rmse: 0.7508

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0426 2 2.5941 -0.0262 2.5941 1.6106
No log 0.0851 4 1.3576 0.0763 1.3576 1.1651
No log 0.1277 6 1.0694 -0.0970 1.0694 1.0341
No log 0.1702 8 1.0324 0.1010 1.0324 1.0161
No log 0.2128 10 1.0958 0.0962 1.0958 1.0468
No log 0.2553 12 1.0792 0.0316 1.0792 1.0388
No log 0.2979 14 0.9871 0.0502 0.9871 0.9935
No log 0.3404 16 0.8492 -0.0841 0.8492 0.9215
No log 0.3830 18 0.8462 0.0053 0.8462 0.9199
No log 0.4255 20 0.8377 -0.0426 0.8377 0.9152
No log 0.4681 22 0.8142 -0.0444 0.8142 0.9024
No log 0.5106 24 0.8629 0.0520 0.8629 0.9289
No log 0.5532 26 1.0068 0.0821 1.0068 1.0034
No log 0.5957 28 1.0999 -0.0506 1.0999 1.0488
No log 0.6383 30 1.0279 0.1962 1.0279 1.0138
No log 0.6809 32 0.9916 0.2670 0.9916 0.9958
No log 0.7234 34 0.8221 0.2574 0.8221 0.9067
No log 0.7660 36 0.7184 0.1504 0.7184 0.8476
No log 0.8085 38 0.7145 0.1699 0.7145 0.8453
No log 0.8511 40 0.7009 0.2145 0.7009 0.8372
No log 0.8936 42 0.6942 0.1953 0.6942 0.8332
No log 0.9362 44 0.7318 0.0851 0.7318 0.8555
No log 0.9787 46 0.7641 0.0851 0.7641 0.8742
No log 1.0213 48 0.7368 0.1065 0.7368 0.8584
No log 1.0638 50 0.7124 0.3211 0.7124 0.8440
No log 1.1064 52 0.7351 0.2987 0.7351 0.8574
No log 1.1489 54 0.7803 0.2779 0.7803 0.8833
No log 1.1915 56 0.7786 0.2353 0.7786 0.8824
No log 1.2340 58 0.7380 0.2607 0.7380 0.8591
No log 1.2766 60 0.7500 0.2171 0.7500 0.8660
No log 1.3191 62 0.8305 0.2244 0.8305 0.9113
No log 1.3617 64 0.8599 0.2982 0.8599 0.9273
No log 1.4043 66 0.8661 0.2244 0.8661 0.9307
No log 1.4468 68 0.8028 0.2285 0.8028 0.8960
No log 1.4894 70 0.7258 0.2409 0.7258 0.8519
No log 1.5319 72 0.7280 0.1766 0.7280 0.8532
No log 1.5745 74 0.7670 0.2574 0.7670 0.8758
No log 1.6170 76 0.8158 0.2784 0.8158 0.9032
No log 1.6596 78 0.8246 0.2518 0.8246 0.9081
No log 1.7021 80 0.7694 0.1142 0.7694 0.8771
No log 1.7447 82 0.8049 0.0757 0.8049 0.8971
No log 1.7872 84 0.8303 0.0884 0.8303 0.9112
No log 1.8298 86 0.8213 0.0840 0.8213 0.9063
No log 1.8723 88 0.7807 0.1983 0.7807 0.8835
No log 1.9149 90 0.7656 0.1225 0.7656 0.8750
No log 1.9574 92 0.8224 0.2904 0.8224 0.9069
No log 2.0 94 0.8824 0.3740 0.8824 0.9394
No log 2.0426 96 0.8892 0.4186 0.8892 0.9430
No log 2.0851 98 0.6930 0.3609 0.6930 0.8324
No log 2.1277 100 0.5975 0.3267 0.5975 0.7730
No log 2.1702 102 0.6261 0.3329 0.6261 0.7913
No log 2.2128 104 0.7237 0.3675 0.7237 0.8507
No log 2.2553 106 0.7709 0.2518 0.7709 0.8780
No log 2.2979 108 0.7603 0.1268 0.7603 0.8719
No log 2.3404 110 0.6772 0.1268 0.6772 0.8229
No log 2.3830 112 0.6552 0.3167 0.6552 0.8095
No log 2.4255 114 0.7271 0.3294 0.7271 0.8527
No log 2.4681 116 0.7665 0.3294 0.7665 0.8755
No log 2.5106 118 0.6852 0.2297 0.6852 0.8278
No log 2.5532 120 0.5937 0.3865 0.5937 0.7705
No log 2.5957 122 0.5892 0.4007 0.5892 0.7676
No log 2.6383 124 0.6072 0.4608 0.6072 0.7792
No log 2.6809 126 0.6225 0.4034 0.6225 0.7890
No log 2.7234 128 0.6193 0.3781 0.6193 0.7870
No log 2.7660 130 0.6158 0.3280 0.6158 0.7847
No log 2.8085 132 0.6724 0.4186 0.6724 0.8200
No log 2.8511 134 0.8887 0.3794 0.8887 0.9427
No log 2.8936 136 0.9160 0.3608 0.9160 0.9571
No log 2.9362 138 0.7681 0.4243 0.7681 0.8764
No log 2.9787 140 0.6488 0.3816 0.6488 0.8055
No log 3.0213 142 0.6504 0.4397 0.6504 0.8065
No log 3.0638 144 0.6801 0.4356 0.6801 0.8247
No log 3.1064 146 0.7273 0.4077 0.7273 0.8528
No log 3.1489 148 0.7698 0.3906 0.7698 0.8774
No log 3.1915 150 0.7253 0.3950 0.7253 0.8516
No log 3.2340 152 0.7219 0.3818 0.7219 0.8497
No log 3.2766 154 0.7065 0.4085 0.7065 0.8405
No log 3.3191 156 0.6931 0.4428 0.6931 0.8325
No log 3.3617 158 0.7270 0.3906 0.7270 0.8526
No log 3.4043 160 0.7512 0.3583 0.7512 0.8667
No log 3.4468 162 0.6972 0.4734 0.6972 0.8350
No log 3.4894 164 0.6369 0.5586 0.6369 0.7981
No log 3.5319 166 0.6908 0.4484 0.6908 0.8311
No log 3.5745 168 0.7057 0.4343 0.7057 0.8401
No log 3.6170 170 0.6716 0.5093 0.6716 0.8195
No log 3.6596 172 0.6635 0.6136 0.6635 0.8146
No log 3.7021 174 0.6536 0.6040 0.6536 0.8085
No log 3.7447 176 0.6308 0.5891 0.6308 0.7943
No log 3.7872 178 0.6167 0.5714 0.6167 0.7853
No log 3.8298 180 0.6036 0.5586 0.6036 0.7769
No log 3.8723 182 0.6183 0.5206 0.6183 0.7863
No log 3.9149 184 0.7173 0.5022 0.7173 0.8469
No log 3.9574 186 0.8021 0.3663 0.8021 0.8956
No log 4.0 188 0.7237 0.4529 0.7237 0.8507
No log 4.0426 190 0.6742 0.4986 0.6742 0.8211
No log 4.0851 192 0.6808 0.4789 0.6808 0.8251
No log 4.1277 194 0.7075 0.3051 0.7075 0.8411
No log 4.1702 196 0.7055 0.3085 0.7055 0.8399
No log 4.2128 198 0.6883 0.2988 0.6883 0.8297
No log 4.2553 200 0.6568 0.3837 0.6568 0.8104
No log 4.2979 202 0.6498 0.3837 0.6498 0.8061
No log 4.3404 204 0.6572 0.3887 0.6572 0.8107
No log 4.3830 206 0.7345 0.4334 0.7345 0.8570
No log 4.4255 208 0.9112 0.3645 0.9112 0.9546
No log 4.4681 210 1.0480 0.3987 1.0480 1.0237
No log 4.5106 212 1.0304 0.3622 1.0304 1.0151
No log 4.5532 214 0.8678 0.3909 0.8678 0.9316
No log 4.5957 216 0.6602 0.3769 0.6602 0.8125
No log 4.6383 218 0.6236 0.4526 0.6236 0.7897
No log 4.6809 220 0.6339 0.3762 0.6339 0.7962
No log 4.7234 222 0.7026 0.3292 0.7026 0.8382
No log 4.7660 224 0.7925 0.3824 0.7925 0.8902
No log 4.8085 226 0.7974 0.3867 0.7974 0.8930
No log 4.8511 228 0.7111 0.3548 0.7111 0.8433
No log 4.8936 230 0.6528 0.4874 0.6528 0.8079
No log 4.9362 232 0.6511 0.4349 0.6511 0.8069
No log 4.9787 234 0.6572 0.3811 0.6572 0.8107
No log 5.0213 236 0.7295 0.3209 0.7295 0.8541
No log 5.0638 238 0.8017 0.2521 0.8017 0.8954
No log 5.1064 240 0.7435 0.3482 0.7435 0.8623
No log 5.1489 242 0.7023 0.4067 0.7023 0.8380
No log 5.1915 244 0.6892 0.4067 0.6892 0.8302
No log 5.2340 246 0.7288 0.3990 0.7288 0.8537
No log 5.2766 248 0.8634 0.4114 0.8634 0.9292
No log 5.3191 250 1.0270 0.3126 1.0270 1.0134
No log 5.3617 252 0.9447 0.3470 0.9447 0.9720
No log 5.4043 254 0.7564 0.4550 0.7564 0.8697
No log 5.4468 256 0.6616 0.4434 0.6616 0.8134
No log 5.4894 258 0.6687 0.4281 0.6687 0.8177
No log 5.5319 260 0.7727 0.3667 0.7727 0.8791
No log 5.5745 262 0.9323 0.3830 0.9323 0.9656
No log 5.6170 264 0.9897 0.3288 0.9897 0.9948
No log 5.6596 266 1.0170 0.3193 1.0170 1.0084
No log 5.7021 268 0.9260 0.3470 0.9260 0.9623
No log 5.7447 270 0.7355 0.4218 0.7355 0.8576
No log 5.7872 272 0.6607 0.4393 0.6607 0.8128
No log 5.8298 274 0.6371 0.4103 0.6371 0.7982
No log 5.8723 276 0.6394 0.4428 0.6394 0.7996
No log 5.9149 278 0.6496 0.4428 0.6496 0.8060
No log 5.9574 280 0.6135 0.5206 0.6135 0.7833
No log 6.0 282 0.5943 0.5826 0.5943 0.7709
No log 6.0426 284 0.5902 0.6027 0.5902 0.7683
No log 6.0851 286 0.5859 0.5742 0.5859 0.7654
No log 6.1277 288 0.6180 0.5109 0.6180 0.7861
No log 6.1702 290 0.6725 0.4315 0.6725 0.8200
No log 6.2128 292 0.6413 0.4428 0.6413 0.8008
No log 6.2553 294 0.5997 0.5419 0.5997 0.7744
No log 6.2979 296 0.6078 0.5419 0.6078 0.7796
No log 6.3404 298 0.5999 0.5640 0.5999 0.7745
No log 6.3830 300 0.5997 0.5505 0.5997 0.7744
No log 6.4255 302 0.6126 0.5906 0.6126 0.7827
No log 6.4681 304 0.5983 0.5190 0.5983 0.7735
No log 6.5106 306 0.6016 0.5357 0.6016 0.7756
No log 6.5532 308 0.6476 0.4428 0.6476 0.8047
No log 6.5957 310 0.7106 0.4404 0.7106 0.8430
No log 6.6383 312 0.7218 0.4353 0.7218 0.8496
No log 6.6809 314 0.6656 0.4644 0.6656 0.8158
No log 6.7234 316 0.6490 0.5015 0.6490 0.8056
No log 6.7660 318 0.6638 0.4353 0.6638 0.8147
No log 6.8085 320 0.6546 0.4023 0.6546 0.8091
No log 6.8511 322 0.6837 0.4124 0.6837 0.8269
No log 6.8936 324 0.8042 0.4351 0.8042 0.8968
No log 6.9362 326 0.8351 0.4222 0.8351 0.9138
No log 6.9787 328 0.7771 0.4821 0.7771 0.8816
No log 7.0213 330 0.6619 0.5040 0.6619 0.8136
No log 7.0638 332 0.5765 0.4576 0.5765 0.7593
No log 7.1064 334 0.5773 0.4314 0.5773 0.7598
No log 7.1489 336 0.5865 0.4555 0.5865 0.7658
No log 7.1915 338 0.6469 0.4144 0.6469 0.8043
No log 7.2340 340 0.6434 0.4504 0.6434 0.8021
No log 7.2766 342 0.6096 0.4504 0.6096 0.7807
No log 7.3191 344 0.5811 0.4968 0.5811 0.7623
No log 7.3617 346 0.6231 0.4845 0.6231 0.7894
No log 7.4043 348 0.6029 0.4997 0.6029 0.7765
No log 7.4468 350 0.5601 0.5337 0.5601 0.7484
No log 7.4894 352 0.5348 0.5742 0.5348 0.7313
No log 7.5319 354 0.5321 0.5479 0.5321 0.7295
No log 7.5745 356 0.5439 0.5868 0.5439 0.7375
No log 7.6170 358 0.5278 0.5827 0.5278 0.7265
No log 7.6596 360 0.5485 0.4970 0.5485 0.7406
No log 7.7021 362 0.5592 0.5242 0.5592 0.7478
No log 7.7447 364 0.5213 0.5697 0.5213 0.7220
No log 7.7872 366 0.5094 0.5812 0.5094 0.7137
No log 7.8298 368 0.5096 0.6053 0.5096 0.7139
No log 7.8723 370 0.5145 0.5930 0.5145 0.7173
No log 7.9149 372 0.5491 0.4587 0.5491 0.7410
No log 7.9574 374 0.6429 0.4512 0.6429 0.8018
No log 8.0 376 0.6940 0.4592 0.6940 0.8331
No log 8.0426 378 0.6371 0.4539 0.6371 0.7982
No log 8.0851 380 0.5721 0.4257 0.5721 0.7564
No log 8.1277 382 0.5777 0.4955 0.5777 0.7601
No log 8.1702 384 0.5833 0.4248 0.5833 0.7638
No log 8.2128 386 0.6094 0.4165 0.6094 0.7806
No log 8.2553 388 0.7149 0.4199 0.7149 0.8455
No log 8.2979 390 0.7377 0.4247 0.7377 0.8589
No log 8.3404 392 0.6401 0.3700 0.6401 0.8000
No log 8.3830 394 0.5751 0.4569 0.5751 0.7583
No log 8.4255 396 0.5530 0.4314 0.5530 0.7436
No log 8.4681 398 0.5499 0.4291 0.5499 0.7416
No log 8.5106 400 0.5545 0.4895 0.5545 0.7446
No log 8.5532 402 0.5621 0.4951 0.5621 0.7498
No log 8.5957 404 0.5683 0.5173 0.5683 0.7539
No log 8.6383 406 0.6010 0.5246 0.6010 0.7753
No log 8.6809 408 0.6052 0.4980 0.6052 0.7779
No log 8.7234 410 0.5883 0.4662 0.5883 0.7670
No log 8.7660 412 0.5812 0.4806 0.5812 0.7623
No log 8.8085 414 0.6136 0.5195 0.6136 0.7833
No log 8.8511 416 0.6965 0.4684 0.6965 0.8346
No log 8.8936 418 0.7253 0.4315 0.7253 0.8516
No log 8.9362 420 0.6632 0.4531 0.6632 0.8144
No log 8.9787 422 0.5779 0.5015 0.5779 0.7602
No log 9.0213 424 0.5677 0.5076 0.5677 0.7535
No log 9.0638 426 0.5775 0.5499 0.5775 0.7599
No log 9.1064 428 0.5767 0.5418 0.5767 0.7594
No log 9.1489 430 0.5556 0.5961 0.5556 0.7454
No log 9.1915 432 0.5445 0.5362 0.5445 0.7379
No log 9.2340 434 0.5440 0.5131 0.5440 0.7375
No log 9.2766 436 0.5540 0.5036 0.5540 0.7443
No log 9.3191 438 0.5406 0.4547 0.5406 0.7353
No log 9.3617 440 0.5410 0.4816 0.5410 0.7356
No log 9.4043 442 0.5558 0.5324 0.5558 0.7455
No log 9.4468 444 0.5702 0.5419 0.5702 0.7551
No log 9.4894 446 0.6000 0.5469 0.6000 0.7746
No log 9.5319 448 0.6067 0.5469 0.6067 0.7789
No log 9.5745 450 0.5950 0.5519 0.5950 0.7713
No log 9.6170 452 0.5901 0.5586 0.5901 0.7682
No log 9.6596 454 0.5832 0.5428 0.5832 0.7637
No log 9.7021 456 0.5944 0.4721 0.5944 0.7710
No log 9.7447 458 0.6026 0.4260 0.6026 0.7762
No log 9.7872 460 0.6194 0.3888 0.6194 0.7870
No log 9.8298 462 0.6026 0.4413 0.6026 0.7762
No log 9.8723 464 0.5915 0.4681 0.5915 0.7691
No log 9.9149 466 0.6136 0.4384 0.6136 0.7833
No log 9.9574 468 0.6173 0.4101 0.6173 0.7857
No log 10.0 470 0.6118 0.3667 0.6118 0.7821
No log 10.0426 472 0.6397 0.4704 0.6397 0.7998
No log 10.0851 474 0.6612 0.4622 0.6612 0.8131
No log 10.1277 476 0.6305 0.4534 0.6305 0.7940
No log 10.1702 478 0.5943 0.3865 0.5943 0.7709
No log 10.2128 480 0.5579 0.3552 0.5579 0.7469
No log 10.2553 482 0.5408 0.4101 0.5408 0.7354
No log 10.2979 484 0.5424 0.4898 0.5424 0.7365
No log 10.3404 486 0.5457 0.5373 0.5457 0.7387
No log 10.3830 488 0.5925 0.5211 0.5925 0.7697
No log 10.4255 490 0.6843 0.4468 0.6843 0.8272
No log 10.4681 492 0.6967 0.4805 0.6967 0.8347
No log 10.5106 494 0.6277 0.4574 0.6277 0.7923
No log 10.5532 496 0.5569 0.4681 0.5569 0.7462
No log 10.5957 498 0.5373 0.5476 0.5373 0.7330
0.4196 10.6383 500 0.5301 0.5655 0.5301 0.7280
0.4196 10.6809 502 0.5338 0.5655 0.5338 0.7306
0.4196 10.7234 504 0.5479 0.5491 0.5479 0.7402
0.4196 10.7660 506 0.5769 0.4830 0.5769 0.7595
0.4196 10.8085 508 0.5823 0.4550 0.5823 0.7631
0.4196 10.8511 510 0.5637 0.5626 0.5637 0.7508

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k9_task7_organization

Finetuned
(4019)
this model