ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k3_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7050
  • Qwk: 0.4502
  • Mse: 0.7050
  • Rmse: 0.8396

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1667 2 2.4974 -0.0136 2.4974 1.5803
No log 0.3333 4 1.1604 0.2416 1.1604 1.0772
No log 0.5 6 0.8188 0.0509 0.8188 0.9049
No log 0.6667 8 0.9587 -0.0127 0.9587 0.9791
No log 0.8333 10 1.0244 0.1277 1.0244 1.0121
No log 1.0 12 1.0326 0.0196 1.0326 1.0162
No log 1.1667 14 0.9183 0.0295 0.9183 0.9583
No log 1.3333 16 0.9432 0.0888 0.9432 0.9712
No log 1.5 18 0.9869 -0.0066 0.9869 0.9934
No log 1.6667 20 0.9178 0.0781 0.9178 0.9580
No log 1.8333 22 0.8602 -0.0027 0.8602 0.9274
No log 2.0 24 0.8925 0.0481 0.8925 0.9447
No log 2.1667 26 0.8971 0.1352 0.8971 0.9471
No log 2.3333 28 0.7851 0.0944 0.7851 0.8860
No log 2.5 30 0.7378 0.1282 0.7378 0.8590
No log 2.6667 32 0.7277 0.2085 0.7277 0.8531
No log 2.8333 34 0.7266 0.2085 0.7266 0.8524
No log 3.0 36 0.7989 0.2409 0.7989 0.8938
No log 3.1667 38 0.8607 0.2142 0.8607 0.9278
No log 3.3333 40 0.8728 0.2463 0.8728 0.9342
No log 3.5 42 0.8386 0.1672 0.8386 0.9158
No log 3.6667 44 0.8117 0.2950 0.8117 0.9010
No log 3.8333 46 0.8426 0.1580 0.8426 0.9179
No log 4.0 48 0.8542 0.1923 0.8542 0.9242
No log 4.1667 50 0.7393 0.1135 0.7393 0.8598
No log 4.3333 52 0.7175 0.2979 0.7175 0.8471
No log 4.5 54 0.7751 0.1635 0.7751 0.8804
No log 4.6667 56 0.7498 0.1299 0.7498 0.8659
No log 4.8333 58 0.7065 0.1139 0.7065 0.8406
No log 5.0 60 0.7439 0.3387 0.7439 0.8625
No log 5.1667 62 0.7912 0.2632 0.7912 0.8895
No log 5.3333 64 0.7926 0.3471 0.7926 0.8903
No log 5.5 66 0.8489 0.2643 0.8489 0.9213
No log 5.6667 68 0.9439 0.2810 0.9439 0.9715
No log 5.8333 70 0.9398 0.2756 0.9398 0.9695
No log 6.0 72 0.8776 0.2836 0.8776 0.9368
No log 6.1667 74 0.8080 0.3475 0.8080 0.8989
No log 6.3333 76 0.7597 0.1569 0.7597 0.8716
No log 6.5 78 0.7358 0.1221 0.7358 0.8578
No log 6.6667 80 0.7528 0.4430 0.7528 0.8676
No log 6.8333 82 1.0108 0.2886 1.0108 1.0054
No log 7.0 84 1.0090 0.2833 1.0090 1.0045
No log 7.1667 86 0.8184 0.3302 0.8184 0.9046
No log 7.3333 88 0.7144 0.0679 0.7144 0.8452
No log 7.5 90 0.7203 0.1509 0.7203 0.8487
No log 7.6667 92 0.7329 0.1131 0.7329 0.8561
No log 7.8333 94 0.7320 0.3599 0.7320 0.8555
No log 8.0 96 0.8187 0.3976 0.8187 0.9048
No log 8.1667 98 0.8674 0.3376 0.8674 0.9313
No log 8.3333 100 0.8971 0.3886 0.8971 0.9471
No log 8.5 102 0.8165 0.3526 0.8165 0.9036
No log 8.6667 104 0.7744 0.3486 0.7744 0.8800
No log 8.8333 106 0.8007 0.3700 0.8007 0.8948
No log 9.0 108 0.7383 0.4135 0.7383 0.8592
No log 9.1667 110 0.6993 0.4581 0.6993 0.8362
No log 9.3333 112 0.7313 0.4515 0.7313 0.8552
No log 9.5 114 0.7901 0.3822 0.7901 0.8889
No log 9.6667 116 0.9373 0.4304 0.9373 0.9681
No log 9.8333 118 0.9976 0.4028 0.9976 0.9988
No log 10.0 120 0.8052 0.4284 0.8052 0.8973
No log 10.1667 122 0.7509 0.4135 0.7509 0.8665
No log 10.3333 124 0.8163 0.3544 0.8163 0.9035
No log 10.5 126 0.8808 0.3520 0.8808 0.9385
No log 10.6667 128 0.7600 0.3723 0.7600 0.8718
No log 10.8333 130 0.6872 0.3762 0.6872 0.8290
No log 11.0 132 0.6829 0.3762 0.6829 0.8264
No log 11.1667 134 0.6868 0.4160 0.6868 0.8287
No log 11.3333 136 0.6995 0.3787 0.6995 0.8363
No log 11.5 138 0.8349 0.3609 0.8349 0.9137
No log 11.6667 140 0.9786 0.4243 0.9786 0.9892
No log 11.8333 142 1.2436 0.3148 1.2436 1.1152
No log 12.0 144 1.3583 0.3276 1.3583 1.1655
No log 12.1667 146 1.2070 0.3647 1.2070 1.0986
No log 12.3333 148 1.2440 0.3603 1.2440 1.1153
No log 12.5 150 1.0871 0.3466 1.0871 1.0426
No log 12.6667 152 0.9552 0.3206 0.9552 0.9774
No log 12.8333 154 0.9782 0.3456 0.9782 0.9890
No log 13.0 156 1.0200 0.3326 1.0200 1.0100
No log 13.1667 158 0.9558 0.3345 0.9558 0.9776
No log 13.3333 160 0.8059 0.3285 0.8059 0.8977
No log 13.5 162 0.6912 0.4486 0.6912 0.8314
No log 13.6667 164 0.6987 0.4486 0.6987 0.8359
No log 13.8333 166 0.7075 0.4248 0.7075 0.8411
No log 14.0 168 0.7812 0.3906 0.7812 0.8839
No log 14.1667 170 0.9361 0.3919 0.9361 0.9675
No log 14.3333 172 0.8587 0.4158 0.8587 0.9267
No log 14.5 174 0.7253 0.4393 0.7253 0.8516
No log 14.6667 176 0.6971 0.4248 0.6971 0.8349
No log 14.8333 178 0.6723 0.4288 0.6723 0.8199
No log 15.0 180 0.6771 0.4248 0.6771 0.8229
No log 15.1667 182 0.7240 0.3910 0.7240 0.8509
No log 15.3333 184 0.8499 0.36 0.8499 0.9219
No log 15.5 186 0.9179 0.2775 0.9179 0.9581
No log 15.6667 188 0.7852 0.3985 0.7852 0.8861
No log 15.8333 190 0.7105 0.3931 0.7105 0.8429
No log 16.0 192 0.7448 0.4085 0.7448 0.8630
No log 16.1667 194 0.8068 0.3586 0.8068 0.8982
No log 16.3333 196 0.9306 0.4321 0.9306 0.9647
No log 16.5 198 0.8385 0.3410 0.8385 0.9157
No log 16.6667 200 0.7317 0.4478 0.7317 0.8554
No log 16.8333 202 0.6959 0.4486 0.6959 0.8342
No log 17.0 204 0.7448 0.3711 0.7448 0.8630
No log 17.1667 206 0.8091 0.3771 0.8091 0.8995
No log 17.3333 208 0.9182 0.3510 0.9182 0.9582
No log 17.5 210 0.8679 0.2971 0.8679 0.9316
No log 17.6667 212 0.8226 0.3151 0.8226 0.9070
No log 17.8333 214 0.8179 0.3389 0.8179 0.9044
No log 18.0 216 0.7163 0.3863 0.7163 0.8463
No log 18.1667 218 0.7148 0.3991 0.7148 0.8455
No log 18.3333 220 0.7931 0.3885 0.7931 0.8906
No log 18.5 222 0.9404 0.3728 0.9404 0.9698
No log 18.6667 224 0.8385 0.3844 0.8385 0.9157
No log 18.8333 226 0.7428 0.3492 0.7428 0.8619
No log 19.0 228 0.7789 0.4282 0.7789 0.8826
No log 19.1667 230 0.7934 0.4212 0.7934 0.8908
No log 19.3333 232 0.7631 0.4186 0.7631 0.8735
No log 19.5 234 0.7905 0.4023 0.7905 0.8891
No log 19.6667 236 0.8118 0.3868 0.8118 0.9010
No log 19.8333 238 0.8516 0.3909 0.8516 0.9228
No log 20.0 240 0.9890 0.3523 0.9890 0.9945
No log 20.1667 242 0.8981 0.4074 0.8981 0.9477
No log 20.3333 244 0.8185 0.4369 0.8185 0.9047
No log 20.5 246 0.7162 0.3732 0.7162 0.8463
No log 20.6667 248 0.6743 0.4624 0.6743 0.8211
No log 20.8333 250 0.7640 0.4504 0.7640 0.8741
No log 21.0 252 0.9420 0.4072 0.9420 0.9706
No log 21.1667 254 0.9525 0.4221 0.9525 0.9759
No log 21.3333 256 0.8272 0.4353 0.8272 0.9095
No log 21.5 258 0.7009 0.3931 0.7009 0.8372
No log 21.6667 260 0.6922 0.3885 0.6922 0.8320
No log 21.8333 262 0.7532 0.3388 0.7532 0.8679
No log 22.0 264 0.8648 0.4152 0.8648 0.9300
No log 22.1667 266 0.8321 0.3740 0.8321 0.9122
No log 22.3333 268 0.8241 0.3586 0.8241 0.9078
No log 22.5 270 0.7694 0.3648 0.7694 0.8772
No log 22.6667 272 0.7848 0.3586 0.7848 0.8859
No log 22.8333 274 0.7574 0.3207 0.7574 0.8703
No log 23.0 276 0.7124 0.3268 0.7124 0.8440
No log 23.1667 278 0.7215 0.3567 0.7215 0.8494
No log 23.3333 280 0.7008 0.3820 0.7008 0.8371
No log 23.5 282 0.7816 0.4124 0.7816 0.8841
No log 23.6667 284 0.8569 0.4286 0.8569 0.9257
No log 23.8333 286 0.8518 0.3866 0.8518 0.9229
No log 24.0 288 0.7182 0.3950 0.7182 0.8475
No log 24.1667 290 0.6512 0.4455 0.6512 0.8070
No log 24.3333 292 0.6705 0.5034 0.6705 0.8188
No log 24.5 294 0.6771 0.5310 0.6771 0.8229
No log 24.6667 296 0.7420 0.3754 0.7420 0.8614
No log 24.8333 298 0.9207 0.3307 0.9207 0.9595
No log 25.0 300 0.9937 0.3721 0.9937 0.9969
No log 25.1667 302 0.9851 0.3731 0.9851 0.9925
No log 25.3333 304 0.8436 0.3679 0.8436 0.9185
No log 25.5 306 0.7461 0.4106 0.7461 0.8637
No log 25.6667 308 0.7932 0.3586 0.7932 0.8906
No log 25.8333 310 0.9155 0.3263 0.9155 0.9568
No log 26.0 312 0.8223 0.4124 0.8223 0.9068
No log 26.1667 314 0.7287 0.3669 0.7287 0.8536
No log 26.3333 316 0.7090 0.4260 0.7090 0.8420
No log 26.5 318 0.7667 0.3456 0.7667 0.8756
No log 26.6667 320 0.8381 0.3461 0.8381 0.9155
No log 26.8333 322 0.8113 0.3461 0.8113 0.9007
No log 27.0 324 0.6859 0.4835 0.6859 0.8282
No log 27.1667 326 0.6496 0.5034 0.6496 0.8060
No log 27.3333 328 0.6916 0.5206 0.6916 0.8317
No log 27.5 330 0.8089 0.4056 0.8089 0.8994
No log 27.6667 332 0.7721 0.4265 0.7721 0.8787
No log 27.8333 334 0.7483 0.4173 0.7483 0.8650
No log 28.0 336 0.7724 0.4173 0.7724 0.8789
No log 28.1667 338 0.7562 0.4072 0.7562 0.8696
No log 28.3333 340 0.8237 0.3867 0.8237 0.9076
No log 28.5 342 0.7959 0.3934 0.7959 0.8921
No log 28.6667 344 0.6921 0.4622 0.6921 0.8319
No log 28.8333 346 0.6596 0.4788 0.6596 0.8121
No log 29.0 348 0.6585 0.5420 0.6585 0.8115
No log 29.1667 350 0.7087 0.4173 0.7087 0.8418
No log 29.3333 352 0.7023 0.3934 0.7023 0.8380
No log 29.5 354 0.6648 0.4371 0.6648 0.8153
No log 29.6667 356 0.6286 0.4314 0.6286 0.7928
No log 29.8333 358 0.6409 0.4482 0.6409 0.8006
No log 30.0 360 0.7219 0.4072 0.7219 0.8496
No log 30.1667 362 0.8451 0.4125 0.8451 0.9193
No log 30.3333 364 0.7980 0.4504 0.7980 0.8933
No log 30.5 366 0.6687 0.5254 0.6687 0.8178
No log 30.6667 368 0.6071 0.4801 0.6071 0.7792
No log 30.8333 370 0.6149 0.4722 0.6149 0.7842
No log 31.0 372 0.6553 0.4828 0.6553 0.8095
No log 31.1667 374 0.8039 0.4265 0.8039 0.8966
No log 31.3333 376 1.0133 0.3900 1.0133 1.0066
No log 31.5 378 1.0334 0.3634 1.0334 1.0166
No log 31.6667 380 0.8744 0.4038 0.8744 0.9351
No log 31.8333 382 0.7209 0.3985 0.7209 0.8491
No log 32.0 384 0.6778 0.3665 0.6778 0.8233
No log 32.1667 386 0.6771 0.4100 0.6771 0.8229
No log 32.3333 388 0.7236 0.3985 0.7236 0.8507
No log 32.5 390 0.8154 0.3934 0.8154 0.9030
No log 32.6667 392 0.8068 0.4173 0.8068 0.8982
No log 32.8333 394 0.6976 0.3985 0.6976 0.8352
No log 33.0 396 0.6193 0.4463 0.6193 0.7870
No log 33.1667 398 0.6197 0.4160 0.6197 0.7872
No log 33.3333 400 0.6217 0.4555 0.6217 0.7885
No log 33.5 402 0.6528 0.5272 0.6528 0.8080
No log 33.6667 404 0.7671 0.4104 0.7671 0.8759
No log 33.8333 406 0.8265 0.3973 0.8265 0.9091
No log 34.0 408 0.7948 0.3973 0.7948 0.8915
No log 34.1667 410 0.6886 0.4911 0.6886 0.8298
No log 34.3333 412 0.6340 0.4459 0.6340 0.7962
No log 34.5 414 0.6402 0.4375 0.6402 0.8001
No log 34.6667 416 0.7015 0.4582 0.7015 0.8376
No log 34.8333 418 0.8555 0.3499 0.8555 0.9249
No log 35.0 420 1.0103 0.2840 1.0103 1.0051
No log 35.1667 422 1.0181 0.3077 1.0181 1.0090
No log 35.3333 424 0.8803 0.3290 0.8803 0.9383
No log 35.5 426 0.6958 0.4587 0.6958 0.8341
No log 35.6667 428 0.6086 0.5327 0.6086 0.7801
No log 35.8333 430 0.5942 0.5422 0.5942 0.7708
No log 36.0 432 0.5968 0.5687 0.5968 0.7725
No log 36.1667 434 0.6065 0.5752 0.6065 0.7788
No log 36.3333 436 0.6352 0.5655 0.6352 0.7970
No log 36.5 438 0.6981 0.4745 0.6981 0.8355
No log 36.6667 440 0.7102 0.4123 0.7102 0.8427
No log 36.8333 442 0.6785 0.4513 0.6785 0.8237
No log 37.0 444 0.6454 0.3782 0.6454 0.8034
No log 37.1667 446 0.6463 0.3809 0.6463 0.8039
No log 37.3333 448 0.6787 0.3425 0.6787 0.8238
No log 37.5 450 0.7761 0.3497 0.7761 0.8809
No log 37.6667 452 0.8284 0.3497 0.8284 0.9102
No log 37.8333 454 0.8006 0.3497 0.8006 0.8947
No log 38.0 456 0.7588 0.4444 0.7588 0.8711
No log 38.1667 458 0.6934 0.3990 0.6934 0.8327
No log 38.3333 460 0.6780 0.2872 0.6780 0.8234
No log 38.5 462 0.6991 0.3594 0.6991 0.8361
No log 38.6667 464 0.7614 0.4218 0.7614 0.8726
No log 38.8333 466 0.7904 0.3562 0.7904 0.8890
No log 39.0 468 0.7994 0.3562 0.7994 0.8941
No log 39.1667 470 0.7319 0.3544 0.7319 0.8555
No log 39.3333 472 0.6794 0.3966 0.6794 0.8242
No log 39.5 474 0.6455 0.3945 0.6455 0.8034
No log 39.6667 476 0.6358 0.3890 0.6358 0.7974
No log 39.8333 478 0.6479 0.4618 0.6479 0.8049
No log 40.0 480 0.6857 0.4602 0.6857 0.8281
No log 40.1667 482 0.7440 0.4827 0.7440 0.8626
No log 40.3333 484 0.7056 0.4602 0.7056 0.8400
No log 40.5 486 0.6529 0.5016 0.6529 0.8080
No log 40.6667 488 0.6286 0.4743 0.6286 0.7928
No log 40.8333 490 0.6377 0.5104 0.6377 0.7986
No log 41.0 492 0.6305 0.5123 0.6305 0.7940
No log 41.1667 494 0.6370 0.4875 0.6370 0.7981
No log 41.3333 496 0.6514 0.4618 0.6514 0.8071
No log 41.5 498 0.6447 0.4618 0.6447 0.8030
0.2704 41.6667 500 0.6356 0.4962 0.6356 0.7972
0.2704 41.8333 502 0.6461 0.4867 0.6461 0.8038
0.2704 42.0 504 0.6997 0.4144 0.6997 0.8365
0.2704 42.1667 506 0.7895 0.3754 0.7895 0.8885
0.2704 42.3333 508 0.7769 0.3754 0.7769 0.8814
0.2704 42.5 510 0.7050 0.4502 0.7050 0.8396

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k3_task7_organization

Finetuned
(4019)
this model