ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k17_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8427
  • Qwk: 0.4012
  • Mse: 0.8427
  • Rmse: 0.9180

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0208 2 4.5533 -0.0132 4.5533 2.1339
No log 0.0417 4 2.5587 0.0798 2.5587 1.5996
No log 0.0625 6 1.6392 0.0372 1.6392 1.2803
No log 0.0833 8 1.7576 0.0372 1.7576 1.3257
No log 0.1042 10 1.5533 0.0372 1.5533 1.2463
No log 0.125 12 1.5625 0.0227 1.5625 1.2500
No log 0.1458 14 1.4936 0.0227 1.4936 1.2221
No log 0.1667 16 1.2980 0.0647 1.2980 1.1393
No log 0.1875 18 1.1409 0.1814 1.1409 1.0681
No log 0.2083 20 1.1547 0.0802 1.1547 1.0745
No log 0.2292 22 1.2373 0.0454 1.2373 1.1124
No log 0.25 24 1.1545 0.1314 1.1545 1.0745
No log 0.2708 26 1.0951 0.1918 1.0951 1.0465
No log 0.2917 28 1.0577 0.2246 1.0577 1.0284
No log 0.3125 30 1.0751 0.2513 1.0751 1.0369
No log 0.3333 32 1.0700 0.3307 1.0700 1.0344
No log 0.3542 34 1.1021 0.2669 1.1021 1.0498
No log 0.375 36 1.0899 0.3009 1.0899 1.0440
No log 0.3958 38 1.0798 0.3056 1.0798 1.0391
No log 0.4167 40 1.0159 0.3973 1.0159 1.0079
No log 0.4375 42 0.9581 0.4237 0.9581 0.9788
No log 0.4583 44 0.9272 0.4511 0.9272 0.9629
No log 0.4792 46 0.9314 0.3624 0.9314 0.9651
No log 0.5 48 0.9723 0.3349 0.9723 0.9861
No log 0.5208 50 0.9146 0.3045 0.9146 0.9563
No log 0.5417 52 0.8576 0.5010 0.8576 0.9261
No log 0.5625 54 1.0038 0.3705 1.0038 1.0019
No log 0.5833 56 1.1176 0.3392 1.1176 1.0572
No log 0.6042 58 1.1665 0.3163 1.1665 1.0800
No log 0.625 60 1.0796 0.3842 1.0796 1.0390
No log 0.6458 62 0.9277 0.5443 0.9277 0.9631
No log 0.6667 64 0.9816 0.3617 0.9816 0.9907
No log 0.6875 66 1.0086 0.3667 1.0086 1.0043
No log 0.7083 68 0.9792 0.3617 0.9792 0.9895
No log 0.7292 70 0.9007 0.5530 0.9007 0.9490
No log 0.75 72 0.8990 0.5462 0.8990 0.9481
No log 0.7708 74 0.8800 0.5530 0.8800 0.9381
No log 0.7917 76 0.8523 0.5580 0.8523 0.9232
No log 0.8125 78 0.8199 0.5458 0.8199 0.9055
No log 0.8333 80 0.8231 0.5291 0.8231 0.9072
No log 0.8542 82 0.8348 0.5291 0.8348 0.9137
No log 0.875 84 0.8268 0.5826 0.8268 0.9093
No log 0.8958 86 0.7905 0.5504 0.7905 0.8891
No log 0.9167 88 0.8163 0.5125 0.8163 0.9035
No log 0.9375 90 0.9194 0.4898 0.9194 0.9589
No log 0.9583 92 0.8372 0.5933 0.8372 0.9150
No log 0.9792 94 0.6967 0.6163 0.6967 0.8347
No log 1.0 96 0.6989 0.6567 0.6989 0.8360
No log 1.0208 98 0.7677 0.5956 0.7677 0.8762
No log 1.0417 100 0.9095 0.4843 0.9095 0.9537
No log 1.0625 102 0.9424 0.4836 0.9424 0.9707
No log 1.0833 104 0.8906 0.5007 0.8906 0.9437
No log 1.1042 106 0.7794 0.6002 0.7794 0.8828
No log 1.125 108 0.8550 0.4808 0.8550 0.9247
No log 1.1458 110 1.0677 0.4806 1.0677 1.0333
No log 1.1667 112 0.9845 0.4898 0.9845 0.9922
No log 1.1875 114 0.8667 0.5241 0.8667 0.9310
No log 1.2083 116 0.7654 0.6196 0.7654 0.8749
No log 1.2292 118 0.7233 0.6360 0.7233 0.8505
No log 1.25 120 0.7631 0.5564 0.7631 0.8736
No log 1.2708 122 0.9733 0.5214 0.9733 0.9866
No log 1.2917 124 1.0454 0.5214 1.0454 1.0224
No log 1.3125 126 0.9719 0.5214 0.9719 0.9858
No log 1.3333 128 0.7747 0.6140 0.7747 0.8802
No log 1.3542 130 0.6787 0.6833 0.6787 0.8239
No log 1.375 132 0.6835 0.6311 0.6835 0.8267
No log 1.3958 134 0.6824 0.6702 0.6824 0.8260
No log 1.4167 136 0.7160 0.5783 0.7160 0.8462
No log 1.4375 138 0.7541 0.6170 0.7541 0.8684
No log 1.4583 140 0.8917 0.5627 0.8917 0.9443
No log 1.4792 142 1.0165 0.5526 1.0165 1.0082
No log 1.5 144 1.0597 0.5410 1.0597 1.0294
No log 1.5208 146 0.8627 0.6330 0.8627 0.9288
No log 1.5417 148 0.6719 0.6629 0.6719 0.8197
No log 1.5625 150 0.7220 0.6117 0.7220 0.8497
No log 1.5833 152 0.6983 0.6424 0.6983 0.8357
No log 1.6042 154 0.6702 0.6517 0.6702 0.8186
No log 1.625 156 0.9420 0.4775 0.9420 0.9706
No log 1.6458 158 1.1089 0.4790 1.1089 1.0531
No log 1.6667 160 0.9811 0.4505 0.9811 0.9905
No log 1.6875 162 0.7432 0.5451 0.7432 0.8621
No log 1.7083 164 0.7233 0.6310 0.7233 0.8504
No log 1.7292 166 0.7486 0.5229 0.7486 0.8652
No log 1.75 168 0.7101 0.6310 0.7101 0.8427
No log 1.7708 170 0.7437 0.6056 0.7437 0.8624
No log 1.7917 172 0.8621 0.4973 0.8621 0.9285
No log 1.8125 174 0.8090 0.5818 0.8090 0.8994
No log 1.8333 176 0.7061 0.6393 0.7061 0.8403
No log 1.8542 178 0.7070 0.6363 0.7070 0.8408
No log 1.875 180 0.7025 0.6790 0.7025 0.8382
No log 1.8958 182 0.7105 0.6393 0.7105 0.8429
No log 1.9167 184 0.7682 0.5671 0.7682 0.8765
No log 1.9375 186 0.8703 0.5412 0.8703 0.9329
No log 1.9583 188 0.8754 0.5427 0.8754 0.9357
No log 1.9792 190 0.7606 0.5572 0.7606 0.8721
No log 2.0 192 0.7335 0.5969 0.7335 0.8565
No log 2.0208 194 0.7523 0.5938 0.7523 0.8674
No log 2.0417 196 0.8484 0.4794 0.8484 0.9211
No log 2.0625 198 1.0176 0.4449 1.0176 1.0088
No log 2.0833 200 0.9448 0.4105 0.9448 0.9720
No log 2.1042 202 0.7927 0.5194 0.7927 0.8903
No log 2.125 204 0.7855 0.5178 0.7855 0.8863
No log 2.1458 206 0.7880 0.5320 0.7880 0.8877
No log 2.1667 208 0.8057 0.4440 0.8057 0.8976
No log 2.1875 210 0.8754 0.3686 0.8754 0.9356
No log 2.2083 212 1.0086 0.4739 1.0086 1.0043
No log 2.2292 214 0.9916 0.4749 0.9916 0.9958
No log 2.25 216 0.9153 0.4533 0.9153 0.9567
No log 2.2708 218 0.8479 0.4559 0.8479 0.9208
No log 2.2917 220 0.8679 0.3713 0.8679 0.9316
No log 2.3125 222 0.8828 0.3713 0.8828 0.9396
No log 2.3333 224 0.7932 0.4604 0.7932 0.8906
No log 2.3542 226 0.7603 0.5579 0.7603 0.8719
No log 2.375 228 0.8008 0.5176 0.8008 0.8949
No log 2.3958 230 0.8154 0.5075 0.8154 0.9030
No log 2.4167 232 0.7661 0.5992 0.7661 0.8753
No log 2.4375 234 0.7754 0.5534 0.7754 0.8806
No log 2.4583 236 0.7862 0.5770 0.7862 0.8867
No log 2.4792 238 0.7768 0.5579 0.7768 0.8814
No log 2.5 240 0.7949 0.5528 0.7949 0.8916
No log 2.5208 242 0.8046 0.5455 0.8046 0.8970
No log 2.5417 244 0.7981 0.5044 0.7981 0.8933
No log 2.5625 246 0.7971 0.5195 0.7971 0.8928
No log 2.5833 248 0.8237 0.4840 0.8237 0.9076
No log 2.6042 250 0.9006 0.4132 0.9006 0.9490
No log 2.625 252 0.8368 0.4774 0.8368 0.9148
No log 2.6458 254 0.7913 0.5184 0.7913 0.8896
No log 2.6667 256 0.7720 0.5143 0.7720 0.8786
No log 2.6875 258 0.8023 0.5043 0.8023 0.8957
No log 2.7083 260 0.8115 0.4894 0.8115 0.9008
No log 2.7292 262 0.8121 0.5192 0.8121 0.9012
No log 2.75 264 0.8048 0.5059 0.8048 0.8971
No log 2.7708 266 0.8109 0.5259 0.8109 0.9005
No log 2.7917 268 0.8561 0.4501 0.8561 0.9253
No log 2.8125 270 0.8429 0.4833 0.8429 0.9181
No log 2.8333 272 0.7739 0.5474 0.7739 0.8797
No log 2.8542 274 0.7692 0.5474 0.7692 0.8770
No log 2.875 276 0.8897 0.4874 0.8897 0.9433
No log 2.8958 278 0.9565 0.4617 0.9565 0.9780
No log 2.9167 280 0.8531 0.4874 0.8531 0.9236
No log 2.9375 282 0.8139 0.4876 0.8139 0.9022
No log 2.9583 284 0.8321 0.4420 0.8321 0.9122
No log 2.9792 286 0.8254 0.4546 0.8254 0.9085
No log 3.0 288 0.8666 0.4289 0.8666 0.9309
No log 3.0208 290 0.8781 0.4607 0.8781 0.9371
No log 3.0417 292 0.8190 0.4694 0.8190 0.9050
No log 3.0625 294 0.7626 0.5220 0.7626 0.8733
No log 3.0833 296 0.7480 0.5220 0.7480 0.8649
No log 3.1042 298 0.8005 0.5844 0.8005 0.8947
No log 3.125 300 0.8488 0.5554 0.8488 0.9213
No log 3.1458 302 0.8006 0.5660 0.8006 0.8948
No log 3.1667 304 0.7681 0.5610 0.7681 0.8764
No log 3.1875 306 0.7592 0.5517 0.7592 0.8713
No log 3.2083 308 0.7266 0.5244 0.7266 0.8524
No log 3.2292 310 0.7409 0.4893 0.7409 0.8608
No log 3.25 312 0.7368 0.4976 0.7368 0.8584
No log 3.2708 314 0.7414 0.5046 0.7414 0.8610
No log 3.2917 316 0.7966 0.4924 0.7966 0.8925
No log 3.3125 318 0.8308 0.4790 0.8308 0.9115
No log 3.3333 320 0.8644 0.4862 0.8644 0.9297
No log 3.3542 322 0.7882 0.5585 0.7882 0.8878
No log 3.375 324 0.7669 0.5451 0.7669 0.8758
No log 3.3958 326 0.7883 0.5045 0.7883 0.8879
No log 3.4167 328 0.7936 0.4799 0.7936 0.8908
No log 3.4375 330 0.7983 0.4343 0.7983 0.8935
No log 3.4583 332 0.7635 0.4923 0.7635 0.8738
No log 3.4792 334 0.7466 0.5343 0.7466 0.8641
No log 3.5 336 0.7418 0.5310 0.7418 0.8613
No log 3.5208 338 0.7180 0.5562 0.7180 0.8473
No log 3.5417 340 0.7460 0.5662 0.7460 0.8637
No log 3.5625 342 0.8720 0.4639 0.8720 0.9338
No log 3.5833 344 0.9938 0.4487 0.9938 0.9969
No log 3.6042 346 0.9364 0.4304 0.9364 0.9677
No log 3.625 348 0.8148 0.4799 0.8148 0.9026
No log 3.6458 350 0.8103 0.4671 0.8103 0.9002
No log 3.6667 352 0.8106 0.4893 0.8106 0.9003
No log 3.6875 354 0.8132 0.5521 0.8132 0.9018
No log 3.7083 356 0.9011 0.3975 0.9011 0.9493
No log 3.7292 358 1.0300 0.4235 1.0300 1.0149
No log 3.75 360 0.9601 0.4502 0.9601 0.9798
No log 3.7708 362 0.8186 0.4513 0.8186 0.9048
No log 3.7917 364 0.7819 0.4853 0.7819 0.8842
No log 3.8125 366 0.8011 0.5010 0.8011 0.8950
No log 3.8333 368 0.8057 0.4745 0.8057 0.8976
No log 3.8542 370 0.8494 0.4513 0.8494 0.9216
No log 3.875 372 0.8546 0.4131 0.8546 0.9244
No log 3.8958 374 0.8552 0.4255 0.8552 0.9248
No log 3.9167 376 0.8203 0.4722 0.8203 0.9057
No log 3.9375 378 0.7701 0.5471 0.7701 0.8775
No log 3.9583 380 0.7666 0.5471 0.7666 0.8756
No log 3.9792 382 0.7388 0.6144 0.7388 0.8595
No log 4.0 384 0.7206 0.5312 0.7206 0.8489
No log 4.0208 386 0.7404 0.4794 0.7404 0.8605
No log 4.0417 388 0.7380 0.5866 0.7380 0.8591
No log 4.0625 390 0.7946 0.5077 0.7946 0.8914
No log 4.0833 392 0.8093 0.5118 0.8093 0.8996
No log 4.1042 394 0.7433 0.5692 0.7433 0.8622
No log 4.125 396 0.7080 0.6120 0.7080 0.8414
No log 4.1458 398 0.7054 0.6242 0.7054 0.8399
No log 4.1667 400 0.7524 0.5875 0.7524 0.8674
No log 4.1875 402 0.8449 0.4845 0.8449 0.9192
No log 4.2083 404 0.9494 0.4902 0.9494 0.9744
No log 4.2292 406 1.0022 0.4787 1.0022 1.0011
No log 4.25 408 0.9706 0.4740 0.9706 0.9852
No log 4.2708 410 0.8601 0.2587 0.8601 0.9274
No log 4.2917 412 0.8020 0.4102 0.8020 0.8955
No log 4.3125 414 0.7973 0.4614 0.7973 0.8929
No log 4.3333 416 0.7873 0.5060 0.7873 0.8873
No log 4.3542 418 0.8360 0.5102 0.8360 0.9143
No log 4.375 420 0.8258 0.5086 0.8258 0.9088
No log 4.3958 422 0.7385 0.5881 0.7385 0.8594
No log 4.4167 424 0.7179 0.5759 0.7179 0.8473
No log 4.4375 426 0.7059 0.5521 0.7059 0.8402
No log 4.4583 428 0.7345 0.5567 0.7345 0.8570
No log 4.4792 430 0.7392 0.5392 0.7392 0.8598
No log 4.5 432 0.7343 0.5028 0.7343 0.8569
No log 4.5208 434 0.7606 0.5250 0.7606 0.8721
No log 4.5417 436 0.8595 0.4930 0.8595 0.9271
No log 4.5625 438 0.9297 0.4693 0.9297 0.9642
No log 4.5833 440 0.9698 0.4580 0.9698 0.9848
No log 4.6042 442 0.9131 0.4601 0.9131 0.9556
No log 4.625 444 0.8338 0.4681 0.8338 0.9131
No log 4.6458 446 0.8082 0.4840 0.8082 0.8990
No log 4.6667 448 0.7763 0.5271 0.7763 0.8811
No log 4.6875 450 0.7703 0.5271 0.7703 0.8777
No log 4.7083 452 0.7775 0.4902 0.7775 0.8817
No log 4.7292 454 0.8772 0.5218 0.8772 0.9366
No log 4.75 456 0.9536 0.4988 0.9536 0.9765
No log 4.7708 458 0.9041 0.5030 0.9041 0.9509
No log 4.7917 460 0.8668 0.5060 0.8668 0.9310
No log 4.8125 462 0.8003 0.4540 0.8003 0.8946
No log 4.8333 464 0.7978 0.4691 0.7978 0.8932
No log 4.8542 466 0.8073 0.4220 0.8073 0.8985
No log 4.875 468 0.8279 0.4575 0.8279 0.9099
No log 4.8958 470 0.8782 0.4754 0.8782 0.9371
No log 4.9167 472 0.8692 0.4845 0.8692 0.9323
No log 4.9375 474 0.8207 0.4700 0.8207 0.9059
No log 4.9583 476 0.8004 0.4527 0.8004 0.8946
No log 4.9792 478 0.8173 0.4444 0.8173 0.9040
No log 5.0 480 0.8071 0.4591 0.8071 0.8984
No log 5.0208 482 0.8474 0.4657 0.8474 0.9205
No log 5.0417 484 0.9453 0.4153 0.9453 0.9723
No log 5.0625 486 0.9722 0.4153 0.9722 0.9860
No log 5.0833 488 0.9018 0.4224 0.9018 0.9496
No log 5.1042 490 0.8839 0.4125 0.8839 0.9401
No log 5.125 492 0.8845 0.3372 0.8845 0.9405
No log 5.1458 494 0.8683 0.4100 0.8683 0.9318
No log 5.1667 496 0.8477 0.4852 0.8477 0.9207
No log 5.1875 498 0.8080 0.4297 0.8080 0.8989
0.3482 5.2083 500 0.7888 0.5208 0.7888 0.8881
0.3482 5.2292 502 0.7963 0.5417 0.7963 0.8924
0.3482 5.25 504 0.7828 0.5811 0.7828 0.8848
0.3482 5.2708 506 0.7891 0.4668 0.7891 0.8883
0.3482 5.2917 508 0.8390 0.5192 0.8390 0.9160
0.3482 5.3125 510 0.8259 0.4401 0.8259 0.9088
0.3482 5.3333 512 0.8170 0.4626 0.8170 0.9039
0.3482 5.3542 514 0.8427 0.4012 0.8427 0.9180

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k17_task2_organization

Finetuned
(4019)
this model