ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k19_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8590
  • Qwk: 0.4278
  • Mse: 0.8590
  • Rmse: 0.9268

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0187 2 4.1324 0.0018 4.1324 2.0328
No log 0.0374 4 2.9786 -0.0193 2.9786 1.7259
No log 0.0561 6 1.8373 0.1273 1.8373 1.3555
No log 0.0748 8 1.3982 0.0082 1.3982 1.1825
No log 0.0935 10 1.2240 0.1140 1.2240 1.1063
No log 0.1121 12 1.1479 0.1857 1.1479 1.0714
No log 0.1308 14 1.1280 0.1857 1.1280 1.0621
No log 0.1495 16 1.1043 0.1875 1.1043 1.0508
No log 0.1682 18 1.0671 0.2300 1.0671 1.0330
No log 0.1869 20 1.2814 0.2065 1.2814 1.1320
No log 0.2056 22 1.2555 0.2015 1.2555 1.1205
No log 0.2243 24 1.0915 0.2697 1.0915 1.0448
No log 0.2430 26 1.1210 0.2115 1.1210 1.0588
No log 0.2617 28 1.0726 0.3090 1.0726 1.0357
No log 0.2804 30 1.1005 0.2454 1.1005 1.0491
No log 0.2991 32 1.4587 0.2635 1.4587 1.2078
No log 0.3178 34 1.6666 0.2519 1.6666 1.2910
No log 0.3364 36 1.1943 0.2344 1.1943 1.0928
No log 0.3551 38 1.1555 0.3671 1.1555 1.0750
No log 0.3738 40 1.2732 0.3496 1.2732 1.1284
No log 0.3925 42 1.0942 0.3590 1.0942 1.0460
No log 0.4112 44 1.1496 0.2398 1.1496 1.0722
No log 0.4299 46 1.2297 0.2270 1.2297 1.1089
No log 0.4486 48 1.0041 0.5559 1.0041 1.0020
No log 0.4673 50 1.0135 0.4541 1.0135 1.0067
No log 0.4860 52 1.0003 0.5356 1.0003 1.0001
No log 0.5047 54 1.0223 0.5559 1.0223 1.0111
No log 0.5234 56 1.0017 0.5298 1.0017 1.0009
No log 0.5421 58 1.0844 0.4670 1.0844 1.0413
No log 0.5607 60 1.1740 0.5585 1.1740 1.0835
No log 0.5794 62 1.1596 0.5072 1.1596 1.0768
No log 0.5981 64 1.0692 0.5 1.0692 1.0340
No log 0.6168 66 1.0940 0.4974 1.0940 1.0459
No log 0.6355 68 0.9992 0.4716 0.9992 0.9996
No log 0.6542 70 0.9613 0.4421 0.9613 0.9805
No log 0.6729 72 0.9095 0.4061 0.9095 0.9537
No log 0.6916 74 0.9471 0.3947 0.9471 0.9732
No log 0.7103 76 0.9104 0.4648 0.9104 0.9542
No log 0.7290 78 0.8937 0.4510 0.8937 0.9454
No log 0.7477 80 0.8701 0.5702 0.8701 0.9328
No log 0.7664 82 0.9481 0.5179 0.9481 0.9737
No log 0.7850 84 1.0435 0.5458 1.0435 1.0215
No log 0.8037 86 0.9743 0.4848 0.9743 0.9870
No log 0.8224 88 1.1064 0.5514 1.1064 1.0519
No log 0.8411 90 1.0641 0.5491 1.0641 1.0316
No log 0.8598 92 1.0041 0.4996 1.0041 1.0020
No log 0.8785 94 1.1609 0.4954 1.1609 1.0775
No log 0.8972 96 1.2288 0.4728 1.2288 1.1085
No log 0.9159 98 1.0448 0.4760 1.0448 1.0222
No log 0.9346 100 0.9374 0.5182 0.9374 0.9682
No log 0.9533 102 1.1021 0.4424 1.1021 1.0498
No log 0.9720 104 1.1129 0.4432 1.1129 1.0549
No log 0.9907 106 0.8829 0.4817 0.8829 0.9396
No log 1.0093 108 0.9592 0.5402 0.9592 0.9794
No log 1.0280 110 1.0835 0.5308 1.0835 1.0409
No log 1.0467 112 0.9734 0.5158 0.9734 0.9866
No log 1.0654 114 0.8974 0.5354 0.8974 0.9473
No log 1.0841 116 0.8910 0.5432 0.8910 0.9439
No log 1.1028 118 0.9335 0.5362 0.9335 0.9662
No log 1.1215 120 1.1412 0.4658 1.1412 1.0683
No log 1.1402 122 1.2625 0.3633 1.2625 1.1236
No log 1.1589 124 1.1311 0.4380 1.1311 1.0635
No log 1.1776 126 0.8989 0.4823 0.8989 0.9481
No log 1.1963 128 1.0228 0.4191 1.0228 1.0113
No log 1.2150 130 1.1108 0.3784 1.1108 1.0539
No log 1.2336 132 0.9623 0.4638 0.9623 0.9810
No log 1.2523 134 0.8378 0.5729 0.8378 0.9153
No log 1.2710 136 0.8683 0.4503 0.8683 0.9319
No log 1.2897 138 1.0159 0.4665 1.0159 1.0079
No log 1.3084 140 1.0691 0.5002 1.0691 1.0340
No log 1.3271 142 0.9907 0.5130 0.9907 0.9953
No log 1.3458 144 0.9913 0.5299 0.9913 0.9956
No log 1.3645 146 0.9389 0.5455 0.9389 0.9690
No log 1.3832 148 0.8802 0.5303 0.8802 0.9382
No log 1.4019 150 0.8714 0.5140 0.8714 0.9335
No log 1.4206 152 0.9040 0.5126 0.9040 0.9508
No log 1.4393 154 0.9245 0.5286 0.9245 0.9615
No log 1.4579 156 0.9449 0.5286 0.9449 0.9720
No log 1.4766 158 0.8711 0.4837 0.8711 0.9333
No log 1.4953 160 0.8741 0.4489 0.8741 0.9349
No log 1.5140 162 0.8575 0.4195 0.8575 0.9260
No log 1.5327 164 0.9523 0.5431 0.9523 0.9758
No log 1.5514 166 1.0663 0.4645 1.0663 1.0326
No log 1.5701 168 0.9782 0.4867 0.9782 0.9890
No log 1.5888 170 0.8558 0.5498 0.8558 0.9251
No log 1.6075 172 0.8299 0.4910 0.8299 0.9110
No log 1.6262 174 0.8694 0.4388 0.8694 0.9324
No log 1.6449 176 0.9222 0.3924 0.9222 0.9603
No log 1.6636 178 0.9191 0.3945 0.9191 0.9587
No log 1.6822 180 1.0713 0.5474 1.0713 1.0350
No log 1.7009 182 1.1053 0.5493 1.1053 1.0514
No log 1.7196 184 0.9597 0.5091 0.9597 0.9796
No log 1.7383 186 0.9027 0.4668 0.9027 0.9501
No log 1.7570 188 0.8962 0.4764 0.8962 0.9467
No log 1.7757 190 0.8520 0.4859 0.8520 0.9230
No log 1.7944 192 0.8374 0.4069 0.8374 0.9151
No log 1.8131 194 0.8637 0.4998 0.8637 0.9294
No log 1.8318 196 1.0575 0.5105 1.0575 1.0284
No log 1.8505 198 1.1165 0.4810 1.1165 1.0567
No log 1.8692 200 0.9475 0.4921 0.9475 0.9734
No log 1.8879 202 0.8583 0.4335 0.8583 0.9264
No log 1.9065 204 0.8680 0.4728 0.8680 0.9317
No log 1.9252 206 0.9059 0.5515 0.9059 0.9518
No log 1.9439 208 0.9264 0.4685 0.9264 0.9625
No log 1.9626 210 1.0017 0.4935 1.0017 1.0009
No log 1.9813 212 0.9736 0.4935 0.9736 0.9867
No log 2.0 214 0.9167 0.4782 0.9167 0.9575
No log 2.0187 216 0.8843 0.5102 0.8843 0.9404
No log 2.0374 218 0.8841 0.4926 0.8841 0.9402
No log 2.0561 220 0.9079 0.4874 0.9079 0.9528
No log 2.0748 222 0.9825 0.5256 0.9825 0.9912
No log 2.0935 224 0.9588 0.4773 0.9588 0.9792
No log 2.1121 226 0.9238 0.3379 0.9238 0.9611
No log 2.1308 228 0.9257 0.3130 0.9257 0.9621
No log 2.1495 230 0.9364 0.4509 0.9364 0.9677
No log 2.1682 232 1.0019 0.5054 1.0019 1.0009
No log 2.1869 234 0.9995 0.5054 0.9995 0.9998
No log 2.2056 236 0.9068 0.4763 0.9068 0.9523
No log 2.2243 238 0.8679 0.4859 0.8679 0.9316
No log 2.2430 240 0.8734 0.4835 0.8734 0.9345
No log 2.2617 242 0.9639 0.5249 0.9639 0.9818
No log 2.2804 244 1.1116 0.4736 1.1116 1.0543
No log 2.2991 246 1.1228 0.4979 1.1228 1.0596
No log 2.3178 248 1.0296 0.5217 1.0296 1.0147
No log 2.3364 250 0.9507 0.5420 0.9507 0.9751
No log 2.3551 252 0.9444 0.5263 0.9444 0.9718
No log 2.3738 254 0.8953 0.4553 0.8953 0.9462
No log 2.3925 256 0.8565 0.4779 0.8565 0.9255
No log 2.4112 258 0.8548 0.4656 0.8548 0.9246
No log 2.4299 260 0.8884 0.4870 0.8884 0.9426
No log 2.4486 262 0.9674 0.5578 0.9674 0.9835
No log 2.4673 264 0.9588 0.5430 0.9588 0.9792
No log 2.4860 266 0.9492 0.5308 0.9492 0.9743
No log 2.5047 268 0.9429 0.5308 0.9429 0.9711
No log 2.5234 270 0.8562 0.5121 0.8562 0.9253
No log 2.5421 272 0.8283 0.5841 0.8283 0.9101
No log 2.5607 274 0.7993 0.5884 0.7993 0.8940
No log 2.5794 276 0.8100 0.6009 0.8100 0.9000
No log 2.5981 278 0.8273 0.5575 0.8273 0.9096
No log 2.6168 280 0.8330 0.4920 0.8330 0.9127
No log 2.6355 282 0.8664 0.4604 0.8664 0.9308
No log 2.6542 284 0.9391 0.4987 0.9391 0.9691
No log 2.6729 286 0.9678 0.5493 0.9678 0.9838
No log 2.6916 288 0.8829 0.5210 0.8829 0.9396
No log 2.7103 290 0.8308 0.5356 0.8308 0.9115
No log 2.7290 292 0.8150 0.5010 0.8150 0.9028
No log 2.7477 294 0.8066 0.4499 0.8066 0.8981
No log 2.7664 296 0.8094 0.4616 0.8094 0.8997
No log 2.7850 298 0.8116 0.5223 0.8116 0.9009
No log 2.8037 300 0.8089 0.5316 0.8089 0.8994
No log 2.8224 302 0.8078 0.5304 0.8078 0.8988
No log 2.8411 304 0.7935 0.5276 0.7935 0.8908
No log 2.8598 306 0.7817 0.4772 0.7817 0.8841
No log 2.8785 308 0.7748 0.4902 0.7748 0.8802
No log 2.8972 310 0.7786 0.5802 0.7786 0.8824
No log 2.9159 312 0.8149 0.5601 0.8149 0.9027
No log 2.9346 314 0.9237 0.5627 0.9237 0.9611
No log 2.9533 316 0.9322 0.5280 0.9322 0.9655
No log 2.9720 318 0.8562 0.4593 0.8562 0.9253
No log 2.9907 320 0.8208 0.4108 0.8208 0.9060
No log 3.0093 322 0.7929 0.4841 0.7929 0.8905
No log 3.0280 324 0.7928 0.5197 0.7928 0.8904
No log 3.0467 326 0.8045 0.5362 0.8045 0.8969
No log 3.0654 328 0.9661 0.5794 0.9661 0.9829
No log 3.0841 330 1.4085 0.5311 1.4085 1.1868
No log 3.1028 332 1.5909 0.4531 1.5909 1.2613
No log 3.1215 334 1.4706 0.4197 1.4706 1.2127
No log 3.1402 336 1.2009 0.4968 1.2009 1.0958
No log 3.1589 338 1.0140 0.5605 1.0140 1.0070
No log 3.1776 340 0.9072 0.5261 0.9072 0.9525
No log 3.1963 342 0.8510 0.5264 0.8510 0.9225
No log 3.2150 344 0.8395 0.4923 0.8395 0.9163
No log 3.2336 346 0.8485 0.5235 0.8485 0.9212
No log 3.2523 348 0.8078 0.5038 0.8078 0.8988
No log 3.2710 350 0.7962 0.5152 0.7962 0.8923
No log 3.2897 352 0.8123 0.5416 0.8123 0.9013
No log 3.3084 354 0.8122 0.5114 0.8122 0.9012
No log 3.3271 356 0.8500 0.4321 0.8500 0.9220
No log 3.3458 358 0.8884 0.4375 0.8884 0.9425
No log 3.3645 360 0.8882 0.4608 0.8882 0.9425
No log 3.3832 362 0.9043 0.4795 0.9043 0.9510
No log 3.4019 364 0.9514 0.5260 0.9514 0.9754
No log 3.4206 366 0.9134 0.4938 0.9134 0.9557
No log 3.4393 368 0.9214 0.5054 0.9214 0.9599
No log 3.4579 370 0.8789 0.5365 0.8789 0.9375
No log 3.4766 372 0.8645 0.5028 0.8645 0.9298
No log 3.4953 374 0.9029 0.5395 0.9029 0.9502
No log 3.5140 376 0.9215 0.5106 0.9215 0.9599
No log 3.5327 378 0.8970 0.4614 0.8970 0.9471
No log 3.5514 380 0.8577 0.4787 0.8577 0.9261
No log 3.5701 382 0.8620 0.5201 0.8620 0.9284
No log 3.5888 384 0.9246 0.4834 0.9246 0.9616
No log 3.6075 386 0.9870 0.4824 0.9870 0.9935
No log 3.6262 388 0.9216 0.4834 0.9216 0.9600
No log 3.6449 390 0.8242 0.4363 0.8242 0.9078
No log 3.6636 392 0.8417 0.4560 0.8417 0.9174
No log 3.6822 394 0.9691 0.4276 0.9691 0.9844
No log 3.7009 396 1.0400 0.4355 1.0400 1.0198
No log 3.7196 398 1.0286 0.4152 1.0286 1.0142
No log 3.7383 400 0.9769 0.4309 0.9769 0.9884
No log 3.7570 402 0.9145 0.4166 0.9145 0.9563
No log 3.7757 404 0.8425 0.4369 0.8425 0.9179
No log 3.7944 406 0.8528 0.4237 0.8528 0.9235
No log 3.8131 408 0.9147 0.4545 0.9147 0.9564
No log 3.8318 410 0.9491 0.4533 0.9491 0.9742
No log 3.8505 412 0.9815 0.4533 0.9815 0.9907
No log 3.8692 414 0.8878 0.5028 0.8878 0.9422
No log 3.8879 416 0.8775 0.5261 0.8775 0.9368
No log 3.9065 418 0.8421 0.4711 0.8421 0.9176
No log 3.9252 420 0.8523 0.4720 0.8523 0.9232
No log 3.9439 422 0.8965 0.4898 0.8965 0.9468
No log 3.9626 424 0.9273 0.4811 0.9273 0.9629
No log 3.9813 426 0.9694 0.4167 0.9694 0.9846
No log 4.0 428 0.9411 0.3065 0.9411 0.9701
No log 4.0187 430 0.9831 0.4339 0.9831 0.9915
No log 4.0374 432 0.9968 0.4337 0.9968 0.9984
No log 4.0561 434 1.0225 0.4521 1.0225 1.0112
No log 4.0748 436 1.1246 0.5206 1.1246 1.0605
No log 4.0935 438 1.0400 0.4697 1.0400 1.0198
No log 4.1121 440 0.9246 0.4252 0.9246 0.9616
No log 4.1308 442 0.9210 0.4453 0.9210 0.9597
No log 4.1495 444 0.9545 0.4527 0.9545 0.9770
No log 4.1682 446 0.9575 0.4533 0.9575 0.9785
No log 4.1869 448 0.9253 0.4539 0.9253 0.9619
No log 4.2056 450 0.8661 0.4940 0.8661 0.9307
No log 4.2243 452 0.8290 0.4637 0.8290 0.9105
No log 4.2430 454 0.8380 0.4808 0.8380 0.9154
No log 4.2617 456 0.9125 0.4166 0.9125 0.9553
No log 4.2804 458 0.9688 0.3556 0.9688 0.9843
No log 4.2991 460 0.9398 0.3465 0.9398 0.9694
No log 4.3178 462 0.9405 0.3734 0.9405 0.9698
No log 4.3364 464 0.9618 0.4082 0.9618 0.9807
No log 4.3551 466 0.9951 0.4083 0.9951 0.9976
No log 4.3738 468 1.0250 0.4359 1.0250 1.0124
No log 4.3925 470 0.9957 0.4770 0.9957 0.9979
No log 4.4112 472 0.9142 0.4623 0.9142 0.9561
No log 4.4299 474 0.8493 0.4996 0.8493 0.9216
No log 4.4486 476 0.8341 0.5164 0.8341 0.9133
No log 4.4673 478 0.8401 0.5164 0.8401 0.9166
No log 4.4860 480 0.8693 0.4237 0.8693 0.9324
No log 4.5047 482 0.9137 0.4607 0.9137 0.9559
No log 4.5234 484 0.9471 0.3590 0.9471 0.9732
No log 4.5421 486 0.9381 0.3543 0.9381 0.9685
No log 4.5607 488 0.9303 0.3656 0.9303 0.9645
No log 4.5794 490 0.9380 0.3897 0.9380 0.9685
No log 4.5981 492 0.9701 0.4545 0.9701 0.9850
No log 4.6168 494 0.9737 0.4741 0.9737 0.9867
No log 4.6355 496 0.9299 0.4946 0.9299 0.9643
No log 4.6542 498 0.8582 0.5077 0.8582 0.9264
0.337 4.6729 500 0.8413 0.5172 0.8413 0.9172
0.337 4.6916 502 0.8460 0.5106 0.8460 0.9198
0.337 4.7103 504 0.8423 0.4592 0.8423 0.9178
0.337 4.7290 506 0.8415 0.4548 0.8415 0.9173
0.337 4.7477 508 0.8534 0.4804 0.8534 0.9238
0.337 4.7664 510 0.8590 0.4278 0.8590 0.9268

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k19_task2_organization

Finetuned
(4019)
this model