ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k19_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8109
  • Qwk: 0.0053
  • Mse: 0.8109
  • Rmse: 0.9005

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0204 2 3.6301 -0.0154 3.6301 1.9053
No log 0.0408 4 2.2012 0.0104 2.2012 1.4836
No log 0.0612 6 1.4617 0.0 1.4617 1.2090
No log 0.0816 8 0.9773 -0.0595 0.9773 0.9886
No log 0.1020 10 0.8943 -0.0833 0.8943 0.9457
No log 0.1224 12 0.7657 0.0296 0.7657 0.8750
No log 0.1429 14 0.9090 -0.0504 0.9090 0.9534
No log 0.1633 16 0.7861 -0.0215 0.7861 0.8866
No log 0.1837 18 0.7693 -0.0101 0.7693 0.8771
No log 0.2041 20 0.8006 -0.0551 0.8006 0.8948
No log 0.2245 22 0.8111 0.1143 0.8111 0.9006
No log 0.2449 24 0.8006 -0.0499 0.8006 0.8948
No log 0.2653 26 0.8812 -0.0852 0.8812 0.9387
No log 0.2857 28 0.8942 -0.0828 0.8942 0.9456
No log 0.3061 30 0.7921 -0.0033 0.7921 0.8900
No log 0.3265 32 0.8254 0.0615 0.8254 0.9085
No log 0.3469 34 0.8554 -0.2257 0.8554 0.9249
No log 0.3673 36 0.8866 -0.1159 0.8866 0.9416
No log 0.3878 38 0.8746 -0.0007 0.8746 0.9352
No log 0.4082 40 0.9182 0.0431 0.9182 0.9582
No log 0.4286 42 1.0225 -0.0099 1.0225 1.0112
No log 0.4490 44 0.8271 0.1176 0.8271 0.9094
No log 0.4694 46 1.1607 -0.0516 1.1607 1.0774
No log 0.4898 48 0.9400 0.0421 0.9400 0.9695
No log 0.5102 50 1.0419 -0.0137 1.0419 1.0207
No log 0.5306 52 1.0881 0.0219 1.0881 1.0431
No log 0.5510 54 0.9833 0.0747 0.9833 0.9916
No log 0.5714 56 1.0343 -0.0049 1.0343 1.0170
No log 0.5918 58 1.0415 -0.0106 1.0415 1.0206
No log 0.6122 60 0.8647 0.1209 0.8647 0.9299
No log 0.6327 62 0.9434 0.1009 0.9434 0.9713
No log 0.6531 64 0.8843 0.0635 0.8843 0.9404
No log 0.6735 66 0.8432 -0.0025 0.8432 0.9183
No log 0.6939 68 0.8686 0.0643 0.8686 0.9320
No log 0.7143 70 0.8951 -0.0049 0.8951 0.9461
No log 0.7347 72 0.9445 0.0618 0.9445 0.9718
No log 0.7551 74 0.9888 -0.0465 0.9888 0.9944
No log 0.7755 76 0.9388 -0.0284 0.9388 0.9689
No log 0.7959 78 0.9256 -0.0164 0.9256 0.9621
No log 0.8163 80 1.1280 0.0193 1.1280 1.0621
No log 0.8367 82 1.0987 0.0267 1.0987 1.0482
No log 0.8571 84 0.9313 -0.0425 0.9313 0.9650
No log 0.8776 86 0.9423 0.0856 0.9423 0.9707
No log 0.8980 88 0.9587 0.1304 0.9587 0.9791
No log 0.9184 90 0.8187 0.2019 0.8187 0.9048
No log 0.9388 92 0.8772 -0.0778 0.8772 0.9366
No log 0.9592 94 0.8736 -0.0336 0.8736 0.9347
No log 0.9796 96 0.8964 0.1398 0.8964 0.9468
No log 1.0 98 1.2100 -0.0077 1.2100 1.1000
No log 1.0204 100 1.2068 0.0482 1.2068 1.0985
No log 1.0408 102 1.0526 0.0984 1.0526 1.0259
No log 1.0612 104 1.1381 0.0091 1.1381 1.0668
No log 1.0816 106 1.2921 0.0076 1.2921 1.1367
No log 1.1020 108 1.6969 0.0581 1.6969 1.3027
No log 1.1224 110 1.3210 0.0379 1.3210 1.1493
No log 1.1429 112 1.2045 -0.0631 1.2045 1.0975
No log 1.1633 114 1.1326 -0.0631 1.1326 1.0643
No log 1.1837 116 1.0052 0.0755 1.0052 1.0026
No log 1.2041 118 0.8924 -0.0200 0.8924 0.9447
No log 1.2245 120 0.8439 -0.0316 0.8439 0.9186
No log 1.2449 122 0.9408 0.0316 0.9408 0.9699
No log 1.2653 124 0.8549 -0.0008 0.8549 0.9246
No log 1.2857 126 0.7527 0.0869 0.7527 0.8676
No log 1.3061 128 1.0232 0.0240 1.0232 1.0116
No log 1.3265 130 1.1548 0.0574 1.1548 1.0746
No log 1.3469 132 0.8692 0.1196 0.8692 0.9323
No log 1.3673 134 1.1709 0.0415 1.1709 1.0821
No log 1.3878 136 1.0850 0.0815 1.0850 1.0416
No log 1.4082 138 0.8066 -0.0426 0.8066 0.8981
No log 1.4286 140 1.0759 0.0267 1.0759 1.0372
No log 1.4490 142 0.9745 0.0492 0.9745 0.9872
No log 1.4694 144 0.7421 -0.0612 0.7421 0.8614
No log 1.4898 146 1.0850 0.1077 1.0850 1.0417
No log 1.5102 148 1.1509 0.1008 1.1509 1.0728
No log 1.5306 150 0.7986 0.0068 0.7986 0.8937
No log 1.5510 152 0.9560 0.0789 0.9560 0.9778
No log 1.5714 154 0.9028 0.1442 0.9028 0.9502
No log 1.5918 156 0.7638 0.0183 0.7638 0.8740
No log 1.6122 158 0.7829 0.0600 0.7829 0.8848
No log 1.6327 160 0.7771 -0.0658 0.7771 0.8815
No log 1.6531 162 0.8550 -0.0407 0.8550 0.9246
No log 1.6735 164 1.0031 0.0794 1.0031 1.0016
No log 1.6939 166 0.8884 -0.0271 0.8884 0.9426
No log 1.7143 168 0.8783 0.0183 0.8783 0.9372
No log 1.7347 170 1.1865 -0.0229 1.1865 1.0893
No log 1.7551 172 1.0678 0.0537 1.0678 1.0334
No log 1.7755 174 0.7961 0.0303 0.7961 0.8923
No log 1.7959 176 0.8365 0.0089 0.8365 0.9146
No log 1.8163 178 0.8803 -0.0259 0.8803 0.9382
No log 1.8367 180 0.8290 -0.0145 0.8290 0.9105
No log 1.8571 182 0.8396 0.0269 0.8396 0.9163
No log 1.8776 184 0.8081 -0.0595 0.8081 0.8989
No log 1.8980 186 0.7610 -0.1088 0.7610 0.8724
No log 1.9184 188 0.7411 -0.0033 0.7411 0.8609
No log 1.9388 190 0.7335 -0.0033 0.7335 0.8564
No log 1.9592 192 0.7810 -0.1018 0.7810 0.8837
No log 1.9796 194 0.7806 -0.1018 0.7806 0.8835
No log 2.0 196 0.7487 0.0814 0.7487 0.8653
No log 2.0204 198 0.7339 -0.0125 0.7339 0.8567
No log 2.0408 200 0.7777 -0.1018 0.7777 0.8819
No log 2.0612 202 0.8118 -0.1018 0.8118 0.9010
No log 2.0816 204 0.8481 -0.0984 0.8481 0.9209
No log 2.1020 206 0.8995 -0.0444 0.8995 0.9484
No log 2.1224 208 0.9613 -0.0622 0.9613 0.9805
No log 2.1429 210 1.2315 0.0789 1.2315 1.1097
No log 2.1633 212 1.1620 0.0784 1.1620 1.0780
No log 2.1837 214 0.8534 -0.0082 0.8534 0.9238
No log 2.2041 216 0.8765 -0.0336 0.8765 0.9362
No log 2.2245 218 0.8723 -0.0573 0.8723 0.9340
No log 2.2449 220 1.0485 -0.1064 1.0485 1.0240
No log 2.2653 222 0.9984 -0.0927 0.9984 0.9992
No log 2.2857 224 0.9408 -0.0099 0.9408 0.9700
No log 2.3061 226 0.9228 0.0146 0.9228 0.9606
No log 2.3265 228 0.9150 -0.0378 0.9150 0.9566
No log 2.3469 230 0.8485 0.0183 0.8485 0.9211
No log 2.3673 232 0.8244 -0.1397 0.8244 0.9080
No log 2.3878 234 0.9506 0.1078 0.9506 0.9750
No log 2.4082 236 0.8979 -0.0661 0.8979 0.9476
No log 2.4286 238 0.8842 0.0146 0.8842 0.9403
No log 2.4490 240 1.0812 -0.0090 1.0812 1.0398
No log 2.4694 242 0.9891 -0.0690 0.9891 0.9945
No log 2.4898 244 1.0485 -0.0167 1.0485 1.0239
No log 2.5102 246 1.2585 -0.0309 1.2585 1.1218
No log 2.5306 248 1.1158 0.0365 1.1158 1.0563
No log 2.5510 250 0.9379 -0.0209 0.9379 0.9685
No log 2.5714 252 1.0953 -0.0122 1.0953 1.0465
No log 2.5918 254 1.0521 -0.0122 1.0521 1.0257
No log 2.6122 256 0.8425 0.0257 0.8425 0.9179
No log 2.6327 258 0.8377 -0.0870 0.8377 0.9153
No log 2.6531 260 0.8443 -0.0859 0.8443 0.9189
No log 2.6735 262 0.8279 -0.0108 0.8279 0.9099
No log 2.6939 264 0.8368 -0.0859 0.8368 0.9148
No log 2.7143 266 0.8017 -0.0849 0.8017 0.8954
No log 2.7347 268 0.7650 -0.1026 0.7650 0.8746
No log 2.7551 270 0.7915 -0.0612 0.7915 0.8897
No log 2.7755 272 0.9262 0.0920 0.9262 0.9624
No log 2.7959 274 0.8888 0.0709 0.8888 0.9427
No log 2.8163 276 0.9863 0.0364 0.9863 0.9931
No log 2.8367 278 1.0188 0.0416 1.0188 1.0093
No log 2.8571 280 0.9120 -0.0647 0.9120 0.9550
No log 2.8776 282 0.8662 -0.0462 0.8662 0.9307
No log 2.8980 284 0.8108 0.0323 0.8108 0.9004
No log 2.9184 286 0.7623 -0.0170 0.7623 0.8731
No log 2.9388 288 0.7448 -0.0204 0.7448 0.8630
No log 2.9592 290 0.7229 -0.0065 0.7229 0.8502
No log 2.9796 292 0.7237 -0.0660 0.7237 0.8507
No log 3.0 294 0.7507 0.0513 0.7507 0.8664
No log 3.0204 296 0.8105 0.0432 0.8105 0.9003
No log 3.0408 298 0.8398 0.1465 0.8398 0.9164
No log 3.0612 300 0.8454 0.1465 0.8454 0.9194
No log 3.0816 302 0.8210 0.1143 0.8210 0.9061
No log 3.1020 304 0.8158 0.1003 0.8158 0.9032
No log 3.1224 306 0.7747 0.0639 0.7747 0.8802
No log 3.1429 308 0.7821 -0.0912 0.7821 0.8844
No log 3.1633 310 0.7868 -0.1396 0.7868 0.8870
No log 3.1837 312 0.7957 0.1047 0.7957 0.8920
No log 3.2041 314 1.1241 -0.0306 1.1241 1.0603
No log 3.2245 316 1.1258 -0.0306 1.1258 1.0610
No log 3.2449 318 0.8530 0.1449 0.8530 0.9236
No log 3.2653 320 0.8698 -0.0687 0.8698 0.9327
No log 3.2857 322 0.9706 -0.0036 0.9706 0.9852
No log 3.3061 324 0.9065 0.0 0.9065 0.9521
No log 3.3265 326 0.9164 0.0208 0.9164 0.9573
No log 3.3469 328 0.9240 -0.0079 0.9240 0.9612
No log 3.3673 330 0.8824 0.0265 0.8824 0.9394
No log 3.3878 332 0.8340 -0.0108 0.8340 0.9132
No log 3.4082 334 0.7773 -0.0170 0.7773 0.8816
No log 3.4286 336 0.7448 -0.0560 0.7448 0.8630
No log 3.4490 338 0.7454 -0.0125 0.7454 0.8634
No log 3.4694 340 0.7860 0.0723 0.7860 0.8866
No log 3.4898 342 0.8030 -0.0426 0.8030 0.8961
No log 3.5102 344 0.9840 0.0104 0.9840 0.9919
No log 3.5306 346 0.9370 0.1039 0.9370 0.9680
No log 3.5510 348 0.9371 0.0851 0.9371 0.9680
No log 3.5714 350 1.1414 -0.0164 1.1414 1.0684
No log 3.5918 352 1.1038 -0.0146 1.1038 1.0506
No log 3.6122 354 0.9108 0.0851 0.9108 0.9544
No log 3.6327 356 0.8865 0.0024 0.8865 0.9416
No log 3.6531 358 0.8494 0.0393 0.8494 0.9216
No log 3.6735 360 0.8969 0.1281 0.8969 0.9471
No log 3.6939 362 0.9159 0.1239 0.9159 0.9571
No log 3.7143 364 0.7947 0.1097 0.7947 0.8915
No log 3.7347 366 0.7841 -0.0444 0.7841 0.8855
No log 3.7551 368 0.7815 -0.0483 0.7815 0.8841
No log 3.7755 370 0.8217 0.1440 0.8217 0.9065
No log 3.7959 372 0.8523 0.1281 0.8523 0.9232
No log 3.8163 374 0.8400 0.0917 0.8400 0.9165
No log 3.8367 376 0.8026 0.0828 0.8026 0.8959
No log 3.8571 378 0.8243 -0.0373 0.8243 0.9079
No log 3.8776 380 0.7963 0.0749 0.7963 0.8924
No log 3.8980 382 0.8081 0.0709 0.8081 0.8989
No log 3.9184 384 0.8207 0.0709 0.8207 0.9059
No log 3.9388 386 0.8487 0.0725 0.8487 0.9212
No log 3.9592 388 0.8664 0.0330 0.8664 0.9308
No log 3.9796 390 0.9001 0.1183 0.9001 0.9488
No log 4.0 392 0.9496 0.1188 0.9496 0.9745
No log 4.0204 394 0.8585 0.0295 0.8585 0.9266
No log 4.0408 396 0.8557 0.0940 0.8557 0.9250
No log 4.0612 398 0.8230 -0.0350 0.8230 0.9072
No log 4.0816 400 0.7860 0.0303 0.7860 0.8866
No log 4.1020 402 0.9129 0.0409 0.9129 0.9554
No log 4.1224 404 0.9842 -0.0496 0.9842 0.9921
No log 4.1429 406 0.8628 0.1395 0.8628 0.9289
No log 4.1633 408 0.7777 -0.1040 0.7777 0.8819
No log 4.1837 410 0.7951 -0.0350 0.7951 0.8917
No log 4.2041 412 0.7746 -0.1040 0.7746 0.8801
No log 4.2245 414 0.8389 0.1495 0.8389 0.9159
No log 4.2449 416 1.0612 -0.0164 1.0612 1.0302
No log 4.2653 418 1.0567 -0.0164 1.0567 1.0280
No log 4.2857 420 0.8579 0.1758 0.8579 0.9262
No log 4.3061 422 0.7487 0.0355 0.7487 0.8653
No log 4.3265 424 0.7531 -0.0488 0.7531 0.8678
No log 4.3469 426 0.7796 -0.0446 0.7796 0.8829
No log 4.3673 428 0.8281 0.0081 0.8281 0.9100
No log 4.3878 430 0.8129 -0.0026 0.8129 0.9016
No log 4.4082 432 0.8717 0.1001 0.8717 0.9337
No log 4.4286 434 0.8954 0.0504 0.8954 0.9463
No log 4.4490 436 0.8548 -0.0573 0.8548 0.9246
No log 4.4694 438 0.8560 -0.0573 0.8560 0.9252
No log 4.4898 440 0.8502 -0.0156 0.8502 0.9220
No log 4.5102 442 0.8465 -0.0156 0.8465 0.9201
No log 4.5306 444 0.8344 -0.0156 0.8344 0.9135
No log 4.5510 446 0.8892 0.0504 0.8892 0.9430
No log 4.5714 448 0.9410 0.0748 0.9410 0.9701
No log 4.5918 450 0.8913 0.0504 0.8913 0.9441
No log 4.6122 452 0.8610 0.0771 0.8610 0.9279
No log 4.6327 454 0.9032 0.0540 0.9032 0.9504
No log 4.6531 456 0.8528 0.0393 0.8528 0.9235
No log 4.6735 458 0.8326 0.0226 0.8326 0.9125
No log 4.6939 460 0.8660 0.0538 0.8660 0.9306
No log 4.7143 462 0.8383 0.0574 0.8383 0.9156
No log 4.7347 464 0.8199 -0.0108 0.8199 0.9055
No log 4.7551 466 0.8184 -0.0108 0.8184 0.9046
No log 4.7755 468 0.8348 0.0690 0.8348 0.9137
No log 4.7959 470 0.8445 -0.0108 0.8445 0.9189
No log 4.8163 472 0.8374 0.0308 0.8374 0.9151
No log 4.8367 474 0.8161 -0.0079 0.8161 0.9034
No log 4.8571 476 0.8035 -0.0426 0.8035 0.8964
No log 4.8776 478 0.7766 0.0357 0.7766 0.8813
No log 4.8980 480 0.7540 0.1199 0.7540 0.8683
No log 4.9184 482 0.7373 0.0768 0.7373 0.8587
No log 4.9388 484 0.7470 0.1199 0.7470 0.8643
No log 4.9592 486 0.7489 0.0732 0.7489 0.8654
No log 4.9796 488 0.7581 0.0821 0.7581 0.8707
No log 5.0 490 0.7691 0.0690 0.7691 0.8770
No log 5.0204 492 0.8657 0.0909 0.8657 0.9304
No log 5.0408 494 0.8795 0.0909 0.8795 0.9378
No log 5.0612 496 0.8089 0.1449 0.8089 0.8994
No log 5.0816 498 0.7533 0.1627 0.7533 0.8679
0.3063 5.1020 500 0.7263 0.1311 0.7263 0.8522
0.3063 5.1224 502 0.7447 0.1311 0.7447 0.8630
0.3063 5.1429 504 0.7876 0.1506 0.7876 0.8875
0.3063 5.1633 506 0.9705 0.1105 0.9705 0.9851
0.3063 5.1837 508 1.2004 0.0291 1.2004 1.0956
0.3063 5.2041 510 1.0935 -0.0528 1.0935 1.0457
0.3063 5.2245 512 0.8487 0.0913 0.8487 0.9212
0.3063 5.2449 514 0.8109 0.0053 0.8109 0.9005

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k19_task3_organization

Finetuned
(4019)
this model