ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k8_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9202
  • Qwk: 0.4731
  • Mse: 0.9202
  • Rmse: 0.9592

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0444 2 4.7103 0.0010 4.7103 2.1703
No log 0.0889 4 3.6259 -0.0013 3.6259 1.9042
No log 0.1333 6 1.7591 0.0504 1.7591 1.3263
No log 0.1778 8 1.2955 0.1080 1.2955 1.1382
No log 0.2222 10 1.2273 0.1417 1.2273 1.1078
No log 0.2667 12 1.2596 0.0454 1.2596 1.1223
No log 0.3111 14 1.2362 0.1076 1.2362 1.1118
No log 0.3556 16 1.2930 0.0860 1.2930 1.1371
No log 0.4 18 1.4058 -0.0066 1.4058 1.1856
No log 0.4444 20 1.4248 0.0 1.4248 1.1936
No log 0.4889 22 1.4966 0.0 1.4966 1.2233
No log 0.5333 24 1.6575 0.0 1.6575 1.2874
No log 0.5778 26 1.5064 0.0 1.5064 1.2274
No log 0.6222 28 1.2063 0.1593 1.2063 1.0983
No log 0.6667 30 1.1513 0.1962 1.1513 1.0730
No log 0.7111 32 1.1767 0.1962 1.1767 1.0847
No log 0.7556 34 1.2182 0.0454 1.2182 1.1037
No log 0.8 36 1.2162 0.0700 1.2162 1.1028
No log 0.8444 38 1.3141 0.0627 1.3141 1.1463
No log 0.8889 40 1.3501 0.0600 1.3501 1.1620
No log 0.9333 42 1.1879 0.1314 1.1879 1.0899
No log 0.9778 44 1.1832 0.1351 1.1832 1.0877
No log 1.0222 46 1.2570 0.0979 1.2570 1.1212
No log 1.0667 48 1.3473 0.0232 1.3473 1.1607
No log 1.1111 50 1.2601 0.1753 1.2601 1.1226
No log 1.1556 52 1.2273 0.1288 1.2273 1.1078
No log 1.2 54 1.2460 0.0449 1.2460 1.1163
No log 1.2444 56 1.1969 0.1809 1.1969 1.0940
No log 1.2889 58 1.1185 0.3648 1.1185 1.0576
No log 1.3333 60 1.1243 0.2574 1.1243 1.0603
No log 1.3778 62 1.1859 0.1852 1.1859 1.0890
No log 1.4222 64 1.2016 0.1952 1.2016 1.0962
No log 1.4667 66 1.1577 0.2043 1.1577 1.0760
No log 1.5111 68 1.0515 0.1343 1.0515 1.0254
No log 1.5556 70 0.9962 0.2969 0.9962 0.9981
No log 1.6 72 0.9498 0.3298 0.9498 0.9746
No log 1.6444 74 0.9498 0.3578 0.9498 0.9746
No log 1.6889 76 0.9378 0.3631 0.9378 0.9684
No log 1.7333 78 0.8913 0.3537 0.8913 0.9441
No log 1.7778 80 0.8960 0.4087 0.8960 0.9466
No log 1.8222 82 0.8836 0.3787 0.8836 0.9400
No log 1.8667 84 0.9146 0.4898 0.9146 0.9564
No log 1.9111 86 0.9351 0.5304 0.9351 0.9670
No log 1.9556 88 1.0180 0.4536 1.0180 1.0090
No log 2.0 90 1.0627 0.3913 1.0627 1.0309
No log 2.0444 92 0.9990 0.3514 0.9990 0.9995
No log 2.0889 94 0.9144 0.3953 0.9144 0.9563
No log 2.1333 96 0.8840 0.5262 0.8840 0.9402
No log 2.1778 98 0.9001 0.5916 0.9001 0.9488
No log 2.2222 100 0.8879 0.4393 0.8879 0.9423
No log 2.2667 102 0.9597 0.4201 0.9597 0.9796
No log 2.3111 104 0.9830 0.3711 0.9830 0.9915
No log 2.3556 106 1.1186 0.3184 1.1186 1.0577
No log 2.4 108 1.1997 0.3991 1.1997 1.0953
No log 2.4444 110 1.0094 0.4410 1.0094 1.0047
No log 2.4889 112 0.8769 0.5023 0.8769 0.9364
No log 2.5333 114 0.8695 0.4939 0.8695 0.9325
No log 2.5778 116 0.8802 0.4805 0.8802 0.9382
No log 2.6222 118 0.8671 0.4794 0.8671 0.9312
No log 2.6667 120 0.8540 0.4805 0.8540 0.9241
No log 2.7111 122 0.8706 0.4546 0.8706 0.9331
No log 2.7556 124 1.0324 0.4545 1.0324 1.0161
No log 2.8 126 1.0976 0.4845 1.0976 1.0477
No log 2.8444 128 0.9972 0.4824 0.9972 0.9986
No log 2.8889 130 0.9115 0.4851 0.9115 0.9547
No log 2.9333 132 0.8015 0.4328 0.8015 0.8953
No log 2.9778 134 0.7437 0.5336 0.7437 0.8624
No log 3.0222 136 0.7348 0.6048 0.7348 0.8572
No log 3.0667 138 0.7343 0.5159 0.7343 0.8569
No log 3.1111 140 0.8919 0.5286 0.8919 0.9444
No log 3.1556 142 1.0766 0.5246 1.0766 1.0376
No log 3.2 144 0.9437 0.5270 0.9437 0.9714
No log 3.2444 146 0.7683 0.5418 0.7683 0.8765
No log 3.2889 148 0.8948 0.4783 0.8948 0.9459
No log 3.3333 150 0.8720 0.5133 0.8720 0.9338
No log 3.3778 152 0.7787 0.4676 0.7787 0.8824
No log 3.4222 154 0.8187 0.4852 0.8187 0.9048
No log 3.4667 156 1.0655 0.5043 1.0655 1.0322
No log 3.5111 158 1.1923 0.4809 1.1923 1.0919
No log 3.5556 160 1.1139 0.4567 1.1139 1.0554
No log 3.6 162 0.9206 0.4681 0.9206 0.9595
No log 3.6444 164 0.8665 0.4775 0.8665 0.9309
No log 3.6889 166 0.8862 0.4468 0.8862 0.9414
No log 3.7333 168 0.8832 0.4166 0.8832 0.9398
No log 3.7778 170 0.9308 0.4958 0.9308 0.9648
No log 3.8222 172 0.9768 0.5083 0.9768 0.9883
No log 3.8667 174 0.9647 0.4958 0.9647 0.9822
No log 3.9111 176 0.8788 0.4043 0.8788 0.9375
No log 3.9556 178 0.8227 0.4282 0.8227 0.9070
No log 4.0 180 0.8285 0.4737 0.8285 0.9102
No log 4.0444 182 0.8546 0.4976 0.8546 0.9244
No log 4.0889 184 0.8640 0.4976 0.8640 0.9295
No log 4.1333 186 0.8621 0.5098 0.8621 0.9285
No log 4.1778 188 0.8799 0.4784 0.8799 0.9381
No log 4.2222 190 0.8784 0.5351 0.8784 0.9372
No log 4.2667 192 0.8634 0.3902 0.8634 0.9292
No log 4.3111 194 0.9161 0.3936 0.9161 0.9571
No log 4.3556 196 1.1319 0.4186 1.1319 1.0639
No log 4.4 198 1.3075 0.3986 1.3075 1.1435
No log 4.4444 200 1.2915 0.3643 1.2915 1.1364
No log 4.4889 202 1.0995 0.4071 1.0995 1.0486
No log 4.5333 204 0.9185 0.3778 0.9185 0.9584
No log 4.5778 206 0.8669 0.4488 0.8669 0.9311
No log 4.6222 208 0.8741 0.4282 0.8741 0.9349
No log 4.6667 210 0.8984 0.4381 0.8984 0.9479
No log 4.7111 212 0.9114 0.4381 0.9114 0.9547
No log 4.7556 214 0.9082 0.3868 0.9082 0.9530
No log 4.8 216 0.8895 0.3762 0.8895 0.9431
No log 4.8444 218 0.8875 0.3705 0.8875 0.9420
No log 4.8889 220 0.8884 0.4122 0.8884 0.9425
No log 4.9333 222 0.9317 0.5046 0.9317 0.9652
No log 4.9778 224 1.0396 0.4219 1.0396 1.0196
No log 5.0222 226 1.0557 0.3907 1.0557 1.0275
No log 5.0667 228 0.9512 0.4261 0.9512 0.9753
No log 5.1111 230 0.9101 0.4164 0.9101 0.9540
No log 5.1556 232 0.9360 0.4224 0.9360 0.9675
No log 5.2 234 0.9702 0.4955 0.9702 0.9850
No log 5.2444 236 0.9250 0.4839 0.9250 0.9617
No log 5.2889 238 0.9242 0.4422 0.9242 0.9614
No log 5.3333 240 0.9289 0.4663 0.9289 0.9638
No log 5.3778 242 0.9235 0.4254 0.9235 0.9610
No log 5.4222 244 0.8812 0.3816 0.8812 0.9387
No log 5.4667 246 0.8530 0.4455 0.8530 0.9236
No log 5.5111 248 0.8507 0.4352 0.8507 0.9223
No log 5.5556 250 0.8777 0.4654 0.8777 0.9368
No log 5.6 252 0.9979 0.4436 0.9979 0.9990
No log 5.6444 254 1.0424 0.4618 1.0424 1.0210
No log 5.6889 256 0.9256 0.4733 0.9256 0.9621
No log 5.7333 258 0.8315 0.4676 0.8315 0.9119
No log 5.7778 260 0.8427 0.4933 0.8427 0.9180
No log 5.8222 262 0.8469 0.4951 0.8469 0.9202
No log 5.8667 264 0.9267 0.4742 0.9267 0.9627
No log 5.9111 266 0.9961 0.4786 0.9961 0.9980
No log 5.9556 268 0.9989 0.4685 0.9989 0.9995
No log 6.0 270 0.9082 0.4857 0.9082 0.9530
No log 6.0444 272 0.8281 0.4519 0.8281 0.9100
No log 6.0889 274 0.8090 0.4690 0.8090 0.8994
No log 6.1333 276 0.7957 0.4512 0.7957 0.8920
No log 6.1778 278 0.8065 0.4998 0.8065 0.8980
No log 6.2222 280 0.8287 0.5295 0.8287 0.9103
No log 6.2667 282 0.8806 0.4898 0.8806 0.9384
No log 6.3111 284 0.9206 0.4886 0.9206 0.9595
No log 6.3556 286 0.9876 0.4332 0.9876 0.9938
No log 6.4 288 0.9089 0.4886 0.9089 0.9533
No log 6.4444 290 0.8579 0.5240 0.8579 0.9262
No log 6.4889 292 0.8542 0.4799 0.8542 0.9242
No log 6.5333 294 0.8389 0.4343 0.8389 0.9159
No log 6.5778 296 0.8317 0.4646 0.8317 0.9120
No log 6.6222 298 0.8517 0.4470 0.8517 0.9229
No log 6.6667 300 0.9531 0.5086 0.9531 0.9763
No log 6.7111 302 1.0873 0.4403 1.0873 1.0427
No log 6.7556 304 1.0552 0.4216 1.0552 1.0272
No log 6.8 306 0.9363 0.4927 0.9363 0.9676
No log 6.8444 308 0.8729 0.3908 0.8729 0.9343
No log 6.8889 310 0.8677 0.4142 0.8677 0.9315
No log 6.9333 312 0.8773 0.3914 0.8773 0.9366
No log 6.9778 314 0.9156 0.5130 0.9156 0.9569
No log 7.0222 316 0.9637 0.4655 0.9637 0.9817
No log 7.0667 318 0.9334 0.5086 0.9334 0.9661
No log 7.1111 320 0.8778 0.4934 0.8778 0.9369
No log 7.1556 322 0.8435 0.4505 0.8435 0.9184
No log 7.2 324 0.8450 0.4676 0.8450 0.9192
No log 7.2444 326 0.8666 0.4966 0.8666 0.9309
No log 7.2889 328 0.9229 0.5058 0.9229 0.9607
No log 7.3333 330 0.9838 0.4521 0.9838 0.9919
No log 7.3778 332 0.9545 0.4918 0.9545 0.9770
No log 7.4222 334 0.8625 0.5421 0.8625 0.9287
No log 7.4667 336 0.8088 0.4575 0.8088 0.8993
No log 7.5111 338 0.7914 0.4824 0.7914 0.8896
No log 7.5556 340 0.8032 0.4686 0.8032 0.8962
No log 7.6 342 0.8017 0.4389 0.8017 0.8954
No log 7.6444 344 0.8116 0.4824 0.8116 0.9009
No log 7.6889 346 0.9044 0.4943 0.9044 0.9510
No log 7.7333 348 0.9801 0.4587 0.9801 0.9900
No log 7.7778 350 0.9688 0.4702 0.9688 0.9843
No log 7.8222 352 0.8802 0.5058 0.8802 0.9382
No log 7.8667 354 0.8456 0.5374 0.8456 0.9195
No log 7.9111 356 0.7926 0.5014 0.7926 0.8903
No log 7.9556 358 0.7790 0.4824 0.7790 0.8826
No log 8.0 360 0.7786 0.4824 0.7786 0.8824
No log 8.0444 362 0.7972 0.4966 0.7972 0.8929
No log 8.0889 364 0.7920 0.4824 0.7920 0.8899
No log 8.1333 366 0.7957 0.4824 0.7957 0.8920
No log 8.1778 368 0.8110 0.5032 0.8110 0.9006
No log 8.2222 370 0.8619 0.5462 0.8619 0.9284
No log 8.2667 372 0.9446 0.5030 0.9446 0.9719
No log 8.3111 374 0.9710 0.5016 0.9710 0.9854
No log 8.3556 376 0.9163 0.5016 0.9163 0.9572
No log 8.4 378 0.8641 0.4631 0.8641 0.9296
No log 8.4444 380 0.8673 0.4838 0.8673 0.9313
No log 8.4889 382 0.8709 0.5016 0.8709 0.9332
No log 8.5333 384 0.8395 0.5014 0.8395 0.9163
No log 8.5778 386 0.8438 0.4742 0.8438 0.9186
No log 8.6222 388 0.8850 0.5016 0.8850 0.9407
No log 8.6667 390 0.9629 0.4617 0.9629 0.9813
No log 8.7111 392 0.9710 0.4617 0.9710 0.9854
No log 8.7556 394 0.9058 0.4091 0.9058 0.9517
No log 8.8 396 0.8723 0.4224 0.8723 0.9340
No log 8.8444 398 0.8395 0.5141 0.8395 0.9163
No log 8.8889 400 0.8418 0.5340 0.8418 0.9175
No log 8.9333 402 0.8940 0.5260 0.8940 0.9455
No log 8.9778 404 0.9115 0.5260 0.9115 0.9547
No log 9.0222 406 0.8805 0.5260 0.8805 0.9383
No log 9.0667 408 0.8677 0.5440 0.8677 0.9315
No log 9.1111 410 0.8668 0.4851 0.8668 0.9310
No log 9.1556 412 0.8265 0.5565 0.8265 0.9091
No log 9.2 414 0.7948 0.4601 0.7948 0.8915
No log 9.2444 416 0.8047 0.4601 0.8047 0.8970
No log 9.2889 418 0.8473 0.4893 0.8473 0.9205
No log 9.3333 420 0.9231 0.4641 0.9231 0.9608
No log 9.3778 422 0.9679 0.4587 0.9679 0.9838
No log 9.4222 424 0.9471 0.4426 0.9471 0.9732
No log 9.4667 426 0.9145 0.4894 0.9145 0.9563
No log 9.5111 428 0.8799 0.5080 0.8799 0.9380
No log 9.5556 430 0.8709 0.5080 0.8709 0.9332
No log 9.6 432 0.8570 0.4998 0.8570 0.9257
No log 9.6444 434 0.8607 0.4998 0.8607 0.9277
No log 9.6889 436 0.8456 0.5130 0.8456 0.9196
No log 9.7333 438 0.8708 0.4526 0.8708 0.9332
No log 9.7778 440 0.9019 0.4479 0.9019 0.9497
No log 9.8222 442 0.8753 0.4519 0.8753 0.9356
No log 9.8667 444 0.8417 0.4928 0.8417 0.9175
No log 9.9111 446 0.8385 0.4098 0.8385 0.9157
No log 9.9556 448 0.8394 0.4098 0.8394 0.9162
No log 10.0 450 0.8472 0.5047 0.8472 0.9204
No log 10.0444 452 0.9492 0.4601 0.9492 0.9743
No log 10.0889 454 1.0287 0.4574 1.0287 1.0143
No log 10.1333 456 0.9903 0.4580 0.9903 0.9952
No log 10.1778 458 0.9279 0.4440 0.9279 0.9633
No log 10.2222 460 0.8514 0.4657 0.8514 0.9227
No log 10.2667 462 0.8161 0.4501 0.8161 0.9034
No log 10.3111 464 0.8089 0.4367 0.8089 0.8994
No log 10.3556 466 0.7863 0.4960 0.7863 0.8868
No log 10.4 468 0.7713 0.5043 0.7713 0.8782
No log 10.4444 470 0.7889 0.5585 0.7889 0.8882
No log 10.4889 472 0.8100 0.5490 0.8100 0.9000
No log 10.5333 474 0.8467 0.54 0.8467 0.9202
No log 10.5778 476 0.9434 0.5113 0.9434 0.9713
No log 10.6222 478 0.9994 0.5016 0.9994 0.9997
No log 10.6667 480 0.9765 0.4851 0.9765 0.9882
No log 10.7111 482 1.0020 0.4851 1.0020 1.0010
No log 10.7556 484 0.9927 0.5030 0.9927 0.9963
No log 10.8 486 0.9298 0.5217 0.9298 0.9643
No log 10.8444 488 0.8631 0.5047 0.8631 0.9290
No log 10.8889 490 0.8461 0.4962 0.8461 0.9199
No log 10.9333 492 0.8496 0.5392 0.8496 0.9217
No log 10.9778 494 0.8515 0.5392 0.8515 0.9228
No log 11.0222 496 0.8458 0.5392 0.8458 0.9197
No log 11.0667 498 0.8433 0.5721 0.8433 0.9183
0.3819 11.1111 500 0.8312 0.4946 0.8312 0.9117
0.3819 11.1556 502 0.8220 0.4979 0.8220 0.9067
0.3819 11.2 504 0.8247 0.4722 0.8247 0.9081
0.3819 11.2444 506 0.8494 0.5149 0.8494 0.9216
0.3819 11.2889 508 0.8731 0.5594 0.8731 0.9344
0.3819 11.3333 510 0.8522 0.5149 0.8522 0.9231
0.3819 11.3778 512 0.8468 0.5149 0.8468 0.9202
0.3819 11.4222 514 0.8564 0.4777 0.8564 0.9254
0.3819 11.4667 516 0.9017 0.5063 0.9017 0.9496
0.3819 11.5111 518 0.9351 0.4863 0.9351 0.9670
0.3819 11.5556 520 0.9207 0.4466 0.9207 0.9595
0.3819 11.6 522 0.9277 0.4466 0.9277 0.9632
0.3819 11.6444 524 0.9332 0.4596 0.9332 0.9660
0.3819 11.6889 526 0.9202 0.4731 0.9202 0.9592

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run2_AugV5_k8_task2_organization

Finetuned
(4019)
this model