ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k8_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7587
  • Qwk: 0.5426
  • Mse: 0.7587
  • Rmse: 0.8710

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0444 2 4.7089 0.0010 4.7089 2.1700
No log 0.0889 4 2.6298 0.0104 2.6298 1.6217
No log 0.1333 6 1.5179 0.0385 1.5179 1.2320
No log 0.1778 8 1.2134 0.1728 1.2134 1.1015
No log 0.2222 10 1.1498 0.1314 1.1498 1.0723
No log 0.2667 12 1.2481 0.1505 1.2481 1.1172
No log 0.3111 14 1.5357 0.0608 1.5357 1.2392
No log 0.3556 16 1.1061 0.3671 1.1061 1.0517
No log 0.4 18 1.0439 0.3431 1.0439 1.0217
No log 0.4444 20 1.0966 0.3602 1.0966 1.0472
No log 0.4889 22 1.0948 0.3937 1.0948 1.0463
No log 0.5333 24 1.1525 0.2984 1.1525 1.0735
No log 0.5778 26 1.0940 0.2938 1.0940 1.0459
No log 0.6222 28 1.2436 0.1842 1.2436 1.1151
No log 0.6667 30 1.2123 0.2240 1.2123 1.1011
No log 0.7111 32 1.1738 0.2410 1.1738 1.0834
No log 0.7556 34 1.1315 0.2361 1.1315 1.0637
No log 0.8 36 1.1837 0.2581 1.1837 1.0880
No log 0.8444 38 1.1709 0.2832 1.1709 1.0821
No log 0.8889 40 1.2053 0.2876 1.2053 1.0979
No log 0.9333 42 1.1150 0.3418 1.1150 1.0560
No log 0.9778 44 1.0365 0.4517 1.0365 1.0181
No log 1.0222 46 1.0426 0.4472 1.0426 1.0211
No log 1.0667 48 1.1275 0.4061 1.1275 1.0618
No log 1.1111 50 1.1280 0.4061 1.1280 1.0621
No log 1.1556 52 1.0896 0.5117 1.0896 1.0438
No log 1.2 54 1.0321 0.4563 1.0321 1.0159
No log 1.2444 56 1.0051 0.4536 1.0051 1.0025
No log 1.2889 58 1.2811 0.3631 1.2811 1.1318
No log 1.3333 60 1.7082 0.2902 1.7082 1.3070
No log 1.3778 62 1.6029 0.2461 1.6029 1.2661
No log 1.4222 64 1.0199 0.3453 1.0199 1.0099
No log 1.4667 66 1.0024 0.4 1.0024 1.0012
No log 1.5111 68 0.9584 0.4200 0.9584 0.9790
No log 1.5556 70 0.8837 0.5498 0.8837 0.9401
No log 1.6 72 0.8333 0.5579 0.8333 0.9128
No log 1.6444 74 0.8356 0.5921 0.8356 0.9141
No log 1.6889 76 1.0023 0.5570 1.0023 1.0011
No log 1.7333 78 1.0254 0.5370 1.0254 1.0126
No log 1.7778 80 0.8300 0.5067 0.8300 0.9110
No log 1.8222 82 0.8296 0.5067 0.8296 0.9108
No log 1.8667 84 0.8576 0.5753 0.8576 0.9261
No log 1.9111 86 0.9357 0.6170 0.9357 0.9673
No log 1.9556 88 0.9104 0.6170 0.9104 0.9542
No log 2.0 90 0.8154 0.5040 0.8154 0.9030
No log 2.0444 92 0.9356 0.5264 0.9356 0.9673
No log 2.0889 94 0.8842 0.4992 0.8842 0.9403
No log 2.1333 96 0.9052 0.6256 0.9052 0.9514
No log 2.1778 98 1.1456 0.4674 1.1456 1.0703
No log 2.2222 100 1.1103 0.4529 1.1103 1.0537
No log 2.2667 102 0.8719 0.5898 0.8719 0.9337
No log 2.3111 104 0.9338 0.4741 0.9338 0.9663
No log 2.3556 106 1.1041 0.3471 1.1041 1.0508
No log 2.4 108 0.9406 0.4439 0.9406 0.9698
No log 2.4444 110 0.8819 0.5012 0.8819 0.9391
No log 2.4889 112 1.1478 0.4339 1.1478 1.0713
No log 2.5333 114 1.0902 0.4881 1.0902 1.0441
No log 2.5778 116 0.8721 0.5171 0.8721 0.9339
No log 2.6222 118 0.8463 0.4655 0.8463 0.9199
No log 2.6667 120 0.8592 0.5110 0.8592 0.9269
No log 2.7111 122 0.8133 0.5241 0.8133 0.9018
No log 2.7556 124 0.8201 0.5420 0.8201 0.9056
No log 2.8 126 0.8242 0.5251 0.8242 0.9078
No log 2.8444 128 0.8502 0.5305 0.8502 0.9221
No log 2.8889 130 0.8508 0.5287 0.8508 0.9224
No log 2.9333 132 0.8961 0.5370 0.8961 0.9466
No log 2.9778 134 0.8634 0.4874 0.8634 0.9292
No log 3.0222 136 0.8504 0.5290 0.8504 0.9222
No log 3.0667 138 0.8925 0.4769 0.8925 0.9447
No log 3.1111 140 0.8818 0.5367 0.8818 0.9390
No log 3.1556 142 0.9729 0.5135 0.9729 0.9863
No log 3.2 144 1.2401 0.4618 1.2401 1.1136
No log 3.2444 146 1.1403 0.4631 1.1403 1.0678
No log 3.2889 148 0.8792 0.5370 0.8792 0.9376
No log 3.3333 150 0.8388 0.3681 0.8388 0.9159
No log 3.3778 152 0.8458 0.3390 0.8458 0.9196
No log 3.4222 154 0.8217 0.3541 0.8217 0.9065
No log 3.4667 156 0.8632 0.5753 0.8632 0.9291
No log 3.5111 158 0.9115 0.5179 0.9115 0.9547
No log 3.5556 160 0.9781 0.5202 0.9781 0.9890
No log 3.6 162 1.0221 0.5202 1.0221 1.0110
No log 3.6444 164 0.8413 0.5697 0.8413 0.9172
No log 3.6889 166 0.8216 0.5821 0.8216 0.9064
No log 3.7333 168 0.8148 0.5458 0.8148 0.9027
No log 3.7778 170 0.7983 0.5023 0.7983 0.8935
No log 3.8222 172 0.7988 0.5275 0.7988 0.8937
No log 3.8667 174 0.8237 0.4907 0.8237 0.9076
No log 3.9111 176 0.8609 0.5291 0.8609 0.9278
No log 3.9556 178 0.8337 0.5635 0.8337 0.9131
No log 4.0 180 0.8399 0.5896 0.8399 0.9165
No log 4.0444 182 0.8403 0.5790 0.8403 0.9167
No log 4.0889 184 0.9058 0.5907 0.9058 0.9517
No log 4.1333 186 0.8957 0.5781 0.8957 0.9464
No log 4.1778 188 0.8520 0.6001 0.8520 0.9231
No log 4.2222 190 0.8119 0.5977 0.8119 0.9011
No log 4.2667 192 0.8066 0.5532 0.8066 0.8981
No log 4.3111 194 0.8268 0.5220 0.8268 0.9093
No log 4.3556 196 0.9076 0.5439 0.9076 0.9527
No log 4.4 198 0.8772 0.5051 0.8772 0.9366
No log 4.4444 200 0.8798 0.4693 0.8798 0.9380
No log 4.4889 202 0.9523 0.5160 0.9523 0.9758
No log 4.5333 204 1.0008 0.4841 1.0008 1.0004
No log 4.5778 206 0.9467 0.4862 0.9467 0.9730
No log 4.6222 208 0.8240 0.4981 0.8240 0.9077
No log 4.6667 210 0.8071 0.4328 0.8071 0.8984
No log 4.7111 212 0.7882 0.4872 0.7882 0.8878
No log 4.7556 214 0.8518 0.6182 0.8518 0.9229
No log 4.8 216 1.1068 0.4783 1.1068 1.0520
No log 4.8444 218 1.3450 0.4495 1.3450 1.1597
No log 4.8889 220 1.2580 0.4411 1.2580 1.1216
No log 4.9333 222 0.9531 0.5614 0.9531 0.9762
No log 4.9778 224 0.8044 0.4818 0.8044 0.8969
No log 5.0222 226 0.8355 0.4648 0.8355 0.9140
No log 5.0667 228 0.8187 0.4992 0.8187 0.9048
No log 5.1111 230 0.7818 0.5864 0.7818 0.8842
No log 5.1556 232 0.9399 0.5592 0.9399 0.9695
No log 5.2 234 1.0239 0.5548 1.0239 1.0119
No log 5.2444 236 0.9587 0.5389 0.9587 0.9791
No log 5.2889 238 0.9047 0.5592 0.9047 0.9512
No log 5.3333 240 0.8239 0.5519 0.8239 0.9077
No log 5.3778 242 0.7998 0.6222 0.7998 0.8943
No log 5.4222 244 0.7965 0.5851 0.7965 0.8925
No log 5.4667 246 0.8228 0.5440 0.8228 0.9071
No log 5.5111 248 0.8300 0.5283 0.8300 0.9111
No log 5.5556 250 0.9501 0.5511 0.9501 0.9747
No log 5.6 252 1.0353 0.4952 1.0353 1.0175
No log 5.6444 254 0.9604 0.4372 0.9604 0.9800
No log 5.6889 256 0.8642 0.3094 0.8642 0.9296
No log 5.7333 258 0.8367 0.4476 0.8367 0.9147
No log 5.7778 260 0.8731 0.5649 0.8731 0.9344
No log 5.8222 262 1.0602 0.4977 1.0602 1.0296
No log 5.8667 264 1.1719 0.4856 1.1719 1.0825
No log 5.9111 266 1.0303 0.5253 1.0303 1.0150
No log 5.9556 268 0.8615 0.5935 0.8615 0.9282
No log 6.0 270 0.8286 0.4575 0.8286 0.9103
No log 6.0444 272 0.8313 0.4575 0.8313 0.9118
No log 6.0889 274 0.8351 0.4889 0.8351 0.9138
No log 6.1333 276 0.9183 0.5287 0.9183 0.9583
No log 6.1778 278 1.1073 0.5040 1.1073 1.0523
No log 6.2222 280 1.0877 0.5040 1.0877 1.0429
No log 6.2667 282 0.8952 0.4930 0.8952 0.9462
No log 6.3111 284 0.8357 0.4979 0.8357 0.9142
No log 6.3556 286 0.8485 0.4781 0.8485 0.9211
No log 6.4 288 0.8298 0.5349 0.8298 0.9109
No log 6.4444 290 0.9105 0.5183 0.9105 0.9542
No log 6.4889 292 1.0218 0.5094 1.0218 1.0108
No log 6.5333 294 1.0185 0.5124 1.0185 1.0092
No log 6.5778 296 0.9362 0.5374 0.9362 0.9676
No log 6.6222 298 0.9099 0.5 0.9099 0.9539
No log 6.6667 300 0.8668 0.5244 0.8668 0.9310
No log 6.7111 302 0.8396 0.5587 0.8396 0.9163
No log 6.7556 304 0.8168 0.5356 0.8168 0.9037
No log 6.8 306 0.8127 0.5806 0.8127 0.9015
No log 6.8444 308 0.8228 0.5796 0.8228 0.9071
No log 6.8889 310 0.8748 0.5575 0.8748 0.9353
No log 6.9333 312 0.8517 0.5575 0.8517 0.9229
No log 6.9778 314 0.8458 0.5379 0.8458 0.9197
No log 7.0222 316 0.8531 0.5326 0.8531 0.9236
No log 7.0667 318 0.8440 0.4618 0.8440 0.9187
No log 7.1111 320 0.8980 0.3989 0.8980 0.9476
No log 7.1556 322 0.9075 0.3989 0.9075 0.9526
No log 7.2 324 0.8778 0.3661 0.8778 0.9369
No log 7.2444 326 1.0184 0.5673 1.0184 1.0092
No log 7.2889 328 1.1696 0.5144 1.1696 1.0815
No log 7.3333 330 1.1199 0.5144 1.1199 1.0583
No log 7.3778 332 0.9267 0.5173 0.9267 0.9627
No log 7.4222 334 0.8270 0.5223 0.8270 0.9094
No log 7.4667 336 0.9122 0.5119 0.9122 0.9551
No log 7.5111 338 1.0284 0.5091 1.0284 1.0141
No log 7.5556 340 1.0015 0.5235 1.0015 1.0007
No log 7.6 342 0.8953 0.5877 0.8953 0.9462
No log 7.6444 344 0.8916 0.5890 0.8916 0.9442
No log 7.6889 346 1.0056 0.5500 1.0056 1.0028
No log 7.7333 348 1.0615 0.5489 1.0615 1.0303
No log 7.7778 350 1.0077 0.5729 1.0077 1.0038
No log 7.8222 352 0.8944 0.5696 0.8944 0.9457
No log 7.8667 354 0.8299 0.5601 0.8299 0.9110
No log 7.9111 356 0.8118 0.5876 0.8118 0.9010
No log 7.9556 358 0.8127 0.5798 0.8127 0.9015
No log 8.0 360 0.8116 0.5811 0.8116 0.9009
No log 8.0444 362 0.8209 0.5798 0.8209 0.9060
No log 8.0889 364 0.8292 0.5700 0.8292 0.9106
No log 8.1333 366 0.8316 0.5700 0.8316 0.9119
No log 8.1778 368 0.8108 0.5700 0.8108 0.9005
No log 8.2222 370 0.7878 0.5747 0.7878 0.8876
No log 8.2667 372 0.7891 0.5177 0.7891 0.8883
No log 8.3111 374 0.7986 0.5624 0.7986 0.8936
No log 8.3556 376 0.7954 0.5318 0.7954 0.8919
No log 8.4 378 0.7816 0.5322 0.7816 0.8841
No log 8.4444 380 0.8296 0.4281 0.8296 0.9108
No log 8.4889 382 0.9893 0.4850 0.9893 0.9947
No log 8.5333 384 1.0618 0.4138 1.0618 1.0304
No log 8.5778 386 0.9436 0.4850 0.9436 0.9714
No log 8.6222 388 0.7699 0.5106 0.7699 0.8775
No log 8.6667 390 0.8907 0.6061 0.8907 0.9438
No log 8.7111 392 1.0877 0.5153 1.0877 1.0429
No log 8.7556 394 1.0724 0.5214 1.0724 1.0355
No log 8.8 396 0.9508 0.5966 0.9508 0.9751
No log 8.8444 398 0.8527 0.5773 0.8527 0.9234
No log 8.8889 400 0.8597 0.5876 0.8597 0.9272
No log 8.9333 402 0.8707 0.6214 0.8707 0.9331
No log 8.9778 404 0.9452 0.5745 0.9452 0.9722
No log 9.0222 406 0.9897 0.5474 0.9897 0.9948
No log 9.0667 408 0.9386 0.6102 0.9386 0.9688
No log 9.1111 410 0.8725 0.6140 0.8725 0.9341
No log 9.1556 412 0.8321 0.5400 0.8321 0.9122
No log 9.2 414 0.8189 0.5229 0.8189 0.9049
No log 9.2444 416 0.8400 0.6032 0.8400 0.9165
No log 9.2889 418 0.8912 0.6004 0.8912 0.9440
No log 9.3333 420 0.9242 0.5951 0.9242 0.9614
No log 9.3778 422 0.9373 0.5951 0.9373 0.9682
No log 9.4222 424 0.9715 0.5690 0.9715 0.9857
No log 9.4667 426 0.9130 0.6062 0.9130 0.9555
No log 9.5111 428 0.8633 0.5947 0.8633 0.9291
No log 9.5556 430 0.8364 0.5283 0.8364 0.9145
No log 9.6 432 0.8204 0.5495 0.8204 0.9057
No log 9.6444 434 0.8153 0.5547 0.8153 0.9030
No log 9.6889 436 0.8178 0.5176 0.8178 0.9043
No log 9.7333 438 0.8515 0.5185 0.8515 0.9228
No log 9.7778 440 0.8607 0.5899 0.8607 0.9277
No log 9.8222 442 0.8446 0.6079 0.8446 0.9190
No log 9.8667 444 0.8075 0.5573 0.8075 0.8986
No log 9.9111 446 0.8105 0.5487 0.8105 0.9003
No log 9.9556 448 0.8026 0.4952 0.8026 0.8959
No log 10.0 450 0.8288 0.4556 0.8288 0.9104
No log 10.0444 452 0.8507 0.4556 0.8507 0.9223
No log 10.0889 454 0.8190 0.4556 0.8190 0.9050
No log 10.1333 456 0.7788 0.4508 0.7788 0.8825
No log 10.1778 458 0.8201 0.5942 0.8201 0.9056
No log 10.2222 460 0.8990 0.5696 0.8990 0.9481
No log 10.2667 462 0.8587 0.5800 0.8587 0.9266
No log 10.3111 464 0.7979 0.5261 0.7979 0.8933
No log 10.3556 466 0.7817 0.4282 0.7817 0.8841
No log 10.4 468 0.7747 0.4079 0.7747 0.8802
No log 10.4444 470 0.7655 0.4879 0.7655 0.8749
No log 10.4889 472 0.7862 0.5841 0.7862 0.8867
No log 10.5333 474 0.8910 0.6081 0.8910 0.9439
No log 10.5778 476 1.0925 0.5584 1.0925 1.0452
No log 10.6222 478 1.1978 0.4828 1.1978 1.0944
No log 10.6667 480 1.1601 0.4574 1.1601 1.0771
No log 10.7111 482 1.0397 0.5144 1.0397 1.0197
No log 10.7556 484 0.8832 0.6186 0.8832 0.9398
No log 10.8 486 0.7774 0.5519 0.7774 0.8817
No log 10.8444 488 0.7757 0.5094 0.7757 0.8807
No log 10.8889 490 0.8692 0.4922 0.8692 0.9323
No log 10.9333 492 0.9507 0.5109 0.9507 0.9751
No log 10.9778 494 0.9184 0.5337 0.9184 0.9583
No log 11.0222 496 0.8287 0.5217 0.8287 0.9103
No log 11.0667 498 0.8354 0.5858 0.8354 0.9140
0.2913 11.1111 500 0.9166 0.5842 0.9166 0.9574
0.2913 11.1556 502 0.9494 0.6004 0.9494 0.9744
0.2913 11.2 504 0.9190 0.6212 0.9190 0.9586
0.2913 11.2444 506 0.8705 0.5962 0.8705 0.9330
0.2913 11.2889 508 0.8142 0.6110 0.8142 0.9024
0.2913 11.3333 510 0.7967 0.6278 0.7967 0.8926
0.2913 11.3778 512 0.8149 0.6187 0.8149 0.9027
0.2913 11.4222 514 0.8863 0.6112 0.8863 0.9414
0.2913 11.4667 516 0.9416 0.5879 0.9416 0.9704
0.2913 11.5111 518 0.9064 0.5905 0.9064 0.9520
0.2913 11.5556 520 0.8440 0.6382 0.8440 0.9187
0.2913 11.6 522 0.8273 0.6054 0.8273 0.9096
0.2913 11.6444 524 0.8000 0.5791 0.8000 0.8944
0.2913 11.6889 526 0.7905 0.5819 0.7905 0.8891
0.2913 11.7333 528 0.7921 0.5763 0.7921 0.8900
0.2913 11.7778 530 0.7635 0.5992 0.7635 0.8738
0.2913 11.8222 532 0.7426 0.5233 0.7426 0.8617
0.2913 11.8667 534 0.7390 0.5358 0.7390 0.8596
0.2913 11.9111 536 0.7456 0.5648 0.7456 0.8635
0.2913 11.9556 538 0.7467 0.5648 0.7467 0.8641
0.2913 12.0 540 0.7587 0.5426 0.7587 0.8710

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k8_task2_organization

Finetuned
(4019)
this model