ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k11_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7531
  • Qwk: 0.4604
  • Mse: 0.7531
  • Rmse: 0.8678

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0323 2 4.3939 -0.0041 4.3939 2.0962
No log 0.0645 4 2.4280 -0.0464 2.4280 1.5582
No log 0.0968 6 1.5727 0.0629 1.5727 1.2541
No log 0.1290 8 1.1912 0.2392 1.1912 1.0914
No log 0.1613 10 1.2642 0.0974 1.2642 1.1244
No log 0.1935 12 1.3376 0.0880 1.3376 1.1565
No log 0.2258 14 1.2516 0.1044 1.2516 1.1187
No log 0.2581 16 1.8054 0.0835 1.8054 1.3437
No log 0.2903 18 1.6265 0.1995 1.6265 1.2754
No log 0.3226 20 1.1768 0.1354 1.1768 1.0848
No log 0.3548 22 1.1814 0.1645 1.1814 1.0869
No log 0.3871 24 1.1612 0.1808 1.1612 1.0776
No log 0.4194 26 1.1147 0.1808 1.1147 1.0558
No log 0.4516 28 1.1059 0.1910 1.1059 1.0516
No log 0.4839 30 1.0730 0.1910 1.0730 1.0358
No log 0.5161 32 1.0566 0.2132 1.0566 1.0279
No log 0.5484 34 1.0390 0.3424 1.0390 1.0193
No log 0.5806 36 1.0680 0.2792 1.0680 1.0335
No log 0.6129 38 1.0444 0.4343 1.0444 1.0220
No log 0.6452 40 1.0574 0.4920 1.0574 1.0283
No log 0.6774 42 1.0824 0.4295 1.0824 1.0404
No log 0.7097 44 1.1186 0.3931 1.1186 1.0576
No log 0.7419 46 1.1972 0.2930 1.1972 1.0942
No log 0.7742 48 1.1576 0.3888 1.1576 1.0759
No log 0.8065 50 1.1352 0.3443 1.1352 1.0655
No log 0.8387 52 1.0765 0.4069 1.0765 1.0375
No log 0.8710 54 1.0397 0.4069 1.0397 1.0197
No log 0.9032 56 1.0354 0.4733 1.0354 1.0176
No log 0.9355 58 1.0155 0.4158 1.0155 1.0077
No log 0.9677 60 0.9921 0.5583 0.9921 0.9961
No log 1.0 62 0.8807 0.5793 0.8807 0.9385
No log 1.0323 64 0.9135 0.5390 0.9135 0.9558
No log 1.0645 66 0.8532 0.5958 0.8532 0.9237
No log 1.0968 68 0.8949 0.5756 0.8949 0.9460
No log 1.1290 70 0.8934 0.5645 0.8934 0.9452
No log 1.1613 72 0.8196 0.5942 0.8196 0.9053
No log 1.1935 74 0.8653 0.5183 0.8653 0.9302
No log 1.2258 76 0.8876 0.4507 0.8876 0.9421
No log 1.2581 78 0.8010 0.4423 0.8010 0.8950
No log 1.2903 80 0.7534 0.5869 0.7534 0.8680
No log 1.3226 82 0.7896 0.5935 0.7896 0.8886
No log 1.3548 84 0.7478 0.6357 0.7478 0.8648
No log 1.3871 86 0.6645 0.7237 0.6645 0.8152
No log 1.4194 88 0.9960 0.5636 0.9960 0.9980
No log 1.4516 90 1.2979 0.4750 1.2979 1.1393
No log 1.4839 92 1.2645 0.5361 1.2645 1.1245
No log 1.5161 94 0.9902 0.5460 0.9902 0.9951
No log 1.5484 96 0.7230 0.6540 0.7230 0.8503
No log 1.5806 98 0.7828 0.6555 0.7828 0.8847
No log 1.6129 100 0.7236 0.6313 0.7236 0.8506
No log 1.6452 102 0.8864 0.5899 0.8864 0.9415
No log 1.6774 104 1.0029 0.4332 1.0029 1.0014
No log 1.7097 106 0.9946 0.4105 0.9946 0.9973
No log 1.7419 108 0.8902 0.5648 0.8902 0.9435
No log 1.7742 110 0.7675 0.5587 0.7675 0.8761
No log 1.8065 112 0.7347 0.5695 0.7347 0.8571
No log 1.8387 114 0.7492 0.5968 0.7492 0.8656
No log 1.8710 116 0.9693 0.5276 0.9693 0.9845
No log 1.9032 118 1.1495 0.5256 1.1495 1.0722
No log 1.9355 120 1.0370 0.5292 1.0370 1.0183
No log 1.9677 122 0.8298 0.6062 0.8298 0.9109
No log 2.0 124 0.7283 0.6324 0.7283 0.8534
No log 2.0323 126 0.7838 0.5968 0.7838 0.8853
No log 2.0645 128 0.9313 0.5140 0.9313 0.9650
No log 2.0968 130 1.1387 0.4592 1.1387 1.0671
No log 2.1290 132 1.0745 0.4156 1.0745 1.0366
No log 2.1613 134 0.9404 0.4911 0.9404 0.9697
No log 2.1935 136 0.7437 0.6183 0.7437 0.8624
No log 2.2258 138 0.7283 0.6139 0.7283 0.8534
No log 2.2581 140 0.7690 0.6171 0.7690 0.8769
No log 2.2903 142 0.8444 0.5399 0.8444 0.9189
No log 2.3226 144 1.0490 0.4891 1.0490 1.0242
No log 2.3548 146 1.0618 0.4990 1.0618 1.0305
No log 2.3871 148 0.9529 0.4485 0.9529 0.9762
No log 2.4194 150 0.8878 0.4231 0.8878 0.9422
No log 2.4516 152 0.9292 0.5130 0.9292 0.9640
No log 2.4839 154 0.8169 0.5177 0.8169 0.9038
No log 2.5161 156 0.7087 0.6203 0.7087 0.8419
No log 2.5484 158 0.6896 0.6237 0.6896 0.8304
No log 2.5806 160 0.6850 0.6272 0.6850 0.8276
No log 2.6129 162 0.7865 0.5539 0.7865 0.8868
No log 2.6452 164 0.8083 0.5495 0.8083 0.8991
No log 2.6774 166 0.7929 0.5636 0.7929 0.8904
No log 2.7097 168 0.7686 0.6300 0.7686 0.8767
No log 2.7419 170 0.8334 0.5052 0.8334 0.9129
No log 2.7742 172 0.9875 0.5153 0.9875 0.9937
No log 2.8065 174 0.9769 0.4989 0.9769 0.9884
No log 2.8387 176 0.9625 0.5139 0.9625 0.9811
No log 2.8710 178 0.9153 0.4924 0.9153 0.9567
No log 2.9032 180 0.7220 0.6069 0.7220 0.8497
No log 2.9355 182 0.6908 0.6879 0.6908 0.8311
No log 2.9677 184 0.7476 0.6324 0.7476 0.8646
No log 3.0 186 0.9046 0.5859 0.9046 0.9511
No log 3.0323 188 1.0729 0.4618 1.0729 1.0358
No log 3.0645 190 0.9922 0.4776 0.9922 0.9961
No log 3.0968 192 0.9948 0.4580 0.9948 0.9974
No log 3.1290 194 0.9929 0.5159 0.9929 0.9964
No log 3.1613 196 0.9923 0.5375 0.9923 0.9961
No log 3.1935 198 0.9560 0.5913 0.9560 0.9778
No log 3.2258 200 0.9568 0.5913 0.9568 0.9781
No log 3.2581 202 1.1275 0.4565 1.1275 1.0619
No log 3.2903 204 1.2005 0.4441 1.2005 1.0957
No log 3.3226 206 1.0225 0.5101 1.0225 1.0112
No log 3.3548 208 0.8440 0.5647 0.8440 0.9187
No log 3.3871 210 0.8343 0.5458 0.8343 0.9134
No log 3.4194 212 0.8832 0.6207 0.8832 0.9398
No log 3.4516 214 0.9611 0.5551 0.9611 0.9804
No log 3.4839 216 0.9884 0.5346 0.9884 0.9942
No log 3.5161 218 0.8865 0.5170 0.8865 0.9415
No log 3.5484 220 0.7908 0.5012 0.7908 0.8893
No log 3.5806 222 0.8007 0.4966 0.8007 0.8948
No log 3.6129 224 0.8674 0.4898 0.8674 0.9313
No log 3.6452 226 0.8802 0.4763 0.8802 0.9382
No log 3.6774 228 0.8510 0.5294 0.8510 0.9225
No log 3.7097 230 0.8977 0.6051 0.8977 0.9475
No log 3.7419 232 0.9691 0.5902 0.9691 0.9844
No log 3.7742 234 0.9875 0.5733 0.9875 0.9937
No log 3.8065 236 0.9096 0.5902 0.9096 0.9537
No log 3.8387 238 0.8975 0.5294 0.8975 0.9474
No log 3.8710 240 1.0130 0.4898 1.0130 1.0065
No log 3.9032 242 1.1087 0.4803 1.1087 1.0530
No log 3.9355 244 1.0272 0.4909 1.0272 1.0135
No log 3.9677 246 0.8758 0.4972 0.8758 0.9358
No log 4.0 248 0.8184 0.5025 0.8184 0.9047
No log 4.0323 250 0.8196 0.5110 0.8196 0.9053
No log 4.0645 252 0.9617 0.4879 0.9617 0.9807
No log 4.0968 254 1.2361 0.4156 1.2361 1.1118
No log 4.1290 256 1.2876 0.3867 1.2876 1.1347
No log 4.1613 258 1.1480 0.3851 1.1480 1.0715
No log 4.1935 260 0.9582 0.4780 0.9582 0.9789
No log 4.2258 262 0.8407 0.5142 0.8407 0.9169
No log 4.2581 264 0.8320 0.5458 0.8320 0.9121
No log 4.2903 266 0.9018 0.4681 0.9018 0.9496
No log 4.3226 268 1.0132 0.5154 1.0132 1.0066
No log 4.3548 270 1.0433 0.5026 1.0433 1.0214
No log 4.3871 272 0.8996 0.5053 0.8996 0.9485
No log 4.4194 274 0.8164 0.5365 0.8164 0.9036
No log 4.4516 276 0.8070 0.5359 0.8070 0.8983
No log 4.4839 278 0.8420 0.4869 0.8420 0.9176
No log 4.5161 280 1.0604 0.4638 1.0604 1.0298
No log 4.5484 282 1.5484 0.4377 1.5484 1.2443
No log 4.5806 284 1.7304 0.4013 1.7304 1.3154
No log 4.6129 286 1.5540 0.4233 1.5540 1.2466
No log 4.6452 288 1.1991 0.5139 1.1991 1.0950
No log 4.6774 290 0.9316 0.4918 0.9316 0.9652
No log 4.7097 292 0.8792 0.4970 0.8792 0.9377
No log 4.7419 294 0.9085 0.4808 0.9085 0.9531
No log 4.7742 296 1.0014 0.4557 1.0014 1.0007
No log 4.8065 298 1.0704 0.4109 1.0704 1.0346
No log 4.8387 300 1.1610 0.4464 1.1610 1.0775
No log 4.8710 302 1.2353 0.4980 1.2353 1.1114
No log 4.9032 304 1.1354 0.5064 1.1354 1.0655
No log 4.9355 306 1.0036 0.5375 1.0036 1.0018
No log 4.9677 308 0.8546 0.5578 0.8546 0.9244
No log 5.0 310 0.8253 0.5253 0.8253 0.9084
No log 5.0323 312 0.8563 0.4942 0.8563 0.9254
No log 5.0645 314 0.9229 0.5241 0.9229 0.9607
No log 5.0968 316 1.1012 0.4400 1.1012 1.0494
No log 5.1290 318 1.1904 0.4416 1.1904 1.0911
No log 5.1613 320 1.0693 0.4590 1.0693 1.0341
No log 5.1935 322 0.8765 0.5553 0.8765 0.9362
No log 5.2258 324 0.7510 0.5476 0.7510 0.8666
No log 5.2581 326 0.7286 0.5587 0.7286 0.8536
No log 5.2903 328 0.7737 0.6162 0.7737 0.8796
No log 5.3226 330 0.8719 0.5867 0.8719 0.9338
No log 5.3548 332 0.9994 0.5062 0.9994 0.9997
No log 5.3871 334 0.9884 0.5002 0.9884 0.9942
No log 5.4194 336 0.9459 0.4734 0.9459 0.9726
No log 5.4516 338 0.8739 0.5472 0.8739 0.9348
No log 5.4839 340 0.8059 0.4998 0.8059 0.8977
No log 5.5161 342 0.8261 0.4676 0.8261 0.9089
No log 5.5484 344 0.9312 0.4714 0.9312 0.9650
No log 5.5806 346 1.0869 0.4758 1.0869 1.0426
No log 5.6129 348 1.1223 0.4571 1.1223 1.0594
No log 5.6452 350 0.9928 0.4976 0.9928 0.9964
No log 5.6774 352 0.8241 0.5472 0.8241 0.9078
No log 5.7097 354 0.7134 0.5455 0.7134 0.8446
No log 5.7419 356 0.6930 0.5954 0.6930 0.8325
No log 5.7742 358 0.6949 0.5749 0.6949 0.8336
No log 5.8065 360 0.7587 0.6162 0.7587 0.8710
No log 5.8387 362 0.9331 0.5120 0.9331 0.9660
No log 5.8710 364 1.0418 0.4511 1.0418 1.0207
No log 5.9032 366 0.9587 0.4714 0.9587 0.9791
No log 5.9355 368 0.8337 0.5424 0.8337 0.9131
No log 5.9677 370 0.7425 0.5274 0.7425 0.8617
No log 6.0 372 0.7248 0.5536 0.7248 0.8514
No log 6.0323 374 0.7097 0.5791 0.7097 0.8425
No log 6.0645 376 0.7224 0.6269 0.7224 0.8499
No log 6.0968 378 0.7159 0.6255 0.7159 0.8461
No log 6.1290 380 0.7222 0.6611 0.7222 0.8498
No log 6.1613 382 0.8383 0.5577 0.8383 0.9156
No log 6.1935 384 0.9269 0.4834 0.9269 0.9627
No log 6.2258 386 0.8982 0.4625 0.8982 0.9477
No log 6.2581 388 0.8033 0.4785 0.8033 0.8963
No log 6.2903 390 0.7628 0.5059 0.7628 0.8734
No log 6.3226 392 0.7719 0.5365 0.7719 0.8786
No log 6.3548 394 0.8773 0.5266 0.8773 0.9366
No log 6.3871 396 0.9328 0.4933 0.9328 0.9658
No log 6.4194 398 0.9538 0.4722 0.9538 0.9766
No log 6.4516 400 0.8478 0.5365 0.8478 0.9208
No log 6.4839 402 0.7242 0.5805 0.7242 0.8510
No log 6.5161 404 0.6945 0.6324 0.6945 0.8334
No log 6.5484 406 0.7013 0.5777 0.7013 0.8375
No log 6.5806 408 0.7795 0.5517 0.7795 0.8829
No log 6.6129 410 0.9506 0.5224 0.9506 0.9750
No log 6.6452 412 1.0193 0.5091 1.0193 1.0096
No log 6.6774 414 0.9574 0.5493 0.9574 0.9785
No log 6.7097 416 0.9103 0.5493 0.9103 0.9541
No log 6.7419 418 0.8328 0.5716 0.8328 0.9126
No log 6.7742 420 0.8115 0.5556 0.8115 0.9008
No log 6.8065 422 0.9028 0.5310 0.9028 0.9501
No log 6.8387 424 0.9710 0.5105 0.9710 0.9854
No log 6.8710 426 0.9770 0.5015 0.9770 0.9884
No log 6.9032 428 0.9699 0.4921 0.9699 0.9848
No log 6.9355 430 0.9055 0.4790 0.9055 0.9516
No log 6.9677 432 0.8323 0.5028 0.8323 0.9123
No log 7.0 434 0.7970 0.5624 0.7970 0.8927
No log 7.0323 436 0.7787 0.5624 0.7787 0.8824
No log 7.0645 438 0.7934 0.5624 0.7934 0.8907
No log 7.0968 440 0.7968 0.5624 0.7968 0.8926
No log 7.1290 442 0.7735 0.5291 0.7735 0.8795
No log 7.1613 444 0.7698 0.5119 0.7698 0.8774
No log 7.1935 446 0.7600 0.5300 0.7600 0.8718
No log 7.2258 448 0.7882 0.4700 0.7882 0.8878
No log 7.2581 450 0.8361 0.4394 0.8361 0.9144
No log 7.2903 452 0.8675 0.4127 0.8675 0.9314
No log 7.3226 454 0.9554 0.5188 0.9554 0.9775
No log 7.3548 456 0.9697 0.5086 0.9697 0.9848
No log 7.3871 458 0.8406 0.5465 0.8406 0.9168
No log 7.4194 460 0.7299 0.5331 0.7299 0.8543
No log 7.4516 462 0.7080 0.6107 0.7080 0.8415
No log 7.4839 464 0.7085 0.6648 0.7085 0.8417
No log 7.5161 466 0.6995 0.6456 0.6995 0.8364
No log 7.5484 468 0.7279 0.5624 0.7279 0.8532
No log 7.5806 470 0.8558 0.5340 0.8558 0.9251
No log 7.6129 472 1.0462 0.4677 1.0462 1.0229
No log 7.6452 474 1.0351 0.4758 1.0351 1.0174
No log 7.6774 476 0.9253 0.5348 0.9253 0.9619
No log 7.7097 478 0.8499 0.4295 0.8499 0.9219
No log 7.7419 480 0.8124 0.4553 0.8124 0.9013
No log 7.7742 482 0.8323 0.4425 0.8323 0.9123
No log 7.8065 484 0.9565 0.4617 0.9565 0.9780
No log 7.8387 486 1.1818 0.3856 1.1818 1.0871
No log 7.8710 488 1.1837 0.3856 1.1837 1.0880
No log 7.9032 490 1.0404 0.4749 1.0404 1.0200
No log 7.9355 492 0.8517 0.4412 0.8517 0.9229
No log 7.9677 494 0.7878 0.4840 0.7878 0.8876
No log 8.0 496 0.7563 0.5308 0.7563 0.8697
No log 8.0323 498 0.7511 0.5148 0.7511 0.8666
0.3324 8.0645 500 0.7873 0.5365 0.7873 0.8873
0.3324 8.0968 502 0.8181 0.5490 0.8181 0.9045
0.3324 8.1290 504 0.9015 0.4824 0.9015 0.9495
0.3324 8.1613 506 0.9017 0.5044 0.9017 0.9496
0.3324 8.1935 508 0.8319 0.4291 0.8319 0.9121
0.3324 8.2258 510 0.7707 0.4237 0.7707 0.8779
0.3324 8.2581 512 0.7531 0.4604 0.7531 0.8678

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k11_task2_organization

Finetuned
(4019)
this model