ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k14_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9376
  • Qwk: 0.4565
  • Mse: 0.9376
  • Rmse: 0.9683

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0253 2 4.5241 0.0129 4.5241 2.1270
No log 0.0506 4 2.8013 -0.0091 2.8013 1.6737
No log 0.0759 6 1.6036 0.0372 1.6036 1.2664
No log 0.1013 8 1.2734 -0.0508 1.2734 1.1285
No log 0.1266 10 1.2041 0.2513 1.2041 1.0973
No log 0.1519 12 1.2501 0.1483 1.2501 1.1181
No log 0.1772 14 1.3428 0.0038 1.3428 1.1588
No log 0.2025 16 1.2132 0.1791 1.2132 1.1014
No log 0.2278 18 1.1451 0.2416 1.1451 1.0701
No log 0.2532 20 1.2117 0.1260 1.2117 1.1008
No log 0.2785 22 1.3519 0.1016 1.3519 1.1627
No log 0.3038 24 1.1975 0.1354 1.1975 1.0943
No log 0.3291 26 1.1811 0.0704 1.1811 1.0868
No log 0.3544 28 1.2229 0.1230 1.2229 1.1059
No log 0.3797 30 1.2452 0.1434 1.2452 1.1159
No log 0.4051 32 1.2268 0.3132 1.2268 1.1076
No log 0.4304 34 1.2800 0.3065 1.2800 1.1314
No log 0.4557 36 1.3249 0.3671 1.3249 1.1510
No log 0.4810 38 1.4768 0.2424 1.4768 1.2152
No log 0.5063 40 1.7559 0.2567 1.7559 1.3251
No log 0.5316 42 1.6226 0.2247 1.6226 1.2738
No log 0.5570 44 1.2732 0.2160 1.2732 1.1284
No log 0.5823 46 1.2026 0.1935 1.2026 1.0966
No log 0.6076 48 1.2030 0.2027 1.2030 1.0968
No log 0.6329 50 1.2382 0.2391 1.2382 1.1127
No log 0.6582 52 1.2144 0.2208 1.2144 1.1020
No log 0.6835 54 1.0873 0.4060 1.0873 1.0427
No log 0.7089 56 1.0749 0.3542 1.0749 1.0368
No log 0.7342 58 1.1723 0.4146 1.1723 1.0827
No log 0.7595 60 1.1710 0.3918 1.1710 1.0821
No log 0.7848 62 1.1390 0.3758 1.1390 1.0672
No log 0.8101 64 1.1178 0.3701 1.1178 1.0572
No log 0.8354 66 1.1685 0.4593 1.1685 1.0810
No log 0.8608 68 1.2236 0.4506 1.2236 1.1062
No log 0.8861 70 1.1217 0.4344 1.1217 1.0591
No log 0.9114 72 0.9958 0.4574 0.9958 0.9979
No log 0.9367 74 0.9132 0.4579 0.9132 0.9556
No log 0.9620 76 1.0914 0.3521 1.0914 1.0447
No log 0.9873 78 1.1511 0.3811 1.1511 1.0729
No log 1.0127 80 1.1364 0.3708 1.1364 1.0660
No log 1.0380 82 1.0312 0.4444 1.0312 1.0155
No log 1.0633 84 0.8926 0.4124 0.8926 0.9448
No log 1.0886 86 0.8864 0.4631 0.8864 0.9415
No log 1.1139 88 1.0015 0.4704 1.0015 1.0008
No log 1.1392 90 1.0045 0.4697 1.0045 1.0022
No log 1.1646 92 1.0319 0.4522 1.0319 1.0158
No log 1.1899 94 1.1237 0.4522 1.1237 1.0600
No log 1.2152 96 0.9683 0.3916 0.9683 0.9840
No log 1.2405 98 0.8562 0.4685 0.8562 0.9253
No log 1.2658 100 0.7182 0.5455 0.7182 0.8474
No log 1.2911 102 0.6946 0.5590 0.6946 0.8334
No log 1.3165 104 0.7075 0.5862 0.7075 0.8412
No log 1.3418 106 0.7637 0.5614 0.7637 0.8739
No log 1.3671 108 0.7892 0.6237 0.7892 0.8884
No log 1.3924 110 0.8815 0.5626 0.8815 0.9389
No log 1.4177 112 0.9537 0.5397 0.9537 0.9766
No log 1.4430 114 0.8494 0.5091 0.8494 0.9216
No log 1.4684 116 0.8018 0.5830 0.8018 0.8954
No log 1.4937 118 0.7411 0.5856 0.7411 0.8608
No log 1.5190 120 0.7113 0.6476 0.7113 0.8434
No log 1.5443 122 0.7304 0.5621 0.7304 0.8546
No log 1.5696 124 0.8863 0.5015 0.8863 0.9415
No log 1.5949 126 1.1492 0.4231 1.1492 1.0720
No log 1.6203 128 1.0437 0.4261 1.0437 1.0216
No log 1.6456 130 0.7868 0.5135 0.7868 0.8870
No log 1.6709 132 0.7416 0.5213 0.7416 0.8612
No log 1.6962 134 0.7434 0.4926 0.7434 0.8622
No log 1.7215 136 0.8956 0.5384 0.8956 0.9463
No log 1.7468 138 1.1303 0.4231 1.1303 1.0632
No log 1.7722 140 1.5506 0.3694 1.5506 1.2452
No log 1.7975 142 1.7382 0.3546 1.7382 1.3184
No log 1.8228 144 1.4681 0.4339 1.4681 1.2117
No log 1.8481 146 0.9146 0.5468 0.9146 0.9563
No log 1.8734 148 0.6964 0.5659 0.6964 0.8345
No log 1.8987 150 0.7803 0.6094 0.7803 0.8833
No log 1.9241 152 0.9799 0.4480 0.9799 0.9899
No log 1.9494 154 0.8303 0.6314 0.8303 0.9112
No log 1.9747 156 0.7317 0.6023 0.7317 0.8554
No log 2.0 158 0.9807 0.5216 0.9807 0.9903
No log 2.0253 160 1.1782 0.4240 1.1782 1.0855
No log 2.0506 162 1.1109 0.5452 1.1109 1.0540
No log 2.0759 164 1.0214 0.5564 1.0214 1.0106
No log 2.1013 166 0.9325 0.5571 0.9325 0.9656
No log 2.1266 168 0.9963 0.5335 0.9963 0.9982
No log 2.1519 170 1.0809 0.4965 1.0809 1.0397
No log 2.1772 172 1.0685 0.5222 1.0685 1.0337
No log 2.2025 174 1.0586 0.5410 1.0586 1.0289
No log 2.2278 176 1.1373 0.5370 1.1373 1.0665
No log 2.2532 178 1.0566 0.5829 1.0566 1.0279
No log 2.2785 180 1.0781 0.5388 1.0781 1.0383
No log 2.3038 182 1.0147 0.4410 1.0147 1.0073
No log 2.3291 184 0.9069 0.4968 0.9069 0.9523
No log 2.3544 186 0.8593 0.5028 0.8593 0.9270
No log 2.3797 188 0.8049 0.4794 0.8049 0.8972
No log 2.4051 190 0.7545 0.5892 0.7545 0.8686
No log 2.4304 192 0.7414 0.5750 0.7414 0.8610
No log 2.4557 194 0.7645 0.5908 0.7645 0.8744
No log 2.4810 196 0.7573 0.5908 0.7573 0.8702
No log 2.5063 198 0.7324 0.5936 0.7324 0.8558
No log 2.5316 200 0.7899 0.5810 0.7899 0.8887
No log 2.5570 202 1.0262 0.5229 1.0262 1.0130
No log 2.5823 204 1.0981 0.3902 1.0981 1.0479
No log 2.6076 206 0.9642 0.4830 0.9642 0.9819
No log 2.6329 208 0.7819 0.5814 0.7819 0.8843
No log 2.6582 210 0.6708 0.6558 0.6708 0.8190
No log 2.6835 212 0.6934 0.6252 0.6934 0.8327
No log 2.7089 214 0.6886 0.6404 0.6886 0.8298
No log 2.7342 216 0.7815 0.5637 0.7815 0.8840
No log 2.7595 218 0.9981 0.5125 0.9981 0.9991
No log 2.7848 220 1.4209 0.3852 1.4209 1.1920
No log 2.8101 222 1.5502 0.3770 1.5502 1.2451
No log 2.8354 224 1.3935 0.3484 1.3935 1.1805
No log 2.8608 226 1.2147 0.3375 1.2147 1.1021
No log 2.8861 228 0.9969 0.4862 0.9969 0.9984
No log 2.9114 230 0.8100 0.5647 0.8100 0.9000
No log 2.9367 232 0.7716 0.5647 0.7716 0.8784
No log 2.9620 234 0.7879 0.5519 0.7879 0.8877
No log 2.9873 236 0.9178 0.5648 0.9178 0.9580
No log 3.0127 238 1.0042 0.4952 1.0042 1.0021
No log 3.0380 240 1.0446 0.4468 1.0446 1.0221
No log 3.0633 242 0.9059 0.5165 0.9059 0.9518
No log 3.0886 244 0.7348 0.5528 0.7348 0.8572
No log 3.1139 246 0.7530 0.6214 0.7530 0.8678
No log 3.1392 248 0.7432 0.5673 0.7432 0.8621
No log 3.1646 250 0.7732 0.5648 0.7732 0.8793
No log 3.1899 252 0.9120 0.5484 0.9120 0.9550
No log 3.2152 254 0.9436 0.5029 0.9436 0.9714
No log 3.2405 256 0.8685 0.5347 0.8685 0.9319
No log 3.2658 258 0.8282 0.4859 0.8282 0.9101
No log 3.2911 260 0.7958 0.5112 0.7958 0.8920
No log 3.3165 262 0.8123 0.5379 0.8123 0.9013
No log 3.3418 264 0.8234 0.5013 0.8234 0.9074
No log 3.3671 266 0.9306 0.5679 0.9306 0.9647
No log 3.3924 268 0.9894 0.5754 0.9894 0.9947
No log 3.4177 270 0.9326 0.5703 0.9326 0.9657
No log 3.4430 272 0.9734 0.5330 0.9734 0.9866
No log 3.4684 274 0.9039 0.5091 0.9039 0.9507
No log 3.4937 276 0.7612 0.5648 0.7612 0.8724
No log 3.5190 278 0.7305 0.5298 0.7305 0.8547
No log 3.5443 280 0.7565 0.5673 0.7565 0.8698
No log 3.5696 282 0.8260 0.5554 0.8260 0.9088
No log 3.5949 284 0.9987 0.4898 0.9987 0.9994
No log 3.6203 286 1.0377 0.5066 1.0377 1.0187
No log 3.6456 288 0.9339 0.5591 0.9339 0.9664
No log 3.6709 290 0.8235 0.5578 0.8235 0.9075
No log 3.6962 292 0.7152 0.6100 0.7152 0.8457
No log 3.7215 294 0.6983 0.5971 0.6983 0.8356
No log 3.7468 296 0.7726 0.5708 0.7726 0.8790
No log 3.7722 298 0.9291 0.5105 0.9291 0.9639
No log 3.7975 300 1.0399 0.4977 1.0399 1.0198
No log 3.8228 302 1.1082 0.5139 1.1082 1.0527
No log 3.8481 304 1.0299 0.4618 1.0299 1.0148
No log 3.8734 306 0.8965 0.4902 0.8965 0.9468
No log 3.8987 308 0.8446 0.5227 0.8446 0.9190
No log 3.9241 310 0.7650 0.5159 0.7650 0.8746
No log 3.9494 312 0.7675 0.5560 0.7675 0.8761
No log 3.9747 314 0.8242 0.5877 0.8242 0.9078
No log 4.0 316 0.8250 0.5877 0.8250 0.9083
No log 4.0253 318 0.7420 0.5667 0.7420 0.8614
No log 4.0506 320 0.7623 0.6032 0.7623 0.8731
No log 4.0759 322 0.8303 0.5346 0.8303 0.9112
No log 4.1013 324 0.8129 0.5313 0.8129 0.9016
No log 4.1266 326 0.8213 0.5165 0.8213 0.9063
No log 4.1519 328 0.9106 0.5279 0.9106 0.9542
No log 4.1772 330 1.0166 0.4466 1.0166 1.0082
No log 4.2025 332 1.0704 0.4236 1.0704 1.0346
No log 4.2278 334 1.0799 0.5119 1.0799 1.0392
No log 4.2532 336 0.9570 0.5028 0.9570 0.9782
No log 4.2785 338 0.8142 0.5209 0.8142 0.9023
No log 4.3038 340 0.8055 0.5042 0.8055 0.8975
No log 4.3291 342 0.7923 0.5157 0.7923 0.8901
No log 4.3544 344 0.8732 0.5192 0.8732 0.9344
No log 4.3797 346 0.9236 0.4874 0.9236 0.9610
No log 4.4051 348 0.9396 0.5071 0.9396 0.9693
No log 4.4304 350 0.8494 0.5264 0.8494 0.9216
No log 4.4557 352 0.7659 0.4980 0.7659 0.8752
No log 4.4810 354 0.7529 0.5203 0.7529 0.8677
No log 4.5063 356 0.7778 0.5447 0.7778 0.8819
No log 4.5316 358 0.9020 0.5488 0.9020 0.9497
No log 4.5570 360 0.9131 0.5649 0.9131 0.9555
No log 4.5823 362 0.8332 0.5750 0.8332 0.9128
No log 4.6076 364 0.7642 0.5320 0.7642 0.8742
No log 4.6329 366 0.7212 0.5951 0.7212 0.8492
No log 4.6582 368 0.7610 0.5086 0.7610 0.8724
No log 4.6835 370 0.7919 0.5044 0.7919 0.8899
No log 4.7089 372 0.7825 0.5130 0.7825 0.8846
No log 4.7342 374 0.8279 0.4986 0.8279 0.9099
No log 4.7595 376 0.8634 0.5346 0.8634 0.9292
No log 4.7848 378 0.8451 0.5192 0.8451 0.9193
No log 4.8101 380 0.8056 0.5708 0.8056 0.8976
No log 4.8354 382 0.8092 0.5624 0.8092 0.8995
No log 4.8608 384 0.8313 0.5519 0.8313 0.9117
No log 4.8861 386 0.8695 0.5192 0.8695 0.9325
No log 4.9114 388 0.8687 0.5014 0.8687 0.9321
No log 4.9367 390 0.8783 0.5201 0.8783 0.9372
No log 4.9620 392 0.9083 0.5324 0.9083 0.9530
No log 4.9873 394 0.9291 0.4700 0.9291 0.9639
No log 5.0127 396 0.9873 0.4186 0.9873 0.9937
No log 5.0380 398 0.9713 0.4655 0.9713 0.9855
No log 5.0633 400 0.9182 0.4655 0.9182 0.9582
No log 5.0886 402 0.9059 0.4647 0.9059 0.9518
No log 5.1139 404 0.9430 0.4834 0.9430 0.9711
No log 5.1392 406 1.0115 0.4722 1.0115 1.0057
No log 5.1646 408 1.0129 0.4414 1.0129 1.0064
No log 5.1899 410 0.9658 0.4732 0.9658 0.9827
No log 5.2152 412 0.9796 0.4539 0.9796 0.9897
No log 5.2405 414 0.9606 0.4539 0.9606 0.9801
No log 5.2658 416 0.9636 0.4705 0.9636 0.9816
No log 5.2911 418 0.9327 0.4933 0.9327 0.9658
No log 5.3165 420 0.7978 0.5245 0.7978 0.8932
No log 5.3418 422 0.7368 0.5711 0.7368 0.8584
No log 5.3671 424 0.7362 0.6368 0.7362 0.8580
No log 5.3924 426 0.7400 0.6106 0.7400 0.8603
No log 5.4177 428 0.7413 0.6151 0.7413 0.8610
No log 5.4430 430 0.7781 0.5648 0.7781 0.8821
No log 5.4684 432 0.8624 0.5592 0.8624 0.9286
No log 5.4937 434 0.9271 0.5027 0.9271 0.9629
No log 5.5190 436 0.9293 0.4638 0.9293 0.9640
No log 5.5443 438 0.9461 0.4322 0.9461 0.9727
No log 5.5696 440 0.8718 0.5600 0.8718 0.9337
No log 5.5949 442 0.8246 0.5624 0.8246 0.9080
No log 5.6203 444 0.8240 0.5708 0.8240 0.9078
No log 5.6456 446 0.8836 0.5027 0.8836 0.9400
No log 5.6709 448 0.8900 0.5061 0.8900 0.9434
No log 5.6962 450 0.8157 0.5860 0.8157 0.9032
No log 5.7215 452 0.7988 0.5350 0.7988 0.8938
No log 5.7468 454 0.8175 0.4777 0.8175 0.9042
No log 5.7722 456 0.8502 0.5186 0.8502 0.9221
No log 5.7975 458 0.8663 0.4787 0.8663 0.9308
No log 5.8228 460 0.9296 0.4637 0.9296 0.9641
No log 5.8481 462 1.0200 0.4418 1.0200 1.0099
No log 5.8734 464 1.0411 0.3892 1.0411 1.0203
No log 5.8987 466 1.0844 0.3792 1.0844 1.0414
No log 5.9241 468 1.0307 0.4252 1.0307 1.0152
No log 5.9494 470 0.9989 0.4167 0.9989 0.9995
No log 5.9747 472 0.9409 0.4107 0.9409 0.9700
No log 6.0 474 0.9004 0.4665 0.9004 0.9489
No log 6.0253 476 0.8851 0.4893 0.8851 0.9408
No log 6.0506 478 0.8849 0.4568 0.8849 0.9407
No log 6.0759 480 0.9051 0.5000 0.9051 0.9513
No log 6.1013 482 0.9758 0.5002 0.9758 0.9879
No log 6.1266 484 1.1138 0.5300 1.1138 1.0554
No log 6.1519 486 1.0441 0.4963 1.0441 1.0218
No log 6.1772 488 0.8633 0.5354 0.8633 0.9292
No log 6.2025 490 0.7877 0.4754 0.7877 0.8875
No log 6.2278 492 0.7875 0.4331 0.7875 0.8874
No log 6.2532 494 0.8066 0.4202 0.8066 0.8981
No log 6.2785 496 0.8805 0.5190 0.8805 0.9384
No log 6.3038 498 1.0405 0.4669 1.0405 1.0201
0.3617 6.3291 500 1.1243 0.4669 1.1243 1.0603
0.3617 6.3544 502 1.1740 0.3970 1.1740 1.0835
0.3617 6.3797 504 1.1175 0.4344 1.1175 1.0571
0.3617 6.4051 506 1.0310 0.3785 1.0310 1.0154
0.3617 6.4304 508 1.0132 0.3868 1.0132 1.0066
0.3617 6.4557 510 0.9376 0.4565 0.9376 0.9683

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k14_task2_organization

Finetuned
(4019)
this model