ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k4_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7732
  • Qwk: 0.5750
  • Mse: 0.7732
  • Rmse: 0.8793

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0870 2 4.6827 0.0010 4.6827 2.1640
No log 0.1739 4 2.8316 -0.0020 2.8316 1.6827
No log 0.2609 6 1.5107 0.0504 1.5107 1.2291
No log 0.3478 8 1.1460 0.2662 1.1460 1.0705
No log 0.4348 10 1.1142 0.2290 1.1142 1.0555
No log 0.5217 12 1.1008 0.1918 1.1008 1.0492
No log 0.6087 14 1.1053 0.1649 1.1053 1.0513
No log 0.6957 16 1.1145 0.1482 1.1145 1.0557
No log 0.7826 18 1.1600 0.0711 1.1600 1.0770
No log 0.8696 20 1.5405 0.0845 1.5405 1.2412
No log 0.9565 22 1.6194 0.0586 1.6194 1.2726
No log 1.0435 24 1.3259 0.0509 1.3259 1.1515
No log 1.1304 26 1.1203 0.2188 1.1203 1.0585
No log 1.2174 28 1.1792 0.2658 1.1792 1.0859
No log 1.3043 30 1.2598 0.2459 1.2598 1.1224
No log 1.3913 32 1.2961 0.2634 1.2961 1.1385
No log 1.4783 34 1.2520 0.2902 1.2520 1.1189
No log 1.5652 36 1.1840 0.3042 1.1840 1.0881
No log 1.6522 38 1.2596 0.2446 1.2596 1.1223
No log 1.7391 40 1.2660 0.2608 1.2660 1.1251
No log 1.8261 42 1.3915 0.3061 1.3915 1.1796
No log 1.9130 44 1.4585 0.3113 1.4585 1.2077
No log 2.0 46 1.3695 0.3233 1.3695 1.1703
No log 2.0870 48 1.1587 0.3265 1.1587 1.0764
No log 2.1739 50 0.9712 0.4685 0.9712 0.9855
No log 2.2609 52 0.8969 0.3363 0.8969 0.9470
No log 2.3478 54 0.8791 0.3282 0.8791 0.9376
No log 2.4348 56 0.8702 0.4308 0.8702 0.9329
No log 2.5217 58 0.9856 0.4068 0.9856 0.9928
No log 2.6087 60 1.0533 0.3768 1.0533 1.0263
No log 2.6957 62 0.9936 0.4454 0.9936 0.9968
No log 2.7826 64 0.8904 0.4889 0.8904 0.9436
No log 2.8696 66 0.8868 0.4165 0.8868 0.9417
No log 2.9565 68 0.9061 0.4295 0.9061 0.9519
No log 3.0435 70 0.9001 0.4495 0.9001 0.9487
No log 3.1304 72 0.8912 0.4956 0.8912 0.9440
No log 3.2174 74 0.9557 0.5249 0.9557 0.9776
No log 3.3043 76 0.9919 0.5102 0.9919 0.9959
No log 3.3913 78 0.9743 0.5574 0.9743 0.9871
No log 3.4783 80 1.0223 0.5222 1.0223 1.0111
No log 3.5652 82 1.0733 0.4521 1.0733 1.0360
No log 3.6522 84 0.9865 0.4449 0.9865 0.9932
No log 3.7391 86 0.8437 0.5396 0.8437 0.9185
No log 3.8261 88 0.8051 0.4800 0.8051 0.8973
No log 3.9130 90 0.8963 0.4932 0.8963 0.9467
No log 4.0 92 1.0226 0.3727 1.0226 1.0112
No log 4.0870 94 0.9629 0.4476 0.9629 0.9813
No log 4.1739 96 0.9107 0.4048 0.9107 0.9543
No log 4.2609 98 1.0305 0.4661 1.0305 1.0151
No log 4.3478 100 1.1456 0.4611 1.1456 1.0703
No log 4.4348 102 1.1777 0.4501 1.1777 1.0852
No log 4.5217 104 1.0804 0.4519 1.0804 1.0394
No log 4.6087 106 1.0689 0.4936 1.0689 1.0339
No log 4.6957 108 1.0266 0.5299 1.0266 1.0132
No log 4.7826 110 1.0006 0.5635 1.0006 1.0003
No log 4.8696 112 0.9851 0.4745 0.9851 0.9925
No log 4.9565 114 0.9220 0.5312 0.9220 0.9602
No log 5.0435 116 0.8873 0.5121 0.8873 0.9419
No log 5.1304 118 0.8751 0.4936 0.8751 0.9354
No log 5.2174 120 0.9128 0.5102 0.9128 0.9554
No log 5.3043 122 1.0115 0.5249 1.0115 1.0057
No log 5.3913 124 1.1448 0.4905 1.1448 1.0699
No log 5.4783 126 1.1351 0.4905 1.1351 1.0654
No log 5.5652 128 0.9847 0.5134 0.9847 0.9923
No log 5.6522 130 0.8209 0.5235 0.8209 0.9061
No log 5.7391 132 0.7201 0.6176 0.7201 0.8486
No log 5.8261 134 0.7234 0.5583 0.7234 0.8505
No log 5.9130 136 0.7611 0.5367 0.7611 0.8724
No log 6.0 138 0.8841 0.5542 0.8841 0.9402
No log 6.0870 140 0.9671 0.5781 0.9671 0.9834
No log 6.1739 142 1.0286 0.5537 1.0286 1.0142
No log 6.2609 144 0.9825 0.5556 0.9825 0.9912
No log 6.3478 146 1.0197 0.5144 1.0197 1.0098
No log 6.4348 148 1.0117 0.5232 1.0117 1.0059
No log 6.5217 150 0.9525 0.5077 0.9525 0.9760
No log 6.6087 152 0.8934 0.4792 0.8934 0.9452
No log 6.6957 154 0.9058 0.4546 0.9058 0.9518
No log 6.7826 156 1.0064 0.4539 1.0064 1.0032
No log 6.8696 158 1.2208 0.4487 1.2208 1.1049
No log 6.9565 160 1.3388 0.4726 1.3388 1.1571
No log 7.0435 162 1.1447 0.4955 1.1447 1.0699
No log 7.1304 164 0.9170 0.5172 0.9170 0.9576
No log 7.2174 166 0.8386 0.5678 0.8386 0.9157
No log 7.3043 168 0.8125 0.5558 0.8125 0.9014
No log 7.3913 170 0.7807 0.5898 0.7807 0.8836
No log 7.4783 172 0.7616 0.5959 0.7616 0.8727
No log 7.5652 174 0.7463 0.5774 0.7463 0.8639
No log 7.6522 176 0.7381 0.5661 0.7381 0.8591
No log 7.7391 178 0.7336 0.6070 0.7336 0.8565
No log 7.8261 180 0.7326 0.5925 0.7326 0.8559
No log 7.9130 182 0.7541 0.6191 0.7541 0.8684
No log 8.0 184 0.8568 0.5776 0.8568 0.9257
No log 8.0870 186 1.1351 0.5449 1.1351 1.0654
No log 8.1739 188 1.1981 0.5598 1.1981 1.0946
No log 8.2609 190 1.0364 0.5443 1.0364 1.0180
No log 8.3478 192 0.8715 0.5907 0.8715 0.9335
No log 8.4348 194 0.8011 0.5851 0.8011 0.8950
No log 8.5217 196 0.7873 0.5607 0.7873 0.8873
No log 8.6087 198 0.7891 0.5647 0.7891 0.8883
No log 8.6957 200 0.8226 0.5586 0.8226 0.9070
No log 8.7826 202 0.8867 0.4958 0.8867 0.9416
No log 8.8696 204 0.9223 0.5637 0.9223 0.9603
No log 8.9565 206 0.9021 0.5556 0.9021 0.9498
No log 9.0435 208 0.8465 0.6026 0.8465 0.9200
No log 9.1304 210 0.8324 0.5957 0.8324 0.9124
No log 9.2174 212 0.8396 0.5561 0.8396 0.9163
No log 9.3043 214 0.8499 0.5561 0.8499 0.9219
No log 9.3913 216 0.8791 0.5268 0.8791 0.9376
No log 9.4783 218 0.9394 0.4732 0.9394 0.9692
No log 9.5652 220 0.9253 0.4732 0.9253 0.9619
No log 9.6522 222 0.8405 0.5102 0.8405 0.9168
No log 9.7391 224 0.8337 0.5509 0.8337 0.9131
No log 9.8261 226 0.8687 0.4994 0.8687 0.9320
No log 9.9130 228 0.8811 0.5010 0.8811 0.9387
No log 10.0 230 0.9505 0.5430 0.9505 0.9749
No log 10.0870 232 1.1593 0.3948 1.1593 1.0767
No log 10.1739 234 1.1754 0.3581 1.1754 1.0842
No log 10.2609 236 1.0276 0.4505 1.0276 1.0137
No log 10.3478 238 0.8750 0.5624 0.8750 0.9354
No log 10.4348 240 0.8035 0.5938 0.8035 0.8964
No log 10.5217 242 0.7958 0.5699 0.7958 0.8921
No log 10.6087 244 0.7847 0.6051 0.7847 0.8858
No log 10.6957 246 0.7823 0.5944 0.7823 0.8845
No log 10.7826 248 0.7805 0.6230 0.7805 0.8834
No log 10.8696 250 0.8462 0.5365 0.8462 0.9199
No log 10.9565 252 0.8917 0.4841 0.8917 0.9443
No log 11.0435 254 0.9076 0.5015 0.9076 0.9527
No log 11.1304 256 0.8430 0.5561 0.8430 0.9182
No log 11.2174 258 0.8032 0.5481 0.8032 0.8962
No log 11.3043 260 0.8223 0.5121 0.8223 0.9068
No log 11.3913 262 0.8614 0.5138 0.8614 0.9281
No log 11.4783 264 0.8445 0.4968 0.8445 0.9189
No log 11.5652 266 0.8770 0.5787 0.8770 0.9365
No log 11.6522 268 0.9876 0.4714 0.9876 0.9938
No log 11.7391 270 1.0735 0.4123 1.0735 1.0361
No log 11.8261 272 1.1575 0.3752 1.1575 1.0759
No log 11.9130 274 1.1092 0.3785 1.1092 1.0532
No log 12.0 276 0.9616 0.5216 0.9616 0.9806
No log 12.0870 278 0.8480 0.5864 0.8480 0.9209
No log 12.1739 280 0.8029 0.5509 0.8029 0.8960
No log 12.2609 282 0.7931 0.5487 0.7931 0.8905
No log 12.3478 284 0.7953 0.5420 0.7953 0.8918
No log 12.4348 286 0.7992 0.5673 0.7992 0.8940
No log 12.5217 288 0.7935 0.5498 0.7935 0.8908
No log 12.6087 290 0.8112 0.5841 0.8112 0.9007
No log 12.6957 292 0.8136 0.5847 0.8136 0.9020
No log 12.7826 294 0.7690 0.5674 0.7690 0.8769
No log 12.8696 296 0.7597 0.5610 0.7597 0.8716
No log 12.9565 298 0.7641 0.5495 0.7641 0.8741
No log 13.0435 300 0.8004 0.5677 0.8004 0.8947
No log 13.1304 302 0.8373 0.5534 0.8373 0.9151
No log 13.2174 304 0.8617 0.5487 0.8617 0.9283
No log 13.3043 306 0.8619 0.5245 0.8619 0.9284
No log 13.3913 308 0.8400 0.5610 0.8400 0.9165
No log 13.4783 310 0.8524 0.5418 0.8524 0.9233
No log 13.5652 312 0.8508 0.5683 0.8508 0.9224
No log 13.6522 314 0.8410 0.5892 0.8410 0.9171
No log 13.7391 316 0.8526 0.5624 0.8526 0.9233
No log 13.8261 318 0.9147 0.5772 0.9147 0.9564
No log 13.9130 320 0.9192 0.5124 0.9192 0.9587
No log 14.0 322 0.8564 0.5114 0.8564 0.9254
No log 14.0870 324 0.8230 0.5283 0.8230 0.9072
No log 14.1739 326 0.8115 0.5184 0.8115 0.9008
No log 14.2609 328 0.8099 0.5203 0.8099 0.9000
No log 14.3478 330 0.8173 0.5165 0.8173 0.9041
No log 14.4348 332 0.8363 0.5610 0.8363 0.9145
No log 14.5217 334 0.9124 0.5268 0.9124 0.9552
No log 14.6087 336 0.9516 0.5614 0.9516 0.9755
No log 14.6957 338 0.9233 0.5721 0.9233 0.9609
No log 14.7826 340 0.8597 0.5313 0.8597 0.9272
No log 14.8696 342 0.8117 0.5787 0.8117 0.9009
No log 14.9565 344 0.8143 0.4764 0.8143 0.9024
No log 15.0435 346 0.8525 0.5135 0.8525 0.9233
No log 15.1304 348 0.8753 0.4007 0.8753 0.9356
No log 15.2174 350 0.8873 0.4010 0.8873 0.9420
No log 15.3043 352 0.8524 0.5473 0.8524 0.9232
No log 15.3913 354 0.8045 0.5696 0.8045 0.8969
No log 15.4783 356 0.7949 0.5611 0.7949 0.8915
No log 15.5652 358 0.8366 0.6110 0.8366 0.9147
No log 15.6522 360 0.8551 0.6054 0.8551 0.9247
No log 15.7391 362 0.8197 0.6208 0.8197 0.9054
No log 15.8261 364 0.7931 0.5670 0.7931 0.8906
No log 15.9130 366 0.8208 0.5836 0.8208 0.9060
No log 16.0 368 0.8608 0.5954 0.8608 0.9278
No log 16.0870 370 0.8209 0.5819 0.8209 0.9061
No log 16.1739 372 0.7901 0.5649 0.7901 0.8889
No log 16.2609 374 0.7589 0.5868 0.7589 0.8712
No log 16.3478 376 0.7497 0.5983 0.7497 0.8659
No log 16.4348 378 0.7488 0.5974 0.7488 0.8653
No log 16.5217 380 0.7690 0.5944 0.7690 0.8769
No log 16.6087 382 0.8451 0.6111 0.8451 0.9193
No log 16.6957 384 0.8987 0.5905 0.8987 0.9480
No log 16.7826 386 0.8691 0.5905 0.8691 0.9323
No log 16.8696 388 0.7945 0.5648 0.7945 0.8914
No log 16.9565 390 0.7619 0.5974 0.7619 0.8729
No log 17.0435 392 0.7651 0.5410 0.7651 0.8747
No log 17.1304 394 0.7633 0.5688 0.7633 0.8737
No log 17.2174 396 0.7790 0.6338 0.7790 0.8826
No log 17.3043 398 0.7846 0.6089 0.7846 0.8858
No log 17.3913 400 0.7555 0.6228 0.7555 0.8692
No log 17.4783 402 0.7540 0.6228 0.7540 0.8683
No log 17.5652 404 0.7720 0.6385 0.7720 0.8786
No log 17.6522 406 0.7775 0.6385 0.7775 0.8818
No log 17.7391 408 0.8063 0.6214 0.8063 0.8979
No log 17.8261 410 0.7893 0.6434 0.7893 0.8884
No log 17.9130 412 0.7461 0.6404 0.7461 0.8638
No log 18.0 414 0.7302 0.5898 0.7302 0.8545
No log 18.0870 416 0.7343 0.5971 0.7343 0.8569
No log 18.1739 418 0.7417 0.5650 0.7417 0.8612
No log 18.2609 420 0.7581 0.5419 0.7581 0.8707
No log 18.3478 422 0.7857 0.5396 0.7857 0.8864
No log 18.4348 424 0.8266 0.5933 0.8266 0.9092
No log 18.5217 426 0.9258 0.5015 0.9258 0.9622
No log 18.6087 428 0.9781 0.4380 0.9781 0.9890
No log 18.6957 430 0.9138 0.4918 0.9138 0.9559
No log 18.7826 432 0.8367 0.5338 0.8367 0.9147
No log 18.8696 434 0.8315 0.5214 0.8315 0.9118
No log 18.9565 436 0.8447 0.4643 0.8447 0.9191
No log 19.0435 438 0.8726 0.5079 0.8726 0.9342
No log 19.1304 440 0.8809 0.5116 0.8809 0.9386
No log 19.2174 442 0.8496 0.5988 0.8496 0.9217
No log 19.3043 444 0.8127 0.6089 0.8127 0.9015
No log 19.3913 446 0.7865 0.6044 0.7865 0.8868
No log 19.4783 448 0.7724 0.6242 0.7724 0.8788
No log 19.5652 450 0.7540 0.6388 0.7540 0.8683
No log 19.6522 452 0.7693 0.6132 0.7693 0.8771
No log 19.7391 454 0.8057 0.5933 0.8057 0.8976
No log 19.8261 456 0.8533 0.5340 0.8533 0.9238
No log 19.9130 458 0.8415 0.5056 0.8415 0.9173
No log 20.0 460 0.7655 0.5912 0.7655 0.8749
No log 20.0870 462 0.7232 0.6144 0.7232 0.8504
No log 20.1739 464 0.7130 0.6341 0.7130 0.8444
No log 20.2609 466 0.7188 0.6664 0.7188 0.8478
No log 20.3478 468 0.7370 0.6781 0.7370 0.8585
No log 20.4348 470 0.7343 0.6781 0.7343 0.8569
No log 20.5217 472 0.7214 0.6570 0.7214 0.8494
No log 20.6087 474 0.7012 0.6328 0.7012 0.8374
No log 20.6957 476 0.6914 0.6594 0.6914 0.8315
No log 20.7826 478 0.6956 0.6756 0.6956 0.8340
No log 20.8696 480 0.7378 0.6481 0.7378 0.8589
No log 20.9565 482 0.7536 0.6315 0.7536 0.8681
No log 21.0435 484 0.7093 0.6369 0.7093 0.8422
No log 21.1304 486 0.6743 0.6605 0.6743 0.8212
No log 21.2174 488 0.6883 0.6437 0.6883 0.8296
No log 21.3043 490 0.7305 0.5998 0.7305 0.8547
No log 21.3913 492 0.7472 0.5697 0.7472 0.8644
No log 21.4783 494 0.7336 0.5916 0.7336 0.8565
No log 21.5652 496 0.7202 0.5648 0.7202 0.8486
No log 21.6522 498 0.7349 0.5983 0.7349 0.8573
0.2863 21.7391 500 0.7469 0.6041 0.7469 0.8642
0.2863 21.8261 502 0.7504 0.5898 0.7504 0.8662
0.2863 21.9130 504 0.7683 0.5957 0.7683 0.8765
0.2863 22.0 506 0.7958 0.5819 0.7958 0.8921
0.2863 22.0870 508 0.7928 0.6029 0.7928 0.8904
0.2863 22.1739 510 0.7732 0.5750 0.7732 0.8793

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k4_task2_organization

Finetuned
(4019)
this model