ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k2_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6562
  • Qwk: 0.4943
  • Mse: 0.6562
  • Rmse: 0.8101

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1538 2 4.1669 0.0002 4.1669 2.0413
No log 0.3077 4 2.6283 0.0750 2.6283 1.6212
No log 0.4615 6 1.2270 0.0361 1.2270 1.1077
No log 0.6154 8 1.0530 -0.0572 1.0530 1.0262
No log 0.7692 10 0.9278 0.0441 0.9278 0.9632
No log 0.9231 12 0.9422 0.0212 0.9422 0.9707
No log 1.0769 14 0.9425 0.0450 0.9425 0.9708
No log 1.2308 16 0.8317 0.0857 0.8317 0.9120
No log 1.3846 18 0.7918 0.1890 0.7918 0.8898
No log 1.5385 20 0.8778 0.2034 0.8778 0.9369
No log 1.6923 22 1.1355 0.0153 1.1355 1.0656
No log 1.8462 24 1.2463 0.0 1.2463 1.1164
No log 2.0 26 1.2799 0.0055 1.2799 1.1313
No log 2.1538 28 1.3177 0.0154 1.3177 1.1479
No log 2.3077 30 0.8882 0.1825 0.8882 0.9424
No log 2.4615 32 0.8213 0.1359 0.8213 0.9062
No log 2.6154 34 0.8670 0.0509 0.8670 0.9311
No log 2.7692 36 0.8092 0.0726 0.8092 0.8996
No log 2.9231 38 0.8416 0.1316 0.8416 0.9174
No log 3.0769 40 0.8272 0.2245 0.8272 0.9095
No log 3.2308 42 0.8040 0.2772 0.8040 0.8967
No log 3.3846 44 0.8088 0.3254 0.8088 0.8993
No log 3.5385 46 0.8222 0.2135 0.8222 0.9068
No log 3.6923 48 0.8390 0.2578 0.8390 0.9160
No log 3.8462 50 0.9695 0.2250 0.9695 0.9846
No log 4.0 52 0.9658 0.2592 0.9658 0.9827
No log 4.1538 54 0.9458 0.2669 0.9458 0.9725
No log 4.3077 56 0.9077 0.2731 0.9077 0.9527
No log 4.4615 58 0.8745 0.2789 0.8745 0.9351
No log 4.6154 60 0.7814 0.2270 0.7814 0.8840
No log 4.7692 62 0.8188 0.2859 0.8188 0.9049
No log 4.9231 64 0.8777 0.3186 0.8777 0.9369
No log 5.0769 66 0.8285 0.4109 0.8285 0.9102
No log 5.2308 68 0.8579 0.3809 0.8579 0.9262
No log 5.3846 70 0.8911 0.4193 0.8911 0.9440
No log 5.5385 72 0.9348 0.3718 0.9348 0.9668
No log 5.6923 74 0.9385 0.3795 0.9385 0.9688
No log 5.8462 76 0.9785 0.4228 0.9785 0.9892
No log 6.0 78 0.8701 0.4180 0.8701 0.9328
No log 6.1538 80 0.8074 0.4751 0.8074 0.8986
No log 6.3077 82 0.7684 0.4760 0.7684 0.8766
No log 6.4615 84 0.7048 0.4419 0.7048 0.8395
No log 6.6154 86 0.8009 0.3545 0.8009 0.8950
No log 6.7692 88 0.8983 0.3419 0.8983 0.9478
No log 6.9231 90 0.8763 0.3310 0.8763 0.9361
No log 7.0769 92 0.8192 0.4725 0.8192 0.9051
No log 7.2308 94 0.7926 0.5628 0.7926 0.8903
No log 7.3846 96 0.7780 0.4560 0.7780 0.8820
No log 7.5385 98 0.7430 0.5277 0.7430 0.8620
No log 7.6923 100 0.7906 0.5776 0.7906 0.8892
No log 7.8462 102 0.8129 0.5415 0.8129 0.9016
No log 8.0 104 0.8053 0.5361 0.8053 0.8974
No log 8.1538 106 0.8694 0.4615 0.8694 0.9324
No log 8.3077 108 0.9768 0.4839 0.9768 0.9883
No log 8.4615 110 0.9635 0.4858 0.9635 0.9816
No log 8.6154 112 0.9564 0.4626 0.9564 0.9780
No log 8.7692 114 0.7705 0.5008 0.7705 0.8778
No log 8.9231 116 0.7113 0.5089 0.7113 0.8434
No log 9.0769 118 0.7222 0.5091 0.7222 0.8498
No log 9.2308 120 0.6998 0.5436 0.6998 0.8365
No log 9.3846 122 0.7388 0.5008 0.7388 0.8595
No log 9.5385 124 0.7094 0.5353 0.7094 0.8423
No log 9.6923 126 0.6988 0.5578 0.6988 0.8360
No log 9.8462 128 0.7546 0.5005 0.7546 0.8687
No log 10.0 130 0.7242 0.4923 0.7242 0.8510
No log 10.1538 132 0.8054 0.4639 0.8054 0.8974
No log 10.3077 134 0.8722 0.4312 0.8722 0.9339
No log 10.4615 136 0.8768 0.4774 0.8768 0.9364
No log 10.6154 138 0.7710 0.4915 0.7710 0.8781
No log 10.7692 140 0.7777 0.4718 0.7777 0.8819
No log 10.9231 142 0.7484 0.4785 0.7484 0.8651
No log 11.0769 144 0.7485 0.4936 0.7485 0.8651
No log 11.2308 146 0.8329 0.4380 0.8329 0.9126
No log 11.3846 148 0.7821 0.4896 0.7821 0.8844
No log 11.5385 150 0.7287 0.4911 0.7287 0.8537
No log 11.6923 152 0.7277 0.4831 0.7277 0.8531
No log 11.8462 154 0.7329 0.5057 0.7329 0.8561
No log 12.0 156 0.7460 0.4891 0.7460 0.8637
No log 12.1538 158 0.7637 0.4780 0.7637 0.8739
No log 12.3077 160 0.7101 0.4603 0.7101 0.8427
No log 12.4615 162 0.6448 0.4777 0.6448 0.8030
No log 12.6154 164 0.6809 0.4581 0.6809 0.8252
No log 12.7692 166 0.7006 0.4588 0.7006 0.8370
No log 12.9231 168 0.6834 0.4815 0.6834 0.8267
No log 13.0769 170 0.7962 0.4549 0.7962 0.8923
No log 13.2308 172 0.9033 0.4306 0.9033 0.9504
No log 13.3846 174 0.8654 0.4977 0.8654 0.9303
No log 13.5385 176 0.8124 0.5256 0.8124 0.9013
No log 13.6923 178 0.8643 0.4958 0.8643 0.9297
No log 13.8462 180 0.8723 0.4388 0.8723 0.9340
No log 14.0 182 0.7218 0.4459 0.7218 0.8496
No log 14.1538 184 0.6256 0.4747 0.6256 0.7910
No log 14.3077 186 0.6523 0.5517 0.6523 0.8076
No log 14.4615 188 0.6914 0.4984 0.6914 0.8315
No log 14.6154 190 0.6509 0.5704 0.6509 0.8068
No log 14.7692 192 0.6838 0.4771 0.6838 0.8269
No log 14.9231 194 0.7751 0.4625 0.7751 0.8804
No log 15.0769 196 0.7760 0.4838 0.7760 0.8809
No log 15.2308 198 0.7070 0.5260 0.7070 0.8408
No log 15.3846 200 0.7221 0.5127 0.7221 0.8498
No log 15.5385 202 0.7644 0.4975 0.7644 0.8743
No log 15.6923 204 0.7272 0.5003 0.7272 0.8528
No log 15.8462 206 0.7185 0.5179 0.7185 0.8476
No log 16.0 208 0.7182 0.5116 0.7182 0.8475
No log 16.1538 210 0.7095 0.4971 0.7095 0.8423
No log 16.3077 212 0.7072 0.4932 0.7072 0.8410
No log 16.4615 214 0.6880 0.5379 0.6880 0.8294
No log 16.6154 216 0.6806 0.5353 0.6806 0.8250
No log 16.7692 218 0.6900 0.5307 0.6900 0.8307
No log 16.9231 220 0.7143 0.5526 0.7143 0.8452
No log 17.0769 222 0.7169 0.5463 0.7169 0.8467
No log 17.2308 224 0.6977 0.5665 0.6977 0.8353
No log 17.3846 226 0.6887 0.5128 0.6887 0.8299
No log 17.5385 228 0.6644 0.5570 0.6644 0.8151
No log 17.6923 230 0.6625 0.5473 0.6625 0.8139
No log 17.8462 232 0.6415 0.5076 0.6415 0.8009
No log 18.0 234 0.6288 0.5261 0.6288 0.7929
No log 18.1538 236 0.6422 0.5125 0.6422 0.8014
No log 18.3077 238 0.6788 0.5126 0.6788 0.8239
No log 18.4615 240 0.7002 0.5200 0.7002 0.8368
No log 18.6154 242 0.7355 0.5391 0.7355 0.8576
No log 18.7692 244 0.7245 0.5386 0.7245 0.8512
No log 18.9231 246 0.7042 0.5347 0.7042 0.8392
No log 19.0769 248 0.6830 0.5420 0.6830 0.8265
No log 19.2308 250 0.6638 0.5610 0.6638 0.8147
No log 19.3846 252 0.6974 0.4820 0.6974 0.8351
No log 19.5385 254 0.6524 0.4907 0.6524 0.8077
No log 19.6923 256 0.5940 0.5257 0.5940 0.7707
No log 19.8462 258 0.5993 0.5552 0.5993 0.7742
No log 20.0 260 0.6205 0.5334 0.6205 0.7877
No log 20.1538 262 0.6417 0.5649 0.6417 0.8010
No log 20.3077 264 0.7325 0.5155 0.7325 0.8558
No log 20.4615 266 0.8066 0.5035 0.8066 0.8981
No log 20.6154 268 0.8258 0.5149 0.8258 0.9087
No log 20.7692 270 0.8111 0.5384 0.8111 0.9006
No log 20.9231 272 0.7576 0.5478 0.7576 0.8704
No log 21.0769 274 0.7205 0.5249 0.7205 0.8488
No log 21.2308 276 0.6835 0.5092 0.6835 0.8267
No log 21.3846 278 0.6593 0.5453 0.6593 0.8120
No log 21.5385 280 0.6424 0.5185 0.6424 0.8015
No log 21.6923 282 0.6392 0.5159 0.6392 0.7995
No log 21.8462 284 0.6376 0.4713 0.6376 0.7985
No log 22.0 286 0.6608 0.4778 0.6608 0.8129
No log 22.1538 288 0.7104 0.4467 0.7104 0.8429
No log 22.3077 290 0.6866 0.4393 0.6866 0.8286
No log 22.4615 292 0.6776 0.4943 0.6776 0.8232
No log 22.6154 294 0.6603 0.4845 0.6603 0.8126
No log 22.7692 296 0.6470 0.4879 0.6470 0.8043
No log 22.9231 298 0.6510 0.4379 0.6510 0.8068
No log 23.0769 300 0.6803 0.4896 0.6803 0.8248
No log 23.2308 302 0.6540 0.4399 0.6540 0.8087
No log 23.3846 304 0.5886 0.4158 0.5886 0.7672
No log 23.5385 306 0.5798 0.5216 0.5798 0.7614
No log 23.6923 308 0.5958 0.5241 0.5958 0.7719
No log 23.8462 310 0.5803 0.5649 0.5803 0.7618
No log 24.0 312 0.5977 0.5848 0.5977 0.7731
No log 24.1538 314 0.6105 0.5312 0.6105 0.7813
No log 24.3077 316 0.6295 0.5213 0.6295 0.7934
No log 24.4615 318 0.6414 0.5304 0.6414 0.8009
No log 24.6154 320 0.6650 0.5354 0.6650 0.8154
No log 24.7692 322 0.6760 0.5580 0.6760 0.8222
No log 24.9231 324 0.6620 0.5453 0.6620 0.8136
No log 25.0769 326 0.6402 0.5523 0.6402 0.8002
No log 25.2308 328 0.6305 0.5580 0.6305 0.7940
No log 25.3846 330 0.6266 0.5177 0.6266 0.7916
No log 25.5385 332 0.6272 0.5130 0.6272 0.7919
No log 25.6923 334 0.6249 0.4962 0.6249 0.7905
No log 25.8462 336 0.6397 0.5039 0.6397 0.7998
No log 26.0 338 0.6663 0.4780 0.6663 0.8163
No log 26.1538 340 0.8075 0.4801 0.8075 0.8986
No log 26.3077 342 0.9590 0.4354 0.9590 0.9793
No log 26.4615 344 0.9104 0.4632 0.9104 0.9542
No log 26.6154 346 0.7201 0.4439 0.7201 0.8486
No log 26.7692 348 0.6621 0.4359 0.6621 0.8137
No log 26.9231 350 0.6579 0.4375 0.6579 0.8111
No log 27.0769 352 0.6832 0.4732 0.6832 0.8265
No log 27.2308 354 0.6757 0.4696 0.6757 0.8220
No log 27.3846 356 0.6635 0.4450 0.6635 0.8145
No log 27.5385 358 0.6902 0.4642 0.6902 0.8308
No log 27.6923 360 0.7128 0.5088 0.7128 0.8443
No log 27.8462 362 0.7033 0.5228 0.7033 0.8386
No log 28.0 364 0.6790 0.5436 0.6790 0.8240
No log 28.1538 366 0.6695 0.5596 0.6695 0.8183
No log 28.3077 368 0.6551 0.5707 0.6551 0.8094
No log 28.4615 370 0.6503 0.5610 0.6503 0.8064
No log 28.6154 372 0.6358 0.5393 0.6358 0.7974
No log 28.7692 374 0.6278 0.5299 0.6278 0.7923
No log 28.9231 376 0.6321 0.5220 0.6321 0.7951
No log 29.0769 378 0.6316 0.5307 0.6316 0.7947
No log 29.2308 380 0.6136 0.5460 0.6136 0.7833
No log 29.3846 382 0.6085 0.5598 0.6085 0.7800
No log 29.5385 384 0.6160 0.5511 0.6160 0.7849
No log 29.6923 386 0.6411 0.5235 0.6411 0.8007
No log 29.8462 388 0.6413 0.5003 0.6413 0.8008
No log 30.0 390 0.6492 0.5583 0.6492 0.8057
No log 30.1538 392 0.6664 0.5202 0.6664 0.8163
No log 30.3077 394 0.6563 0.5410 0.6563 0.8101
No log 30.4615 396 0.6613 0.5565 0.6613 0.8132
No log 30.6154 398 0.6675 0.5260 0.6675 0.8170
No log 30.7692 400 0.7046 0.5127 0.7046 0.8394
No log 30.9231 402 0.7217 0.5152 0.7217 0.8496
No log 31.0769 404 0.7522 0.5406 0.7522 0.8673
No log 31.2308 406 0.7833 0.5253 0.7833 0.8850
No log 31.3846 408 0.7832 0.5214 0.7832 0.8850
No log 31.5385 410 0.7771 0.5525 0.7771 0.8815
No log 31.6923 412 0.7576 0.5191 0.7576 0.8704
No log 31.8462 414 0.7423 0.5003 0.7423 0.8616
No log 32.0 416 0.6816 0.4759 0.6816 0.8256
No log 32.1538 418 0.6283 0.5386 0.6283 0.7927
No log 32.3077 420 0.6441 0.5208 0.6441 0.8025
No log 32.4615 422 0.6744 0.4801 0.6744 0.8212
No log 32.6154 424 0.6777 0.5129 0.6777 0.8232
No log 32.7692 426 0.6880 0.4909 0.6880 0.8295
No log 32.9231 428 0.6829 0.4963 0.6829 0.8264
No log 33.0769 430 0.6556 0.4956 0.6556 0.8097
No log 33.2308 432 0.6417 0.5231 0.6417 0.8010
No log 33.3846 434 0.6382 0.5541 0.6382 0.7989
No log 33.5385 436 0.6319 0.5421 0.6319 0.7949
No log 33.6923 438 0.6311 0.5091 0.6311 0.7944
No log 33.8462 440 0.6368 0.5169 0.6368 0.7980
No log 34.0 442 0.6290 0.5159 0.6290 0.7931
No log 34.1538 444 0.6015 0.5607 0.6015 0.7755
No log 34.3077 446 0.5959 0.5323 0.5959 0.7719
No log 34.4615 448 0.6194 0.5371 0.6194 0.7870
No log 34.6154 450 0.6465 0.4933 0.6465 0.8040
No log 34.7692 452 0.6442 0.4812 0.6442 0.8026
No log 34.9231 454 0.6268 0.5304 0.6268 0.7917
No log 35.0769 456 0.6166 0.5483 0.6166 0.7853
No log 35.2308 458 0.6281 0.5393 0.6281 0.7925
No log 35.3846 460 0.6409 0.5450 0.6409 0.8005
No log 35.5385 462 0.6211 0.5505 0.6211 0.7881
No log 35.6923 464 0.5978 0.5258 0.5978 0.7732
No log 35.8462 466 0.5977 0.5274 0.5977 0.7731
No log 36.0 468 0.6067 0.5167 0.6067 0.7789
No log 36.1538 470 0.6211 0.5167 0.6211 0.7881
No log 36.3077 472 0.6220 0.5153 0.6220 0.7887
No log 36.4615 474 0.6155 0.5129 0.6155 0.7845
No log 36.6154 476 0.6041 0.5176 0.6041 0.7773
No log 36.7692 478 0.6005 0.5216 0.6005 0.7749
No log 36.9231 480 0.5985 0.5216 0.5985 0.7737
No log 37.0769 482 0.5921 0.5266 0.5921 0.7695
No log 37.2308 484 0.6054 0.5379 0.6054 0.7781
No log 37.3846 486 0.6092 0.5481 0.6092 0.7805
No log 37.5385 488 0.5958 0.5105 0.5958 0.7719
No log 37.6923 490 0.5865 0.5137 0.5865 0.7658
No log 37.8462 492 0.5877 0.5464 0.5877 0.7666
No log 38.0 494 0.5880 0.5111 0.5880 0.7668
No log 38.1538 496 0.6005 0.5053 0.6005 0.7749
No log 38.3077 498 0.6396 0.5445 0.6396 0.7997
0.3361 38.4615 500 0.6511 0.5701 0.6511 0.8069
0.3361 38.6154 502 0.6222 0.5376 0.6222 0.7888
0.3361 38.7692 504 0.6224 0.5129 0.6224 0.7889
0.3361 38.9231 506 0.6555 0.4822 0.6555 0.8097
0.3361 39.0769 508 0.6771 0.4475 0.6771 0.8229
0.3361 39.2308 510 0.6562 0.4943 0.6562 0.8101

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits8_usingWellWrittenEssays_FineTuningAraBERT_run1_AugV5_k2_task2_organization

Finetuned
(4023)
this model