ArabicNewSplits8_usingALLEssays_FineTuningAraBERT_run1_AugV5_k19_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7465
  • Qwk: 0.6551
  • Mse: 0.7465
  • Rmse: 0.8640

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0215 2 5.5427 -0.0114 5.5427 2.3543
No log 0.0430 4 3.1865 0.0406 3.1865 1.7851
No log 0.0645 6 2.2089 -0.0355 2.2089 1.4862
No log 0.0860 8 2.1579 -0.0396 2.1579 1.4690
No log 0.1075 10 1.4672 0.0216 1.4672 1.2113
No log 0.1290 12 1.2098 0.1299 1.2098 1.0999
No log 0.1505 14 1.1828 0.1735 1.1828 1.0875
No log 0.1720 16 1.1828 0.1822 1.1828 1.0876
No log 0.1935 18 1.2424 0.3358 1.2424 1.1146
No log 0.2151 20 1.2604 0.3171 1.2604 1.1227
No log 0.2366 22 1.2195 0.1684 1.2195 1.1043
No log 0.2581 24 1.1845 0.1510 1.1845 1.0884
No log 0.2796 26 1.1738 0.1460 1.1738 1.0834
No log 0.3011 28 1.1658 0.3051 1.1658 1.0797
No log 0.3226 30 1.2641 0.1864 1.2641 1.1243
No log 0.3441 32 1.5996 0.0906 1.5996 1.2648
No log 0.3656 34 1.7325 0.1329 1.7325 1.3163
No log 0.3871 36 1.4848 0.0872 1.4848 1.2185
No log 0.4086 38 1.2836 0.1564 1.2836 1.1330
No log 0.4301 40 1.1155 0.3748 1.1155 1.0562
No log 0.4516 42 1.0747 0.2847 1.0747 1.0367
No log 0.4731 44 1.0777 0.3391 1.0777 1.0381
No log 0.4946 46 1.0705 0.3977 1.0705 1.0346
No log 0.5161 48 1.0407 0.4342 1.0407 1.0202
No log 0.5376 50 1.1112 0.3757 1.1112 1.0541
No log 0.5591 52 1.1738 0.3467 1.1738 1.0834
No log 0.5806 54 1.0983 0.4458 1.0983 1.0480
No log 0.6022 56 1.2535 0.3719 1.2535 1.1196
No log 0.6237 58 1.9710 0.2092 1.9710 1.4039
No log 0.6452 60 1.9767 0.2577 1.9767 1.4060
No log 0.6667 62 1.3095 0.3935 1.3095 1.1443
No log 0.6882 64 1.0567 0.4812 1.0567 1.0279
No log 0.7097 66 1.0427 0.4357 1.0427 1.0211
No log 0.7312 68 1.0563 0.3922 1.0563 1.0278
No log 0.7527 70 1.0146 0.4317 1.0146 1.0073
No log 0.7742 72 1.0021 0.4784 1.0021 1.0011
No log 0.7957 74 1.1839 0.4276 1.1839 1.0881
No log 0.8172 76 1.8411 0.2575 1.8411 1.3569
No log 0.8387 78 2.8554 0.1149 2.8554 1.6898
No log 0.8602 80 2.7883 0.1297 2.7883 1.6698
No log 0.8817 82 2.1481 0.2417 2.1481 1.4656
No log 0.9032 84 1.4001 0.2875 1.4001 1.1833
No log 0.9247 86 0.9722 0.4674 0.9722 0.9860
No log 0.9462 88 0.8935 0.4113 0.8935 0.9452
No log 0.9677 90 0.9135 0.4114 0.9135 0.9557
No log 0.9892 92 0.9717 0.4626 0.9717 0.9858
No log 1.0108 94 0.9844 0.4764 0.9844 0.9922
No log 1.0323 96 0.9027 0.4176 0.9027 0.9501
No log 1.0538 98 0.9177 0.5131 0.9177 0.9579
No log 1.0753 100 1.1038 0.5161 1.1038 1.0506
No log 1.0968 102 1.1420 0.5044 1.1420 1.0686
No log 1.1183 104 1.0064 0.5340 1.0064 1.0032
No log 1.1398 106 0.8779 0.4747 0.8779 0.9370
No log 1.1613 108 0.9239 0.4740 0.9239 0.9612
No log 1.1828 110 0.9387 0.5009 0.9387 0.9689
No log 1.2043 112 0.9429 0.5185 0.9429 0.9710
No log 1.2258 114 0.9998 0.5195 0.9998 0.9999
No log 1.2473 116 1.0289 0.5079 1.0289 1.0144
No log 1.2688 118 1.0878 0.4902 1.0878 1.0430
No log 1.2903 120 0.9745 0.5136 0.9745 0.9872
No log 1.3118 122 0.8741 0.5435 0.8741 0.9350
No log 1.3333 124 0.8677 0.5239 0.8677 0.9315
No log 1.3548 126 0.8710 0.4972 0.8710 0.9333
No log 1.3763 128 0.8693 0.5142 0.8693 0.9324
No log 1.3978 130 0.8529 0.5511 0.8529 0.9235
No log 1.4194 132 0.8385 0.5153 0.8385 0.9157
No log 1.4409 134 0.9079 0.5406 0.9079 0.9528
No log 1.4624 136 1.0686 0.5008 1.0686 1.0337
No log 1.4839 138 1.0566 0.4422 1.0566 1.0279
No log 1.5054 140 0.9357 0.5069 0.9357 0.9673
No log 1.5269 142 0.8092 0.5291 0.8092 0.8995
No log 1.5484 144 0.7838 0.6146 0.7838 0.8853
No log 1.5699 146 0.7983 0.6162 0.7983 0.8935
No log 1.5914 148 0.8399 0.6120 0.8399 0.9165
No log 1.6129 150 1.0267 0.5369 1.0267 1.0133
No log 1.6344 152 1.1443 0.4647 1.1443 1.0697
No log 1.6559 154 1.0516 0.5384 1.0516 1.0255
No log 1.6774 156 0.9037 0.5657 0.9037 0.9506
No log 1.6989 158 0.8707 0.5484 0.8707 0.9331
No log 1.7204 160 0.8746 0.5484 0.8746 0.9352
No log 1.7419 162 0.8636 0.5041 0.8636 0.9293
No log 1.7634 164 0.9455 0.4997 0.9455 0.9724
No log 1.7849 166 1.0729 0.4441 1.0729 1.0358
No log 1.8065 168 1.0465 0.4696 1.0465 1.0230
No log 1.8280 170 0.9481 0.53 0.9481 0.9737
No log 1.8495 172 0.8986 0.5428 0.8986 0.9480
No log 1.8710 174 0.8863 0.5360 0.8863 0.9414
No log 1.8925 176 0.8881 0.5286 0.8881 0.9424
No log 1.9140 178 0.9997 0.5637 0.9997 0.9998
No log 1.9355 180 1.1330 0.5218 1.1330 1.0644
No log 1.9570 182 1.1029 0.5282 1.1029 1.0502
No log 1.9785 184 1.0775 0.5486 1.0775 1.0380
No log 2.0 186 1.0058 0.5240 1.0058 1.0029
No log 2.0215 188 0.9327 0.5138 0.9327 0.9658
No log 2.0430 190 0.8707 0.5159 0.8707 0.9331
No log 2.0645 192 0.8994 0.5371 0.8994 0.9484
No log 2.0860 194 0.9129 0.5711 0.9129 0.9555
No log 2.1075 196 0.8166 0.5771 0.8166 0.9037
No log 2.1290 198 0.8935 0.5928 0.8935 0.9453
No log 2.1505 200 1.1277 0.5245 1.1277 1.0619
No log 2.1720 202 1.3988 0.4806 1.3988 1.1827
No log 2.1935 204 1.5503 0.3967 1.5503 1.2451
No log 2.2151 206 1.3673 0.4369 1.3673 1.1693
No log 2.2366 208 1.1383 0.4861 1.1383 1.0669
No log 2.2581 210 0.8900 0.6125 0.8900 0.9434
No log 2.2796 212 0.8367 0.6177 0.8367 0.9147
No log 2.3011 214 0.8539 0.6025 0.8539 0.9241
No log 2.3226 216 0.8266 0.6287 0.8266 0.9092
No log 2.3441 218 0.8396 0.6171 0.8396 0.9163
No log 2.3656 220 0.9531 0.5301 0.9531 0.9763
No log 2.3871 222 0.9749 0.5680 0.9749 0.9874
No log 2.4086 224 0.9989 0.5620 0.9989 0.9995
No log 2.4301 226 0.9306 0.5702 0.9306 0.9647
No log 2.4516 228 0.8035 0.5892 0.8035 0.8964
No log 2.4731 230 0.7811 0.6358 0.7811 0.8838
No log 2.4946 232 0.8599 0.5997 0.8599 0.9273
No log 2.5161 234 1.0476 0.5052 1.0476 1.0235
No log 2.5376 236 1.0756 0.5226 1.0756 1.0371
No log 2.5591 238 0.9287 0.5180 0.9287 0.9637
No log 2.5806 240 0.7617 0.6456 0.7617 0.8728
No log 2.6022 242 0.7256 0.6568 0.7256 0.8518
No log 2.6237 244 0.7690 0.6621 0.7690 0.8769
No log 2.6452 246 0.8369 0.6202 0.8369 0.9148
No log 2.6667 248 0.8100 0.6187 0.8100 0.9000
No log 2.6882 250 0.7664 0.6076 0.7664 0.8754
No log 2.7097 252 0.7526 0.5988 0.7526 0.8675
No log 2.7312 254 0.7759 0.5898 0.7759 0.8808
No log 2.7527 256 0.7786 0.6098 0.7786 0.8824
No log 2.7742 258 0.7840 0.6115 0.7840 0.8855
No log 2.7957 260 0.8043 0.6349 0.8043 0.8968
No log 2.8172 262 0.8232 0.6368 0.8232 0.9073
No log 2.8387 264 0.8363 0.5835 0.8363 0.9145
No log 2.8602 266 0.8894 0.5911 0.8894 0.9431
No log 2.8817 268 0.8567 0.6051 0.8567 0.9256
No log 2.9032 270 0.7877 0.6443 0.7877 0.8875
No log 2.9247 272 0.7817 0.6470 0.7817 0.8842
No log 2.9462 274 0.7908 0.6398 0.7908 0.8893
No log 2.9677 276 0.8130 0.6417 0.8130 0.9016
No log 2.9892 278 0.8451 0.6080 0.8451 0.9193
No log 3.0108 280 0.8429 0.6194 0.8429 0.9181
No log 3.0323 282 0.8419 0.6296 0.8419 0.9176
No log 3.0538 284 0.8493 0.6348 0.8493 0.9216
No log 3.0753 286 0.7952 0.6495 0.7952 0.8917
No log 3.0968 288 0.7361 0.6560 0.7361 0.8580
No log 3.1183 290 0.7938 0.6542 0.7938 0.8909
No log 3.1398 292 0.8484 0.6162 0.8484 0.9211
No log 3.1613 294 0.9669 0.5938 0.9669 0.9833
No log 3.1828 296 0.9768 0.6100 0.9768 0.9883
No log 3.2043 298 0.8608 0.6106 0.8608 0.9278
No log 3.2258 300 0.7584 0.6590 0.7584 0.8709
No log 3.2473 302 0.7254 0.6728 0.7254 0.8517
No log 3.2688 304 0.7067 0.6595 0.7067 0.8407
No log 3.2903 306 0.7057 0.6574 0.7057 0.8401
No log 3.3118 308 0.8084 0.6146 0.8084 0.8991
No log 3.3333 310 1.0131 0.5633 1.0131 1.0065
No log 3.3548 312 1.0420 0.5917 1.0420 1.0208
No log 3.3763 314 0.8465 0.5866 0.8465 0.9200
No log 3.3978 316 0.7418 0.6472 0.7418 0.8613
No log 3.4194 318 0.7037 0.5978 0.7037 0.8388
No log 3.4409 320 0.7114 0.5751 0.7114 0.8435
No log 3.4624 322 0.7308 0.6118 0.7308 0.8549
No log 3.4839 324 0.8918 0.5881 0.8918 0.9444
No log 3.5054 326 1.0319 0.5346 1.0319 1.0158
No log 3.5269 328 1.0763 0.5287 1.0763 1.0374
No log 3.5484 330 0.9977 0.5412 0.9977 0.9988
No log 3.5699 332 0.8603 0.6202 0.8603 0.9275
No log 3.5914 334 0.7969 0.6374 0.7969 0.8927
No log 3.6129 336 0.8169 0.6313 0.8169 0.9038
No log 3.6344 338 0.9127 0.5763 0.9127 0.9554
No log 3.6559 340 1.0130 0.5741 1.0130 1.0065
No log 3.6774 342 0.9794 0.5741 0.9794 0.9896
No log 3.6989 344 0.9327 0.5815 0.9327 0.9658
No log 3.7204 346 0.7827 0.5792 0.7827 0.8847
No log 3.7419 348 0.7338 0.6243 0.7338 0.8566
No log 3.7634 350 0.7286 0.5894 0.7286 0.8536
No log 3.7849 352 0.7385 0.6101 0.7385 0.8593
No log 3.8065 354 0.8258 0.5818 0.8258 0.9088
No log 3.8280 356 0.9217 0.5817 0.9217 0.9601
No log 3.8495 358 0.9141 0.6025 0.9141 0.9561
No log 3.8710 360 0.8395 0.5896 0.8395 0.9163
No log 3.8925 362 0.7791 0.6079 0.7791 0.8827
No log 3.9140 364 0.7772 0.6191 0.7772 0.8816
No log 3.9355 366 0.7643 0.6299 0.7643 0.8743
No log 3.9570 368 0.7492 0.6162 0.7492 0.8656
No log 3.9785 370 0.7542 0.6278 0.7542 0.8684
No log 4.0 372 0.7369 0.6630 0.7369 0.8584
No log 4.0215 374 0.7248 0.6752 0.7248 0.8513
No log 4.0430 376 0.7310 0.6702 0.7310 0.8550
No log 4.0645 378 0.7654 0.6079 0.7654 0.8748
No log 4.0860 380 0.7864 0.6186 0.7864 0.8868
No log 4.1075 382 0.8306 0.6178 0.8306 0.9114
No log 4.1290 384 0.8253 0.6155 0.8253 0.9085
No log 4.1505 386 0.8556 0.5931 0.8556 0.9250
No log 4.1720 388 0.8590 0.6034 0.8590 0.9268
No log 4.1935 390 0.8508 0.6333 0.8508 0.9224
No log 4.2151 392 0.9114 0.5937 0.9114 0.9547
No log 4.2366 394 0.9267 0.6098 0.9267 0.9627
No log 4.2581 396 0.8460 0.5960 0.8460 0.9198
No log 4.2796 398 0.8111 0.6398 0.8111 0.9006
No log 4.3011 400 0.7344 0.6632 0.7344 0.8570
No log 4.3226 402 0.7441 0.6799 0.7441 0.8626
No log 4.3441 404 0.7615 0.6759 0.7615 0.8726
No log 4.3656 406 0.8067 0.6595 0.8067 0.8982
No log 4.3871 408 0.8504 0.6053 0.8504 0.9222
No log 4.4086 410 0.7917 0.6359 0.7917 0.8898
No log 4.4301 412 0.7590 0.6376 0.7590 0.8712
No log 4.4516 414 0.7839 0.6380 0.7839 0.8854
No log 4.4731 416 0.8290 0.6142 0.8290 0.9105
No log 4.4946 418 0.8144 0.6153 0.8144 0.9024
No log 4.5161 420 0.8680 0.5920 0.8680 0.9317
No log 4.5376 422 0.9520 0.5753 0.9520 0.9757
No log 4.5591 424 0.9373 0.5813 0.9373 0.9681
No log 4.5806 426 0.8472 0.6130 0.8472 0.9205
No log 4.6022 428 0.8316 0.6424 0.8316 0.9119
No log 4.6237 430 0.8583 0.6211 0.8583 0.9265
No log 4.6452 432 0.9336 0.6093 0.9336 0.9662
No log 4.6667 434 1.1344 0.5492 1.1344 1.0651
No log 4.6882 436 1.1834 0.5262 1.1834 1.0878
No log 4.7097 438 1.0543 0.5790 1.0543 1.0268
No log 4.7312 440 0.9490 0.5854 0.9490 0.9742
No log 4.7527 442 0.8535 0.6289 0.8535 0.9239
No log 4.7742 444 0.8082 0.6314 0.8082 0.8990
No log 4.7957 446 0.7999 0.6578 0.7999 0.8944
No log 4.8172 448 0.7905 0.6645 0.7905 0.8891
No log 4.8387 450 0.7983 0.6407 0.7983 0.8935
No log 4.8602 452 0.8199 0.6069 0.8199 0.9055
No log 4.8817 454 0.8432 0.6452 0.8432 0.9183
No log 4.9032 456 0.8799 0.6332 0.8799 0.9380
No log 4.9247 458 0.9026 0.6115 0.9026 0.9500
No log 4.9462 460 0.9118 0.5658 0.9118 0.9549
No log 4.9677 462 0.8085 0.6375 0.8085 0.8992
No log 4.9892 464 0.7542 0.6546 0.7542 0.8684
No log 5.0108 466 0.7208 0.6629 0.7208 0.8490
No log 5.0323 468 0.7048 0.6514 0.7048 0.8395
No log 5.0538 470 0.6999 0.6383 0.6999 0.8366
No log 5.0753 472 0.7306 0.6489 0.7306 0.8547
No log 5.0968 474 0.8687 0.5945 0.8687 0.9320
No log 5.1183 476 1.0400 0.5647 1.0400 1.0198
No log 5.1398 478 1.0854 0.5614 1.0854 1.0418
No log 5.1613 480 1.1725 0.5563 1.1725 1.0828
No log 5.1828 482 1.2749 0.5242 1.2749 1.1291
No log 5.2043 484 1.1669 0.5236 1.1669 1.0802
No log 5.2258 486 1.0374 0.5613 1.0374 1.0185
No log 5.2473 488 0.8659 0.5931 0.8659 0.9305
No log 5.2688 490 0.7850 0.6404 0.7850 0.8860
No log 5.2903 492 0.7943 0.6379 0.7943 0.8913
No log 5.3118 494 0.8577 0.6134 0.8577 0.9261
No log 5.3333 496 0.8024 0.6203 0.8024 0.8958
No log 5.3548 498 0.6882 0.6882 0.6882 0.8296
0.4623 5.3763 500 0.6295 0.6937 0.6295 0.7934
0.4623 5.3978 502 0.6270 0.6998 0.6270 0.7918
0.4623 5.4194 504 0.6352 0.6801 0.6352 0.7970
0.4623 5.4409 506 0.6300 0.6459 0.6300 0.7937
0.4623 5.4624 508 0.6371 0.6366 0.6371 0.7982
0.4623 5.4839 510 0.6714 0.6471 0.6714 0.8194
0.4623 5.5054 512 0.7465 0.6551 0.7465 0.8640

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits8_usingALLEssays_FineTuningAraBERT_run1_AugV5_k19_task1_organization

Finetuned
(4023)
this model