ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k14_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7480
  • Qwk: 0.4775
  • Mse: 0.7480
  • Rmse: 0.8649

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0465 2 4.5590 -0.0132 4.5590 2.1352
No log 0.0930 4 2.6215 0.0274 2.6215 1.6191
No log 0.1395 6 2.1014 -0.0634 2.1014 1.4496
No log 0.1860 8 1.5274 0.0682 1.5274 1.2359
No log 0.2326 10 1.3009 0.0050 1.3009 1.1406
No log 0.2791 12 1.3325 0.1472 1.3325 1.1543
No log 0.3256 14 1.4485 0.1106 1.4485 1.2035
No log 0.3721 16 1.2890 0.1136 1.2890 1.1353
No log 0.4186 18 1.2099 0.1753 1.2099 1.0999
No log 0.4651 20 1.1414 0.2628 1.1414 1.0684
No log 0.5116 22 1.1356 0.1995 1.1356 1.0657
No log 0.5581 24 1.1563 0.1812 1.1563 1.0753
No log 0.6047 26 1.1275 0.2520 1.1275 1.0618
No log 0.6512 28 1.1237 0.2416 1.1237 1.0600
No log 0.6977 30 1.3092 0.0290 1.3092 1.1442
No log 0.7442 32 1.6587 0.0972 1.6587 1.2879
No log 0.7907 34 1.4808 0.0878 1.4808 1.2169
No log 0.8372 36 1.0764 0.2378 1.0764 1.0375
No log 0.8837 38 1.1760 0.2119 1.1760 1.0844
No log 0.9302 40 1.2321 0.2454 1.2321 1.1100
No log 0.9767 42 1.1000 0.1343 1.1000 1.0488
No log 1.0233 44 0.9928 0.3356 0.9928 0.9964
No log 1.0698 46 1.3474 0.2112 1.3474 1.1608
No log 1.1163 48 1.6457 0.1758 1.6457 1.2829
No log 1.1628 50 1.4959 0.1364 1.4959 1.2231
No log 1.2093 52 1.2728 0.2067 1.2728 1.1282
No log 1.2558 54 0.9945 0.4557 0.9945 0.9972
No log 1.3023 56 0.9582 0.2351 0.9582 0.9789
No log 1.3488 58 1.1087 0.2781 1.1087 1.0530
No log 1.3953 60 1.1919 0.3261 1.1919 1.0917
No log 1.4419 62 0.9973 0.1902 0.9973 0.9987
No log 1.4884 64 0.8402 0.4036 0.8402 0.9166
No log 1.5349 66 0.9349 0.4379 0.9349 0.9669
No log 1.5814 68 0.8776 0.4379 0.8776 0.9368
No log 1.6279 70 0.7821 0.4295 0.7821 0.8844
No log 1.6744 72 1.3755 0.2540 1.3755 1.1728
No log 1.7209 74 1.4784 0.2672 1.4784 1.2159
No log 1.7674 76 0.9516 0.6015 0.9516 0.9755
No log 1.8140 78 0.9469 0.3636 0.9469 0.9731
No log 1.8605 80 1.2865 0.3304 1.2865 1.1342
No log 1.9070 82 1.1094 0.3989 1.1094 1.0533
No log 1.9535 84 0.8032 0.4846 0.8032 0.8962
No log 2.0 86 1.1049 0.4876 1.1049 1.0511
No log 2.0465 88 1.1337 0.4704 1.1337 1.0648
No log 2.0930 90 0.8739 0.5294 0.8739 0.9348
No log 2.1395 92 0.8654 0.4699 0.8654 0.9303
No log 2.1860 94 0.9298 0.4361 0.9298 0.9643
No log 2.2326 96 0.7870 0.5686 0.7870 0.8871
No log 2.2791 98 0.8074 0.5405 0.8074 0.8986
No log 2.3256 100 0.8089 0.5532 0.8089 0.8994
No log 2.3721 102 0.7256 0.5479 0.7256 0.8518
No log 2.4186 104 0.7106 0.5984 0.7106 0.8430
No log 2.4651 106 0.8180 0.6234 0.8180 0.9044
No log 2.5116 108 0.7201 0.5843 0.7201 0.8486
No log 2.5581 110 0.8258 0.5769 0.8258 0.9087
No log 2.6047 112 0.9466 0.5506 0.9466 0.9729
No log 2.6512 114 0.7998 0.5814 0.7998 0.8943
No log 2.6977 116 0.7498 0.5942 0.7498 0.8659
No log 2.7442 118 0.7850 0.5731 0.7850 0.8860
No log 2.7907 120 1.0036 0.5410 1.0036 1.0018
No log 2.8372 122 0.9994 0.5429 0.9994 0.9997
No log 2.8837 124 0.7826 0.5806 0.7826 0.8847
No log 2.9302 126 0.7634 0.5495 0.7634 0.8737
No log 2.9767 128 0.8619 0.5229 0.8619 0.9284
No log 3.0233 130 0.7744 0.4848 0.7744 0.8800
No log 3.0698 132 0.8335 0.4732 0.8335 0.9130
No log 3.1163 134 0.9673 0.5507 0.9673 0.9835
No log 3.1628 136 0.8444 0.4732 0.8444 0.9189
No log 3.2093 138 0.7729 0.4583 0.7729 0.8791
No log 3.2558 140 0.8106 0.5485 0.8106 0.9003
No log 3.3023 142 0.8225 0.5060 0.8225 0.9069
No log 3.3488 144 0.7820 0.5226 0.7820 0.8843
No log 3.3953 146 0.7702 0.5138 0.7702 0.8776
No log 3.4419 148 0.7831 0.5102 0.7831 0.8849
No log 3.4884 150 0.8073 0.5220 0.8073 0.8985
No log 3.5349 152 0.8111 0.4722 0.8111 0.9006
No log 3.5814 154 0.8936 0.5505 0.8936 0.9453
No log 3.6279 156 1.0941 0.5238 1.0941 1.0460
No log 3.6744 158 0.9810 0.4936 0.9810 0.9905
No log 3.7209 160 0.8100 0.5331 0.8100 0.9000
No log 3.7674 162 0.8875 0.5636 0.8875 0.9420
No log 3.8140 164 0.8782 0.5375 0.8782 0.9371
No log 3.8605 166 0.8196 0.4846 0.8196 0.9053
No log 3.9070 168 0.7867 0.4122 0.7867 0.8870
No log 3.9535 170 0.8089 0.3987 0.8089 0.8994
No log 4.0 172 0.8343 0.3819 0.8343 0.9134
No log 4.0465 174 0.8089 0.4767 0.8089 0.8994
No log 4.0930 176 0.8159 0.4914 0.8159 0.9033
No log 4.1395 178 0.9741 0.5648 0.9741 0.9869
No log 4.1860 180 1.0472 0.5659 1.0472 1.0233
No log 4.2326 182 0.9695 0.5873 0.9695 0.9846
No log 4.2791 184 0.9714 0.5509 0.9714 0.9856
No log 4.3256 186 0.8341 0.4638 0.8341 0.9133
No log 4.3721 188 0.7611 0.5114 0.7611 0.8724
No log 4.4186 190 0.7503 0.5846 0.7503 0.8662
No log 4.4651 192 0.7416 0.6084 0.7416 0.8612
No log 4.5116 194 0.7717 0.4886 0.7717 0.8785
No log 4.5581 196 0.9286 0.5224 0.9286 0.9636
No log 4.6047 198 1.0427 0.5272 1.0427 1.0211
No log 4.6512 200 0.9246 0.4404 0.9246 0.9616
No log 4.6977 202 0.7532 0.4609 0.7532 0.8679
No log 4.7442 204 0.7710 0.5618 0.7710 0.8780
No log 4.7907 206 0.7981 0.5766 0.7981 0.8934
No log 4.8372 208 0.7949 0.5520 0.7949 0.8916
No log 4.8837 210 0.7842 0.4977 0.7842 0.8855
No log 4.9302 212 0.7957 0.4563 0.7957 0.8920
No log 4.9767 214 0.8004 0.4847 0.8004 0.8946
No log 5.0233 216 0.9079 0.4375 0.9079 0.9528
No log 5.0698 218 1.0084 0.4761 1.0084 1.0042
No log 5.1163 220 0.9189 0.4166 0.9189 0.9586
No log 5.1628 222 0.8090 0.4628 0.8090 0.8994
No log 5.2093 224 0.7835 0.4822 0.7835 0.8851
No log 5.2558 226 0.7849 0.4760 0.7849 0.8859
No log 5.3023 228 0.8077 0.4568 0.8077 0.8987
No log 5.3488 230 0.8784 0.5593 0.8784 0.9372
No log 5.3953 232 0.9861 0.4520 0.9861 0.9930
No log 5.4419 234 0.9269 0.4613 0.9269 0.9628
No log 5.4884 236 0.8493 0.4803 0.8493 0.9215
No log 5.5349 238 0.8375 0.5143 0.8375 0.9151
No log 5.5814 240 0.8354 0.5329 0.8354 0.9140
No log 5.6279 242 0.8361 0.5481 0.8361 0.9144
No log 5.6744 244 0.8123 0.5153 0.8123 0.9013
No log 5.7209 246 0.8081 0.5343 0.8081 0.8990
No log 5.7674 248 0.8783 0.4646 0.8783 0.9372
No log 5.8140 250 0.9900 0.5315 0.9900 0.9950
No log 5.8605 252 0.9491 0.4781 0.9491 0.9742
No log 5.9070 254 0.8188 0.4644 0.8188 0.9049
No log 5.9535 256 0.8252 0.4961 0.8252 0.9084
No log 6.0 258 0.8125 0.4994 0.8125 0.9014
No log 6.0465 260 0.8084 0.5268 0.8084 0.8991
No log 6.0930 262 0.8719 0.4565 0.8719 0.9338
No log 6.1395 264 0.9431 0.4507 0.9431 0.9711
No log 6.1860 266 0.8633 0.4572 0.8633 0.9292
No log 6.2326 268 0.8124 0.3991 0.8124 0.9013
No log 6.2791 270 0.7888 0.4479 0.7888 0.8882
No log 6.3256 272 0.7721 0.4123 0.7721 0.8787
No log 6.3721 274 0.7535 0.4479 0.7535 0.8681
No log 6.4186 276 0.7619 0.4435 0.7619 0.8728
No log 6.4651 278 0.7450 0.4527 0.7450 0.8631
No log 6.5116 280 0.7170 0.4847 0.7170 0.8468
No log 6.5581 282 0.7338 0.6045 0.7338 0.8566
No log 6.6047 284 0.7139 0.4945 0.7139 0.8449
No log 6.6512 286 0.7506 0.4164 0.7506 0.8664
No log 6.6977 288 0.7541 0.4165 0.7541 0.8684
No log 6.7442 290 0.7466 0.4911 0.7466 0.8640
No log 6.7907 292 0.7216 0.5313 0.7216 0.8495
No log 6.8372 294 0.7500 0.5385 0.7500 0.8660
No log 6.8837 296 0.7418 0.5683 0.7418 0.8613
No log 6.9302 298 0.7022 0.5944 0.7022 0.8380
No log 6.9767 300 0.7021 0.5944 0.7021 0.8379
No log 7.0233 302 0.7794 0.54 0.7794 0.8828
No log 7.0698 304 0.8010 0.54 0.8010 0.8950
No log 7.1163 306 0.7216 0.4951 0.7216 0.8495
No log 7.1628 308 0.7301 0.6207 0.7301 0.8545
No log 7.2093 310 0.7561 0.5928 0.7561 0.8696
No log 7.2558 312 0.7351 0.5719 0.7351 0.8574
No log 7.3023 314 0.7260 0.6246 0.7260 0.8521
No log 7.3488 316 0.7468 0.5802 0.7468 0.8642
No log 7.3953 318 0.7280 0.6035 0.7280 0.8532
No log 7.4419 320 0.7377 0.5331 0.7377 0.8589
No log 7.4884 322 0.7708 0.5537 0.7708 0.8780
No log 7.5349 324 0.8834 0.5509 0.8834 0.9399
No log 7.5814 326 0.9624 0.5627 0.9624 0.9810
No log 7.6279 328 0.8660 0.5427 0.8660 0.9306
No log 7.6744 330 0.7725 0.5146 0.7725 0.8789
No log 7.7209 332 0.7377 0.4996 0.7377 0.8589
No log 7.7674 334 0.7400 0.4996 0.7400 0.8602
No log 7.8140 336 0.7515 0.5057 0.7515 0.8669
No log 7.8605 338 0.7783 0.4864 0.7783 0.8822
No log 7.9070 340 0.8005 0.4954 0.8005 0.8947
No log 7.9535 342 0.8714 0.5509 0.8714 0.9335
No log 8.0 344 0.8803 0.5553 0.8803 0.9382
No log 8.0465 346 0.8514 0.5712 0.8514 0.9227
No log 8.0930 348 0.8479 0.5712 0.8479 0.9208
No log 8.1395 350 0.8514 0.5712 0.8514 0.9227
No log 8.1860 352 0.8381 0.5686 0.8381 0.9155
No log 8.2326 354 0.7621 0.5283 0.7621 0.8730
No log 8.2791 356 0.7232 0.4555 0.7232 0.8504
No log 8.3256 358 0.7339 0.5327 0.7339 0.8567
No log 8.3721 360 0.7402 0.4373 0.7402 0.8603
No log 8.4186 362 0.7544 0.5192 0.7544 0.8686
No log 8.4651 364 0.7368 0.4704 0.7368 0.8584
No log 8.5116 366 0.7636 0.5192 0.7636 0.8739
No log 8.5581 368 0.7742 0.5451 0.7742 0.8799
No log 8.6047 370 0.8443 0.5637 0.8443 0.9189
No log 8.6512 372 0.9767 0.5569 0.9767 0.9883
No log 8.6977 374 1.1350 0.4861 1.1350 1.0654
No log 8.7442 376 1.0676 0.5055 1.0676 1.0332
No log 8.7907 378 0.9364 0.5297 0.9364 0.9677
No log 8.8372 380 0.8051 0.5515 0.8051 0.8973
No log 8.8837 382 0.7578 0.5291 0.7578 0.8705
No log 8.9302 384 0.7527 0.4998 0.7527 0.8676
No log 8.9767 386 0.7736 0.5157 0.7736 0.8795
No log 9.0233 388 0.9026 0.5365 0.9026 0.9500
No log 9.0698 390 0.9412 0.5365 0.9412 0.9701
No log 9.1163 392 0.8258 0.5444 0.8258 0.9087
No log 9.1628 394 0.7245 0.4965 0.7245 0.8512
No log 9.2093 396 0.8443 0.5575 0.8443 0.9188
No log 9.2558 398 1.0235 0.4248 1.0235 1.0117
No log 9.3023 400 1.0218 0.4625 1.0218 1.0109
No log 9.3488 402 0.8909 0.4258 0.8909 0.9439
No log 9.3953 404 0.7697 0.5634 0.7697 0.8773
No log 9.4419 406 0.7670 0.5634 0.7670 0.8758
No log 9.4884 408 0.7839 0.5562 0.7839 0.8854
No log 9.5349 410 0.9048 0.4700 0.9048 0.9512
No log 9.5814 412 0.8846 0.5054 0.8846 0.9405
No log 9.6279 414 0.7647 0.5517 0.7647 0.8745
No log 9.6744 416 0.7088 0.6043 0.7088 0.8419
No log 9.7209 418 0.6857 0.5766 0.6857 0.8280
No log 9.7674 420 0.6970 0.5571 0.6970 0.8349
No log 9.8140 422 0.6717 0.6287 0.6717 0.8196
No log 9.8605 424 0.6594 0.6725 0.6594 0.8121
No log 9.9070 426 0.6766 0.6239 0.6766 0.8226
No log 9.9535 428 0.8515 0.5853 0.8515 0.9227
No log 10.0 430 0.9904 0.5809 0.9904 0.9952
No log 10.0465 432 0.9580 0.5833 0.9580 0.9788
No log 10.0930 434 0.8249 0.5724 0.8249 0.9083
No log 10.1395 436 0.7018 0.6233 0.7018 0.8377
No log 10.1860 438 0.7164 0.6233 0.7164 0.8464
No log 10.2326 440 0.7942 0.5872 0.7942 0.8912
No log 10.2791 442 0.8807 0.5526 0.8807 0.9384
No log 10.3256 444 0.8823 0.5569 0.8823 0.9393
No log 10.3721 446 0.7936 0.5827 0.7936 0.8908
No log 10.4186 448 0.7130 0.5898 0.7130 0.8444
No log 10.4651 450 0.7009 0.5059 0.7009 0.8372
No log 10.5116 452 0.7592 0.5675 0.7592 0.8713
No log 10.5581 454 0.8892 0.5569 0.8892 0.9430
No log 10.6047 456 0.9459 0.5365 0.9459 0.9726
No log 10.6512 458 0.9108 0.5365 0.9108 0.9544
No log 10.6977 460 0.8775 0.5648 0.8775 0.9367
No log 10.7442 462 0.8542 0.5592 0.8542 0.9242
No log 10.7907 464 0.8547 0.5671 0.8547 0.9245
No log 10.8372 466 0.7699 0.5614 0.7699 0.8774
No log 10.8837 468 0.7259 0.5865 0.7259 0.8520
No log 10.9302 470 0.7461 0.5308 0.7461 0.8638
No log 10.9767 472 0.7463 0.4818 0.7463 0.8639
No log 11.0233 474 0.7089 0.5833 0.7089 0.8420
No log 11.0698 476 0.6876 0.6212 0.6876 0.8292
No log 11.1163 478 0.7142 0.6177 0.7142 0.8451
No log 11.1628 480 0.7121 0.6333 0.7121 0.8439
No log 11.2093 482 0.6989 0.5878 0.6989 0.8360
No log 11.2558 484 0.7177 0.6163 0.7177 0.8472
No log 11.3023 486 0.7277 0.5814 0.7277 0.8530
No log 11.3488 488 0.7182 0.5722 0.7182 0.8475
No log 11.3953 490 0.7133 0.5648 0.7133 0.8446
No log 11.4419 492 0.7292 0.5154 0.7292 0.8540
No log 11.4884 494 0.7602 0.5443 0.7602 0.8719
No log 11.5349 496 0.7714 0.5576 0.7714 0.8783
No log 11.5814 498 0.7449 0.5513 0.7449 0.8631
0.357 11.6279 500 0.7931 0.5403 0.7931 0.8906
0.357 11.6744 502 0.8039 0.5384 0.8039 0.8966
0.357 11.7209 504 0.7588 0.5276 0.7588 0.8711
0.357 11.7674 506 0.7696 0.5360 0.7696 0.8772
0.357 11.8140 508 0.7961 0.5233 0.7961 0.8923
0.357 11.8605 510 0.7480 0.4775 0.7480 0.8649

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k14_task2_organization

Finetuned
(4023)
this model