ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k19_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5786
  • Qwk: 0.4429
  • Mse: 0.5786
  • Rmse: 0.7607

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0208 2 2.5688 -0.0109 2.5688 1.6028
No log 0.0417 4 1.2812 0.0997 1.2812 1.1319
No log 0.0625 6 1.1109 -0.1740 1.1109 1.0540
No log 0.0833 8 1.0219 0.0754 1.0219 1.0109
No log 0.1042 10 1.1842 0.0812 1.1842 1.0882
No log 0.125 12 1.0371 0.1870 1.0371 1.0184
No log 0.1458 14 0.8176 0.2407 0.8176 0.9042
No log 0.1667 16 0.8644 0.1972 0.8644 0.9297
No log 0.1875 18 0.9225 0.1550 0.9225 0.9605
No log 0.2083 20 0.9206 0.1217 0.9206 0.9595
No log 0.2292 22 0.7425 0.1407 0.7425 0.8617
No log 0.25 24 0.7217 0.0460 0.7217 0.8495
No log 0.2708 26 0.7183 0.0426 0.7183 0.8475
No log 0.2917 28 0.7412 0.0902 0.7412 0.8609
No log 0.3125 30 0.7751 0.1673 0.7751 0.8804
No log 0.3333 32 0.7764 0.1660 0.7764 0.8812
No log 0.3542 34 0.7977 0.2948 0.7977 0.8932
No log 0.375 36 0.8015 0.3401 0.8015 0.8953
No log 0.3958 38 0.9854 0.3206 0.9854 0.9927
No log 0.4167 40 0.9020 0.3538 0.9020 0.9498
No log 0.4375 42 0.7071 0.4575 0.7071 0.8409
No log 0.4583 44 0.6572 0.4986 0.6572 0.8107
No log 0.4792 46 0.7563 0.4208 0.7563 0.8697
No log 0.5 48 0.8486 0.3355 0.8486 0.9212
No log 0.5208 50 0.7527 0.4113 0.7527 0.8676
No log 0.5417 52 0.8717 0.3679 0.8717 0.9336
No log 0.5625 54 1.0618 0.3497 1.0618 1.0304
No log 0.5833 56 1.1228 0.2658 1.1228 1.0596
No log 0.6042 58 0.9433 0.4436 0.9433 0.9712
No log 0.625 60 0.6802 0.2804 0.6802 0.8248
No log 0.6458 62 0.6081 0.1981 0.6081 0.7798
No log 0.6667 64 0.6332 0.2709 0.6332 0.7957
No log 0.6875 66 0.7027 0.2804 0.7027 0.8383
No log 0.7083 68 0.7664 0.3398 0.7664 0.8755
No log 0.7292 70 0.7421 0.2769 0.7421 0.8615
No log 0.75 72 0.8213 0.2642 0.8213 0.9062
No log 0.7708 74 0.9986 0.3111 0.9986 0.9993
No log 0.7917 76 1.0235 0.3052 1.0235 1.0117
No log 0.8125 78 0.8972 0.2483 0.8972 0.9472
No log 0.8333 80 0.7172 0.2087 0.7172 0.8469
No log 0.8542 82 0.6237 0.1981 0.6237 0.7897
No log 0.875 84 0.5997 0.2345 0.5997 0.7744
No log 0.8958 86 0.6053 0.2711 0.6053 0.7780
No log 0.9167 88 0.6450 0.1754 0.6450 0.8031
No log 0.9375 90 0.6991 0.2381 0.6991 0.8361
No log 0.9583 92 0.7085 0.2381 0.7085 0.8417
No log 0.9792 94 0.6384 0.2402 0.6384 0.7990
No log 1.0 96 0.5685 0.3690 0.5685 0.7540
No log 1.0208 98 0.5412 0.4173 0.5412 0.7357
No log 1.0417 100 0.5389 0.4173 0.5389 0.7341
No log 1.0625 102 0.5495 0.4199 0.5495 0.7413
No log 1.0833 104 0.5559 0.4492 0.5559 0.7456
No log 1.1042 106 0.5639 0.4425 0.5639 0.7509
No log 1.125 108 0.6329 0.4958 0.6329 0.7956
No log 1.1458 110 0.6992 0.3528 0.6992 0.8362
No log 1.1667 112 0.6294 0.3677 0.6294 0.7934
No log 1.1875 114 0.7195 0.4263 0.7195 0.8483
No log 1.2083 116 0.7348 0.4495 0.7348 0.8572
No log 1.2292 118 0.6870 0.3834 0.6870 0.8289
No log 1.25 120 0.7231 0.3695 0.7231 0.8504
No log 1.2708 122 0.7775 0.3123 0.7775 0.8818
No log 1.2917 124 0.7530 0.3464 0.7530 0.8678
No log 1.3125 126 0.7139 0.3856 0.7139 0.8449
No log 1.3333 128 0.7196 0.3681 0.7196 0.8483
No log 1.3542 130 0.7646 0.3706 0.7646 0.8744
No log 1.375 132 0.8188 0.4100 0.8188 0.9049
No log 1.3958 134 0.7351 0.3831 0.7351 0.8574
No log 1.4167 136 0.6836 0.3318 0.6836 0.8268
No log 1.4375 138 0.6817 0.2675 0.6817 0.8256
No log 1.4583 140 0.7715 0.2992 0.7715 0.8783
No log 1.4792 142 0.8127 0.3509 0.8127 0.9015
No log 1.5 144 0.7790 0.3921 0.7790 0.8826
No log 1.5208 146 0.8232 0.2703 0.8232 0.9073
No log 1.5417 148 0.7154 0.4548 0.7154 0.8458
No log 1.5625 150 0.6311 0.3302 0.6311 0.7944
No log 1.5833 152 0.6170 0.3061 0.6170 0.7855
No log 1.6042 154 0.6552 0.3302 0.6552 0.8095
No log 1.625 156 0.7182 0.3189 0.7182 0.8475
No log 1.6458 158 0.6655 0.3050 0.6655 0.8158
No log 1.6667 160 0.6777 0.1673 0.6777 0.8232
No log 1.6875 162 0.6294 0.3633 0.6294 0.7933
No log 1.7083 164 0.6240 0.3633 0.6240 0.7900
No log 1.7292 166 0.6000 0.4703 0.6000 0.7746
No log 1.75 168 0.6226 0.4092 0.6226 0.7890
No log 1.7708 170 0.5952 0.4677 0.5952 0.7715
No log 1.7917 172 0.5609 0.4788 0.5609 0.7489
No log 1.8125 174 0.5650 0.4722 0.5650 0.7517
No log 1.8333 176 0.5998 0.5110 0.5998 0.7745
No log 1.8542 178 0.6209 0.4315 0.6209 0.7880
No log 1.875 180 0.5934 0.4876 0.5934 0.7703
No log 1.8958 182 0.5808 0.4182 0.5808 0.7621
No log 1.9167 184 0.5737 0.4137 0.5737 0.7574
No log 1.9375 186 0.5981 0.4883 0.5981 0.7734
No log 1.9583 188 0.6473 0.4704 0.6473 0.8046
No log 1.9792 190 0.6334 0.4222 0.6334 0.7958
No log 2.0 192 0.7116 0.3027 0.7116 0.8435
No log 2.0208 194 0.7149 0.2918 0.7149 0.8455
No log 2.0417 196 0.8033 0.3931 0.8033 0.8963
No log 2.0625 198 0.8172 0.3890 0.8172 0.9040
No log 2.0833 200 0.7861 0.3668 0.7861 0.8866
No log 2.1042 202 0.7233 0.3224 0.7233 0.8505
No log 2.125 204 0.7207 0.2571 0.7207 0.8490
No log 2.1458 206 0.7397 0.3099 0.7397 0.8601
No log 2.1667 208 0.6953 0.3265 0.6953 0.8338
No log 2.1875 210 0.6899 0.3265 0.6899 0.8306
No log 2.2083 212 0.7015 0.3967 0.7015 0.8376
No log 2.2292 214 0.6694 0.4096 0.6694 0.8182
No log 2.25 216 0.6734 0.4016 0.6734 0.8206
No log 2.2708 218 0.6607 0.3879 0.6607 0.8129
No log 2.2917 220 0.6467 0.3934 0.6467 0.8042
No log 2.3125 222 0.6542 0.3833 0.6542 0.8088
No log 2.3333 224 0.6215 0.3266 0.6215 0.7883
No log 2.3542 226 0.6176 0.4222 0.6176 0.7859
No log 2.375 228 0.6066 0.3984 0.6066 0.7789
No log 2.3958 230 0.5939 0.4423 0.5939 0.7707
No log 2.4167 232 0.7057 0.4100 0.7057 0.8400
No log 2.4375 234 0.7555 0.4051 0.7555 0.8692
No log 2.4583 236 0.6345 0.4307 0.6345 0.7966
No log 2.4792 238 0.6164 0.4205 0.6164 0.7851
No log 2.5 240 0.5928 0.4338 0.5928 0.7700
No log 2.5208 242 0.5934 0.4504 0.5934 0.7703
No log 2.5417 244 0.5981 0.4591 0.5981 0.7734
No log 2.5625 246 0.7333 0.3067 0.7333 0.8564
No log 2.5833 248 0.8291 0.3506 0.8291 0.9106
No log 2.6042 250 0.7285 0.3807 0.7285 0.8535
No log 2.625 252 0.6020 0.3111 0.6020 0.7759
No log 2.6458 254 0.6475 0.3425 0.6475 0.8047
No log 2.6667 256 0.6153 0.3622 0.6153 0.7844
No log 2.6875 258 0.6019 0.3633 0.6019 0.7758
No log 2.7083 260 0.6726 0.3590 0.6726 0.8201
No log 2.7292 262 0.6528 0.3857 0.6528 0.8079
No log 2.75 264 0.5956 0.4264 0.5956 0.7717
No log 2.7708 266 0.6109 0.3341 0.6109 0.7816
No log 2.7917 268 0.5914 0.4161 0.5914 0.7690
No log 2.8125 270 0.7377 0.4072 0.7377 0.8589
No log 2.8333 272 1.0385 0.2199 1.0385 1.0191
No log 2.8542 274 1.0891 0.2169 1.0891 1.0436
No log 2.875 276 0.9070 0.2230 0.9070 0.9524
No log 2.8958 278 0.6251 0.4373 0.6251 0.7906
No log 2.9167 280 0.6255 0.4354 0.6255 0.7909
No log 2.9375 282 0.7735 0.2817 0.7735 0.8795
No log 2.9583 284 0.8294 0.2817 0.8294 0.9107
No log 2.9792 286 0.7788 0.2345 0.7788 0.8825
No log 3.0 288 0.7068 0.2506 0.7068 0.8407
No log 3.0208 290 0.6658 0.2158 0.6658 0.8159
No log 3.0417 292 0.6459 0.3454 0.6459 0.8037
No log 3.0625 294 0.6492 0.4091 0.6492 0.8057
No log 3.0833 296 0.6904 0.4740 0.6904 0.8309
No log 3.1042 298 0.7657 0.4574 0.7657 0.8750
No log 3.125 300 0.7363 0.4795 0.7363 0.8581
No log 3.1458 302 0.6633 0.4434 0.6633 0.8144
No log 3.1667 304 0.6447 0.4 0.6447 0.8029
No log 3.1875 306 0.6395 0.4307 0.6395 0.7997
No log 3.2083 308 0.6207 0.4429 0.6207 0.7879
No log 3.2292 310 0.5987 0.3280 0.5987 0.7738
No log 3.25 312 0.6091 0.3701 0.6091 0.7804
No log 3.2708 314 0.6279 0.4270 0.6279 0.7924
No log 3.2917 316 0.6038 0.4100 0.6038 0.7771
No log 3.3125 318 0.5699 0.4990 0.5699 0.7549
No log 3.3333 320 0.7007 0.4671 0.7007 0.8371
No log 3.3542 322 0.8518 0.3239 0.8518 0.9229
No log 3.375 324 0.8740 0.3173 0.8740 0.9349
No log 3.3958 326 0.8036 0.3509 0.8036 0.8964
No log 3.4167 328 0.7245 0.2464 0.7245 0.8512
No log 3.4375 330 0.6623 0.3001 0.6623 0.8138
No log 3.4583 332 0.6263 0.2334 0.6263 0.7914
No log 3.4792 334 0.6047 0.3010 0.6047 0.7776
No log 3.5 336 0.5863 0.3129 0.5863 0.7657
No log 3.5208 338 0.5768 0.2545 0.5768 0.7595
No log 3.5417 340 0.5710 0.4019 0.5710 0.7556
No log 3.5625 342 0.5715 0.4300 0.5715 0.7560
No log 3.5833 344 0.5771 0.4091 0.5771 0.7597
No log 3.6042 346 0.5600 0.4448 0.5600 0.7483
No log 3.625 348 0.6059 0.4630 0.6059 0.7784
No log 3.6458 350 0.6193 0.4336 0.6193 0.7870
No log 3.6667 352 0.5795 0.3857 0.5795 0.7613
No log 3.6875 354 0.5632 0.3608 0.5632 0.7504
No log 3.7083 356 0.5704 0.3289 0.5704 0.7553
No log 3.7292 358 0.5729 0.3258 0.5729 0.7569
No log 3.75 360 0.5889 0.3804 0.5889 0.7674
No log 3.7708 362 0.6342 0.4879 0.6342 0.7964
No log 3.7917 364 0.6097 0.4106 0.6097 0.7808
No log 3.8125 366 0.5543 0.4101 0.5543 0.7445
No log 3.8333 368 0.5484 0.4482 0.5484 0.7406
No log 3.8542 370 0.5519 0.5104 0.5519 0.7429
No log 3.875 372 0.5266 0.4788 0.5266 0.7257
No log 3.8958 374 0.5674 0.4860 0.5674 0.7533
No log 3.9167 376 0.7624 0.3425 0.7624 0.8731
No log 3.9375 378 0.9039 0.3052 0.9039 0.9507
No log 3.9583 380 0.8783 0.3511 0.8783 0.9372
No log 3.9792 382 0.7118 0.4092 0.7118 0.8437
No log 4.0 384 0.5455 0.5283 0.5455 0.7386
No log 4.0208 386 0.5292 0.4637 0.5292 0.7275
No log 4.0417 388 0.5595 0.4637 0.5595 0.7480
No log 4.0625 390 0.5715 0.3754 0.5715 0.7560
No log 4.0833 392 0.5935 0.3336 0.5935 0.7704
No log 4.1042 394 0.6350 0.2743 0.6350 0.7968
No log 4.125 396 0.6690 0.1673 0.6690 0.8179
No log 4.1458 398 0.6519 0.2920 0.6519 0.8074
No log 4.1667 400 0.6057 0.3608 0.6057 0.7783
No log 4.1875 402 0.6189 0.4086 0.6189 0.7867
No log 4.2083 404 0.6915 0.4392 0.6915 0.8316
No log 4.2292 406 0.7148 0.4392 0.7148 0.8455
No log 4.25 408 0.6733 0.4392 0.6733 0.8205
No log 4.2708 410 0.5992 0.3788 0.5992 0.7741
No log 4.2917 412 0.5796 0.4471 0.5796 0.7613
No log 4.3125 414 0.6285 0.4051 0.6285 0.7928
No log 4.3333 416 0.7727 0.3051 0.7727 0.8791
No log 4.3542 418 0.8394 0.3141 0.8394 0.9162
No log 4.375 420 0.7531 0.3323 0.7531 0.8678
No log 4.3958 422 0.6010 0.4507 0.6010 0.7753
No log 4.4167 424 0.5543 0.5022 0.5543 0.7445
No log 4.4375 426 0.5780 0.4292 0.5780 0.7603
No log 4.4583 428 0.5700 0.5053 0.5700 0.7550
No log 4.4792 430 0.5493 0.5177 0.5493 0.7412
No log 4.5 432 0.5811 0.5110 0.5811 0.7623
No log 4.5208 434 0.5984 0.4895 0.5984 0.7736
No log 4.5417 436 0.5547 0.5321 0.5547 0.7448
No log 4.5625 438 0.5389 0.5003 0.5389 0.7341
No log 4.5833 440 0.5431 0.5455 0.5431 0.7369
No log 4.6042 442 0.5357 0.4194 0.5357 0.7319
No log 4.625 444 0.5623 0.5010 0.5623 0.7499
No log 4.6458 446 0.6634 0.3921 0.6634 0.8145
No log 4.6667 448 0.7206 0.4072 0.7206 0.8489
No log 4.6875 450 0.6679 0.4341 0.6679 0.8173
No log 4.7083 452 0.5998 0.4931 0.5998 0.7745
No log 4.7292 454 0.5459 0.5379 0.5459 0.7389
No log 4.75 456 0.5359 0.4857 0.5359 0.7321
No log 4.7708 458 0.5344 0.4857 0.5344 0.7311
No log 4.7917 460 0.5379 0.5114 0.5379 0.7334
No log 4.8125 462 0.5772 0.4052 0.5772 0.7597
No log 4.8333 464 0.6908 0.3401 0.6908 0.8312
No log 4.8542 466 0.7472 0.3085 0.7472 0.8644
No log 4.875 468 0.6710 0.3754 0.6710 0.8191
No log 4.8958 470 0.5657 0.4259 0.5657 0.7521
No log 4.9167 472 0.5352 0.4677 0.5352 0.7316
No log 4.9375 474 0.5341 0.5460 0.5341 0.7308
No log 4.9583 476 0.5433 0.5357 0.5433 0.7371
No log 4.9792 478 0.5565 0.5160 0.5565 0.7460
No log 5.0 480 0.5861 0.5065 0.5861 0.7656
No log 5.0208 482 0.6573 0.4708 0.6573 0.8108
No log 5.0417 484 0.7209 0.4743 0.7209 0.8490
No log 5.0625 486 0.7008 0.4519 0.7008 0.8371
No log 5.0833 488 0.6140 0.4756 0.6140 0.7836
No log 5.1042 490 0.5580 0.5042 0.5580 0.7470
No log 5.125 492 0.5716 0.3382 0.5716 0.7560
No log 5.1458 494 0.6152 0.4247 0.6152 0.7844
No log 5.1667 496 0.6159 0.4247 0.6159 0.7848
No log 5.1875 498 0.5701 0.3840 0.5701 0.7550
0.3835 5.2083 500 0.5512 0.5227 0.5512 0.7425
0.3835 5.2292 502 0.5758 0.4429 0.5758 0.7588
0.3835 5.25 504 0.5992 0.4429 0.5992 0.7741
0.3835 5.2708 506 0.6389 0.4106 0.6389 0.7993
0.3835 5.2917 508 0.6044 0.3804 0.6044 0.7774
0.3835 5.3125 510 0.5786 0.4429 0.5786 0.7607

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k19_task7_organization

Finetuned
(4019)
this model