ArabicNewSplits6_FineTuningAraBERT_run3_AugV5_k11_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6220
  • Qwk: 0.2766
  • Mse: 0.6220
  • Rmse: 0.7886

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0385 2 3.2194 0.0013 3.2194 1.7943
No log 0.0769 4 1.6805 0.0168 1.6805 1.2963
No log 0.1154 6 1.2607 0.0255 1.2607 1.1228
No log 0.1538 8 0.7739 0.12 0.7739 0.8797
No log 0.1923 10 0.6710 0.0452 0.6710 0.8192
No log 0.2308 12 0.6282 0.0685 0.6282 0.7926
No log 0.2692 14 0.6420 0.0949 0.6420 0.8013
No log 0.3077 16 0.6375 0.0949 0.6375 0.7984
No log 0.3462 18 0.6561 0.0685 0.6561 0.8100
No log 0.3846 20 0.7627 0.0457 0.7627 0.8733
No log 0.4231 22 0.7419 0.1133 0.7419 0.8614
No log 0.4615 24 0.6618 0.0 0.6618 0.8135
No log 0.5 26 0.6146 0.0769 0.6146 0.7840
No log 0.5385 28 0.5894 0.0857 0.5894 0.7677
No log 0.5769 30 0.5618 0.0569 0.5618 0.7496
No log 0.6154 32 0.7232 -0.0818 0.7232 0.8504
No log 0.6538 34 1.1961 0.0201 1.1961 1.0937
No log 0.6923 36 1.1212 0.0244 1.1212 1.0589
No log 0.7308 38 0.8086 -0.0526 0.8086 0.8992
No log 0.7692 40 0.6385 -0.0081 0.6385 0.7991
No log 0.8077 42 0.6181 0.0 0.6181 0.7862
No log 0.8462 44 0.6455 -0.0159 0.6455 0.8034
No log 0.8846 46 0.6489 -0.0159 0.6489 0.8056
No log 0.9231 48 0.7005 -0.1156 0.7005 0.8370
No log 0.9615 50 0.7955 0.0 0.7955 0.8919
No log 1.0 52 1.0601 -0.0563 1.0601 1.0296
No log 1.0385 54 0.8507 0.0601 0.8507 0.9223
No log 1.0769 56 0.6705 0.0145 0.6705 0.8189
No log 1.1154 58 0.6608 0.0303 0.6608 0.8129
No log 1.1538 60 0.6600 0.1206 0.6600 0.8124
No log 1.1923 62 0.8966 0.0303 0.8966 0.9469
No log 1.2308 64 1.5698 -0.0268 1.5698 1.2529
No log 1.2692 66 1.4764 -0.0159 1.4764 1.2151
No log 1.3077 68 1.1355 0.0333 1.1355 1.0656
No log 1.3462 70 0.7555 0.0270 0.7555 0.8692
No log 1.3846 72 0.6812 0.2381 0.6812 0.8254
No log 1.4231 74 0.7227 0.1917 0.7227 0.8501
No log 1.4615 76 0.6854 0.2626 0.6854 0.8279
No log 1.5 78 0.6634 0.1059 0.6634 0.8145
No log 1.5385 80 0.7668 0.0 0.7668 0.8757
No log 1.5769 82 0.7049 0.0638 0.7049 0.8396
No log 1.6154 84 0.6416 0.2457 0.6416 0.8010
No log 1.6538 86 0.9048 0.1660 0.9048 0.9512
No log 1.6923 88 0.9016 0.0558 0.9016 0.9495
No log 1.7308 90 0.6525 0.1739 0.6525 0.8078
No log 1.7692 92 0.8082 0.0622 0.8082 0.8990
No log 1.8077 94 1.0076 0.1644 1.0076 1.0038
No log 1.8462 96 0.7786 0.0291 0.7786 0.8824
No log 1.8846 98 0.6465 0.2688 0.6465 0.8040
No log 1.9231 100 0.8261 0.1600 0.8261 0.9089
No log 1.9615 102 1.3301 0.0476 1.3301 1.1533
No log 2.0 104 1.4759 0.0545 1.4759 1.2149
No log 2.0385 106 1.0473 0.1020 1.0473 1.0234
No log 2.0769 108 0.6323 0.2000 0.6323 0.7952
No log 2.1154 110 0.7226 0.0805 0.7226 0.8501
No log 2.1538 112 0.7206 0.0909 0.7206 0.8489
No log 2.1923 114 0.6580 0.1716 0.6580 0.8112
No log 2.2308 116 0.8950 0.2332 0.8950 0.9461
No log 2.2692 118 1.1301 0.0815 1.1301 1.0631
No log 2.3077 120 0.9940 0.1347 0.9940 0.9970
No log 2.3462 122 0.7047 0.2000 0.7047 0.8395
No log 2.3846 124 0.7027 0.0843 0.7027 0.8383
No log 2.4231 126 0.7422 0.1461 0.7422 0.8615
No log 2.4615 128 0.6819 0.1163 0.6819 0.8258
No log 2.5 130 0.7286 0.2000 0.7286 0.8536
No log 2.5385 132 0.7675 0.2410 0.7675 0.8761
No log 2.5769 134 0.8036 0.28 0.8036 0.8965
No log 2.6154 136 0.9098 0.2877 0.9098 0.9538
No log 2.6538 138 0.9216 0.2727 0.9216 0.9600
No log 2.6923 140 0.9636 0.2000 0.9636 0.9816
No log 2.7308 142 0.9784 0.1718 0.9784 0.9891
No log 2.7692 144 0.8419 0.3010 0.8419 0.9175
No log 2.8077 146 0.8555 0.3010 0.8555 0.9249
No log 2.8462 148 0.9929 0.0877 0.9929 0.9964
No log 2.8846 150 0.9029 0.2593 0.9029 0.9502
No log 2.9231 152 0.7219 0.2871 0.7219 0.8497
No log 2.9615 154 0.7162 0.2475 0.7162 0.8463
No log 3.0 156 0.7093 0.2475 0.7093 0.8422
No log 3.0385 158 0.7845 0.2475 0.7845 0.8857
No log 3.0769 160 0.7184 0.3161 0.7184 0.8476
No log 3.1154 162 0.6720 0.2871 0.6720 0.8198
No log 3.1538 164 0.7065 0.3103 0.7065 0.8405
No log 3.1923 166 0.9598 0.0916 0.9598 0.9797
No log 3.2308 168 0.9434 0.0988 0.9434 0.9713
No log 3.2692 170 0.7099 0.3077 0.7099 0.8425
No log 3.3077 172 0.6089 0.3367 0.6089 0.7803
No log 3.3462 174 0.6444 0.3161 0.6444 0.8027
No log 3.3846 176 0.7864 0.2857 0.7864 0.8868
No log 3.4231 178 0.7219 0.3091 0.7219 0.8496
No log 3.4615 180 0.6181 0.3706 0.6181 0.7862
No log 3.5 182 0.6207 0.3814 0.6207 0.7878
No log 3.5385 184 0.7372 0.2696 0.7372 0.8586
No log 3.5769 186 0.7149 0.2711 0.7149 0.8455
No log 3.6154 188 0.6362 0.3939 0.6362 0.7976
No log 3.6538 190 0.6385 0.3641 0.6385 0.7991
No log 3.6923 192 0.5946 0.3684 0.5946 0.7711
No log 3.7308 194 0.5311 0.2889 0.5311 0.7288
No log 3.7692 196 0.5351 0.3684 0.5351 0.7315
No log 3.8077 198 0.6072 0.3778 0.6072 0.7792
No log 3.8462 200 0.5996 0.3778 0.5996 0.7743
No log 3.8846 202 0.5775 0.3371 0.5775 0.7599
No log 3.9231 204 0.5524 0.2994 0.5524 0.7432
No log 3.9615 206 0.5725 0.3563 0.5725 0.7566
No log 4.0 208 0.6768 0.3161 0.6768 0.8227
No log 4.0385 210 0.9615 0.1867 0.9615 0.9805
No log 4.0769 212 0.9282 0.1799 0.9282 0.9634
No log 4.1154 214 0.7190 0.3769 0.7190 0.8479
No log 4.1538 216 0.6704 0.3231 0.6704 0.8188
No log 4.1923 218 0.6785 0.3862 0.6785 0.8237
No log 4.2308 220 0.8266 0.1351 0.8266 0.9092
No log 4.2692 222 1.0361 0.0977 1.0361 1.0179
No log 4.3077 224 0.9228 0.1203 0.9228 0.9606
No log 4.3462 226 0.6837 0.2432 0.6837 0.8268
No log 4.3846 228 0.6252 0.2099 0.6252 0.7907
No log 4.4231 230 0.6496 0.1011 0.6496 0.8060
No log 4.4615 232 0.6228 0.2704 0.6228 0.7892
No log 4.5 234 0.6607 0.2195 0.6607 0.8128
No log 4.5385 236 0.7487 0.2265 0.7487 0.8653
No log 4.5769 238 0.6964 0.2289 0.6964 0.8345
No log 4.6154 240 0.6124 0.2099 0.6124 0.7825
No log 4.6538 242 0.6038 0.2099 0.6038 0.7771
No log 4.6923 244 0.5764 0.25 0.5764 0.7592
No log 4.7308 246 0.5840 0.3333 0.5840 0.7642
No log 4.7692 248 0.6396 0.3297 0.6396 0.7997
No log 4.8077 250 0.8239 0.2208 0.8239 0.9077
No log 4.8462 252 0.7453 0.3962 0.7453 0.8633
No log 4.8846 254 0.6512 0.3367 0.6512 0.8070
No log 4.9231 256 0.6263 0.3367 0.6263 0.7914
No log 4.9615 258 0.6523 0.3367 0.6523 0.8077
No log 5.0 260 0.7537 0.3455 0.7537 0.8681
No log 5.0385 262 0.8367 0.2743 0.8367 0.9147
No log 5.0769 264 0.8087 0.2743 0.8087 0.8993
No log 5.1154 266 0.6520 0.3769 0.6520 0.8075
No log 5.1538 268 0.6307 0.3769 0.6307 0.7942
No log 5.1923 270 0.7500 0.2793 0.7500 0.8660
No log 5.2308 272 0.8445 0.2366 0.8445 0.9189
No log 5.2692 274 0.8236 0.2713 0.8236 0.9075
No log 5.3077 276 0.7992 0.2397 0.7992 0.8940
No log 5.3462 278 0.6766 0.3028 0.6766 0.8226
No log 5.3846 280 0.5876 0.3446 0.5876 0.7666
No log 5.4231 282 0.5813 0.3258 0.5813 0.7625
No log 5.4615 284 0.5801 0.3488 0.5801 0.7616
No log 5.5 286 0.6375 0.2432 0.6375 0.7984
No log 5.5385 288 0.7410 0.2070 0.7410 0.8608
No log 5.5769 290 0.7501 0.2143 0.7501 0.8661
No log 5.6154 292 0.6751 0.2410 0.6751 0.8216
No log 5.6538 294 0.6003 0.3563 0.6003 0.7748
No log 5.6923 296 0.5883 0.3446 0.5883 0.7670
No log 5.7308 298 0.6004 0.3407 0.6004 0.7748
No log 5.7692 300 0.6370 0.3862 0.6370 0.7981
No log 5.8077 302 0.7874 0.2793 0.7874 0.8874
No log 5.8462 304 1.0629 0.1378 1.0629 1.0310
No log 5.8846 306 1.1039 0.1439 1.1039 1.0506
No log 5.9231 308 0.9263 0.1642 0.9263 0.9624
No log 5.9615 310 0.6780 0.3200 0.6780 0.8234
No log 6.0 312 0.5831 0.3371 0.5831 0.7636
No log 6.0385 314 0.6028 0.2273 0.6028 0.7764
No log 6.0769 316 0.5826 0.3412 0.5826 0.7633
No log 6.1154 318 0.5766 0.3609 0.5766 0.7594
No log 6.1538 320 0.6299 0.2832 0.6299 0.7937
No log 6.1923 322 0.7198 0.1759 0.7198 0.8484
No log 6.2308 324 0.7623 0.1776 0.7623 0.8731
No log 6.2692 326 0.7059 0.2780 0.7059 0.8402
No log 6.3077 328 0.6576 0.3878 0.6576 0.8109
No log 6.3462 330 0.6217 0.3706 0.6217 0.7885
No log 6.3846 332 0.6264 0.4059 0.6264 0.7915
No log 6.4231 334 0.6639 0.4118 0.6639 0.8148
No log 6.4615 336 0.6649 0.4118 0.6649 0.8154
No log 6.5 338 0.6371 0.4118 0.6371 0.7982
No log 6.5385 340 0.6372 0.4118 0.6372 0.7982
No log 6.5769 342 0.5905 0.3439 0.5905 0.7684
No log 6.6154 344 0.5563 0.3446 0.5563 0.7459
No log 6.6538 346 0.5454 0.3446 0.5454 0.7385
No log 6.6923 348 0.5450 0.3446 0.5450 0.7383
No log 6.7308 350 0.5750 0.3439 0.5750 0.7583
No log 6.7692 352 0.6521 0.2609 0.6521 0.8075
No log 6.8077 354 0.6991 0.2670 0.6991 0.8361
No log 6.8462 356 0.6616 0.2707 0.6616 0.8134
No log 6.8846 358 0.5951 0.3878 0.5951 0.7714
No log 6.9231 360 0.5954 0.3769 0.5954 0.7716
No log 6.9615 362 0.6777 0.3200 0.6777 0.8232
No log 7.0 364 0.7437 0.2986 0.7437 0.8624
No log 7.0385 366 0.7076 0.3143 0.7076 0.8412
No log 7.0769 368 0.6916 0.3803 0.6916 0.8316
No log 7.1154 370 0.6559 0.4175 0.6559 0.8099
No log 7.1538 372 0.6283 0.4118 0.6283 0.7927
No log 7.1923 374 0.6192 0.4118 0.6192 0.7869
No log 7.2308 376 0.6527 0.4175 0.6527 0.8079
No log 7.2692 378 0.7083 0.3208 0.7083 0.8416
No log 7.3077 380 0.6929 0.2941 0.6929 0.8324
No log 7.3462 382 0.6558 0.3231 0.6558 0.8098
No log 7.3846 384 0.6292 0.2766 0.6292 0.7933
No log 7.4231 386 0.5965 0.3814 0.5965 0.7723
No log 7.4615 388 0.6042 0.3508 0.6042 0.7773
No log 7.5 390 0.6478 0.2527 0.6478 0.8049
No log 7.5385 392 0.6456 0.2865 0.6456 0.8035
No log 7.5769 394 0.6108 0.3508 0.6108 0.7815
No log 7.6154 396 0.5565 0.3563 0.5565 0.7460
No log 7.6538 398 0.5382 0.3446 0.5382 0.7336
No log 7.6923 400 0.5392 0.3446 0.5392 0.7343
No log 7.7308 402 0.5520 0.3446 0.5520 0.7429
No log 7.7692 404 0.5864 0.3548 0.5864 0.7658
No log 7.8077 406 0.6576 0.2746 0.6576 0.8109
No log 7.8462 408 0.7165 0.2563 0.7165 0.8465
No log 7.8846 410 0.7185 0.2563 0.7185 0.8477
No log 7.9231 412 0.6651 0.2746 0.6651 0.8155
No log 7.9615 414 0.6034 0.3508 0.6034 0.7768
No log 8.0 416 0.5758 0.3478 0.5758 0.7588
No log 8.0385 418 0.5617 0.3563 0.5617 0.7495
No log 8.0769 420 0.5725 0.3563 0.5725 0.7567
No log 8.1154 422 0.6014 0.3548 0.6014 0.7755
No log 8.1538 424 0.6349 0.3508 0.6349 0.7968
No log 8.1923 426 0.6484 0.3508 0.6484 0.8052
No log 8.2308 428 0.6709 0.2746 0.6709 0.8191
No log 8.2692 430 0.6966 0.2308 0.6966 0.8346
No log 8.3077 432 0.7128 0.2762 0.7128 0.8443
No log 8.3462 434 0.7235 0.2850 0.7235 0.8506
No log 8.3846 436 0.7010 0.2676 0.7010 0.8373
No log 8.4231 438 0.6715 0.2709 0.6715 0.8195
No log 8.4615 440 0.6479 0.3469 0.6479 0.8049
No log 8.5 442 0.6658 0.2709 0.6658 0.8159
No log 8.5385 444 0.6712 0.2709 0.6712 0.8192
No log 8.5769 446 0.6703 0.3398 0.6703 0.8187
No log 8.6154 448 0.6383 0.3469 0.6383 0.7990
No log 8.6538 450 0.6164 0.3814 0.6164 0.7851
No log 8.6923 452 0.6193 0.3814 0.6193 0.7870
No log 8.7308 454 0.6478 0.3469 0.6478 0.8049
No log 8.7692 456 0.6692 0.3744 0.6692 0.8181
No log 8.8077 458 0.6956 0.2676 0.6956 0.8340
No log 8.8462 460 0.7491 0.3091 0.7491 0.8655
No log 8.8846 462 0.7643 0.3180 0.7643 0.8743
No log 8.9231 464 0.7434 0.3091 0.7434 0.8622
No log 8.9615 466 0.6986 0.2676 0.6986 0.8358
No log 9.0 468 0.6384 0.3508 0.6384 0.7990
No log 9.0385 470 0.6045 0.3814 0.6045 0.7775
No log 9.0769 472 0.5789 0.375 0.5789 0.7609
No log 9.1154 474 0.5742 0.375 0.5742 0.7577
No log 9.1538 476 0.5847 0.3814 0.5847 0.7647
No log 9.1923 478 0.6104 0.3508 0.6104 0.7813
No log 9.2308 480 0.6399 0.3508 0.6399 0.7999
No log 9.2692 482 0.6853 0.3171 0.6853 0.8278
No log 9.3077 484 0.7384 0.3271 0.7384 0.8593
No log 9.3462 486 0.7675 0.2986 0.7675 0.8761
No log 9.3846 488 0.7649 0.2986 0.7649 0.8746
No log 9.4231 490 0.7404 0.3271 0.7404 0.8604
No log 9.4615 492 0.7108 0.2762 0.7108 0.8431
No log 9.5 494 0.6850 0.3171 0.6850 0.8277
No log 9.5385 496 0.6635 0.2727 0.6635 0.8146
No log 9.5769 498 0.6549 0.2727 0.6549 0.8093
0.3604 9.6154 500 0.6472 0.2746 0.6472 0.8045
0.3604 9.6538 502 0.6401 0.2766 0.6401 0.8001
0.3604 9.6923 504 0.6423 0.2766 0.6423 0.8014
0.3604 9.7308 506 0.6412 0.2766 0.6412 0.8007
0.3604 9.7692 508 0.6405 0.2766 0.6405 0.8003
0.3604 9.8077 510 0.6353 0.2766 0.6353 0.7970
0.3604 9.8462 512 0.6299 0.2766 0.6299 0.7936
0.3604 9.8846 514 0.6261 0.2766 0.6261 0.7913
0.3604 9.9231 516 0.6231 0.2766 0.6231 0.7894
0.3604 9.9615 518 0.6219 0.2766 0.6219 0.7886
0.3604 10.0 520 0.6220 0.2766 0.6220 0.7886

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_FineTuningAraBERT_run3_AugV5_k11_task3_organization

Finetuned
(4023)
this model