ArabicNewSplits6_FineTuningAraBERT_run1_AugV5_k11_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7523
  • Qwk: 0.0857
  • Mse: 0.7523
  • Rmse: 0.8674

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0385 2 3.1183 -0.0126 3.1183 1.7659
No log 0.0769 4 1.6546 0.0168 1.6546 1.2863
No log 0.1154 6 1.1609 0.0294 1.1609 1.0775
No log 0.1538 8 0.6153 0.0222 0.6153 0.7844
No log 0.1923 10 0.5582 0.0569 0.5582 0.7471
No log 0.2308 12 0.5882 0.0569 0.5882 0.7669
No log 0.2692 14 0.6458 0.0145 0.6458 0.8036
No log 0.3077 16 0.8304 0.1870 0.8304 0.9113
No log 0.3462 18 0.8675 0.1545 0.8675 0.9314
No log 0.3846 20 0.6847 -0.0256 0.6847 0.8275
No log 0.4231 22 0.6024 0.0 0.6024 0.7761
No log 0.4615 24 0.6228 0.0 0.6228 0.7892
No log 0.5 26 0.5629 0.0 0.5629 0.7503
No log 0.5385 28 0.5678 0.2418 0.5678 0.7536
No log 0.5769 30 0.6745 0.2258 0.6745 0.8213
No log 0.6154 32 0.7319 0.2157 0.7319 0.8555
No log 0.6538 34 0.5650 0.0725 0.5650 0.7517
No log 0.6923 36 0.5959 0.0 0.5959 0.7720
No log 0.7308 38 0.5982 0.0 0.5982 0.7734
No log 0.7692 40 0.5856 0.0388 0.5856 0.7652
No log 0.8077 42 0.7146 0.0476 0.7146 0.8453
No log 0.8462 44 0.6707 -0.0222 0.6707 0.8189
No log 0.8846 46 0.6150 0.0 0.6150 0.7842
No log 0.9231 48 0.6732 0.0409 0.6732 0.8205
No log 0.9615 50 0.7207 0.0538 0.7207 0.8490
No log 1.0 52 0.8227 0.0588 0.8227 0.9070
No log 1.0385 54 0.7367 0.0538 0.7367 0.8583
No log 1.0769 56 0.5960 -0.0159 0.5960 0.7720
No log 1.1154 58 0.5739 0.0 0.5739 0.7576
No log 1.1538 60 0.5610 0.2653 0.5610 0.7490
No log 1.1923 62 0.7435 0.1416 0.7435 0.8623
No log 1.2308 64 0.8609 0.1644 0.8609 0.9278
No log 1.2692 66 0.6015 0.1795 0.6015 0.7755
No log 1.3077 68 0.5906 -0.0081 0.5906 0.7685
No log 1.3462 70 0.5850 0.0476 0.5850 0.7649
No log 1.3846 72 0.7393 0.0538 0.7393 0.8598
No log 1.4231 74 1.0303 0.1562 1.0303 1.0150
No log 1.4615 76 0.9945 0.1597 0.9945 0.9972
No log 1.5 78 0.7141 0.1258 0.7141 0.8451
No log 1.5385 80 0.8009 0.1707 0.8009 0.8949
No log 1.5769 82 0.7768 0.0604 0.7768 0.8814
No log 1.6154 84 0.8548 0.0164 0.8548 0.9245
No log 1.6538 86 0.8018 0.1030 0.8018 0.8954
No log 1.6923 88 0.8097 0.0952 0.8097 0.8998
No log 1.7308 90 0.7756 0.1823 0.7756 0.8807
No log 1.7692 92 0.7416 0.2688 0.7416 0.8611
No log 1.8077 94 0.7359 0.2179 0.7359 0.8579
No log 1.8462 96 0.6927 0.2281 0.6927 0.8323
No log 1.8846 98 0.7097 0.2609 0.7097 0.8425
No log 1.9231 100 0.6730 0.2752 0.6730 0.8204
No log 1.9615 102 0.6713 0.2485 0.6713 0.8193
No log 2.0 104 0.9089 0.1525 0.9089 0.9534
No log 2.0385 106 0.8171 0.1698 0.8171 0.9039
No log 2.0769 108 0.7456 0.1732 0.7456 0.8635
No log 2.1154 110 0.9923 0.0400 0.9923 0.9961
No log 2.1538 112 0.9496 0.0979 0.9496 0.9745
No log 2.1923 114 0.8015 0.1801 0.8015 0.8952
No log 2.2308 116 0.7076 0.0222 0.7076 0.8412
No log 2.2692 118 0.7005 0.0556 0.7005 0.8369
No log 2.3077 120 0.7096 0.0303 0.7096 0.8424
No log 2.3462 122 0.7474 0.2308 0.7474 0.8645
No log 2.3846 124 0.7628 0.1905 0.7628 0.8734
No log 2.4231 126 0.7089 0.1773 0.7089 0.8419
No log 2.4615 128 0.6387 0.1892 0.6387 0.7992
No log 2.5 130 0.7631 0.125 0.7631 0.8736
No log 2.5385 132 0.8946 0.0918 0.8946 0.9459
No log 2.5769 134 0.7437 0.1515 0.7437 0.8624
No log 2.6154 136 0.5850 0.2848 0.5850 0.7649
No log 2.6538 138 0.6262 0.1329 0.6262 0.7914
No log 2.6923 140 0.6539 0.1392 0.6539 0.8086
No log 2.7308 142 0.6292 0.3455 0.6292 0.7932
No log 2.7692 144 0.7252 0.2464 0.7252 0.8516
No log 2.8077 146 0.8402 0.1790 0.8402 0.9166
No log 2.8462 148 0.9104 0.0871 0.9104 0.9541
No log 2.8846 150 1.0518 0.0916 1.0518 1.0256
No log 2.9231 152 0.9088 0.1169 0.9088 0.9533
No log 2.9615 154 1.0089 0.2065 1.0089 1.0044
No log 3.0 156 0.9686 0.0947 0.9686 0.9842
No log 3.0385 158 1.0023 0.1873 1.0023 1.0012
No log 3.0769 160 1.1725 0.0977 1.1725 1.0828
No log 3.1154 162 1.0917 0.1264 1.0917 1.0448
No log 3.1538 164 0.8133 0.1705 0.8133 0.9018
No log 3.1923 166 0.7820 0.2217 0.7820 0.8843
No log 3.2308 168 0.7746 0.1776 0.7746 0.8801
No log 3.2692 170 0.8705 0.1453 0.8705 0.9330
No log 3.3077 172 0.7830 0.1619 0.7830 0.8849
No log 3.3462 174 0.7397 0.2217 0.7397 0.8601
No log 3.3846 176 0.7789 0.3427 0.7789 0.8825
No log 3.4231 178 0.7534 0.2294 0.7534 0.8680
No log 3.4615 180 1.0060 0.1486 1.0060 1.0030
No log 3.5 182 1.2059 0.0747 1.2059 1.0981
No log 3.5385 184 1.0020 0.1486 1.0020 1.0010
No log 3.5769 186 0.7396 0.2233 0.7396 0.8600
No log 3.6154 188 0.7741 0.2511 0.7741 0.8798
No log 3.6538 190 0.7250 0.2251 0.7250 0.8515
No log 3.6923 192 0.8289 0.1273 0.8289 0.9104
No log 3.7308 194 0.9590 0.0598 0.9590 0.9793
No log 3.7692 196 0.8395 0.1481 0.8395 0.9163
No log 3.8077 198 0.6712 0.2865 0.6712 0.8193
No log 3.8462 200 0.6940 0.2513 0.6940 0.8330
No log 3.8846 202 0.6565 0.2994 0.6565 0.8102
No log 3.9231 204 0.7269 0.2239 0.7269 0.8526
No log 3.9615 206 0.7636 0.1845 0.7636 0.8738
No log 4.0 208 0.6861 0.2707 0.6861 0.8283
No log 4.0385 210 0.6013 0.3136 0.6013 0.7755
No log 4.0769 212 0.6162 0.3371 0.6162 0.7850
No log 4.1154 214 0.6346 0.3708 0.6346 0.7966
No log 4.1538 216 0.8022 0.1515 0.8022 0.8956
No log 4.1923 218 0.8033 0.2068 0.8033 0.8962
No log 4.2308 220 0.7118 0.1841 0.7118 0.8437
No log 4.2692 222 0.7347 0.3010 0.7347 0.8571
No log 4.3077 224 0.7347 0.2919 0.7347 0.8571
No log 4.3462 226 0.7109 0.2621 0.7109 0.8432
No log 4.3846 228 0.7958 0.2069 0.7958 0.8921
No log 4.4231 230 0.8108 0.1429 0.8108 0.9005
No log 4.4615 232 0.7160 0.1289 0.7160 0.8461
No log 4.5 234 0.6574 0.3369 0.6574 0.8108
No log 4.5385 236 0.6639 0.2893 0.6639 0.8148
No log 4.5769 238 0.6794 0.3641 0.6794 0.8242
No log 4.6154 240 0.6637 0.2453 0.6637 0.8147
No log 4.6538 242 0.7474 0.2072 0.7474 0.8645
No log 4.6923 244 0.8046 0.2137 0.8046 0.8970
No log 4.7308 246 0.7085 0.2744 0.7085 0.8417
No log 4.7692 248 0.6819 0.2762 0.6819 0.8258
No log 4.8077 250 0.6272 0.3161 0.6272 0.7920
No log 4.8462 252 0.6110 0.4227 0.6110 0.7816
No log 4.8846 254 0.6025 0.3684 0.6025 0.7762
No log 4.9231 256 0.6041 0.3898 0.6041 0.7772
No log 4.9615 258 0.6752 0.2513 0.6752 0.8217
No log 5.0 260 0.6849 0.25 0.6849 0.8276
No log 5.0385 262 0.6519 0.1908 0.6519 0.8074
No log 5.0769 264 0.6497 0.2273 0.6497 0.8061
No log 5.1154 266 0.7237 0.2727 0.7237 0.8507
No log 5.1538 268 0.9334 0.1475 0.9334 0.9661
No log 5.1923 270 1.1708 0.1126 1.1708 1.0820
No log 5.2308 272 1.1028 0.1280 1.1028 1.0502
No log 5.2692 274 0.8425 0.0901 0.8425 0.9179
No log 5.3077 276 0.7888 0.2536 0.7888 0.8881
No log 5.3462 278 0.8095 0.1131 0.8095 0.8997
No log 5.3846 280 0.8560 0.0617 0.8560 0.9252
No log 5.4231 282 0.8152 0.1228 0.8152 0.9029
No log 5.4615 284 0.8587 0.1736 0.8587 0.9267
No log 5.5 286 0.8218 0.1379 0.8218 0.9065
No log 5.5385 288 0.7182 0.1848 0.7182 0.8475
No log 5.5769 290 0.7069 0.1923 0.7069 0.8408
No log 5.6154 292 0.7103 0.1923 0.7103 0.8428
No log 5.6538 294 0.7618 0.1644 0.7618 0.8728
No log 5.6923 296 0.8190 0.1786 0.8190 0.9050
No log 5.7308 298 0.8662 0.2713 0.8662 0.9307
No log 5.7692 300 0.8313 0.1660 0.8313 0.9118
No log 5.8077 302 0.8192 0.1933 0.8192 0.9051
No log 5.8462 304 0.9057 0.2685 0.9057 0.9517
No log 5.8846 306 0.9837 0.2121 0.9837 0.9918
No log 5.9231 308 0.9374 0.2124 0.9374 0.9682
No log 5.9615 310 0.7665 0.1652 0.7665 0.8755
No log 6.0 312 0.6613 0.2577 0.6613 0.8132
No log 6.0385 314 0.6469 0.2593 0.6469 0.8043
No log 6.0769 316 0.6739 0.3131 0.6739 0.8209
No log 6.1154 318 0.7891 0.1515 0.7891 0.8883
No log 6.1538 320 0.7886 0.1795 0.7886 0.8880
No log 6.1923 322 0.6814 0.1538 0.6814 0.8255
No log 6.2308 324 0.6493 0.2323 0.6493 0.8058
No log 6.2692 326 0.6637 0.1921 0.6637 0.8147
No log 6.3077 328 0.7767 0.2137 0.7767 0.8813
No log 6.3462 330 0.8261 0.2203 0.8261 0.9089
No log 6.3846 332 0.7352 0.1402 0.7352 0.8574
No log 6.4231 334 0.6203 0.2323 0.6203 0.7876
No log 6.4615 336 0.6088 0.3191 0.6088 0.7803
No log 6.5 338 0.6227 0.2727 0.6227 0.7891
No log 6.5385 340 0.6788 0.2308 0.6788 0.8239
No log 6.5769 342 0.7702 0.1724 0.7702 0.8776
No log 6.6154 344 0.8262 0.2531 0.8262 0.9090
No log 6.6538 346 0.7819 0.2000 0.7819 0.8843
No log 6.6923 348 0.6950 0.1852 0.6950 0.8337
No log 6.7308 350 0.6896 0.2074 0.6896 0.8304
No log 6.7692 352 0.7069 0.2150 0.7069 0.8408
No log 6.8077 354 0.7484 0.2222 0.7484 0.8651
No log 6.8462 356 0.8248 0.2000 0.8248 0.9082
No log 6.8846 358 0.8513 0.2000 0.8513 0.9227
No log 6.9231 360 0.7856 0.1855 0.7856 0.8864
No log 6.9615 362 0.7373 0.2287 0.7373 0.8587
No log 7.0 364 0.7304 0.1855 0.7304 0.8546
No log 7.0385 366 0.7217 0.2227 0.7217 0.8495
No log 7.0769 368 0.7215 0.1855 0.7215 0.8494
No log 7.1154 370 0.7169 0.2621 0.7169 0.8467
No log 7.1538 372 0.7185 0.2217 0.7185 0.8476
No log 7.1923 374 0.7740 0.1852 0.7740 0.8798
No log 7.2308 376 0.9398 0.2124 0.9398 0.9694
No log 7.2692 378 1.0377 0.1628 1.0377 1.0187
No log 7.3077 380 0.9824 0.1628 0.9824 0.9912
No log 7.3462 382 0.8104 0.1652 0.8104 0.9002
No log 7.3846 384 0.7211 0.1925 0.7211 0.8492
No log 7.4231 386 0.6756 0.2332 0.6756 0.8220
No log 7.4615 388 0.6787 0.2239 0.6787 0.8239
No log 7.5 390 0.6940 0.2239 0.6940 0.8330
No log 7.5385 392 0.7466 0.2308 0.7466 0.8641
No log 7.5769 394 0.7494 0.1925 0.7494 0.8657
No log 7.6154 396 0.7533 0.1925 0.7533 0.8679
No log 7.6538 398 0.7467 0.2308 0.7467 0.8641
No log 7.6923 400 0.7472 0.2308 0.7472 0.8644
No log 7.7308 402 0.7804 0.1289 0.7804 0.8834
No log 7.7692 404 0.7635 0.1636 0.7635 0.8738
No log 7.8077 406 0.7173 0.2315 0.7173 0.8469
No log 7.8462 408 0.6640 0.2239 0.6640 0.8149
No log 7.8846 410 0.6389 0.3131 0.6389 0.7993
No log 7.9231 412 0.6408 0.2941 0.6408 0.8005
No log 7.9615 414 0.6382 0.2941 0.6382 0.7989
No log 8.0 416 0.6316 0.3200 0.6316 0.7947
No log 8.0385 418 0.6463 0.2323 0.6463 0.8039
No log 8.0769 420 0.6828 0.2323 0.6828 0.8263
No log 8.1154 422 0.6872 0.1921 0.6872 0.8290
No log 8.1538 424 0.6581 0.2323 0.6581 0.8112
No log 8.1923 426 0.6176 0.2766 0.6176 0.7859
No log 8.2308 428 0.6122 0.3591 0.6122 0.7824
No log 8.2692 430 0.6252 0.2766 0.6252 0.7907
No log 8.3077 432 0.6443 0.2323 0.6443 0.8027
No log 8.3462 434 0.6874 0.1549 0.6874 0.8291
No log 8.3846 436 0.6918 0.1538 0.6918 0.8317
No log 8.4231 438 0.6600 0.2323 0.6600 0.8124
No log 8.4615 440 0.6413 0.2670 0.6413 0.8008
No log 8.5 442 0.6381 0.2688 0.6381 0.7988
No log 8.5385 444 0.6506 0.2340 0.6506 0.8066
No log 8.5769 446 0.6801 0.2323 0.6801 0.8247
No log 8.6154 448 0.7315 0.1538 0.7315 0.8553
No log 8.6538 450 0.7613 0.0566 0.7613 0.8725
No log 8.6923 452 0.7759 0.0599 0.7759 0.8808
No log 8.7308 454 0.7556 0.0884 0.7556 0.8692
No log 8.7692 456 0.7506 0.0857 0.7506 0.8664
No log 8.8077 458 0.7531 0.0857 0.7531 0.8678
No log 8.8462 460 0.7623 0.0884 0.7623 0.8731
No log 8.8846 462 0.7554 0.0857 0.7554 0.8691
No log 8.9231 464 0.7598 0.0857 0.7598 0.8716
No log 8.9615 466 0.7525 0.0857 0.7525 0.8675
No log 9.0 468 0.7394 0.1220 0.7394 0.8599
No log 9.0385 470 0.7320 0.1154 0.7320 0.8556
No log 9.0769 472 0.7246 0.2233 0.7246 0.8512
No log 9.1154 474 0.7106 0.2233 0.7106 0.8430
No log 9.1538 476 0.7116 0.2233 0.7116 0.8436
No log 9.1923 478 0.7217 0.1848 0.7217 0.8495
No log 9.2308 480 0.7239 0.1923 0.7239 0.8508
No log 9.2692 482 0.7193 0.1923 0.7193 0.8481
No log 9.3077 484 0.7146 0.2315 0.7146 0.8454
No log 9.3462 486 0.7082 0.2233 0.7082 0.8416
No log 9.3846 488 0.6971 0.2233 0.6971 0.8349
No log 9.4231 490 0.6987 0.2233 0.6987 0.8359
No log 9.4615 492 0.7060 0.2233 0.7060 0.8402
No log 9.5 494 0.7211 0.1549 0.7211 0.8492
No log 9.5385 496 0.7282 0.0857 0.7282 0.8533
No log 9.5769 498 0.7340 0.0857 0.7340 0.8567
0.3766 9.6154 500 0.7340 0.0857 0.7340 0.8568
0.3766 9.6538 502 0.7348 0.0857 0.7348 0.8572
0.3766 9.6923 504 0.7414 0.0857 0.7414 0.8610
0.3766 9.7308 506 0.7521 0.0857 0.7521 0.8672
0.3766 9.7692 508 0.7528 0.0857 0.7528 0.8676
0.3766 9.8077 510 0.7519 0.0857 0.7519 0.8671
0.3766 9.8462 512 0.7521 0.0857 0.7521 0.8672
0.3766 9.8846 514 0.7511 0.0857 0.7511 0.8667
0.3766 9.9231 516 0.7504 0.0857 0.7504 0.8663
0.3766 9.9615 518 0.7514 0.0857 0.7514 0.8668
0.3766 10.0 520 0.7523 0.0857 0.7523 0.8674

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_FineTuningAraBERT_run1_AugV5_k11_task3_organization

Finetuned
(4023)
this model