ArabicNewSplits6_FineTuningAraBERT_run1_AugV5_k10_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8271
  • Qwk: 0.6081
  • Mse: 0.8271
  • Rmse: 0.9095

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0385 2 5.4654 -0.0205 5.4654 2.3378
No log 0.0769 4 3.6061 0.0256 3.6061 1.8990
No log 0.1154 6 2.9172 -0.0952 2.9172 1.7080
No log 0.1538 8 1.6997 0.0735 1.6997 1.3037
No log 0.1923 10 1.2158 0.2871 1.2158 1.1026
No log 0.2308 12 1.0881 0.2456 1.0881 1.0431
No log 0.2692 14 1.1064 0.2004 1.1064 1.0519
No log 0.3077 16 1.1686 0.1981 1.1686 1.0810
No log 0.3462 18 1.2354 0.2672 1.2354 1.1115
No log 0.3846 20 1.4841 0.0530 1.4841 1.2182
No log 0.4231 22 1.5255 0.0032 1.5255 1.2351
No log 0.4615 24 1.3239 0.1523 1.3239 1.1506
No log 0.5 26 1.1864 0.3519 1.1864 1.0892
No log 0.5385 28 1.1417 0.3367 1.1417 1.0685
No log 0.5769 30 1.1303 0.2080 1.1303 1.0632
No log 0.6154 32 1.1060 0.2080 1.1060 1.0516
No log 0.6538 34 1.0908 0.3296 1.0908 1.0444
No log 0.6923 36 1.1312 0.3709 1.1312 1.0636
No log 0.7308 38 1.1971 0.2468 1.1971 1.0941
No log 0.7692 40 1.2123 0.2319 1.2123 1.1010
No log 0.8077 42 1.2654 0.2136 1.2654 1.1249
No log 0.8462 44 1.4356 0.2345 1.4356 1.1981
No log 0.8846 46 1.4031 0.2591 1.4031 1.1845
No log 0.9231 48 1.2999 0.2507 1.2999 1.1401
No log 0.9615 50 1.1356 0.3278 1.1356 1.0657
No log 1.0 52 1.0142 0.3939 1.0142 1.0071
No log 1.0385 54 1.0208 0.3988 1.0208 1.0104
No log 1.0769 56 1.1338 0.2807 1.1338 1.0648
No log 1.1154 58 1.3548 0.2769 1.3548 1.1640
No log 1.1538 60 1.6044 0.2532 1.6044 1.2666
No log 1.1923 62 1.5156 0.2842 1.5156 1.2311
No log 1.2308 64 1.2890 0.3298 1.2889 1.1353
No log 1.2692 66 1.0205 0.3799 1.0205 1.0102
No log 1.3077 68 0.8266 0.4505 0.8266 0.9092
No log 1.3462 70 0.8009 0.4912 0.8009 0.8949
No log 1.3846 72 0.8306 0.5012 0.8306 0.9114
No log 1.4231 74 0.7771 0.5349 0.7771 0.8815
No log 1.4615 76 0.7454 0.5964 0.7454 0.8634
No log 1.5 78 0.7046 0.5955 0.7046 0.8394
No log 1.5385 80 0.7287 0.5515 0.7287 0.8536
No log 1.5769 82 0.7488 0.5848 0.7488 0.8653
No log 1.6154 84 0.7252 0.6212 0.7252 0.8516
No log 1.6538 86 0.7236 0.5847 0.7236 0.8506
No log 1.6923 88 0.7120 0.5906 0.7120 0.8438
No log 1.7308 90 0.7250 0.6333 0.7250 0.8515
No log 1.7692 92 0.7588 0.5737 0.7588 0.8711
No log 1.8077 94 0.7566 0.5897 0.7566 0.8698
No log 1.8462 96 0.7340 0.5834 0.7340 0.8567
No log 1.8846 98 0.7109 0.5767 0.7109 0.8431
No log 1.9231 100 0.8039 0.5395 0.8039 0.8966
No log 1.9615 102 0.8589 0.5400 0.8589 0.9268
No log 2.0 104 0.8466 0.5497 0.8466 0.9201
No log 2.0385 106 0.6981 0.6323 0.6981 0.8355
No log 2.0769 108 0.6705 0.6617 0.6705 0.8188
No log 2.1154 110 0.6919 0.6525 0.6919 0.8318
No log 2.1538 112 0.6844 0.6697 0.6844 0.8273
No log 2.1923 114 0.6791 0.6785 0.6791 0.8241
No log 2.2308 116 0.7106 0.6748 0.7106 0.8430
No log 2.2692 118 0.7621 0.6633 0.7621 0.8730
No log 2.3077 120 0.8450 0.5896 0.8450 0.9192
No log 2.3462 122 0.9026 0.5368 0.9026 0.9500
No log 2.3846 124 0.9419 0.4885 0.9419 0.9705
No log 2.4231 126 0.9489 0.4619 0.9489 0.9741
No log 2.4615 128 0.9073 0.4939 0.9073 0.9525
No log 2.5 130 0.8534 0.6110 0.8534 0.9238
No log 2.5385 132 0.7799 0.6565 0.7799 0.8831
No log 2.5769 134 0.7239 0.6633 0.7239 0.8508
No log 2.6154 136 0.7397 0.6328 0.7397 0.8600
No log 2.6538 138 0.7358 0.6595 0.7358 0.8578
No log 2.6923 140 0.7125 0.6482 0.7125 0.8441
No log 2.7308 142 0.6996 0.6797 0.6996 0.8364
No log 2.7692 144 0.7449 0.6537 0.7449 0.8631
No log 2.8077 146 0.8062 0.6227 0.8062 0.8979
No log 2.8462 148 0.8052 0.6221 0.8052 0.8973
No log 2.8846 150 0.7586 0.6440 0.7586 0.8710
No log 2.9231 152 0.7382 0.6593 0.7382 0.8592
No log 2.9615 154 0.7147 0.6459 0.7147 0.8454
No log 3.0 156 0.7214 0.6320 0.7214 0.8493
No log 3.0385 158 0.7358 0.6303 0.7358 0.8578
No log 3.0769 160 0.7151 0.6361 0.7151 0.8456
No log 3.1154 162 0.6911 0.6867 0.6911 0.8313
No log 3.1538 164 0.7027 0.6495 0.7027 0.8383
No log 3.1923 166 0.7037 0.6411 0.7037 0.8389
No log 3.2308 168 0.6967 0.6861 0.6967 0.8347
No log 3.2692 170 0.6946 0.6904 0.6946 0.8334
No log 3.3077 172 0.7309 0.6747 0.7309 0.8550
No log 3.3462 174 0.7628 0.6756 0.7628 0.8734
No log 3.3846 176 0.7870 0.6538 0.7870 0.8872
No log 3.4231 178 0.7960 0.6668 0.7960 0.8922
No log 3.4615 180 0.8121 0.6406 0.8121 0.9011
No log 3.5 182 0.8337 0.6278 0.8337 0.9131
No log 3.5385 184 0.8711 0.6188 0.8711 0.9333
No log 3.5769 186 0.8432 0.6272 0.8432 0.9182
No log 3.6154 188 0.7948 0.6451 0.7948 0.8915
No log 3.6538 190 0.7661 0.6498 0.7661 0.8753
No log 3.6923 192 0.7409 0.6433 0.7409 0.8608
No log 3.7308 194 0.7179 0.6934 0.7179 0.8473
No log 3.7692 196 0.7273 0.6725 0.7273 0.8528
No log 3.8077 198 0.7394 0.6485 0.7394 0.8599
No log 3.8462 200 0.7470 0.6372 0.7470 0.8643
No log 3.8846 202 0.7646 0.6556 0.7646 0.8744
No log 3.9231 204 0.8152 0.6283 0.8152 0.9029
No log 3.9615 206 0.8638 0.6050 0.8638 0.9294
No log 4.0 208 0.9246 0.5768 0.9246 0.9616
No log 4.0385 210 0.9504 0.5737 0.9504 0.9749
No log 4.0769 212 0.9097 0.5858 0.9097 0.9538
No log 4.1154 214 0.8441 0.6243 0.8441 0.9188
No log 4.1538 216 0.8034 0.6558 0.8034 0.8963
No log 4.1923 218 0.7752 0.6685 0.7752 0.8805
No log 4.2308 220 0.7618 0.6714 0.7618 0.8728
No log 4.2692 222 0.7946 0.6276 0.7946 0.8914
No log 4.3077 224 0.8454 0.6114 0.8454 0.9194
No log 4.3462 226 0.8124 0.5981 0.8124 0.9013
No log 4.3846 228 0.7352 0.6432 0.7352 0.8574
No log 4.4231 230 0.7084 0.6615 0.7084 0.8417
No log 4.4615 232 0.7177 0.7094 0.7177 0.8471
No log 4.5 234 0.7189 0.7054 0.7189 0.8479
No log 4.5385 236 0.7142 0.6586 0.7142 0.8451
No log 4.5769 238 0.7722 0.6124 0.7722 0.8787
No log 4.6154 240 0.8425 0.6132 0.8425 0.9179
No log 4.6538 242 0.8805 0.6123 0.8805 0.9384
No log 4.6923 244 0.8518 0.6249 0.8518 0.9229
No log 4.7308 246 0.8649 0.6128 0.8649 0.9300
No log 4.7692 248 0.8388 0.6513 0.8388 0.9158
No log 4.8077 250 0.8025 0.6598 0.8025 0.8958
No log 4.8462 252 0.7554 0.6890 0.7554 0.8692
No log 4.8846 254 0.7520 0.6628 0.7520 0.8672
No log 4.9231 256 0.8093 0.6476 0.8093 0.8996
No log 4.9615 258 0.8906 0.6139 0.8906 0.9437
No log 5.0 260 0.8673 0.6100 0.8673 0.9313
No log 5.0385 262 0.8622 0.6182 0.8622 0.9285
No log 5.0769 264 0.8222 0.6271 0.8222 0.9068
No log 5.1154 266 0.7874 0.6376 0.7874 0.8873
No log 5.1538 268 0.7929 0.6269 0.7929 0.8905
No log 5.1923 270 0.8050 0.6405 0.8050 0.8972
No log 5.2308 272 0.8275 0.6479 0.8275 0.9097
No log 5.2692 274 0.9144 0.6275 0.9144 0.9563
No log 5.3077 276 1.0158 0.6149 1.0158 1.0079
No log 5.3462 278 1.0498 0.6212 1.0498 1.0246
No log 5.3846 280 1.0416 0.6165 1.0416 1.0206
No log 5.4231 282 1.0123 0.5862 1.0123 1.0061
No log 5.4615 284 0.8771 0.6307 0.8771 0.9366
No log 5.5 286 0.8081 0.6421 0.8081 0.8990
No log 5.5385 288 0.7648 0.6751 0.7648 0.8745
No log 5.5769 290 0.7702 0.6579 0.7702 0.8776
No log 5.6154 292 0.8017 0.6330 0.8017 0.8954
No log 5.6538 294 0.8668 0.6171 0.8668 0.9310
No log 5.6923 296 0.9368 0.6245 0.9368 0.9679
No log 5.7308 298 0.9425 0.6277 0.9425 0.9708
No log 5.7692 300 0.8866 0.6437 0.8866 0.9416
No log 5.8077 302 0.8281 0.6279 0.8281 0.9100
No log 5.8462 304 0.8254 0.6323 0.8254 0.9085
No log 5.8846 306 0.8835 0.6470 0.8835 0.9399
No log 5.9231 308 0.9376 0.6256 0.9376 0.9683
No log 5.9615 310 0.9420 0.6362 0.9420 0.9706
No log 6.0 312 0.9584 0.6323 0.9584 0.9790
No log 6.0385 314 1.0166 0.6195 1.0166 1.0082
No log 6.0769 316 1.0355 0.6358 1.0355 1.0176
No log 6.1154 318 1.0123 0.6132 1.0123 1.0061
No log 6.1538 320 0.9242 0.5951 0.9242 0.9614
No log 6.1923 322 0.8477 0.6072 0.8477 0.9207
No log 6.2308 324 0.8260 0.6076 0.8260 0.9088
No log 6.2692 326 0.8412 0.6035 0.8412 0.9171
No log 6.3077 328 0.8611 0.6136 0.8611 0.9279
No log 6.3462 330 0.8585 0.6284 0.8585 0.9265
No log 6.3846 332 0.8831 0.6407 0.8831 0.9397
No log 6.4231 334 0.9242 0.6407 0.9242 0.9614
No log 6.4615 336 0.9401 0.6353 0.9401 0.9696
No log 6.5 338 0.9180 0.6276 0.9180 0.9581
No log 6.5385 340 0.8366 0.6305 0.8366 0.9147
No log 6.5769 342 0.7751 0.6461 0.7751 0.8804
No log 6.6154 344 0.7661 0.6434 0.7661 0.8753
No log 6.6538 346 0.7731 0.6469 0.7731 0.8793
No log 6.6923 348 0.8157 0.6226 0.8157 0.9031
No log 6.7308 350 0.8592 0.6268 0.8592 0.9269
No log 6.7692 352 0.8500 0.6179 0.8500 0.9219
No log 6.8077 354 0.8190 0.6346 0.8190 0.9050
No log 6.8462 356 0.8109 0.6411 0.8109 0.9005
No log 6.8846 358 0.7941 0.6371 0.7941 0.8911
No log 6.9231 360 0.8132 0.6098 0.8132 0.9018
No log 6.9615 362 0.8555 0.6227 0.8555 0.9249
No log 7.0 364 0.9134 0.6184 0.9134 0.9557
No log 7.0385 366 0.9757 0.6156 0.9757 0.9878
No log 7.0769 368 0.9691 0.6156 0.9691 0.9844
No log 7.1154 370 0.9410 0.6170 0.9410 0.9701
No log 7.1538 372 0.9021 0.6267 0.9021 0.9498
No log 7.1923 374 0.8782 0.6321 0.8782 0.9371
No log 7.2308 376 0.8627 0.6196 0.8627 0.9288
No log 7.2692 378 0.8677 0.6230 0.8677 0.9315
No log 7.3077 380 0.8751 0.6251 0.8751 0.9355
No log 7.3462 382 0.8711 0.6251 0.8711 0.9333
No log 7.3846 384 0.8804 0.5984 0.8804 0.9383
No log 7.4231 386 0.9022 0.5984 0.9022 0.9499
No log 7.4615 388 0.9250 0.6048 0.9250 0.9618
No log 7.5 390 0.9628 0.5983 0.9628 0.9812
No log 7.5385 392 0.9988 0.6074 0.9988 0.9994
No log 7.5769 394 1.0152 0.6151 1.0152 1.0076
No log 7.6154 396 0.9867 0.6091 0.9867 0.9933
No log 7.6538 398 0.9431 0.5959 0.9431 0.9711
No log 7.6923 400 0.8815 0.6100 0.8815 0.9389
No log 7.7308 402 0.8579 0.5957 0.8579 0.9263
No log 7.7692 404 0.8417 0.5934 0.8417 0.9175
No log 7.8077 406 0.8450 0.5934 0.8450 0.9192
No log 7.8462 408 0.8698 0.6054 0.8698 0.9326
No log 7.8846 410 0.9111 0.5981 0.9111 0.9545
No log 7.9231 412 0.9431 0.6061 0.9431 0.9711
No log 7.9615 414 0.9311 0.6061 0.9311 0.9649
No log 8.0 416 0.9025 0.5932 0.9025 0.9500
No log 8.0385 418 0.8568 0.6022 0.8568 0.9256
No log 8.0769 420 0.8187 0.5871 0.8187 0.9048
No log 8.1154 422 0.8161 0.5871 0.8161 0.9034
No log 8.1538 424 0.8217 0.5934 0.8217 0.9065
No log 8.1923 426 0.8287 0.5934 0.8287 0.9103
No log 8.2308 428 0.8497 0.6181 0.8497 0.9218
No log 8.2692 430 0.8869 0.6161 0.8869 0.9417
No log 8.3077 432 0.9184 0.6149 0.9184 0.9584
No log 8.3462 434 0.9398 0.6197 0.9398 0.9694
No log 8.3846 436 0.9363 0.6197 0.9363 0.9676
No log 8.4231 438 0.9211 0.6231 0.9211 0.9597
No log 8.4615 440 0.9028 0.6169 0.9028 0.9502
No log 8.5 442 0.8975 0.6164 0.8975 0.9474
No log 8.5385 444 0.8926 0.6129 0.8926 0.9448
No log 8.5769 446 0.8782 0.6189 0.8782 0.9371
No log 8.6154 448 0.8596 0.6184 0.8596 0.9271
No log 8.6538 450 0.8513 0.6152 0.8513 0.9226
No log 8.6923 452 0.8615 0.6015 0.8615 0.9281
No log 8.7308 454 0.8820 0.5881 0.8820 0.9392
No log 8.7692 456 0.9164 0.6182 0.9164 0.9573
No log 8.8077 458 0.9409 0.6179 0.9409 0.9700
No log 8.8462 460 0.9732 0.6126 0.9732 0.9865
No log 8.8846 462 0.9789 0.6086 0.9789 0.9894
No log 8.9231 464 0.9634 0.6126 0.9634 0.9815
No log 8.9615 466 0.9317 0.6192 0.9317 0.9652
No log 9.0 468 0.9118 0.6130 0.9118 0.9549
No log 9.0385 470 0.8991 0.6142 0.8991 0.9482
No log 9.0769 472 0.8798 0.5985 0.8798 0.9380
No log 9.1154 474 0.8557 0.5854 0.8557 0.9250
No log 9.1538 476 0.8326 0.5960 0.8326 0.9125
No log 9.1923 478 0.8171 0.6172 0.8171 0.9039
No log 9.2308 480 0.8044 0.6215 0.8044 0.8969
No log 9.2692 482 0.7948 0.6256 0.7948 0.8915
No log 9.3077 484 0.7910 0.6256 0.7910 0.8894
No log 9.3462 486 0.7885 0.6220 0.7885 0.8880
No log 9.3846 488 0.7875 0.6220 0.7875 0.8874
No log 9.4231 490 0.7887 0.6110 0.7887 0.8881
No log 9.4615 492 0.7914 0.6110 0.7914 0.8896
No log 9.5 494 0.7986 0.6067 0.7986 0.8937
No log 9.5385 496 0.8042 0.6132 0.8042 0.8967
No log 9.5769 498 0.8080 0.6196 0.8080 0.8989
0.43 9.6154 500 0.8131 0.6201 0.8131 0.9017
0.43 9.6538 502 0.8171 0.6081 0.8171 0.9039
0.43 9.6923 504 0.8214 0.6081 0.8214 0.9063
0.43 9.7308 506 0.8244 0.6081 0.8244 0.9080
0.43 9.7692 508 0.8254 0.6081 0.8254 0.9085
0.43 9.8077 510 0.8253 0.6081 0.8253 0.9084
0.43 9.8462 512 0.8263 0.6081 0.8263 0.9090
0.43 9.8846 514 0.8274 0.6081 0.8274 0.9096
0.43 9.9231 516 0.8275 0.6081 0.8275 0.9097
0.43 9.9615 518 0.8273 0.6081 0.8273 0.9095
0.43 10.0 520 0.8271 0.6081 0.8271 0.9095

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_FineTuningAraBERT_run1_AugV5_k10_task1_organization

Finetuned
(4023)
this model