ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k6_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0749
  • Qwk: 0.2624
  • Mse: 1.0749
  • Rmse: 1.0368

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0606 2 3.3679 -0.0149 3.3679 1.8352
No log 0.1212 4 1.6985 -0.0130 1.6985 1.3033
No log 0.1818 6 1.0227 0.1128 1.0227 1.0113
No log 0.2424 8 0.9105 0.0744 0.9105 0.9542
No log 0.3030 10 0.6080 0.0388 0.6080 0.7798
No log 0.3636 12 0.5954 -0.0159 0.5954 0.7716
No log 0.4242 14 0.5688 0.0 0.5688 0.7542
No log 0.4848 16 0.6246 0.0850 0.6246 0.7903
No log 0.5455 18 1.2170 0.0149 1.2170 1.1032
No log 0.6061 20 0.7404 0.2000 0.7404 0.8604
No log 0.6667 22 0.5831 -0.0081 0.5831 0.7636
No log 0.7273 24 0.6769 0.0 0.6769 0.8227
No log 0.7879 26 0.7000 -0.0732 0.7000 0.8367
No log 0.8485 28 0.6690 -0.0159 0.6690 0.8179
No log 0.9091 30 0.6500 -0.0303 0.6500 0.8062
No log 0.9697 32 0.7191 0.2083 0.7191 0.8480
No log 1.0303 34 1.0044 0.0 1.0044 1.0022
No log 1.0909 36 0.9933 0.0345 0.9933 0.9966
No log 1.1515 38 0.7876 0.0991 0.7876 0.8874
No log 1.2121 40 0.6288 0.0909 0.6288 0.7929
No log 1.2727 42 0.5737 -0.0233 0.5737 0.7574
No log 1.3333 44 0.5742 -0.0233 0.5742 0.7578
No log 1.3939 46 0.5750 -0.0233 0.5750 0.7583
No log 1.4545 48 0.5919 0.1020 0.5919 0.7694
No log 1.5152 50 0.6712 0.1556 0.6712 0.8193
No log 1.5758 52 0.6495 0.1556 0.6495 0.8059
No log 1.6364 54 0.5886 0.1773 0.5886 0.7672
No log 1.6970 56 0.6992 0.0448 0.6992 0.8362
No log 1.7576 58 0.7537 0.0843 0.7537 0.8681
No log 1.8182 60 0.6155 0.0857 0.6155 0.7845
No log 1.8788 62 0.5893 0.1667 0.5893 0.7677
No log 1.9394 64 0.5739 0.2109 0.5739 0.7575
No log 2.0 66 0.5626 0.2941 0.5626 0.7501
No log 2.0606 68 0.5424 0.2821 0.5424 0.7365
No log 2.1212 70 0.5165 0.1888 0.5165 0.7187
No log 2.1818 72 0.5085 0.1579 0.5085 0.7131
No log 2.2424 74 0.5160 0.1688 0.5160 0.7183
No log 2.3030 76 0.6355 0.1429 0.6355 0.7972
No log 2.3636 78 0.6623 0.2340 0.6623 0.8138
No log 2.4242 80 0.5813 0.2000 0.5813 0.7624
No log 2.4848 82 0.5736 0.3831 0.5736 0.7574
No log 2.5455 84 0.6625 0.2212 0.6625 0.8139
No log 2.6061 86 1.0097 0.2243 1.0097 1.0049
No log 2.6667 88 1.0041 0.2243 1.0041 1.0020
No log 2.7273 90 0.8374 0.2258 0.8374 0.9151
No log 2.7879 92 0.6092 0.2881 0.6092 0.7805
No log 2.8485 94 0.6499 0.3251 0.6499 0.8062
No log 2.9091 96 0.7269 0.3231 0.7269 0.8526
No log 2.9697 98 0.7132 0.2581 0.7132 0.8445
No log 3.0303 100 0.7166 0.2269 0.7166 0.8465
No log 3.0909 102 0.6418 0.3909 0.6418 0.8011
No log 3.1515 104 0.8213 0.2868 0.8213 0.9063
No log 3.2121 106 0.8335 0.2959 0.8335 0.9130
No log 3.2727 108 0.8429 0.2741 0.8429 0.9181
No log 3.3333 110 1.0481 0.2738 1.0481 1.0238
No log 3.3939 112 1.1927 0.2593 1.1927 1.0921
No log 3.4545 114 1.1299 0.2727 1.1299 1.0630
No log 3.5152 116 1.1627 0.3012 1.1627 1.0783
No log 3.5758 118 1.2257 0.2779 1.2257 1.1071
No log 3.6364 120 1.2215 0.2762 1.2215 1.1052
No log 3.6970 122 0.6912 0.3798 0.6912 0.8314
No log 3.7576 124 0.6198 0.3466 0.6198 0.7873
No log 3.8182 126 0.9318 0.2490 0.9318 0.9653
No log 3.8788 128 1.5871 0.1591 1.5871 1.2598
No log 3.9394 130 1.3751 0.1402 1.3751 1.1727
No log 4.0 132 0.8650 0.2459 0.8650 0.9300
No log 4.0606 134 0.7838 0.2405 0.7838 0.8853
No log 4.1212 136 0.8417 0.2450 0.8417 0.9175
No log 4.1818 138 0.9031 0.2441 0.9031 0.9503
No log 4.2424 140 1.2605 0.1672 1.2605 1.1227
No log 4.3030 142 1.2579 0.1890 1.2579 1.1216
No log 4.3636 144 0.9990 0.1942 0.9990 0.9995
No log 4.4242 146 0.8848 0.2797 0.8848 0.9406
No log 4.4848 148 0.7525 0.3701 0.7525 0.8675
No log 4.5455 150 0.8282 0.3667 0.8282 0.9101
No log 4.6061 152 1.0944 0.2104 1.0944 1.0462
No log 4.6667 154 1.1744 0.2399 1.1744 1.0837
No log 4.7273 156 0.9218 0.1304 0.9218 0.9601
No log 4.7879 158 0.7435 0.2281 0.7435 0.8623
No log 4.8485 160 0.7274 0.2000 0.7274 0.8529
No log 4.9091 162 1.0574 0.2258 1.0574 1.0283
No log 4.9697 164 1.1144 0.2975 1.1144 1.0556
No log 5.0303 166 1.1136 0.2603 1.1136 1.0553
No log 5.0909 168 0.8509 0.3114 0.8509 0.9225
No log 5.1515 170 1.0363 0.2583 1.0363 1.0180
No log 5.2121 172 1.1551 0.2821 1.1551 1.0748
No log 5.2727 174 1.5238 0.1956 1.5238 1.2344
No log 5.3333 176 1.5369 0.1957 1.5369 1.2397
No log 5.3939 178 1.0632 0.2516 1.0632 1.0311
No log 5.4545 180 0.8192 0.2868 0.8192 0.9051
No log 5.5152 182 0.9539 0.2571 0.9539 0.9767
No log 5.5758 184 1.3567 0.2663 1.3567 1.1648
No log 5.6364 186 1.3929 0.2184 1.3929 1.1802
No log 5.6970 188 1.2455 0.2762 1.2455 1.1160
No log 5.7576 190 0.8841 0.1880 0.8841 0.9403
No log 5.8182 192 0.6463 0.3537 0.6463 0.8040
No log 5.8788 194 0.6529 0.3939 0.6529 0.8080
No log 5.9394 196 0.8716 0.2409 0.8716 0.9336
No log 6.0 198 1.4524 0.1771 1.4524 1.2052
No log 6.0606 200 1.7324 0.1921 1.7324 1.3162
No log 6.1212 202 1.4885 0.1813 1.4885 1.2201
No log 6.1818 204 0.9733 0.2676 0.9733 0.9865
No log 6.2424 206 0.6196 0.3131 0.6196 0.7871
No log 6.3030 208 0.5559 0.4 0.5559 0.7456
No log 6.3636 210 0.5484 0.4286 0.5484 0.7405
No log 6.4242 212 0.6320 0.2487 0.6320 0.7950
No log 6.4848 214 0.8803 0.2481 0.8803 0.9383
No log 6.5455 216 1.0076 0.2664 1.0076 1.0038
No log 6.6061 218 0.9273 0.2688 0.9273 0.9630
No log 6.6667 220 0.8771 0.2993 0.8771 0.9365
No log 6.7273 222 0.7783 0.2727 0.7783 0.8822
No log 6.7879 224 0.7857 0.2959 0.7857 0.8864
No log 6.8485 226 1.0183 0.2603 1.0183 1.0091
No log 6.9091 228 1.5681 0.1958 1.5681 1.2522
No log 6.9697 230 1.8204 0.1959 1.8204 1.3492
No log 7.0303 232 1.8845 0.1695 1.8845 1.3728
No log 7.0909 234 1.6980 0.1959 1.6980 1.3031
No log 7.1515 236 1.2368 0.2566 1.2368 1.1121
No log 7.2121 238 0.9739 0.2703 0.9739 0.9869
No log 7.2727 240 0.9890 0.2491 0.9890 0.9945
No log 7.3333 242 1.0773 0.2552 1.0773 1.0379
No log 7.3939 244 1.0135 0.2552 1.0135 1.0067
No log 7.4545 246 1.0722 0.2688 1.0722 1.0355
No log 7.5152 248 0.9865 0.2624 0.9865 0.9932
No log 7.5758 250 0.8896 0.2060 0.8896 0.9432
No log 7.6364 252 0.8901 0.2060 0.8901 0.9434
No log 7.6970 254 1.0563 0.2688 1.0563 1.0277
No log 7.7576 256 1.2811 0.2368 1.2811 1.1318
No log 7.8182 258 1.2550 0.2368 1.2550 1.1203
No log 7.8788 260 1.0568 0.2688 1.0568 1.0280
No log 7.9394 262 0.8000 0.2640 0.8000 0.8944
No log 8.0 264 0.6578 0.3128 0.6578 0.8111
No log 8.0606 266 0.6490 0.3128 0.6490 0.8056
No log 8.1212 268 0.7290 0.3125 0.7290 0.8538
No log 8.1818 270 0.9436 0.2409 0.9436 0.9714
No log 8.2424 272 1.1966 0.2664 1.1966 1.0939
No log 8.3030 274 1.2752 0.2294 1.2752 1.1292
No log 8.3636 276 1.2170 0.2583 1.2170 1.1032
No log 8.4242 278 1.0472 0.2624 1.0472 1.0233
No log 8.4848 280 0.8688 0.2701 0.8688 0.9321
No log 8.5455 282 0.8053 0.2959 0.8053 0.8974
No log 8.6061 284 0.8375 0.2701 0.8375 0.9151
No log 8.6667 286 0.9199 0.2360 0.9199 0.9591
No log 8.7273 288 0.9928 0.2635 0.9928 0.9964
No log 8.7879 290 1.0460 0.2624 1.0460 1.0228
No log 8.8485 292 1.0284 0.2624 1.0284 1.0141
No log 8.9091 294 0.9879 0.2635 0.9879 0.9940
No log 8.9697 296 0.9667 0.2353 0.9667 0.9832
No log 9.0303 298 1.0107 0.2635 1.0107 1.0053
No log 9.0909 300 1.0040 0.2635 1.0040 1.0020
No log 9.1515 302 0.9674 0.2448 0.9674 0.9836
No log 9.2121 304 0.9504 0.2448 0.9504 0.9749
No log 9.2727 306 0.9288 0.2448 0.9288 0.9637
No log 9.3333 308 0.9143 0.2448 0.9143 0.9562
No log 9.3939 310 0.9316 0.2448 0.9316 0.9652
No log 9.4545 312 0.9567 0.2448 0.9567 0.9781
No log 9.5152 314 0.9916 0.2448 0.9916 0.9958
No log 9.5758 316 1.0011 0.2448 1.0011 1.0005
No log 9.6364 318 0.9932 0.2448 0.9932 0.9966
No log 9.6970 320 1.0049 0.2226 1.0049 1.0024
No log 9.7576 322 1.0202 0.25 1.0202 1.0101
No log 9.8182 324 1.0465 0.2491 1.0465 1.0230
No log 9.8788 326 1.0666 0.2624 1.0666 1.0327
No log 9.9394 328 1.0716 0.2624 1.0716 1.0352
No log 10.0 330 1.0749 0.2624 1.0749 1.0368

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_WithDuplicationsForScore5_FineTuningAraBERT_run2_AugV5_k6_task3_organization

Finetuned
(4023)
this model