ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run1_AugV5_k8_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9514
  • Qwk: 0.4805
  • Mse: 0.9514
  • Rmse: 0.9754

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0465 2 4.0935 -0.0066 4.0935 2.0232
No log 0.0930 4 1.9138 0.0445 1.9138 1.3834
No log 0.1395 6 1.1022 0.0528 1.1022 1.0499
No log 0.1860 8 0.7413 0.1994 0.7413 0.8610
No log 0.2326 10 0.7105 0.1404 0.7105 0.8429
No log 0.2791 12 0.7839 0.0973 0.7839 0.8854
No log 0.3256 14 0.7566 0.2057 0.7566 0.8698
No log 0.3721 16 0.7381 0.2326 0.7381 0.8591
No log 0.4186 18 1.0099 0.1561 1.0099 1.0049
No log 0.4651 20 0.8718 0.2516 0.8718 0.9337
No log 0.5116 22 0.7493 0.4404 0.7493 0.8656
No log 0.5581 24 0.9393 0.3345 0.9393 0.9692
No log 0.6047 26 0.8570 0.3645 0.8570 0.9257
No log 0.6512 28 0.6168 0.3937 0.6168 0.7854
No log 0.6977 30 0.7384 0.3492 0.7384 0.8593
No log 0.7442 32 1.4401 0.1855 1.4401 1.2000
No log 0.7907 34 1.5488 0.1677 1.5488 1.2445
No log 0.8372 36 1.1609 0.1948 1.1609 1.0775
No log 0.8837 38 0.7304 0.2251 0.7304 0.8546
No log 0.9302 40 0.5585 0.4341 0.5585 0.7473
No log 0.9767 42 0.5985 0.2518 0.5985 0.7736
No log 1.0233 44 0.6611 0.4090 0.6611 0.8131
No log 1.0698 46 0.5406 0.4528 0.5406 0.7352
No log 1.1163 48 0.5839 0.4599 0.5839 0.7641
No log 1.1628 50 0.6402 0.4892 0.6402 0.8001
No log 1.2093 52 0.5670 0.4745 0.5670 0.7530
No log 1.2558 54 0.6860 0.5033 0.6860 0.8283
No log 1.3023 56 0.6249 0.5114 0.6249 0.7905
No log 1.3488 58 0.6293 0.4994 0.6293 0.7933
No log 1.3953 60 0.6981 0.5238 0.6981 0.8355
No log 1.4419 62 0.7186 0.5356 0.7186 0.8477
No log 1.4884 64 0.7173 0.5405 0.7173 0.8469
No log 1.5349 66 0.7787 0.5272 0.7787 0.8825
No log 1.5814 68 0.8370 0.5370 0.8370 0.9149
No log 1.6279 70 0.8667 0.5435 0.8667 0.9309
No log 1.6744 72 0.9079 0.5470 0.9079 0.9529
No log 1.7209 74 0.9950 0.4944 0.9950 0.9975
No log 1.7674 76 1.0303 0.4854 1.0303 1.0150
No log 1.8140 78 0.8371 0.5171 0.8371 0.9149
No log 1.8605 80 0.6783 0.5455 0.6783 0.8236
No log 1.9070 82 0.7106 0.5721 0.7106 0.8429
No log 1.9535 84 0.7784 0.5547 0.7784 0.8823
No log 2.0 86 0.7978 0.5329 0.7978 0.8932
No log 2.0465 88 1.0051 0.5368 1.0051 1.0025
No log 2.0930 90 1.1312 0.5086 1.1312 1.0636
No log 2.1395 92 1.2032 0.5212 1.2032 1.0969
No log 2.1860 94 1.1351 0.5016 1.1351 1.0654
No log 2.2326 96 1.0838 0.5320 1.0838 1.0410
No log 2.2791 98 1.1592 0.5002 1.1592 1.0767
No log 2.3256 100 1.3532 0.4712 1.3532 1.1633
No log 2.3721 102 1.4669 0.4643 1.4669 1.2112
No log 2.4186 104 1.5285 0.4541 1.5285 1.2363
No log 2.4651 106 1.4237 0.4622 1.4237 1.1932
No log 2.5116 108 1.1630 0.4945 1.1630 1.0784
No log 2.5581 110 1.1606 0.4907 1.1606 1.0773
No log 2.6047 112 1.2498 0.4903 1.2498 1.1179
No log 2.6512 114 1.3025 0.4516 1.3025 1.1413
No log 2.6977 116 1.1241 0.4771 1.1241 1.0603
No log 2.7442 118 0.8603 0.5440 0.8603 0.9275
No log 2.7907 120 0.8279 0.5014 0.8279 0.9099
No log 2.8372 122 0.8680 0.5287 0.8680 0.9317
No log 2.8837 124 1.0530 0.4939 1.0530 1.0261
No log 2.9302 126 1.1801 0.5084 1.1801 1.0863
No log 2.9767 128 1.2872 0.4552 1.2872 1.1345
No log 3.0233 130 1.4847 0.4264 1.4847 1.2185
No log 3.0698 132 1.2394 0.4665 1.2394 1.1133
No log 3.1163 134 0.9095 0.5319 0.9095 0.9537
No log 3.1628 136 0.8454 0.5577 0.8454 0.9195
No log 3.2093 138 0.8765 0.5058 0.8765 0.9362
No log 3.2558 140 0.9802 0.4473 0.9802 0.9901
No log 3.3023 142 0.9726 0.4578 0.9726 0.9862
No log 3.3488 144 0.9799 0.4578 0.9799 0.9899
No log 3.3953 146 0.8457 0.4778 0.8457 0.9196
No log 3.4419 148 0.7345 0.4880 0.7345 0.8570
No log 3.4884 150 0.7483 0.4536 0.7483 0.8650
No log 3.5349 152 0.9518 0.5154 0.9518 0.9756
No log 3.5814 154 1.0169 0.4836 1.0169 1.0084
No log 3.6279 156 1.0187 0.4855 1.0187 1.0093
No log 3.6744 158 1.2570 0.4550 1.2570 1.1211
No log 3.7209 160 1.3338 0.4563 1.3338 1.1549
No log 3.7674 162 1.1499 0.4358 1.1499 1.0723
No log 3.8140 164 1.0531 0.4832 1.0531 1.0262
No log 3.8605 166 1.0405 0.4980 1.0405 1.0201
No log 3.9070 168 1.0616 0.5008 1.0616 1.0303
No log 3.9535 170 1.0710 0.4939 1.0710 1.0349
No log 4.0 172 1.0761 0.4754 1.0761 1.0374
No log 4.0465 174 0.9919 0.4906 0.9919 0.9959
No log 4.0930 176 1.0499 0.4470 1.0499 1.0246
No log 4.1395 178 1.0437 0.4594 1.0437 1.0216
No log 4.1860 180 0.9775 0.4680 0.9775 0.9887
No log 4.2326 182 0.9607 0.4860 0.9607 0.9802
No log 4.2791 184 0.9503 0.4795 0.9503 0.9749
No log 4.3256 186 0.9126 0.4856 0.9126 0.9553
No log 4.3721 188 0.9666 0.4793 0.9666 0.9832
No log 4.4186 190 1.1184 0.4645 1.1184 1.0575
No log 4.4651 192 1.1725 0.4534 1.1725 1.0828
No log 4.5116 194 1.1407 0.4606 1.1407 1.0680
No log 4.5581 196 1.1762 0.4769 1.1762 1.0845
No log 4.6047 198 1.1876 0.4739 1.1876 1.0898
No log 4.6512 200 1.1229 0.4767 1.1229 1.0597
No log 4.6977 202 1.0990 0.4751 1.0990 1.0484
No log 4.7442 204 1.1546 0.4507 1.1546 1.0745
No log 4.7907 206 1.1906 0.4383 1.1906 1.0912
No log 4.8372 208 1.2210 0.4291 1.2210 1.1050
No log 4.8837 210 1.1531 0.4291 1.1531 1.0738
No log 4.9302 212 1.1355 0.4373 1.1355 1.0656
No log 4.9767 214 1.0266 0.4909 1.0266 1.0132
No log 5.0233 216 1.0417 0.4936 1.0417 1.0206
No log 5.0698 218 1.0596 0.4918 1.0596 1.0294
No log 5.1163 220 1.1027 0.4669 1.1027 1.0501
No log 5.1628 222 0.9974 0.5229 0.9974 0.9987
No log 5.2093 224 0.8957 0.4811 0.8957 0.9464
No log 5.2558 226 0.8797 0.4939 0.8797 0.9379
No log 5.3023 228 0.8885 0.5094 0.8885 0.9426
No log 5.3488 230 1.0105 0.5019 1.0105 1.0053
No log 5.3953 232 1.2182 0.4578 1.2182 1.1037
No log 5.4419 234 1.2267 0.4578 1.2267 1.1076
No log 5.4884 236 1.0596 0.5238 1.0596 1.0293
No log 5.5349 238 0.8892 0.5055 0.8892 0.9430
No log 5.5814 240 0.8331 0.4754 0.8331 0.9127
No log 5.6279 242 0.8299 0.4968 0.8299 0.9110
No log 5.6744 244 0.8544 0.4591 0.8544 0.9243
No log 5.7209 246 0.9444 0.5026 0.9444 0.9718
No log 5.7674 248 1.1352 0.4711 1.1352 1.0655
No log 5.8140 250 1.1616 0.4734 1.1616 1.0778
No log 5.8605 252 1.0160 0.4991 1.0160 1.0080
No log 5.9070 254 0.8532 0.4870 0.8532 0.9237
No log 5.9535 256 0.8139 0.4703 0.8139 0.9022
No log 6.0 258 0.8435 0.4900 0.8435 0.9184
No log 6.0465 260 0.9643 0.5162 0.9643 0.9820
No log 6.0930 262 1.0288 0.4749 1.0288 1.0143
No log 6.1395 264 1.0684 0.4642 1.0684 1.0336
No log 6.1860 266 1.1624 0.4552 1.1624 1.0781
No log 6.2326 268 1.1382 0.4513 1.1382 1.0669
No log 6.2791 270 1.0825 0.4491 1.0825 1.0404
No log 6.3256 272 0.9935 0.4771 0.9935 0.9968
No log 6.3721 274 0.9206 0.5085 0.9206 0.9595
No log 6.4186 276 0.8746 0.5185 0.8746 0.9352
No log 6.4651 278 0.8649 0.5296 0.8649 0.9300
No log 6.5116 280 0.9178 0.5071 0.9178 0.9580
No log 6.5581 282 0.9910 0.4792 0.9910 0.9955
No log 6.6047 284 1.0791 0.4652 1.0791 1.0388
No log 6.6512 286 1.0514 0.4757 1.0514 1.0254
No log 6.6977 288 0.9236 0.5018 0.9236 0.9611
No log 6.7442 290 0.8445 0.5221 0.8445 0.9190
No log 6.7907 292 0.7663 0.5391 0.7663 0.8754
No log 6.8372 294 0.7491 0.5465 0.7491 0.8655
No log 6.8837 296 0.7856 0.5215 0.7856 0.8863
No log 6.9302 298 0.7994 0.5295 0.7994 0.8941
No log 6.9767 300 0.8412 0.5158 0.8412 0.9171
No log 7.0233 302 0.8576 0.5155 0.8576 0.9260
No log 7.0698 304 0.9272 0.5170 0.9272 0.9629
No log 7.1163 306 0.9864 0.5146 0.9864 0.9932
No log 7.1628 308 1.0309 0.5067 1.0309 1.0153
No log 7.2093 310 0.9661 0.5141 0.9661 0.9829
No log 7.2558 312 0.8843 0.5210 0.8843 0.9404
No log 7.3023 314 0.8394 0.5445 0.8394 0.9162
No log 7.3488 316 0.8252 0.5323 0.8252 0.9084
No log 7.3953 318 0.8012 0.5141 0.8012 0.8951
No log 7.4419 320 0.7940 0.5227 0.7940 0.8911
No log 7.4884 322 0.8162 0.5248 0.8162 0.9034
No log 7.5349 324 0.8514 0.5165 0.8514 0.9227
No log 7.5814 326 0.8699 0.5019 0.8699 0.9327
No log 7.6279 328 0.8452 0.5103 0.8452 0.9194
No log 7.6744 330 0.7891 0.5319 0.7891 0.8883
No log 7.7209 332 0.7431 0.5352 0.7431 0.8620
No log 7.7674 334 0.7319 0.5203 0.7319 0.8555
No log 7.8140 336 0.7551 0.5453 0.7551 0.8690
No log 7.8605 338 0.8226 0.5235 0.8226 0.9070
No log 7.9070 340 0.9366 0.4915 0.9366 0.9678
No log 7.9535 342 1.0894 0.4609 1.0894 1.0437
No log 8.0 344 1.2414 0.4355 1.2414 1.1142
No log 8.0465 346 1.2789 0.4315 1.2789 1.1309
No log 8.0930 348 1.2323 0.4419 1.2323 1.1101
No log 8.1395 350 1.1636 0.4573 1.1636 1.0787
No log 8.1860 352 1.1035 0.4670 1.1035 1.0505
No log 8.2326 354 1.0398 0.4690 1.0398 1.0197
No log 8.2791 356 0.9638 0.4965 0.9638 0.9817
No log 8.3256 358 0.9097 0.4876 0.9097 0.9538
No log 8.3721 360 0.8593 0.5208 0.8593 0.9270
No log 8.4186 362 0.8427 0.5315 0.8427 0.9180
No log 8.4651 364 0.8363 0.5315 0.8363 0.9145
No log 8.5116 366 0.8355 0.5315 0.8355 0.9141
No log 8.5581 368 0.8352 0.5366 0.8352 0.9139
No log 8.6047 370 0.8393 0.5366 0.8393 0.9161
No log 8.6512 372 0.8598 0.5131 0.8598 0.9273
No log 8.6977 374 0.9093 0.5198 0.9093 0.9536
No log 8.7442 376 0.9599 0.4809 0.9599 0.9797
No log 8.7907 378 0.9828 0.4799 0.9828 0.9914
No log 8.8372 380 0.9794 0.4911 0.9794 0.9896
No log 8.8837 382 0.9417 0.5041 0.9417 0.9704
No log 8.9302 384 0.8983 0.5244 0.8983 0.9478
No log 8.9767 386 0.8733 0.5545 0.8733 0.9345
No log 9.0233 388 0.8778 0.5533 0.8778 0.9369
No log 9.0698 390 0.8821 0.53 0.8821 0.9392
No log 9.1163 392 0.9021 0.5073 0.9021 0.9498
No log 9.1628 394 0.9479 0.4826 0.9479 0.9736
No log 9.2093 396 0.9851 0.4855 0.9851 0.9925
No log 9.2558 398 1.0116 0.4799 1.0116 1.0058
No log 9.3023 400 1.0209 0.4740 1.0209 1.0104
No log 9.3488 402 1.0187 0.4740 1.0187 1.0093
No log 9.3953 404 1.0139 0.4740 1.0139 1.0069
No log 9.4419 406 1.0231 0.4782 1.0231 1.0115
No log 9.4884 408 1.0264 0.4777 1.0264 1.0131
No log 9.5349 410 1.0172 0.4740 1.0172 1.0086
No log 9.5814 412 1.0077 0.4740 1.0077 1.0038
No log 9.6279 414 0.9957 0.4744 0.9957 0.9978
No log 9.6744 416 0.9915 0.4744 0.9915 0.9958
No log 9.7209 418 0.9856 0.4744 0.9856 0.9928
No log 9.7674 420 0.9750 0.4799 0.9750 0.9874
No log 9.8140 422 0.9674 0.4799 0.9674 0.9836
No log 9.8605 424 0.9591 0.4805 0.9591 0.9793
No log 9.9070 426 0.9550 0.4805 0.9550 0.9772
No log 9.9535 428 0.9523 0.4805 0.9523 0.9758
No log 10.0 430 0.9514 0.4805 0.9514 0.9754

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run1_AugV5_k8_task2_organization

Finetuned
(4023)
this model