ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run3_AugV5_k9_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0812
  • Qwk: 0.5921
  • Mse: 1.0812
  • Rmse: 1.0398

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0408 2 5.2026 0.0066 5.2026 2.2809
No log 0.0816 4 3.1301 0.0535 3.1301 1.7692
No log 0.1224 6 1.9094 0.1114 1.9094 1.3818
No log 0.1633 8 1.3450 0.1369 1.3450 1.1597
No log 0.2041 10 1.3282 0.1208 1.3282 1.1525
No log 0.2449 12 1.4931 -0.0429 1.4931 1.2219
No log 0.2857 14 1.3362 0.1435 1.3362 1.1560
No log 0.3265 16 1.3547 0.1158 1.3547 1.1639
No log 0.3673 18 1.3932 0.1112 1.3932 1.1803
No log 0.4082 20 1.4061 0.0915 1.4061 1.1858
No log 0.4490 22 1.3700 0.0353 1.3700 1.1705
No log 0.4898 24 1.3034 0.0955 1.3034 1.1417
No log 0.5306 26 1.2875 0.0872 1.2875 1.1347
No log 0.5714 28 1.2623 0.0788 1.2623 1.1235
No log 0.6122 30 1.2349 0.1221 1.2349 1.1112
No log 0.6531 32 1.1849 0.1466 1.1849 1.0885
No log 0.6939 34 1.1179 0.1973 1.1179 1.0573
No log 0.7347 36 1.0662 0.2207 1.0662 1.0326
No log 0.7755 38 1.0740 0.2132 1.0740 1.0364
No log 0.8163 40 1.1851 0.2331 1.1851 1.0886
No log 0.8571 42 1.1840 0.2331 1.1840 1.0881
No log 0.8980 44 1.1893 0.2153 1.1893 1.0905
No log 0.9388 46 1.0998 0.2386 1.0998 1.0487
No log 0.9796 48 1.0645 0.3497 1.0645 1.0318
No log 1.0204 50 1.0117 0.3625 1.0117 1.0058
No log 1.0612 52 0.9843 0.3576 0.9843 0.9921
No log 1.1020 54 1.3113 0.2870 1.3113 1.1451
No log 1.1429 56 1.4150 0.1927 1.4150 1.1895
No log 1.1837 58 1.1725 0.3282 1.1725 1.0828
No log 1.2245 60 0.9596 0.4095 0.9596 0.9796
No log 1.2653 62 0.8661 0.5052 0.8661 0.9307
No log 1.3061 64 0.9211 0.4538 0.9211 0.9598
No log 1.3469 66 0.9679 0.4200 0.9679 0.9838
No log 1.3878 68 1.0886 0.4487 1.0886 1.0433
No log 1.4286 70 1.3508 0.4924 1.3508 1.1622
No log 1.4694 72 1.5309 0.4667 1.5309 1.2373
No log 1.5102 74 1.4642 0.4714 1.4642 1.2101
No log 1.5510 76 1.1622 0.5305 1.1622 1.0781
No log 1.5918 78 0.8606 0.5622 0.8606 0.9277
No log 1.6327 80 0.8405 0.5805 0.8405 0.9168
No log 1.6735 82 0.8001 0.5861 0.8001 0.8945
No log 1.7143 84 0.7516 0.6243 0.7516 0.8670
No log 1.7551 86 0.8561 0.6022 0.8561 0.9252
No log 1.7959 88 1.2933 0.4740 1.2933 1.1372
No log 1.8367 90 1.6440 0.3726 1.6440 1.2822
No log 1.8776 92 1.8078 0.3776 1.8078 1.3446
No log 1.9184 94 1.8764 0.4292 1.8764 1.3698
No log 1.9592 96 1.3065 0.4824 1.3065 1.1430
No log 2.0 98 1.1290 0.4012 1.1290 1.0626
No log 2.0408 100 1.1680 0.4626 1.1680 1.0808
No log 2.0816 102 1.1561 0.4822 1.1561 1.0752
No log 2.1224 104 1.0517 0.5437 1.0517 1.0255
No log 2.1633 106 0.9990 0.5497 0.9990 0.9995
No log 2.2041 108 0.9047 0.6186 0.9047 0.9512
No log 2.2449 110 0.9683 0.6047 0.9683 0.9840
No log 2.2857 112 1.1094 0.5511 1.1094 1.0533
No log 2.3265 114 1.4207 0.5180 1.4207 1.1920
No log 2.3673 116 1.4293 0.5018 1.4293 1.1955
No log 2.4082 118 1.3031 0.4914 1.3031 1.1415
No log 2.4490 120 1.2813 0.4845 1.2813 1.1319
No log 2.4898 122 1.2220 0.4791 1.2220 1.1054
No log 2.5306 124 1.1165 0.4540 1.1165 1.0566
No log 2.5714 126 1.1032 0.4420 1.1032 1.0503
No log 2.6122 128 1.3042 0.4843 1.3042 1.1420
No log 2.6531 130 1.3514 0.5108 1.3514 1.1625
No log 2.6939 132 1.2830 0.5647 1.2830 1.1327
No log 2.7347 134 1.3589 0.5311 1.3589 1.1657
No log 2.7755 136 1.4507 0.4927 1.4507 1.2044
No log 2.8163 138 1.6903 0.4485 1.6903 1.3001
No log 2.8571 140 1.8595 0.4120 1.8595 1.3636
No log 2.8980 142 1.6734 0.4673 1.6734 1.2936
No log 2.9388 144 1.3400 0.5516 1.3400 1.1576
No log 2.9796 146 1.0302 0.6016 1.0302 1.0150
No log 3.0204 148 0.9055 0.6014 0.9055 0.9516
No log 3.0612 150 0.9427 0.5992 0.9427 0.9709
No log 3.1020 152 1.1458 0.5615 1.1458 1.0704
No log 3.1429 154 1.4309 0.5416 1.4309 1.1962
No log 3.1837 156 1.4665 0.5179 1.4665 1.2110
No log 3.2245 158 1.5477 0.5221 1.5477 1.2441
No log 3.2653 160 1.5761 0.5058 1.5761 1.2554
No log 3.3061 162 1.5974 0.4779 1.5974 1.2639
No log 3.3469 164 1.5196 0.4936 1.5196 1.2327
No log 3.3878 166 1.4002 0.4860 1.4002 1.1833
No log 3.4286 168 1.3601 0.5070 1.3601 1.1662
No log 3.4694 170 1.3702 0.5276 1.3702 1.1705
No log 3.5102 172 1.4029 0.5284 1.4029 1.1844
No log 3.5510 174 1.2713 0.5542 1.2713 1.1275
No log 3.5918 176 1.2751 0.5601 1.2751 1.1292
No log 3.6327 178 1.3462 0.5498 1.3462 1.1603
No log 3.6735 180 1.3167 0.5557 1.3167 1.1475
No log 3.7143 182 1.3193 0.5458 1.3193 1.1486
No log 3.7551 184 1.2153 0.5385 1.2153 1.1024
No log 3.7959 186 1.0365 0.5795 1.0365 1.0181
No log 3.8367 188 0.9592 0.5892 0.9592 0.9794
No log 3.8776 190 1.0017 0.5742 1.0017 1.0009
No log 3.9184 192 1.1063 0.5823 1.1063 1.0518
No log 3.9592 194 1.1622 0.5793 1.1622 1.0781
No log 4.0 196 1.2111 0.5695 1.2111 1.1005
No log 4.0408 198 1.3431 0.5210 1.3431 1.1589
No log 4.0816 200 1.3569 0.5351 1.3569 1.1648
No log 4.1224 202 1.4802 0.5259 1.4802 1.2166
No log 4.1633 204 1.4768 0.5168 1.4768 1.2152
No log 4.2041 206 1.2322 0.5298 1.2322 1.1101
No log 4.2449 208 1.0974 0.5826 1.0974 1.0476
No log 4.2857 210 0.9779 0.5861 0.9779 0.9889
No log 4.3265 212 0.9948 0.5581 0.9948 0.9974
No log 4.3673 214 1.1260 0.4804 1.1260 1.0611
No log 4.4082 216 1.3369 0.4689 1.3369 1.1562
No log 4.4490 218 1.6210 0.4654 1.6210 1.2732
No log 4.4898 220 1.8149 0.4648 1.8149 1.3472
No log 4.5306 222 1.6651 0.4732 1.6651 1.2904
No log 4.5714 224 1.3449 0.5117 1.3449 1.1597
No log 4.6122 226 1.1716 0.5421 1.1716 1.0824
No log 4.6531 228 1.1193 0.5507 1.1193 1.0580
No log 4.6939 230 1.0734 0.5720 1.0734 1.0361
No log 4.7347 232 1.0356 0.5716 1.0356 1.0177
No log 4.7755 234 0.9953 0.5770 0.9953 0.9977
No log 4.8163 236 0.9830 0.5824 0.9830 0.9915
No log 4.8571 238 1.0215 0.5935 1.0215 1.0107
No log 4.8980 240 1.0381 0.5871 1.0381 1.0189
No log 4.9388 242 1.0747 0.5860 1.0747 1.0367
No log 4.9796 244 0.9699 0.6078 0.9699 0.9848
No log 5.0204 246 0.8597 0.5976 0.8597 0.9272
No log 5.0612 248 0.8502 0.6289 0.8502 0.9221
No log 5.1020 250 0.9244 0.6246 0.9244 0.9615
No log 5.1429 252 0.9783 0.6114 0.9783 0.9891
No log 5.1837 254 0.9586 0.6178 0.9586 0.9791
No log 5.2245 256 0.9606 0.6285 0.9606 0.9801
No log 5.2653 258 0.9427 0.6071 0.9427 0.9709
No log 5.3061 260 0.9438 0.6073 0.9438 0.9715
No log 5.3469 262 0.9520 0.6073 0.9520 0.9757
No log 5.3878 264 0.9880 0.5779 0.9880 0.9940
No log 5.4286 266 1.0506 0.5875 1.0506 1.0250
No log 5.4694 268 1.0174 0.5824 1.0174 1.0087
No log 5.5102 270 0.9689 0.5623 0.9689 0.9843
No log 5.5510 272 0.9930 0.5726 0.9930 0.9965
No log 5.5918 274 1.0213 0.5629 1.0213 1.0106
No log 5.6327 276 0.9767 0.5669 0.9767 0.9883
No log 5.6735 278 0.9194 0.5653 0.9194 0.9589
No log 5.7143 280 0.9613 0.5684 0.9613 0.9805
No log 5.7551 282 1.0627 0.5635 1.0627 1.0309
No log 5.7959 284 1.0797 0.5671 1.0797 1.0391
No log 5.8367 286 1.0210 0.5997 1.0210 1.0104
No log 5.8776 288 0.9365 0.6173 0.9365 0.9677
No log 5.9184 290 0.8304 0.6569 0.8304 0.9112
No log 5.9592 292 0.7907 0.6462 0.7907 0.8892
No log 6.0 294 0.8282 0.6300 0.8282 0.9100
No log 6.0408 296 0.9303 0.5900 0.9303 0.9645
No log 6.0816 298 1.0076 0.5843 1.0076 1.0038
No log 6.1224 300 1.0589 0.5803 1.0589 1.0290
No log 6.1633 302 1.0621 0.5803 1.0621 1.0306
No log 6.2041 304 1.0175 0.5813 1.0175 1.0087
No log 6.2449 306 0.9500 0.5836 0.9500 0.9747
No log 6.2857 308 0.9491 0.5905 0.9491 0.9742
No log 6.3265 310 0.9941 0.5803 0.9941 0.9970
No log 6.3673 312 1.0776 0.5760 1.0776 1.0381
No log 6.4082 314 1.1062 0.5792 1.1062 1.0518
No log 6.4490 316 1.1635 0.5695 1.1635 1.0787
No log 6.4898 318 1.1765 0.5587 1.1765 1.0847
No log 6.5306 320 1.0906 0.5863 1.0906 1.0443
No log 6.5714 322 0.9998 0.5854 0.9998 0.9999
No log 6.6122 324 0.9682 0.5963 0.9682 0.9840
No log 6.6531 326 0.9592 0.6024 0.9592 0.9794
No log 6.6939 328 0.9981 0.5932 0.9981 0.9991
No log 6.7347 330 1.1110 0.5740 1.1110 1.0540
No log 6.7755 332 1.1835 0.5515 1.1835 1.0879
No log 6.8163 334 1.2501 0.5370 1.2501 1.1181
No log 6.8571 336 1.2636 0.5370 1.2636 1.1241
No log 6.8980 338 1.2005 0.5365 1.2005 1.0957
No log 6.9388 340 1.0809 0.5533 1.0809 1.0397
No log 6.9796 342 0.9431 0.5730 0.9431 0.9711
No log 7.0204 344 0.8914 0.6015 0.8914 0.9441
No log 7.0612 346 0.9036 0.6015 0.9036 0.9506
No log 7.1020 348 0.9725 0.5777 0.9725 0.9862
No log 7.1429 350 1.0669 0.5741 1.0669 1.0329
No log 7.1837 352 1.1796 0.5706 1.1796 1.0861
No log 7.2245 354 1.2615 0.5523 1.2615 1.1232
No log 7.2653 356 1.3855 0.5419 1.3855 1.1771
No log 7.3061 358 1.4646 0.5435 1.4646 1.2102
No log 7.3469 360 1.4602 0.5435 1.4602 1.2084
No log 7.3878 362 1.3500 0.5469 1.3500 1.1619
No log 7.4286 364 1.2081 0.5555 1.2081 1.0992
No log 7.4694 366 1.0998 0.5709 1.0998 1.0487
No log 7.5102 368 1.0045 0.5657 1.0045 1.0023
No log 7.5510 370 0.9608 0.6034 0.9608 0.9802
No log 7.5918 372 0.9531 0.6077 0.9531 0.9762
No log 7.6327 374 0.9933 0.5866 0.9933 0.9966
No log 7.6735 376 1.0515 0.5975 1.0515 1.0254
No log 7.7143 378 1.1681 0.5734 1.1681 1.0808
No log 7.7551 380 1.2995 0.5644 1.2995 1.1399
No log 7.7959 382 1.3716 0.5422 1.3716 1.1711
No log 7.8367 384 1.4246 0.5313 1.4246 1.1936
No log 7.8776 386 1.3987 0.5341 1.3987 1.1827
No log 7.9184 388 1.3283 0.5378 1.3283 1.1525
No log 7.9592 390 1.2292 0.5454 1.2292 1.1087
No log 8.0 392 1.1361 0.5723 1.1361 1.0659
No log 8.0408 394 1.0572 0.5768 1.0572 1.0282
No log 8.0816 396 1.0117 0.5932 1.0117 1.0058
No log 8.1224 398 1.0176 0.5892 1.0176 1.0088
No log 8.1633 400 1.0749 0.5576 1.0749 1.0368
No log 8.2041 402 1.1704 0.5695 1.1704 1.0818
No log 8.2449 404 1.2244 0.5572 1.2244 1.1065
No log 8.2857 406 1.2362 0.5476 1.2362 1.1118
No log 8.3265 408 1.2455 0.5500 1.2455 1.1160
No log 8.3673 410 1.2221 0.5547 1.2221 1.1055
No log 8.4082 412 1.1996 0.5531 1.1996 1.0952
No log 8.4490 414 1.1654 0.5613 1.1654 1.0795
No log 8.4898 416 1.1259 0.5663 1.1259 1.0611
No log 8.5306 418 1.1043 0.5747 1.1043 1.0509
No log 8.5714 420 1.0792 0.5714 1.0792 1.0389
No log 8.6122 422 1.0820 0.5747 1.0820 1.0402
No log 8.6531 424 1.0651 0.5767 1.0651 1.0320
No log 8.6939 426 1.0522 0.5776 1.0522 1.0258
No log 8.7347 428 1.0542 0.5851 1.0542 1.0267
No log 8.7755 430 1.0530 0.5965 1.0530 1.0262
No log 8.8163 432 1.0707 0.5913 1.0707 1.0347
No log 8.8571 434 1.1188 0.5747 1.1188 1.0577
No log 8.8980 436 1.1408 0.5666 1.1408 1.0681
No log 8.9388 438 1.1478 0.5667 1.1478 1.0714
No log 8.9796 440 1.1294 0.5666 1.1294 1.0627
No log 9.0204 442 1.1112 0.5716 1.1112 1.0541
No log 9.0612 444 1.0758 0.5880 1.0758 1.0372
No log 9.1020 446 1.0378 0.5871 1.0378 1.0187
No log 9.1429 448 1.0239 0.5740 1.0239 1.0119
No log 9.1837 450 1.0089 0.5740 1.0089 1.0045
No log 9.2245 452 0.9995 0.5740 0.9995 0.9997
No log 9.2653 454 0.9976 0.5740 0.9976 0.9988
No log 9.3061 456 1.0006 0.5740 1.0006 1.0003
No log 9.3469 458 1.0115 0.5740 1.0115 1.0057
No log 9.3878 460 1.0208 0.5841 1.0208 1.0104
No log 9.4286 462 1.0286 0.5907 1.0286 1.0142
No log 9.4694 464 1.0319 0.5904 1.0319 1.0158
No log 9.5102 466 1.0262 0.5943 1.0262 1.0130
No log 9.5510 468 1.0311 0.5943 1.0311 1.0154
No log 9.5918 470 1.0402 0.5943 1.0402 1.0199
No log 9.6327 472 1.0512 0.5921 1.0512 1.0253
No log 9.6735 474 1.0609 0.5921 1.0609 1.0300
No log 9.7143 476 1.0701 0.5880 1.0701 1.0344
No log 9.7551 478 1.0719 0.5880 1.0719 1.0353
No log 9.7959 480 1.0764 0.5880 1.0764 1.0375
No log 9.8367 482 1.0785 0.5880 1.0785 1.0385
No log 9.8776 484 1.0807 0.5880 1.0807 1.0396
No log 9.9184 486 1.0810 0.5921 1.0810 1.0397
No log 9.9592 488 1.0815 0.5921 1.0815 1.0399
No log 10.0 490 1.0812 0.5921 1.0812 1.0398

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits4_WithDuplicationsForScore5_FineTuningAraBERT_run3_AugV5_k9_task1_organization

Finetuned
(4023)
this model