ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k17_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2100
  • Qwk: 0.2876
  • Mse: 1.2100
  • Rmse: 1.1000

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0385 2 4.9040 0.0010 4.9040 2.2145
No log 0.0769 4 3.0627 0.0048 3.0627 1.7500
No log 0.1154 6 2.4011 -0.0464 2.4011 1.5495
No log 0.1538 8 1.8626 -0.0303 1.8626 1.3648
No log 0.1923 10 1.4342 -0.0870 1.4342 1.1976
No log 0.2308 12 1.2580 0.0434 1.2580 1.1216
No log 0.2692 14 1.2598 0.1567 1.2598 1.1224
No log 0.3077 16 1.2803 0.1173 1.2803 1.1315
No log 0.3462 18 1.4674 0.0331 1.4674 1.2114
No log 0.3846 20 1.9721 0.1169 1.9721 1.4043
No log 0.4231 22 2.4483 0.0682 2.4483 1.5647
No log 0.4615 24 2.3379 0.1971 2.3379 1.5290
No log 0.5 26 1.8792 0.0372 1.8792 1.3708
No log 0.5385 28 1.4911 0.0372 1.4911 1.2211
No log 0.5769 30 1.3394 0.0372 1.3394 1.1573
No log 0.6154 32 1.2733 0.0707 1.2733 1.1284
No log 0.6538 34 1.2618 0.2110 1.2618 1.1233
No log 0.6923 36 1.2262 0.1650 1.2262 1.1073
No log 0.7308 38 1.2139 0.1480 1.2139 1.1018
No log 0.7692 40 1.2171 0.0427 1.2171 1.1032
No log 0.8077 42 1.2621 -0.0616 1.2621 1.1234
No log 0.8462 44 1.4410 0.0811 1.4410 1.2004
No log 0.8846 46 1.5287 0.0169 1.5287 1.2364
No log 0.9231 48 1.5853 0.0169 1.5853 1.2591
No log 0.9615 50 1.6715 -0.0149 1.6715 1.2928
No log 1.0 52 1.6064 -0.0149 1.6064 1.2674
No log 1.0385 54 1.3479 -0.0251 1.3479 1.1610
No log 1.0769 56 1.2589 0.0788 1.2589 1.1220
No log 1.1154 58 1.2320 0.0872 1.2320 1.1099
No log 1.1538 60 1.2467 0.2188 1.2467 1.1165
No log 1.1923 62 1.2514 0.1752 1.2514 1.1187
No log 1.2308 64 1.2951 0.0070 1.2951 1.1380
No log 1.2692 66 1.5820 -0.0149 1.5820 1.2578
No log 1.3077 68 1.8021 0.0 1.8021 1.3424
No log 1.3462 70 1.7045 0.0 1.7045 1.3056
No log 1.3846 72 1.5558 0.0 1.5558 1.2473
No log 1.4231 74 1.4201 0.0575 1.4201 1.1917
No log 1.4615 76 1.2857 0.0561 1.2857 1.1339
No log 1.5 78 1.1949 0.1968 1.1949 1.0931
No log 1.5385 80 1.1591 0.2293 1.1591 1.0766
No log 1.5769 82 1.1766 0.1857 1.1766 1.0847
No log 1.6154 84 1.2023 0.2678 1.2023 1.0965
No log 1.6538 86 1.2653 0.0768 1.2653 1.1249
No log 1.6923 88 1.3497 0.0561 1.3497 1.1617
No log 1.7308 90 1.4285 0.0776 1.4285 1.1952
No log 1.7692 92 1.3183 0.0613 1.3183 1.1482
No log 1.8077 94 1.1894 0.1927 1.1894 1.0906
No log 1.8462 96 1.2058 0.1081 1.2058 1.0981
No log 1.8846 98 1.2317 0.1406 1.2317 1.1098
No log 1.9231 100 1.2830 0.1379 1.2830 1.1327
No log 1.9615 102 1.2864 0.1379 1.2864 1.1342
No log 2.0 104 1.3129 0.1596 1.3129 1.1458
No log 2.0385 106 1.2864 0.2273 1.2864 1.1342
No log 2.0769 108 1.2506 0.1919 1.2506 1.1183
No log 2.1154 110 1.3538 0.1354 1.3538 1.1635
No log 2.1538 112 1.2103 0.1772 1.2103 1.1001
No log 2.1923 114 1.1752 0.3343 1.1752 1.0841
No log 2.2308 116 1.2952 0.2173 1.2952 1.1381
No log 2.2692 118 1.1886 0.2922 1.1886 1.0902
No log 2.3077 120 1.1491 0.2200 1.1491 1.0719
No log 2.3462 122 1.1514 0.2155 1.1514 1.0730
No log 2.3846 124 1.1624 0.2111 1.1624 1.0781
No log 2.4231 126 1.1246 0.2268 1.1246 1.0605
No log 2.4615 128 1.0754 0.3431 1.0754 1.0370
No log 2.5 130 1.1293 0.3721 1.1293 1.0627
No log 2.5385 132 1.0627 0.3961 1.0627 1.0309
No log 2.5769 134 1.1847 0.2180 1.1847 1.0884
No log 2.6154 136 1.2730 0.2143 1.2730 1.1283
No log 2.6538 138 1.0708 0.2784 1.0708 1.0348
No log 2.6923 140 1.0110 0.2967 1.0110 1.0055
No log 2.7308 142 1.0189 0.2628 1.0189 1.0094
No log 2.7692 144 1.0933 0.1830 1.0933 1.0456
No log 2.8077 146 1.0779 0.1013 1.0779 1.0382
No log 2.8462 148 1.0821 0.2062 1.0821 1.0402
No log 2.8846 150 1.0982 0.2650 1.0982 1.0479
No log 2.9231 152 1.1965 0.2641 1.1965 1.0938
No log 2.9615 154 1.1351 0.3139 1.1351 1.0654
No log 3.0 156 1.0746 0.4763 1.0746 1.0366
No log 3.0385 158 1.1330 0.4218 1.1330 1.0644
No log 3.0769 160 1.0917 0.4010 1.0917 1.0448
No log 3.1154 162 0.9904 0.4165 0.9904 0.9952
No log 3.1538 164 1.0240 0.2633 1.0240 1.0119
No log 3.1923 166 1.0607 0.2697 1.0607 1.0299
No log 3.2308 168 1.0649 0.3551 1.0649 1.0319
No log 3.2692 170 1.1154 0.4028 1.1154 1.0561
No log 3.3077 172 1.1204 0.2100 1.1204 1.0585
No log 3.3462 174 1.1792 0.1728 1.1792 1.0859
No log 3.3846 176 1.1712 0.2417 1.1712 1.0822
No log 3.4231 178 1.1184 0.2244 1.1184 1.0575
No log 3.4615 180 1.2563 0.2542 1.2563 1.1208
No log 3.5 182 1.3635 0.2593 1.3635 1.1677
No log 3.5385 184 1.2347 0.2330 1.2347 1.1112
No log 3.5769 186 1.1112 0.3229 1.1112 1.0541
No log 3.6154 188 1.1140 0.2409 1.1140 1.0555
No log 3.6538 190 1.1152 0.3223 1.1152 1.0560
No log 3.6923 192 1.3589 0.2418 1.3589 1.1657
No log 3.7308 194 1.5097 0.2841 1.5097 1.2287
No log 3.7692 196 1.3196 0.2807 1.3196 1.1488
No log 3.8077 198 1.0567 0.3049 1.0567 1.0280
No log 3.8462 200 1.0341 0.1982 1.0341 1.0169
No log 3.8846 202 1.2974 0.3047 1.2974 1.1390
No log 3.9231 204 1.6269 0.2846 1.6269 1.2755
No log 3.9615 206 1.6873 0.3195 1.6873 1.2990
No log 4.0 208 1.4483 0.2863 1.4483 1.2034
No log 4.0385 210 1.0916 0.2534 1.0916 1.0448
No log 4.0769 212 1.1482 0.4016 1.1482 1.0715
No log 4.1154 214 1.5049 0.3058 1.5049 1.2267
No log 4.1538 216 1.5262 0.3280 1.5262 1.2354
No log 4.1923 218 1.3017 0.2468 1.3017 1.1409
No log 4.2308 220 1.0510 0.3459 1.0510 1.0252
No log 4.2692 222 0.9862 0.2628 0.9862 0.9931
No log 4.3077 224 1.0566 0.2461 1.0566 1.0279
No log 4.3462 226 1.0371 0.2461 1.0371 1.0184
No log 4.3846 228 0.9621 0.4217 0.9621 0.9809
No log 4.4231 230 1.0362 0.3928 1.0362 1.0179
No log 4.4615 232 1.1348 0.4297 1.1348 1.0653
No log 4.5 234 1.0942 0.4298 1.0942 1.0460
No log 4.5385 236 1.0974 0.4101 1.0974 1.0476
No log 4.5769 238 1.0710 0.3657 1.0710 1.0349
No log 4.6154 240 1.0230 0.3735 1.0230 1.0114
No log 4.6538 242 0.9735 0.3564 0.9735 0.9867
No log 4.6923 244 1.0173 0.2223 1.0173 1.0086
No log 4.7308 246 1.1012 0.1950 1.1012 1.0494
No log 4.7692 248 1.0945 0.1950 1.0945 1.0462
No log 4.8077 250 1.0811 0.2130 1.0811 1.0397
No log 4.8462 252 1.0774 0.2452 1.0774 1.0380
No log 4.8846 254 1.0485 0.2128 1.0485 1.0240
No log 4.9231 256 1.0828 0.1797 1.0828 1.0406
No log 4.9615 258 1.2838 0.3028 1.2838 1.1330
No log 5.0 260 1.5974 0.2680 1.5974 1.2639
No log 5.0385 262 1.7140 0.2318 1.7140 1.3092
No log 5.0769 264 1.6102 0.24 1.6102 1.2689
No log 5.1154 266 1.3584 0.2418 1.3584 1.1655
No log 5.1538 268 1.0416 0.1327 1.0416 1.0206
No log 5.1923 270 0.9592 0.3994 0.9592 0.9794
No log 5.2308 272 0.9610 0.4243 0.9610 0.9803
No log 5.2692 274 0.9694 0.2991 0.9694 0.9846
No log 5.3077 276 1.0003 0.2892 1.0003 1.0002
No log 5.3462 278 1.0088 0.2964 1.0088 1.0044
No log 5.3846 280 1.0319 0.2871 1.0319 1.0158
No log 5.4231 282 1.0205 0.3059 1.0205 1.0102
No log 5.4615 284 0.9812 0.3268 0.9812 0.9905
No log 5.5 286 0.9340 0.4023 0.9340 0.9664
No log 5.5385 288 0.9178 0.4256 0.9178 0.9580
No log 5.5769 290 0.9160 0.4346 0.9160 0.9571
No log 5.6154 292 0.9283 0.4290 0.9283 0.9635
No log 5.6538 294 0.9428 0.3741 0.9428 0.9710
No log 5.6923 296 0.9225 0.4450 0.9225 0.9605
No log 5.7308 298 0.9353 0.4181 0.9353 0.9671
No log 5.7692 300 0.9489 0.4474 0.9489 0.9741
No log 5.8077 302 0.9581 0.4739 0.9581 0.9788
No log 5.8462 304 1.0078 0.3516 1.0078 1.0039
No log 5.8846 306 1.0149 0.3427 1.0149 1.0074
No log 5.9231 308 0.9584 0.4011 0.9584 0.9790
No log 5.9615 310 0.9264 0.3454 0.9264 0.9625
No log 6.0 312 0.9374 0.3712 0.9374 0.9682
No log 6.0385 314 0.9291 0.3712 0.9291 0.9639
No log 6.0769 316 0.9268 0.3935 0.9268 0.9627
No log 6.1154 318 0.9511 0.4326 0.9511 0.9752
No log 6.1538 320 0.9370 0.3738 0.9370 0.9680
No log 6.1923 322 0.9328 0.3392 0.9328 0.9658
No log 6.2308 324 0.9299 0.3392 0.9299 0.9643
No log 6.2692 326 0.9280 0.3539 0.9280 0.9633
No log 6.3077 328 0.9287 0.3738 0.9287 0.9637
No log 6.3462 330 0.9354 0.3738 0.9354 0.9672
No log 6.3846 332 0.9203 0.3845 0.9203 0.9593
No log 6.4231 334 0.9202 0.4100 0.9202 0.9593
No log 6.4615 336 0.9449 0.4439 0.9449 0.9721
No log 6.5 338 0.9600 0.4631 0.9600 0.9798
No log 6.5385 340 0.9440 0.4765 0.9440 0.9716
No log 6.5769 342 0.9384 0.4439 0.9384 0.9687
No log 6.6154 344 0.9133 0.4343 0.9133 0.9557
No log 6.6538 346 0.9156 0.4711 0.9156 0.9569
No log 6.6923 348 0.9333 0.3886 0.9333 0.9661
No log 6.7308 350 1.0115 0.48 1.0115 1.0057
No log 6.7692 352 1.0885 0.4655 1.0885 1.0433
No log 6.8077 354 1.0992 0.4655 1.0992 1.0484
No log 6.8462 356 1.0476 0.4631 1.0476 1.0235
No log 6.8846 358 0.9557 0.4548 0.9557 0.9776
No log 6.9231 360 0.9819 0.3733 0.9819 0.9909
No log 6.9615 362 0.9853 0.4100 0.9853 0.9926
No log 7.0 364 0.9543 0.4159 0.9543 0.9769
No log 7.0385 366 0.9646 0.4316 0.9646 0.9821
No log 7.0769 368 1.0367 0.3931 1.0367 1.0182
No log 7.1154 370 1.0596 0.4354 1.0596 1.0294
No log 7.1538 372 1.0088 0.4340 1.0088 1.0044
No log 7.1923 374 0.9591 0.3838 0.9591 0.9793
No log 7.2308 376 0.9621 0.4119 0.9621 0.9809
No log 7.2692 378 0.9538 0.3804 0.9538 0.9766
No log 7.3077 380 0.9996 0.4105 0.9996 0.9998
No log 7.3462 382 1.0553 0.3918 1.0553 1.0273
No log 7.3846 384 1.0072 0.4648 1.0072 1.0036
No log 7.4231 386 0.9538 0.4343 0.9538 0.9766
No log 7.4615 388 0.9116 0.3421 0.9116 0.9548
No log 7.5 390 0.9185 0.3467 0.9185 0.9584
No log 7.5385 392 0.9280 0.3365 0.9280 0.9633
No log 7.5769 394 0.9228 0.3374 0.9228 0.9606
No log 7.6154 396 0.9771 0.3326 0.9771 0.9885
No log 7.6538 398 1.0707 0.4040 1.0707 1.0347
No log 7.6923 400 1.0870 0.4132 1.0870 1.0426
No log 7.7308 402 1.0266 0.3892 1.0266 1.0132
No log 7.7692 404 0.9529 0.4102 0.9529 0.9762
No log 7.8077 406 0.9673 0.3365 0.9673 0.9835
No log 7.8462 408 1.0404 0.2891 1.0404 1.0200
No log 7.8846 410 1.0747 0.3008 1.0747 1.0367
No log 7.9231 412 1.0119 0.2716 1.0119 1.0059
No log 7.9615 414 0.9769 0.4587 0.9769 0.9884
No log 8.0 416 1.0052 0.4785 1.0052 1.0026
No log 8.0385 418 1.0034 0.4785 1.0034 1.0017
No log 8.0769 420 0.9709 0.3578 0.9709 0.9853
No log 8.1154 422 0.9920 0.3205 0.9920 0.9960
No log 8.1538 424 1.0036 0.3205 1.0036 1.0018
No log 8.1923 426 1.0151 0.3154 1.0151 1.0075
No log 8.2308 428 0.9900 0.3275 0.9900 0.9950
No log 8.2692 430 0.9885 0.3902 0.9885 0.9943
No log 8.3077 432 0.9835 0.3804 0.9835 0.9917
No log 8.3462 434 0.9738 0.4002 0.9738 0.9868
No log 8.3846 436 0.9586 0.4142 0.9586 0.9791
No log 8.4231 438 0.9485 0.3838 0.9485 0.9739
No log 8.4615 440 0.9479 0.3838 0.9479 0.9736
No log 8.5 442 0.9468 0.4280 0.9468 0.9730
No log 8.5385 444 0.9553 0.3966 0.9553 0.9774
No log 8.5769 446 1.0045 0.4572 1.0045 1.0023
No log 8.6154 448 1.0273 0.4552 1.0273 1.0136
No log 8.6538 450 1.0080 0.4672 1.0080 1.0040
No log 8.6923 452 0.9910 0.5 0.9910 0.9955
No log 8.7308 454 0.9515 0.5077 0.9515 0.9755
No log 8.7692 456 0.9669 0.5030 0.9669 0.9833
No log 8.8077 458 1.0150 0.4558 1.0150 1.0075
No log 8.8462 460 1.0807 0.4761 1.0807 1.0396
No log 8.8846 462 1.1451 0.3826 1.1451 1.0701
No log 8.9231 464 1.1437 0.4016 1.1437 1.0694
No log 8.9615 466 1.1154 0.4221 1.1154 1.0561
No log 9.0 468 1.0281 0.4028 1.0281 1.0140
No log 9.0385 470 1.0035 0.4028 1.0035 1.0017
No log 9.0769 472 0.9801 0.3820 0.9801 0.9900
No log 9.1154 474 0.9775 0.3564 0.9775 0.9887
No log 9.1538 476 0.9992 0.4065 0.9992 0.9996
No log 9.1923 478 1.0439 0.4034 1.0439 1.0217
No log 9.2308 480 1.0409 0.4031 1.0409 1.0203
No log 9.2692 482 1.0485 0.3794 1.0485 1.0240
No log 9.3077 484 0.9933 0.3920 0.9933 0.9966
No log 9.3462 486 0.9772 0.3666 0.9772 0.9885
No log 9.3846 488 1.0030 0.3666 1.0030 1.0015
No log 9.4231 490 1.0362 0.4164 1.0362 1.0179
No log 9.4615 492 1.0663 0.4164 1.0663 1.0326
No log 9.5 494 1.0449 0.4164 1.0449 1.0222
No log 9.5385 496 1.0404 0.3577 1.0404 1.0200
No log 9.5769 498 1.0596 0.3920 1.0596 1.0293
0.4129 9.6154 500 1.0638 0.3920 1.0638 1.0314
0.4129 9.6538 502 1.0980 0.3920 1.0980 1.0479
0.4129 9.6923 504 1.0886 0.3463 1.0886 1.0434
0.4129 9.7308 506 1.0927 0.2679 1.0927 1.0453
0.4129 9.7692 508 1.1774 0.2670 1.1774 1.0851
0.4129 9.8077 510 1.2100 0.2876 1.2100 1.1000

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k17_task2_organization

Finetuned
(4023)
this model