ArabicNewSplits8_usingALLEssays_FineTuningAraBERT_run2_AugV5_k2_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2062
  • Qwk: 0.4746
  • Mse: 1.2062
  • Rmse: 1.0983

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1667 2 5.0992 -0.0137 5.0992 2.2581
No log 0.3333 4 3.0456 0.1050 3.0456 1.7452
No log 0.5 6 1.8156 0.0659 1.8156 1.3474
No log 0.6667 8 1.3225 0.2042 1.3225 1.1500
No log 0.8333 10 1.1221 0.1982 1.1221 1.0593
No log 1.0 12 1.1441 0.2046 1.1441 1.0696
No log 1.1667 14 1.1118 0.2322 1.1118 1.0544
No log 1.3333 16 1.1470 0.2351 1.1470 1.0710
No log 1.5 18 1.0939 0.1711 1.0939 1.0459
No log 1.6667 20 1.0514 0.1662 1.0514 1.0254
No log 1.8333 22 1.2003 0.2505 1.2003 1.0956
No log 2.0 24 1.3410 0.1621 1.3410 1.1580
No log 2.1667 26 1.0676 0.2727 1.0676 1.0332
No log 2.3333 28 1.3819 0.1177 1.3819 1.1755
No log 2.5 30 1.6118 -0.0063 1.6118 1.2696
No log 2.6667 32 1.3528 0.1648 1.3528 1.1631
No log 2.8333 34 1.0981 0.1348 1.0981 1.0479
No log 3.0 36 1.0462 0.2022 1.0463 1.0229
No log 3.1667 38 1.0951 0.2356 1.0951 1.0465
No log 3.3333 40 1.0196 0.2562 1.0196 1.0097
No log 3.5 42 0.9681 0.2113 0.9681 0.9839
No log 3.6667 44 1.0046 0.4082 1.0046 1.0023
No log 3.8333 46 0.9305 0.3632 0.9305 0.9646
No log 4.0 48 0.9391 0.3984 0.9391 0.9691
No log 4.1667 50 0.9355 0.4368 0.9355 0.9672
No log 4.3333 52 0.8593 0.4873 0.8593 0.9270
No log 4.5 54 0.8138 0.5231 0.8138 0.9021
No log 4.6667 56 0.7776 0.4904 0.7776 0.8818
No log 4.8333 58 0.7820 0.4562 0.7820 0.8843
No log 5.0 60 0.8000 0.5292 0.8000 0.8944
No log 5.1667 62 0.8744 0.5784 0.8744 0.9351
No log 5.3333 64 0.9979 0.5294 0.9979 0.9989
No log 5.5 66 0.9753 0.5615 0.9753 0.9876
No log 5.6667 68 0.9525 0.5254 0.9525 0.9760
No log 5.8333 70 0.9773 0.4857 0.9773 0.9886
No log 6.0 72 0.9878 0.5117 0.9878 0.9939
No log 6.1667 74 1.0114 0.5167 1.0114 1.0057
No log 6.3333 76 1.0653 0.5101 1.0653 1.0321
No log 6.5 78 1.1902 0.4983 1.1902 1.0910
No log 6.6667 80 1.2389 0.4338 1.2389 1.1131
No log 6.8333 82 1.2375 0.4628 1.2375 1.1124
No log 7.0 84 1.1969 0.4443 1.1969 1.0941
No log 7.1667 86 1.1604 0.5105 1.1604 1.0772
No log 7.3333 88 1.1510 0.5047 1.1510 1.0729
No log 7.5 90 1.2355 0.4811 1.2355 1.1115
No log 7.6667 92 1.2356 0.4542 1.2356 1.1116
No log 7.8333 94 1.1101 0.4600 1.1101 1.0536
No log 8.0 96 0.9717 0.5287 0.9717 0.9858
No log 8.1667 98 1.0496 0.5146 1.0496 1.0245
No log 8.3333 100 1.2324 0.4640 1.2324 1.1101
No log 8.5 102 1.3659 0.4399 1.3659 1.1687
No log 8.6667 104 1.2453 0.4901 1.2453 1.1159
No log 8.8333 106 1.0552 0.5166 1.0552 1.0272
No log 9.0 108 1.0302 0.4848 1.0302 1.0150
No log 9.1667 110 1.0962 0.5028 1.0962 1.0470
No log 9.3333 112 1.3551 0.4149 1.3551 1.1641
No log 9.5 114 1.4056 0.4449 1.4056 1.1856
No log 9.6667 116 1.3722 0.4661 1.3722 1.1714
No log 9.8333 118 1.3109 0.5009 1.3109 1.1449
No log 10.0 120 1.1090 0.4571 1.1090 1.0531
No log 10.1667 122 1.0626 0.4230 1.0626 1.0308
No log 10.3333 124 1.0441 0.5396 1.0441 1.0218
No log 10.5 126 1.1663 0.4988 1.1663 1.0799
No log 10.6667 128 1.1048 0.5482 1.1048 1.0511
No log 10.8333 130 1.1063 0.5058 1.1063 1.0518
No log 11.0 132 1.2142 0.4900 1.2142 1.1019
No log 11.1667 134 1.3911 0.4574 1.3911 1.1794
No log 11.3333 136 1.5417 0.4243 1.5417 1.2417
No log 11.5 138 1.3376 0.4164 1.3376 1.1565
No log 11.6667 140 1.2936 0.4409 1.2936 1.1374
No log 11.8333 142 1.1991 0.4853 1.1991 1.0950
No log 12.0 144 0.8598 0.5362 0.8598 0.9273
No log 12.1667 146 0.7825 0.5950 0.7825 0.8846
No log 12.3333 148 0.9008 0.5943 0.9008 0.9491
No log 12.5 150 1.1635 0.5191 1.1635 1.0786
No log 12.6667 152 1.2853 0.4811 1.2853 1.1337
No log 12.8333 154 1.2218 0.4577 1.2218 1.1054
No log 13.0 156 1.1611 0.4625 1.1611 1.0775
No log 13.1667 158 1.0181 0.5141 1.0181 1.0090
No log 13.3333 160 0.9808 0.5353 0.9808 0.9903
No log 13.5 162 0.8669 0.5469 0.8669 0.9311
No log 13.6667 164 0.8888 0.5376 0.8888 0.9428
No log 13.8333 166 1.0207 0.5169 1.0207 1.0103
No log 14.0 168 1.2719 0.4668 1.2719 1.1278
No log 14.1667 170 1.1283 0.4935 1.1283 1.0622
No log 14.3333 172 1.0227 0.5409 1.0227 1.0113
No log 14.5 174 1.1558 0.4734 1.1558 1.0751
No log 14.6667 176 1.2447 0.4685 1.2447 1.1157
No log 14.8333 178 1.2142 0.4778 1.2142 1.1019
No log 15.0 180 1.0561 0.5371 1.0561 1.0277
No log 15.1667 182 1.1985 0.4755 1.1985 1.0948
No log 15.3333 184 1.4169 0.4436 1.4169 1.1903
No log 15.5 186 1.2872 0.4905 1.2872 1.1345
No log 15.6667 188 1.1757 0.5105 1.1757 1.0843
No log 15.8333 190 1.1860 0.4818 1.1860 1.0891
No log 16.0 192 1.1442 0.4911 1.1442 1.0697
No log 16.1667 194 1.0903 0.4885 1.0903 1.0442
No log 16.3333 196 1.0160 0.5126 1.0160 1.0080
No log 16.5 198 0.9607 0.5771 0.9607 0.9802
No log 16.6667 200 0.9361 0.5797 0.9361 0.9675
No log 16.8333 202 0.9859 0.5515 0.9859 0.9929
No log 17.0 204 1.1110 0.5169 1.1110 1.0540
No log 17.1667 206 1.0690 0.5578 1.0690 1.0339
No log 17.3333 208 1.0706 0.5483 1.0706 1.0347
No log 17.5 210 0.9943 0.5850 0.9943 0.9971
No log 17.6667 212 0.8925 0.6101 0.8925 0.9447
No log 17.8333 214 0.9150 0.6180 0.9150 0.9566
No log 18.0 216 0.9835 0.5428 0.9835 0.9917
No log 18.1667 218 1.1534 0.4707 1.1534 1.0740
No log 18.3333 220 1.0750 0.5047 1.0750 1.0368
No log 18.5 222 0.8865 0.5845 0.8865 0.9415
No log 18.6667 224 0.9067 0.5845 0.9067 0.9522
No log 18.8333 226 1.0149 0.5462 1.0149 1.0074
No log 19.0 228 1.2966 0.5110 1.2966 1.1387
No log 19.1667 230 1.3155 0.5097 1.3155 1.1469
No log 19.3333 232 1.1206 0.5711 1.1206 1.0586
No log 19.5 234 0.9648 0.5855 0.9648 0.9822
No log 19.6667 236 0.9843 0.5853 0.9843 0.9921
No log 19.8333 238 1.2418 0.4655 1.2418 1.1144
No log 20.0 240 1.3891 0.4505 1.3891 1.1786
No log 20.1667 242 1.2406 0.4169 1.2406 1.1138
No log 20.3333 244 0.9549 0.5651 0.9548 0.9772
No log 20.5 246 0.9065 0.6081 0.9065 0.9521
No log 20.6667 248 1.0076 0.5271 1.0076 1.0038
No log 20.8333 250 1.1045 0.4772 1.1045 1.0509
No log 21.0 252 1.0201 0.5466 1.0201 1.0100
No log 21.1667 254 0.8941 0.6383 0.8941 0.9456
No log 21.3333 256 0.8319 0.6584 0.8319 0.9121
No log 21.5 258 0.8987 0.6183 0.8987 0.9480
No log 21.6667 260 1.0521 0.5224 1.0521 1.0257
No log 21.8333 262 1.2635 0.4429 1.2635 1.1241
No log 22.0 264 1.2308 0.4534 1.2308 1.1094
No log 22.1667 266 0.9987 0.5342 0.9987 0.9994
No log 22.3333 268 0.8045 0.6478 0.8045 0.8969
No log 22.5 270 0.7775 0.6231 0.7775 0.8817
No log 22.6667 272 0.8029 0.6399 0.8029 0.8960
No log 22.8333 274 0.8924 0.5688 0.8924 0.9447
No log 23.0 276 1.1651 0.4568 1.1651 1.0794
No log 23.1667 278 1.3284 0.4441 1.3284 1.1526
No log 23.3333 280 1.2432 0.4255 1.2432 1.1150
No log 23.5 282 1.0307 0.5262 1.0307 1.0152
No log 23.6667 284 0.8989 0.5995 0.8989 0.9481
No log 23.8333 286 0.9071 0.5797 0.9071 0.9524
No log 24.0 288 0.9474 0.5995 0.9474 0.9733
No log 24.1667 290 1.0084 0.5809 1.0084 1.0042
No log 24.3333 292 1.1016 0.5186 1.1016 1.0496
No log 24.5 294 1.3241 0.4454 1.3241 1.1507
No log 24.6667 296 1.4217 0.4279 1.4217 1.1924
No log 24.8333 298 1.2619 0.4648 1.2619 1.1234
No log 25.0 300 1.0409 0.4827 1.0409 1.0203
No log 25.1667 302 0.9926 0.5035 0.9926 0.9963
No log 25.3333 304 1.0578 0.4963 1.0578 1.0285
No log 25.5 306 1.1623 0.4909 1.1623 1.0781
No log 25.6667 308 1.1743 0.4859 1.1743 1.0836
No log 25.8333 310 1.2248 0.4845 1.2248 1.1067
No log 26.0 312 1.2148 0.4901 1.2148 1.1022
No log 26.1667 314 1.1034 0.5398 1.1034 1.0504
No log 26.3333 316 0.9027 0.6077 0.9027 0.9501
No log 26.5 318 0.8394 0.6200 0.8394 0.9162
No log 26.6667 320 0.8971 0.5987 0.8971 0.9472
No log 26.8333 322 0.9813 0.5503 0.9813 0.9906
No log 27.0 324 1.1776 0.4706 1.1776 1.0852
No log 27.1667 326 1.2085 0.4701 1.2085 1.0993
No log 27.3333 328 1.0790 0.5164 1.0790 1.0387
No log 27.5 330 0.9966 0.5257 0.9966 0.9983
No log 27.6667 332 0.9631 0.5032 0.9631 0.9814
No log 27.8333 334 1.0132 0.4927 1.0132 1.0066
No log 28.0 336 1.0883 0.4915 1.0883 1.0432
No log 28.1667 338 1.0117 0.5198 1.0117 1.0058
No log 28.3333 340 0.9705 0.5444 0.9705 0.9852
No log 28.5 342 1.0174 0.5504 1.0174 1.0087
No log 28.6667 344 1.1389 0.4893 1.1389 1.0672
No log 28.8333 346 1.1955 0.4575 1.1955 1.0934
No log 29.0 348 1.1494 0.4730 1.1494 1.0721
No log 29.1667 350 1.2520 0.4525 1.2520 1.1189
No log 29.3333 352 1.2990 0.4300 1.2990 1.1398
No log 29.5 354 1.3201 0.5058 1.3201 1.1489
No log 29.6667 356 1.2559 0.5064 1.2559 1.1206
No log 29.8333 358 1.1490 0.5127 1.1490 1.0719
No log 30.0 360 1.0560 0.5380 1.0560 1.0276
No log 30.1667 362 1.0513 0.5427 1.0513 1.0253
No log 30.3333 364 1.0979 0.5324 1.0979 1.0478
No log 30.5 366 1.1126 0.5122 1.1126 1.0548
No log 30.6667 368 1.0540 0.5321 1.0540 1.0267
No log 30.8333 370 1.0284 0.5278 1.0284 1.0141
No log 31.0 372 1.0541 0.5127 1.0541 1.0267
No log 31.1667 374 1.1642 0.4532 1.1642 1.0790
No log 31.3333 376 1.2801 0.4363 1.2801 1.1314
No log 31.5 378 1.2706 0.4358 1.2706 1.1272
No log 31.6667 380 1.1395 0.4712 1.1395 1.0675
No log 31.8333 382 0.9650 0.5459 0.9650 0.9824
No log 32.0 384 0.9580 0.5427 0.9580 0.9788
No log 32.1667 386 1.0550 0.5179 1.0550 1.0271
No log 32.3333 388 1.1642 0.5265 1.1642 1.0790
No log 32.5 390 1.1678 0.5366 1.1678 1.0806
No log 32.6667 392 1.0353 0.5550 1.0353 1.0175
No log 32.8333 394 0.9513 0.5732 0.9513 0.9754
No log 33.0 396 0.9519 0.5596 0.9519 0.9757
No log 33.1667 398 0.9842 0.5525 0.9842 0.9921
No log 33.3333 400 1.1091 0.5493 1.1091 1.0531
No log 33.5 402 1.2837 0.5135 1.2837 1.1330
No log 33.6667 404 1.2867 0.4766 1.2867 1.1343
No log 33.8333 406 1.1213 0.5083 1.1213 1.0589
No log 34.0 408 1.0158 0.5295 1.0158 1.0079
No log 34.1667 410 0.8902 0.5489 0.8902 0.9435
No log 34.3333 412 0.8718 0.5634 0.8718 0.9337
No log 34.5 414 0.9686 0.5361 0.9686 0.9842
No log 34.6667 416 1.0270 0.5392 1.0270 1.0134
No log 34.8333 418 1.0245 0.5494 1.0245 1.0122
No log 35.0 420 0.9819 0.5504 0.9819 0.9909
No log 35.1667 422 0.9923 0.5504 0.9923 0.9962
No log 35.3333 424 1.0170 0.5402 1.0170 1.0085
No log 35.5 426 1.0926 0.5392 1.0926 1.0453
No log 35.6667 428 1.1348 0.5193 1.1348 1.0653
No log 35.8333 430 1.1546 0.5132 1.1546 1.0745
No log 36.0 432 1.1067 0.5226 1.1067 1.0520
No log 36.1667 434 1.0865 0.5276 1.0865 1.0424
No log 36.3333 436 1.0944 0.5276 1.0944 1.0461
No log 36.5 438 1.1247 0.5299 1.1247 1.0605
No log 36.6667 440 1.1479 0.5299 1.1479 1.0714
No log 36.8333 442 1.0817 0.5372 1.0817 1.0401
No log 37.0 444 0.9589 0.5464 0.9589 0.9792
No log 37.1667 446 0.9077 0.5878 0.9077 0.9527
No log 37.3333 448 0.8656 0.5690 0.8656 0.9304
No log 37.5 450 0.9351 0.5746 0.9351 0.9670
No log 37.6667 452 1.1177 0.5468 1.1177 1.0572
No log 37.8333 454 1.3113 0.4528 1.3113 1.1451
No log 38.0 456 1.3543 0.4360 1.3543 1.1637
No log 38.1667 458 1.2943 0.4491 1.2943 1.1377
No log 38.3333 460 1.1127 0.5053 1.1127 1.0548
No log 38.5 462 0.8818 0.5918 0.8818 0.9391
No log 38.6667 464 0.7971 0.5907 0.7971 0.8928
No log 38.8333 466 0.8174 0.5895 0.8174 0.9041
No log 39.0 468 0.8570 0.6049 0.8570 0.9257
No log 39.1667 470 0.9077 0.5917 0.9077 0.9527
No log 39.3333 472 1.0453 0.5111 1.0453 1.0224
No log 39.5 474 1.1217 0.4879 1.1217 1.0591
No log 39.6667 476 1.1500 0.4966 1.1500 1.0724
No log 39.8333 478 1.1522 0.5165 1.1522 1.0734
No log 40.0 480 1.0636 0.5376 1.0636 1.0313
No log 40.1667 482 0.9523 0.5785 0.9523 0.9759
No log 40.3333 484 0.9451 0.5785 0.9451 0.9722
No log 40.5 486 1.0542 0.5335 1.0542 1.0267
No log 40.6667 488 1.2485 0.5085 1.2485 1.1174
No log 40.8333 490 1.2924 0.5071 1.2924 1.1368
No log 41.0 492 1.2346 0.4950 1.2346 1.1111
No log 41.1667 494 1.1930 0.4883 1.1930 1.0922
No log 41.3333 496 1.1243 0.5366 1.1243 1.0603
No log 41.5 498 1.0495 0.5470 1.0495 1.0244
0.3561 41.6667 500 0.9659 0.5586 0.9659 0.9828
0.3561 41.8333 502 0.9953 0.5448 0.9953 0.9977
0.3561 42.0 504 1.0734 0.5009 1.0734 1.0360
0.3561 42.1667 506 1.1909 0.4685 1.1909 1.0913
0.3561 42.3333 508 1.2293 0.4522 1.2293 1.1087
0.3561 42.5 510 1.2062 0.4746 1.2062 1.0983

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits8_usingALLEssays_FineTuningAraBERT_run2_AugV5_k2_task1_organization

Finetuned
(4023)
this model