ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k19_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0706
  • Qwk: 0.1860
  • Mse: 1.0706
  • Rmse: 1.0347

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0345 2 4.8087 0.0010 4.8087 2.1929
No log 0.0690 4 2.7275 -0.0233 2.7275 1.6515
No log 0.1034 6 1.7450 0.0062 1.7450 1.3210
No log 0.1379 8 1.4470 -0.0211 1.4470 1.2029
No log 0.1724 10 1.2554 0.1273 1.2554 1.1204
No log 0.2069 12 1.2659 0.0677 1.2659 1.1251
No log 0.2414 14 1.4700 0.0169 1.4700 1.2124
No log 0.2759 16 1.7470 0.0 1.7470 1.3218
No log 0.3103 18 1.6484 0.0 1.6484 1.2839
No log 0.3448 20 1.3762 0.0317 1.3762 1.1731
No log 0.3793 22 1.2286 0.1865 1.2286 1.1084
No log 0.4138 24 1.1692 0.3195 1.1692 1.0813
No log 0.4483 26 1.1450 0.2360 1.1450 1.0700
No log 0.4828 28 1.0983 0.2200 1.0983 1.0480
No log 0.5172 30 1.1247 0.2936 1.1247 1.0605
No log 0.5517 32 1.2490 0.1346 1.2490 1.1176
No log 0.5862 34 1.1800 0.1715 1.1800 1.0863
No log 0.6207 36 1.2158 0.1622 1.2158 1.1026
No log 0.6552 38 1.2783 0.1404 1.2783 1.1306
No log 0.6897 40 1.3105 0.0898 1.3105 1.1448
No log 0.7241 42 1.3315 0.0898 1.3315 1.1539
No log 0.7586 44 1.2605 0.1314 1.2605 1.1227
No log 0.7931 46 1.3090 0.1314 1.3090 1.1441
No log 0.8276 48 1.4590 0.0254 1.4590 1.2079
No log 0.8621 50 1.5439 -0.0149 1.5439 1.2425
No log 0.8966 52 1.4195 -0.0066 1.4195 1.1914
No log 0.9310 54 1.3090 0.0575 1.3090 1.1441
No log 0.9655 56 1.1950 0.2298 1.1950 1.0932
No log 1.0 58 1.0729 0.2782 1.0729 1.0358
No log 1.0345 60 1.0644 0.3374 1.0644 1.0317
No log 1.0690 62 1.0245 0.2916 1.0245 1.0122
No log 1.1034 64 1.0276 0.3115 1.0276 1.0137
No log 1.1379 66 1.1064 0.2395 1.1064 1.0519
No log 1.1724 68 1.1391 0.2690 1.1391 1.0673
No log 1.2069 70 1.0760 0.2432 1.0760 1.0373
No log 1.2414 72 0.9917 0.3646 0.9917 0.9959
No log 1.2759 74 0.9425 0.4180 0.9425 0.9708
No log 1.3103 76 0.9560 0.4236 0.9560 0.9777
No log 1.3448 78 0.9402 0.4772 0.9402 0.9697
No log 1.3793 80 0.9074 0.4976 0.9074 0.9526
No log 1.4138 82 0.8773 0.5131 0.8773 0.9367
No log 1.4483 84 0.8645 0.4411 0.8645 0.9298
No log 1.4828 86 0.9528 0.4198 0.9528 0.9761
No log 1.5172 88 0.9795 0.3946 0.9795 0.9897
No log 1.5517 90 0.8816 0.4479 0.8816 0.9389
No log 1.5862 92 0.8419 0.3478 0.8419 0.9176
No log 1.6207 94 0.9077 0.3503 0.9077 0.9528
No log 1.6552 96 0.8781 0.3686 0.8781 0.9371
No log 1.6897 98 0.8248 0.5076 0.8248 0.9082
No log 1.7241 100 1.1610 0.3323 1.1610 1.0775
No log 1.7586 102 1.0025 0.3672 1.0025 1.0012
No log 1.7931 104 0.7911 0.4736 0.7911 0.8894
No log 1.8276 106 0.8696 0.4 0.8696 0.9325
No log 1.8621 108 0.9048 0.3778 0.9048 0.9512
No log 1.8966 110 0.9132 0.3885 0.9132 0.9556
No log 1.9310 112 0.9514 0.3996 0.9514 0.9754
No log 1.9655 114 0.9787 0.3409 0.9787 0.9893
No log 2.0 116 1.0254 0.4508 1.0254 1.0126
No log 2.0345 118 1.0132 0.2974 1.0132 1.0066
No log 2.0690 120 1.0088 0.3509 1.0088 1.0044
No log 2.1034 122 0.9921 0.3387 0.9921 0.9960
No log 2.1379 124 0.9857 0.3948 0.9857 0.9928
No log 2.1724 126 0.9802 0.4352 0.9802 0.9901
No log 2.2069 128 0.9742 0.4548 0.9742 0.9870
No log 2.2414 130 0.9923 0.4258 0.9923 0.9962
No log 2.2759 132 1.1146 0.2560 1.1146 1.0557
No log 2.3103 134 1.1096 0.3602 1.1096 1.0534
No log 2.3448 136 1.0029 0.4158 1.0029 1.0015
No log 2.3793 138 0.9813 0.2850 0.9813 0.9906
No log 2.4138 140 0.9848 0.2877 0.9848 0.9923
No log 2.4483 142 0.9794 0.3396 0.9794 0.9897
No log 2.4828 144 0.9880 0.3886 0.9880 0.9940
No log 2.5172 146 0.9415 0.3804 0.9415 0.9703
No log 2.5517 148 0.9594 0.4358 0.9594 0.9795
No log 2.5862 150 0.9099 0.4593 0.9099 0.9539
No log 2.6207 152 0.9400 0.4515 0.9400 0.9695
No log 2.6552 154 0.8928 0.4798 0.8928 0.9449
No log 2.6897 156 0.9530 0.4492 0.9530 0.9762
No log 2.7241 158 1.0291 0.4645 1.0291 1.0145
No log 2.7586 160 0.8789 0.5125 0.8789 0.9375
No log 2.7931 162 0.8760 0.3961 0.8760 0.9359
No log 2.8276 164 0.9362 0.3747 0.9362 0.9676
No log 2.8621 166 0.9349 0.3931 0.9349 0.9669
No log 2.8966 168 0.9609 0.5000 0.9609 0.9802
No log 2.9310 170 1.0284 0.4219 1.0284 1.0141
No log 2.9655 172 0.9419 0.5015 0.9419 0.9705
No log 3.0 174 0.9213 0.5374 0.9213 0.9599
No log 3.0345 176 0.8637 0.3891 0.8637 0.9293
No log 3.0690 178 0.8688 0.5176 0.8688 0.9321
No log 3.1034 180 0.8870 0.5219 0.8870 0.9418
No log 3.1379 182 0.8204 0.4993 0.8204 0.9058
No log 3.1724 184 0.8943 0.5303 0.8943 0.9457
No log 3.2069 186 1.0762 0.4410 1.0762 1.0374
No log 3.2414 188 1.2113 0.4048 1.2113 1.1006
No log 3.2759 190 1.1043 0.3884 1.1043 1.0508
No log 3.3103 192 1.0627 0.3912 1.0627 1.0309
No log 3.3448 194 1.0492 0.4497 1.0492 1.0243
No log 3.3793 196 0.9492 0.4606 0.9492 0.9743
No log 3.4138 198 0.9147 0.4394 0.9147 0.9564
No log 3.4483 200 0.8925 0.3827 0.8925 0.9447
No log 3.4828 202 0.8669 0.3663 0.8669 0.9311
No log 3.5172 204 0.8937 0.3827 0.8937 0.9454
No log 3.5517 206 0.9017 0.3687 0.9017 0.9496
No log 3.5862 208 0.8990 0.3687 0.8990 0.9481
No log 3.6207 210 0.8732 0.3908 0.8732 0.9344
No log 3.6552 212 0.8911 0.3908 0.8911 0.9440
No log 3.6897 214 0.9432 0.3436 0.9432 0.9712
No log 3.7241 216 1.0343 0.4101 1.0343 1.0170
No log 3.7586 218 1.0508 0.4013 1.0508 1.0251
No log 3.7931 220 0.9904 0.4130 0.9904 0.9952
No log 3.8276 222 0.8860 0.3573 0.8860 0.9413
No log 3.8621 224 0.8422 0.4860 0.8422 0.9177
No log 3.8966 226 1.0583 0.3908 1.0583 1.0287
No log 3.9310 228 1.1150 0.4186 1.1150 1.0559
No log 3.9655 230 0.9140 0.4898 0.9140 0.9560
No log 4.0 232 0.7983 0.5925 0.7983 0.8935
No log 4.0345 234 0.8047 0.5946 0.8047 0.8970
No log 4.0690 236 0.8118 0.6127 0.8118 0.9010
No log 4.1034 238 0.8892 0.5170 0.8892 0.9430
No log 4.1379 240 1.0033 0.4436 1.0033 1.0017
No log 4.1724 242 1.1649 0.4026 1.1649 1.0793
No log 4.2069 244 1.2344 0.3987 1.2344 1.1111
No log 4.2414 246 1.0220 0.4440 1.0220 1.0110
No log 4.2759 248 0.8353 0.5006 0.8353 0.9140
No log 4.3103 250 0.7817 0.5150 0.7817 0.8842
No log 4.3448 252 0.7739 0.4352 0.7739 0.8797
No log 4.3793 254 0.7774 0.3796 0.7774 0.8817
No log 4.4138 256 0.7913 0.3424 0.7913 0.8896
No log 4.4483 258 0.8220 0.4334 0.8220 0.9066
No log 4.4828 260 0.8429 0.4743 0.8429 0.9181
No log 4.5172 262 0.8439 0.4105 0.8439 0.9186
No log 4.5517 264 0.8507 0.3380 0.8507 0.9223
No log 4.5862 266 0.9027 0.4814 0.9027 0.9501
No log 4.6207 268 0.9960 0.4394 0.9960 0.9980
No log 4.6552 270 0.9920 0.4397 0.9920 0.9960
No log 4.6897 272 0.9397 0.5198 0.9397 0.9694
No log 4.7241 274 0.8493 0.3363 0.8493 0.9216
No log 4.7586 276 0.8241 0.3747 0.8241 0.9078
No log 4.7931 278 0.8314 0.3747 0.8314 0.9118
No log 4.8276 280 0.8530 0.3652 0.8530 0.9236
No log 4.8621 282 0.9516 0.5061 0.9516 0.9755
No log 4.8966 284 0.9610 0.5252 0.9610 0.9803
No log 4.9310 286 0.8504 0.5211 0.8504 0.9222
No log 4.9655 288 0.8284 0.4159 0.8284 0.9101
No log 5.0 290 0.8608 0.3733 0.8608 0.9278
No log 5.0345 292 0.9097 0.4606 0.9097 0.9538
No log 5.0690 294 0.8956 0.3960 0.8956 0.9463
No log 5.1034 296 0.8896 0.3565 0.8896 0.9432
No log 5.1379 298 0.9194 0.3868 0.9194 0.9589
No log 5.1724 300 0.9120 0.3522 0.9120 0.9550
No log 5.2069 302 0.8674 0.3705 0.8674 0.9314
No log 5.2414 304 0.8464 0.3796 0.8464 0.9200
No log 5.2759 306 0.8413 0.4308 0.8413 0.9172
No log 5.3103 308 0.8771 0.5192 0.8771 0.9366
No log 5.3448 310 0.8391 0.5220 0.8391 0.9160
No log 5.3793 312 0.8109 0.4308 0.8109 0.9005
No log 5.4138 314 0.7980 0.4161 0.7980 0.8933
No log 5.4483 316 0.7946 0.4196 0.7946 0.8914
No log 5.4828 318 0.8127 0.5040 0.8127 0.9015
No log 5.5172 320 0.8434 0.4835 0.8434 0.9184
No log 5.5517 322 0.8537 0.3690 0.8537 0.9240
No log 5.5862 324 0.9030 0.4401 0.9030 0.9503
No log 5.6207 326 1.0126 0.4792 1.0126 1.0063
No log 5.6552 328 1.1137 0.4390 1.1137 1.0553
No log 5.6897 330 1.1446 0.4152 1.1446 1.0699
No log 5.7241 332 1.1032 0.4711 1.1032 1.0503
No log 5.7586 334 1.0415 0.4222 1.0415 1.0206
No log 5.7931 336 0.9515 0.3211 0.9515 0.9754
No log 5.8276 338 0.8755 0.3525 0.8755 0.9357
No log 5.8621 340 0.8763 0.4489 0.8763 0.9361
No log 5.8966 342 0.9095 0.3836 0.9095 0.9537
No log 5.9310 344 0.9377 0.3641 0.9377 0.9683
No log 5.9655 346 1.0051 0.2276 1.0051 1.0026
No log 6.0 348 1.1297 0.1801 1.1297 1.0629
No log 6.0345 350 1.1843 0.3115 1.1843 1.0883
No log 6.0690 352 1.1244 0.3474 1.1244 1.0604
No log 6.1034 354 0.9885 0.3864 0.9885 0.9942
No log 6.1379 356 0.8706 0.3738 0.8706 0.9331
No log 6.1724 358 0.8206 0.4126 0.8206 0.9059
No log 6.2069 360 0.7962 0.4965 0.7962 0.8923
No log 6.2414 362 0.7809 0.5181 0.7809 0.8837
No log 6.2759 364 0.7774 0.5607 0.7774 0.8817
No log 6.3103 366 0.7713 0.5772 0.7713 0.8782
No log 6.3448 368 0.7993 0.5402 0.7993 0.8940
No log 6.3793 370 0.9147 0.4511 0.9147 0.9564
No log 6.4138 372 0.9310 0.4246 0.9310 0.9649
No log 6.4483 374 0.9106 0.4272 0.9106 0.9543
No log 6.4828 376 0.7659 0.5211 0.7659 0.8752
No log 6.5172 378 0.7476 0.5489 0.7476 0.8646
No log 6.5517 380 0.8248 0.5102 0.8248 0.9082
No log 6.5862 382 0.8169 0.4968 0.8169 0.9038
No log 6.6207 384 0.8223 0.4796 0.8223 0.9068
No log 6.6552 386 0.8658 0.4732 0.8658 0.9305
No log 6.6897 388 0.9634 0.4 0.9634 0.9815
No log 6.7241 390 1.0217 0.3367 1.0217 1.0108
No log 6.7586 392 1.0022 0.3730 1.0022 1.0011
No log 6.7931 394 0.9344 0.2850 0.9344 0.9667
No log 6.8276 396 0.9085 0.3122 0.9085 0.9532
No log 6.8621 398 0.9449 0.3474 0.9449 0.9721
No log 6.8966 400 0.9935 0.4519 0.9935 0.9968
No log 6.9310 402 1.0445 0.4481 1.0445 1.0220
No log 6.9655 404 0.9302 0.4354 0.9302 0.9645
No log 7.0 406 0.8459 0.3666 0.8459 0.9197
No log 7.0345 408 0.8324 0.3957 0.8324 0.9123
No log 7.0690 410 0.8218 0.4244 0.8218 0.9065
No log 7.1034 412 0.8356 0.4100 0.8356 0.9141
No log 7.1379 414 0.8248 0.4100 0.8248 0.9082
No log 7.1724 416 0.8439 0.3957 0.8439 0.9186
No log 7.2069 418 0.8345 0.3957 0.8345 0.9135
No log 7.2414 420 0.7990 0.4548 0.7990 0.8939
No log 7.2759 422 0.8488 0.4785 0.8488 0.9213
No log 7.3103 424 0.8368 0.4763 0.8368 0.9148
No log 7.3448 426 0.8093 0.4512 0.8093 0.8996
No log 7.3793 428 0.9407 0.3281 0.9407 0.9699
No log 7.4138 430 1.1999 0.4154 1.1999 1.0954
No log 7.4483 432 1.3414 0.3431 1.3414 1.1582
No log 7.4828 434 1.2995 0.2757 1.2995 1.1399
No log 7.5172 436 1.1879 0.1756 1.1879 1.0899
No log 7.5517 438 1.1910 0.1756 1.1910 1.0913
No log 7.5862 440 1.1914 0.2020 1.1914 1.0915
No log 7.6207 442 1.2574 0.3405 1.2574 1.1213
No log 7.6552 444 1.3432 0.3026 1.3432 1.1590
No log 7.6897 446 1.3152 0.3026 1.3152 1.1468
No log 7.7241 448 1.2336 0.3405 1.2336 1.1107
No log 7.7586 450 1.0960 0.1650 1.0960 1.0469
No log 7.7931 452 0.9882 0.2871 0.9882 0.9941
No log 7.8276 454 0.9755 0.2969 0.9755 0.9877
No log 7.8621 456 1.0051 0.2343 1.0051 1.0025
No log 7.8966 458 1.0815 0.2330 1.0815 1.0399
No log 7.9310 460 1.1073 0.3709 1.1073 1.0523
No log 7.9655 462 1.0600 0.3881 1.0600 1.0296
No log 8.0 464 1.0384 0.4221 1.0384 1.0190
No log 8.0345 466 0.9749 0.3451 0.9749 0.9874
No log 8.0690 468 0.9452 0.3809 0.9452 0.9722
No log 8.1034 470 0.9739 0.4258 0.9739 0.9869
No log 8.1379 472 1.0345 0.4545 1.0345 1.0171
No log 8.1724 474 1.0801 0.4786 1.0801 1.0393
No log 8.2069 476 1.0439 0.4786 1.0439 1.0217
No log 8.2414 478 1.0348 0.4410 1.0348 1.0173
No log 8.2759 480 1.0201 0.4220 1.0201 1.0100
No log 8.3103 482 0.9382 0.3556 0.9382 0.9686
No log 8.3448 484 0.8759 0.4062 0.8759 0.9359
No log 8.3793 486 0.8884 0.3786 0.8884 0.9425
No log 8.4138 488 0.9673 0.4345 0.9673 0.9835
No log 8.4483 490 1.0021 0.4781 1.0021 1.0010
No log 8.4828 492 0.9626 0.4781 0.9626 0.9811
No log 8.5172 494 0.8993 0.3653 0.8993 0.9483
No log 8.5517 496 0.9018 0.3689 0.9018 0.9496
No log 8.5862 498 0.9248 0.3773 0.9248 0.9617
0.3798 8.6207 500 0.9552 0.3606 0.9552 0.9773
0.3798 8.6552 502 0.9392 0.3981 0.9392 0.9691
0.3798 8.6897 504 0.9126 0.3758 0.9126 0.9553
0.3798 8.7241 506 0.9403 0.4028 0.9403 0.9697
0.3798 8.7586 508 0.9647 0.3950 0.9647 0.9822
0.3798 8.7931 510 1.0153 0.3359 1.0153 1.0076
0.3798 8.8276 512 1.0092 0.2317 1.0092 1.0046
0.3798 8.8621 514 0.9748 0.2432 0.9748 0.9873
0.3798 8.8966 516 0.9938 0.2276 0.9938 0.9969
0.3798 8.9310 518 1.0706 0.1860 1.0706 1.0347

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
2
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k19_task2_organization

Finetuned
(4023)
this model