ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k10_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5349
  • Qwk: 0.6451
  • Mse: 0.5349
  • Rmse: 0.7314

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0741 2 4.0880 0.0024 4.0880 2.0219
No log 0.1481 4 2.3282 0.0541 2.3282 1.5258
No log 0.2222 6 2.0560 -0.0450 2.0560 1.4339
No log 0.2963 8 1.4911 0.0294 1.4911 1.2211
No log 0.3704 10 1.1092 0.3003 1.1092 1.0532
No log 0.4444 12 1.0452 0.3625 1.0452 1.0223
No log 0.5185 14 1.0255 0.3521 1.0255 1.0127
No log 0.5926 16 1.0632 0.1764 1.0632 1.0311
No log 0.6667 18 1.1130 0.1764 1.1130 1.0550
No log 0.7407 20 1.0794 0.2981 1.0794 1.0389
No log 0.8148 22 1.0039 0.2108 1.0039 1.0019
No log 0.8889 24 1.0397 0.1516 1.0397 1.0197
No log 0.9630 26 1.0547 0.1137 1.0547 1.0270
No log 1.0370 28 1.2706 0.0814 1.2706 1.1272
No log 1.1111 30 1.3902 0.1487 1.3902 1.1791
No log 1.1852 32 1.0989 0.2441 1.0989 1.0483
No log 1.2593 34 1.0044 0.2265 1.0044 1.0022
No log 1.3333 36 1.1569 0.2293 1.1569 1.0756
No log 1.4074 38 1.3568 -0.0296 1.3568 1.1648
No log 1.4815 40 1.4769 -0.0148 1.4769 1.2153
No log 1.5556 42 1.3959 -0.0148 1.3959 1.1815
No log 1.6296 44 1.3592 0.0 1.3592 1.1658
No log 1.7037 46 1.1966 0.1024 1.1966 1.0939
No log 1.7778 48 1.0175 0.3003 1.0175 1.0087
No log 1.8519 50 0.9568 0.2566 0.9568 0.9782
No log 1.9259 52 0.9354 0.2849 0.9354 0.9672
No log 2.0 54 0.9409 0.1389 0.9409 0.9700
No log 2.0741 56 0.9543 0.1601 0.9543 0.9769
No log 2.1481 58 0.9327 0.2818 0.9327 0.9658
No log 2.2222 60 0.9016 0.4402 0.9016 0.9495
No log 2.2963 62 0.8892 0.4312 0.8892 0.9430
No log 2.3704 64 0.8344 0.4022 0.8344 0.9135
No log 2.4444 66 0.8468 0.3288 0.8468 0.9202
No log 2.5185 68 0.9091 0.2262 0.9091 0.9535
No log 2.5926 70 0.9734 0.1998 0.9734 0.9866
No log 2.6667 72 0.9566 0.1799 0.9566 0.9781
No log 2.7407 74 0.8942 0.3094 0.8942 0.9456
No log 2.8148 76 0.8805 0.4275 0.8805 0.9384
No log 2.8889 78 0.8467 0.4710 0.8467 0.9201
No log 2.9630 80 0.8043 0.4727 0.8043 0.8968
No log 3.0370 82 0.7295 0.4932 0.7295 0.8541
No log 3.1111 84 0.7177 0.5146 0.7177 0.8472
No log 3.1852 86 0.8133 0.3844 0.8133 0.9018
No log 3.2593 88 0.8868 0.4004 0.8868 0.9417
No log 3.3333 90 0.9298 0.2960 0.9298 0.9642
No log 3.4074 92 0.9505 0.3283 0.9505 0.9749
No log 3.4815 94 0.7994 0.4650 0.7994 0.8941
No log 3.5556 96 0.7235 0.5403 0.7235 0.8506
No log 3.6296 98 0.7522 0.5435 0.7522 0.8673
No log 3.7037 100 0.7400 0.5994 0.7400 0.8602
No log 3.7778 102 0.7164 0.6079 0.7164 0.8464
No log 3.8519 104 0.6414 0.6209 0.6414 0.8009
No log 3.9259 106 0.5822 0.6252 0.5822 0.7631
No log 4.0 108 0.5971 0.6032 0.5971 0.7727
No log 4.0741 110 0.7344 0.5916 0.7344 0.8570
No log 4.1481 112 0.8791 0.4681 0.8791 0.9376
No log 4.2222 114 0.8022 0.4902 0.8022 0.8957
No log 4.2963 116 0.6624 0.5923 0.6624 0.8139
No log 4.3704 118 0.5633 0.7049 0.5633 0.7505
No log 4.4444 120 0.5428 0.7018 0.5428 0.7367
No log 4.5185 122 0.5315 0.6931 0.5315 0.7291
No log 4.5926 124 0.5686 0.6324 0.5686 0.7541
No log 4.6667 126 0.5711 0.5840 0.5711 0.7557
No log 4.7407 128 0.5482 0.6301 0.5482 0.7404
No log 4.8148 130 0.5646 0.6634 0.5646 0.7514
No log 4.8889 132 0.5654 0.6419 0.5654 0.7520
No log 4.9630 134 0.5289 0.6324 0.5289 0.7272
No log 5.0370 136 0.7338 0.6539 0.7338 0.8566
No log 5.1111 138 0.7311 0.6539 0.7311 0.8550
No log 5.1852 140 0.5510 0.6324 0.5510 0.7423
No log 5.2593 142 0.6552 0.6080 0.6552 0.8094
No log 5.3333 144 0.6952 0.6275 0.6952 0.8338
No log 5.4074 146 0.5923 0.6215 0.5923 0.7696
No log 5.4815 148 0.6194 0.6314 0.6194 0.7871
No log 5.5556 150 0.6244 0.6700 0.6244 0.7902
No log 5.6296 152 0.5640 0.6796 0.5639 0.7510
No log 5.7037 154 0.5514 0.6690 0.5514 0.7425
No log 5.7778 156 0.5388 0.6164 0.5388 0.7340
No log 5.8519 158 0.5406 0.6455 0.5406 0.7352
No log 5.9259 160 0.6336 0.6160 0.6336 0.7960
No log 6.0 162 0.6442 0.5867 0.6442 0.8026
No log 6.0741 164 0.5524 0.6584 0.5524 0.7432
No log 6.1481 166 0.5500 0.5679 0.5500 0.7416
No log 6.2222 168 0.5491 0.5549 0.5491 0.7410
No log 6.2963 170 0.5427 0.5972 0.5427 0.7367
No log 6.3704 172 0.5573 0.6688 0.5573 0.7465
No log 6.4444 174 0.5321 0.6445 0.5321 0.7294
No log 6.5185 176 0.5078 0.6363 0.5078 0.7126
No log 6.5926 178 0.4864 0.6897 0.4864 0.6974
No log 6.6667 180 0.4974 0.6479 0.4974 0.7053
No log 6.7407 182 0.4970 0.6724 0.4970 0.7050
No log 6.8148 184 0.5175 0.6716 0.5175 0.7194
No log 6.8889 186 0.5584 0.6716 0.5584 0.7472
No log 6.9630 188 0.5333 0.6667 0.5333 0.7303
No log 7.0370 190 0.5907 0.6328 0.5907 0.7686
No log 7.1111 192 0.6823 0.5428 0.6823 0.8260
No log 7.1852 194 0.6712 0.5745 0.6712 0.8193
No log 7.2593 196 0.6039 0.6405 0.6039 0.7771
No log 7.3333 198 0.5822 0.7074 0.5822 0.7630
No log 7.4074 200 0.6019 0.6209 0.6019 0.7758
No log 7.4815 202 0.5788 0.5648 0.5788 0.7608
No log 7.5556 204 0.5587 0.5648 0.5587 0.7475
No log 7.6296 206 0.5534 0.6138 0.5534 0.7439
No log 7.7037 208 0.5290 0.6157 0.5290 0.7273
No log 7.7778 210 0.5175 0.6756 0.5175 0.7194
No log 7.8519 212 0.5182 0.6605 0.5182 0.7199
No log 7.9259 214 0.5238 0.6936 0.5238 0.7237
No log 8.0 216 0.6211 0.6045 0.6211 0.7881
No log 8.0741 218 0.6696 0.6218 0.6696 0.8183
No log 8.1481 220 0.6019 0.6227 0.6019 0.7758
No log 8.2222 222 0.5184 0.6333 0.5184 0.7200
No log 8.2963 224 0.5162 0.7064 0.5162 0.7185
No log 8.3704 226 0.5519 0.6546 0.5519 0.7429
No log 8.4444 228 0.6131 0.6045 0.6131 0.7830
No log 8.5185 230 0.5761 0.6331 0.5761 0.7590
No log 8.5926 232 0.5147 0.6919 0.5147 0.7174
No log 8.6667 234 0.6113 0.5688 0.6113 0.7818
No log 8.7407 236 0.5841 0.6204 0.5841 0.7643
No log 8.8148 238 0.5095 0.6936 0.5095 0.7138
No log 8.8889 240 0.6475 0.5861 0.6475 0.8046
No log 8.9630 242 0.7090 0.5750 0.7090 0.8420
No log 9.0370 244 0.5935 0.6340 0.5935 0.7704
No log 9.1111 246 0.5471 0.6770 0.5471 0.7396
No log 9.1852 248 0.5534 0.6634 0.5534 0.7439
No log 9.2593 250 0.5540 0.6950 0.5540 0.7443
No log 9.3333 252 0.5847 0.6154 0.5847 0.7647
No log 9.4074 254 0.5678 0.6105 0.5678 0.7535
No log 9.4815 256 0.5499 0.6043 0.5499 0.7416
No log 9.5556 258 0.5324 0.6548 0.5324 0.7296
No log 9.6296 260 0.5548 0.6272 0.5548 0.7448
No log 9.7037 262 0.5652 0.5998 0.5652 0.7518
No log 9.7778 264 0.5157 0.6798 0.5157 0.7181
No log 9.8519 266 0.4950 0.6788 0.4950 0.7035
No log 9.9259 268 0.4972 0.6451 0.4972 0.7051
No log 10.0 270 0.5229 0.6584 0.5229 0.7231
No log 10.0741 272 0.5495 0.6008 0.5495 0.7413
No log 10.1481 274 0.5568 0.6617 0.5568 0.7462
No log 10.2222 276 0.5491 0.6617 0.5491 0.7410
No log 10.2963 278 0.5443 0.7081 0.5443 0.7378
No log 10.3704 280 0.5566 0.6438 0.5566 0.7460
No log 10.4444 282 0.5408 0.6753 0.5408 0.7354
No log 10.5185 284 0.5541 0.6927 0.5541 0.7444
No log 10.5926 286 0.5763 0.5759 0.5763 0.7591
No log 10.6667 288 0.6044 0.5463 0.6044 0.7774
No log 10.7407 290 0.6902 0.5484 0.6902 0.8308
No log 10.8148 292 0.6677 0.5403 0.6677 0.8171
No log 10.8889 294 0.6049 0.5928 0.6049 0.7777
No log 10.9630 296 0.6230 0.5441 0.6230 0.7893
No log 11.0370 298 0.6326 0.5441 0.6326 0.7954
No log 11.1111 300 0.6136 0.5771 0.6136 0.7833
No log 11.1852 302 0.6294 0.6122 0.6294 0.7934
No log 11.2593 304 0.6476 0.6063 0.6476 0.8047
No log 11.3333 306 0.6109 0.6011 0.6109 0.7816
No log 11.4074 308 0.5904 0.6011 0.5904 0.7684
No log 11.4815 310 0.6069 0.6502 0.6069 0.7790
No log 11.5556 312 0.6336 0.6401 0.6336 0.7960
No log 11.6296 314 0.5761 0.6611 0.5761 0.7590
No log 11.7037 316 0.5461 0.6252 0.5461 0.7390
No log 11.7778 318 0.5229 0.6566 0.5229 0.7231
No log 11.8519 320 0.5288 0.7019 0.5288 0.7272
No log 11.9259 322 0.5603 0.7114 0.5603 0.7486
No log 12.0 324 0.6441 0.6045 0.6441 0.8025
No log 12.0741 326 0.6053 0.6287 0.6053 0.7780
No log 12.1481 328 0.5324 0.6857 0.5324 0.7297
No log 12.2222 330 0.4868 0.6962 0.4868 0.6977
No log 12.2963 332 0.4906 0.7116 0.4906 0.7004
No log 12.3704 334 0.5209 0.6789 0.5209 0.7217
No log 12.4444 336 0.6010 0.6127 0.6010 0.7753
No log 12.5185 338 0.5962 0.6127 0.5962 0.7721
No log 12.5926 340 0.5202 0.6833 0.5202 0.7212
No log 12.6667 342 0.5098 0.7007 0.5098 0.7140
No log 12.7407 344 0.5129 0.6942 0.5129 0.7162
No log 12.8148 346 0.5489 0.6678 0.5489 0.7409
No log 12.8889 348 0.5734 0.6455 0.5734 0.7572
No log 12.9630 350 0.5879 0.6257 0.5879 0.7668
No log 13.0370 352 0.5342 0.6573 0.5342 0.7309
No log 13.1111 354 0.5235 0.6782 0.5235 0.7236
No log 13.1852 356 0.5417 0.6740 0.5417 0.7360
No log 13.2593 358 0.5775 0.6252 0.5775 0.7599
No log 13.3333 360 0.5910 0.6137 0.5910 0.7688
No log 13.4074 362 0.5427 0.6529 0.5427 0.7367
No log 13.4815 364 0.5043 0.7309 0.5043 0.7102
No log 13.5556 366 0.5092 0.6890 0.5092 0.7136
No log 13.6296 368 0.4949 0.7309 0.4949 0.7035
No log 13.7037 370 0.5045 0.6510 0.5045 0.7103
No log 13.7778 372 0.5593 0.6520 0.5593 0.7479
No log 13.8519 374 0.5730 0.6520 0.5730 0.7570
No log 13.9259 376 0.5352 0.6630 0.5352 0.7315
No log 14.0 378 0.5480 0.6333 0.5480 0.7403
No log 14.0741 380 0.5462 0.6178 0.5462 0.7390
No log 14.1481 382 0.5466 0.6178 0.5466 0.7393
No log 14.2222 384 0.5266 0.6637 0.5266 0.7257
No log 14.2963 386 0.5230 0.6678 0.5230 0.7232
No log 14.3704 388 0.5855 0.6485 0.5855 0.7652
No log 14.4444 390 0.6686 0.5821 0.6686 0.8177
No log 14.5185 392 0.6111 0.6653 0.6111 0.7817
No log 14.5926 394 0.5354 0.6584 0.5354 0.7317
No log 14.6667 396 0.5322 0.6593 0.5322 0.7295
No log 14.7407 398 0.5151 0.6988 0.5151 0.7177
No log 14.8148 400 0.5207 0.6916 0.5207 0.7216
No log 14.8889 402 0.5427 0.6798 0.5427 0.7367
No log 14.9630 404 0.5619 0.6473 0.5619 0.7496
No log 15.0370 406 0.5657 0.6361 0.5657 0.7521
No log 15.1111 408 0.5777 0.6249 0.5777 0.7600
No log 15.1852 410 0.6079 0.6429 0.6079 0.7796
No log 15.2593 412 0.5676 0.6473 0.5676 0.7534
No log 15.3333 414 0.5577 0.6664 0.5577 0.7468
No log 15.4074 416 0.5436 0.6701 0.5436 0.7373
No log 15.4815 418 0.5564 0.6128 0.5564 0.7459
No log 15.5556 420 0.6060 0.6282 0.6060 0.7785
No log 15.6296 422 0.7383 0.5856 0.7383 0.8592
No log 15.7037 424 0.7911 0.4556 0.7911 0.8894
No log 15.7778 426 0.6944 0.5856 0.6944 0.8333
No log 15.8519 428 0.5683 0.6630 0.5683 0.7538
No log 15.9259 430 0.5056 0.6779 0.5056 0.7111
No log 16.0 432 0.4914 0.6625 0.4914 0.7010
No log 16.0741 434 0.4981 0.6974 0.4981 0.7058
No log 16.1481 436 0.4933 0.6805 0.4933 0.7023
No log 16.2222 438 0.5441 0.6282 0.5441 0.7376
No log 16.2963 440 0.7197 0.5968 0.7197 0.8484
No log 16.3704 442 0.8353 0.4975 0.8353 0.9140
No log 16.4444 444 0.7550 0.4881 0.7550 0.8689
No log 16.5185 446 0.6260 0.5292 0.6260 0.7912
No log 16.5926 448 0.5814 0.6255 0.5814 0.7625
No log 16.6667 450 0.5667 0.6646 0.5667 0.7528
No log 16.7407 452 0.5833 0.6209 0.5833 0.7638
No log 16.8148 454 0.6140 0.6099 0.6140 0.7836
No log 16.8889 456 0.6492 0.6333 0.6492 0.8057
No log 16.9630 458 0.6120 0.6147 0.6120 0.7823
No log 17.0370 460 0.5867 0.6167 0.5867 0.7660
No log 17.1111 462 0.5404 0.6824 0.5404 0.7351
No log 17.1852 464 0.5248 0.6894 0.5248 0.7244
No log 17.2593 466 0.5135 0.6806 0.5135 0.7166
No log 17.3333 468 0.5000 0.6695 0.5000 0.7071
No log 17.4074 470 0.4704 0.6780 0.4704 0.6859
No log 17.4815 472 0.4525 0.6995 0.4525 0.6727
No log 17.5556 474 0.4607 0.7042 0.4607 0.6787
No log 17.6296 476 0.4719 0.6756 0.4719 0.6869
No log 17.7037 478 0.4894 0.7325 0.4894 0.6996
No log 17.7778 480 0.4698 0.7011 0.4698 0.6855
No log 17.8519 482 0.4789 0.6777 0.4789 0.6920
No log 17.9259 484 0.4885 0.6966 0.4885 0.6989
No log 18.0 486 0.4848 0.6903 0.4848 0.6963
No log 18.0741 488 0.5117 0.6815 0.5117 0.7153
No log 18.1481 490 0.6073 0.6815 0.6073 0.7793
No log 18.2222 492 0.5930 0.6624 0.5930 0.7701
No log 18.2963 494 0.5242 0.6983 0.5242 0.7240
No log 18.3704 496 0.4907 0.6772 0.4907 0.7005
No log 18.4444 498 0.4793 0.6712 0.4793 0.6924
0.322 18.5185 500 0.4808 0.6712 0.4808 0.6934
0.322 18.5926 502 0.5131 0.6695 0.5131 0.7163
0.322 18.6667 504 0.5654 0.6774 0.5654 0.7519
0.322 18.7407 506 0.5768 0.6476 0.5768 0.7594
0.322 18.8148 508 0.6088 0.6476 0.6088 0.7802
0.322 18.8889 510 0.5957 0.6476 0.5957 0.7718
0.322 18.9630 512 0.5514 0.6519 0.5514 0.7425
0.322 19.0370 514 0.5349 0.6451 0.5349 0.7314

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k10_task5_organization

Finetuned
(4023)
this model