ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k16_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8075
  • Qwk: 0.4796
  • Mse: 0.8075
  • Rmse: 0.8986

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0488 2 4.1074 0.0069 4.1074 2.0267
No log 0.0976 4 2.0961 0.1002 2.0961 1.4478
No log 0.1463 6 1.4143 0.0143 1.4143 1.1892
No log 0.1951 8 1.0876 0.2221 1.0876 1.0429
No log 0.2439 10 1.0962 0.1725 1.0962 1.0470
No log 0.2927 12 1.4127 -0.0278 1.4127 1.1886
No log 0.3415 14 1.7442 -0.0398 1.7442 1.3207
No log 0.3902 16 1.5107 -0.0560 1.5107 1.2291
No log 0.4390 18 1.1483 0.1119 1.1483 1.0716
No log 0.4878 20 1.0804 0.2140 1.0804 1.0394
No log 0.5366 22 1.2017 0.1658 1.2017 1.0962
No log 0.5854 24 1.5324 0.0 1.5324 1.2379
No log 0.6341 26 1.5029 0.0 1.5029 1.2259
No log 0.6829 28 1.2931 0.0380 1.2931 1.1371
No log 0.7317 30 1.1481 0.1148 1.1481 1.0715
No log 0.7805 32 1.0776 0.1989 1.0776 1.0381
No log 0.8293 34 1.0707 0.1891 1.0707 1.0347
No log 0.8780 36 1.0672 0.1131 1.0672 1.0331
No log 0.9268 38 1.0589 0.1755 1.0589 1.0290
No log 0.9756 40 1.0625 0.1713 1.0625 1.0308
No log 1.0244 42 1.1155 0.1509 1.1155 1.0562
No log 1.0732 44 1.1576 0.2004 1.1576 1.0759
No log 1.1220 46 1.0573 0.2367 1.0573 1.0283
No log 1.1707 48 0.9751 0.2390 0.9751 0.9875
No log 1.2195 50 1.1348 0.2632 1.1348 1.0653
No log 1.2683 52 1.1737 0.0888 1.1737 1.0834
No log 1.3171 54 1.1011 0.1233 1.1011 1.0493
No log 1.3659 56 1.0346 0.1962 1.0346 1.0171
No log 1.4146 58 1.0134 0.4186 1.0134 1.0067
No log 1.4634 60 1.0019 0.3272 1.0019 1.0009
No log 1.5122 62 0.9739 0.2716 0.9739 0.9869
No log 1.5610 64 0.9885 0.1989 0.9885 0.9942
No log 1.6098 66 1.0444 0.1699 1.0444 1.0220
No log 1.6585 68 1.0804 0.1826 1.0804 1.0394
No log 1.7073 70 1.0548 0.3278 1.0548 1.0270
No log 1.7561 72 0.9483 0.2932 0.9483 0.9738
No log 1.8049 74 0.9363 0.3896 0.9363 0.9676
No log 1.8537 76 1.0326 0.3119 1.0326 1.0162
No log 1.9024 78 1.2722 0.2149 1.2722 1.1279
No log 1.9512 80 1.3866 0.1790 1.3866 1.1775
No log 2.0 82 1.1931 0.3310 1.1931 1.0923
No log 2.0488 84 0.9046 0.3973 0.9046 0.9511
No log 2.0976 86 0.8300 0.3642 0.8300 0.9111
No log 2.1463 88 0.8375 0.4628 0.8375 0.9152
No log 2.1951 90 1.0470 0.4222 1.0470 1.0232
No log 2.2439 92 1.4174 0.2424 1.4174 1.1905
No log 2.2927 94 1.3345 0.2381 1.3345 1.1552
No log 2.3415 96 0.9724 0.4563 0.9724 0.9861
No log 2.3902 98 0.8816 0.3957 0.8816 0.9389
No log 2.4390 100 0.9373 0.4681 0.9373 0.9682
No log 2.4878 102 1.0961 0.3283 1.0961 1.0469
No log 2.5366 104 1.2547 0.2970 1.2547 1.1202
No log 2.5854 106 1.1644 0.3478 1.1644 1.0791
No log 2.6341 108 0.9188 0.4291 0.9188 0.9585
No log 2.6829 110 0.8502 0.2742 0.8502 0.9220
No log 2.7317 112 0.8317 0.3052 0.8317 0.9120
No log 2.7805 114 0.8840 0.4115 0.8840 0.9402
No log 2.8293 116 1.2434 0.2863 1.2434 1.1151
No log 2.8780 118 1.5096 0.2396 1.5096 1.2287
No log 2.9268 120 1.3055 0.2898 1.3055 1.1426
No log 2.9756 122 0.9309 0.4455 0.9309 0.9649
No log 3.0244 124 0.7905 0.4547 0.7905 0.8891
No log 3.0732 126 0.8068 0.5425 0.8068 0.8982
No log 3.1220 128 0.7926 0.4676 0.7926 0.8903
No log 3.1707 130 0.8776 0.4943 0.8776 0.9368
No log 3.2195 132 1.0752 0.3724 1.0752 1.0369
No log 3.2683 134 1.1722 0.2934 1.1722 1.0827
No log 3.3171 136 1.0456 0.4186 1.0456 1.0226
No log 3.3659 138 0.9205 0.4423 0.9205 0.9594
No log 3.4146 140 0.8029 0.4759 0.8029 0.8960
No log 3.4634 142 0.8064 0.4759 0.8064 0.8980
No log 3.5122 144 0.9073 0.4575 0.9073 0.9525
No log 3.5610 146 1.1225 0.3385 1.1225 1.0595
No log 3.6098 148 1.0859 0.3984 1.0859 1.0421
No log 3.6585 150 0.9130 0.4807 0.9130 0.9555
No log 3.7073 152 0.8007 0.4898 0.8007 0.8948
No log 3.7561 154 0.7976 0.4557 0.7976 0.8931
No log 3.8049 156 0.7982 0.4557 0.7982 0.8934
No log 3.8537 158 0.8152 0.4660 0.8152 0.9029
No log 3.9024 160 0.8553 0.4511 0.8553 0.9248
No log 3.9512 162 0.9509 0.4478 0.9509 0.9751
No log 4.0 164 1.1278 0.3461 1.1278 1.0620
No log 4.0488 166 1.1438 0.3461 1.1438 1.0695
No log 4.0976 168 1.0800 0.4255 1.0800 1.0392
No log 4.1463 170 1.0039 0.3775 1.0039 1.0019
No log 4.1951 172 1.0443 0.3846 1.0443 1.0219
No log 4.2439 174 1.0144 0.4151 1.0144 1.0072
No log 4.2927 176 0.9081 0.3902 0.9081 0.9529
No log 4.3415 178 0.7848 0.5113 0.7848 0.8859
No log 4.3902 180 0.7547 0.5156 0.7547 0.8687
No log 4.4390 182 0.7603 0.5002 0.7603 0.8720
No log 4.4878 184 0.8238 0.4815 0.8238 0.9076
No log 4.5366 186 0.8952 0.4581 0.8952 0.9462
No log 4.5854 188 0.8805 0.4807 0.8805 0.9383
No log 4.6341 190 0.8229 0.5366 0.8229 0.9072
No log 4.6829 192 0.7403 0.5135 0.7403 0.8604
No log 4.7317 194 0.7453 0.5069 0.7453 0.8633
No log 4.7805 196 0.8101 0.3577 0.8101 0.9000
No log 4.8293 198 0.7751 0.3537 0.7751 0.8804
No log 4.8780 200 0.7765 0.4743 0.7765 0.8812
No log 4.9268 202 0.8983 0.4695 0.8983 0.9478
No log 4.9756 204 0.9661 0.4167 0.9661 0.9829
No log 5.0244 206 1.0790 0.4162 1.0790 1.0387
No log 5.0732 208 1.1066 0.4152 1.1066 1.0520
No log 5.1220 210 0.9510 0.3959 0.9510 0.9752
No log 5.1707 212 0.9140 0.3897 0.9140 0.9560
No log 5.2195 214 0.9380 0.3862 0.9380 0.9685
No log 5.2683 216 1.0926 0.3928 1.0926 1.0453
No log 5.3171 218 1.1948 0.3493 1.1948 1.0931
No log 5.3659 220 1.1253 0.3810 1.1253 1.0608
No log 5.4146 222 0.9463 0.3207 0.9463 0.9728
No log 5.4634 224 0.8477 0.3940 0.8477 0.9207
No log 5.5122 226 0.8142 0.4973 0.8142 0.9023
No log 5.5610 228 0.8469 0.4584 0.8469 0.9203
No log 5.6098 230 0.9360 0.4579 0.9360 0.9675
No log 5.6585 232 1.0298 0.4668 1.0298 1.0148
No log 5.7073 234 0.9643 0.4779 0.9643 0.9820
No log 5.7561 236 0.8780 0.5131 0.8780 0.9370
No log 5.8049 238 0.8179 0.5504 0.8179 0.9044
No log 5.8537 240 0.8084 0.5153 0.8084 0.8991
No log 5.9024 242 0.8807 0.5318 0.8807 0.9385
No log 5.9512 244 0.8107 0.5439 0.8107 0.9004
No log 6.0 246 0.8108 0.5549 0.8108 0.9005
No log 6.0488 248 0.7708 0.5383 0.7708 0.8780
No log 6.0976 250 0.6939 0.4888 0.6939 0.8330
No log 6.1463 252 0.6890 0.5368 0.6890 0.8300
No log 6.1951 254 0.7364 0.5885 0.7364 0.8581
No log 6.2439 256 0.8005 0.5532 0.8005 0.8947
No log 6.2927 258 0.8252 0.5532 0.8252 0.9084
No log 6.3415 260 0.7873 0.5267 0.7873 0.8873
No log 6.3902 262 0.8593 0.5355 0.8593 0.9270
No log 6.4390 264 0.9600 0.5 0.9600 0.9798
No log 6.4878 266 1.0133 0.4204 1.0133 1.0066
No log 6.5366 268 0.9413 0.4694 0.9413 0.9702
No log 6.5854 270 0.8169 0.4960 0.8169 0.9038
No log 6.6341 272 0.7784 0.5139 0.7784 0.8823
No log 6.6829 274 0.7749 0.5030 0.7749 0.8803
No log 6.7317 276 0.7768 0.4789 0.7768 0.8814
No log 6.7805 278 0.8053 0.4478 0.8053 0.8974
No log 6.8293 280 0.8209 0.4344 0.8209 0.9060
No log 6.8780 282 0.8299 0.4839 0.8299 0.9110
No log 6.9268 284 0.7954 0.5221 0.7954 0.8919
No log 6.9756 286 0.7682 0.5247 0.7682 0.8764
No log 7.0244 288 0.7788 0.5223 0.7788 0.8825
No log 7.0732 290 0.7963 0.4739 0.7963 0.8923
No log 7.1220 292 0.7757 0.4865 0.7757 0.8807
No log 7.1707 294 0.7471 0.4411 0.7471 0.8644
No log 7.2195 296 0.8201 0.5805 0.8201 0.9056
No log 7.2683 298 0.7965 0.6004 0.7965 0.8925
No log 7.3171 300 0.7257 0.5032 0.7257 0.8519
No log 7.3659 302 0.7343 0.5002 0.7343 0.8569
No log 7.4146 304 0.7499 0.5221 0.7499 0.8660
No log 7.4634 306 0.7403 0.5002 0.7403 0.8604
No log 7.5122 308 0.7135 0.5248 0.7135 0.8447
No log 7.5610 310 0.7117 0.5809 0.7117 0.8436
No log 7.6098 312 0.7721 0.5054 0.7721 0.8787
No log 7.6585 314 0.8140 0.4818 0.8140 0.9022
No log 7.7073 316 0.7565 0.4818 0.7565 0.8698
No log 7.7561 318 0.7219 0.6025 0.7219 0.8496
No log 7.8049 320 0.7490 0.5719 0.7490 0.8654
No log 7.8537 322 0.7806 0.5397 0.7806 0.8835
No log 7.9024 324 0.7843 0.4946 0.7843 0.8856
No log 7.9512 326 0.8213 0.4946 0.8213 0.9063
No log 8.0 328 0.8050 0.4958 0.8050 0.8972
No log 8.0488 330 0.8209 0.4593 0.8209 0.9061
No log 8.0976 332 0.8384 0.5088 0.8384 0.9156
No log 8.1463 334 0.8392 0.5070 0.8392 0.9161
No log 8.1951 336 0.8056 0.4995 0.8056 0.8975
No log 8.2439 338 0.7947 0.4912 0.7947 0.8914
No log 8.2927 340 0.7949 0.5329 0.7949 0.8916
No log 8.3415 342 0.8624 0.5150 0.8624 0.9287
No log 8.3902 344 0.9937 0.5219 0.9937 0.9968
No log 8.4390 346 0.9666 0.5013 0.9666 0.9832
No log 8.4878 348 0.8708 0.4327 0.8708 0.9332
No log 8.5366 350 0.7964 0.4378 0.7964 0.8924
No log 8.5854 352 0.7916 0.4151 0.7916 0.8897
No log 8.6341 354 0.7917 0.4378 0.7917 0.8898
No log 8.6829 356 0.8400 0.4839 0.8400 0.9165
No log 8.7317 358 0.9960 0.4902 0.9960 0.9980
No log 8.7805 360 1.1004 0.3699 1.1004 1.0490
No log 8.8293 362 1.0546 0.4667 1.0546 1.0270
No log 8.8780 364 0.9322 0.4455 0.9322 0.9655
No log 8.9268 366 0.8718 0.3780 0.8718 0.9337
No log 8.9756 368 0.8250 0.4060 0.8250 0.9083
No log 9.0244 370 0.7616 0.4198 0.7616 0.8727
No log 9.0732 372 0.7161 0.5146 0.7161 0.8462
No log 9.1220 374 0.7114 0.5274 0.7114 0.8435
No log 9.1707 376 0.7163 0.5361 0.7163 0.8463
No log 9.2195 378 0.7188 0.5546 0.7188 0.8478
No log 9.2683 380 0.7192 0.5221 0.7192 0.8480
No log 9.3171 382 0.7192 0.5503 0.7192 0.8480
No log 9.3659 384 0.7247 0.4544 0.7247 0.8513
No log 9.4146 386 0.7307 0.4428 0.7307 0.8548
No log 9.4634 388 0.7352 0.4923 0.7352 0.8574
No log 9.5122 390 0.7260 0.4893 0.7260 0.8521
No log 9.5610 392 0.7660 0.5305 0.7660 0.8752
No log 9.6098 394 0.7964 0.5153 0.7964 0.8924
No log 9.6585 396 0.7595 0.5291 0.7595 0.8715
No log 9.7073 398 0.7166 0.4774 0.7166 0.8465
No log 9.7561 400 0.7115 0.5032 0.7115 0.8435
No log 9.8049 402 0.7076 0.4908 0.7076 0.8412
No log 9.8537 404 0.7319 0.5103 0.7319 0.8555
No log 9.9024 406 0.7211 0.5232 0.7211 0.8492
No log 9.9512 408 0.7000 0.5017 0.7000 0.8367
No log 10.0 410 0.6967 0.5112 0.6967 0.8347
No log 10.0488 412 0.6954 0.5330 0.6954 0.8339
No log 10.0976 414 0.7292 0.6209 0.7292 0.8539
No log 10.1463 416 0.7394 0.6209 0.7394 0.8599
No log 10.1951 418 0.6916 0.5442 0.6916 0.8317
No log 10.2439 420 0.6662 0.6292 0.6662 0.8162
No log 10.2927 422 0.6867 0.5879 0.6867 0.8287
No log 10.3415 424 0.6774 0.5868 0.6774 0.8231
No log 10.3902 426 0.6884 0.5117 0.6884 0.8297
No log 10.4390 428 0.7482 0.4946 0.7482 0.8650
No log 10.4878 430 0.7673 0.5266 0.7673 0.8759
No log 10.5366 432 0.7427 0.5084 0.7427 0.8618
No log 10.5854 434 0.7254 0.5810 0.7254 0.8517
No log 10.6341 436 0.7374 0.5986 0.7374 0.8587
No log 10.6829 438 0.7227 0.5822 0.7227 0.8501
No log 10.7317 440 0.7100 0.5498 0.7100 0.8426
No log 10.7805 442 0.7154 0.5381 0.7154 0.8458
No log 10.8293 444 0.7238 0.5808 0.7238 0.8508
No log 10.8780 446 0.7369 0.6014 0.7369 0.8584
No log 10.9268 448 0.7636 0.5333 0.7636 0.8739
No log 10.9756 450 0.7625 0.4995 0.7625 0.8732
No log 11.0244 452 0.7555 0.4269 0.7555 0.8692
No log 11.0732 454 0.7877 0.4090 0.7877 0.8875
No log 11.1220 456 0.7886 0.4510 0.7886 0.8880
No log 11.1707 458 0.7422 0.4691 0.7422 0.8615
No log 11.2195 460 0.7376 0.4745 0.7376 0.8588
No log 11.2683 462 0.7848 0.5305 0.7848 0.8859
No log 11.3171 464 0.8178 0.5504 0.8178 0.9043
No log 11.3659 466 0.7974 0.5504 0.7974 0.8930
No log 11.4146 468 0.7609 0.5634 0.7609 0.8723
No log 11.4634 470 0.7444 0.5645 0.7444 0.8628
No log 11.5122 472 0.7905 0.5833 0.7905 0.8891
No log 11.5610 474 0.8976 0.5137 0.8976 0.9474
No log 11.6098 476 0.9333 0.5013 0.9333 0.9661
No log 11.6585 478 0.9416 0.5007 0.9416 0.9704
No log 11.7073 480 0.8712 0.5027 0.8712 0.9334
No log 11.7561 482 0.8002 0.4696 0.8002 0.8945
No log 11.8049 484 0.7406 0.4327 0.7406 0.8606
No log 11.8537 486 0.7079 0.4642 0.7079 0.8413
No log 11.9024 488 0.7003 0.4642 0.7003 0.8368
No log 11.9512 490 0.7016 0.5746 0.7016 0.8376
No log 12.0 492 0.7434 0.5708 0.7434 0.8622
No log 12.0488 494 0.7740 0.4924 0.7740 0.8798
No log 12.0976 496 0.7620 0.5140 0.7620 0.8729
No log 12.1463 498 0.7512 0.5160 0.7512 0.8667
0.306 12.1951 500 0.7433 0.4946 0.7433 0.8621
0.306 12.2439 502 0.7504 0.4946 0.7504 0.8663
0.306 12.2927 504 0.7682 0.4696 0.7682 0.8765
0.306 12.3415 506 0.7784 0.4696 0.7784 0.8823
0.306 12.3902 508 0.8064 0.4681 0.8064 0.8980
0.306 12.4390 510 0.8075 0.4796 0.8075 0.8986

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k16_task5_organization

Finetuned
(4023)
this model