ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k6_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8069
  • Qwk: 0.4006
  • Mse: 0.8069
  • Rmse: 0.8983

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1 2 4.7318 -0.0020 4.7318 2.1753
No log 0.2 4 2.6264 0.0578 2.6264 1.6206
No log 0.3 6 1.5365 0.0372 1.5365 1.2395
No log 0.4 8 1.2692 0.1388 1.2692 1.1266
No log 0.5 10 1.2014 0.2424 1.2014 1.0961
No log 0.6 12 1.1805 0.2268 1.1805 1.0865
No log 0.7 14 1.4875 0.0019 1.4875 1.2196
No log 0.8 16 1.7115 0.0789 1.7115 1.3083
No log 0.9 18 1.5467 0.0749 1.5467 1.2437
No log 1.0 20 1.3868 0.1140 1.3868 1.1776
No log 1.1 22 1.2508 0.1196 1.2508 1.1184
No log 1.2 24 1.1015 0.3318 1.1015 1.0495
No log 1.3 26 1.0883 0.2424 1.0883 1.0432
No log 1.4 28 1.0462 0.2464 1.0462 1.0228
No log 1.5 30 1.0576 0.2938 1.0576 1.0284
No log 1.6 32 1.2929 0.2252 1.2929 1.1371
No log 1.7 34 1.4301 0.2033 1.4301 1.1958
No log 1.8 36 1.4259 0.1995 1.4259 1.1941
No log 1.9 38 1.1662 0.2417 1.1662 1.0799
No log 2.0 40 1.0480 0.3356 1.0480 1.0237
No log 2.1 42 0.9867 0.3663 0.9867 0.9933
No log 2.2 44 0.9837 0.4363 0.9837 0.9918
No log 2.3 46 1.0002 0.3552 1.0002 1.0001
No log 2.4 48 1.0679 0.2857 1.0679 1.0334
No log 2.5 50 1.0590 0.3908 1.0590 1.0291
No log 2.6 52 1.2208 0.2320 1.2208 1.1049
No log 2.7 54 1.5041 0.1785 1.5041 1.2264
No log 2.8 56 1.2978 0.2229 1.2978 1.1392
No log 2.9 58 1.0963 0.3173 1.0963 1.0470
No log 3.0 60 1.7664 0.3009 1.7664 1.3291
No log 3.1 62 1.6640 0.2994 1.6640 1.2899
No log 3.2 64 1.0854 0.3758 1.0854 1.0418
No log 3.3 66 1.1813 0.3273 1.1813 1.0869
No log 3.4 68 1.5337 0.2873 1.5337 1.2384
No log 3.5 70 1.2980 0.2962 1.2980 1.1393
No log 3.6 72 1.0100 0.3995 1.0100 1.0050
No log 3.7 74 1.1814 0.3734 1.1814 1.0869
No log 3.8 76 1.2404 0.4365 1.2404 1.1137
No log 3.9 78 1.0465 0.3655 1.0465 1.0230
No log 4.0 80 0.9547 0.4749 0.9547 0.9771
No log 4.1 82 0.9524 0.4749 0.9524 0.9759
No log 4.2 84 0.9775 0.4498 0.9775 0.9887
No log 4.3 86 0.9888 0.4182 0.9888 0.9944
No log 4.4 88 0.9300 0.5052 0.9300 0.9644
No log 4.5 90 0.9111 0.4825 0.9111 0.9545
No log 4.6 92 0.9335 0.5560 0.9335 0.9662
No log 4.7 94 0.9777 0.5307 0.9777 0.9888
No log 4.8 96 0.8787 0.5510 0.8787 0.9374
No log 4.9 98 1.1293 0.5173 1.1293 1.0627
No log 5.0 100 1.2553 0.5173 1.2553 1.1204
No log 5.1 102 1.0857 0.5050 1.0857 1.0420
No log 5.2 104 0.8530 0.5024 0.8530 0.9236
No log 5.3 106 0.8313 0.4818 0.8313 0.9118
No log 5.4 108 0.8446 0.4454 0.8446 0.9190
No log 5.5 110 0.8586 0.4823 0.8586 0.9266
No log 5.6 112 0.9459 0.4350 0.9459 0.9726
No log 5.7 114 0.9689 0.4353 0.9689 0.9843
No log 5.8 116 0.9779 0.4356 0.9779 0.9889
No log 5.9 118 0.9339 0.3631 0.9339 0.9664
No log 6.0 120 1.0892 0.4464 1.0892 1.0436
No log 6.1 122 1.1938 0.3377 1.1938 1.0926
No log 6.2 124 1.0765 0.4494 1.0765 1.0375
No log 6.3 126 1.0029 0.3650 1.0029 1.0014
No log 6.4 128 1.0090 0.3371 1.0090 1.0045
No log 6.5 130 1.0140 0.3619 1.0140 1.0070
No log 6.6 132 1.0115 0.4123 1.0115 1.0057
No log 6.7 134 1.0286 0.4385 1.0286 1.0142
No log 6.8 136 1.0881 0.4052 1.0881 1.0431
No log 6.9 138 1.0712 0.4052 1.0712 1.0350
No log 7.0 140 0.9741 0.4266 0.9741 0.9870
No log 7.1 142 0.9250 0.4218 0.9250 0.9618
No log 7.2 144 0.9058 0.4218 0.9058 0.9517
No log 7.3 146 0.8935 0.4218 0.8935 0.9452
No log 7.4 148 0.8847 0.4405 0.8847 0.9406
No log 7.5 150 0.8654 0.4810 0.8654 0.9302
No log 7.6 152 0.8725 0.4800 0.8725 0.9341
No log 7.7 154 0.8512 0.4889 0.8512 0.9226
No log 7.8 156 0.8444 0.4714 0.8444 0.9189
No log 7.9 158 0.8681 0.4672 0.8681 0.9317
No log 8.0 160 0.8408 0.4181 0.8408 0.9169
No log 8.1 162 0.8685 0.5382 0.8685 0.9319
No log 8.2 164 0.9431 0.4629 0.9431 0.9711
No log 8.3 166 0.9608 0.4629 0.9608 0.9802
No log 8.4 168 0.8484 0.4196 0.8484 0.9211
No log 8.5 170 0.9698 0.5365 0.9698 0.9848
No log 8.6 172 1.1289 0.4916 1.1289 1.0625
No log 8.7 174 1.0037 0.5429 1.0037 1.0019
No log 8.8 176 0.8770 0.4869 0.8770 0.9365
No log 8.9 178 0.8476 0.4575 0.8476 0.9207
No log 9.0 180 0.8762 0.5172 0.8762 0.9360
No log 9.1 182 0.8398 0.4388 0.8398 0.9164
No log 9.2 184 0.8332 0.4045 0.8332 0.9128
No log 9.3 186 0.8688 0.4037 0.8688 0.9321
No log 9.4 188 0.8884 0.4468 0.8884 0.9425
No log 9.5 190 0.8573 0.4201 0.8573 0.9259
No log 9.6 192 0.8089 0.4254 0.8089 0.8994
No log 9.7 194 0.8148 0.5334 0.8148 0.9027
No log 9.8 196 0.8064 0.4650 0.8064 0.8980
No log 9.9 198 0.8345 0.5541 0.8345 0.9135
No log 10.0 200 0.8484 0.5414 0.8484 0.9211
No log 10.1 202 0.8173 0.5469 0.8173 0.9041
No log 10.2 204 0.9182 0.5034 0.9182 0.9583
No log 10.3 206 0.9815 0.4484 0.9815 0.9907
No log 10.4 208 0.9699 0.4659 0.9699 0.9848
No log 10.5 210 1.0230 0.4775 1.0230 1.0114
No log 10.6 212 0.9096 0.5361 0.9096 0.9537
No log 10.7 214 0.8575 0.4729 0.8575 0.9260
No log 10.8 216 0.8820 0.3979 0.8820 0.9391
No log 10.9 218 0.8722 0.4334 0.8722 0.9339
No log 11.0 220 0.8527 0.4708 0.8527 0.9234
No log 11.1 222 0.8509 0.4618 0.8509 0.9224
No log 11.2 224 0.8299 0.4618 0.8299 0.9110
No log 11.3 226 0.8402 0.4635 0.8402 0.9166
No log 11.4 228 0.8728 0.5558 0.8728 0.9342
No log 11.5 230 0.8391 0.5242 0.8391 0.9160
No log 11.6 232 0.7948 0.5424 0.7948 0.8915
No log 11.7 234 0.7658 0.5611 0.7658 0.8751
No log 11.8 236 0.7429 0.6212 0.7429 0.8619
No log 11.9 238 0.7357 0.5302 0.7357 0.8577
No log 12.0 240 0.7261 0.5056 0.7261 0.8521
No log 12.1 242 0.7532 0.5526 0.7532 0.8679
No log 12.2 244 0.7764 0.5851 0.7764 0.8811
No log 12.3 246 0.7443 0.5503 0.7443 0.8628
No log 12.4 248 0.7361 0.5232 0.7361 0.8580
No log 12.5 250 0.7641 0.5142 0.7641 0.8742
No log 12.6 252 0.7570 0.4554 0.7570 0.8701
No log 12.7 254 0.7849 0.5710 0.7849 0.8859
No log 12.8 256 0.8105 0.5409 0.8105 0.9003
No log 12.9 258 0.7691 0.4771 0.7691 0.8770
No log 13.0 260 0.7634 0.5041 0.7634 0.8737
No log 13.1 262 0.8013 0.4956 0.8013 0.8951
No log 13.2 264 0.7828 0.5168 0.7828 0.8848
No log 13.3 266 0.7728 0.4256 0.7728 0.8791
No log 13.4 268 0.9313 0.4855 0.9313 0.9650
No log 13.5 270 1.0940 0.4410 1.0940 1.0459
No log 13.6 272 1.0572 0.4833 1.0572 1.0282
No log 13.7 274 0.9161 0.4855 0.9161 0.9571
No log 13.8 276 0.8193 0.4292 0.8193 0.9051
No log 13.9 278 0.8296 0.4816 0.8296 0.9108
No log 14.0 280 0.8775 0.4814 0.8775 0.9368
No log 14.1 282 0.8742 0.3687 0.8742 0.9350
No log 14.2 284 0.8694 0.3345 0.8694 0.9324
No log 14.3 286 0.8781 0.3278 0.8781 0.9371
No log 14.4 288 0.8679 0.3838 0.8679 0.9316
No log 14.5 290 0.8631 0.4120 0.8631 0.9290
No log 14.6 292 0.8684 0.3632 0.8684 0.9319
No log 14.7 294 0.9056 0.4803 0.9056 0.9517
No log 14.8 296 0.8994 0.4803 0.8994 0.9484
No log 14.9 298 0.8436 0.4237 0.8436 0.9185
No log 15.0 300 0.8355 0.3685 0.8355 0.9141
No log 15.1 302 0.8403 0.3629 0.8403 0.9167
No log 15.2 304 0.8481 0.3970 0.8481 0.9209
No log 15.3 306 0.9098 0.4558 0.9098 0.9538
No log 15.4 308 0.9387 0.4930 0.9387 0.9689
No log 15.5 310 0.8660 0.4072 0.8660 0.9306
No log 15.6 312 0.8470 0.3596 0.8470 0.9203
No log 15.7 314 0.8990 0.4100 0.8990 0.9481
No log 15.8 316 0.9029 0.3821 0.9029 0.9502
No log 15.9 318 0.8804 0.3256 0.8804 0.9383
No log 16.0 320 0.9021 0.4105 0.9021 0.9498
No log 16.1 322 0.9357 0.4838 0.9357 0.9673
No log 16.2 324 0.9308 0.4164 0.9308 0.9648
No log 16.3 326 0.9056 0.3710 0.9056 0.9516
No log 16.4 328 0.8888 0.3508 0.8888 0.9428
No log 16.5 330 0.8821 0.3382 0.8821 0.9392
No log 16.6 332 0.8774 0.3671 0.8774 0.9367
No log 16.7 334 0.8856 0.3663 0.8856 0.9411
No log 16.8 336 0.8949 0.3762 0.8949 0.9460
No log 16.9 338 0.8916 0.3374 0.8916 0.9442
No log 17.0 340 0.8921 0.3374 0.8921 0.9445
No log 17.1 342 0.8840 0.4042 0.8840 0.9402
No log 17.2 344 0.8650 0.3885 0.8650 0.9301
No log 17.3 346 0.8711 0.4494 0.8711 0.9333
No log 17.4 348 0.8540 0.4292 0.8540 0.9241
No log 17.5 350 0.8487 0.4218 0.8487 0.9212
No log 17.6 352 0.8560 0.3548 0.8560 0.9252
No log 17.7 354 0.8466 0.3535 0.8466 0.9201
No log 17.8 356 0.8426 0.3747 0.8426 0.9179
No log 17.9 358 0.8440 0.3788 0.8440 0.9187
No log 18.0 360 0.8412 0.3525 0.8412 0.9171
No log 18.1 362 0.8586 0.4369 0.8586 0.9266
No log 18.2 364 0.8857 0.4871 0.8857 0.9411
No log 18.3 366 0.9041 0.4871 0.9041 0.9509
No log 18.4 368 0.8655 0.4612 0.8655 0.9303
No log 18.5 370 0.8514 0.4029 0.8514 0.9227
No log 18.6 372 0.8537 0.3925 0.8537 0.9240
No log 18.7 374 0.8536 0.3392 0.8536 0.9239
No log 18.8 376 0.8911 0.4203 0.8911 0.9440
No log 18.9 378 0.9575 0.4087 0.9575 0.9785
No log 19.0 380 0.9797 0.3711 0.9797 0.9898
No log 19.1 382 0.9273 0.4429 0.9273 0.9630
No log 19.2 384 0.8622 0.4009 0.8622 0.9286
No log 19.3 386 0.8939 0.4803 0.8939 0.9455
No log 19.4 388 0.9248 0.5291 0.9248 0.9617
No log 19.5 390 0.8912 0.5334 0.8912 0.9440
No log 19.6 392 0.8516 0.4374 0.8516 0.9228
No log 19.7 394 0.8704 0.4546 0.8704 0.9329
No log 19.8 396 0.9281 0.5071 0.9281 0.9634
No log 19.9 398 0.9542 0.4502 0.9542 0.9768
No log 20.0 400 0.9025 0.4903 0.9025 0.9500
No log 20.1 402 0.8437 0.4526 0.8437 0.9185
No log 20.2 404 0.8134 0.3902 0.8134 0.9019
No log 20.3 406 0.8092 0.3608 0.8092 0.8995
No log 20.4 408 0.8202 0.4241 0.8202 0.9057
No log 20.5 410 0.8284 0.4871 0.8284 0.9102
No log 20.6 412 0.8034 0.5159 0.8034 0.8963
No log 20.7 414 0.7906 0.4982 0.7906 0.8892
No log 20.8 416 0.8012 0.5142 0.8012 0.8951
No log 20.9 418 0.8399 0.5118 0.8399 0.9165
No log 21.0 420 0.8717 0.5208 0.8717 0.9337
No log 21.1 422 0.8370 0.4966 0.8370 0.9149
No log 21.2 424 0.7879 0.4439 0.7879 0.8876
No log 21.3 426 0.7851 0.3453 0.7851 0.8860
No log 21.4 428 0.7925 0.3596 0.7925 0.8902
No log 21.5 430 0.8052 0.4439 0.8052 0.8974
No log 21.6 432 0.8186 0.4666 0.8186 0.9048
No log 21.7 434 0.8045 0.4465 0.8045 0.8969
No log 21.8 436 0.7920 0.4724 0.7920 0.8900
No log 21.9 438 0.8141 0.4808 0.8141 0.9023
No log 22.0 440 0.8164 0.4934 0.8164 0.9035
No log 22.1 442 0.8334 0.4808 0.8334 0.9129
No log 22.2 444 0.8498 0.4681 0.8498 0.9218
No log 22.3 446 0.8683 0.4775 0.8683 0.9318
No log 22.4 448 0.9177 0.4743 0.9177 0.9580
No log 22.5 450 0.9556 0.4386 0.9556 0.9775
No log 22.6 452 0.9795 0.4771 0.9795 0.9897
No log 22.7 454 0.9329 0.4639 0.9329 0.9659
No log 22.8 456 0.8683 0.4252 0.8683 0.9318
No log 22.9 458 0.8573 0.4290 0.8573 0.9259
No log 23.0 460 0.9033 0.5238 0.9033 0.9504
No log 23.1 462 0.9518 0.4855 0.9518 0.9756
No log 23.2 464 0.9134 0.4855 0.9134 0.9557
No log 23.3 466 0.8727 0.5409 0.8727 0.9342
No log 23.4 468 0.8350 0.5022 0.8350 0.9138
No log 23.5 470 0.8136 0.4829 0.8136 0.9020
No log 23.6 472 0.8111 0.4685 0.8111 0.9006
No log 23.7 474 0.8095 0.4685 0.8095 0.8997
No log 23.8 476 0.8057 0.4601 0.8057 0.8976
No log 23.9 478 0.8009 0.4626 0.8009 0.8950
No log 24.0 480 0.8105 0.4401 0.8105 0.9003
No log 24.1 482 0.8746 0.4785 0.8746 0.9352
No log 24.2 484 0.9500 0.4973 0.9500 0.9747
No log 24.3 486 0.9660 0.5040 0.9660 0.9829
No log 24.4 488 0.9321 0.4808 0.9321 0.9654
No log 24.5 490 0.8675 0.5275 0.8675 0.9314
No log 24.6 492 0.8396 0.4906 0.8396 0.9163
No log 24.7 494 0.8551 0.4301 0.8551 0.9247
No log 24.8 496 0.8704 0.4247 0.8704 0.9329
No log 24.9 498 0.8493 0.3820 0.8493 0.9216
0.2917 25.0 500 0.8254 0.4256 0.8254 0.9085
0.2917 25.1 502 0.8356 0.4840 0.8356 0.9141
0.2917 25.2 504 0.8629 0.4553 0.8629 0.9289
0.2917 25.3 506 0.8803 0.4811 0.8803 0.9382
0.2917 25.4 508 0.8580 0.4714 0.8580 0.9263
0.2917 25.5 510 0.8331 0.4685 0.8331 0.9128
0.2917 25.6 512 0.8397 0.4196 0.8397 0.9164
0.2917 25.7 514 0.8557 0.3555 0.8557 0.9250
0.2917 25.8 516 0.8465 0.3925 0.8465 0.9200
0.2917 25.9 518 0.8453 0.4349 0.8453 0.9194
0.2917 26.0 520 0.8812 0.4202 0.8812 0.9387
0.2917 26.1 522 0.9569 0.4894 0.9569 0.9782
0.2917 26.2 524 1.0056 0.4722 1.0056 1.0028
0.2917 26.3 526 1.0254 0.4894 1.0254 1.0126
0.2917 26.4 528 0.9888 0.4339 0.9888 0.9944
0.2917 26.5 530 0.9109 0.4838 0.9109 0.9544
0.2917 26.6 532 0.8587 0.4657 0.8587 0.9266
0.2917 26.7 534 0.8382 0.4620 0.8382 0.9156
0.2917 26.8 536 0.8184 0.4217 0.8184 0.9046
0.2917 26.9 538 0.8152 0.4435 0.8152 0.9029
0.2917 27.0 540 0.8281 0.4937 0.8281 0.9100
0.2917 27.1 542 0.8540 0.4648 0.8540 0.9241
0.2917 27.2 544 0.8821 0.4513 0.8821 0.9392
0.2917 27.3 546 0.8611 0.4420 0.8611 0.9280
0.2917 27.4 548 0.8212 0.4240 0.8212 0.9062
0.2917 27.5 550 0.8056 0.4505 0.8056 0.8975
0.2917 27.6 552 0.8166 0.4681 0.8166 0.9037
0.2917 27.7 554 0.8614 0.4754 0.8614 0.9281
0.2917 27.8 556 0.9190 0.5135 0.9190 0.9586
0.2917 27.9 558 0.9166 0.5135 0.9166 0.9574
0.2917 28.0 560 0.8932 0.5135 0.8932 0.9451
0.2917 28.1 562 0.8551 0.4546 0.8551 0.9247
0.2917 28.2 564 0.8267 0.4240 0.8267 0.9092
0.2917 28.3 566 0.8272 0.4240 0.8272 0.9095
0.2917 28.4 568 0.8476 0.4787 0.8476 0.9207
0.2917 28.5 570 0.8806 0.4420 0.8806 0.9384
0.2917 28.6 572 0.9118 0.4420 0.9118 0.9549
0.2917 28.7 574 0.9198 0.4631 0.9198 0.9590
0.2917 28.8 576 0.8957 0.4420 0.8957 0.9464
0.2917 28.9 578 0.8450 0.4240 0.8450 0.9192
0.2917 29.0 580 0.8168 0.4042 0.8168 0.9038
0.2917 29.1 582 0.8103 0.4318 0.8103 0.9002
0.2917 29.2 584 0.8192 0.4277 0.8192 0.9051
0.2917 29.3 586 0.8445 0.4240 0.8445 0.9189
0.2917 29.4 588 0.8868 0.4239 0.8868 0.9417
0.2917 29.5 590 0.9601 0.4598 0.9601 0.9799
0.2917 29.6 592 1.0276 0.4426 1.0276 1.0137
0.2917 29.7 594 1.0381 0.4817 1.0381 1.0189
0.2917 29.8 596 0.9904 0.4663 0.9904 0.9952
0.2917 29.9 598 0.9115 0.4710 0.9115 0.9547
0.2917 30.0 600 0.8467 0.4239 0.8467 0.9202
0.2917 30.1 602 0.8109 0.3908 0.8109 0.9005
0.2917 30.2 604 0.7984 0.4181 0.7984 0.8935
0.2917 30.3 606 0.7958 0.4239 0.7958 0.8921
0.2917 30.4 608 0.8229 0.4459 0.8229 0.9071
0.2917 30.5 610 0.8925 0.4857 0.8925 0.9447
0.2917 30.6 612 0.9309 0.5224 0.9309 0.9648
0.2917 30.7 614 0.9133 0.5242 0.9133 0.9556
0.2917 30.8 616 0.8732 0.5124 0.8732 0.9345
0.2917 30.9 618 0.8395 0.4743 0.8395 0.9163
0.2917 31.0 620 0.8167 0.4239 0.8167 0.9037
0.2917 31.1 622 0.8045 0.4006 0.8045 0.8970
0.2917 31.2 624 0.8084 0.4006 0.8084 0.8991
0.2917 31.3 626 0.8100 0.4006 0.8100 0.9000
0.2917 31.4 628 0.8069 0.4006 0.8069 0.8983

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k6_task2_organization

Finetuned
(4023)
this model