ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k16_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7902
  • Qwk: 0.4966
  • Mse: 0.7902
  • Rmse: 0.8889

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0408 2 4.7859 0.0010 4.7859 2.1877
No log 0.0816 4 2.9549 -0.0072 2.9549 1.7190
No log 0.1224 6 1.9347 -0.0164 1.9347 1.3909
No log 0.1633 8 1.7823 -0.1329 1.7823 1.3350
No log 0.2041 10 1.6453 -0.0122 1.6453 1.2827
No log 0.2449 12 1.2777 0.1142 1.2777 1.1303
No log 0.2857 14 1.2950 0.0952 1.2950 1.1380
No log 0.3265 16 1.3304 -0.0422 1.3304 1.1534
No log 0.3673 18 1.2623 0.1616 1.2623 1.1235
No log 0.4082 20 1.2307 0.1110 1.2307 1.1094
No log 0.4490 22 1.2188 0.1546 1.2188 1.1040
No log 0.4898 24 1.2123 0.1546 1.2123 1.1010
No log 0.5306 26 1.2320 0.1237 1.2320 1.1100
No log 0.5714 28 1.2583 0.0253 1.2583 1.1218
No log 0.6122 30 1.2215 0.0532 1.2215 1.1052
No log 0.6531 32 1.1892 0.1952 1.1892 1.0905
No log 0.6939 34 1.1723 0.1570 1.1723 1.0827
No log 0.7347 36 1.1850 0.1242 1.1850 1.0886
No log 0.7755 38 1.2015 0.1799 1.2015 1.0961
No log 0.8163 40 1.0913 0.3283 1.0913 1.0446
No log 0.8571 42 1.0309 0.3728 1.0309 1.0154
No log 0.8980 44 0.9757 0.4301 0.9757 0.9878
No log 0.9388 46 0.9525 0.4444 0.9525 0.9760
No log 0.9796 48 0.9572 0.4352 0.9572 0.9784
No log 1.0204 50 1.0734 0.3947 1.0734 1.0361
No log 1.0612 52 1.3763 0.2730 1.3763 1.1731
No log 1.1020 54 1.5994 0.1609 1.5994 1.2647
No log 1.1429 56 1.4050 0.2349 1.4050 1.1853
No log 1.1837 58 1.1199 0.3476 1.1199 1.0582
No log 1.2245 60 1.1388 0.3446 1.1388 1.0671
No log 1.2653 62 1.0378 0.3427 1.0378 1.0187
No log 1.3061 64 1.0569 0.3857 1.0569 1.0281
No log 1.3469 66 1.0355 0.4041 1.0355 1.0176
No log 1.3878 68 0.8937 0.4981 0.8937 0.9453
No log 1.4286 70 1.1610 0.2864 1.1610 1.0775
No log 1.4694 72 1.3151 0.1076 1.3151 1.1468
No log 1.5102 74 1.1909 0.1920 1.1909 1.0913
No log 1.5510 76 0.9159 0.4635 0.9159 0.9570
No log 1.5918 78 1.1898 0.4468 1.1898 1.0908
No log 1.6327 80 1.6402 0.2731 1.6402 1.2807
No log 1.6735 82 1.5951 0.3252 1.5951 1.2630
No log 1.7143 84 1.1226 0.4390 1.1226 1.0595
No log 1.7551 86 0.8591 0.5137 0.8591 0.9269
No log 1.7959 88 1.0077 0.3008 1.0077 1.0038
No log 1.8367 90 1.0771 0.2797 1.0771 1.0378
No log 1.8776 92 1.0023 0.3198 1.0023 1.0012
No log 1.9184 94 0.8326 0.4784 0.8326 0.9125
No log 1.9592 96 0.7491 0.4780 0.7491 0.8655
No log 2.0 98 0.9399 0.4444 0.9399 0.9695
No log 2.0408 100 1.0646 0.4950 1.0646 1.0318
No log 2.0816 102 0.9418 0.4742 0.9418 0.9704
No log 2.1224 104 0.8174 0.5211 0.8174 0.9041
No log 2.1633 106 0.7371 0.4726 0.7371 0.8586
No log 2.2041 108 0.7510 0.4737 0.7510 0.8666
No log 2.2449 110 0.8273 0.5213 0.8273 0.9095
No log 2.2857 112 0.9160 0.4300 0.9160 0.9571
No log 2.3265 114 0.9005 0.4666 0.9005 0.9490
No log 2.3673 116 0.8675 0.5352 0.8675 0.9314
No log 2.4082 118 0.8168 0.4757 0.8168 0.9038
No log 2.4490 120 0.8073 0.4898 0.8073 0.8985
No log 2.4898 122 0.8072 0.5443 0.8072 0.8985
No log 2.5306 124 0.8090 0.5327 0.8090 0.8994
No log 2.5714 126 0.8786 0.4646 0.8786 0.9374
No log 2.6122 128 0.9160 0.4886 0.9160 0.9571
No log 2.6531 130 0.8686 0.4754 0.8686 0.9320
No log 2.6939 132 0.9143 0.5109 0.9143 0.9562
No log 2.7347 134 1.1787 0.4112 1.1787 1.0857
No log 2.7755 136 1.2341 0.4010 1.2341 1.1109
No log 2.8163 138 1.0240 0.4673 1.0240 1.0119
No log 2.8571 140 0.8915 0.5361 0.8915 0.9442
No log 2.8980 142 1.1206 0.4457 1.1206 1.0586
No log 2.9388 144 1.1687 0.4256 1.1687 1.0810
No log 2.9796 146 0.9336 0.4713 0.9336 0.9662
No log 3.0204 148 0.8907 0.4181 0.8907 0.9437
No log 3.0612 150 1.0608 0.3647 1.0608 1.0300
No log 3.1020 152 1.0669 0.3876 1.0669 1.0329
No log 3.1429 154 0.9260 0.3868 0.9260 0.9623
No log 3.1837 156 0.8908 0.4662 0.8908 0.9438
No log 3.2245 158 0.8796 0.5253 0.8796 0.9379
No log 3.2653 160 0.9018 0.5047 0.9018 0.9496
No log 3.3061 162 1.0018 0.5704 1.0018 1.0009
No log 3.3469 164 1.2112 0.4198 1.2112 1.1005
No log 3.3878 166 1.4598 0.375 1.4598 1.2082
No log 3.4286 168 1.4031 0.4231 1.4031 1.1845
No log 3.4694 170 1.1561 0.4025 1.1561 1.0752
No log 3.5102 172 0.9847 0.2926 0.9847 0.9923
No log 3.5510 174 0.9500 0.2865 0.9500 0.9747
No log 3.5918 176 1.0535 0.3141 1.0535 1.0264
No log 3.6327 178 1.1339 0.4377 1.1339 1.0648
No log 3.6735 180 1.0552 0.3945 1.0552 1.0272
No log 3.7143 182 0.9270 0.4325 0.9270 0.9628
No log 3.7551 184 0.8865 0.4430 0.8865 0.9415
No log 3.7959 186 0.8692 0.4794 0.8692 0.9323
No log 3.8367 188 0.9220 0.4962 0.9220 0.9602
No log 3.8776 190 0.8555 0.5304 0.8555 0.9249
No log 3.9184 192 0.8548 0.4575 0.8548 0.9246
No log 3.9592 194 0.9032 0.3590 0.9032 0.9504
No log 4.0 196 0.8738 0.3700 0.8738 0.9348
No log 4.0408 198 0.8832 0.5209 0.8832 0.9398
No log 4.0816 200 1.0321 0.4685 1.0321 1.0159
No log 4.1224 202 0.9957 0.4807 0.9957 0.9978
No log 4.1633 204 0.8314 0.4617 0.8314 0.9118
No log 4.2041 206 0.9824 0.4372 0.9824 0.9911
No log 4.2449 208 1.0551 0.5055 1.0551 1.0272
No log 4.2857 210 0.9185 0.4167 0.9185 0.9584
No log 4.3265 212 0.8467 0.4337 0.8467 0.9202
No log 4.3673 214 0.8293 0.4476 0.8293 0.9106
No log 4.4082 216 0.8405 0.4340 0.8405 0.9168
No log 4.4490 218 0.9199 0.4382 0.9199 0.9591
No log 4.4898 220 1.0599 0.5080 1.0599 1.0295
No log 4.5306 222 1.2284 0.4673 1.2284 1.1083
No log 4.5714 224 1.1790 0.4604 1.1790 1.0858
No log 4.6122 226 1.0357 0.5280 1.0357 1.0177
No log 4.6531 228 0.8749 0.5220 0.8749 0.9354
No log 4.6939 230 0.8679 0.5440 0.8679 0.9316
No log 4.7347 232 0.9046 0.5068 0.9046 0.9511
No log 4.7755 234 1.0636 0.5384 1.0636 1.0313
No log 4.8163 236 1.0917 0.4708 1.0917 1.0448
No log 4.8571 238 1.0086 0.5250 1.0086 1.0043
No log 4.8980 240 0.9716 0.4501 0.9716 0.9857
No log 4.9388 242 0.8772 0.4595 0.8772 0.9366
No log 4.9796 244 0.8816 0.4459 0.8816 0.9389
No log 5.0204 246 1.0162 0.4258 1.0162 1.0081
No log 5.0612 248 1.2298 0.4594 1.2298 1.1090
No log 5.1020 250 1.2593 0.3995 1.2593 1.1222
No log 5.1429 252 1.0743 0.4132 1.0743 1.0365
No log 5.1837 254 0.9801 0.4527 0.9801 0.9900
No log 5.2245 256 1.0137 0.4048 1.0137 1.0068
No log 5.2653 258 0.9636 0.4515 0.9636 0.9816
No log 5.3061 260 0.8763 0.5365 0.8763 0.9361
No log 5.3469 262 0.9160 0.4252 0.9160 0.9571
No log 5.3878 264 1.1184 0.4473 1.1184 1.0575
No log 5.4286 266 1.1940 0.4723 1.1940 1.0927
No log 5.4694 268 1.0451 0.4693 1.0451 1.0223
No log 5.5102 270 0.8682 0.4657 0.8682 0.9318
No log 5.5510 272 0.7496 0.4316 0.7496 0.8658
No log 5.5918 274 0.7500 0.4450 0.7500 0.8660
No log 5.6327 276 0.8157 0.5124 0.8157 0.9032
No log 5.6735 278 1.0357 0.5159 1.0357 1.0177
No log 5.7143 280 1.1856 0.4456 1.1856 1.0889
No log 5.7551 282 1.1985 0.4387 1.1985 1.0948
No log 5.7959 284 1.0461 0.5042 1.0461 1.0228
No log 5.8367 286 0.8647 0.5709 0.8647 0.9299
No log 5.8776 288 0.8207 0.4994 0.8207 0.9059
No log 5.9184 290 0.8402 0.5271 0.8402 0.9166
No log 5.9592 292 0.9101 0.5013 0.9101 0.9540
No log 6.0 294 1.0823 0.4806 1.0823 1.0403
No log 6.0408 296 1.2186 0.4468 1.2186 1.1039
No log 6.0816 298 1.1956 0.4749 1.1956 1.0934
No log 6.1224 300 1.0430 0.4689 1.0430 1.0213
No log 6.1633 302 0.8577 0.5248 0.8577 0.9261
No log 6.2041 304 0.7989 0.5022 0.7989 0.8938
No log 6.2449 306 0.8148 0.4694 0.8148 0.9027
No log 6.2857 308 0.8037 0.4607 0.8037 0.8965
No log 6.3265 310 0.8134 0.5149 0.8134 0.9019
No log 6.3673 312 0.8968 0.5517 0.8968 0.9470
No log 6.4082 314 1.0036 0.4428 1.0036 1.0018
No log 6.4490 316 0.9702 0.5216 0.9702 0.9850
No log 6.4898 318 0.8988 0.4937 0.8988 0.9481
No log 6.5306 320 0.8801 0.4615 0.8801 0.9381
No log 6.5714 322 0.9125 0.3897 0.9125 0.9553
No log 6.6122 324 0.8945 0.4054 0.8945 0.9458
No log 6.6531 326 0.8747 0.4571 0.8747 0.9353
No log 6.6939 328 0.8911 0.5080 0.8911 0.9440
No log 6.7347 330 0.9000 0.5201 0.9000 0.9487
No log 6.7755 332 0.8464 0.4948 0.8464 0.9200
No log 6.8163 334 0.8097 0.5300 0.8097 0.8998
No log 6.8571 336 0.7705 0.5265 0.7705 0.8778
No log 6.8980 338 0.7757 0.5702 0.7757 0.8807
No log 6.9388 340 0.8556 0.5553 0.8556 0.9250
No log 6.9796 342 0.9445 0.5255 0.9445 0.9719
No log 7.0204 344 0.8750 0.5419 0.8750 0.9354
No log 7.0612 346 0.8146 0.5439 0.8146 0.9026
No log 7.1020 348 0.7834 0.4754 0.7834 0.8851
No log 7.1429 350 0.7395 0.4203 0.7395 0.8599
No log 7.1837 352 0.7290 0.4584 0.7290 0.8538
No log 7.2245 354 0.7629 0.4743 0.7629 0.8734
No log 7.2653 356 0.7843 0.4743 0.7843 0.8856
No log 7.3061 358 0.7539 0.5279 0.7539 0.8683
No log 7.3469 360 0.7461 0.4916 0.7461 0.8638
No log 7.3878 362 0.7581 0.5180 0.7581 0.8707
No log 7.4286 364 0.7549 0.5607 0.7549 0.8689
No log 7.4694 366 0.7695 0.4789 0.7695 0.8772
No log 7.5102 368 0.7815 0.4420 0.7815 0.8840
No log 7.5510 370 0.8015 0.5311 0.8015 0.8953
No log 7.5918 372 0.7921 0.5279 0.7921 0.8900
No log 7.6327 374 0.8435 0.5090 0.8435 0.9184
No log 7.6735 376 0.8333 0.5073 0.8333 0.9129
No log 7.7143 378 0.8007 0.4694 0.8007 0.8948
No log 7.7551 380 0.8240 0.5236 0.8240 0.9077
No log 7.7959 382 0.8177 0.5077 0.8177 0.9043
No log 7.8367 384 0.8133 0.5077 0.8133 0.9018
No log 7.8776 386 0.7971 0.3961 0.7971 0.8928
No log 7.9184 388 0.7732 0.4102 0.7732 0.8793
No log 7.9592 390 0.7466 0.5274 0.7466 0.8641
No log 8.0 392 0.7405 0.5701 0.7405 0.8605
No log 8.0408 394 0.7476 0.5884 0.7476 0.8646
No log 8.0816 396 0.7539 0.5688 0.7539 0.8683
No log 8.1224 398 0.7900 0.5470 0.7900 0.8888
No log 8.1633 400 0.7704 0.5898 0.7704 0.8777
No log 8.2041 402 0.7449 0.5660 0.7449 0.8631
No log 8.2449 404 0.7388 0.6348 0.7388 0.8596
No log 8.2857 406 0.7371 0.5711 0.7371 0.8586
No log 8.3265 408 0.7901 0.5521 0.7901 0.8889
No log 8.3673 410 0.8682 0.4546 0.8682 0.9318
No log 8.4082 412 0.9312 0.5118 0.9312 0.9650
No log 8.4490 414 0.9514 0.5190 0.9514 0.9754
No log 8.4898 416 0.8941 0.5306 0.8941 0.9456
No log 8.5306 418 0.7981 0.4968 0.7981 0.8933
No log 8.5714 420 0.7344 0.5501 0.7344 0.8570
No log 8.6122 422 0.7316 0.5027 0.7316 0.8553
No log 8.6531 424 0.7633 0.5230 0.7633 0.8737
No log 8.6939 426 0.8341 0.5044 0.8341 0.9133
No log 8.7347 428 0.8204 0.5044 0.8204 0.9058
No log 8.7755 430 0.7732 0.5451 0.7732 0.8793
No log 8.8163 432 0.7630 0.5455 0.7630 0.8735
No log 8.8571 434 0.7692 0.5308 0.7692 0.8770
No log 8.8980 436 0.7769 0.4611 0.7769 0.8814
No log 8.9388 438 0.8025 0.4203 0.8025 0.8958
No log 8.9796 440 0.8292 0.4304 0.8292 0.9106
No log 9.0204 442 0.8495 0.3655 0.8495 0.9217
No log 9.0612 444 0.8716 0.3354 0.8716 0.9336
No log 9.1020 446 0.8692 0.3505 0.8692 0.9323
No log 9.1429 448 0.8341 0.3493 0.8341 0.9133
No log 9.1837 450 0.7911 0.5215 0.7911 0.8894
No log 9.2245 452 0.7897 0.5662 0.7897 0.8886
No log 9.2653 454 0.8375 0.4690 0.8375 0.9151
No log 9.3061 456 0.9039 0.4710 0.9039 0.9507
No log 9.3469 458 0.8763 0.4631 0.8763 0.9361
No log 9.3878 460 0.8041 0.4908 0.8041 0.8967
No log 9.4286 462 0.7845 0.4439 0.7845 0.8857
No log 9.4694 464 0.7903 0.5057 0.7903 0.8890
No log 9.5102 466 0.7836 0.5536 0.7836 0.8852
No log 9.5510 468 0.8641 0.4722 0.8641 0.9296
No log 9.5918 470 0.9556 0.4742 0.9556 0.9775
No log 9.6327 472 0.9048 0.4685 0.9048 0.9512
No log 9.6735 474 0.8014 0.5898 0.8014 0.8952
No log 9.7143 476 0.7666 0.5348 0.7666 0.8755
No log 9.7551 478 0.7755 0.5348 0.7755 0.8806
No log 9.7959 480 0.7744 0.5150 0.7744 0.8800
No log 9.8367 482 0.7962 0.5274 0.7962 0.8923
No log 9.8776 484 0.8650 0.4983 0.8650 0.9301
No log 9.9184 486 0.9293 0.4133 0.9293 0.9640
No log 9.9592 488 0.9160 0.4345 0.9160 0.9571
No log 10.0 490 0.8582 0.5015 0.8582 0.9264
No log 10.0408 492 0.7787 0.5841 0.7787 0.8825
No log 10.0816 494 0.7480 0.5724 0.7480 0.8649
No log 10.1224 496 0.7464 0.5872 0.7464 0.8640
No log 10.1633 498 0.7818 0.6209 0.7818 0.8842
0.3638 10.2041 500 0.9081 0.5015 0.9081 0.9530
0.3638 10.2449 502 0.9776 0.4697 0.9776 0.9887
0.3638 10.2857 504 1.0347 0.4215 1.0347 1.0172
0.3638 10.3265 506 0.9987 0.4590 0.9987 0.9993
0.3638 10.3673 508 0.8767 0.4458 0.8767 0.9363
0.3638 10.4082 510 0.7902 0.4966 0.7902 0.8889

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k16_task2_organization

Finetuned
(4019)
this model