ArabicNewSplits6_FineTuningAraBERT_run2_AugV5_k9_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1386
  • Qwk: 0.6022
  • Mse: 1.1386
  • Rmse: 1.0670

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0588 2 2.4082 0.0137 2.4082 1.5518
No log 0.1176 4 1.5267 0.2087 1.5267 1.2356
No log 0.1765 6 1.4284 0.1561 1.4284 1.1952
No log 0.2353 8 1.5042 0.1543 1.5042 1.2265
No log 0.2941 10 1.4282 0.1502 1.4282 1.1951
No log 0.3529 12 1.4061 0.1496 1.4061 1.1858
No log 0.4118 14 1.4265 0.1569 1.4265 1.1944
No log 0.4706 16 1.4242 0.1292 1.4242 1.1934
No log 0.5294 18 1.4583 0.1718 1.4583 1.2076
No log 0.5882 20 1.4602 0.2417 1.4602 1.2084
No log 0.6471 22 1.4220 0.2087 1.4220 1.1925
No log 0.7059 24 1.3436 0.1601 1.3436 1.1592
No log 0.7647 26 1.2805 0.2653 1.2805 1.1316
No log 0.8235 28 1.2598 0.3254 1.2598 1.1224
No log 0.8824 30 1.2121 0.3002 1.2121 1.1009
No log 0.9412 32 1.2386 0.3076 1.2386 1.1129
No log 1.0 34 1.3241 0.2867 1.3241 1.1507
No log 1.0588 36 1.3406 0.3734 1.3406 1.1578
No log 1.1176 38 1.2769 0.3101 1.2769 1.1300
No log 1.1765 40 1.1742 0.3199 1.1742 1.0836
No log 1.2353 42 1.1208 0.4144 1.1208 1.0587
No log 1.2941 44 1.1052 0.4045 1.1052 1.0513
No log 1.3529 46 1.0800 0.4724 1.0800 1.0392
No log 1.4118 48 1.0649 0.4352 1.0649 1.0319
No log 1.4706 50 1.1554 0.4681 1.1554 1.0749
No log 1.5294 52 1.2664 0.4516 1.2664 1.1254
No log 1.5882 54 1.2980 0.4463 1.2980 1.1393
No log 1.6471 56 1.2564 0.4610 1.2564 1.1209
No log 1.7059 58 1.2773 0.4637 1.2773 1.1302
No log 1.7647 60 1.2680 0.4637 1.2680 1.1260
No log 1.8235 62 1.2251 0.4610 1.2251 1.1068
No log 1.8824 64 1.2144 0.4701 1.2144 1.1020
No log 1.9412 66 1.0750 0.4772 1.0750 1.0368
No log 2.0 68 0.9633 0.5276 0.9633 0.9815
No log 2.0588 70 0.9482 0.5050 0.9482 0.9738
No log 2.1176 72 0.9558 0.5156 0.9558 0.9776
No log 2.1765 74 1.0899 0.4909 1.0899 1.0440
No log 2.2353 76 1.2871 0.4657 1.2871 1.1345
No log 2.2941 78 1.2839 0.4709 1.2839 1.1331
No log 2.3529 80 1.2378 0.4756 1.2378 1.1126
No log 2.4118 82 1.3922 0.4579 1.3922 1.1799
No log 2.4706 84 1.7948 0.3792 1.7948 1.3397
No log 2.5294 86 2.0900 0.3795 2.0900 1.4457
No log 2.5882 88 2.0156 0.3773 2.0156 1.4197
No log 2.6471 90 1.6931 0.4214 1.6931 1.3012
No log 2.7059 92 1.2058 0.4795 1.2058 1.0981
No log 2.7647 94 0.9995 0.5092 0.9995 0.9997
No log 2.8235 96 0.9513 0.4934 0.9513 0.9753
No log 2.8824 98 0.9682 0.4769 0.9682 0.9840
No log 2.9412 100 0.9761 0.4746 0.9761 0.9880
No log 3.0 102 1.0283 0.5216 1.0283 1.0140
No log 3.0588 104 1.1557 0.5281 1.1557 1.0750
No log 3.1176 106 1.3142 0.4919 1.3142 1.1464
No log 3.1765 108 1.2654 0.4715 1.2654 1.1249
No log 3.2353 110 1.1518 0.5530 1.1518 1.0732
No log 3.2941 112 1.0986 0.5428 1.0986 1.0481
No log 3.3529 114 1.2073 0.5161 1.2073 1.0988
No log 3.4118 116 1.4199 0.4746 1.4199 1.1916
No log 3.4706 118 1.6298 0.4818 1.6298 1.2766
No log 3.5294 120 1.7701 0.4837 1.7701 1.3304
No log 3.5882 122 1.7576 0.4937 1.7576 1.3257
No log 3.6471 124 1.5723 0.5030 1.5723 1.2539
No log 3.7059 126 1.5035 0.5001 1.5035 1.2262
No log 3.7647 128 1.4008 0.5388 1.4008 1.1836
No log 3.8235 130 1.3282 0.5233 1.3282 1.1525
No log 3.8824 132 1.3259 0.5233 1.3259 1.1515
No log 3.9412 134 1.2110 0.5373 1.2110 1.1005
No log 4.0 136 1.0928 0.5616 1.0928 1.0454
No log 4.0588 138 1.1813 0.5851 1.1813 1.0869
No log 4.1176 140 1.4262 0.5139 1.4262 1.1942
No log 4.1765 142 1.4548 0.5081 1.4548 1.2061
No log 4.2353 144 1.2314 0.5426 1.2314 1.1097
No log 4.2941 146 1.0339 0.5642 1.0339 1.0168
No log 4.3529 148 0.9523 0.6084 0.9523 0.9759
No log 4.4118 150 0.9785 0.5899 0.9785 0.9892
No log 4.4706 152 1.0848 0.5625 1.0848 1.0415
No log 4.5294 154 1.2963 0.5087 1.2963 1.1385
No log 4.5882 156 1.3580 0.5170 1.3580 1.1653
No log 4.6471 158 1.2892 0.5604 1.2892 1.1354
No log 4.7059 160 1.3007 0.5453 1.3007 1.1405
No log 4.7647 162 1.3379 0.5498 1.3379 1.1567
No log 4.8235 164 1.3989 0.5488 1.3989 1.1828
No log 4.8824 166 1.3227 0.5492 1.3227 1.1501
No log 4.9412 168 1.3111 0.5303 1.3111 1.1450
No log 5.0 170 1.2436 0.5767 1.2436 1.1152
No log 5.0588 172 1.1345 0.5821 1.1345 1.0651
No log 5.1176 174 1.1004 0.5821 1.1004 1.0490
No log 5.1765 176 1.1026 0.5956 1.1026 1.0501
No log 5.2353 178 0.9784 0.6204 0.9784 0.9892
No log 5.2941 180 0.8543 0.6284 0.8543 0.9243
No log 5.3529 182 0.8392 0.6475 0.8392 0.9161
No log 5.4118 184 0.9311 0.6280 0.9311 0.9650
No log 5.4706 186 1.1229 0.6132 1.1229 1.0597
No log 5.5294 188 1.3939 0.5539 1.3939 1.1806
No log 5.5882 190 1.4707 0.5508 1.4707 1.2127
No log 5.6471 192 1.3937 0.5635 1.3937 1.1805
No log 5.7059 194 1.1930 0.5737 1.1930 1.0923
No log 5.7647 196 0.9436 0.6207 0.9436 0.9714
No log 5.8235 198 0.8547 0.6274 0.8547 0.9245
No log 5.8824 200 0.8696 0.6341 0.8696 0.9325
No log 5.9412 202 0.9684 0.6179 0.9684 0.9841
No log 6.0 204 1.1713 0.5973 1.1713 1.0823
No log 6.0588 206 1.3164 0.5699 1.3164 1.1474
No log 6.1176 208 1.3399 0.5786 1.3399 1.1575
No log 6.1765 210 1.2807 0.5611 1.2807 1.1317
No log 6.2353 212 1.1840 0.5816 1.1840 1.0881
No log 6.2941 214 1.1113 0.6020 1.1113 1.0542
No log 6.3529 216 1.1144 0.5958 1.1144 1.0557
No log 6.4118 218 1.1905 0.5797 1.1905 1.0911
No log 6.4706 220 1.3243 0.5488 1.3243 1.1508
No log 6.5294 222 1.3824 0.5488 1.3824 1.1758
No log 6.5882 224 1.3412 0.5394 1.3412 1.1581
No log 6.6471 226 1.2635 0.5530 1.2635 1.1240
No log 6.7059 228 1.2071 0.5673 1.2071 1.0987
No log 6.7647 230 1.1690 0.5417 1.1690 1.0812
No log 6.8235 232 1.0885 0.5621 1.0885 1.0433
No log 6.8824 234 1.0152 0.5851 1.0152 1.0076
No log 6.9412 236 0.9884 0.5844 0.9884 0.9942
No log 7.0 238 0.9822 0.5971 0.9822 0.9910
No log 7.0588 240 0.9837 0.5969 0.9837 0.9918
No log 7.1176 242 0.9940 0.6150 0.9940 0.9970
No log 7.1765 244 1.0378 0.6126 1.0378 1.0187
No log 7.2353 246 1.0782 0.5702 1.0782 1.0384
No log 7.2941 248 1.1093 0.5769 1.1093 1.0532
No log 7.3529 250 1.1320 0.5790 1.1320 1.0640
No log 7.4118 252 1.1638 0.5683 1.1638 1.0788
No log 7.4706 254 1.1794 0.5726 1.1794 1.0860
No log 7.5294 256 1.2229 0.5644 1.2229 1.1059
No log 7.5882 258 1.2998 0.5554 1.2998 1.1401
No log 7.6471 260 1.3249 0.5679 1.3249 1.1511
No log 7.7059 262 1.2980 0.5767 1.2980 1.1393
No log 7.7647 264 1.2456 0.5635 1.2456 1.1161
No log 7.8235 266 1.1806 0.5767 1.1806 1.0865
No log 7.8824 268 1.1240 0.6032 1.1240 1.0602
No log 7.9412 270 1.0822 0.6119 1.0822 1.0403
No log 8.0 272 1.0718 0.6135 1.0718 1.0353
No log 8.0588 274 1.0575 0.6095 1.0575 1.0283
No log 8.1176 276 1.0755 0.6103 1.0755 1.0371
No log 8.1765 278 1.0740 0.6103 1.0740 1.0364
No log 8.2353 280 1.0682 0.6103 1.0682 1.0335
No log 8.2941 282 1.0506 0.6095 1.0506 1.0250
No log 8.3529 284 1.0649 0.6103 1.0649 1.0320
No log 8.4118 286 1.0734 0.6103 1.0734 1.0361
No log 8.4706 288 1.0832 0.6103 1.0832 1.0408
No log 8.5294 290 1.1030 0.5988 1.1030 1.0503
No log 8.5882 292 1.1128 0.5988 1.1128 1.0549
No log 8.6471 294 1.1078 0.6074 1.1078 1.0525
No log 8.7059 296 1.1114 0.6103 1.1114 1.0542
No log 8.7647 298 1.1360 0.6165 1.1360 1.0658
No log 8.8235 300 1.1666 0.6085 1.1666 1.0801
No log 8.8824 302 1.1838 0.6051 1.1838 1.0880
No log 8.9412 304 1.2165 0.5824 1.2165 1.1029
No log 9.0 306 1.2504 0.5776 1.2504 1.1182
No log 9.0588 308 1.2643 0.5766 1.2643 1.1244
No log 9.1176 310 1.2566 0.5766 1.2566 1.1210
No log 9.1765 312 1.2300 0.5776 1.2300 1.1091
No log 9.2353 314 1.1944 0.5903 1.1944 1.0929
No log 9.2941 316 1.1582 0.6170 1.1582 1.0762
No log 9.3529 318 1.1459 0.6109 1.1459 1.0705
No log 9.4118 320 1.1369 0.6022 1.1369 1.0663
No log 9.4706 322 1.1283 0.6022 1.1283 1.0622
No log 9.5294 324 1.1194 0.6022 1.1194 1.0580
No log 9.5882 326 1.1235 0.6022 1.1235 1.0599
No log 9.6471 328 1.1280 0.6022 1.1280 1.0621
No log 9.7059 330 1.1292 0.6022 1.1292 1.0626
No log 9.7647 332 1.1305 0.6022 1.1305 1.0632
No log 9.8235 334 1.1312 0.6022 1.1312 1.0636
No log 9.8824 336 1.1337 0.6022 1.1337 1.0647
No log 9.9412 338 1.1368 0.6022 1.1368 1.0662
No log 10.0 340 1.1386 0.6022 1.1386 1.0670

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_FineTuningAraBERT_run2_AugV5_k9_task5_organization

Finetuned
(4023)
this model