ArabicNewSplits7_FineTuningAraBERT_noAug_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6749
  • Qwk: 0.7448
  • Mse: 0.6749
  • Rmse: 0.8215

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.5 2 5.7283 0.0 5.7283 2.3934
No log 1.0 4 4.4212 0.0899 4.4212 2.1027
No log 1.5 6 2.8327 0.1091 2.8327 1.6831
No log 2.0 8 2.0037 0.1818 2.0037 1.4155
No log 2.5 10 1.5763 0.1143 1.5763 1.2555
No log 3.0 12 1.7950 0.1345 1.7950 1.3398
No log 3.5 14 1.7567 0.2439 1.7567 1.3254
No log 4.0 16 1.3368 0.4640 1.3368 1.1562
No log 4.5 18 1.1361 0.5625 1.1361 1.0659
No log 5.0 20 1.0035 0.6 1.0035 1.0018
No log 5.5 22 0.7957 0.6466 0.7957 0.8920
No log 6.0 24 0.9489 0.6803 0.9489 0.9741
No log 6.5 26 0.9905 0.6711 0.9905 0.9952
No log 7.0 28 0.7061 0.7215 0.7061 0.8403
No log 7.5 30 0.7623 0.7089 0.7623 0.8731
No log 8.0 32 0.7881 0.675 0.7881 0.8877
No log 8.5 34 0.7602 0.7286 0.7602 0.8719
No log 9.0 36 0.7555 0.7324 0.7555 0.8692
No log 9.5 38 0.7334 0.7671 0.7334 0.8564
No log 10.0 40 0.8464 0.6901 0.8464 0.9200
No log 10.5 42 0.7635 0.7093 0.7635 0.8738
No log 11.0 44 0.6611 0.7875 0.6611 0.8131
No log 11.5 46 0.6986 0.7531 0.6986 0.8358
No log 12.0 48 0.6103 0.8090 0.6103 0.7812
No log 12.5 50 0.9149 0.7374 0.9149 0.9565
No log 13.0 52 0.8405 0.7423 0.8405 0.9168
No log 13.5 54 0.6384 0.7771 0.6384 0.7990
No log 14.0 56 0.6231 0.7712 0.6231 0.7894
No log 14.5 58 0.7146 0.7568 0.7146 0.8453
No log 15.0 60 0.8190 0.6849 0.8190 0.9050
No log 15.5 62 0.7697 0.7222 0.7697 0.8773
No log 16.0 64 0.7883 0.7260 0.7883 0.8879
No log 16.5 66 0.7168 0.6993 0.7168 0.8467
No log 17.0 68 0.6648 0.7467 0.6648 0.8153
No log 17.5 70 0.6308 0.7550 0.6308 0.7943
No log 18.0 72 0.6351 0.775 0.6351 0.7969
No log 18.5 74 0.6116 0.7799 0.6116 0.7821
No log 19.0 76 0.5927 0.7564 0.5927 0.7699
No log 19.5 78 0.6433 0.7799 0.6433 0.8020
No log 20.0 80 0.6251 0.7613 0.6251 0.7907
No log 20.5 82 0.6753 0.7901 0.6753 0.8218
No log 21.0 84 0.6971 0.7901 0.6971 0.8349
No log 21.5 86 0.5967 0.7632 0.5967 0.7725
No log 22.0 88 0.6512 0.7595 0.6512 0.8069
No log 22.5 90 0.8192 0.6792 0.8192 0.9051
No log 23.0 92 0.7403 0.7215 0.7403 0.8604
No log 23.5 94 0.5969 0.7763 0.5969 0.7726
No log 24.0 96 0.7296 0.7632 0.7296 0.8542
No log 24.5 98 0.7538 0.7206 0.7538 0.8682
No log 25.0 100 0.7907 0.6718 0.7907 0.8892
No log 25.5 102 0.7714 0.6866 0.7714 0.8783
No log 26.0 104 0.7152 0.7101 0.7152 0.8457
No log 26.5 106 0.6623 0.7183 0.6623 0.8138
No log 27.0 108 0.6467 0.7361 0.6467 0.8042
No log 27.5 110 0.6404 0.8 0.6404 0.8002
No log 28.0 112 0.6807 0.7347 0.6807 0.8251
No log 28.5 114 0.6866 0.7347 0.6866 0.8286
No log 29.0 116 0.6683 0.7397 0.6683 0.8175
No log 29.5 118 0.7229 0.75 0.7229 0.8503
No log 30.0 120 0.7675 0.72 0.7675 0.8761
No log 30.5 122 0.7178 0.7101 0.7178 0.8472
No log 31.0 124 0.7037 0.6866 0.7037 0.8389
No log 31.5 126 0.6864 0.7234 0.6864 0.8285
No log 32.0 128 0.6806 0.7234 0.6806 0.8250
No log 32.5 130 0.6529 0.7785 0.6529 0.8081
No log 33.0 132 0.6303 0.7815 0.6303 0.7939
No log 33.5 134 0.6330 0.7785 0.6330 0.7956
No log 34.0 136 0.6680 0.7871 0.6680 0.8173
No log 34.5 138 0.6989 0.7976 0.6989 0.8360
No log 35.0 140 0.6774 0.8049 0.6774 0.8231
No log 35.5 142 0.6164 0.7898 0.6164 0.7851
No log 36.0 144 0.6464 0.7898 0.6464 0.8040
No log 36.5 146 0.7079 0.7975 0.7079 0.8414
No log 37.0 148 0.7986 0.7673 0.7986 0.8936
No log 37.5 150 0.7841 0.7673 0.7841 0.8855
No log 38.0 152 0.7336 0.7550 0.7336 0.8565
No log 38.5 154 0.6689 0.7361 0.6689 0.8178
No log 39.0 156 0.6495 0.7333 0.6495 0.8059
No log 39.5 158 0.6309 0.7333 0.6309 0.7943
No log 40.0 160 0.6394 0.7643 0.6394 0.7996
No log 40.5 162 0.6691 0.7643 0.6691 0.8180
No log 41.0 164 0.6491 0.7532 0.6491 0.8057
No log 41.5 166 0.6487 0.7799 0.6487 0.8054
No log 42.0 168 0.6488 0.775 0.6488 0.8055
No log 42.5 170 0.6648 0.7436 0.6648 0.8153
No log 43.0 172 0.7060 0.7564 0.7060 0.8403
No log 43.5 174 0.7596 0.7531 0.7595 0.8715
No log 44.0 176 0.7547 0.7564 0.7547 0.8687
No log 44.5 178 0.7462 0.6993 0.7462 0.8638
No log 45.0 180 0.7579 0.7042 0.7579 0.8706
No log 45.5 182 0.7430 0.7133 0.7430 0.8620
No log 46.0 184 0.7356 0.7133 0.7356 0.8577
No log 46.5 186 0.7246 0.7133 0.7246 0.8512
No log 47.0 188 0.7530 0.7403 0.7530 0.8678
No log 47.5 190 0.8012 0.7636 0.8012 0.8951
No log 48.0 192 0.7931 0.7738 0.7931 0.8906
No log 48.5 194 0.7278 0.7702 0.7278 0.8531
No log 49.0 196 0.6743 0.7483 0.6743 0.8211
No log 49.5 198 0.6482 0.7448 0.6482 0.8051
No log 50.0 200 0.6446 0.7397 0.6446 0.8029
No log 50.5 202 0.6448 0.7310 0.6448 0.8030
No log 51.0 204 0.6260 0.7361 0.6260 0.7912
No log 51.5 206 0.6310 0.7234 0.6310 0.7944
No log 52.0 208 0.6484 0.7448 0.6484 0.8052
No log 52.5 210 0.6807 0.7234 0.6807 0.8251
No log 53.0 212 0.6967 0.7347 0.6967 0.8347
No log 53.5 214 0.7160 0.7682 0.7160 0.8462
No log 54.0 216 0.7614 0.7578 0.7614 0.8726
No log 54.5 218 0.7615 0.7578 0.7615 0.8726
No log 55.0 220 0.7213 0.7595 0.7213 0.8493
No log 55.5 222 0.7117 0.7578 0.7117 0.8436
No log 56.0 224 0.6695 0.7421 0.6695 0.8182
No log 56.5 226 0.6704 0.7407 0.6704 0.8188
No log 57.0 228 0.7212 0.7857 0.7212 0.8492
No log 57.5 230 0.7341 0.7758 0.7341 0.8568
No log 58.0 232 0.7238 0.7831 0.7238 0.8508
No log 58.5 234 0.7268 0.7758 0.7268 0.8525
No log 59.0 236 0.7271 0.7578 0.7271 0.8527
No log 59.5 238 0.6985 0.75 0.6985 0.8357
No log 60.0 240 0.6931 0.7234 0.6931 0.8325
No log 60.5 242 0.7165 0.7 0.7165 0.8465
No log 61.0 244 0.7219 0.7234 0.7219 0.8496
No log 61.5 246 0.7521 0.7092 0.7521 0.8672
No log 62.0 248 0.7855 0.75 0.7855 0.8863
No log 62.5 250 0.7865 0.7484 0.7865 0.8868
No log 63.0 252 0.7517 0.75 0.7517 0.8670
No log 63.5 254 0.7019 0.75 0.7019 0.8378
No log 64.0 256 0.7025 0.7484 0.7025 0.8381
No log 64.5 258 0.7140 0.7613 0.7140 0.8450
No log 65.0 260 0.7186 0.7403 0.7186 0.8477
No log 65.5 262 0.7214 0.7613 0.7214 0.8493
No log 66.0 264 0.7213 0.7595 0.7213 0.8493
No log 66.5 266 0.6882 0.7403 0.6882 0.8296
No log 67.0 268 0.6727 0.7417 0.6727 0.8202
No log 67.5 270 0.6825 0.7467 0.6825 0.8262
No log 68.0 272 0.6921 0.7222 0.6921 0.8319
No log 68.5 274 0.7065 0.7234 0.7065 0.8405
No log 69.0 276 0.7205 0.7101 0.7205 0.8488
No log 69.5 278 0.7308 0.7101 0.7308 0.8549
No log 70.0 280 0.7660 0.7083 0.7660 0.8752
No log 70.5 282 0.7801 0.7172 0.7801 0.8832
No log 71.0 284 0.7736 0.7172 0.7736 0.8795
No log 71.5 286 0.7469 0.7083 0.7469 0.8642
No log 72.0 288 0.7196 0.7101 0.7196 0.8483
No log 72.5 290 0.7104 0.7101 0.7104 0.8428
No log 73.0 292 0.6989 0.7361 0.6989 0.8360
No log 73.5 294 0.6817 0.7361 0.6817 0.8256
No log 74.0 296 0.6807 0.7361 0.6807 0.8250
No log 74.5 298 0.6898 0.7297 0.6898 0.8306
No log 75.0 300 0.7230 0.76 0.7230 0.8503
No log 75.5 302 0.7631 0.7284 0.7631 0.8735
No log 76.0 304 0.7765 0.7284 0.7765 0.8812
No log 76.5 306 0.7481 0.75 0.7481 0.8649
No log 77.0 308 0.7310 0.7848 0.7310 0.8550
No log 77.5 310 0.7070 0.76 0.7070 0.8408
No log 78.0 312 0.6705 0.7361 0.6705 0.8188
No log 78.5 314 0.6472 0.7361 0.6472 0.8045
No log 79.0 316 0.6447 0.7361 0.6447 0.8029
No log 79.5 318 0.6460 0.7347 0.6460 0.8037
No log 80.0 320 0.6673 0.7763 0.6673 0.8169
No log 80.5 322 0.6800 0.7975 0.6800 0.8246
No log 81.0 324 0.6779 0.7871 0.6779 0.8233
No log 81.5 326 0.6616 0.7651 0.6616 0.8134
No log 82.0 328 0.6539 0.7671 0.6539 0.8086
No log 82.5 330 0.6553 0.7763 0.6553 0.8095
No log 83.0 332 0.6531 0.7763 0.6531 0.8081
No log 83.5 334 0.6594 0.7671 0.6594 0.8121
No log 84.0 336 0.6597 0.7671 0.6597 0.8122
No log 84.5 338 0.6486 0.7671 0.6486 0.8054
No log 85.0 340 0.6342 0.7483 0.6342 0.7964
No log 85.5 342 0.6275 0.7432 0.6275 0.7922
No log 86.0 344 0.6329 0.7310 0.6329 0.7955
No log 86.5 346 0.6419 0.7586 0.6419 0.8012
No log 87.0 348 0.6416 0.7361 0.6416 0.8010
No log 87.5 350 0.6460 0.7361 0.6460 0.8038
No log 88.0 352 0.6464 0.7361 0.6464 0.8040
No log 88.5 354 0.6448 0.7310 0.6448 0.8030
No log 89.0 356 0.6464 0.7183 0.6464 0.8040
No log 89.5 358 0.6528 0.7183 0.6528 0.8080
No log 90.0 360 0.6637 0.7183 0.6637 0.8147
No log 90.5 362 0.6732 0.7183 0.6732 0.8205
No log 91.0 364 0.6781 0.7234 0.6781 0.8235
No log 91.5 366 0.6850 0.7234 0.6850 0.8277
No log 92.0 368 0.6902 0.7448 0.6902 0.8308
No log 92.5 370 0.6941 0.7448 0.6941 0.8331
No log 93.0 372 0.6951 0.7361 0.6951 0.8337
No log 93.5 374 0.6946 0.7361 0.6946 0.8334
No log 94.0 376 0.6924 0.7361 0.6924 0.8321
No log 94.5 378 0.6860 0.7448 0.6860 0.8283
No log 95.0 380 0.6789 0.7448 0.6789 0.8240
No log 95.5 382 0.6743 0.7448 0.6743 0.8211
No log 96.0 384 0.6730 0.7448 0.6730 0.8204
No log 96.5 386 0.6732 0.7448 0.6732 0.8205
No log 97.0 388 0.6743 0.7448 0.6743 0.8212
No log 97.5 390 0.6755 0.7448 0.6755 0.8219
No log 98.0 392 0.6755 0.7448 0.6755 0.8219
No log 98.5 394 0.6746 0.7448 0.6746 0.8213
No log 99.0 396 0.6743 0.7448 0.6743 0.8212
No log 99.5 398 0.6747 0.7448 0.6747 0.8214
No log 100.0 400 0.6749 0.7448 0.6749 0.8215

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_noAug_task1_organization

Finetuned
(4019)
this model