Arabic_CrossPrompt_FineTuningAraBERT_noAug_TestTask7_mechanics

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5742
  • Qwk: 0.5905
  • Mse: 0.5742
  • Rmse: 0.7577

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0202 2 3.8911 -0.0083 3.8911 1.9726
No log 0.0404 4 2.8935 0.0324 2.8935 1.7010
No log 0.0606 6 1.8160 0.0920 1.8160 1.3476
No log 0.0808 8 0.9053 0.1535 0.9053 0.9515
No log 0.1010 10 0.8947 0.0679 0.8947 0.9459
No log 0.1212 12 1.0524 0.0333 1.0524 1.0259
No log 0.1414 14 1.0171 -0.0014 1.0171 1.0085
No log 0.1616 16 0.9202 0.0735 0.9202 0.9593
No log 0.1818 18 0.8181 0.1191 0.8181 0.9045
No log 0.2020 20 0.7104 0.2245 0.7104 0.8428
No log 0.2222 22 0.6800 0.2893 0.6800 0.8246
No log 0.2424 24 0.6951 0.2054 0.6951 0.8337
No log 0.2626 26 0.7410 0.2246 0.7410 0.8608
No log 0.2828 28 0.6703 0.2351 0.6703 0.8187
No log 0.3030 30 0.5821 0.4573 0.5821 0.7630
No log 0.3232 32 0.5424 0.4721 0.5424 0.7365
No log 0.3434 34 0.5497 0.4869 0.5497 0.7414
No log 0.3636 36 0.6582 0.3133 0.6582 0.8113
No log 0.3838 38 0.8011 0.2414 0.8011 0.8951
No log 0.4040 40 0.8739 0.1776 0.8739 0.9348
No log 0.4242 42 0.7984 0.2508 0.7984 0.8935
No log 0.4444 44 0.6795 0.3023 0.6795 0.8243
No log 0.4646 46 0.6155 0.3699 0.6155 0.7845
No log 0.4848 48 0.5150 0.4943 0.5150 0.7176
No log 0.5051 50 0.5045 0.5077 0.5045 0.7103
No log 0.5253 52 0.5328 0.4894 0.5328 0.7299
No log 0.5455 54 0.5815 0.4131 0.5815 0.7625
No log 0.5657 56 0.6490 0.3603 0.6490 0.8056
No log 0.5859 58 0.5978 0.3896 0.5978 0.7732
No log 0.6061 60 0.5587 0.4629 0.5587 0.7475
No log 0.6263 62 0.5580 0.4663 0.5580 0.7470
No log 0.6465 64 0.5631 0.4650 0.5631 0.7504
No log 0.6667 66 0.5805 0.4035 0.5805 0.7619
No log 0.6869 68 0.6809 0.3120 0.6809 0.8252
No log 0.7071 70 0.9463 0.4124 0.9463 0.9728
No log 0.7273 72 1.0101 0.4098 1.0101 1.0050
No log 0.7475 74 0.7274 0.5092 0.7274 0.8529
No log 0.7677 76 0.5420 0.6353 0.5420 0.7362
No log 0.7879 78 0.5591 0.6416 0.5591 0.7478
No log 0.8081 80 0.5488 0.6109 0.5488 0.7408
No log 0.8283 82 0.6354 0.5505 0.6354 0.7971
No log 0.8485 84 0.6254 0.5397 0.6254 0.7908
No log 0.8687 86 0.5170 0.5965 0.5170 0.7190
No log 0.8889 88 0.4638 0.5707 0.4638 0.6810
No log 0.9091 90 0.4649 0.6154 0.4649 0.6819
No log 0.9293 92 0.5358 0.6017 0.5358 0.7320
No log 0.9495 94 0.6444 0.5333 0.6444 0.8028
No log 0.9697 96 0.6537 0.5423 0.6537 0.8085
No log 0.9899 98 0.5787 0.5706 0.5787 0.7607
No log 1.0101 100 0.5901 0.5771 0.5901 0.7682
No log 1.0303 102 0.5715 0.5770 0.5715 0.7560
No log 1.0505 104 0.5373 0.5917 0.5373 0.7330
No log 1.0707 106 0.5274 0.5851 0.5274 0.7262
No log 1.0909 108 0.5510 0.5840 0.5510 0.7423
No log 1.1111 110 0.6367 0.5413 0.6367 0.7980
No log 1.1313 112 0.8808 0.4583 0.8808 0.9385
No log 1.1515 114 0.8529 0.4862 0.8529 0.9235
No log 1.1717 116 0.6515 0.5179 0.6515 0.8072
No log 1.1919 118 0.5422 0.6252 0.5422 0.7364
No log 1.2121 120 0.5245 0.6421 0.5245 0.7242
No log 1.2323 122 0.5666 0.5950 0.5666 0.7527
No log 1.2525 124 0.6857 0.5105 0.6857 0.8281
No log 1.2727 126 0.6104 0.5345 0.6104 0.7812
No log 1.2929 128 0.4572 0.5587 0.4572 0.6761
No log 1.3131 130 0.4437 0.4929 0.4437 0.6661
No log 1.3333 132 0.5135 0.5268 0.5135 0.7166
No log 1.3535 134 0.7553 0.3251 0.7553 0.8691
No log 1.3737 136 0.8406 0.2027 0.8406 0.9169
No log 1.3939 138 0.6475 0.4249 0.6475 0.8047
No log 1.4141 140 0.4871 0.5598 0.4871 0.6979
No log 1.4343 142 0.4305 0.5862 0.4305 0.6561
No log 1.4545 144 0.4292 0.6449 0.4292 0.6551
No log 1.4747 146 0.4542 0.6610 0.4542 0.6740
No log 1.4949 148 0.5735 0.6110 0.5735 0.7573
No log 1.5152 150 0.7459 0.4839 0.7459 0.8637
No log 1.5354 152 0.8623 0.4414 0.8623 0.9286
No log 1.5556 154 0.8020 0.4474 0.8020 0.8955
No log 1.5758 156 0.6764 0.5211 0.6764 0.8224
No log 1.5960 158 0.5846 0.5645 0.5846 0.7646
No log 1.6162 160 0.5796 0.5812 0.5796 0.7613
No log 1.6364 162 0.6681 0.5376 0.6681 0.8174
No log 1.6566 164 0.7081 0.5564 0.7081 0.8415
No log 1.6768 166 0.6211 0.5960 0.6211 0.7881
No log 1.6970 168 0.5591 0.6352 0.5591 0.7477
No log 1.7172 170 0.5665 0.6297 0.5665 0.7526
No log 1.7374 172 0.6344 0.5639 0.6344 0.7965
No log 1.7576 174 0.7950 0.4759 0.7950 0.8916
No log 1.7778 176 0.8250 0.4559 0.8250 0.9083
No log 1.7980 178 0.7743 0.4613 0.7743 0.8800
No log 1.8182 180 0.6349 0.4944 0.6349 0.7968
No log 1.8384 182 0.5440 0.4739 0.5440 0.7375
No log 1.8586 184 0.5432 0.4733 0.5432 0.7370
No log 1.8788 186 0.5852 0.5124 0.5852 0.7650
No log 1.8990 188 0.5750 0.4871 0.5750 0.7583
No log 1.9192 190 0.5273 0.5341 0.5273 0.7262
No log 1.9394 192 0.4639 0.5825 0.4639 0.6811
No log 1.9596 194 0.4854 0.5901 0.4854 0.6967
No log 1.9798 196 0.4820 0.5836 0.4820 0.6943
No log 2.0 198 0.4725 0.6115 0.4725 0.6874
No log 2.0202 200 0.7666 0.5057 0.7666 0.8756
No log 2.0404 202 1.0847 0.4023 1.0847 1.0415
No log 2.0606 204 0.9921 0.4220 0.9921 0.9960
No log 2.0808 206 0.6711 0.5046 0.6711 0.8192
No log 2.1010 208 0.5332 0.5696 0.5332 0.7302
No log 2.1212 210 0.5174 0.5463 0.5174 0.7193
No log 2.1414 212 0.5813 0.5229 0.5813 0.7624
No log 2.1616 214 0.6729 0.4519 0.6729 0.8203
No log 2.1818 216 0.6477 0.5026 0.6477 0.8048
No log 2.2020 218 0.6077 0.5727 0.6077 0.7795
No log 2.2222 220 0.6662 0.5176 0.6662 0.8162
No log 2.2424 222 0.7300 0.5412 0.7300 0.8544
No log 2.2626 224 0.6981 0.5674 0.6981 0.8355
No log 2.2828 226 0.5211 0.6425 0.5211 0.7219
No log 2.3030 228 0.5111 0.6678 0.5111 0.7149
No log 2.3232 230 0.4919 0.7057 0.4919 0.7014
No log 2.3434 232 0.5580 0.6456 0.5580 0.7470
No log 2.3636 234 0.8002 0.5741 0.8002 0.8946
No log 2.3838 236 1.0304 0.4433 1.0304 1.0151
No log 2.4040 238 0.9678 0.4691 0.9678 0.9838
No log 2.4242 240 0.6268 0.5459 0.6268 0.7917
No log 2.4444 242 0.4423 0.6824 0.4423 0.6650
No log 2.4646 244 0.4350 0.6693 0.4350 0.6595
No log 2.4848 246 0.5610 0.6180 0.5610 0.7490
No log 2.5051 248 0.7365 0.5476 0.7365 0.8582
No log 2.5253 250 0.7416 0.5416 0.7416 0.8612
No log 2.5455 252 0.5160 0.6238 0.5160 0.7184
No log 2.5657 254 0.4587 0.6481 0.4587 0.6773
No log 2.5859 256 0.4959 0.6436 0.4959 0.7042
No log 2.6061 258 0.6363 0.5550 0.6363 0.7977
No log 2.6263 260 0.8538 0.5039 0.8538 0.9240
No log 2.6465 262 0.7857 0.5397 0.7857 0.8864
No log 2.6667 264 0.6574 0.5806 0.6574 0.8108
No log 2.6869 266 0.5045 0.6602 0.5045 0.7103
No log 2.7071 268 0.4740 0.6869 0.4740 0.6884
No log 2.7273 270 0.4472 0.6752 0.4472 0.6687
No log 2.7475 272 0.4099 0.7140 0.4099 0.6402
No log 2.7677 274 0.4134 0.7107 0.4134 0.6430
No log 2.7879 276 0.4076 0.7024 0.4076 0.6384
No log 2.8081 278 0.5083 0.6302 0.5083 0.7129
No log 2.8283 280 0.5772 0.6007 0.5772 0.7597
No log 2.8485 282 0.5910 0.5727 0.5910 0.7688
No log 2.8687 284 0.5044 0.6037 0.5044 0.7102
No log 2.8889 286 0.5000 0.5586 0.5000 0.7071
No log 2.9091 288 0.5470 0.5604 0.5470 0.7396
No log 2.9293 290 0.5830 0.5469 0.5830 0.7636
No log 2.9495 292 0.6060 0.5416 0.6060 0.7784
No log 2.9697 294 0.4979 0.5963 0.4979 0.7056
No log 2.9899 296 0.4867 0.6371 0.4867 0.6976
No log 3.0101 298 0.4305 0.6746 0.4305 0.6562
No log 3.0303 300 0.4436 0.6823 0.4436 0.6660
No log 3.0505 302 0.6525 0.6001 0.6525 0.8078
No log 3.0707 304 0.9932 0.4911 0.9932 0.9966
No log 3.0909 306 1.0557 0.4701 1.0557 1.0275
No log 3.1111 308 0.7505 0.5695 0.7505 0.8663
No log 3.1313 310 0.4175 0.6856 0.4175 0.6461
No log 3.1515 312 0.4003 0.6911 0.4003 0.6327
No log 3.1717 314 0.3913 0.6943 0.3913 0.6255
No log 3.1919 316 0.3964 0.6789 0.3964 0.6296
No log 3.2121 318 0.6276 0.5817 0.6276 0.7922
No log 3.2323 320 1.0910 0.4483 1.0910 1.0445
No log 3.2525 322 1.1582 0.4328 1.1582 1.0762
No log 3.2727 324 0.8391 0.5013 0.8391 0.9160
No log 3.2929 326 0.4624 0.6184 0.4624 0.6800
No log 3.3131 328 0.4318 0.6610 0.4318 0.6571
No log 3.3333 330 0.4743 0.6269 0.4743 0.6887
No log 3.3535 332 0.4093 0.6516 0.4093 0.6398
No log 3.3737 334 0.4917 0.5751 0.4917 0.7012
No log 3.3939 336 0.7217 0.5027 0.7217 0.8495
No log 3.4141 338 0.8588 0.4899 0.8588 0.9267
No log 3.4343 340 0.7535 0.5488 0.7535 0.8680
No log 3.4545 342 0.6685 0.6098 0.6685 0.8176
No log 3.4747 344 0.5719 0.6423 0.5719 0.7562
No log 3.4949 346 0.5125 0.6785 0.5125 0.7159
No log 3.5152 348 0.5430 0.6749 0.5430 0.7369
No log 3.5354 350 0.7452 0.6050 0.7452 0.8632
No log 3.5556 352 0.9571 0.5011 0.9571 0.9783
No log 3.5758 354 0.9427 0.4891 0.9427 0.9709
No log 3.5960 356 0.8397 0.4859 0.8397 0.9163
No log 3.6162 358 0.8448 0.4924 0.8448 0.9191
No log 3.6364 360 0.7249 0.5447 0.7249 0.8514
No log 3.6566 362 0.6795 0.5426 0.6795 0.8243
No log 3.6768 364 0.7630 0.5100 0.7630 0.8735
No log 3.6970 366 0.7194 0.5078 0.7194 0.8482
No log 3.7172 368 0.6375 0.5309 0.6375 0.7984
No log 3.7374 370 0.5940 0.5554 0.5940 0.7707
No log 3.7576 372 0.5047 0.6293 0.5047 0.7104
No log 3.7778 374 0.5185 0.6263 0.5185 0.7201
No log 3.7980 376 0.6887 0.5599 0.6887 0.8299
No log 3.8182 378 0.6907 0.5730 0.6907 0.8311
No log 3.8384 380 0.6438 0.6017 0.6438 0.8024
No log 3.8586 382 0.5749 0.6356 0.5749 0.7582
No log 3.8788 384 0.7188 0.5636 0.7188 0.8478
No log 3.8990 386 0.7382 0.5498 0.7382 0.8592
No log 3.9192 388 0.6242 0.5713 0.6242 0.7901
No log 3.9394 390 0.5731 0.5657 0.5731 0.7570
No log 3.9596 392 0.5608 0.5793 0.5608 0.7489
No log 3.9798 394 0.4890 0.6103 0.4890 0.6993
No log 4.0 396 0.5903 0.5759 0.5903 0.7683
No log 4.0202 398 0.6723 0.5432 0.6723 0.8199
No log 4.0404 400 0.6604 0.5569 0.6604 0.8127
No log 4.0606 402 0.5130 0.6122 0.5130 0.7163
No log 4.0808 404 0.4968 0.6248 0.4968 0.7049
No log 4.1010 406 0.5014 0.6510 0.5014 0.7081
No log 4.1212 408 0.5714 0.6232 0.5714 0.7559
No log 4.1414 410 0.7158 0.5741 0.7158 0.8460
No log 4.1616 412 0.7092 0.6024 0.7092 0.8421
No log 4.1818 414 0.5858 0.6321 0.5858 0.7654
No log 4.2020 416 0.4986 0.6110 0.4986 0.7061
No log 4.2222 418 0.5137 0.6031 0.5137 0.7167
No log 4.2424 420 0.8278 0.4931 0.8278 0.9098
No log 4.2626 422 1.0591 0.4488 1.0591 1.0291
No log 4.2828 424 0.8933 0.4894 0.8933 0.9452
No log 4.3030 426 0.6413 0.5804 0.6413 0.8008
No log 4.3232 428 0.4917 0.5862 0.4917 0.7012
No log 4.3434 430 0.5261 0.5963 0.5261 0.7254
No log 4.3636 432 0.6300 0.5777 0.6300 0.7937
No log 4.3838 434 0.8186 0.5136 0.8186 0.9047
No log 4.4040 436 0.7317 0.5456 0.7317 0.8554
No log 4.4242 438 0.6869 0.5750 0.6869 0.8288
No log 4.4444 440 0.7405 0.5476 0.7405 0.8605
No log 4.4646 442 0.6675 0.5849 0.6675 0.8170
No log 4.4848 444 0.4827 0.6538 0.4827 0.6948
No log 4.5051 446 0.4742 0.6537 0.4742 0.6886
No log 4.5253 448 0.6462 0.6299 0.6462 0.8039
No log 4.5455 450 0.9154 0.5370 0.9154 0.9567
No log 4.5657 452 0.8375 0.5508 0.8375 0.9151
No log 4.5859 454 0.6039 0.6003 0.6039 0.7771
No log 4.6061 456 0.5956 0.5978 0.5956 0.7718
No log 4.6263 458 0.5030 0.6124 0.5030 0.7092
No log 4.6465 460 0.5504 0.6263 0.5504 0.7419
No log 4.6667 462 0.7214 0.5718 0.7214 0.8493
No log 4.6869 464 0.7414 0.5916 0.7414 0.8610
No log 4.7071 466 0.8637 0.5338 0.8637 0.9293
No log 4.7273 468 0.7630 0.5528 0.7630 0.8735
No log 4.7475 470 0.6515 0.6098 0.6515 0.8071
No log 4.7677 472 0.7804 0.5213 0.7804 0.8834
No log 4.7879 474 0.9885 0.4777 0.9885 0.9942
No log 4.8081 476 0.8272 0.5441 0.8272 0.9095
No log 4.8283 478 0.5013 0.6576 0.5013 0.7080
No log 4.8485 480 0.4621 0.6787 0.4621 0.6798
No log 4.8687 482 0.5100 0.6167 0.5100 0.7141
No log 4.8889 484 0.6163 0.5502 0.6163 0.7850
No log 4.9091 486 0.6764 0.5201 0.6764 0.8225
No log 4.9293 488 0.6801 0.5121 0.6801 0.8247
No log 4.9495 490 0.5296 0.6152 0.5296 0.7278
No log 4.9697 492 0.5036 0.6660 0.5036 0.7096
No log 4.9899 494 0.6369 0.5971 0.6369 0.7980
No log 5.0101 496 0.7071 0.5904 0.7071 0.8409
No log 5.0303 498 0.5989 0.6352 0.5989 0.7739
0.5083 5.0505 500 0.5872 0.6410 0.5872 0.7663
0.5083 5.0707 502 0.6633 0.5896 0.6633 0.8145
0.5083 5.0909 504 0.6894 0.5394 0.6894 0.8303
0.5083 5.1111 506 0.6542 0.5537 0.6542 0.8088
0.5083 5.1313 508 0.6052 0.5756 0.6052 0.7779
0.5083 5.1515 510 0.5742 0.5905 0.5742 0.7577

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/Arabic_CrossPrompt_FineTuningAraBERT_noAug_TestTask7_mechanics

Finetuned
(4019)
this model