ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k19_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5517
  • Qwk: 0.4681
  • Mse: 0.5517
  • Rmse: 0.7428

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0417 2 2.5798 -0.0262 2.5798 1.6062
No log 0.0833 4 1.3015 0.0257 1.3015 1.1408
No log 0.125 6 0.9846 0.0952 0.9846 0.9922
No log 0.1667 8 0.9859 -0.0622 0.9859 0.9929
No log 0.2083 10 1.0198 0.1010 1.0198 1.0098
No log 0.25 12 1.0320 0.1573 1.0320 1.0158
No log 0.2917 14 0.8183 0.2604 0.8183 0.9046
No log 0.3333 16 0.7493 0.1604 0.7493 0.8656
No log 0.375 18 0.7263 0.1604 0.7263 0.8522
No log 0.4167 20 0.7026 0.2713 0.7026 0.8382
No log 0.4583 22 0.6832 0.2374 0.6832 0.8265
No log 0.5 24 0.6945 0.1272 0.6945 0.8333
No log 0.5417 26 0.6746 0.1282 0.6746 0.8214
No log 0.5833 28 0.6806 0.1282 0.6806 0.8250
No log 0.625 30 0.7425 0.1358 0.7425 0.8617
No log 0.6667 32 0.8876 0.3051 0.8876 0.9421
No log 0.7083 34 0.9346 0.2908 0.9346 0.9667
No log 0.75 36 0.8546 0.2109 0.8546 0.9244
No log 0.7917 38 0.7977 0.0 0.7977 0.8931
No log 0.8333 40 0.7529 0.0 0.7529 0.8677
No log 0.875 42 0.7419 0.0840 0.7419 0.8614
No log 0.9167 44 0.7544 0.1236 0.7544 0.8686
No log 0.9583 46 0.7678 0.1617 0.7678 0.8763
No log 1.0 48 0.7639 0.1617 0.7639 0.8740
No log 1.0417 50 0.7656 0.1617 0.7656 0.8750
No log 1.0833 52 0.7988 0.1648 0.7988 0.8937
No log 1.125 54 0.9725 0.2939 0.9725 0.9861
No log 1.1667 56 0.8658 0.3231 0.8658 0.9305
No log 1.2083 58 0.6718 0.2145 0.6718 0.8197
No log 1.25 60 0.6417 0.2405 0.6417 0.8011
No log 1.2917 62 0.6544 0.3151 0.6544 0.8089
No log 1.3333 64 0.5926 0.2923 0.5926 0.7698
No log 1.375 66 0.5840 0.3502 0.5840 0.7642
No log 1.4167 68 0.5950 0.4526 0.5950 0.7714
No log 1.4583 70 0.6134 0.4294 0.6134 0.7832
No log 1.5 72 0.6334 0.4850 0.6334 0.7959
No log 1.5417 74 0.6144 0.4294 0.6144 0.7838
No log 1.5833 76 0.6195 0.5142 0.6195 0.7871
No log 1.625 78 0.7888 0.3567 0.7888 0.8881
No log 1.6667 80 0.8591 0.3095 0.8591 0.9268
No log 1.7083 82 0.6895 0.4516 0.6895 0.8304
No log 1.75 84 0.6865 0.3953 0.6865 0.8286
No log 1.7917 86 0.6918 0.3706 0.6918 0.8317
No log 1.8333 88 0.6630 0.3656 0.6630 0.8142
No log 1.875 90 0.7336 0.3840 0.7336 0.8565
No log 1.9167 92 0.6924 0.3296 0.6924 0.8321
No log 1.9583 94 0.6084 0.2923 0.6084 0.7800
No log 2.0 96 0.6327 0.3563 0.6327 0.7954
No log 2.0417 98 0.7051 0.3522 0.7051 0.8397
No log 2.0833 100 0.6547 0.4083 0.6547 0.8091
No log 2.125 102 0.6031 0.2783 0.6031 0.7766
No log 2.1667 104 0.7853 0.3319 0.7853 0.8861
No log 2.2083 106 0.8405 0.3562 0.8405 0.9168
No log 2.25 108 0.6887 0.3840 0.6887 0.8299
No log 2.2917 110 0.6108 0.4074 0.6108 0.7815
No log 2.3333 112 0.5918 0.3859 0.5918 0.7693
No log 2.375 114 0.5867 0.3478 0.5867 0.7660
No log 2.4167 116 0.6434 0.4597 0.6434 0.8021
No log 2.4583 118 0.6364 0.4835 0.6364 0.7978
No log 2.5 120 0.5780 0.4684 0.5780 0.7602
No log 2.5417 122 0.5992 0.4782 0.5992 0.7741
No log 2.5833 124 0.5554 0.4455 0.5554 0.7452
No log 2.625 126 0.5550 0.4681 0.5550 0.7450
No log 2.6667 128 0.7089 0.5215 0.7089 0.8420
No log 2.7083 130 0.9970 0.3798 0.9970 0.9985
No log 2.75 132 0.7940 0.4366 0.7940 0.8911
No log 2.7917 134 0.5935 0.4212 0.5935 0.7704
No log 2.8333 136 0.6088 0.4051 0.6088 0.7802
No log 2.875 138 0.6471 0.4466 0.6471 0.8045
No log 2.9167 140 0.6425 0.3517 0.6425 0.8015
No log 2.9583 142 0.6223 0.4434 0.6223 0.7889
No log 3.0 144 0.8061 0.3909 0.8061 0.8978
No log 3.0417 146 0.7328 0.4444 0.7328 0.8561
No log 3.0833 148 0.5918 0.4555 0.5918 0.7693
No log 3.125 150 0.6318 0.3069 0.6318 0.7948
No log 3.1667 152 0.6055 0.3373 0.6055 0.7782
No log 3.2083 154 0.5700 0.4661 0.5700 0.7550
No log 3.25 156 0.5773 0.4473 0.5773 0.7598
No log 3.2917 158 0.5858 0.4352 0.5858 0.7654
No log 3.3333 160 0.5823 0.2852 0.5823 0.7631
No log 3.375 162 0.5747 0.3474 0.5747 0.7581
No log 3.4167 164 0.5621 0.3728 0.5621 0.7498
No log 3.4583 166 0.5712 0.3738 0.5712 0.7558
No log 3.5 168 0.6100 0.4222 0.6100 0.7810
No log 3.5417 170 0.5913 0.4516 0.5913 0.7690
No log 3.5833 172 0.5866 0.5319 0.5866 0.7659
No log 3.625 174 0.5603 0.5286 0.5603 0.7485
No log 3.6667 176 0.5483 0.5682 0.5483 0.7405
No log 3.7083 178 0.5315 0.5951 0.5315 0.7290
No log 3.75 180 0.5336 0.4904 0.5336 0.7305
No log 3.7917 182 0.6123 0.4395 0.6123 0.7825
No log 3.8333 184 0.6645 0.4341 0.6645 0.8152
No log 3.875 186 0.7128 0.4380 0.7128 0.8443
No log 3.9167 188 0.6302 0.4967 0.6302 0.7939
No log 3.9583 190 0.5567 0.5234 0.5567 0.7461
No log 4.0 192 0.5490 0.5084 0.5490 0.7410
No log 4.0417 194 0.5493 0.4757 0.5493 0.7411
No log 4.0833 196 0.5602 0.5009 0.5602 0.7485
No log 4.125 198 0.5532 0.5009 0.5532 0.7438
No log 4.1667 200 0.5448 0.4343 0.5448 0.7381
No log 4.2083 202 0.5526 0.4149 0.5526 0.7434
No log 4.25 204 0.5949 0.4732 0.5949 0.7713
No log 4.2917 206 0.5469 0.5587 0.5469 0.7395
No log 4.3333 208 0.5257 0.5009 0.5257 0.7250
No log 4.375 210 0.5152 0.5039 0.5152 0.7178
No log 4.4167 212 0.5099 0.4973 0.5099 0.7141
No log 4.4583 214 0.5375 0.5779 0.5375 0.7331
No log 4.5 216 0.5731 0.5922 0.5731 0.7570
No log 4.5417 218 0.6129 0.5664 0.6129 0.7829
No log 4.5833 220 0.5552 0.5584 0.5552 0.7451
No log 4.625 222 0.5122 0.5533 0.5122 0.7157
No log 4.6667 224 0.5588 0.6070 0.5588 0.7476
No log 4.7083 226 0.6038 0.5673 0.6038 0.7771
No log 4.75 228 0.5454 0.6070 0.5454 0.7385
No log 4.7917 230 0.5237 0.5009 0.5237 0.7237
No log 4.8333 232 0.5210 0.4776 0.5210 0.7218
No log 4.875 234 0.5221 0.4329 0.5221 0.7226
No log 4.9167 236 0.5156 0.4114 0.5156 0.7181
No log 4.9583 238 0.5331 0.4724 0.5331 0.7301
No log 5.0 240 0.5177 0.4937 0.5177 0.7195
No log 5.0417 242 0.5120 0.4757 0.5120 0.7155
No log 5.0833 244 0.5108 0.5523 0.5108 0.7147
No log 5.125 246 0.5234 0.5483 0.5234 0.7235
No log 5.1667 248 0.5024 0.4697 0.5024 0.7088
No log 5.2083 250 0.5505 0.5166 0.5505 0.7420
No log 5.25 252 0.6384 0.4606 0.6384 0.7990
No log 5.2917 254 0.5567 0.4948 0.5567 0.7461
No log 5.3333 256 0.5274 0.3806 0.5274 0.7262
No log 5.375 258 0.6804 0.4015 0.6804 0.8249
No log 5.4167 260 0.6983 0.3777 0.6983 0.8356
No log 5.4583 262 0.5749 0.3730 0.5749 0.7582
No log 5.5 264 0.5177 0.4747 0.5177 0.7195
No log 5.5417 266 0.5383 0.4808 0.5383 0.7337
No log 5.5833 268 0.5104 0.4660 0.5104 0.7144
No log 5.625 270 0.5013 0.5125 0.5013 0.7080
No log 5.6667 272 0.5981 0.4842 0.5981 0.7734
No log 5.7083 274 0.6379 0.4783 0.6379 0.7987
No log 5.75 276 0.5666 0.4789 0.5666 0.7527
No log 5.7917 278 0.5287 0.5177 0.5287 0.7271
No log 5.8333 280 0.6073 0.4721 0.6073 0.7793
No log 5.875 282 0.6193 0.4721 0.6193 0.7870
No log 5.9167 284 0.5589 0.4337 0.5589 0.7476
No log 5.9583 286 0.5610 0.4504 0.5610 0.7490
No log 6.0 288 0.5671 0.3651 0.5671 0.7531
No log 6.0417 290 0.5739 0.3915 0.5739 0.7575
No log 6.0833 292 0.6329 0.4292 0.6329 0.7955
No log 6.125 294 0.7294 0.4387 0.7294 0.8541
No log 6.1667 296 0.7101 0.4224 0.7101 0.8427
No log 6.2083 298 0.6321 0.4292 0.6321 0.7951
No log 6.25 300 0.5898 0.4229 0.5898 0.7680
No log 6.2917 302 0.5833 0.4171 0.5833 0.7638
No log 6.3333 304 0.6025 0.4314 0.6025 0.7762
No log 6.375 306 0.6306 0.4292 0.6306 0.7941
No log 6.4167 308 0.6608 0.4292 0.6608 0.8129
No log 6.4583 310 0.6255 0.4547 0.6255 0.7909
No log 6.5 312 0.6102 0.3426 0.6102 0.7811
No log 6.5417 314 0.5930 0.3995 0.5930 0.7701
No log 6.5833 316 0.6411 0.4473 0.6411 0.8007
No log 6.625 318 0.7025 0.4484 0.7025 0.8382
No log 6.6667 320 0.7091 0.4408 0.7091 0.8421
No log 6.7083 322 0.6690 0.4704 0.6690 0.8179
No log 6.75 324 0.6616 0.4622 0.6616 0.8134
No log 6.7917 326 0.6045 0.3864 0.6045 0.7775
No log 6.8333 328 0.5776 0.4091 0.5776 0.7600
No log 6.875 330 0.5691 0.4114 0.5691 0.7544
No log 6.9167 332 0.5719 0.3890 0.5719 0.7562
No log 6.9583 334 0.5661 0.4194 0.5661 0.7524
No log 7.0 336 0.5652 0.3970 0.5652 0.7518
No log 7.0417 338 0.5599 0.4463 0.5599 0.7483
No log 7.0833 340 0.5677 0.4849 0.5677 0.7535
No log 7.125 342 0.5659 0.3426 0.5659 0.7522
No log 7.1667 344 0.5896 0.3675 0.5896 0.7679
No log 7.2083 346 0.6409 0.3640 0.6409 0.8006
No log 7.25 348 0.6251 0.3640 0.6251 0.7907
No log 7.2917 350 0.5599 0.3970 0.5599 0.7482
No log 7.3333 352 0.5413 0.4547 0.5413 0.7357
No log 7.375 354 0.5350 0.4463 0.5350 0.7314
No log 7.4167 356 0.5440 0.4337 0.5440 0.7376
No log 7.4583 358 0.5374 0.4596 0.5374 0.7331
No log 7.5 360 0.5232 0.4617 0.5232 0.7233
No log 7.5417 362 0.5230 0.4555 0.5230 0.7232
No log 7.5833 364 0.5308 0.4234 0.5308 0.7285
No log 7.625 366 0.5374 0.4788 0.5374 0.7330
No log 7.6667 368 0.5020 0.5232 0.5020 0.7086
No log 7.7083 370 0.4959 0.4885 0.4959 0.7042
No log 7.75 372 0.4920 0.4885 0.4920 0.7014
No log 7.7917 374 0.4835 0.5107 0.4835 0.6953
No log 7.8333 376 0.4828 0.5150 0.4828 0.6949
No log 7.875 378 0.4793 0.5340 0.4793 0.6923
No log 7.9167 380 0.4927 0.5507 0.4927 0.7020
No log 7.9583 382 0.5041 0.5467 0.5041 0.7100
No log 8.0 384 0.5116 0.5592 0.5116 0.7153
No log 8.0417 386 0.4958 0.5619 0.4958 0.7041
No log 8.0833 388 0.6494 0.6028 0.6494 0.8059
No log 8.125 390 0.7909 0.5372 0.7909 0.8894
No log 8.1667 392 0.6887 0.5467 0.6887 0.8299
No log 8.2083 394 0.5397 0.5485 0.5397 0.7346
No log 8.25 396 0.4943 0.4991 0.4943 0.7031
No log 8.2917 398 0.5281 0.5078 0.5281 0.7267
No log 8.3333 400 0.5255 0.5078 0.5255 0.7249
No log 8.375 402 0.5097 0.5061 0.5097 0.7139
No log 8.4167 404 0.5719 0.5016 0.5719 0.7562
No log 8.4583 406 0.6006 0.5016 0.6006 0.7750
No log 8.5 408 0.5625 0.4067 0.5625 0.7500
No log 8.5417 410 0.5438 0.4253 0.5438 0.7374
No log 8.5833 412 0.5451 0.4019 0.5451 0.7383
No log 8.625 414 0.5384 0.3988 0.5384 0.7338
No log 8.6667 416 0.5327 0.4809 0.5327 0.7298
No log 8.7083 418 0.5315 0.5463 0.5315 0.7291
No log 8.75 420 0.5352 0.5430 0.5352 0.7316
No log 8.7917 422 0.5735 0.4701 0.5735 0.7573
No log 8.8333 424 0.6644 0.4644 0.6644 0.8151
No log 8.875 426 0.7094 0.5231 0.7094 0.8423
No log 8.9167 428 0.6779 0.5237 0.6779 0.8233
No log 8.9583 430 0.6068 0.4473 0.6068 0.7790
No log 9.0 432 0.5630 0.4455 0.5630 0.7503
No log 9.0417 434 0.5470 0.4953 0.5470 0.7396
No log 9.0833 436 0.5661 0.4937 0.5661 0.7524
No log 9.125 438 0.6306 0.4885 0.6306 0.7941
No log 9.1667 440 0.7798 0.5335 0.7798 0.8831
No log 9.2083 442 0.8961 0.4568 0.8961 0.9466
No log 9.25 444 0.8649 0.4336 0.8649 0.9300
No log 9.2917 446 0.7876 0.5085 0.7876 0.8875
No log 9.3333 448 0.7090 0.4294 0.7090 0.8420
No log 9.375 450 0.6548 0.3723 0.6548 0.8092
No log 9.4167 452 0.6212 0.4660 0.6212 0.7882
No log 9.4583 454 0.6223 0.4576 0.6223 0.7888
No log 9.5 456 0.6261 0.3723 0.6261 0.7912
No log 9.5417 458 0.6242 0.4179 0.6242 0.7901
No log 9.5833 460 0.6196 0.3625 0.6196 0.7871
No log 9.625 462 0.6316 0.2847 0.6316 0.7947
No log 9.6667 464 0.6354 0.3183 0.6354 0.7971
No log 9.7083 466 0.6233 0.2847 0.6233 0.7895
No log 9.75 468 0.6041 0.3728 0.6041 0.7772
No log 9.7917 470 0.5956 0.3939 0.5956 0.7717
No log 9.8333 472 0.5855 0.3435 0.5855 0.7652
No log 9.875 474 0.5837 0.3435 0.5837 0.7640
No log 9.9167 476 0.5948 0.4701 0.5948 0.7712
No log 9.9583 478 0.6230 0.4700 0.6230 0.7893
No log 10.0 480 0.6114 0.4700 0.6114 0.7819
No log 10.0417 482 0.6070 0.4681 0.6070 0.7791
No log 10.0833 484 0.6191 0.4747 0.6191 0.7869
No log 10.125 486 0.6472 0.3914 0.6472 0.8045
No log 10.1667 488 0.6436 0.3891 0.6436 0.8022
No log 10.2083 490 0.6182 0.3914 0.6182 0.7863
No log 10.25 492 0.6041 0.4257 0.6041 0.7773
No log 10.2917 494 0.6248 0.4413 0.6248 0.7904
No log 10.3333 496 0.6943 0.4703 0.6943 0.8333
No log 10.375 498 0.7465 0.4631 0.7465 0.8640
0.3214 10.4167 500 0.7730 0.4684 0.7730 0.8792
0.3214 10.4583 502 0.7484 0.4829 0.7484 0.8651
0.3214 10.5 504 0.6532 0.4782 0.6532 0.8082
0.3214 10.5417 506 0.5967 0.4013 0.5967 0.7724
0.3214 10.5833 508 0.5619 0.4419 0.5619 0.7496
0.3214 10.625 510 0.5783 0.4419 0.5783 0.7605
0.3214 10.6667 512 0.5990 0.4639 0.5990 0.7739
0.3214 10.7083 514 0.5809 0.4984 0.5809 0.7622
0.3214 10.75 516 0.5274 0.4934 0.5274 0.7263
0.3214 10.7917 518 0.5109 0.5195 0.5109 0.7148
0.3214 10.8333 520 0.5170 0.5177 0.5170 0.7191
0.3214 10.875 522 0.5654 0.4419 0.5654 0.7519
0.3214 10.9167 524 0.6086 0.4555 0.6086 0.7802
0.3214 10.9583 526 0.6076 0.4555 0.6076 0.7795
0.3214 11.0 528 0.6077 0.4555 0.6077 0.7795
0.3214 11.0417 530 0.5517 0.4681 0.5517 0.7428

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k19_task7_organization

Finetuned
(4023)
this model