ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k11_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7015
  • Qwk: 0.4379
  • Mse: 0.7015
  • Rmse: 0.8376

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0364 2 2.6379 -0.0262 2.6379 1.6242
No log 0.0727 4 1.4088 0.0470 1.4088 1.1869
No log 0.1091 6 1.2908 -0.1255 1.2908 1.1361
No log 0.1455 8 1.1505 0.0156 1.1505 1.0726
No log 0.1818 10 1.0562 0.1277 1.0562 1.0277
No log 0.2182 12 0.8602 0.2116 0.8602 0.9275
No log 0.2545 14 0.7794 0.1321 0.7794 0.8828
No log 0.2909 16 1.1112 0.0986 1.1112 1.0541
No log 0.3273 18 1.2099 0.0257 1.2099 1.1000
No log 0.3636 20 0.8968 0.2736 0.8968 0.9470
No log 0.4 22 0.7706 0.2424 0.7706 0.8778
No log 0.4364 24 0.7388 0.2353 0.7388 0.8595
No log 0.4727 26 0.8184 0.2300 0.8184 0.9046
No log 0.5091 28 0.9214 0.1941 0.9214 0.9599
No log 0.5455 30 0.7728 0.1686 0.7728 0.8791
No log 0.5818 32 0.7320 0.2002 0.7320 0.8556
No log 0.6182 34 0.7561 0.0771 0.7561 0.8695
No log 0.6545 36 0.8100 0.0898 0.8100 0.9000
No log 0.6909 38 0.9279 0.0949 0.9279 0.9633
No log 0.7273 40 0.9887 0.1697 0.9887 0.9944
No log 0.7636 42 1.1125 0.1882 1.1125 1.0547
No log 0.8 44 1.2283 0.0078 1.2283 1.1083
No log 0.8364 46 1.0963 0.2147 1.0963 1.0470
No log 0.8727 48 0.8844 0.0940 0.8844 0.9404
No log 0.9091 50 0.8247 0.1752 0.8247 0.9081
No log 0.9455 52 0.9246 0.2012 0.9246 0.9615
No log 0.9818 54 0.9193 0.2297 0.9193 0.9588
No log 1.0182 56 0.8571 0.2527 0.8571 0.9258
No log 1.0545 58 0.7809 0.0717 0.7809 0.8837
No log 1.0909 60 0.7867 0.0 0.7867 0.8869
No log 1.1273 62 0.8648 0.0 0.8648 0.9299
No log 1.1636 64 0.8901 -0.0426 0.8901 0.9434
No log 1.2 66 0.8610 0.0053 0.8610 0.9279
No log 1.2364 68 0.8007 0.0026 0.8007 0.8948
No log 1.2727 70 0.7362 0.1508 0.7362 0.8580
No log 1.3091 72 0.7313 0.1972 0.7313 0.8551
No log 1.3455 74 0.7224 0.0966 0.7224 0.8499
No log 1.3818 76 0.7970 0.2015 0.7970 0.8927
No log 1.4182 78 0.8598 0.3013 0.8598 0.9272
No log 1.4545 80 0.7762 0.3360 0.7762 0.8810
No log 1.4909 82 0.7379 0.3360 0.7379 0.8590
No log 1.5273 84 0.8110 0.3090 0.8110 0.9005
No log 1.5636 86 0.8901 0.3777 0.8901 0.9435
No log 1.6 88 0.9828 0.2958 0.9828 0.9914
No log 1.6364 90 1.1178 0.1484 1.1178 1.0572
No log 1.6727 92 1.0902 0.2298 1.0902 1.0441
No log 1.7091 94 0.9151 0.3323 0.9151 0.9566
No log 1.7455 96 0.7394 0.2361 0.7394 0.8599
No log 1.7818 98 0.6929 0.3105 0.6929 0.8324
No log 1.8182 100 0.6743 0.3446 0.6743 0.8212
No log 1.8545 102 0.7581 0.3854 0.7581 0.8707
No log 1.8909 104 0.9723 0.2627 0.9723 0.9860
No log 1.9273 106 1.2086 0.1232 1.2086 1.0993
No log 1.9636 108 1.1904 0.1232 1.1904 1.0911
No log 2.0 110 0.9854 0.2051 0.9854 0.9927
No log 2.0364 112 0.8292 0.3493 0.8292 0.9106
No log 2.0727 114 0.7639 0.4556 0.7639 0.8740
No log 2.1091 116 0.7433 0.4375 0.7433 0.8622
No log 2.1455 118 0.7001 0.4028 0.7001 0.8367
No log 2.1818 120 0.7053 0.3566 0.7053 0.8398
No log 2.2182 122 0.7778 0.3730 0.7778 0.8819
No log 2.2545 124 0.7452 0.4139 0.7452 0.8633
No log 2.2909 126 0.6926 0.3144 0.6926 0.8322
No log 2.3273 128 0.7510 0.4218 0.7510 0.8666
No log 2.3636 130 0.7862 0.3520 0.7862 0.8867
No log 2.4 132 0.6769 0.3793 0.6769 0.8227
No log 2.4364 134 0.6602 0.3200 0.6602 0.8125
No log 2.4727 136 0.6594 0.4632 0.6594 0.8121
No log 2.5091 138 0.6377 0.5189 0.6377 0.7986
No log 2.5455 140 0.7240 0.3718 0.7240 0.8509
No log 2.5818 142 0.9396 0.3156 0.9396 0.9693
No log 2.6182 144 1.1207 0.3039 1.1207 1.0586
No log 2.6545 146 1.2474 0.1444 1.2474 1.1169
No log 2.6909 148 1.1176 0.2577 1.1176 1.0571
No log 2.7273 150 0.8178 0.3247 0.8178 0.9043
No log 2.7636 152 0.5765 0.5538 0.5765 0.7593
No log 2.8 154 0.5729 0.4315 0.5729 0.7569
No log 2.8364 156 0.5844 0.4134 0.5844 0.7645
No log 2.8727 158 0.5930 0.4264 0.5930 0.7701
No log 2.9091 160 0.6361 0.2345 0.6361 0.7976
No log 2.9455 162 0.6607 0.2290 0.6607 0.8129
No log 2.9818 164 0.7044 0.2862 0.7044 0.8393
No log 3.0182 166 0.8280 0.3455 0.8280 0.9100
No log 3.0545 168 0.8329 0.3381 0.8329 0.9126
No log 3.0909 170 0.8383 0.3346 0.8383 0.9156
No log 3.1273 172 0.8890 0.3044 0.8890 0.9428
No log 3.1636 174 0.8160 0.3346 0.8160 0.9033
No log 3.2 176 0.6096 0.4384 0.6096 0.7807
No log 3.2364 178 0.6892 0.4197 0.6892 0.8302
No log 3.2727 180 0.8471 0.3251 0.8471 0.9204
No log 3.3091 182 0.7810 0.3538 0.7810 0.8837
No log 3.3455 184 0.6999 0.4247 0.6999 0.8366
No log 3.3818 186 0.6547 0.3545 0.6547 0.8091
No log 3.4182 188 0.6251 0.3813 0.6251 0.7907
No log 3.4545 190 0.6125 0.3481 0.6125 0.7826
No log 3.4909 192 0.6578 0.3716 0.6578 0.8111
No log 3.5273 194 0.7818 0.4519 0.7818 0.8842
No log 3.5636 196 0.9543 0.2154 0.9543 0.9769
No log 3.6 198 0.9791 0.2191 0.9791 0.9895
No log 3.6364 200 0.8526 0.2987 0.8526 0.9234
No log 3.6727 202 0.6627 0.4444 0.6627 0.8141
No log 3.7091 204 0.6164 0.4019 0.6164 0.7851
No log 3.7455 206 0.6187 0.3788 0.6187 0.7866
No log 3.7818 208 0.6125 0.2811 0.6125 0.7826
No log 3.8182 210 0.6804 0.4949 0.6804 0.8249
No log 3.8545 212 0.7529 0.3913 0.7529 0.8677
No log 3.8909 214 0.7251 0.4431 0.7251 0.8515
No log 3.9273 216 0.6489 0.4389 0.6489 0.8055
No log 3.9636 218 0.6110 0.4397 0.6110 0.7817
No log 4.0 220 0.6992 0.3417 0.6992 0.8362
No log 4.0364 222 0.6995 0.3976 0.6995 0.8363
No log 4.0727 224 0.6507 0.4515 0.6507 0.8067
No log 4.1091 226 0.7073 0.4923 0.7073 0.8410
No log 4.1455 228 0.7363 0.4177 0.7363 0.8581
No log 4.1818 230 0.7307 0.4272 0.7307 0.8548
No log 4.2182 232 0.7136 0.4091 0.7136 0.8447
No log 4.2545 234 0.6592 0.5304 0.6592 0.8119
No log 4.2909 236 0.6622 0.4523 0.6622 0.8137
No log 4.3273 238 0.7124 0.3798 0.7124 0.8440
No log 4.3636 240 0.7398 0.3798 0.7398 0.8601
No log 4.4 242 0.6921 0.4239 0.6921 0.8319
No log 4.4364 244 0.6703 0.3408 0.6703 0.8187
No log 4.4727 246 0.6937 0.3581 0.6937 0.8329
No log 4.5091 248 0.6946 0.3859 0.6946 0.8335
No log 4.5455 250 0.7000 0.3859 0.7000 0.8366
No log 4.5818 252 0.7553 0.3133 0.7553 0.8691
No log 4.6182 254 0.7995 0.3807 0.7995 0.8942
No log 4.6545 256 0.7918 0.3615 0.7918 0.8898
No log 4.6909 258 0.7322 0.4128 0.7322 0.8557
No log 4.7273 260 0.7150 0.4186 0.7150 0.8456
No log 4.7636 262 0.7918 0.4051 0.7918 0.8898
No log 4.8 264 0.7376 0.4123 0.7376 0.8588
No log 4.8364 266 0.6766 0.3366 0.6766 0.8226
No log 4.8727 268 0.6809 0.3467 0.6809 0.8252
No log 4.9091 270 0.7291 0.4412 0.7291 0.8538
No log 4.9455 272 0.7203 0.4059 0.7203 0.8487
No log 4.9818 274 0.6691 0.3442 0.6691 0.8180
No log 5.0182 276 0.6803 0.3934 0.6803 0.8248
No log 5.0545 278 0.6886 0.4012 0.6886 0.8298
No log 5.0909 280 0.6800 0.3509 0.6800 0.8246
No log 5.1273 282 0.7054 0.3754 0.7054 0.8399
No log 5.1636 284 0.6902 0.4005 0.6902 0.8308
No log 5.2 286 0.6427 0.3625 0.6427 0.8017
No log 5.2364 288 0.6419 0.3864 0.6419 0.8012
No log 5.2727 290 0.6092 0.3625 0.6092 0.7805
No log 5.3091 292 0.6079 0.3509 0.6079 0.7797
No log 5.3455 294 0.6425 0.4271 0.6425 0.8016
No log 5.3818 296 0.6096 0.3982 0.6096 0.7808
No log 5.4182 298 0.5682 0.4052 0.5682 0.7538
No log 5.4545 300 0.5552 0.4569 0.5552 0.7451
No log 5.4909 302 0.5506 0.4052 0.5506 0.7420
No log 5.5273 304 0.6107 0.5033 0.6107 0.7815
No log 5.5636 306 0.6679 0.4783 0.6679 0.8173
No log 5.6 308 0.6683 0.4725 0.6683 0.8175
No log 5.6364 310 0.5712 0.5587 0.5712 0.7558
No log 5.6727 312 0.5364 0.5877 0.5364 0.7324
No log 5.7091 314 0.5457 0.6013 0.5457 0.7387
No log 5.7455 316 0.5364 0.6295 0.5364 0.7324
No log 5.7818 318 0.5299 0.6197 0.5299 0.7280
No log 5.8182 320 0.5223 0.6101 0.5223 0.7227
No log 5.8545 322 0.5441 0.6210 0.5441 0.7377
No log 5.8909 324 0.5489 0.5498 0.5489 0.7409
No log 5.9273 326 0.5341 0.5943 0.5341 0.7308
No log 5.9636 328 0.5308 0.5904 0.5308 0.7285
No log 6.0 330 0.5863 0.5112 0.5863 0.7657
No log 6.0364 332 0.6080 0.4765 0.6080 0.7797
No log 6.0727 334 0.6423 0.4986 0.6423 0.8015
No log 6.1091 336 0.6049 0.5247 0.6049 0.7778
No log 6.1455 338 0.5632 0.5587 0.5632 0.7505
No log 6.1818 340 0.5327 0.5463 0.5327 0.7299
No log 6.2182 342 0.5341 0.4885 0.5341 0.7308
No log 6.2545 344 0.5364 0.4637 0.5364 0.7324
No log 6.2909 346 0.5515 0.5414 0.5515 0.7426
No log 6.3273 348 0.5735 0.5356 0.5735 0.7573
No log 6.3636 350 0.5455 0.5171 0.5455 0.7386
No log 6.4 352 0.5456 0.4596 0.5456 0.7387
No log 6.4364 354 0.5485 0.4337 0.5485 0.7406
No log 6.4727 356 0.5502 0.4681 0.5502 0.7417
No log 6.5091 358 0.5604 0.4249 0.5604 0.7486
No log 6.5455 360 0.5575 0.3552 0.5575 0.7466
No log 6.5818 362 0.5544 0.3552 0.5544 0.7446
No log 6.6182 364 0.5501 0.3552 0.5501 0.7417
No log 6.6545 366 0.5742 0.4126 0.5742 0.7578
No log 6.6909 368 0.6266 0.5603 0.6266 0.7916
No log 6.7273 370 0.6388 0.5603 0.6388 0.7992
No log 6.7636 372 0.5898 0.4888 0.5898 0.7680
No log 6.8 374 0.5564 0.3863 0.5564 0.7459
No log 6.8364 376 0.5610 0.3552 0.5610 0.7490
No log 6.8727 378 0.5902 0.3661 0.5902 0.7683
No log 6.9091 380 0.6484 0.3693 0.6484 0.8052
No log 6.9455 382 0.6547 0.3643 0.6547 0.8091
No log 6.9818 384 0.5995 0.4452 0.5995 0.7743
No log 7.0182 386 0.5506 0.4849 0.5506 0.7420
No log 7.0545 388 0.5266 0.5625 0.5266 0.7257
No log 7.0909 390 0.5172 0.6346 0.5172 0.7192
No log 7.1273 392 0.5351 0.5904 0.5351 0.7315
No log 7.1636 394 0.5796 0.5841 0.5796 0.7613
No log 7.2 396 0.6370 0.5003 0.6370 0.7981
No log 7.2364 398 0.6236 0.4851 0.6236 0.7897
No log 7.2727 400 0.5784 0.5528 0.5784 0.7605
No log 7.3091 402 0.5397 0.5339 0.5397 0.7346
No log 7.3455 404 0.5501 0.5117 0.5501 0.7417
No log 7.3818 406 0.6252 0.4946 0.6252 0.7907
No log 7.4182 408 0.7871 0.3782 0.7871 0.8872
No log 7.4545 410 0.8788 0.2873 0.8788 0.9374
No log 7.4909 412 0.7961 0.3560 0.7961 0.8923
No log 7.5273 414 0.6525 0.4860 0.6525 0.8078
No log 7.5636 416 0.5535 0.5098 0.5535 0.7440
No log 7.6 418 0.5498 0.4229 0.5498 0.7415
No log 7.6364 420 0.6071 0.4389 0.6071 0.7792
No log 7.6727 422 0.5930 0.4933 0.5930 0.7701
No log 7.7091 424 0.5532 0.5159 0.5532 0.7438
No log 7.7455 426 0.5533 0.5231 0.5533 0.7439
No log 7.7818 428 0.6173 0.5384 0.6173 0.7857
No log 7.8182 430 0.6298 0.5384 0.6298 0.7936
No log 7.8545 432 0.5774 0.5373 0.5774 0.7599
No log 7.8909 434 0.5314 0.5061 0.5314 0.7290
No log 7.9273 436 0.5485 0.5614 0.5485 0.7406
No log 7.9636 438 0.5543 0.5367 0.5543 0.7445
No log 8.0 440 0.5360 0.4314 0.5360 0.7321
No log 8.0364 442 0.5581 0.4698 0.5581 0.7470
No log 8.0727 444 0.6143 0.4106 0.6143 0.7838
No log 8.1091 446 0.6148 0.4163 0.6148 0.7841
No log 8.1455 448 0.6103 0.3640 0.6103 0.7812
No log 8.1818 450 0.5632 0.4273 0.5632 0.7504
No log 8.2182 452 0.5452 0.4774 0.5452 0.7384
No log 8.2545 454 0.5403 0.4703 0.5403 0.7351
No log 8.2909 456 0.5389 0.4746 0.5389 0.7341
No log 8.3273 458 0.5464 0.5117 0.5464 0.7392
No log 8.3636 460 0.5968 0.4969 0.5968 0.7725
No log 8.4 462 0.6662 0.5061 0.6662 0.8162
No log 8.4364 464 0.7322 0.4635 0.7322 0.8557
No log 8.4727 466 0.6530 0.5061 0.6530 0.8081
No log 8.5091 468 0.5447 0.6589 0.5447 0.7380
No log 8.5455 470 0.5221 0.5656 0.5221 0.7226
No log 8.5818 472 0.5266 0.4828 0.5266 0.7256
No log 8.6182 474 0.5253 0.5915 0.5253 0.7248
No log 8.6545 476 0.5424 0.5765 0.5424 0.7365
No log 8.6909 478 0.5799 0.4841 0.5799 0.7615
No log 8.7273 480 0.6402 0.4732 0.6402 0.8001
No log 8.7636 482 0.6230 0.5101 0.6230 0.7893
No log 8.8 484 0.5794 0.5356 0.5794 0.7612
No log 8.8364 486 0.5699 0.4746 0.5699 0.7549
No log 8.8727 488 0.5618 0.4161 0.5618 0.7495
No log 8.9091 490 0.5617 0.3863 0.5617 0.7495
No log 8.9455 492 0.5560 0.4161 0.5560 0.7456
No log 8.9818 494 0.5675 0.5189 0.5675 0.7533
No log 9.0182 496 0.5774 0.5603 0.5774 0.7599
No log 9.0545 498 0.6024 0.5544 0.6024 0.7762
0.359 9.0909 500 0.6096 0.5470 0.6096 0.7808
0.359 9.1273 502 0.5700 0.5300 0.5700 0.7550
0.359 9.1636 504 0.5286 0.5195 0.5286 0.7271
0.359 9.2 506 0.4985 0.6269 0.4985 0.7060
0.359 9.2364 508 0.5022 0.5816 0.5022 0.7087
0.359 9.2727 510 0.4961 0.6073 0.4961 0.7043
0.359 9.3091 512 0.5024 0.6101 0.5024 0.7088
0.359 9.3455 514 0.5047 0.5904 0.5047 0.7104
0.359 9.3818 516 0.4841 0.5904 0.4841 0.6958
0.359 9.4182 518 0.4678 0.6073 0.4678 0.6840
0.359 9.4545 520 0.4783 0.5722 0.4783 0.6916
0.359 9.4909 522 0.5204 0.6156 0.5204 0.7214
0.359 9.5273 524 0.6115 0.5450 0.6115 0.7820
0.359 9.5636 526 0.7569 0.4601 0.7569 0.8700
0.359 9.6 528 0.8177 0.4699 0.8177 0.9043
0.359 9.6364 530 0.8051 0.4699 0.8051 0.8973
0.359 9.6727 532 0.7015 0.4379 0.7015 0.8376

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run2_AugV5_k11_task7_organization

Finetuned
(4019)
this model