ArabicNewSplits5_FineTuningAraBERT_run1_AugV5_k9_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5317
  • Qwk: 0.7382
  • Mse: 0.5317
  • Rmse: 0.7292

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0351 2 5.3966 -0.0402 5.3966 2.3231
No log 0.0702 4 3.2393 0.0743 3.2393 1.7998
No log 0.1053 6 3.0077 -0.0966 3.0077 1.7343
No log 0.1404 8 2.3659 -0.0525 2.3659 1.5381
No log 0.1754 10 1.3731 0.2247 1.3731 1.1718
No log 0.2105 12 1.4555 0.0605 1.4555 1.2065
No log 0.2456 14 2.1159 0.0409 2.1159 1.4546
No log 0.2807 16 2.8104 -0.0515 2.8104 1.6764
No log 0.3158 18 2.6694 0.0080 2.6694 1.6338
No log 0.3509 20 2.1302 0.0950 2.1302 1.4595
No log 0.3860 22 1.6291 0.0368 1.6291 1.2764
No log 0.4211 24 1.5039 0.0187 1.5039 1.2263
No log 0.4561 26 1.4443 0.0187 1.4443 1.2018
No log 0.4912 28 1.3489 0.0614 1.3489 1.1614
No log 0.5263 30 1.3396 0.0385 1.3396 1.1574
No log 0.5614 32 1.3582 0.0470 1.3582 1.1654
No log 0.5965 34 1.4148 0.0385 1.4148 1.1894
No log 0.6316 36 1.4938 0.0636 1.4938 1.2222
No log 0.6667 38 1.4611 0.0980 1.4611 1.2088
No log 0.7018 40 1.6172 0.0985 1.6172 1.2717
No log 0.7368 42 1.7006 0.1452 1.7006 1.3041
No log 0.7719 44 1.5391 0.1443 1.5391 1.2406
No log 0.8070 46 1.3963 0.1864 1.3963 1.1817
No log 0.8421 48 1.2286 0.2726 1.2286 1.1084
No log 0.8772 50 1.3794 0.2297 1.3794 1.1745
No log 0.9123 52 1.6015 0.1948 1.6015 1.2655
No log 0.9474 54 1.8448 0.2343 1.8448 1.3582
No log 0.9825 56 2.0646 0.2665 2.0646 1.4369
No log 1.0175 58 2.0615 0.2795 2.0615 1.4358
No log 1.0526 60 1.8900 0.2643 1.8900 1.3748
No log 1.0877 62 1.4174 0.2761 1.4174 1.1905
No log 1.1228 64 1.0588 0.3930 1.0588 1.0290
No log 1.1579 66 0.9726 0.4372 0.9726 0.9862
No log 1.1930 68 1.0550 0.4239 1.0550 1.0271
No log 1.2281 70 1.5155 0.3479 1.5155 1.2311
No log 1.2632 72 2.2052 0.2229 2.2052 1.4850
No log 1.2982 74 2.2084 0.2024 2.2084 1.4861
No log 1.3333 76 1.7347 0.3304 1.7347 1.3171
No log 1.3684 78 1.1051 0.4792 1.1051 1.0512
No log 1.4035 80 0.7470 0.5636 0.7470 0.8643
No log 1.4386 82 0.7248 0.5912 0.7248 0.8513
No log 1.4737 84 0.7591 0.5768 0.7591 0.8713
No log 1.5088 86 0.6803 0.6371 0.6803 0.8248
No log 1.5439 88 0.6879 0.6429 0.6879 0.8294
No log 1.5789 90 1.0311 0.5427 1.0311 1.0154
No log 1.6140 92 1.2615 0.4739 1.2615 1.1232
No log 1.6491 94 1.2093 0.4710 1.2093 1.0997
No log 1.6842 96 0.9636 0.5447 0.9636 0.9816
No log 1.7193 98 0.7508 0.6453 0.7508 0.8665
No log 1.7544 100 0.6541 0.6398 0.6541 0.8087
No log 1.7895 102 0.6470 0.6343 0.6470 0.8044
No log 1.8246 104 0.7320 0.5974 0.7320 0.8556
No log 1.8596 106 0.9398 0.5879 0.9398 0.9694
No log 1.8947 108 0.9499 0.6110 0.9499 0.9746
No log 1.9298 110 0.6939 0.6702 0.6939 0.8330
No log 1.9649 112 0.6333 0.7051 0.6333 0.7958
No log 2.0 114 0.5932 0.7331 0.5932 0.7702
No log 2.0351 116 0.5901 0.7367 0.5901 0.7682
No log 2.0702 118 0.6028 0.7446 0.6028 0.7764
No log 2.1053 120 0.7412 0.6679 0.7412 0.8609
No log 2.1404 122 0.7802 0.6429 0.7802 0.8833
No log 2.1754 124 0.6845 0.7204 0.6845 0.8274
No log 2.2105 126 0.5794 0.7556 0.5794 0.7612
No log 2.2456 128 0.6640 0.7245 0.6640 0.8148
No log 2.2807 130 0.6971 0.7222 0.6971 0.8349
No log 2.3158 132 0.5938 0.7287 0.5938 0.7706
No log 2.3509 134 0.7938 0.6395 0.7938 0.8910
No log 2.3860 136 1.2181 0.4099 1.2181 1.1037
No log 2.4211 138 1.3034 0.3680 1.3034 1.1417
No log 2.4561 140 0.9976 0.5233 0.9976 0.9988
No log 2.4912 142 0.7506 0.6262 0.7506 0.8664
No log 2.5263 144 0.6413 0.6960 0.6413 0.8008
No log 2.5614 146 0.5541 0.7086 0.5541 0.7444
No log 2.5965 148 0.5524 0.7096 0.5524 0.7432
No log 2.6316 150 0.5346 0.7208 0.5346 0.7312
No log 2.6667 152 0.5310 0.7445 0.5310 0.7287
No log 2.7018 154 0.5649 0.7669 0.5649 0.7516
No log 2.7368 156 0.5771 0.7460 0.5771 0.7597
No log 2.7719 158 0.5909 0.7432 0.5909 0.7687
No log 2.8070 160 0.5472 0.7741 0.5472 0.7397
No log 2.8421 162 0.5427 0.7647 0.5427 0.7367
No log 2.8772 164 0.5307 0.7574 0.5307 0.7285
No log 2.9123 166 0.5308 0.7492 0.5308 0.7286
No log 2.9474 168 0.5547 0.7539 0.5547 0.7448
No log 2.9825 170 0.5720 0.7500 0.5720 0.7563
No log 3.0175 172 0.5676 0.7569 0.5676 0.7534
No log 3.0526 174 0.5449 0.7133 0.5449 0.7382
No log 3.0877 176 0.5996 0.7356 0.5996 0.7743
No log 3.1228 178 0.6617 0.7357 0.6617 0.8135
No log 3.1579 180 0.6059 0.7324 0.6059 0.7784
No log 3.1930 182 0.5580 0.7277 0.5580 0.7470
No log 3.2281 184 0.6040 0.7409 0.6040 0.7772
No log 3.2632 186 0.6157 0.7436 0.6157 0.7847
No log 3.2982 188 0.5721 0.7348 0.5721 0.7563
No log 3.3333 190 0.5510 0.7359 0.5510 0.7423
No log 3.3684 192 0.6029 0.7450 0.6029 0.7765
No log 3.4035 194 0.6357 0.7318 0.6357 0.7973
No log 3.4386 196 0.5927 0.7350 0.5927 0.7699
No log 3.4737 198 0.5398 0.7655 0.5398 0.7347
No log 3.5088 200 0.5436 0.7494 0.5436 0.7373
No log 3.5439 202 0.5400 0.7469 0.5400 0.7349
No log 3.5789 204 0.5254 0.7514 0.5254 0.7249
No log 3.6140 206 0.5251 0.7590 0.5251 0.7247
No log 3.6491 208 0.5284 0.7524 0.5284 0.7269
No log 3.6842 210 0.5548 0.7489 0.5548 0.7449
No log 3.7193 212 0.5686 0.7533 0.5686 0.7540
No log 3.7544 214 0.5529 0.7452 0.5529 0.7436
No log 3.7895 216 0.5574 0.7443 0.5574 0.7466
No log 3.8246 218 0.5786 0.7306 0.5786 0.7607
No log 3.8596 220 0.6079 0.7383 0.6079 0.7796
No log 3.8947 222 0.5706 0.7124 0.5706 0.7554
No log 3.9298 224 0.5549 0.7351 0.5549 0.7449
No log 3.9649 226 0.5608 0.7368 0.5608 0.7489
No log 4.0 228 0.5642 0.7209 0.5642 0.7511
No log 4.0351 230 0.5716 0.7360 0.5716 0.7560
No log 4.0702 232 0.5732 0.7198 0.5732 0.7571
No log 4.1053 234 0.5541 0.7261 0.5541 0.7444
No log 4.1404 236 0.5469 0.7363 0.5469 0.7395
No log 4.1754 238 0.5370 0.7432 0.5370 0.7328
No log 4.2105 240 0.5316 0.7407 0.5316 0.7291
No log 4.2456 242 0.5286 0.7418 0.5286 0.7270
No log 4.2807 244 0.5346 0.7492 0.5346 0.7312
No log 4.3158 246 0.5383 0.7484 0.5383 0.7337
No log 4.3509 248 0.5353 0.7374 0.5353 0.7317
No log 4.3860 250 0.5464 0.7359 0.5464 0.7392
No log 4.4211 252 0.5542 0.7560 0.5542 0.7444
No log 4.4561 254 0.5435 0.7368 0.5435 0.7372
No log 4.4912 256 0.5525 0.7383 0.5525 0.7433
No log 4.5263 258 0.5603 0.7285 0.5603 0.7485
No log 4.5614 260 0.5733 0.7428 0.5733 0.7572
No log 4.5965 262 0.5609 0.7389 0.5609 0.7489
No log 4.6316 264 0.5563 0.7279 0.5563 0.7458
No log 4.6667 266 0.5696 0.7622 0.5696 0.7547
No log 4.7018 268 0.6342 0.7159 0.6342 0.7963
No log 4.7368 270 0.6658 0.6771 0.6658 0.8160
No log 4.7719 272 0.6281 0.7070 0.6281 0.7925
No log 4.8070 274 0.5984 0.7253 0.5984 0.7736
No log 4.8421 276 0.5726 0.7360 0.5726 0.7567
No log 4.8772 278 0.5660 0.7474 0.5660 0.7523
No log 4.9123 280 0.5671 0.7385 0.5671 0.7531
No log 4.9474 282 0.5684 0.7339 0.5684 0.7539
No log 4.9825 284 0.5697 0.7413 0.5697 0.7548
No log 5.0175 286 0.5670 0.7437 0.5670 0.7530
No log 5.0526 288 0.5867 0.7523 0.5867 0.7659
No log 5.0877 290 0.6038 0.7378 0.6038 0.7770
No log 5.1228 292 0.5812 0.7616 0.5812 0.7623
No log 5.1579 294 0.5830 0.7223 0.5830 0.7636
No log 5.1930 296 0.6130 0.7299 0.6130 0.7829
No log 5.2281 298 0.6200 0.7248 0.6200 0.7874
No log 5.2632 300 0.6118 0.7314 0.6118 0.7822
No log 5.2982 302 0.6018 0.7177 0.6018 0.7757
No log 5.3333 304 0.5891 0.7181 0.5891 0.7675
No log 5.3684 306 0.5711 0.7350 0.5711 0.7557
No log 5.4035 308 0.5586 0.7369 0.5586 0.7474
No log 5.4386 310 0.5532 0.7461 0.5532 0.7438
No log 5.4737 312 0.5594 0.7604 0.5594 0.7480
No log 5.5088 314 0.5558 0.7473 0.5558 0.7455
No log 5.5439 316 0.5602 0.7401 0.5602 0.7484
No log 5.5789 318 0.5768 0.7371 0.5768 0.7595
No log 5.6140 320 0.6028 0.7455 0.6028 0.7764
No log 5.6491 322 0.6563 0.7276 0.6563 0.8101
No log 5.6842 324 0.6640 0.7261 0.6640 0.8149
No log 5.7193 326 0.6550 0.7382 0.6550 0.8093
No log 5.7544 328 0.6056 0.7405 0.6056 0.7782
No log 5.7895 330 0.5808 0.7257 0.5808 0.7621
No log 5.8246 332 0.5597 0.7267 0.5597 0.7481
No log 5.8596 334 0.5516 0.7335 0.5516 0.7427
No log 5.8947 336 0.5525 0.7323 0.5525 0.7433
No log 5.9298 338 0.5603 0.7395 0.5603 0.7485
No log 5.9649 340 0.5467 0.7379 0.5467 0.7394
No log 6.0 342 0.5319 0.7568 0.5319 0.7293
No log 6.0351 344 0.5289 0.7684 0.5289 0.7272
No log 6.0702 346 0.5272 0.7624 0.5272 0.7261
No log 6.1053 348 0.5184 0.7624 0.5184 0.7200
No log 6.1404 350 0.5156 0.7562 0.5156 0.7180
No log 6.1754 352 0.5176 0.7582 0.5176 0.7194
No log 6.2105 354 0.5116 0.7562 0.5116 0.7153
No log 6.2456 356 0.5119 0.7624 0.5119 0.7155
No log 6.2807 358 0.5206 0.7675 0.5206 0.7215
No log 6.3158 360 0.5219 0.7655 0.5219 0.7224
No log 6.3509 362 0.5088 0.7775 0.5088 0.7133
No log 6.3860 364 0.5027 0.7613 0.5027 0.7090
No log 6.4211 366 0.5037 0.7613 0.5037 0.7097
No log 6.4561 368 0.5058 0.7664 0.5058 0.7112
No log 6.4912 370 0.5139 0.7775 0.5139 0.7169
No log 6.5263 372 0.5295 0.7593 0.5295 0.7277
No log 6.5614 374 0.5376 0.7379 0.5376 0.7332
No log 6.5965 376 0.5293 0.7604 0.5293 0.7275
No log 6.6316 378 0.5204 0.7483 0.5204 0.7214
No log 6.6667 380 0.5312 0.7400 0.5312 0.7288
No log 6.7018 382 0.5377 0.7327 0.5377 0.7333
No log 6.7368 384 0.5348 0.7505 0.5348 0.7313
No log 6.7719 386 0.5250 0.7603 0.5250 0.7245
No log 6.8070 388 0.5262 0.7510 0.5262 0.7254
No log 6.8421 390 0.5251 0.7510 0.5251 0.7246
No log 6.8772 392 0.5240 0.7623 0.5240 0.7239
No log 6.9123 394 0.5403 0.7418 0.5403 0.7351
No log 6.9474 396 0.5484 0.7343 0.5484 0.7405
No log 6.9825 398 0.5437 0.7433 0.5437 0.7373
No log 7.0175 400 0.5420 0.7587 0.5420 0.7362
No log 7.0526 402 0.5320 0.7572 0.5320 0.7294
No log 7.0877 404 0.5294 0.7642 0.5294 0.7276
No log 7.1228 406 0.5318 0.7572 0.5318 0.7292
No log 7.1579 408 0.5441 0.7536 0.5441 0.7376
No log 7.1930 410 0.5539 0.7576 0.5539 0.7442
No log 7.2281 412 0.5711 0.7467 0.5711 0.7557
No log 7.2632 414 0.5787 0.7503 0.5787 0.7607
No log 7.2982 416 0.5902 0.7331 0.5902 0.7682
No log 7.3333 418 0.5949 0.7331 0.5949 0.7713
No log 7.3684 420 0.5919 0.7347 0.5919 0.7694
No log 7.4035 422 0.5745 0.7422 0.5745 0.7580
No log 7.4386 424 0.5631 0.7576 0.5631 0.7504
No log 7.4737 426 0.5566 0.7576 0.5566 0.7461
No log 7.5088 428 0.5595 0.7530 0.5595 0.7480
No log 7.5439 430 0.5539 0.7429 0.5539 0.7442
No log 7.5789 432 0.5484 0.7475 0.5484 0.7405
No log 7.6140 434 0.5427 0.7572 0.5427 0.7367
No log 7.6491 436 0.5409 0.7617 0.5409 0.7355
No log 7.6842 438 0.5386 0.7559 0.5386 0.7339
No log 7.7193 440 0.5394 0.7599 0.5394 0.7344
No log 7.7544 442 0.5398 0.7599 0.5398 0.7347
No log 7.7895 444 0.5355 0.7540 0.5355 0.7318
No log 7.8246 446 0.5322 0.7501 0.5322 0.7295
No log 7.8596 448 0.5314 0.7501 0.5314 0.7290
No log 7.8947 450 0.5337 0.7540 0.5337 0.7305
No log 7.9298 452 0.5409 0.7568 0.5409 0.7355
No log 7.9649 454 0.5413 0.7481 0.5413 0.7358
No log 8.0 456 0.5377 0.7463 0.5377 0.7333
No log 8.0351 458 0.5324 0.7514 0.5324 0.7296
No log 8.0702 460 0.5297 0.7480 0.5297 0.7278
No log 8.1053 462 0.5297 0.7380 0.5297 0.7278
No log 8.1404 464 0.5293 0.7380 0.5293 0.7275
No log 8.1754 466 0.5289 0.7501 0.5289 0.7272
No log 8.2105 468 0.5282 0.7501 0.5282 0.7268
No log 8.2456 470 0.5271 0.7501 0.5271 0.7260
No log 8.2807 472 0.5265 0.7520 0.5265 0.7256
No log 8.3158 474 0.5273 0.7536 0.5273 0.7261
No log 8.3509 476 0.5284 0.7576 0.5284 0.7269
No log 8.3860 478 0.5331 0.7435 0.5331 0.7302
No log 8.4211 480 0.5399 0.7321 0.5399 0.7348
No log 8.4561 482 0.5437 0.7422 0.5437 0.7373
No log 8.4912 484 0.5467 0.7482 0.5467 0.7394
No log 8.5263 486 0.5446 0.7482 0.5446 0.7380
No log 8.5614 488 0.5436 0.7482 0.5436 0.7373
No log 8.5965 490 0.5360 0.7467 0.5360 0.7321
No log 8.6316 492 0.5320 0.7538 0.5320 0.7294
No log 8.6667 494 0.5314 0.7538 0.5314 0.7290
No log 8.7018 496 0.5287 0.7499 0.5287 0.7271
No log 8.7368 498 0.5292 0.7538 0.5292 0.7275
0.4266 8.7719 500 0.5327 0.7467 0.5327 0.7299
0.4266 8.8070 502 0.5409 0.7422 0.5409 0.7354
0.4266 8.8421 504 0.5550 0.7416 0.5550 0.7450
0.4266 8.8772 506 0.5668 0.7416 0.5668 0.7529
0.4266 8.9123 508 0.5684 0.7416 0.5684 0.7540
0.4266 8.9474 510 0.5607 0.7354 0.5607 0.7488
0.4266 8.9825 512 0.5484 0.7354 0.5484 0.7405
0.4266 9.0175 514 0.5361 0.7360 0.5361 0.7322
0.4266 9.0526 516 0.5309 0.7435 0.5309 0.7286
0.4266 9.0877 518 0.5276 0.7435 0.5276 0.7264
0.4266 9.1228 520 0.5244 0.7591 0.5244 0.7241
0.4266 9.1579 522 0.5237 0.7530 0.5237 0.7237
0.4266 9.1930 524 0.5237 0.7591 0.5237 0.7237
0.4266 9.2281 526 0.5221 0.7591 0.5221 0.7226
0.4266 9.2632 528 0.5217 0.7591 0.5217 0.7223
0.4266 9.2982 530 0.5216 0.7591 0.5216 0.7222
0.4266 9.3333 532 0.5219 0.7591 0.5219 0.7224
0.4266 9.3684 534 0.5215 0.7591 0.5215 0.7222
0.4266 9.4035 536 0.5226 0.7481 0.5226 0.7229
0.4266 9.4386 538 0.5238 0.7412 0.5238 0.7237
0.4266 9.4737 540 0.5252 0.7412 0.5252 0.7247
0.4266 9.5088 542 0.5273 0.7412 0.5273 0.7262
0.4266 9.5439 544 0.5289 0.7366 0.5289 0.7273
0.4266 9.5789 546 0.5294 0.7428 0.5294 0.7276
0.4266 9.6140 548 0.5306 0.7428 0.5306 0.7284
0.4266 9.6491 550 0.5308 0.7428 0.5308 0.7286
0.4266 9.6842 552 0.5313 0.7428 0.5313 0.7289
0.4266 9.7193 554 0.5317 0.7428 0.5317 0.7292
0.4266 9.7544 556 0.5322 0.7382 0.5322 0.7295
0.4266 9.7895 558 0.5325 0.7382 0.5325 0.7297
0.4266 9.8246 560 0.5323 0.7382 0.5323 0.7296
0.4266 9.8596 562 0.5324 0.7382 0.5324 0.7296
0.4266 9.8947 564 0.5324 0.7382 0.5324 0.7296
0.4266 9.9298 566 0.5319 0.7382 0.5319 0.7293
0.4266 9.9649 568 0.5318 0.7382 0.5318 0.7292
0.4266 10.0 570 0.5317 0.7382 0.5317 0.7292

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits5_FineTuningAraBERT_run1_AugV5_k9_task1_organization

Finetuned
(4023)
this model