ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k18_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7076
  • Qwk: 0.7273
  • Mse: 0.7076
  • Rmse: 0.8412

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0244 2 6.7876 0.0185 6.7876 2.6053
No log 0.0488 4 4.1137 0.0365 4.1137 2.0282
No log 0.0732 6 2.9131 0.0387 2.9131 1.7068
No log 0.0976 8 2.1470 0.1343 2.1470 1.4653
No log 0.1220 10 2.2268 0.0146 2.2268 1.4923
No log 0.1463 12 2.0053 0.2698 2.0053 1.4161
No log 0.1707 14 1.7299 0.3167 1.7299 1.3153
No log 0.1951 16 1.9376 0.3358 1.9376 1.3920
No log 0.2195 18 2.7419 0.0741 2.7419 1.6559
No log 0.2439 20 2.4864 0.0629 2.4864 1.5768
No log 0.2683 22 1.8850 0.2520 1.8850 1.3730
No log 0.2927 24 1.5340 0.2593 1.5340 1.2385
No log 0.3171 26 1.5311 0.2752 1.5311 1.2374
No log 0.3415 28 1.5619 0.2931 1.5619 1.2498
No log 0.3659 30 1.5292 0.3361 1.5292 1.2366
No log 0.3902 32 1.2774 0.3717 1.2774 1.1302
No log 0.4146 34 1.3420 0.5289 1.3420 1.1584
No log 0.4390 36 1.5430 0.3590 1.5430 1.2422
No log 0.4634 38 1.4809 0.3091 1.4809 1.2169
No log 0.4878 40 1.4110 0.2430 1.4110 1.1879
No log 0.5122 42 1.3461 0.2222 1.3461 1.1602
No log 0.5366 44 1.4516 0.2202 1.4516 1.2048
No log 0.5610 46 1.5154 0.2832 1.5154 1.2310
No log 0.5854 48 1.4801 0.2202 1.4801 1.2166
No log 0.6098 50 1.4534 0.2783 1.4534 1.2056
No log 0.6341 52 1.4835 0.3529 1.4835 1.2180
No log 0.6585 54 1.4513 0.3667 1.4513 1.2047
No log 0.6829 56 1.4071 0.3667 1.4071 1.1862
No log 0.7073 58 1.3082 0.3130 1.3082 1.1438
No log 0.7317 60 1.1575 0.3304 1.1575 1.0759
No log 0.7561 62 1.1145 0.5210 1.1145 1.0557
No log 0.7805 64 1.0785 0.5484 1.0785 1.0385
No log 0.8049 66 0.9587 0.496 0.9587 0.9791
No log 0.8293 68 0.9238 0.6324 0.9238 0.9612
No log 0.8537 70 0.8578 0.7162 0.8578 0.9262
No log 0.8780 72 0.8091 0.6761 0.8091 0.8995
No log 0.9024 74 0.8366 0.6806 0.8366 0.9147
No log 0.9268 76 0.7886 0.7123 0.7886 0.8881
No log 0.9512 78 0.7256 0.7763 0.7256 0.8518
No log 0.9756 80 0.7199 0.7763 0.7199 0.8485
No log 1.0 82 0.7081 0.7949 0.7081 0.8415
No log 1.0244 84 0.6919 0.8125 0.6919 0.8318
No log 1.0488 86 0.6681 0.7673 0.6681 0.8173
No log 1.0732 88 0.8306 0.7215 0.8306 0.9114
No log 1.0976 90 0.9486 0.6667 0.9486 0.9740
No log 1.1220 92 0.8391 0.6974 0.8391 0.9160
No log 1.1463 94 0.7730 0.7368 0.7730 0.8792
No log 1.1707 96 1.0581 0.5563 1.0581 1.0286
No log 1.1951 98 1.0205 0.56 1.0205 1.0102
No log 1.2195 100 0.7906 0.7285 0.7906 0.8891
No log 1.2439 102 0.8538 0.6892 0.8538 0.9240
No log 1.2683 104 0.9389 0.6486 0.9389 0.9690
No log 1.2927 106 0.8790 0.6968 0.8790 0.9375
No log 1.3171 108 0.7014 0.7901 0.7014 0.8375
No log 1.3415 110 0.5864 0.8098 0.5864 0.7658
No log 1.3659 112 0.5523 0.8293 0.5523 0.7431
No log 1.3902 114 0.5593 0.7975 0.5593 0.7478
No log 1.4146 116 0.6747 0.7347 0.6747 0.8214
No log 1.4390 118 0.7128 0.7172 0.7128 0.8443
No log 1.4634 120 0.6388 0.8105 0.6388 0.7993
No log 1.4878 122 0.7963 0.6294 0.7963 0.8923
No log 1.5122 124 0.8082 0.6294 0.8082 0.8990
No log 1.5366 126 0.7315 0.7755 0.7315 0.8553
No log 1.5610 128 0.8699 0.6573 0.8699 0.9327
No log 1.5854 130 0.9960 0.6301 0.9960 0.9980
No log 1.6098 132 0.9203 0.6301 0.9203 0.9593
No log 1.6341 134 0.7363 0.7919 0.7363 0.8581
No log 1.6585 136 0.7149 0.7843 0.7149 0.8455
No log 1.6829 138 0.7359 0.7632 0.7359 0.8578
No log 1.7073 140 0.7379 0.7417 0.7379 0.8590
No log 1.7317 142 0.9338 0.6621 0.9338 0.9663
No log 1.7561 144 1.1600 0.6027 1.1600 1.0770
No log 1.7805 146 1.1027 0.6207 1.1027 1.0501
No log 1.8049 148 0.8579 0.6853 0.8579 0.9262
No log 1.8293 150 0.7371 0.7517 0.7371 0.8586
No log 1.8537 152 0.7176 0.7517 0.7176 0.8471
No log 1.8780 154 0.8122 0.7075 0.8122 0.9012
No log 1.9024 156 1.1475 0.6027 1.1475 1.0712
No log 1.9268 158 1.2951 0.5369 1.2951 1.1380
No log 1.9512 160 1.1728 0.5850 1.1728 1.0830
No log 1.9756 162 0.9801 0.6846 0.9801 0.9900
No log 2.0 164 0.9357 0.7034 0.9357 0.9673
No log 2.0244 166 0.9463 0.6713 0.9463 0.9728
No log 2.0488 168 1.0350 0.6809 1.0350 1.0174
No log 2.0732 170 1.1134 0.6667 1.1134 1.0552
No log 2.0976 172 1.1458 0.6434 1.1458 1.0704
No log 2.1220 174 1.0822 0.625 1.0822 1.0403
No log 2.1463 176 0.9393 0.6803 0.9393 0.9692
No log 2.1707 178 0.9032 0.6803 0.9032 0.9504
No log 2.1951 180 0.9119 0.6622 0.9119 0.9550
No log 2.2195 182 0.8145 0.7308 0.8145 0.9025
No log 2.2439 184 0.7285 0.7662 0.7285 0.8535
No log 2.2683 186 0.7105 0.7742 0.7105 0.8429
No log 2.2927 188 0.6636 0.7792 0.6636 0.8146
No log 2.3171 190 0.6376 0.8302 0.6376 0.7985
No log 2.3415 192 0.6059 0.8050 0.6059 0.7784
No log 2.3659 194 0.6362 0.8025 0.6362 0.7977
No log 2.3902 196 0.6724 0.7368 0.6724 0.8200
No log 2.4146 198 0.8100 0.6803 0.8100 0.9000
No log 2.4390 200 0.7622 0.7105 0.7622 0.8730
No log 2.4634 202 0.6760 0.7285 0.6760 0.8222
No log 2.4878 204 0.5869 0.7763 0.5869 0.7661
No log 2.5122 206 0.5728 0.7843 0.5728 0.7568
No log 2.5366 208 0.5657 0.7712 0.5657 0.7521
No log 2.5610 210 0.5806 0.7550 0.5806 0.7620
No log 2.5854 212 0.6440 0.7123 0.6440 0.8025
No log 2.6098 214 0.6159 0.7919 0.6159 0.7848
No log 2.6341 216 0.6158 0.8054 0.6158 0.7847
No log 2.6585 218 0.6661 0.7347 0.6661 0.8162
No log 2.6829 220 0.7159 0.7114 0.7159 0.8461
No log 2.7073 222 0.8014 0.6803 0.8014 0.8952
No log 2.7317 224 0.7567 0.7172 0.7567 0.8699
No log 2.7561 226 0.7655 0.7639 0.7655 0.8749
No log 2.7805 228 0.7909 0.7234 0.7909 0.8894
No log 2.8049 230 0.8036 0.7324 0.8036 0.8965
No log 2.8293 232 0.9234 0.6667 0.9234 0.9610
No log 2.8537 234 1.0327 0.6383 1.0327 1.0162
No log 2.8780 236 1.0024 0.6383 1.0024 1.0012
No log 2.9024 238 0.8453 0.6986 0.8453 0.9194
No log 2.9268 240 0.7197 0.7703 0.7197 0.8484
No log 2.9512 242 0.7340 0.8101 0.7340 0.8568
No log 2.9756 244 0.7234 0.8077 0.7234 0.8505
No log 3.0 246 0.7517 0.7 0.7517 0.8670
No log 3.0244 248 0.7741 0.6906 0.7741 0.8798
No log 3.0488 250 0.7460 0.7143 0.7460 0.8637
No log 3.0732 252 0.6872 0.75 0.6872 0.8290
No log 3.0976 254 0.6683 0.75 0.6683 0.8175
No log 3.1220 256 0.6613 0.7075 0.6613 0.8132
No log 3.1463 258 0.5989 0.7733 0.5989 0.7739
No log 3.1707 260 0.5618 0.8571 0.5618 0.7495
No log 3.1951 262 0.5679 0.8571 0.5679 0.7536
No log 3.2195 264 0.6209 0.7417 0.6209 0.7880
No log 3.2439 266 0.7778 0.6968 0.7778 0.8819
No log 3.2683 268 0.7665 0.6980 0.7665 0.8755
No log 3.2927 270 0.6480 0.7368 0.6480 0.8050
No log 3.3171 272 0.5802 0.8323 0.5802 0.7617
No log 3.3415 274 0.6005 0.8333 0.6005 0.7749
No log 3.3659 276 0.6788 0.7297 0.6788 0.8239
No log 3.3902 278 0.6926 0.7297 0.6926 0.8322
No log 3.4146 280 0.6365 0.7662 0.6365 0.7978
No log 3.4390 282 0.6415 0.7662 0.6415 0.8010
No log 3.4634 284 0.5413 0.8221 0.5413 0.7358
No log 3.4878 286 0.5243 0.8537 0.5243 0.7241
No log 3.5122 288 0.5708 0.7975 0.5708 0.7555
No log 3.5366 290 0.6550 0.7643 0.6550 0.8093
No log 3.5610 292 0.6206 0.7643 0.6206 0.7878
No log 3.5854 294 0.5786 0.8323 0.5786 0.7607
No log 3.6098 296 0.5908 0.8375 0.5908 0.7686
No log 3.6341 298 0.6578 0.8421 0.6578 0.8111
No log 3.6585 300 0.7294 0.7397 0.7294 0.8541
No log 3.6829 302 0.8729 0.6528 0.8729 0.9343
No log 3.7073 304 1.0448 0.6154 1.0448 1.0222
No log 3.7317 306 1.1310 0.5899 1.1310 1.0635
No log 3.7561 308 1.0271 0.6383 1.0271 1.0134
No log 3.7805 310 0.8673 0.6993 0.8673 0.9313
No log 3.8049 312 0.8036 0.7034 0.8036 0.8964
No log 3.8293 314 0.8334 0.6759 0.8334 0.9129
No log 3.8537 316 0.9645 0.6389 0.9645 0.9821
No log 3.8780 318 1.1026 0.6164 1.1026 1.0500
No log 3.9024 320 1.0480 0.6 1.0480 1.0237
No log 3.9268 322 0.9502 0.6714 0.9502 0.9748
No log 3.9512 324 0.8875 0.7050 0.8875 0.9421
No log 3.9756 326 0.9105 0.6957 0.9105 0.9542
No log 4.0 328 0.9653 0.6714 0.9653 0.9825
No log 4.0244 330 0.9713 0.6471 0.9713 0.9855
No log 4.0488 332 0.9912 0.6232 0.9912 0.9956
No log 4.0732 334 0.9190 0.6571 0.9190 0.9586
No log 4.0976 336 0.7742 0.6957 0.7742 0.8799
No log 4.1220 338 0.7270 0.7324 0.7270 0.8526
No log 4.1463 340 0.7197 0.7324 0.7197 0.8483
No log 4.1707 342 0.7309 0.7194 0.7309 0.8549
No log 4.1951 344 0.7393 0.7194 0.7393 0.8598
No log 4.2195 346 0.7084 0.7448 0.7084 0.8417
No log 4.2439 348 0.6766 0.7568 0.6766 0.8226
No log 4.2683 350 0.6884 0.76 0.6884 0.8297
No log 4.2927 352 0.7727 0.6759 0.7727 0.8790
No log 4.3171 354 0.8265 0.5839 0.8265 0.9091
No log 4.3415 356 0.8139 0.6131 0.8139 0.9022
No log 4.3659 358 0.7901 0.6618 0.7901 0.8889
No log 4.3902 360 0.7698 0.7465 0.7698 0.8774
No log 4.4146 362 0.7133 0.75 0.7133 0.8446
No log 4.4390 364 0.6506 0.7448 0.6506 0.8066
No log 4.4634 366 0.5815 0.8050 0.5815 0.7626
No log 4.4878 368 0.5550 0.8642 0.5550 0.7450
No log 4.5122 370 0.5722 0.8606 0.5722 0.7565
No log 4.5366 372 0.6551 0.8 0.6551 0.8094
No log 4.5610 374 0.6352 0.8075 0.6352 0.7970
No log 4.5854 376 0.5936 0.8642 0.5936 0.7705
No log 4.6098 378 0.6087 0.8462 0.6087 0.7802
No log 4.6341 380 0.5932 0.7973 0.5932 0.7702
No log 4.6585 382 0.5664 0.7755 0.5664 0.7526
No log 4.6829 384 0.5540 0.8535 0.5540 0.7443
No log 4.7073 386 0.6148 0.8050 0.6148 0.7841
No log 4.7317 388 0.6295 0.7871 0.6295 0.7934
No log 4.7561 390 0.5745 0.8280 0.5745 0.7580
No log 4.7805 392 0.5613 0.7815 0.5613 0.7492
No log 4.8049 394 0.6426 0.7347 0.6426 0.8016
No log 4.8293 396 0.6518 0.7397 0.6518 0.8074
No log 4.8537 398 0.6505 0.7815 0.6505 0.8065
No log 4.8780 400 0.7443 0.7285 0.7443 0.8627
No log 4.9024 402 0.7402 0.75 0.7402 0.8604
No log 4.9268 404 0.6729 0.7949 0.6729 0.8203
No log 4.9512 406 0.6043 0.7692 0.6043 0.7773
No log 4.9756 408 0.7150 0.6887 0.7150 0.8456
No log 5.0 410 0.7765 0.6667 0.7765 0.8812
No log 5.0244 412 0.6619 0.7407 0.6619 0.8136
No log 5.0488 414 0.5603 0.8098 0.5603 0.7485
No log 5.0732 416 0.5494 0.8098 0.5494 0.7412
No log 5.0976 418 0.6040 0.7578 0.6040 0.7772
No log 5.1220 420 0.6796 0.6753 0.6796 0.8244
No log 5.1463 422 0.8409 0.6623 0.8409 0.9170
No log 5.1707 424 0.9547 0.6528 0.9547 0.9771
No log 5.1951 426 0.8880 0.6621 0.8880 0.9424
No log 5.2195 428 0.7101 0.6986 0.7101 0.8427
No log 5.2439 430 0.5683 0.7871 0.5683 0.7539
No log 5.2683 432 0.5462 0.8176 0.5462 0.7390
No log 5.2927 434 0.5382 0.8125 0.5382 0.7336
No log 5.3171 436 0.5621 0.7821 0.5621 0.7498
No log 5.3415 438 0.6096 0.7432 0.6096 0.7808
No log 5.3659 440 0.6943 0.7172 0.6943 0.8332
No log 5.3902 442 0.7659 0.6846 0.7659 0.8752
No log 5.4146 444 0.7752 0.6846 0.7752 0.8805
No log 5.4390 446 0.7893 0.6918 0.7893 0.8884
No log 5.4634 448 0.6848 0.7397 0.6848 0.8275
No log 5.4878 450 0.6365 0.75 0.6365 0.7978
No log 5.5122 452 0.6692 0.7413 0.6692 0.8180
No log 5.5366 454 0.6973 0.7324 0.6973 0.8350
No log 5.5610 456 0.7841 0.6806 0.7841 0.8855
No log 5.5854 458 0.8807 0.6806 0.8807 0.9384
No log 5.6098 460 0.8703 0.6621 0.8703 0.9329
No log 5.6341 462 0.7308 0.6933 0.7308 0.8549
No log 5.6585 464 0.6282 0.7484 0.6282 0.7926
No log 5.6829 466 0.6082 0.7922 0.6082 0.7799
No log 5.7073 468 0.6199 0.7733 0.6199 0.7873
No log 5.7317 470 0.6179 0.7724 0.6179 0.7861
No log 5.7561 472 0.5957 0.7568 0.5956 0.7718
No log 5.7805 474 0.5490 0.7763 0.5490 0.7409
No log 5.8049 476 0.5524 0.7742 0.5524 0.7433
No log 5.8293 478 0.4996 0.8519 0.4996 0.7068
No log 5.8537 480 0.4972 0.8625 0.4972 0.7051
No log 5.8780 482 0.5115 0.8625 0.5115 0.7152
No log 5.9024 484 0.5657 0.8280 0.5657 0.7522
No log 5.9268 486 0.7009 0.7211 0.7009 0.8372
No log 5.9512 488 0.7567 0.7083 0.7567 0.8699
No log 5.9756 490 0.7210 0.7083 0.7210 0.8491
No log 6.0 492 0.6619 0.7692 0.6619 0.8136
No log 6.0244 494 0.6356 0.8138 0.6356 0.7973
No log 6.0488 496 0.6057 0.8079 0.6057 0.7782
No log 6.0732 498 0.5766 0.8133 0.5766 0.7593
0.4163 6.0976 500 0.6101 0.7432 0.6101 0.7811
0.4163 6.1220 502 0.7107 0.7211 0.7107 0.8430
0.4163 6.1463 504 0.7979 0.6667 0.7979 0.8932
0.4163 6.1707 506 0.7413 0.7273 0.7413 0.8610
0.4163 6.1951 508 0.7163 0.7260 0.7163 0.8464
0.4163 6.2195 510 0.7076 0.7273 0.7076 0.8412

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run2_AugV5_k18_task1_organization

Finetuned
(4023)
this model