ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k13_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8816
  • Qwk: 0.2993
  • Mse: 0.8816
  • Rmse: 0.9389

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0488 2 4.8233 0.0010 4.8233 2.1962
No log 0.0976 4 2.9005 -0.0050 2.9005 1.7031
No log 0.1463 6 2.2692 -0.0796 2.2692 1.5064
No log 0.1951 8 1.6581 0.0451 1.6581 1.2877
No log 0.2439 10 1.3370 0.0361 1.3370 1.1563
No log 0.2927 12 1.3908 0.0532 1.3908 1.1793
No log 0.3415 14 1.3834 0.0543 1.3834 1.1762
No log 0.3902 16 1.4131 0.0446 1.4131 1.1887
No log 0.4390 18 1.4611 0.0439 1.4611 1.2087
No log 0.4878 20 1.5641 0.0613 1.5641 1.2506
No log 0.5366 22 1.4771 0.1045 1.4771 1.2154
No log 0.5854 24 1.5235 0.0039 1.5235 1.2343
No log 0.6341 26 1.9318 -0.2124 1.9318 1.3899
No log 0.6829 28 2.1564 -0.2483 2.1564 1.4685
No log 0.7317 30 1.6082 -0.0171 1.6082 1.2681
No log 0.7805 32 1.3341 0.1045 1.3341 1.1550
No log 0.8293 34 1.4814 0.0232 1.4814 1.2171
No log 0.8780 36 1.5432 0.0297 1.5432 1.2423
No log 0.9268 38 1.3193 0.0687 1.3193 1.1486
No log 0.9756 40 1.3010 0.2033 1.3010 1.1406
No log 1.0244 42 1.3994 0.2090 1.3994 1.1830
No log 1.0732 44 1.3195 0.2009 1.3195 1.1487
No log 1.1220 46 1.2374 0.1688 1.2374 1.1124
No log 1.1707 48 1.3803 0.1527 1.3803 1.1749
No log 1.2195 50 1.4087 0.1622 1.4087 1.1869
No log 1.2683 52 1.2132 0.1896 1.2132 1.1014
No log 1.3171 54 1.1669 0.2346 1.1669 1.0802
No log 1.3659 56 1.1528 0.2763 1.1528 1.0737
No log 1.4146 58 1.1542 0.1671 1.1542 1.0744
No log 1.4634 60 1.1286 0.2562 1.1286 1.0624
No log 1.5122 62 1.0944 0.2921 1.0944 1.0461
No log 1.5610 64 1.0870 0.3062 1.0870 1.0426
No log 1.6098 66 1.2659 0.1921 1.2659 1.1251
No log 1.6585 68 1.1139 0.2095 1.1139 1.0554
No log 1.7073 70 0.9403 0.4724 0.9403 0.9697
No log 1.7561 72 0.9089 0.3070 0.9089 0.9533
No log 1.8049 74 0.8865 0.4571 0.8865 0.9415
No log 1.8537 76 0.9113 0.3956 0.9113 0.9546
No log 1.9024 78 0.8813 0.4541 0.8813 0.9388
No log 1.9512 80 0.8140 0.4942 0.8140 0.9022
No log 2.0 82 0.7960 0.4499 0.7960 0.8922
No log 2.0488 84 0.8540 0.5141 0.8540 0.9241
No log 2.0976 86 0.7922 0.5165 0.7922 0.8900
No log 2.1463 88 0.8196 0.4998 0.8196 0.9053
No log 2.1951 90 0.9208 0.4814 0.9208 0.9596
No log 2.2439 92 0.9467 0.4390 0.9467 0.9730
No log 2.2927 94 0.8946 0.3493 0.8946 0.9459
No log 2.3415 96 0.8583 0.3584 0.8583 0.9264
No log 2.3902 98 0.8783 0.3346 0.8783 0.9372
No log 2.4390 100 0.8992 0.3695 0.8992 0.9482
No log 2.4878 102 0.9292 0.3493 0.9292 0.9640
No log 2.5366 104 0.9078 0.3637 0.9078 0.9528
No log 2.5854 106 0.9230 0.3637 0.9230 0.9607
No log 2.6341 108 0.9853 0.3700 0.9853 0.9926
No log 2.6829 110 1.0863 0.2006 1.0863 1.0422
No log 2.7317 112 0.9786 0.4141 0.9786 0.9892
No log 2.7805 114 0.9811 0.3486 0.9811 0.9905
No log 2.8293 116 1.0162 0.3312 1.0162 1.0080
No log 2.8780 118 1.0353 0.4620 1.0353 1.0175
No log 2.9268 120 1.1418 0.3705 1.1418 1.0685
No log 2.9756 122 1.0923 0.4146 1.0923 1.0451
No log 3.0244 124 1.0772 0.3998 1.0772 1.0379
No log 3.0732 126 1.1338 0.4289 1.1338 1.0648
No log 3.1220 128 1.1446 0.3954 1.1446 1.0699
No log 3.1707 130 1.0379 0.4871 1.0379 1.0188
No log 3.2195 132 1.1390 0.3084 1.1390 1.0673
No log 3.2683 134 1.1695 0.3759 1.1695 1.0814
No log 3.3171 136 1.0573 0.4802 1.0573 1.0283
No log 3.3659 138 1.0511 0.4493 1.0511 1.0252
No log 3.4146 140 1.1315 0.3325 1.1315 1.0637
No log 3.4634 142 1.0192 0.4493 1.0192 1.0096
No log 3.5122 144 0.9885 0.4119 0.9885 0.9942
No log 3.5610 146 1.0010 0.3529 1.0010 1.0005
No log 3.6098 148 0.9418 0.4119 0.9418 0.9704
No log 3.6585 150 0.9313 0.4455 0.9313 0.9650
No log 3.7073 152 0.9156 0.3925 0.9156 0.9569
No log 3.7561 154 0.9490 0.3126 0.9490 0.9742
No log 3.8049 156 0.8552 0.3596 0.8552 0.9248
No log 3.8537 158 0.8344 0.4120 0.8344 0.9135
No log 3.9024 160 0.9146 0.5023 0.9146 0.9563
No log 3.9512 162 1.0622 0.4728 1.0622 1.0307
No log 4.0 164 0.9934 0.4494 0.9934 0.9967
No log 4.0488 166 0.8396 0.4137 0.8396 0.9163
No log 4.0976 168 0.8621 0.4581 0.8621 0.9285
No log 4.1463 170 0.8666 0.4581 0.8666 0.9309
No log 4.1951 172 0.9545 0.5090 0.9545 0.9770
No log 4.2439 174 0.9237 0.4976 0.9237 0.9611
No log 4.2927 176 0.8389 0.4661 0.8389 0.9159
No log 4.3415 178 0.8513 0.4159 0.8513 0.9227
No log 4.3902 180 0.8858 0.4706 0.8858 0.9412
No log 4.4390 182 0.8606 0.4294 0.8606 0.9277
No log 4.4878 184 0.8595 0.4023 0.8595 0.9271
No log 4.5366 186 0.8892 0.4234 0.8892 0.9430
No log 4.5854 188 0.9599 0.4155 0.9599 0.9798
No log 4.6341 190 0.9060 0.3913 0.9060 0.9518
No log 4.6829 192 0.9188 0.4637 0.9188 0.9585
No log 4.7317 194 1.0705 0.4414 1.0705 1.0346
No log 4.7805 196 0.9759 0.4840 0.9759 0.9879
No log 4.8293 198 0.8634 0.4160 0.8634 0.9292
No log 4.8780 200 0.9146 0.4672 0.9146 0.9563
No log 4.9268 202 0.8606 0.5024 0.8606 0.9277
No log 4.9756 204 0.8690 0.4637 0.8690 0.9322
No log 5.0244 206 0.8966 0.5136 0.8966 0.9469
No log 5.0732 208 0.8352 0.4352 0.8352 0.9139
No log 5.1220 210 0.9870 0.4469 0.9870 0.9935
No log 5.1707 212 1.0922 0.4205 1.0922 1.0451
No log 5.2195 214 1.0159 0.4497 1.0159 1.0079
No log 5.2683 216 0.8852 0.4159 0.8852 0.9409
No log 5.3171 218 0.8607 0.4023 0.8607 0.9277
No log 5.3659 220 0.8554 0.4352 0.8554 0.9249
No log 5.4146 222 0.8572 0.4084 0.8572 0.9258
No log 5.4634 224 0.8743 0.4007 0.8743 0.9350
No log 5.5122 226 0.9161 0.4435 0.9161 0.9571
No log 5.5610 228 0.8697 0.3875 0.8697 0.9326
No log 5.6098 230 0.8642 0.4181 0.8642 0.9296
No log 5.6585 232 0.9932 0.3696 0.9932 0.9966
No log 5.7073 234 1.0308 0.4134 1.0308 1.0153
No log 5.7561 236 0.9341 0.4565 0.9341 0.9665
No log 5.8049 238 0.8951 0.4123 0.8951 0.9461
No log 5.8537 240 0.8925 0.4123 0.8925 0.9447
No log 5.9024 242 1.0084 0.4131 1.0084 1.0042
No log 5.9512 244 1.2905 0.4312 1.2905 1.1360
No log 6.0 246 1.3063 0.3752 1.3063 1.1429
No log 6.0488 248 1.1109 0.2953 1.1109 1.0540
No log 6.0976 250 0.9552 0.2812 0.9552 0.9773
No log 6.1463 252 0.9091 0.3639 0.9091 0.9535
No log 6.1951 254 0.8999 0.3639 0.8999 0.9486
No log 6.2439 256 0.8958 0.3453 0.8958 0.9464
No log 6.2927 258 0.9354 0.2812 0.9354 0.9672
No log 6.3415 260 1.0285 0.3333 1.0285 1.0141
No log 6.3902 262 1.0428 0.3831 1.0428 1.0212
No log 6.4390 264 0.8901 0.4493 0.8901 0.9435
No log 6.4878 266 0.8227 0.4257 0.8227 0.9070
No log 6.5366 268 0.8228 0.4120 0.8228 0.9071
No log 6.5854 270 0.8424 0.4879 0.8424 0.9178
No log 6.6341 272 0.8592 0.4592 0.8592 0.9269
No log 6.6829 274 0.8658 0.4140 0.8658 0.9305
No log 6.7317 276 0.8919 0.3960 0.8919 0.9444
No log 6.7805 278 0.9253 0.4118 0.9253 0.9619
No log 6.8293 280 0.8677 0.4197 0.8677 0.9315
No log 6.8780 282 0.8505 0.4845 0.8505 0.9222
No log 6.9268 284 0.8506 0.4845 0.8506 0.9223
No log 6.9756 286 0.8226 0.4393 0.8226 0.9070
No log 7.0244 288 0.8436 0.4364 0.8436 0.9185
No log 7.0732 290 0.8135 0.4196 0.8135 0.9019
No log 7.1220 292 0.8241 0.4780 0.8241 0.9078
No log 7.1707 294 0.9055 0.5080 0.9055 0.9516
No log 7.2195 296 0.9790 0.4186 0.9790 0.9894
No log 7.2683 298 0.9312 0.3489 0.9312 0.9650
No log 7.3171 300 0.8727 0.3401 0.8727 0.9342
No log 7.3659 302 0.8433 0.4220 0.8433 0.9183
No log 7.4146 304 0.8348 0.4280 0.8348 0.9137
No log 7.4634 306 0.8908 0.5094 0.8908 0.9438
No log 7.5122 308 0.9329 0.5487 0.9329 0.9658
No log 7.5610 310 0.9621 0.4783 0.9621 0.9809
No log 7.6098 312 0.8619 0.4142 0.8619 0.9284
No log 7.6585 314 0.8304 0.4081 0.8304 0.9113
No log 7.7073 316 0.8248 0.4081 0.8248 0.9082
No log 7.7561 318 0.8449 0.4409 0.8449 0.9192
No log 7.8049 320 0.8987 0.5094 0.8987 0.9480
No log 7.8537 322 0.8791 0.4568 0.8791 0.9376
No log 7.9024 324 0.8885 0.4062 0.8885 0.9426
No log 7.9512 326 0.9200 0.4062 0.9200 0.9592
No log 8.0 328 0.9635 0.2998 0.9635 0.9816
No log 8.0488 330 1.0198 0.3281 1.0198 1.0099
No log 8.0976 332 1.0152 0.3097 1.0152 1.0076
No log 8.1463 334 0.9570 0.2083 0.9570 0.9783
No log 8.1951 336 0.9427 0.3100 0.9427 0.9709
No log 8.2439 338 0.9352 0.3383 0.9352 0.9670
No log 8.2927 340 0.9315 0.4884 0.9315 0.9651
No log 8.3415 342 0.9085 0.5070 0.9085 0.9532
No log 8.3902 344 0.9518 0.5578 0.9518 0.9756
No log 8.4390 346 0.9487 0.5600 0.9487 0.9740
No log 8.4878 348 0.8679 0.4704 0.8679 0.9316
No log 8.5366 350 0.8483 0.3885 0.8483 0.9210
No log 8.5854 352 0.9093 0.4465 0.9093 0.9536
No log 8.6341 354 0.9049 0.4465 0.9049 0.9512
No log 8.6829 356 0.8850 0.4197 0.8850 0.9407
No log 8.7317 358 0.8438 0.3493 0.8438 0.9186
No log 8.7805 360 0.8141 0.3747 0.8141 0.9023
No log 8.8293 362 0.8155 0.4686 0.8155 0.9030
No log 8.8780 364 0.8271 0.4741 0.8271 0.9094
No log 8.9268 366 0.8382 0.4741 0.8382 0.9155
No log 8.9756 368 0.7753 0.4575 0.7753 0.8805
No log 9.0244 370 0.7882 0.5611 0.7882 0.8878
No log 9.0732 372 0.9034 0.5577 0.9034 0.9505
No log 9.1220 374 0.9819 0.5389 0.9819 0.9909
No log 9.1707 376 0.8868 0.5536 0.8868 0.9417
No log 9.2195 378 0.7942 0.4220 0.7942 0.8912
No log 9.2683 380 0.8945 0.4469 0.8945 0.9458
No log 9.3171 382 0.9308 0.4723 0.9308 0.9648
No log 9.3659 384 0.8563 0.4334 0.8563 0.9254
No log 9.4146 386 0.8311 0.3938 0.8311 0.9117
No log 9.4634 388 0.8970 0.4265 0.8970 0.9471
No log 9.5122 390 1.0305 0.3838 1.0305 1.0152
No log 9.5610 392 1.0691 0.3838 1.0691 1.0340
No log 9.6098 394 1.0031 0.4186 1.0031 1.0015
No log 9.6585 396 0.8873 0.3812 0.8873 0.9420
No log 9.7073 398 0.8600 0.3415 0.8600 0.9273
No log 9.7561 400 0.8669 0.3290 0.8669 0.9311
No log 9.8049 402 0.8787 0.2678 0.8787 0.9374
No log 9.8537 404 0.9320 0.3229 0.9320 0.9654
No log 9.9024 406 0.9585 0.3229 0.9585 0.9790
No log 9.9512 408 0.9528 0.3229 0.9528 0.9761
No log 10.0 410 0.9712 0.2812 0.9712 0.9855
No log 10.0488 412 0.9761 0.3122 0.9761 0.9880
No log 10.0976 414 0.9098 0.4631 0.9098 0.9539
No log 10.1463 416 0.8232 0.4343 0.8232 0.9073
No log 10.1951 418 0.8057 0.4254 0.8057 0.8976
No log 10.2439 420 0.8166 0.4898 0.8166 0.9036
No log 10.2927 422 0.8501 0.4304 0.8501 0.9220
No log 10.3415 424 0.8686 0.4059 0.8686 0.9320
No log 10.3902 426 0.8635 0.3045 0.8635 0.9293
No log 10.4390 428 0.8863 0.3250 0.8863 0.9415
No log 10.4878 430 0.8976 0.3095 0.8976 0.9474
No log 10.5366 432 0.9068 0.3570 0.9068 0.9522
No log 10.5854 434 0.8813 0.3421 0.8813 0.9388
No log 10.6341 436 0.8704 0.3804 0.8704 0.9330
No log 10.6829 438 0.8935 0.4743 0.8935 0.9453
No log 10.7317 440 0.8430 0.4711 0.8430 0.9182
No log 10.7805 442 0.8382 0.4119 0.8382 0.9155
No log 10.8293 444 0.8918 0.4500 0.8918 0.9443
No log 10.8780 446 0.8763 0.4570 0.8763 0.9361
No log 10.9268 448 0.8690 0.3326 0.8690 0.9322
No log 10.9756 450 0.9866 0.3993 0.9866 0.9933
No log 11.0244 452 1.0629 0.2602 1.0629 1.0310
No log 11.0732 454 1.0389 0.2152 1.0389 1.0193
No log 11.1220 456 0.9945 0.2351 0.9945 0.9972
No log 11.1707 458 0.9608 0.3258 0.9608 0.9802
No log 11.2195 460 0.9584 0.2678 0.9584 0.9790
No log 11.2683 462 0.9332 0.2678 0.9332 0.9660
No log 11.3171 464 0.9201 0.2969 0.9201 0.9592
No log 11.3659 466 0.9171 0.3250 0.9171 0.9577
No log 11.4146 468 0.9152 0.3383 0.9152 0.9567
No log 11.4634 470 0.9153 0.3383 0.9153 0.9567
No log 11.5122 472 0.9258 0.2624 0.9258 0.9622
No log 11.5610 474 0.9467 0.2939 0.9467 0.9730
No log 11.6098 476 0.9306 0.2782 0.9306 0.9647
No log 11.6585 478 0.9077 0.2624 0.9077 0.9528
No log 11.7073 480 0.8877 0.2624 0.8877 0.9422
No log 11.7561 482 0.8747 0.3119 0.8747 0.9353
No log 11.8049 484 0.8721 0.4529 0.8721 0.9339
No log 11.8537 486 0.8615 0.4529 0.8615 0.9282
No log 11.9024 488 0.8439 0.4197 0.8439 0.9186
No log 11.9512 490 0.8206 0.4455 0.8206 0.9059
No log 12.0 492 0.8116 0.4039 0.8116 0.9009
No log 12.0488 494 0.8012 0.4039 0.8012 0.8951
No log 12.0976 496 0.8025 0.4142 0.8025 0.8958
No log 12.1463 498 0.8149 0.4142 0.8149 0.9027
0.3313 12.1951 500 0.8437 0.3861 0.8437 0.9185
0.3313 12.2439 502 0.8603 0.3854 0.8603 0.9275
0.3313 12.2927 504 0.8801 0.3100 0.8801 0.9381
0.3313 12.3415 506 0.9028 0.2941 0.9028 0.9502
0.3313 12.3902 508 0.8965 0.3100 0.8965 0.9468
0.3313 12.4390 510 0.8816 0.2993 0.8816 0.9389

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run1_AugV5_k13_task2_organization

Finetuned
(4023)
this model