ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run3_AugV5_k9_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8239
  • Qwk: 0.3948
  • Mse: 0.8239
  • Rmse: 0.9077

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0435 2 2.7407 -0.0407 2.7407 1.6555
No log 0.0870 4 1.3475 0.0991 1.3475 1.1608
No log 0.1304 6 1.0856 -0.1802 1.0856 1.0419
No log 0.1739 8 0.9523 -0.0223 0.9523 0.9759
No log 0.2174 10 1.0459 0.0256 1.0459 1.0227
No log 0.2609 12 1.0454 0.0863 1.0454 1.0224
No log 0.3043 14 1.0075 0.2076 1.0075 1.0037
No log 0.3478 16 0.9324 0.2283 0.9324 0.9656
No log 0.3913 18 0.8801 0.2484 0.8801 0.9381
No log 0.4348 20 0.9525 0.2936 0.9525 0.9759
No log 0.4783 22 0.8715 0.2942 0.8715 0.9335
No log 0.5217 24 0.8449 0.2229 0.8449 0.9192
No log 0.5652 26 0.7865 0.2310 0.7865 0.8869
No log 0.6087 28 0.7092 0.3280 0.7092 0.8421
No log 0.6522 30 0.7193 0.2907 0.7193 0.8481
No log 0.6957 32 0.7415 0.2053 0.7415 0.8611
No log 0.7391 34 0.7294 0.2451 0.7294 0.8541
No log 0.7826 36 0.7161 0.2642 0.7161 0.8462
No log 0.8261 38 0.7896 0.2226 0.7896 0.8886
No log 0.8696 40 0.7353 0.2264 0.7353 0.8575
No log 0.9130 42 0.7136 0.1962 0.7136 0.8447
No log 0.9565 44 0.7103 0.1962 0.7103 0.8428
No log 1.0 46 0.7140 0.2285 0.7140 0.8450
No log 1.0435 48 0.7144 0.2285 0.7144 0.8452
No log 1.0870 50 0.7120 0.2027 0.7120 0.8438
No log 1.1304 52 0.7267 0.1942 0.7267 0.8525
No log 1.1739 54 0.7165 0.2379 0.7165 0.8465
No log 1.2174 56 0.7187 0.2285 0.7187 0.8478
No log 1.2609 58 0.7152 0.2285 0.7152 0.8457
No log 1.3043 60 0.7103 0.2973 0.7103 0.8428
No log 1.3478 62 0.7708 0.2652 0.7708 0.8779
No log 1.3913 64 0.8710 0.2809 0.8710 0.9333
No log 1.4348 66 0.8234 0.4071 0.8234 0.9074
No log 1.4783 68 0.8310 0.3666 0.8310 0.9116
No log 1.5217 70 0.8478 0.3610 0.8478 0.9208
No log 1.5652 72 0.8239 0.3076 0.8239 0.9077
No log 1.6087 74 0.7182 0.3691 0.7182 0.8475
No log 1.6522 76 0.7146 0.3653 0.7146 0.8453
No log 1.6957 78 0.6987 0.4089 0.6987 0.8359
No log 1.7391 80 0.6095 0.4375 0.6095 0.7807
No log 1.7826 82 0.6807 0.3417 0.6807 0.8251
No log 1.8261 84 0.7558 0.2867 0.7558 0.8694
No log 1.8696 86 0.7603 0.3107 0.7603 0.8719
No log 1.9130 88 0.7008 0.3060 0.7008 0.8371
No log 1.9565 90 0.7027 0.3287 0.7027 0.8383
No log 2.0 92 0.7727 0.3157 0.7727 0.8791
No log 2.0435 94 0.7920 0.3157 0.7920 0.8900
No log 2.0870 96 0.7861 0.2475 0.7861 0.8866
No log 2.1304 98 0.9240 0.2996 0.9240 0.9613
No log 2.1739 100 1.0949 0.2487 1.0949 1.0464
No log 2.2174 102 0.9912 0.3490 0.9912 0.9956
No log 2.2609 104 0.8527 0.3335 0.8527 0.9234
No log 2.3043 106 0.7957 0.3841 0.7957 0.8920
No log 2.3478 108 0.7480 0.3966 0.7480 0.8649
No log 2.3913 110 0.7023 0.3738 0.7023 0.8380
No log 2.4348 112 0.7123 0.2895 0.7123 0.8440
No log 2.4783 114 0.8112 0.2624 0.8112 0.9007
No log 2.5217 116 0.8500 0.1819 0.8500 0.9219
No log 2.5652 118 0.7456 0.3175 0.7456 0.8635
No log 2.6087 120 0.7606 0.3769 0.7606 0.8721
No log 2.6522 122 0.8046 0.3157 0.8046 0.8970
No log 2.6957 124 0.7446 0.3549 0.7446 0.8629
No log 2.7391 126 0.7953 0.3457 0.7953 0.8918
No log 2.7826 128 0.9649 0.1988 0.9649 0.9823
No log 2.8261 130 0.9486 0.1737 0.9486 0.9740
No log 2.8696 132 0.8199 0.2439 0.8199 0.9055
No log 2.9130 134 0.7764 0.1870 0.7764 0.8811
No log 2.9565 136 0.7273 0.3398 0.7273 0.8528
No log 3.0 138 0.7436 0.3545 0.7436 0.8623
No log 3.0435 140 0.8444 0.2975 0.8444 0.9189
No log 3.0870 142 0.8251 0.3034 0.8251 0.9084
No log 3.1304 144 0.7526 0.3569 0.7526 0.8675
No log 3.1739 146 0.7560 0.3025 0.7560 0.8695
No log 3.2174 148 0.7663 0.3102 0.7663 0.8754
No log 3.2609 150 0.7430 0.3321 0.7430 0.8619
No log 3.3043 152 0.7172 0.3667 0.7172 0.8469
No log 3.3478 154 0.7043 0.3314 0.7043 0.8392
No log 3.3913 156 0.7148 0.3023 0.7148 0.8455
No log 3.4348 158 0.7304 0.3305 0.7304 0.8547
No log 3.4783 160 0.7206 0.2866 0.7206 0.8489
No log 3.5217 162 0.7438 0.3060 0.7438 0.8625
No log 3.5652 164 0.8225 0.3475 0.8225 0.9069
No log 3.6087 166 0.7829 0.2793 0.7829 0.8848
No log 3.6522 168 0.7367 0.3068 0.7367 0.8583
No log 3.6957 170 0.7956 0.3060 0.7956 0.8919
No log 3.7391 172 0.8700 0.2968 0.8700 0.9328
No log 3.7826 174 0.9154 0.3068 0.9154 0.9568
No log 3.8261 176 0.9542 0.3560 0.9542 0.9768
No log 3.8696 178 0.8895 0.3034 0.8895 0.9431
No log 3.9130 180 0.8356 0.1884 0.8356 0.9141
No log 3.9565 182 0.8344 0.0637 0.8344 0.9135
No log 4.0 184 0.8570 0.1373 0.8570 0.9257
No log 4.0435 186 0.8203 0.3209 0.8203 0.9057
No log 4.0870 188 0.8457 0.4085 0.8457 0.9196
No log 4.1304 190 0.8137 0.3433 0.8137 0.9021
No log 4.1739 192 0.7072 0.3966 0.7072 0.8410
No log 4.2174 194 0.7213 0.4343 0.7213 0.8493
No log 4.2609 196 0.8297 0.2652 0.8297 0.9109
No log 4.3043 198 0.8552 0.2052 0.8552 0.9248
No log 4.3478 200 0.7973 0.3093 0.7973 0.8929
No log 4.3913 202 0.7423 0.3796 0.7423 0.8615
No log 4.4348 204 0.7548 0.4662 0.7548 0.8688
No log 4.4783 206 0.7322 0.3175 0.7322 0.8557
No log 4.5217 208 0.7204 0.2652 0.7204 0.8488
No log 4.5652 210 0.7116 0.4336 0.7116 0.8436
No log 4.6087 212 0.7718 0.4354 0.7718 0.8785
No log 4.6522 214 0.8419 0.3781 0.8419 0.9175
No log 4.6957 216 0.7864 0.3754 0.7864 0.8868
No log 4.7391 218 0.7643 0.4428 0.7643 0.8742
No log 4.7826 220 0.8515 0.3864 0.8515 0.9228
No log 4.8261 222 0.9169 0.3724 0.9169 0.9575
No log 4.8696 224 1.0329 0.3897 1.0329 1.0163
No log 4.9130 226 0.9060 0.3568 0.9060 0.9518
No log 4.9565 228 0.8573 0.4706 0.8573 0.9259
No log 5.0 230 0.8074 0.4992 0.8074 0.8986
No log 5.0435 232 0.7532 0.4794 0.7532 0.8679
No log 5.0870 234 0.7140 0.4715 0.7140 0.8450
No log 5.1304 236 0.8338 0.4234 0.8338 0.9131
No log 5.1739 238 0.9524 0.3721 0.9524 0.9759
No log 5.2174 240 0.9502 0.3590 0.9502 0.9748
No log 5.2609 242 0.7911 0.3847 0.7911 0.8895
No log 5.3043 244 0.6685 0.3763 0.6685 0.8176
No log 5.3478 246 0.6584 0.3788 0.6584 0.8114
No log 5.3913 248 0.7474 0.3409 0.7474 0.8645
No log 5.4348 250 1.0010 0.2919 1.0010 1.0005
No log 5.4783 252 1.1662 0.2477 1.1662 1.0799
No log 5.5217 254 1.0376 0.3318 1.0376 1.0186
No log 5.5652 256 0.7790 0.4186 0.7790 0.8826
No log 5.6087 258 0.7387 0.4903 0.7387 0.8595
No log 5.6522 260 0.7524 0.5359 0.7524 0.8674
No log 5.6957 262 0.7245 0.4751 0.7245 0.8512
No log 5.7391 264 0.8758 0.3906 0.8758 0.9358
No log 5.7826 266 0.9584 0.3481 0.9584 0.9790
No log 5.8261 268 0.8219 0.4056 0.8219 0.9066
No log 5.8696 270 0.7014 0.4438 0.7014 0.8375
No log 5.9130 272 0.7521 0.4024 0.7521 0.8673
No log 5.9565 274 0.7002 0.4308 0.7002 0.8368
No log 6.0 276 0.7249 0.4609 0.7249 0.8514
No log 6.0435 278 0.8986 0.3941 0.8986 0.9480
No log 6.0870 280 0.9473 0.3885 0.9473 0.9733
No log 6.1304 282 0.8253 0.4088 0.8253 0.9084
No log 6.1739 284 0.6933 0.4504 0.6933 0.8327
No log 6.2174 286 0.6532 0.4681 0.6532 0.8082
No log 6.2609 288 0.6659 0.4659 0.6659 0.8161
No log 6.3043 290 0.6778 0.4523 0.6778 0.8233
No log 6.3478 292 0.8055 0.4417 0.8055 0.8975
No log 6.3913 294 0.8801 0.4197 0.8801 0.9381
No log 6.4348 296 0.8134 0.4212 0.8134 0.9019
No log 6.4783 298 0.7254 0.4940 0.7254 0.8517
No log 6.5217 300 0.6881 0.4889 0.6881 0.8295
No log 6.5652 302 0.6970 0.3958 0.6970 0.8349
No log 6.6087 304 0.7278 0.4081 0.7278 0.8531
No log 6.6522 306 0.7557 0.3794 0.7557 0.8693
No log 6.6957 308 0.7244 0.4464 0.7244 0.8511
No log 6.7391 310 0.6560 0.4582 0.6560 0.8099
No log 6.7826 312 0.6518 0.5201 0.6518 0.8073
No log 6.8261 314 0.6487 0.5306 0.6487 0.8054
No log 6.8696 316 0.6568 0.5389 0.6568 0.8104
No log 6.9130 318 0.6693 0.5167 0.6693 0.8181
No log 6.9565 320 0.6919 0.5218 0.6919 0.8318
No log 7.0 322 0.7400 0.4154 0.7400 0.8602
No log 7.0435 324 0.7628 0.4154 0.7628 0.8734
No log 7.0870 326 0.7753 0.4608 0.7753 0.8805
No log 7.1304 328 0.7903 0.3780 0.7903 0.8890
No log 7.1739 330 0.8099 0.3780 0.8099 0.8999
No log 7.2174 332 0.8413 0.3453 0.8413 0.9172
No log 7.2609 334 0.8649 0.3453 0.8649 0.9300
No log 7.3043 336 0.8266 0.3922 0.8266 0.9092
No log 7.3478 338 0.8906 0.2408 0.8906 0.9437
No log 7.3913 340 0.9969 0.2359 0.9969 0.9985
No log 7.4348 342 0.9330 0.2354 0.9330 0.9659
No log 7.4783 344 0.8457 0.3880 0.8457 0.9196
No log 7.5217 346 1.0487 0.3195 1.0487 1.0240
No log 7.5652 348 1.1860 0.2670 1.1860 1.0890
No log 7.6087 350 1.0904 0.2861 1.0904 1.0442
No log 7.6522 352 0.9784 0.3418 0.9784 0.9892
No log 7.6957 354 0.8266 0.2849 0.8266 0.9092
No log 7.7391 356 0.8233 0.3003 0.8233 0.9074
No log 7.7826 358 0.8218 0.1642 0.8218 0.9065
No log 7.8261 360 0.7608 0.3112 0.7608 0.8722
No log 7.8696 362 0.7604 0.4582 0.7604 0.8720
No log 7.9130 364 0.8564 0.3089 0.8564 0.9254
No log 7.9565 366 0.8990 0.3216 0.8990 0.9482
No log 8.0 368 0.8361 0.3141 0.8361 0.9144
No log 8.0435 370 0.7650 0.4460 0.7650 0.8746
No log 8.0870 372 0.7251 0.4864 0.7251 0.8515
No log 8.1304 374 0.7145 0.4276 0.7145 0.8453
No log 8.1739 376 0.6920 0.4600 0.6920 0.8319
No log 8.2174 378 0.6777 0.4155 0.6777 0.8232
No log 8.2609 380 0.6714 0.4135 0.6714 0.8194
No log 8.3043 382 0.6619 0.4473 0.6619 0.8136
No log 8.3478 384 0.6505 0.4473 0.6505 0.8065
No log 8.3913 386 0.6883 0.4684 0.6883 0.8296
No log 8.4348 388 0.7195 0.3914 0.7195 0.8482
No log 8.4783 390 0.7008 0.4315 0.7008 0.8372
No log 8.5217 392 0.7115 0.5032 0.7115 0.8435
No log 8.5652 394 0.7587 0.4539 0.7587 0.8710
No log 8.6087 396 0.7985 0.3987 0.7985 0.8936
No log 8.6522 398 0.7652 0.4539 0.7652 0.8747
No log 8.6957 400 0.7685 0.4587 0.7685 0.8767
No log 8.7391 402 0.8170 0.3432 0.8170 0.9039
No log 8.7826 404 0.9169 0.3580 0.9169 0.9576
No log 8.8261 406 0.8850 0.3300 0.8850 0.9407
No log 8.8696 408 0.8265 0.3565 0.8265 0.9091
No log 8.9130 410 0.7169 0.4207 0.7169 0.8467
No log 8.9565 412 0.6830 0.4514 0.6830 0.8264
No log 9.0 414 0.6825 0.4514 0.6825 0.8261
No log 9.0435 416 0.7025 0.4504 0.7025 0.8382
No log 9.0870 418 0.7810 0.3379 0.7810 0.8837
No log 9.1304 420 0.9009 0.3613 0.9009 0.9492
No log 9.1739 422 0.8806 0.3596 0.8806 0.9384
No log 9.2174 424 0.7758 0.4031 0.7758 0.8808
No log 9.2609 426 0.7164 0.4856 0.7164 0.8464
No log 9.3043 428 0.7420 0.4717 0.7420 0.8614
No log 9.3478 430 0.8114 0.3987 0.8114 0.9008
No log 9.3913 432 1.0107 0.3560 1.0107 1.0053
No log 9.4348 434 1.1567 0.3135 1.1567 1.0755
No log 9.4783 436 1.1447 0.3277 1.1447 1.0699
No log 9.5217 438 1.0047 0.3268 1.0047 1.0024
No log 9.5652 440 0.8456 0.4862 0.8456 0.9196
No log 9.6087 442 0.8154 0.4507 0.8154 0.9030
No log 9.6522 444 0.8029 0.4887 0.8029 0.8960
No log 9.6957 446 0.8020 0.4540 0.8020 0.8956
No log 9.7391 448 0.7770 0.4678 0.7770 0.8815
No log 9.7826 450 0.7508 0.4462 0.7508 0.8665
No log 9.8261 452 0.7460 0.4986 0.7460 0.8637
No log 9.8696 454 0.7612 0.4951 0.7612 0.8725
No log 9.9130 456 0.7991 0.4587 0.7991 0.8939
No log 9.9565 458 0.8140 0.4275 0.8140 0.9022
No log 10.0 460 0.7927 0.5044 0.7927 0.8903
No log 10.0435 462 0.7915 0.5044 0.7915 0.8896
No log 10.0870 464 0.8065 0.4714 0.8065 0.8981
No log 10.1304 466 0.8741 0.3882 0.8741 0.9349
No log 10.1739 468 0.9072 0.4536 0.9072 0.9525
No log 10.2174 470 0.9074 0.4217 0.9074 0.9526
No log 10.2609 472 0.8200 0.4717 0.8200 0.9055
No log 10.3043 474 0.7755 0.4154 0.7755 0.8806
No log 10.3478 476 0.7793 0.4154 0.7793 0.8828
No log 10.3913 478 0.8323 0.4648 0.8323 0.9123
No log 10.4348 480 0.8873 0.4045 0.8873 0.9419
No log 10.4783 482 0.9092 0.3826 0.9092 0.9535
No log 10.5217 484 0.8507 0.4045 0.8507 0.9223
No log 10.5652 486 0.7959 0.4539 0.7959 0.8921
No log 10.6087 488 0.7474 0.4836 0.7474 0.8645
No log 10.6522 490 0.7531 0.4023 0.7531 0.8678
No log 10.6957 492 0.7894 0.4334 0.7894 0.8885
No log 10.7391 494 0.8291 0.4012 0.8291 0.9106
No log 10.7826 496 0.8307 0.4025 0.8307 0.9115
No log 10.8261 498 0.8303 0.4612 0.8303 0.9112
0.3551 10.8696 500 0.8357 0.4563 0.8357 0.9142
0.3551 10.9130 502 0.8368 0.4408 0.8368 0.9147
0.3551 10.9565 504 0.8232 0.4595 0.8232 0.9073
0.3551 11.0 506 0.7910 0.4327 0.7910 0.8894
0.3551 11.0435 508 0.7729 0.4426 0.7729 0.8791
0.3551 11.0870 510 0.7889 0.4183 0.7889 0.8882
0.3551 11.1304 512 0.8211 0.4124 0.8211 0.9061
0.3551 11.1739 514 0.8239 0.3948 0.8239 0.9077

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run3_AugV5_k9_task7_organization

Finetuned
(4019)
this model