ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k11_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7220
  • Qwk: 0.2652
  • Mse: 0.7220
  • Rmse: 0.8497

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0541 2 2.5226 0.0231 2.5226 1.5883
No log 0.1081 4 1.1974 0.0731 1.1974 1.0942
No log 0.1622 6 0.8975 0.0145 0.8975 0.9474
No log 0.2162 8 0.8985 -0.0389 0.8985 0.9479
No log 0.2703 10 0.8817 -0.0073 0.8817 0.9390
No log 0.3243 12 0.8906 0.0627 0.8906 0.9437
No log 0.3784 14 1.1304 -0.0941 1.1304 1.0632
No log 0.4324 16 1.1078 -0.1734 1.1078 1.0525
No log 0.4865 18 0.8249 -0.0076 0.8249 0.9083
No log 0.5405 20 0.7807 0.1456 0.7807 0.8836
No log 0.5946 22 0.7879 0.1139 0.7879 0.8877
No log 0.6486 24 0.7918 0.0327 0.7918 0.8898
No log 0.7027 26 0.8316 0.0937 0.8316 0.9119
No log 0.7568 28 0.8139 0.0937 0.8139 0.9022
No log 0.8108 30 0.7329 -0.0054 0.7329 0.8561
No log 0.8649 32 0.7534 0.0265 0.7534 0.8680
No log 0.9189 34 0.8342 0.0966 0.8342 0.9134
No log 0.9730 36 0.8957 0.0181 0.8957 0.9464
No log 1.0270 38 0.8916 0.0208 0.8916 0.9443
No log 1.0811 40 0.9397 0.0441 0.9397 0.9694
No log 1.1351 42 0.9667 0.0153 0.9667 0.9832
No log 1.1892 44 0.8211 0.0408 0.8211 0.9062
No log 1.2432 46 0.7506 0.2572 0.7506 0.8664
No log 1.2973 48 0.7449 0.2476 0.7449 0.8631
No log 1.3514 50 0.7457 0.0697 0.7457 0.8635
No log 1.4054 52 0.7995 0.1660 0.7995 0.8941
No log 1.4595 54 0.8251 0.1310 0.8251 0.9083
No log 1.5135 56 0.8357 0.0474 0.8357 0.9142
No log 1.5676 58 0.8584 0.0940 0.8584 0.9265
No log 1.6216 60 0.8272 -0.0483 0.8272 0.9095
No log 1.6757 62 0.8305 -0.0444 0.8305 0.9113
No log 1.7297 64 0.8507 0.0026 0.8507 0.9224
No log 1.7838 66 0.8772 -0.0753 0.8772 0.9366
No log 1.8378 68 0.8762 0.0119 0.8762 0.9361
No log 1.8919 70 0.9352 0.0277 0.9352 0.9671
No log 1.9459 72 0.9219 0.1231 0.9219 0.9602
No log 2.0 74 0.8978 0.1528 0.8978 0.9475
No log 2.0541 76 1.0022 0.0596 1.0022 1.0011
No log 2.1081 78 1.2025 0.0715 1.2025 1.0966
No log 2.1622 80 1.1107 0.0355 1.1107 1.0539
No log 2.2162 82 0.9167 0.0861 0.9167 0.9574
No log 2.2703 84 0.8028 0.1093 0.8028 0.8960
No log 2.3243 86 0.8315 0.2652 0.8315 0.9119
No log 2.3784 88 0.8294 0.2652 0.8294 0.9107
No log 2.4324 90 0.7988 0.1648 0.7988 0.8938
No log 2.4865 92 0.8086 0.2441 0.8086 0.8992
No log 2.5405 94 0.8924 0.0471 0.8924 0.9447
No log 2.5946 96 1.1004 0.1451 1.1004 1.0490
No log 2.6486 98 1.1250 0.1180 1.1250 1.0607
No log 2.7027 100 0.9650 0.1198 0.9650 0.9824
No log 2.7568 102 0.8835 0.0785 0.8835 0.9400
No log 2.8108 104 0.8840 0.2811 0.8840 0.9402
No log 2.8649 106 0.9195 0.2237 0.9195 0.9589
No log 2.9189 108 0.9804 0.1651 0.9804 0.9902
No log 2.9730 110 1.0363 0.0860 1.0363 1.0180
No log 3.0270 112 1.0244 0.0891 1.0244 1.0121
No log 3.0811 114 1.0329 0.1800 1.0329 1.0163
No log 3.1351 116 0.9766 0.2116 0.9766 0.9882
No log 3.1892 118 0.9467 0.2227 0.9467 0.9730
No log 3.2432 120 0.9641 0.2116 0.9641 0.9819
No log 3.2973 122 0.9824 0.1522 0.9824 0.9912
No log 3.3514 124 1.0134 0.1506 1.0134 1.0067
No log 3.4054 126 1.2039 -0.0195 1.2039 1.0972
No log 3.4595 128 1.2011 -0.1059 1.2011 1.0960
No log 3.5135 130 0.9872 0.2019 0.9872 0.9936
No log 3.5676 132 0.8641 0.1567 0.8641 0.9296
No log 3.6216 134 0.8053 0.2319 0.8053 0.8974
No log 3.6757 136 0.7978 0.2817 0.7978 0.8932
No log 3.7297 138 0.8452 0.2227 0.8452 0.9194
No log 3.7838 140 0.9148 0.0520 0.9148 0.9565
No log 3.8378 142 0.9742 0.1887 0.9742 0.9870
No log 3.8919 144 0.9592 0.1500 0.9592 0.9794
No log 3.9459 146 0.8959 0.1867 0.8959 0.9465
No log 4.0 148 0.8797 0.1918 0.8797 0.9379
No log 4.0541 150 0.8555 0.2383 0.8555 0.9250
No log 4.1081 152 0.7723 0.3060 0.7723 0.8788
No log 4.1622 154 0.7613 0.3002 0.7613 0.8725
No log 4.2162 156 0.8306 0.2429 0.8306 0.9114
No log 4.2703 158 0.9068 0.3025 0.9068 0.9523
No log 4.3243 160 0.9130 0.2680 0.9130 0.9555
No log 4.3784 162 0.8795 0.2105 0.8795 0.9378
No log 4.4324 164 0.8852 0.1835 0.8852 0.9409
No log 4.4865 166 0.9937 0.1178 0.9937 0.9968
No log 4.5405 168 1.0154 0.1534 1.0154 1.0077
No log 4.5946 170 0.9941 0.1403 0.9941 0.9970
No log 4.6486 172 0.9888 0.1091 0.9888 0.9944
No log 4.7027 174 0.9461 0.1091 0.9461 0.9727
No log 4.7568 176 0.8867 0.2058 0.8867 0.9416
No log 4.8108 178 0.8201 0.1988 0.8201 0.9056
No log 4.8649 180 0.8053 0.1760 0.8053 0.8974
No log 4.9189 182 0.8400 0.2385 0.8400 0.9165
No log 4.9730 184 0.7973 0.2379 0.7973 0.8929
No log 5.0270 186 0.7460 0.2684 0.7460 0.8637
No log 5.0811 188 0.7815 0.3891 0.7815 0.8840
No log 5.1351 190 0.7645 0.3891 0.7645 0.8744
No log 5.1892 192 0.7377 0.3175 0.7377 0.8589
No log 5.2432 194 0.8759 0.3059 0.8759 0.9359
No log 5.2973 196 0.8957 0.3059 0.8957 0.9464
No log 5.3514 198 0.7688 0.2958 0.7688 0.8768
No log 5.4054 200 0.7155 0.3042 0.7155 0.8459
No log 5.4595 202 0.7181 0.3618 0.7181 0.8474
No log 5.5135 204 0.7186 0.4017 0.7186 0.8477
No log 5.5676 206 0.7140 0.3574 0.7140 0.8450
No log 5.6216 208 0.7089 0.3667 0.7089 0.8420
No log 5.6757 210 0.7044 0.3818 0.7044 0.8393
No log 5.7297 212 0.7121 0.4291 0.7121 0.8439
No log 5.7838 214 0.6949 0.4397 0.6949 0.8336
No log 5.8378 216 0.6958 0.3861 0.6958 0.8342
No log 5.8919 218 0.7003 0.3887 0.7003 0.8368
No log 5.9459 220 0.7077 0.3096 0.7077 0.8412
No log 6.0 222 0.7256 0.2747 0.7256 0.8518
No log 6.0541 224 0.7578 0.3980 0.7578 0.8705
No log 6.1081 226 0.8330 0.4284 0.8330 0.9127
No log 6.1622 228 0.8084 0.3689 0.8084 0.8991
No log 6.2162 230 0.7802 0.3616 0.7802 0.8833
No log 6.2703 232 0.8120 0.2822 0.8120 0.9011
No log 6.3243 234 0.8280 0.3068 0.8280 0.9099
No log 6.3784 236 0.8218 0.2717 0.8218 0.9066
No log 6.4324 238 0.8402 0.2689 0.8402 0.9166
No log 6.4865 240 0.8548 0.2479 0.8548 0.9245
No log 6.5405 242 0.8874 0.1935 0.8874 0.9420
No log 6.5946 244 0.9009 0.2537 0.9009 0.9491
No log 6.6486 246 0.8964 0.2270 0.8964 0.9468
No log 6.7027 248 0.9121 0.2907 0.9121 0.9551
No log 6.7568 250 0.8907 0.2471 0.8907 0.9437
No log 6.8108 252 0.8534 0.3299 0.8534 0.9238
No log 6.8649 254 0.8092 0.2429 0.8092 0.8995
No log 6.9189 256 0.7883 0.2192 0.7883 0.8878
No log 6.9730 258 0.7661 0.2568 0.7661 0.8752
No log 7.0270 260 0.7300 0.2377 0.7300 0.8544
No log 7.0811 262 0.7368 0.2862 0.7368 0.8584
No log 7.1351 264 0.7325 0.2605 0.7325 0.8559
No log 7.1892 266 0.7340 0.2424 0.7340 0.8568
No log 7.2432 268 0.7554 0.2862 0.7554 0.8691
No log 7.2973 270 0.8612 0.3586 0.8612 0.9280
No log 7.3514 272 0.9175 0.3219 0.9175 0.9578
No log 7.4054 274 0.8859 0.3344 0.8859 0.9412
No log 7.4595 276 0.8885 0.3219 0.8885 0.9426
No log 7.5135 278 0.8804 0.3131 0.8804 0.9383
No log 7.5676 280 0.8501 0.3789 0.8501 0.9220
No log 7.6216 282 0.8868 0.3819 0.8868 0.9417
No log 7.6757 284 0.8524 0.3399 0.8524 0.9233
No log 7.7297 286 0.8293 0.2813 0.8293 0.9107
No log 7.7838 288 0.8280 0.3183 0.8280 0.9100
No log 7.8378 290 0.8140 0.3355 0.8140 0.9022
No log 7.8919 292 0.7534 0.3408 0.7534 0.8680
No log 7.9459 294 0.7328 0.2318 0.7328 0.8561
No log 8.0 296 0.7209 0.2318 0.7209 0.8491
No log 8.0541 298 0.7285 0.3841 0.7285 0.8535
No log 8.1081 300 0.8103 0.3344 0.8103 0.9002
No log 8.1622 302 0.7706 0.3996 0.7706 0.8778
No log 8.2162 304 0.6935 0.4124 0.6935 0.8328
No log 8.2703 306 0.7067 0.2193 0.7067 0.8406
No log 8.3243 308 0.7292 0.2491 0.7292 0.8539
No log 8.3784 310 0.6880 0.2038 0.6880 0.8295
No log 8.4324 312 0.6764 0.3267 0.6764 0.8224
No log 8.4865 314 0.8040 0.3770 0.8040 0.8966
No log 8.5405 316 0.8480 0.3665 0.8480 0.9209
No log 8.5946 318 0.8691 0.3601 0.8691 0.9322
No log 8.6486 320 0.7839 0.4014 0.7839 0.8854
No log 8.7027 322 0.6719 0.3267 0.6719 0.8197
No log 8.7568 324 0.6551 0.3551 0.6551 0.8094
No log 8.8108 326 0.6694 0.2877 0.6694 0.8182
No log 8.8649 328 0.6709 0.4067 0.6709 0.8191
No log 8.9189 330 0.6814 0.3498 0.6814 0.8255
No log 8.9730 332 0.6698 0.4013 0.6698 0.8184
No log 9.0270 334 0.6814 0.4013 0.6814 0.8255
No log 9.0811 336 0.6871 0.4013 0.6871 0.8289
No log 9.1351 338 0.7073 0.3990 0.7073 0.8410
No log 9.1892 340 0.7420 0.4282 0.7420 0.8614
No log 9.2432 342 0.7272 0.4393 0.7272 0.8528
No log 9.2973 344 0.6936 0.4924 0.6936 0.8328
No log 9.3514 346 0.6819 0.5150 0.6819 0.8258
No log 9.4054 348 0.6843 0.4919 0.6843 0.8272
No log 9.4595 350 0.6753 0.4919 0.6753 0.8218
No log 9.5135 352 0.6963 0.4808 0.6963 0.8345
No log 9.5676 354 0.7168 0.4854 0.7168 0.8466
No log 9.6216 356 0.6592 0.4486 0.6592 0.8119
No log 9.6757 358 0.6496 0.4795 0.6496 0.8060
No log 9.7297 360 0.7068 0.4522 0.7068 0.8407
No log 9.7838 362 0.7182 0.3888 0.7182 0.8475
No log 9.8378 364 0.6753 0.4374 0.6753 0.8218
No log 9.8919 366 0.6431 0.5071 0.6431 0.8019
No log 9.9459 368 0.6434 0.4847 0.6434 0.8021
No log 10.0 370 0.6802 0.4227 0.6802 0.8247
No log 10.0541 372 0.7313 0.4044 0.7313 0.8552
No log 10.1081 374 0.7312 0.4550 0.7312 0.8551
No log 10.1622 376 0.7548 0.4979 0.7548 0.8688
No log 10.2162 378 0.7474 0.4788 0.7474 0.8645
No log 10.2703 380 0.7387 0.4555 0.7387 0.8595
No log 10.3243 382 0.7057 0.4257 0.7057 0.8400
No log 10.3784 384 0.7278 0.4292 0.7278 0.8531
No log 10.4324 386 0.7668 0.3934 0.7668 0.8757
No log 10.4865 388 0.7636 0.3710 0.7636 0.8738
No log 10.5405 390 0.7419 0.3955 0.7419 0.8613
No log 10.5946 392 0.6930 0.4763 0.6930 0.8325
No log 10.6486 394 0.6732 0.4086 0.6732 0.8205
No log 10.7027 396 0.6862 0.4107 0.6862 0.8284
No log 10.7568 398 0.7001 0.4137 0.7001 0.8367
No log 10.8108 400 0.7686 0.4928 0.7686 0.8767
No log 10.8649 402 0.9358 0.3790 0.9358 0.9674
No log 10.9189 404 0.9238 0.3998 0.9238 0.9612
No log 10.9730 406 0.7889 0.4732 0.7889 0.8882
No log 11.0270 408 0.7195 0.4058 0.7195 0.8482
No log 11.0811 410 0.7280 0.3308 0.7280 0.8532
No log 11.1351 412 0.7183 0.3297 0.7183 0.8475
No log 11.1892 414 0.7680 0.4125 0.7680 0.8764
No log 11.2432 416 0.8948 0.2777 0.8948 0.9460
No log 11.2973 418 0.9731 0.2636 0.9731 0.9865
No log 11.3514 420 0.9194 0.2509 0.9194 0.9588
No log 11.4054 422 0.8021 0.3775 0.8021 0.8956
No log 11.4595 424 0.7756 0.3582 0.7756 0.8807
No log 11.5135 426 0.7679 0.3834 0.7679 0.8763
No log 11.5676 428 0.7777 0.3175 0.7777 0.8819
No log 11.6216 430 0.8822 0.3902 0.8822 0.9392
No log 11.6757 432 0.8582 0.3527 0.8582 0.9264
No log 11.7297 434 0.7612 0.4144 0.7612 0.8725
No log 11.7838 436 0.7206 0.3976 0.7206 0.8489
No log 11.8378 438 0.6941 0.2389 0.6941 0.8331
No log 11.8919 440 0.6869 0.3530 0.6869 0.8288
No log 11.9459 442 0.6951 0.3011 0.6951 0.8338
No log 12.0 444 0.7164 0.2877 0.7164 0.8464
No log 12.0541 446 0.7653 0.3060 0.7653 0.8748
No log 12.1081 448 0.9251 0.4186 0.9251 0.9618
No log 12.1622 450 1.1261 0.3059 1.1261 1.0612
No log 12.2162 452 1.0753 0.3059 1.0753 1.0370
No log 12.2703 454 0.9043 0.3123 0.9043 0.9509
No log 12.3243 456 0.8567 0.3355 0.8567 0.9256
No log 12.3784 458 0.8747 0.3206 0.8747 0.9353
No log 12.4324 460 0.9152 0.2636 0.9152 0.9567
No log 12.4865 462 0.9923 0.2622 0.9923 0.9961
No log 12.5405 464 1.0073 0.2885 1.0073 1.0037
No log 12.5946 466 0.9455 0.3146 0.9455 0.9724
No log 12.6486 468 0.8621 0.4222 0.8621 0.9285
No log 12.7027 470 0.7687 0.3604 0.7687 0.8768
No log 12.7568 472 0.7266 0.3633 0.7266 0.8524
No log 12.8108 474 0.6774 0.3863 0.6774 0.8230
No log 12.8649 476 0.6599 0.4146 0.6599 0.8124
No log 12.9189 478 0.6523 0.4470 0.6523 0.8077
No log 12.9730 480 0.6581 0.4360 0.6581 0.8112
No log 13.0270 482 0.6802 0.4290 0.6802 0.8247
No log 13.0811 484 0.7280 0.3273 0.7280 0.8532
No log 13.1351 486 0.7665 0.4171 0.7665 0.8755
No log 13.1892 488 0.7332 0.3510 0.7332 0.8563
No log 13.2432 490 0.7000 0.4180 0.7000 0.8367
No log 13.2973 492 0.7163 0.3681 0.7163 0.8464
No log 13.3514 494 0.7518 0.3505 0.7518 0.8671
No log 13.4054 496 0.8317 0.4085 0.8317 0.9120
No log 13.4595 498 0.8331 0.4085 0.8331 0.9127
0.3672 13.5135 500 0.7967 0.3754 0.7967 0.8926
0.3672 13.5676 502 0.7692 0.4036 0.7692 0.8771
0.3672 13.6216 504 0.7046 0.3340 0.7046 0.8394
0.3672 13.6757 506 0.6940 0.4036 0.6940 0.8331
0.3672 13.7297 508 0.7048 0.4036 0.7048 0.8395
0.3672 13.7838 510 0.6982 0.4006 0.6982 0.8356
0.3672 13.8378 512 0.7087 0.4248 0.7087 0.8419
0.3672 13.8919 514 0.7460 0.4186 0.7460 0.8637
0.3672 13.9459 516 0.7066 0.4301 0.7066 0.8406
0.3672 14.0 518 0.6741 0.3460 0.6741 0.8210
0.3672 14.0541 520 0.6653 0.3862 0.6653 0.8156
0.3672 14.1081 522 0.6760 0.3253 0.6760 0.8222
0.3672 14.1622 524 0.6941 0.3919 0.6941 0.8331
0.3672 14.2162 526 0.6797 0.3081 0.6797 0.8244
0.3672 14.2703 528 0.6735 0.2334 0.6735 0.8207
0.3672 14.3243 530 0.6719 0.2038 0.6719 0.8197
0.3672 14.3784 532 0.6883 0.3945 0.6883 0.8296
0.3672 14.4324 534 0.7500 0.3723 0.7500 0.8660
0.3672 14.4865 536 0.7709 0.3776 0.7709 0.8780
0.3672 14.5405 538 0.7232 0.3382 0.7232 0.8504
0.3672 14.5946 540 0.7043 0.2936 0.7043 0.8392
0.3672 14.6486 542 0.7037 0.3252 0.7037 0.8389
0.3672 14.7027 544 0.7148 0.2889 0.7148 0.8455
0.3672 14.7568 546 0.7220 0.2652 0.7220 0.8497

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k11_task7_organization

Finetuned
(4019)
this model