ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k15_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8801
  • Qwk: 0.3318
  • Mse: 0.8801
  • Rmse: 0.9381

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0370 2 4.4116 -0.0163 4.4116 2.1004
No log 0.0741 4 2.3967 0.0228 2.3967 1.5481
No log 0.1111 6 1.8517 0.0062 1.8517 1.3608
No log 0.1481 8 1.5165 0.0310 1.5165 1.2315
No log 0.1852 10 1.4309 0.0018 1.4309 1.1962
No log 0.2222 12 1.4751 0.0393 1.4751 1.2145
No log 0.2593 14 1.7926 0.1117 1.7926 1.3389
No log 0.2963 16 1.7444 0.1357 1.7444 1.3208
No log 0.3333 18 1.6821 0.1028 1.6821 1.2969
No log 0.3704 20 1.7036 0.0827 1.7036 1.3052
No log 0.4074 22 1.3904 0.1314 1.3904 1.1791
No log 0.4444 24 1.1180 0.1943 1.1180 1.0574
No log 0.4815 26 1.0902 0.2386 1.0902 1.0441
No log 0.5185 28 1.2716 0.1427 1.2716 1.1277
No log 0.5556 30 1.7332 0.2356 1.7332 1.3165
No log 0.5926 32 1.7061 0.2091 1.7061 1.3062
No log 0.6296 34 1.5165 0.1833 1.5165 1.2315
No log 0.6667 36 1.4176 0.1418 1.4176 1.1906
No log 0.7037 38 1.1902 0.2071 1.1902 1.0910
No log 0.7407 40 1.1537 0.2256 1.1537 1.0741
No log 0.7778 42 1.1751 0.1974 1.1751 1.0840
No log 0.8148 44 1.2471 0.1698 1.2471 1.1167
No log 0.8519 46 1.3732 0.0662 1.3732 1.1718
No log 0.8889 48 1.3735 0.1404 1.3735 1.1720
No log 0.9259 50 1.2745 0.1530 1.2745 1.1290
No log 0.9630 52 1.2970 0.1080 1.2970 1.1389
No log 1.0 54 1.4006 0.0700 1.4006 1.1835
No log 1.0370 56 1.5617 0.1819 1.5617 1.2497
No log 1.0741 58 1.6476 0.1735 1.6476 1.2836
No log 1.1111 60 1.4605 0.1395 1.4605 1.2085
No log 1.1481 62 1.1983 0.1827 1.1983 1.0947
No log 1.1852 64 1.0113 0.2640 1.0113 1.0056
No log 1.2222 66 0.9153 0.3737 0.9153 0.9567
No log 1.2593 68 1.0325 0.2984 1.0325 1.0161
No log 1.2963 70 1.2660 0.1587 1.2660 1.1252
No log 1.3333 72 1.2554 0.1438 1.2554 1.1204
No log 1.3704 74 1.3774 0.1165 1.3774 1.1736
No log 1.4074 76 1.3636 0.1165 1.3636 1.1677
No log 1.4444 78 1.2731 0.1438 1.2731 1.1283
No log 1.4815 80 1.0453 0.3441 1.0453 1.0224
No log 1.5185 82 0.9810 0.3498 0.9810 0.9904
No log 1.5556 84 0.9911 0.3365 0.9911 0.9956
No log 1.5926 86 0.9655 0.3394 0.9655 0.9826
No log 1.6296 88 1.0332 0.2285 1.0332 1.0165
No log 1.6667 90 1.1211 0.2439 1.1211 1.0588
No log 1.7037 92 1.1154 0.3291 1.1154 1.0561
No log 1.7407 94 0.9376 0.3996 0.9376 0.9683
No log 1.7778 96 0.8659 0.3250 0.8659 0.9305
No log 1.8148 98 1.1002 0.3418 1.1002 1.0489
No log 1.8519 100 1.1825 0.2692 1.1825 1.0874
No log 1.8889 102 0.9246 0.3674 0.9246 0.9616
No log 1.9259 104 1.0454 0.3872 1.0454 1.0225
No log 1.9630 106 1.3590 0.3121 1.3590 1.1658
No log 2.0 108 1.4120 0.3421 1.4120 1.1883
No log 2.0370 110 1.1818 0.4512 1.1818 1.0871
No log 2.0741 112 0.8963 0.4705 0.8963 0.9467
No log 2.1111 114 0.9213 0.4527 0.9213 0.9599
No log 2.1481 116 0.9421 0.4200 0.9421 0.9706
No log 2.1852 118 0.8708 0.5203 0.8708 0.9332
No log 2.2222 120 1.0479 0.3330 1.0479 1.0237
No log 2.2593 122 1.3774 0.2333 1.3774 1.1736
No log 2.2963 124 1.4298 0.2077 1.4298 1.1957
No log 2.3333 126 1.2336 0.2589 1.2336 1.1107
No log 2.3704 128 0.9928 0.3033 0.9928 0.9964
No log 2.4074 130 0.9573 0.3365 0.9573 0.9784
No log 2.4444 132 0.9442 0.3846 0.9442 0.9717
No log 2.4815 134 0.9724 0.4063 0.9724 0.9861
No log 2.5185 136 1.1293 0.3670 1.1293 1.0627
No log 2.5556 138 1.3516 0.4550 1.3516 1.1626
No log 2.5926 140 1.2267 0.4227 1.2267 1.1076
No log 2.6296 142 0.9333 0.4885 0.9333 0.9661
No log 2.6667 144 0.8315 0.5226 0.8315 0.9119
No log 2.7037 146 0.8052 0.5358 0.8052 0.8973
No log 2.7407 148 0.8055 0.5155 0.8055 0.8975
No log 2.7778 150 0.8274 0.5132 0.8274 0.9096
No log 2.8148 152 0.8246 0.4667 0.8246 0.9081
No log 2.8519 154 0.8314 0.4834 0.8314 0.9118
No log 2.8889 156 0.8880 0.4581 0.8880 0.9423
No log 2.9259 158 1.0736 0.4820 1.0736 1.0361
No log 2.9630 160 1.0484 0.4878 1.0484 1.0239
No log 3.0 162 0.8903 0.4489 0.8903 0.9435
No log 3.0370 164 0.8479 0.3925 0.8479 0.9208
No log 3.0741 166 0.9257 0.4376 0.9257 0.9621
No log 3.1111 168 0.8884 0.4522 0.8884 0.9426
No log 3.1481 170 0.8347 0.4847 0.8347 0.9136
No log 3.1852 172 0.8500 0.4548 0.8500 0.9219
No log 3.2222 174 0.8946 0.4796 0.8946 0.9458
No log 3.2593 176 0.9006 0.4920 0.9006 0.9490
No log 3.2963 178 1.0813 0.4365 1.0813 1.0399
No log 3.3333 180 1.1407 0.4365 1.1407 1.0680
No log 3.3704 182 0.9466 0.5073 0.9466 0.9729
No log 3.4074 184 0.9143 0.5287 0.9143 0.9562
No log 3.4444 186 1.0114 0.4487 1.0114 1.0057
No log 3.4815 188 0.9107 0.5618 0.9107 0.9543
No log 3.5185 190 0.8870 0.4123 0.8870 0.9418
No log 3.5556 192 0.9217 0.3535 0.9217 0.9600
No log 3.5926 194 0.9213 0.3819 0.9213 0.9599
No log 3.6296 196 0.9500 0.4042 0.9500 0.9747
No log 3.6667 198 0.9481 0.4465 0.9481 0.9737
No log 3.7037 200 0.9343 0.4241 0.9343 0.9666
No log 3.7407 202 1.0900 0.4098 1.0900 1.0440
No log 3.7778 204 1.1455 0.3763 1.1455 1.0703
No log 3.8148 206 0.9961 0.4420 0.9961 0.9981
No log 3.8519 208 0.9336 0.4061 0.9336 0.9662
No log 3.8889 210 0.9840 0.3852 0.9840 0.9920
No log 3.9259 212 1.0590 0.3887 1.0590 1.0291
No log 3.9630 214 0.9591 0.4212 0.9591 0.9794
No log 4.0 216 0.9347 0.3486 0.9347 0.9668
No log 4.0370 218 0.9285 0.3637 0.9285 0.9636
No log 4.0741 220 0.9077 0.2892 0.9077 0.9527
No log 4.1111 222 0.9008 0.3948 0.9008 0.9491
No log 4.1481 224 0.8985 0.3224 0.8985 0.9479
No log 4.1852 226 0.8988 0.3224 0.8988 0.9481
No log 4.2222 228 0.8985 0.4316 0.8985 0.9479
No log 4.2593 230 0.9449 0.4622 0.9449 0.9721
No log 4.2963 232 1.0202 0.4490 1.0202 1.0100
No log 4.3333 234 0.9349 0.4811 0.9349 0.9669
No log 4.3704 236 0.8634 0.3943 0.8634 0.9292
No log 4.4074 238 0.8612 0.4352 0.8612 0.9280
No log 4.4444 240 0.8651 0.4352 0.8651 0.9301
No log 4.4815 242 0.8807 0.4280 0.8807 0.9385
No log 4.5185 244 0.9003 0.4242 0.9003 0.9489
No log 4.5556 246 0.8937 0.3483 0.8937 0.9454
No log 4.5926 248 1.0235 0.3714 1.0235 1.0117
No log 4.6296 250 0.9931 0.3078 0.9931 0.9966
No log 4.6667 252 0.8967 0.4555 0.8967 0.9470
No log 4.7037 254 1.0080 0.4 1.0080 1.0040
No log 4.7407 256 1.0498 0.3971 1.0498 1.0246
No log 4.7778 258 0.9291 0.4009 0.9291 0.9639
No log 4.8148 260 0.9935 0.3813 0.9935 0.9967
No log 4.8519 262 1.0471 0.3985 1.0471 1.0233
No log 4.8889 264 0.9523 0.3483 0.9523 0.9758
No log 4.9259 266 1.0502 0.2972 1.0502 1.0248
No log 4.9630 268 1.2609 0.2119 1.2609 1.1229
No log 5.0 270 1.2018 0.2083 1.2018 1.0963
No log 5.0370 272 1.0268 0.3243 1.0268 1.0133
No log 5.0741 274 1.0067 0.3382 1.0067 1.0034
No log 5.1111 276 1.0912 0.3100 1.0912 1.0446
No log 5.1481 278 1.0712 0.3106 1.0712 1.0350
No log 5.1852 280 0.9651 0.3866 0.9651 0.9824
No log 5.2222 282 0.9594 0.2871 0.9594 0.9795
No log 5.2593 284 0.9362 0.3552 0.9362 0.9676
No log 5.2963 286 0.9127 0.3920 0.9127 0.9554
No log 5.3333 288 0.9878 0.3759 0.9878 0.9939
No log 5.3704 290 1.0303 0.4225 1.0303 1.0151
No log 5.4074 292 1.0224 0.4350 1.0224 1.0112
No log 5.4444 294 0.9876 0.4373 0.9876 0.9938
No log 5.4815 296 0.9106 0.4510 0.9106 0.9543
No log 5.5185 298 0.8971 0.4582 0.8971 0.9472
No log 5.5556 300 0.9109 0.4138 0.9109 0.9544
No log 5.5926 302 0.9301 0.4137 0.9301 0.9644
No log 5.6296 304 0.9411 0.3382 0.9411 0.9701
No log 5.6667 306 0.9420 0.3290 0.9420 0.9706
No log 5.7037 308 0.9546 0.3237 0.9546 0.9770
No log 5.7407 310 0.9925 0.3937 0.9925 0.9962
No log 5.7778 312 0.9622 0.3283 0.9622 0.9809
No log 5.8148 314 0.9461 0.3142 0.9461 0.9727
No log 5.8519 316 0.9287 0.3552 0.9287 0.9637
No log 5.8889 318 0.9124 0.3891 0.9124 0.9552
No log 5.9259 320 0.9019 0.3935 0.9019 0.9497
No log 5.9630 322 0.8938 0.4066 0.8938 0.9454
No log 6.0 324 0.9001 0.3925 0.9001 0.9487
No log 6.0370 326 0.8970 0.3190 0.8970 0.9471
No log 6.0741 328 0.9225 0.3854 0.9225 0.9605
No log 6.1111 330 0.9406 0.2938 0.9406 0.9698
No log 6.1481 332 0.9934 0.3977 0.9934 0.9967
No log 6.1852 334 0.9717 0.3891 0.9717 0.9858
No log 6.2222 336 0.9095 0.3337 0.9095 0.9537
No log 6.2593 338 0.9277 0.4568 0.9277 0.9632
No log 6.2963 340 0.9305 0.4568 0.9305 0.9646
No log 6.3333 342 0.8906 0.4143 0.8906 0.9437
No log 6.3704 344 0.8778 0.4364 0.8778 0.9369
No log 6.4074 346 0.8712 0.4235 0.8712 0.9334
No log 6.4444 348 0.8683 0.4218 0.8683 0.9318
No log 6.4815 350 1.0120 0.3543 1.0120 1.0060
No log 6.5185 352 1.1214 0.3117 1.1214 1.0590
No log 6.5556 354 1.0596 0.2506 1.0596 1.0294
No log 6.5926 356 0.9390 0.3508 0.9390 0.9690
No log 6.6296 358 0.9173 0.4054 0.9173 0.9578
No log 6.6667 360 0.9728 0.3925 0.9728 0.9863
No log 6.7037 362 0.9968 0.4372 0.9968 0.9984
No log 6.7407 364 0.9150 0.4305 0.9150 0.9565
No log 6.7778 366 0.8920 0.5137 0.8920 0.9445
No log 6.8148 368 0.9769 0.3892 0.9769 0.9884
No log 6.8519 370 0.9595 0.4328 0.9595 0.9795
No log 6.8889 372 0.8645 0.5458 0.8645 0.9298
No log 6.9259 374 0.8699 0.4823 0.8699 0.9327
No log 6.9630 376 0.8878 0.4946 0.8878 0.9423
No log 7.0 378 0.8748 0.4465 0.8748 0.9353
No log 7.0370 380 0.8835 0.4388 0.8835 0.9400
No log 7.0741 382 0.9104 0.3983 0.9104 0.9541
No log 7.1111 384 0.9410 0.3558 0.9410 0.9700
No log 7.1481 386 0.9552 0.3365 0.9552 0.9773
No log 7.1852 388 0.9629 0.3263 0.9629 0.9813
No log 7.2222 390 0.9768 0.4175 0.9768 0.9883
No log 7.2593 392 1.0400 0.3056 1.0400 1.0198
No log 7.2963 394 1.0618 0.2721 1.0618 1.0304
No log 7.3333 396 0.9823 0.3996 0.9823 0.9911
No log 7.3704 398 0.9061 0.4084 0.9061 0.9519
No log 7.4074 400 0.9158 0.4796 0.9158 0.9570
No log 7.4444 402 0.9001 0.5059 0.9001 0.9488
No log 7.4815 404 0.9139 0.3814 0.9139 0.9560
No log 7.5185 406 1.0269 0.3937 1.0269 1.0133
No log 7.5556 408 1.1076 0.3096 1.1076 1.0524
No log 7.5926 410 1.0581 0.3152 1.0581 1.0286
No log 7.6296 412 0.9498 0.3996 0.9498 0.9746
No log 7.6667 414 0.9333 0.3671 0.9333 0.9661
No log 7.7037 416 0.9219 0.3879 0.9219 0.9602
No log 7.7407 418 0.9343 0.3652 0.9343 0.9666
No log 7.7778 420 0.9541 0.3720 0.9541 0.9768
No log 7.8148 422 0.9335 0.4455 0.9335 0.9662
No log 7.8519 424 0.9256 0.3780 0.9256 0.9621
No log 7.8889 426 0.9750 0.3699 0.9750 0.9874
No log 7.9259 428 0.9720 0.3699 0.9720 0.9859
No log 7.9630 430 0.9409 0.3814 0.9409 0.9700
No log 8.0 432 0.9350 0.4023 0.9350 0.9669
No log 8.0370 434 0.9545 0.3535 0.9545 0.9770
No log 8.0741 436 0.9652 0.3779 0.9652 0.9825
No log 8.1111 438 0.9442 0.3804 0.9442 0.9717
No log 8.1481 440 0.9297 0.3738 0.9297 0.9642
No log 8.1852 442 0.9477 0.3373 0.9477 0.9735
No log 8.2222 444 0.9350 0.3483 0.9350 0.9670
No log 8.2593 446 0.9371 0.3577 0.9371 0.9681
No log 8.2963 448 0.9781 0.4588 0.9781 0.9890
No log 8.3333 450 0.9480 0.4164 0.9480 0.9737
No log 8.3704 452 0.8975 0.4413 0.8975 0.9473
No log 8.4074 454 0.8847 0.3983 0.8847 0.9406
No log 8.4444 456 0.8797 0.3879 0.8797 0.9379
No log 8.4815 458 0.8751 0.4141 0.8751 0.9355
No log 8.5185 460 0.8762 0.3933 0.8762 0.9361
No log 8.5556 462 0.8721 0.4036 0.8721 0.9339
No log 8.5926 464 0.8656 0.4243 0.8656 0.9304
No log 8.6296 466 0.8902 0.4998 0.8902 0.9435
No log 8.6667 468 0.8659 0.4722 0.8659 0.9305
No log 8.7037 470 0.8473 0.4555 0.8473 0.9205
No log 8.7407 472 0.8897 0.3276 0.8897 0.9432
No log 8.7778 474 0.9694 0.3152 0.9694 0.9846
No log 8.8148 476 0.9528 0.3106 0.9528 0.9761
No log 8.8519 478 0.9051 0.4221 0.9051 0.9514
No log 8.8889 480 0.8852 0.3641 0.8852 0.9409
No log 8.9259 482 0.8747 0.4181 0.8747 0.9352
No log 8.9630 484 0.8622 0.4912 0.8622 0.9285
No log 9.0 486 0.8491 0.4583 0.8491 0.9215
No log 9.0370 488 0.8765 0.3728 0.8765 0.9362
No log 9.0741 490 0.8704 0.3852 0.8704 0.9330
No log 9.1111 492 0.8315 0.4292 0.8315 0.9119
No log 9.1481 494 0.8494 0.5426 0.8494 0.9216
No log 9.1852 496 0.8607 0.4568 0.8607 0.9277
No log 9.2222 498 0.8376 0.4780 0.8376 0.9152
0.321 9.2593 500 0.8364 0.3879 0.8364 0.9146
0.321 9.2963 502 0.8438 0.3925 0.8438 0.9186
0.321 9.3333 504 0.8463 0.3925 0.8463 0.9199
0.321 9.3704 506 0.8430 0.4023 0.8430 0.9182
0.321 9.4074 508 0.8399 0.4122 0.8399 0.9165
0.321 9.4444 510 0.8413 0.4026 0.8413 0.9172
0.321 9.4815 512 0.8454 0.4695 0.8454 0.9195
0.321 9.5185 514 0.8512 0.4254 0.8512 0.9226
0.321 9.5556 516 0.8731 0.3969 0.8731 0.9344
0.321 9.5926 518 0.8731 0.3836 0.8731 0.9344
0.321 9.6296 520 0.8721 0.3983 0.8721 0.9339
0.321 9.6667 522 0.8738 0.4081 0.8738 0.9348
0.321 9.7037 524 0.8886 0.3879 0.8886 0.9426
0.321 9.7407 526 0.9171 0.3011 0.9171 0.9577
0.321 9.7778 528 0.9076 0.3091 0.9076 0.9527
0.321 9.8148 530 0.8989 0.3299 0.8989 0.9481
0.321 9.8519 532 0.8801 0.3318 0.8801 0.9381

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run2_AugV5_k15_task2_organization

Finetuned
(4019)
this model