ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k17_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5228
  • Qwk: 0.4527
  • Mse: 0.5228
  • Rmse: 0.7231

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0233 2 2.6798 -0.0230 2.6798 1.6370
No log 0.0465 4 1.2995 0.0511 1.2995 1.1400
No log 0.0698 6 0.9989 -0.0622 0.9989 0.9994
No log 0.0930 8 1.0468 0.0478 1.0468 1.0231
No log 0.1163 10 0.9453 0.2703 0.9453 0.9723
No log 0.1395 12 0.7612 0.1648 0.7612 0.8725
No log 0.1628 14 0.7284 0.2336 0.7284 0.8535
No log 0.1860 16 0.7165 0.1807 0.7165 0.8465
No log 0.2093 18 0.7029 0.1807 0.7029 0.8384
No log 0.2326 20 0.6895 0.2718 0.6895 0.8303
No log 0.2558 22 0.7225 0.2424 0.7225 0.8500
No log 0.2791 24 0.9845 0.2239 0.9845 0.9922
No log 0.3023 26 1.0539 0.2435 1.0539 1.0266
No log 0.3256 28 0.8378 0.3895 0.8378 0.9153
No log 0.3488 30 0.6332 0.3050 0.6332 0.7957
No log 0.3721 32 0.5615 0.2955 0.5615 0.7493
No log 0.3953 34 0.5613 0.2243 0.5613 0.7492
No log 0.4186 36 0.5900 0.2382 0.5900 0.7681
No log 0.4419 38 0.6331 0.1315 0.6331 0.7957
No log 0.4651 40 0.8066 0.3598 0.8066 0.8981
No log 0.4884 42 0.8706 0.3777 0.8706 0.9331
No log 0.5116 44 0.7387 0.3051 0.7387 0.8595
No log 0.5349 46 0.5897 0.1983 0.5897 0.7679
No log 0.5581 48 0.5970 0.3546 0.5970 0.7727
No log 0.5814 50 0.5960 0.3084 0.5960 0.7720
No log 0.6047 52 0.7067 0.2051 0.7067 0.8407
No log 0.6279 54 0.8556 0.3643 0.8556 0.9250
No log 0.6512 56 1.0438 0.2263 1.0438 1.0217
No log 0.6744 58 1.1741 0.1710 1.1741 1.0836
No log 0.6977 60 1.0396 0.1746 1.0396 1.0196
No log 0.7209 62 0.7900 0.4431 0.7900 0.8888
No log 0.7442 64 0.6619 0.3481 0.6619 0.8136
No log 0.7674 66 0.6506 0.2811 0.6506 0.8066
No log 0.7907 68 0.6548 0.3160 0.6548 0.8092
No log 0.8140 70 0.6615 0.3293 0.6615 0.8133
No log 0.8372 72 0.6523 0.2715 0.6523 0.8076
No log 0.8605 74 0.8482 0.2975 0.8482 0.9210
No log 0.8837 76 0.9186 0.3069 0.9186 0.9585
No log 0.9070 78 0.7183 0.3819 0.7183 0.8475
No log 0.9302 80 0.6052 0.3089 0.6052 0.7780
No log 0.9535 82 0.7180 0.4411 0.7180 0.8473
No log 0.9767 84 0.7372 0.4161 0.7372 0.8586
No log 1.0 86 0.6900 0.3681 0.6900 0.8307
No log 1.0233 88 0.6852 0.3681 0.6852 0.8278
No log 1.0465 90 0.6493 0.3443 0.6493 0.8058
No log 1.0698 92 0.6192 0.4276 0.6192 0.7869
No log 1.0930 94 0.6287 0.4182 0.6287 0.7929
No log 1.1163 96 0.8350 0.4051 0.8350 0.9138
No log 1.1395 98 1.2298 0.1453 1.2298 1.1090
No log 1.1628 100 1.2299 0.1453 1.2299 1.1090
No log 1.1860 102 0.9205 0.3808 0.9205 0.9594
No log 1.2093 104 0.8132 0.3988 0.8132 0.9018
No log 1.2326 106 0.7916 0.3597 0.7916 0.8897
No log 1.2558 108 0.6951 0.4103 0.6951 0.8337
No log 1.2791 110 0.6393 0.4693 0.6393 0.7995
No log 1.3023 112 0.6366 0.4586 0.6366 0.7979
No log 1.3256 114 0.6616 0.5845 0.6616 0.8134
No log 1.3488 116 0.6802 0.4767 0.6802 0.8247
No log 1.3721 118 0.6612 0.5257 0.6612 0.8132
No log 1.3953 120 0.6559 0.4831 0.6559 0.8099
No log 1.4186 122 0.7262 0.4357 0.7262 0.8522
No log 1.4419 124 0.6859 0.3768 0.6859 0.8282
No log 1.4651 126 0.6555 0.4542 0.6555 0.8096
No log 1.4884 128 0.6858 0.3586 0.6858 0.8282
No log 1.5116 130 0.6337 0.3196 0.6337 0.7961
No log 1.5349 132 0.6427 0.3580 0.6427 0.8017
No log 1.5581 134 0.7352 0.2245 0.7352 0.8574
No log 1.5814 136 0.7599 0.2528 0.7599 0.8717
No log 1.6047 138 0.6994 0.3124 0.6994 0.8363
No log 1.6279 140 0.6404 0.4548 0.6404 0.8002
No log 1.6512 142 0.6586 0.3808 0.6586 0.8115
No log 1.6744 144 0.7136 0.3761 0.7136 0.8448
No log 1.6977 146 0.7131 0.3590 0.7131 0.8445
No log 1.7209 148 0.6293 0.3435 0.6293 0.7933
No log 1.7442 150 0.6277 0.3755 0.6277 0.7923
No log 1.7674 152 0.6456 0.2995 0.6456 0.8035
No log 1.7907 154 0.6178 0.3966 0.6178 0.7860
No log 1.8140 156 0.6936 0.4016 0.6936 0.8328
No log 1.8372 158 0.8228 0.3718 0.8228 0.9071
No log 1.8605 160 0.7544 0.4391 0.7544 0.8686
No log 1.8837 162 0.6509 0.3746 0.6509 0.8068
No log 1.9070 164 0.6105 0.4073 0.6105 0.7813
No log 1.9302 166 0.6085 0.4001 0.6085 0.7801
No log 1.9535 168 0.6086 0.3970 0.6086 0.7801
No log 1.9767 170 0.6552 0.3729 0.6552 0.8095
No log 2.0 172 0.8077 0.3239 0.8077 0.8987
No log 2.0233 174 0.9586 0.2680 0.9586 0.9791
No log 2.0465 176 0.9053 0.2727 0.9053 0.9514
No log 2.0698 178 0.7730 0.4377 0.7730 0.8792
No log 2.0930 180 0.6986 0.4224 0.6986 0.8358
No log 2.1163 182 0.6995 0.4263 0.6995 0.8364
No log 2.1395 184 0.7525 0.4392 0.7525 0.8675
No log 2.1628 186 0.7192 0.4177 0.7192 0.8480
No log 2.1860 188 0.6776 0.4189 0.6776 0.8232
No log 2.2093 190 0.6414 0.4444 0.6414 0.8009
No log 2.2326 192 0.6486 0.4429 0.6486 0.8054
No log 2.2558 194 0.6605 0.3879 0.6605 0.8127
No log 2.2791 196 0.6508 0.4633 0.6508 0.8067
No log 2.3023 198 0.6773 0.4199 0.6773 0.8230
No log 2.3256 200 0.6458 0.3590 0.6458 0.8036
No log 2.3488 202 0.6238 0.4193 0.6238 0.7898
No log 2.3721 204 0.6585 0.4139 0.6585 0.8115
No log 2.3953 206 0.6281 0.3856 0.6281 0.7926
No log 2.4186 208 0.6097 0.3970 0.6097 0.7808
No log 2.4419 210 0.6573 0.4513 0.6573 0.8107
No log 2.4651 212 0.6448 0.3990 0.6448 0.8030
No log 2.4884 214 0.6379 0.3504 0.6379 0.7987
No log 2.5116 216 0.7060 0.3761 0.7060 0.8402
No log 2.5349 218 0.7896 0.3550 0.7896 0.8886
No log 2.5581 220 0.7804 0.3590 0.7804 0.8834
No log 2.5814 222 0.7039 0.3761 0.7039 0.8390
No log 2.6047 224 0.6559 0.4547 0.6559 0.8099
No log 2.6279 226 0.6500 0.4361 0.6500 0.8062
No log 2.6512 228 0.6739 0.3934 0.6739 0.8209
No log 2.6744 230 0.6726 0.3934 0.6726 0.8201
No log 2.6977 232 0.6908 0.3355 0.6908 0.8311
No log 2.7209 234 0.6934 0.3355 0.6934 0.8327
No log 2.7442 236 0.6553 0.3141 0.6553 0.8095
No log 2.7674 238 0.6544 0.3141 0.6544 0.8090
No log 2.7907 240 0.6887 0.3474 0.6887 0.8299
No log 2.8140 242 0.7036 0.3474 0.7036 0.8388
No log 2.8372 244 0.6728 0.3561 0.6728 0.8202
No log 2.8605 246 0.6590 0.3561 0.6590 0.8118
No log 2.8837 248 0.6474 0.4308 0.6474 0.8046
No log 2.9070 250 0.6458 0.3725 0.6458 0.8036
No log 2.9302 252 0.6529 0.3378 0.6529 0.8080
No log 2.9535 254 0.7095 0.3896 0.7095 0.8423
No log 2.9767 256 0.6921 0.3896 0.6921 0.8319
No log 3.0 258 0.6552 0.3653 0.6552 0.8094
No log 3.0233 260 0.6060 0.4322 0.6060 0.7785
No log 3.0465 262 0.6108 0.4299 0.6108 0.7816
No log 3.0698 264 0.6294 0.3563 0.6294 0.7933
No log 3.0930 266 0.6885 0.4131 0.6885 0.8298
No log 3.1163 268 0.7362 0.3829 0.7362 0.8580
No log 3.1395 270 0.7131 0.3633 0.7131 0.8444
No log 3.1628 272 0.7302 0.4084 0.7302 0.8545
No log 3.1860 274 0.7612 0.4168 0.7612 0.8725
No log 3.2093 276 0.7314 0.3912 0.7314 0.8552
No log 3.2326 278 0.6970 0.4689 0.6970 0.8349
No log 3.2558 280 0.7020 0.4340 0.7020 0.8378
No log 3.2791 282 0.6892 0.4340 0.6892 0.8302
No log 3.3023 284 0.6888 0.4111 0.6888 0.8300
No log 3.3256 286 0.6756 0.4444 0.6756 0.8220
No log 3.3488 288 0.6199 0.4211 0.6199 0.7873
No log 3.3721 290 0.6078 0.4253 0.6078 0.7796
No log 3.3953 292 0.6175 0.3906 0.6175 0.7858
No log 3.4186 294 0.7276 0.3410 0.7276 0.8530
No log 3.4419 296 0.8585 0.3924 0.8585 0.9266
No log 3.4651 298 0.8262 0.4007 0.8262 0.9089
No log 3.4884 300 0.6846 0.3896 0.6846 0.8274
No log 3.5116 302 0.6040 0.4322 0.6040 0.7772
No log 3.5349 304 0.6043 0.4314 0.6043 0.7774
No log 3.5581 306 0.6023 0.3407 0.6023 0.7761
No log 3.5814 308 0.6105 0.3107 0.6105 0.7814
No log 3.6047 310 0.6212 0.3481 0.6212 0.7881
No log 3.6279 312 0.6021 0.3982 0.6021 0.7760
No log 3.6512 314 0.5923 0.4466 0.5923 0.7696
No log 3.6744 316 0.5931 0.4536 0.5931 0.7702
No log 3.6977 318 0.6342 0.4939 0.6342 0.7964
No log 3.7209 320 0.6435 0.5085 0.6435 0.8022
No log 3.7442 322 0.6114 0.5254 0.6114 0.7819
No log 3.7674 324 0.5816 0.5301 0.5816 0.7626
No log 3.7907 326 0.5811 0.5399 0.5811 0.7623
No log 3.8140 328 0.6107 0.5407 0.6107 0.7815
No log 3.8372 330 0.5857 0.5628 0.5857 0.7653
No log 3.8605 332 0.5752 0.5848 0.5752 0.7584
No log 3.8837 334 0.5571 0.5758 0.5571 0.7464
No log 3.9070 336 0.5554 0.5487 0.5554 0.7452
No log 3.9302 338 0.5255 0.5758 0.5255 0.7249
No log 3.9535 340 0.5283 0.6076 0.5283 0.7269
No log 3.9767 342 0.6295 0.5464 0.6295 0.7934
No log 4.0 344 0.6727 0.5146 0.6727 0.8202
No log 4.0233 346 0.6299 0.5464 0.6299 0.7937
No log 4.0465 348 0.5616 0.5795 0.5616 0.7494
No log 4.0698 350 0.5656 0.6195 0.5656 0.7520
No log 4.0930 352 0.5561 0.5835 0.5561 0.7457
No log 4.1163 354 0.5982 0.5757 0.5982 0.7734
No log 4.1395 356 0.6829 0.4794 0.6829 0.8264
No log 4.1628 358 0.7504 0.5032 0.7504 0.8663
No log 4.1860 360 0.7703 0.4852 0.7703 0.8777
No log 4.2093 362 0.6417 0.5101 0.6417 0.8010
No log 4.2326 364 0.5628 0.4620 0.5628 0.7502
No log 4.2558 366 0.6117 0.4836 0.6117 0.7821
No log 4.2791 368 0.6155 0.4808 0.6155 0.7845
No log 4.3023 370 0.5681 0.5327 0.5681 0.7537
No log 4.3256 372 0.5565 0.3530 0.5565 0.7460
No log 4.3488 374 0.5980 0.4116 0.5980 0.7733
No log 4.3721 376 0.6214 0.4083 0.6214 0.7883
No log 4.3953 378 0.5926 0.4174 0.5926 0.7698
No log 4.4186 380 0.5726 0.4174 0.5726 0.7567
No log 4.4419 382 0.5735 0.4059 0.5735 0.7573
No log 4.4651 384 0.5950 0.4589 0.5950 0.7714
No log 4.4884 386 0.6600 0.5065 0.6600 0.8124
No log 4.5116 388 0.6191 0.4949 0.6191 0.7868
No log 4.5349 390 0.5329 0.4632 0.5329 0.7300
No log 4.5581 392 0.5081 0.5485 0.5081 0.7128
No log 4.5814 394 0.5100 0.5732 0.5100 0.7142
No log 4.6047 396 0.4999 0.6452 0.4999 0.7071
No log 4.6279 398 0.5382 0.5368 0.5382 0.7336
No log 4.6512 400 0.6272 0.4756 0.6272 0.7920
No log 4.6744 402 0.6501 0.4773 0.6501 0.8063
No log 4.6977 404 0.5718 0.4851 0.5718 0.7562
No log 4.7209 406 0.4904 0.6170 0.4904 0.7003
No log 4.7442 408 0.5048 0.5307 0.5048 0.7105
No log 4.7674 410 0.5242 0.5131 0.5242 0.7240
No log 4.7907 412 0.5100 0.5170 0.5100 0.7141
No log 4.8140 414 0.5128 0.4505 0.5128 0.7161
No log 4.8372 416 0.5808 0.4969 0.5808 0.7621
No log 4.8605 418 0.6047 0.5122 0.6047 0.7776
No log 4.8837 420 0.5557 0.5429 0.5557 0.7455
No log 4.9070 422 0.5130 0.5596 0.5130 0.7162
No log 4.9302 424 0.5151 0.6087 0.5151 0.7177
No log 4.9535 426 0.5583 0.5822 0.5583 0.7472
No log 4.9767 428 0.6138 0.5116 0.6138 0.7835
No log 5.0 430 0.6013 0.5436 0.6013 0.7754
No log 5.0233 432 0.5469 0.5538 0.5469 0.7395
No log 5.0465 434 0.5461 0.4964 0.5461 0.7390
No log 5.0698 436 0.5815 0.4663 0.5815 0.7626
No log 5.0930 438 0.5724 0.4845 0.5724 0.7566
No log 5.1163 440 0.5565 0.3865 0.5565 0.7460
No log 5.1395 442 0.5524 0.4314 0.5524 0.7433
No log 5.1628 444 0.5503 0.4526 0.5503 0.7418
No log 5.1860 446 0.5462 0.4397 0.5462 0.7390
No log 5.2093 448 0.5420 0.4964 0.5420 0.7362
No log 5.2326 450 0.5430 0.4516 0.5430 0.7369
No log 5.2558 452 0.5684 0.5678 0.5684 0.7539
No log 5.2791 454 0.6016 0.5168 0.6016 0.7756
No log 5.3023 456 0.5998 0.5283 0.5998 0.7745
No log 5.3256 458 0.5825 0.5471 0.5825 0.7632
No log 5.3488 460 0.5527 0.5943 0.5527 0.7434
No log 5.3721 462 0.5417 0.4757 0.5417 0.7360
No log 5.3953 464 0.5387 0.4757 0.5387 0.7340
No log 5.4186 466 0.5441 0.5596 0.5441 0.7376
No log 5.4419 468 0.5702 0.5554 0.5702 0.7551
No log 5.4651 470 0.5609 0.5538 0.5609 0.7489
No log 5.4884 472 0.5414 0.4495 0.5414 0.7358
No log 5.5116 474 0.5505 0.4782 0.5505 0.7419
No log 5.5349 476 0.5964 0.4513 0.5964 0.7723
No log 5.5581 478 0.6058 0.4460 0.6058 0.7783
No log 5.5814 480 0.5682 0.4912 0.5682 0.7538
No log 5.6047 482 0.5648 0.6411 0.5648 0.7515
No log 5.6279 484 0.6231 0.5283 0.6231 0.7894
No log 5.6512 486 0.6597 0.5002 0.6597 0.8122
No log 5.6744 488 0.6170 0.5236 0.6170 0.7855
No log 5.6977 490 0.5574 0.6092 0.5574 0.7466
No log 5.7209 492 0.5323 0.5479 0.5323 0.7296
No log 5.7442 494 0.5246 0.5160 0.5246 0.7243
No log 5.7674 496 0.5294 0.5028 0.5294 0.7276
No log 5.7907 498 0.5359 0.5266 0.5359 0.7320
0.3365 5.8140 500 0.5253 0.4782 0.5253 0.7248
0.3365 5.8372 502 0.5233 0.4724 0.5233 0.7234
0.3365 5.8605 504 0.5198 0.4634 0.5198 0.7210
0.3365 5.8837 506 0.5219 0.4471 0.5219 0.7224
0.3365 5.9070 508 0.5250 0.4677 0.5250 0.7246
0.3365 5.9302 510 0.5228 0.4527 0.5228 0.7231

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_B_usingALLEssays_FineTuningAraBERT_run1_AugV5_k17_task7_organization

Finetuned
(4019)
this model