ArabicNewSplits6_FineTuningAraBERT_run1_AugV5_k10_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8429
  • Qwk: 0.4631
  • Mse: 0.8429
  • Rmse: 0.9181

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0385 2 4.2390 -0.0152 4.2390 2.0589
No log 0.0769 4 2.4279 0.0444 2.4279 1.5582
No log 0.1154 6 1.3950 0.0190 1.3950 1.1811
No log 0.1538 8 1.3345 -0.0305 1.3345 1.1552
No log 0.1923 10 0.9753 -0.0673 0.9753 0.9876
No log 0.2308 12 0.7276 0.1754 0.7276 0.8530
No log 0.2692 14 0.7335 0.1844 0.7335 0.8565
No log 0.3077 16 0.9238 0.1198 0.9238 0.9611
No log 0.3462 18 0.8142 0.1910 0.8142 0.9023
No log 0.3846 20 0.7527 0.1977 0.7527 0.8676
No log 0.4231 22 0.7998 0.2217 0.7998 0.8943
No log 0.4615 24 0.9654 0.1387 0.9654 0.9825
No log 0.5 26 0.9558 0.1879 0.9558 0.9776
No log 0.5385 28 0.9368 0.2358 0.9368 0.9679
No log 0.5769 30 0.7613 0.2443 0.7613 0.8725
No log 0.6154 32 0.7218 0.2947 0.7218 0.8496
No log 0.6538 34 0.7760 0.1493 0.7760 0.8809
No log 0.6923 36 0.7449 0.3248 0.7449 0.8631
No log 0.7308 38 0.8531 0.3235 0.8531 0.9236
No log 0.7692 40 1.0518 0.2205 1.0518 1.0255
No log 0.8077 42 1.0192 0.2537 1.0192 1.0096
No log 0.8462 44 0.8303 0.3807 0.8303 0.9112
No log 0.8846 46 0.7082 0.3779 0.7082 0.8416
No log 0.9231 48 0.7099 0.3504 0.7099 0.8425
No log 0.9615 50 0.7150 0.3414 0.7150 0.8456
No log 1.0 52 0.6835 0.3554 0.6835 0.8268
No log 1.0385 54 0.6659 0.3784 0.6659 0.8160
No log 1.0769 56 0.7065 0.3579 0.7065 0.8405
No log 1.1154 58 0.8901 0.3649 0.8901 0.9435
No log 1.1538 60 0.9753 0.3382 0.9753 0.9876
No log 1.1923 62 0.9955 0.3267 0.9955 0.9978
No log 1.2308 64 1.1133 0.2175 1.1133 1.0551
No log 1.2692 66 1.1530 0.1534 1.1530 1.0738
No log 1.3077 68 1.0516 0.1709 1.0516 1.0255
No log 1.3462 70 0.8716 0.2153 0.8716 0.9336
No log 1.3846 72 0.8317 0.3205 0.8317 0.9120
No log 1.4231 74 0.9348 0.3623 0.9348 0.9668
No log 1.4615 76 0.9214 0.3804 0.9214 0.9599
No log 1.5 78 0.8566 0.3836 0.8566 0.9256
No log 1.5385 80 0.8492 0.2303 0.8492 0.9215
No log 1.5769 82 0.9509 0.1219 0.9509 0.9751
No log 1.6154 84 0.9915 0.0507 0.9915 0.9957
No log 1.6538 86 0.7559 0.2991 0.7559 0.8694
No log 1.6923 88 0.6204 0.3953 0.6204 0.7876
No log 1.7308 90 0.6726 0.4441 0.6726 0.8201
No log 1.7692 92 0.8128 0.3405 0.8128 0.9015
No log 1.8077 94 0.8488 0.3494 0.8488 0.9213
No log 1.8462 96 0.7485 0.4160 0.7485 0.8651
No log 1.8846 98 0.7217 0.4446 0.7217 0.8496
No log 1.9231 100 0.7430 0.4584 0.7430 0.8620
No log 1.9615 102 0.8144 0.4211 0.8144 0.9024
No log 2.0 104 0.8153 0.4631 0.8153 0.9029
No log 2.0385 106 0.8329 0.4978 0.8329 0.9126
No log 2.0769 108 0.8429 0.4474 0.8429 0.9181
No log 2.1154 110 0.8597 0.4491 0.8597 0.9272
No log 2.1538 112 0.8551 0.4295 0.8551 0.9247
No log 2.1923 114 0.8567 0.4727 0.8567 0.9256
No log 2.2308 116 0.8589 0.4743 0.8589 0.9268
No log 2.2692 118 0.8608 0.4931 0.8608 0.9278
No log 2.3077 120 0.8883 0.4321 0.8883 0.9425
No log 2.3462 122 0.8547 0.4614 0.8547 0.9245
No log 2.3846 124 0.8203 0.4994 0.8203 0.9057
No log 2.4231 126 0.8720 0.4691 0.8720 0.9338
No log 2.4615 128 0.8869 0.4311 0.8869 0.9417
No log 2.5 130 0.8738 0.4350 0.8738 0.9347
No log 2.5385 132 0.7580 0.4961 0.7580 0.8706
No log 2.5769 134 0.7282 0.4699 0.7282 0.8533
No log 2.6154 136 0.7984 0.3937 0.7984 0.8935
No log 2.6538 138 0.8517 0.4021 0.8517 0.9229
No log 2.6923 140 0.8056 0.4018 0.8056 0.8975
No log 2.7308 142 0.7224 0.3538 0.7224 0.8499
No log 2.7692 144 0.6651 0.4115 0.6651 0.8155
No log 2.8077 146 0.6841 0.5054 0.6841 0.8271
No log 2.8462 148 0.6875 0.4542 0.6875 0.8292
No log 2.8846 150 0.6832 0.4184 0.6832 0.8266
No log 2.9231 152 0.7130 0.4614 0.7130 0.8444
No log 2.9615 154 0.7084 0.4487 0.7084 0.8417
No log 3.0 156 0.7261 0.3970 0.7261 0.8521
No log 3.0385 158 0.7605 0.4394 0.7605 0.8721
No log 3.0769 160 0.8043 0.4834 0.8043 0.8968
No log 3.1154 162 0.8323 0.4974 0.8323 0.9123
No log 3.1538 164 0.8113 0.4945 0.8113 0.9007
No log 3.1923 166 0.8196 0.4462 0.8196 0.9053
No log 3.2308 168 0.8747 0.4182 0.8747 0.9353
No log 3.2692 170 0.8658 0.3934 0.8658 0.9305
No log 3.3077 172 0.7891 0.4294 0.7891 0.8883
No log 3.3462 174 0.7502 0.4747 0.7502 0.8661
No log 3.3846 176 0.7631 0.5461 0.7631 0.8736
No log 3.4231 178 0.7785 0.5166 0.7785 0.8823
No log 3.4615 180 0.8078 0.4609 0.8078 0.8988
No log 3.5 182 0.8688 0.4325 0.8688 0.9321
No log 3.5385 184 0.8926 0.4323 0.8926 0.9448
No log 3.5769 186 0.8700 0.4155 0.8700 0.9327
No log 3.6154 188 0.8503 0.4098 0.8503 0.9221
No log 3.6538 190 0.8465 0.3877 0.8465 0.9201
No log 3.6923 192 0.8402 0.4190 0.8402 0.9166
No log 3.7308 194 0.8240 0.4052 0.8240 0.9078
No log 3.7692 196 0.7950 0.4293 0.7950 0.8916
No log 3.8077 198 0.8008 0.4345 0.8008 0.8949
No log 3.8462 200 0.7973 0.4559 0.7973 0.8929
No log 3.8846 202 0.8073 0.4818 0.8073 0.8985
No log 3.9231 204 0.8371 0.4779 0.8371 0.9150
No log 3.9615 206 0.8290 0.4824 0.8290 0.9105
No log 4.0 208 0.8254 0.4673 0.8254 0.9085
No log 4.0385 210 0.8313 0.4595 0.8313 0.9118
No log 4.0769 212 0.8404 0.3903 0.8404 0.9167
No log 4.1154 214 0.8466 0.4048 0.8466 0.9201
No log 4.1538 216 0.8503 0.3900 0.8503 0.9221
No log 4.1923 218 0.8343 0.4253 0.8343 0.9134
No log 4.2308 220 0.8126 0.4346 0.8126 0.9015
No log 4.2692 222 0.8122 0.4515 0.8122 0.9012
No log 4.3077 224 0.7935 0.4444 0.7935 0.8908
No log 4.3462 226 0.7637 0.4277 0.7637 0.8739
No log 4.3846 228 0.7536 0.5179 0.7536 0.8681
No log 4.4231 230 0.7590 0.5030 0.7590 0.8712
No log 4.4615 232 0.7498 0.4138 0.7498 0.8659
No log 4.5 234 0.7633 0.4505 0.7633 0.8737
No log 4.5385 236 0.7989 0.4346 0.7989 0.8938
No log 4.5769 238 0.8080 0.4496 0.8080 0.8989
No log 4.6154 240 0.8194 0.4849 0.8194 0.9052
No log 4.6538 242 0.8415 0.4718 0.8415 0.9173
No log 4.6923 244 0.8708 0.4934 0.8708 0.9332
No log 4.7308 246 0.8969 0.4751 0.8969 0.9470
No log 4.7692 248 0.9035 0.4549 0.9035 0.9505
No log 4.8077 250 0.8994 0.4483 0.8994 0.9484
No log 4.8462 252 0.9091 0.4377 0.9091 0.9535
No log 4.8846 254 0.8715 0.4432 0.8715 0.9335
No log 4.9231 256 0.8679 0.4613 0.8679 0.9316
No log 4.9615 258 0.9058 0.4603 0.9058 0.9517
No log 5.0 260 0.9055 0.4592 0.9055 0.9516
No log 5.0385 262 0.8898 0.4400 0.8898 0.9433
No log 5.0769 264 0.8876 0.4513 0.8876 0.9421
No log 5.1154 266 0.8833 0.4668 0.8833 0.9399
No log 5.1538 268 0.8694 0.4432 0.8694 0.9324
No log 5.1923 270 0.8460 0.4403 0.8460 0.9198
No log 5.2308 272 0.8339 0.4401 0.8339 0.9132
No log 5.2692 274 0.8262 0.4621 0.8262 0.9089
No log 5.3077 276 0.8404 0.4818 0.8404 0.9167
No log 5.3462 278 0.8421 0.4764 0.8421 0.9177
No log 5.3846 280 0.8511 0.4723 0.8511 0.9225
No log 5.4231 282 0.8648 0.4922 0.8648 0.9299
No log 5.4615 284 0.9018 0.4658 0.9018 0.9497
No log 5.5 286 0.9356 0.4650 0.9356 0.9673
No log 5.5385 288 0.9356 0.4650 0.9356 0.9672
No log 5.5769 290 0.9049 0.4833 0.9049 0.9512
No log 5.6154 292 0.8946 0.4840 0.8946 0.9458
No log 5.6538 294 0.9207 0.4507 0.9207 0.9595
No log 5.6923 296 0.9147 0.4473 0.9147 0.9564
No log 5.7308 298 0.8608 0.4809 0.8608 0.9278
No log 5.7692 300 0.8200 0.4751 0.8200 0.9056
No log 5.8077 302 0.8205 0.5025 0.8205 0.9058
No log 5.8462 304 0.8312 0.4493 0.8312 0.9117
No log 5.8846 306 0.8164 0.4493 0.8164 0.9035
No log 5.9231 308 0.8043 0.4992 0.8043 0.8968
No log 5.9615 310 0.7937 0.4852 0.7937 0.8909
No log 6.0 312 0.7992 0.4852 0.7992 0.8940
No log 6.0385 314 0.8082 0.4960 0.8082 0.8990
No log 6.0769 316 0.8349 0.4724 0.8349 0.9137
No log 6.1154 318 0.8542 0.4440 0.8542 0.9242
No log 6.1538 320 0.8691 0.4651 0.8691 0.9322
No log 6.1923 322 0.8663 0.4587 0.8663 0.9307
No log 6.2308 324 0.8696 0.4461 0.8696 0.9325
No log 6.2692 326 0.8650 0.4539 0.8650 0.9300
No log 6.3077 328 0.8480 0.4755 0.8480 0.9209
No log 6.3462 330 0.8442 0.4796 0.8442 0.9188
No log 6.3846 332 0.8466 0.4914 0.8466 0.9201
No log 6.4231 334 0.8439 0.5029 0.8439 0.9186
No log 6.4615 336 0.8511 0.4890 0.8511 0.9225
No log 6.5 338 0.8483 0.4183 0.8483 0.9210
No log 6.5385 340 0.8306 0.4180 0.8306 0.9114
No log 6.5769 342 0.8166 0.4874 0.8166 0.9037
No log 6.6154 344 0.8109 0.4822 0.8109 0.9005
No log 6.6538 346 0.8094 0.4822 0.8094 0.8996
No log 6.6923 348 0.8067 0.4975 0.8067 0.8982
No log 6.7308 350 0.8063 0.4967 0.8063 0.8980
No log 6.7692 352 0.8068 0.4983 0.8068 0.8982
No log 6.8077 354 0.8006 0.4536 0.8006 0.8948
No log 6.8462 356 0.8002 0.4691 0.8002 0.8945
No log 6.8846 358 0.7933 0.4634 0.7933 0.8907
No log 6.9231 360 0.7958 0.4634 0.7958 0.8921
No log 6.9615 362 0.8118 0.4463 0.8118 0.9010
No log 7.0 364 0.8487 0.4342 0.8487 0.9213
No log 7.0385 366 0.8825 0.4196 0.8825 0.9394
No log 7.0769 368 0.8764 0.4196 0.8764 0.9362
No log 7.1154 370 0.8568 0.4342 0.8568 0.9256
No log 7.1538 372 0.8438 0.4696 0.8438 0.9186
No log 7.1923 374 0.8491 0.4588 0.8491 0.9215
No log 7.2308 376 0.8644 0.4658 0.8644 0.9297
No log 7.2692 378 0.8926 0.4411 0.8926 0.9448
No log 7.3077 380 0.9323 0.4341 0.9323 0.9655
No log 7.3462 382 0.9371 0.4411 0.9371 0.9680
No log 7.3846 384 0.9580 0.4340 0.9580 0.9788
No log 7.4231 386 0.9682 0.4409 0.9682 0.9840
No log 7.4615 388 0.9522 0.4409 0.9522 0.9758
No log 7.5 390 0.9212 0.4848 0.9212 0.9598
No log 7.5385 392 0.9174 0.4985 0.9174 0.9578
No log 7.5769 394 0.9316 0.4906 0.9316 0.9652
No log 7.6154 396 0.9771 0.4339 0.9771 0.9885
No log 7.6538 398 1.0207 0.4286 1.0207 1.0103
No log 7.6923 400 1.0645 0.4334 1.0645 1.0317
No log 7.7308 402 1.0994 0.4118 1.0994 1.0485
No log 7.7692 404 1.0966 0.4095 1.0966 1.0472
No log 7.8077 406 1.0471 0.4383 1.0471 1.0233
No log 7.8462 408 0.9846 0.4497 0.9846 0.9923
No log 7.8846 410 0.9483 0.4852 0.9483 0.9738
No log 7.9231 412 0.9306 0.4803 0.9306 0.9647
No log 7.9615 414 0.9039 0.4639 0.9039 0.9507
No log 8.0 416 0.8760 0.4753 0.8760 0.9359
No log 8.0385 418 0.8576 0.4839 0.8576 0.9261
No log 8.0769 420 0.8485 0.4555 0.8485 0.9211
No log 8.1154 422 0.8546 0.4253 0.8546 0.9244
No log 8.1538 424 0.8655 0.4215 0.8655 0.9303
No log 8.1923 426 0.8729 0.4215 0.8729 0.9343
No log 8.2308 428 0.8696 0.4195 0.8696 0.9325
No log 8.2692 430 0.8570 0.4215 0.8570 0.9257
No log 8.3077 432 0.8461 0.4159 0.8461 0.9198
No log 8.3462 434 0.8395 0.4593 0.8395 0.9162
No log 8.3846 436 0.8326 0.4566 0.8326 0.9125
No log 8.4231 438 0.8344 0.4767 0.8344 0.9134
No log 8.4615 440 0.8270 0.4722 0.8270 0.9094
No log 8.5 442 0.8248 0.4676 0.8248 0.9082
No log 8.5385 444 0.8262 0.4740 0.8262 0.9090
No log 8.5769 446 0.8320 0.4740 0.8320 0.9122
No log 8.6154 448 0.8425 0.4722 0.8425 0.9179
No log 8.6538 450 0.8507 0.4310 0.8507 0.9223
No log 8.6923 452 0.8547 0.4363 0.8547 0.9245
No log 8.7308 454 0.8610 0.4363 0.8610 0.9279
No log 8.7692 456 0.8591 0.4363 0.8591 0.9269
No log 8.8077 458 0.8551 0.4363 0.8551 0.9247
No log 8.8462 460 0.8429 0.4644 0.8429 0.9181
No log 8.8846 462 0.8430 0.4790 0.8430 0.9181
No log 8.9231 464 0.8483 0.4802 0.8483 0.9210
No log 8.9615 466 0.8529 0.5016 0.8529 0.9235
No log 9.0 468 0.8562 0.5008 0.8562 0.9253
No log 9.0385 470 0.8605 0.5008 0.8605 0.9276
No log 9.0769 472 0.8680 0.4796 0.8680 0.9317
No log 9.1154 474 0.8744 0.4790 0.8744 0.9351
No log 9.1538 476 0.8806 0.4538 0.8806 0.9384
No log 9.1923 478 0.8816 0.4535 0.8816 0.9390
No log 9.2308 480 0.8780 0.4538 0.8780 0.9370
No log 9.2692 482 0.8768 0.4345 0.8768 0.9364
No log 9.3077 484 0.8740 0.4344 0.8740 0.9349
No log 9.3462 486 0.8684 0.4328 0.8684 0.9319
No log 9.3846 488 0.8621 0.4328 0.8621 0.9285
No log 9.4231 490 0.8579 0.4328 0.8579 0.9262
No log 9.4615 492 0.8584 0.4328 0.8584 0.9265
No log 9.5 494 0.8623 0.4380 0.8623 0.9286
No log 9.5385 496 0.8621 0.4380 0.8621 0.9285
No log 9.5769 498 0.8625 0.4619 0.8625 0.9287
0.4013 9.6154 500 0.8649 0.4480 0.8649 0.9300
0.4013 9.6538 502 0.8649 0.4480 0.8649 0.9300
0.4013 9.6923 504 0.8624 0.4480 0.8624 0.9287
0.4013 9.7308 506 0.8582 0.4480 0.8582 0.9264
0.4013 9.7692 508 0.8536 0.4483 0.8536 0.9239
0.4013 9.8077 510 0.8500 0.4483 0.8500 0.9219
0.4013 9.8462 512 0.8468 0.4380 0.8468 0.9202
0.4013 9.8846 514 0.8444 0.4380 0.8444 0.9189
0.4013 9.9231 516 0.8434 0.4631 0.8434 0.9184
0.4013 9.9615 518 0.8430 0.4631 0.8430 0.9182
0.4013 10.0 520 0.8429 0.4631 0.8429 0.9181

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_FineTuningAraBERT_run1_AugV5_k10_task2_organization

Finetuned
(4023)
this model