ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k5_task2_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9218
  • Qwk: 0.3551
  • Mse: 0.9218
  • Rmse: 0.9601

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0690 2 4.5357 0.0163 4.5357 2.1297
No log 0.1379 4 3.1033 -0.0029 3.1033 1.7616
No log 0.2069 6 1.6664 0.0709 1.6664 1.2909
No log 0.2759 8 1.3077 0.1142 1.3077 1.1436
No log 0.3448 10 1.7946 -0.0351 1.7946 1.3396
No log 0.4138 12 1.3600 0.0404 1.3600 1.1662
No log 0.4828 14 1.1954 0.1076 1.1954 1.0934
No log 0.5517 16 1.1775 0.0904 1.1775 1.0851
No log 0.6207 18 1.2050 0.1144 1.2050 1.0977
No log 0.6897 20 1.2051 0.1814 1.2051 1.0978
No log 0.7586 22 1.2192 0.2785 1.2192 1.1042
No log 0.8276 24 1.2079 0.2439 1.2079 1.0990
No log 0.8966 26 1.1827 0.2984 1.1827 1.0875
No log 0.9655 28 1.1247 0.3635 1.1247 1.0605
No log 1.0345 30 1.1027 0.3635 1.1027 1.0501
No log 1.1034 32 1.0859 0.2439 1.0859 1.0421
No log 1.1724 34 1.0035 0.4289 1.0035 1.0017
No log 1.2414 36 0.9513 0.3877 0.9513 0.9754
No log 1.3103 38 0.9179 0.4138 0.9179 0.9581
No log 1.3793 40 0.9158 0.3431 0.9158 0.9570
No log 1.4483 42 0.9446 0.3298 0.9446 0.9719
No log 1.5172 44 0.9223 0.3317 0.9223 0.9604
No log 1.5862 46 0.8710 0.4284 0.8710 0.9333
No log 1.6552 48 0.8423 0.4654 0.8423 0.9178
No log 1.7241 50 0.8509 0.4974 0.8509 0.9224
No log 1.7931 52 0.8942 0.5548 0.8942 0.9456
No log 1.8621 54 0.8423 0.5740 0.8423 0.9178
No log 1.9310 56 0.7909 0.5846 0.7909 0.8893
No log 2.0 58 0.7858 0.5672 0.7858 0.8865
No log 2.0690 60 0.7870 0.4969 0.7870 0.8871
No log 2.1379 62 0.8440 0.5280 0.8440 0.9187
No log 2.2069 64 0.8238 0.5760 0.8238 0.9076
No log 2.2759 66 0.7663 0.5238 0.7663 0.8754
No log 2.3448 68 0.7599 0.5176 0.7599 0.8717
No log 2.4138 70 0.7499 0.5968 0.7499 0.8659
No log 2.4828 72 0.7131 0.6171 0.7131 0.8445
No log 2.5517 74 0.6732 0.6489 0.6732 0.8205
No log 2.6207 76 0.7034 0.5684 0.7034 0.8387
No log 2.6897 78 0.7172 0.5856 0.7172 0.8469
No log 2.7586 80 0.7006 0.6372 0.7006 0.8370
No log 2.8276 82 0.7408 0.6334 0.7408 0.8607
No log 2.8966 84 0.7495 0.6062 0.7495 0.8657
No log 2.9655 86 0.7600 0.5961 0.7600 0.8718
No log 3.0345 88 0.7948 0.5658 0.7948 0.8915
No log 3.1034 90 0.8621 0.5766 0.8621 0.9285
No log 3.1724 92 0.7995 0.5760 0.7995 0.8942
No log 3.2414 94 0.7655 0.5968 0.7655 0.8750
No log 3.3103 96 0.8238 0.5658 0.8238 0.9077
No log 3.3793 98 0.9401 0.5448 0.9401 0.9696
No log 3.4483 100 0.9484 0.5591 0.9484 0.9739
No log 3.5172 102 0.8205 0.5223 0.8205 0.9058
No log 3.5862 104 0.8143 0.5811 0.8143 0.9024
No log 3.6552 106 0.7548 0.5797 0.7548 0.8688
No log 3.7241 108 0.7188 0.5908 0.7188 0.8478
No log 3.7931 110 0.7695 0.5601 0.7695 0.8772
No log 3.8621 112 0.7174 0.6289 0.7174 0.8470
No log 3.9310 114 0.7038 0.6118 0.7038 0.8389
No log 4.0 116 0.7040 0.5878 0.7040 0.8390
No log 4.0690 118 0.7976 0.5682 0.7976 0.8931
No log 4.1379 120 1.0076 0.5088 1.0076 1.0038
No log 4.2069 122 1.0292 0.5088 1.0292 1.0145
No log 4.2759 124 0.8872 0.5339 0.8872 0.9419
No log 4.3448 126 0.7505 0.5550 0.7505 0.8663
No log 4.4138 128 0.7296 0.6084 0.7296 0.8542
No log 4.4828 130 0.7338 0.5572 0.7338 0.8566
No log 4.5517 132 0.8416 0.5637 0.8416 0.9174
No log 4.6207 134 0.9170 0.5199 0.9170 0.9576
No log 4.6897 136 0.8800 0.5083 0.8800 0.9381
No log 4.7586 138 0.8747 0.4715 0.8747 0.9352
No log 4.8276 140 0.9020 0.4124 0.9020 0.9497
No log 4.8966 142 0.8376 0.5435 0.8376 0.9152
No log 4.9655 144 0.7323 0.5504 0.7323 0.8558
No log 5.0345 146 0.7175 0.5622 0.7175 0.8470
No log 5.1034 148 0.7510 0.5477 0.7510 0.8666
No log 5.1724 150 0.8300 0.4444 0.8300 0.9110
No log 5.2414 152 0.9383 0.4946 0.9383 0.9687
No log 5.3103 154 1.0401 0.4714 1.0401 1.0198
No log 5.3793 156 0.9535 0.5015 0.9535 0.9765
No log 5.4483 158 0.8345 0.4236 0.8345 0.9135
No log 5.5172 160 0.8141 0.4397 0.8141 0.9023
No log 5.5862 162 0.8330 0.5157 0.8330 0.9127
No log 5.6552 164 0.8519 0.5326 0.8519 0.9230
No log 5.7241 166 0.8500 0.4982 0.8500 0.9220
No log 5.7931 168 0.8764 0.5055 0.8764 0.9362
No log 5.8621 170 0.9587 0.5170 0.9587 0.9791
No log 5.9310 172 0.9344 0.5194 0.9344 0.9667
No log 6.0 174 0.7973 0.5908 0.7973 0.8929
No log 6.0690 176 0.8253 0.6025 0.8253 0.9085
No log 6.1379 178 0.8368 0.6025 0.8368 0.9148
No log 6.2069 180 0.7977 0.5916 0.7977 0.8932
No log 6.2759 182 0.9114 0.5448 0.9114 0.9547
No log 6.3448 184 1.0743 0.4477 1.0743 1.0365
No log 6.4138 186 0.9804 0.4714 0.9804 0.9902
No log 6.4828 188 0.8308 0.5029 0.8308 0.9115
No log 6.5517 190 0.7975 0.5336 0.7975 0.8930
No log 6.6207 192 0.8828 0.4874 0.8828 0.9396
No log 6.6897 194 0.8362 0.5505 0.8362 0.9144
No log 6.7586 196 0.8792 0.5236 0.8792 0.9376
No log 6.8276 198 1.0359 0.5144 1.0359 1.0178
No log 6.8966 200 1.0747 0.4565 1.0747 1.0367
No log 6.9655 202 0.9366 0.5080 0.9366 0.9678
No log 7.0345 204 0.7828 0.5169 0.7828 0.8848
No log 7.1034 206 0.7645 0.6272 0.7645 0.8743
No log 7.1724 208 0.8302 0.5934 0.8302 0.9112
No log 7.2414 210 0.8132 0.5633 0.8132 0.9018
No log 7.3103 212 0.8198 0.5204 0.8198 0.9054
No log 7.3793 214 0.8995 0.4972 0.8995 0.9484
No log 7.4483 216 0.9596 0.4948 0.9596 0.9796
No log 7.5172 218 0.9393 0.4615 0.9393 0.9692
No log 7.5862 220 0.9080 0.5185 0.9080 0.9529
No log 7.6552 222 0.8559 0.5041 0.8559 0.9251
No log 7.7241 224 0.8576 0.5041 0.8576 0.9260
No log 7.7931 226 0.8612 0.5291 0.8612 0.9280
No log 7.8621 228 0.8294 0.5086 0.8294 0.9107
No log 7.9310 230 0.7871 0.5202 0.7871 0.8872
No log 8.0 232 0.7547 0.5318 0.7547 0.8688
No log 8.0690 234 0.7821 0.5264 0.7821 0.8844
No log 8.1379 236 0.8063 0.5385 0.8063 0.8980
No log 8.2069 238 0.7782 0.5264 0.7782 0.8821
No log 8.2759 240 0.7341 0.6120 0.7341 0.8568
No log 8.3448 242 0.7227 0.5883 0.7227 0.8501
No log 8.4138 244 0.7989 0.5219 0.7989 0.8938
No log 8.4828 246 0.8276 0.4907 0.8276 0.9097
No log 8.5517 248 0.7727 0.5412 0.7727 0.8790
No log 8.6207 250 0.7750 0.5165 0.7750 0.8804
No log 8.6897 252 0.9046 0.5029 0.9046 0.9511
No log 8.7586 254 1.0099 0.4813 1.0099 1.0049
No log 8.8276 256 0.9596 0.4563 0.9596 0.9796
No log 8.8966 258 0.8425 0.5211 0.8425 0.9179
No log 8.9655 260 0.8127 0.4813 0.8127 0.9015
No log 9.0345 262 0.8120 0.4898 0.8120 0.9011
No log 9.1034 264 0.8213 0.5043 0.8213 0.9062
No log 9.1724 266 0.8848 0.5447 0.8848 0.9406
No log 9.2414 268 0.9197 0.5218 0.9197 0.9590
No log 9.3103 270 0.8817 0.5470 0.8817 0.9390
No log 9.3793 272 0.8541 0.4595 0.8541 0.9242
No log 9.4483 274 0.8561 0.5157 0.8561 0.9252
No log 9.5172 276 0.8809 0.4959 0.8809 0.9386
No log 9.5862 278 0.8561 0.4527 0.8561 0.9253
No log 9.6552 280 0.8761 0.5131 0.8761 0.9360
No log 9.7241 282 0.9288 0.5601 0.9288 0.9637
No log 9.7931 284 0.8857 0.5673 0.8857 0.9411
No log 9.8621 286 0.8238 0.5242 0.8238 0.9076
No log 9.9310 288 0.8090 0.5089 0.8090 0.8995
No log 10.0 290 0.8203 0.5137 0.8203 0.9057
No log 10.0690 292 0.8563 0.4923 0.8563 0.9254
No log 10.1379 294 0.8740 0.5114 0.8740 0.9349
No log 10.2069 296 0.8586 0.5070 0.8586 0.9266
No log 10.2759 298 0.8398 0.5070 0.8398 0.9164
No log 10.3448 300 0.8255 0.5194 0.8255 0.9086
No log 10.4138 302 0.8338 0.5338 0.8338 0.9131
No log 10.4828 304 0.8528 0.5759 0.8528 0.9235
No log 10.5517 306 0.8226 0.5256 0.8226 0.9070
No log 10.6207 308 0.8439 0.5756 0.8439 0.9186
No log 10.6897 310 0.8711 0.5731 0.8711 0.9333
No log 10.7586 312 0.8363 0.5211 0.8363 0.9145
No log 10.8276 314 0.8042 0.5322 0.8042 0.8968
No log 10.8966 316 0.8189 0.5507 0.8189 0.9049
No log 10.9655 318 0.8702 0.4972 0.8702 0.9328
No log 11.0345 320 0.9084 0.5297 0.9084 0.9531
No log 11.1034 322 0.9781 0.5536 0.9781 0.9890
No log 11.1724 324 1.0384 0.5293 1.0384 1.0190
No log 11.2414 326 0.9561 0.5458 0.9561 0.9778
No log 11.3103 328 0.8469 0.5324 0.8469 0.9203
No log 11.3793 330 0.8145 0.5830 0.8145 0.9025
No log 11.4483 332 0.8254 0.5300 0.8254 0.9085
No log 11.5172 334 0.8651 0.4998 0.8651 0.9301
No log 11.5862 336 0.9104 0.4874 0.9104 0.9541
No log 11.6552 338 0.9880 0.4551 0.9880 0.9940
No log 11.7241 340 0.9864 0.4557 0.9864 0.9932
No log 11.7931 342 0.9106 0.5313 0.9106 0.9542
No log 11.8621 344 0.8240 0.5622 0.8240 0.9078
No log 11.9310 346 0.8492 0.4563 0.8492 0.9215
No log 12.0 348 0.8830 0.5072 0.8830 0.9397
No log 12.0690 350 0.8429 0.4294 0.8429 0.9181
No log 12.1379 352 0.8360 0.4852 0.8360 0.9143
No log 12.2069 354 0.9238 0.4938 0.9238 0.9612
No log 12.2759 356 1.0596 0.3992 1.0596 1.0294
No log 12.3448 358 1.0577 0.4697 1.0577 1.0284
No log 12.4138 360 0.9192 0.5385 0.9192 0.9587
No log 12.4828 362 0.7919 0.5983 0.7919 0.8899
No log 12.5517 364 0.7822 0.5846 0.7822 0.8844
No log 12.6207 366 0.8303 0.5837 0.8303 0.9112
No log 12.6897 368 0.8186 0.5618 0.8186 0.9048
No log 12.7586 370 0.7716 0.5940 0.7716 0.8784
No log 12.8276 372 0.8003 0.5385 0.8003 0.8946
No log 12.8966 374 0.9089 0.5577 0.9089 0.9534
No log 12.9655 376 0.9453 0.5140 0.9453 0.9723
No log 13.0345 378 0.8900 0.4986 0.8900 0.9434
No log 13.1034 380 0.8228 0.4587 0.8228 0.9071
No log 13.1724 382 0.8136 0.4726 0.8136 0.9020
No log 13.2414 384 0.8223 0.5590 0.8223 0.9068
No log 13.3103 386 0.8177 0.4726 0.8177 0.9043
No log 13.3793 388 0.8427 0.4328 0.8427 0.9180
No log 13.4483 390 0.9228 0.4565 0.9228 0.9606
No log 13.5172 392 0.9845 0.5016 0.9845 0.9922
No log 13.5862 394 0.9428 0.4986 0.9428 0.9710
No log 13.6552 396 0.8436 0.4454 0.8436 0.9185
No log 13.7241 398 0.8139 0.4726 0.8139 0.9022
No log 13.7931 400 0.8353 0.4953 0.8353 0.9140
No log 13.8621 402 0.8904 0.5430 0.8904 0.9436
No log 13.9310 404 0.8968 0.5541 0.8968 0.9470
No log 14.0 406 0.9018 0.5279 0.9018 0.9496
No log 14.0690 408 0.8781 0.5279 0.8781 0.9371
No log 14.1379 410 0.8536 0.5229 0.8536 0.9239
No log 14.2069 412 0.8505 0.5176 0.8505 0.9222
No log 14.2759 414 0.8338 0.5042 0.8338 0.9131
No log 14.3448 416 0.8280 0.4617 0.8280 0.9100
No log 14.4138 418 0.8395 0.4512 0.8395 0.9162
No log 14.4828 420 0.8820 0.4902 0.8820 0.9392
No log 14.5517 422 0.9216 0.4490 0.9216 0.9600
No log 14.6207 424 1.0099 0.4741 1.0099 1.0049
No log 14.6897 426 1.0565 0.4497 1.0565 1.0279
No log 14.7586 428 0.9967 0.4135 0.9967 0.9984
No log 14.8276 430 0.9202 0.5028 0.9202 0.9593
No log 14.8966 432 0.8318 0.5119 0.8318 0.9121
No log 14.9655 434 0.8163 0.4930 0.8163 0.9035
No log 15.0345 436 0.8288 0.4930 0.8288 0.9104
No log 15.1034 438 0.8453 0.4009 0.8453 0.9194
No log 15.1724 440 0.8932 0.5118 0.8932 0.9451
No log 15.2414 442 0.9093 0.5118 0.9093 0.9536
No log 15.3103 444 0.8740 0.5374 0.8740 0.9349
No log 15.3793 446 0.8484 0.5838 0.8484 0.9211
No log 15.4483 448 0.8196 0.4575 0.8196 0.9053
No log 15.5172 450 0.8182 0.4575 0.8182 0.9045
No log 15.5862 452 0.8351 0.5408 0.8351 0.9139
No log 15.6552 454 0.9182 0.5759 0.9182 0.9582
No log 15.7241 456 1.0917 0.3963 1.0917 1.0449
No log 15.7931 458 1.1476 0.3490 1.1476 1.0712
No log 15.8621 460 1.0584 0.4407 1.0584 1.0288
No log 15.9310 462 0.9106 0.5451 0.9106 0.9543
No log 16.0 464 0.8667 0.5571 0.8667 0.9309
No log 16.0690 466 0.8601 0.5176 0.8601 0.9274
No log 16.1379 468 0.8673 0.5662 0.8673 0.9313
No log 16.2069 470 0.8771 0.5844 0.8771 0.9366
No log 16.2759 472 0.8808 0.5815 0.8808 0.9385
No log 16.3448 474 0.8371 0.5844 0.8371 0.9149
No log 16.4138 476 0.8322 0.5352 0.8322 0.9123
No log 16.4828 478 0.8508 0.5585 0.8508 0.9224
No log 16.5517 480 0.8382 0.4512 0.8382 0.9156
No log 16.6207 482 0.8555 0.4242 0.8555 0.9249
No log 16.6897 484 0.9130 0.4631 0.9130 0.9555
No log 16.7586 486 0.9356 0.4291 0.9356 0.9673
No log 16.8276 488 0.9275 0.4540 0.9275 0.9631
No log 16.8966 490 0.9097 0.4366 0.9097 0.9538
No log 16.9655 492 0.8920 0.5043 0.8920 0.9445
No log 17.0345 494 0.8980 0.5148 0.8980 0.9476
No log 17.1034 496 0.9218 0.5114 0.9218 0.9601
No log 17.1724 498 0.9386 0.5202 0.9386 0.9688
0.3227 17.2414 500 0.9386 0.4449 0.9386 0.9688
0.3227 17.3103 502 0.9391 0.4297 0.9391 0.9690
0.3227 17.3793 504 0.9431 0.4262 0.9431 0.9711
0.3227 17.4483 506 0.9369 0.3401 0.9369 0.9679
0.3227 17.5172 508 0.9365 0.3401 0.9365 0.9677
0.3227 17.5862 510 0.9218 0.3551 0.9218 0.9601

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_OSS_usingWellWrittenEssays_FineTuningAraBERT_run3_AugV5_k5_task2_organization

Finetuned
(4019)
this model