ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k4_task5_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.6411
  • Qwk: 0.5891
  • Mse: 0.6411
  • Rmse: 0.8007

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.1333 2 4.0760 -0.0323 4.0760 2.0189
No log 0.2667 4 2.2768 0.0705 2.2768 1.5089
No log 0.4 6 1.6016 -0.0180 1.6016 1.2655
No log 0.5333 8 1.6261 0.0532 1.6261 1.2752
No log 0.6667 10 1.9060 0.0535 1.9060 1.3806
No log 0.8 12 2.1386 0.1065 2.1386 1.4624
No log 0.9333 14 1.5975 0.0408 1.5975 1.2639
No log 1.0667 16 1.3989 -0.0032 1.3989 1.1827
No log 1.2 18 1.1537 0.1028 1.1537 1.0741
No log 1.3333 20 1.0718 0.1725 1.0718 1.0353
No log 1.4667 22 1.1341 0.0523 1.1341 1.0650
No log 1.6 24 1.3075 0.0883 1.3075 1.1435
No log 1.7333 26 1.5539 0.1339 1.5539 1.2465
No log 1.8667 28 1.6367 0.1843 1.6367 1.2793
No log 2.0 30 1.3398 0.1612 1.3398 1.1575
No log 2.1333 32 1.1341 0.2547 1.1341 1.0649
No log 2.2667 34 1.3812 0.2962 1.3812 1.1752
No log 2.4 36 1.4182 0.3323 1.4182 1.1909
No log 2.5333 38 0.9954 0.3697 0.9954 0.9977
No log 2.6667 40 0.8495 0.2967 0.8495 0.9217
No log 2.8 42 0.9741 0.3027 0.9741 0.9870
No log 2.9333 44 0.8568 0.3647 0.8568 0.9257
No log 3.0667 46 1.1190 0.3478 1.1190 1.0578
No log 3.2 48 1.4883 0.3096 1.4883 1.2200
No log 3.3333 50 1.3317 0.3617 1.3317 1.1540
No log 3.4667 52 1.0212 0.3203 1.0212 1.0106
No log 3.6 54 0.8089 0.4936 0.8089 0.8994
No log 3.7333 56 0.8214 0.4159 0.8214 0.9063
No log 3.8667 58 0.8306 0.4832 0.8306 0.9114
No log 4.0 60 0.7586 0.4922 0.7586 0.8710
No log 4.1333 62 1.0410 0.5272 1.0410 1.0203
No log 4.2667 64 1.2692 0.3929 1.2692 1.1266
No log 4.4 66 1.0733 0.4994 1.0733 1.0360
No log 4.5333 68 0.8097 0.5336 0.8097 0.8998
No log 4.6667 70 0.7636 0.4118 0.7636 0.8738
No log 4.8 72 0.8359 0.4792 0.8359 0.9143
No log 4.9333 74 0.9638 0.5272 0.9638 0.9818
No log 5.0667 76 1.1012 0.4107 1.1012 1.0494
No log 5.2 78 1.2320 0.3902 1.2320 1.1100
No log 5.3333 80 0.9282 0.4738 0.9282 0.9634
No log 5.4667 82 0.7437 0.5274 0.7437 0.8624
No log 5.6 84 0.7529 0.5345 0.7529 0.8677
No log 5.7333 86 0.8054 0.4710 0.8054 0.8975
No log 5.8667 88 0.7585 0.4754 0.7585 0.8709
No log 6.0 90 0.7574 0.5315 0.7574 0.8703
No log 6.1333 92 0.8130 0.4998 0.8130 0.9016
No log 6.2667 94 0.8073 0.4861 0.8073 0.8985
No log 6.4 96 0.8246 0.5390 0.8246 0.9081
No log 6.5333 98 0.8478 0.5363 0.8478 0.9208
No log 6.6667 100 0.8655 0.5363 0.8655 0.9303
No log 6.8 102 0.8201 0.5401 0.8201 0.9056
No log 6.9333 104 0.7944 0.5545 0.7944 0.8913
No log 7.0667 106 0.8176 0.5627 0.8176 0.9042
No log 7.2 108 0.8133 0.5131 0.8133 0.9018
No log 7.3333 110 0.8508 0.5257 0.8508 0.9224
No log 7.4667 112 0.8781 0.4502 0.8781 0.9371
No log 7.6 114 0.9161 0.5033 0.9161 0.9572
No log 7.7333 116 0.9983 0.5841 0.9983 0.9991
No log 7.8667 118 1.0011 0.4602 1.0011 1.0006
No log 8.0 120 1.0511 0.4629 1.0511 1.0252
No log 8.1333 122 1.0276 0.4992 1.0276 1.0137
No log 8.2667 124 1.0063 0.4848 1.0063 1.0032
No log 8.4 126 0.9935 0.4584 0.9935 0.9968
No log 8.5333 128 0.9570 0.4383 0.9570 0.9782
No log 8.6667 130 0.8919 0.5025 0.8919 0.9444
No log 8.8 132 0.8838 0.5059 0.8838 0.9401
No log 8.9333 134 0.8786 0.5246 0.8786 0.9374
No log 9.0667 136 0.8513 0.5451 0.8513 0.9226
No log 9.2 138 0.8438 0.5508 0.8438 0.9186
No log 9.3333 140 0.8475 0.5675 0.8475 0.9206
No log 9.4667 142 0.8834 0.6082 0.8834 0.9399
No log 9.6 144 0.8686 0.5679 0.8686 0.9320
No log 9.7333 146 0.8132 0.5919 0.8132 0.9018
No log 9.8667 148 0.7898 0.5224 0.7898 0.8887
No log 10.0 150 0.7920 0.5374 0.7920 0.8900
No log 10.1333 152 0.7793 0.5263 0.7793 0.8828
No log 10.2667 154 0.8016 0.5275 0.8016 0.8953
No log 10.4 156 0.9663 0.5243 0.9663 0.9830
No log 10.5333 158 0.9622 0.5243 0.9622 0.9809
No log 10.6667 160 0.8372 0.5298 0.8372 0.9150
No log 10.8 162 0.7793 0.5600 0.7793 0.8828
No log 10.9333 164 0.7873 0.5796 0.7873 0.8873
No log 11.0667 166 0.7606 0.5742 0.7606 0.8721
No log 11.2 168 0.7194 0.6001 0.7194 0.8482
No log 11.3333 170 0.7043 0.5638 0.7043 0.8392
No log 11.4667 172 0.6960 0.5771 0.6960 0.8342
No log 11.6 174 0.7220 0.5427 0.7220 0.8497
No log 11.7333 176 0.7039 0.5548 0.7039 0.8390
No log 11.8667 178 0.6669 0.6230 0.6669 0.8167
No log 12.0 180 0.6557 0.6311 0.6557 0.8097
No log 12.1333 182 0.6600 0.6374 0.6600 0.8124
No log 12.2667 184 0.7073 0.5404 0.7073 0.8410
No log 12.4 186 0.6639 0.6246 0.6639 0.8148
No log 12.5333 188 0.7037 0.6511 0.7037 0.8389
No log 12.6667 190 0.7789 0.6459 0.7789 0.8825
No log 12.8 192 0.7010 0.5832 0.7010 0.8373
No log 12.9333 194 0.6687 0.5736 0.6687 0.8177
No log 13.0667 196 0.8852 0.5182 0.8852 0.9409
No log 13.2 198 0.9364 0.5295 0.9364 0.9677
No log 13.3333 200 0.7974 0.4922 0.7974 0.8930
No log 13.4667 202 0.6790 0.5735 0.6790 0.8240
No log 13.6 204 0.7517 0.5498 0.7517 0.8670
No log 13.7333 206 0.8241 0.5560 0.8241 0.9078
No log 13.8667 208 0.7425 0.6293 0.7425 0.8617
No log 14.0 210 0.7801 0.5823 0.7801 0.8833
No log 14.1333 212 0.8842 0.5384 0.8842 0.9403
No log 14.2667 214 0.8284 0.5384 0.8284 0.9102
No log 14.4 216 0.7341 0.5654 0.7341 0.8568
No log 14.5333 218 0.6761 0.5921 0.6761 0.8223
No log 14.6667 220 0.6791 0.6055 0.6791 0.8241
No log 14.8 222 0.6986 0.5949 0.6986 0.8358
No log 14.9333 224 0.7376 0.5459 0.7376 0.8588
No log 15.0667 226 0.7264 0.5774 0.7264 0.8523
No log 15.2 228 0.6867 0.6154 0.6867 0.8287
No log 15.3333 230 0.6838 0.6076 0.6838 0.8269
No log 15.4667 232 0.6834 0.6498 0.6834 0.8267
No log 15.6 234 0.6943 0.6187 0.6943 0.8332
No log 15.7333 236 0.7636 0.5279 0.7636 0.8738
No log 15.8667 238 0.8158 0.4836 0.8158 0.9032
No log 16.0 240 0.7740 0.5489 0.7740 0.8797
No log 16.1333 242 0.7419 0.5684 0.7419 0.8613
No log 16.2667 244 0.8096 0.5207 0.8096 0.8998
No log 16.4 246 0.7947 0.5483 0.7947 0.8914
No log 16.5333 248 0.7560 0.5264 0.7560 0.8695
No log 16.6667 250 0.7723 0.5178 0.7723 0.8788
No log 16.8 252 0.9080 0.4722 0.9080 0.9529
No log 16.9333 254 0.9118 0.4492 0.9118 0.9549
No log 17.0667 256 0.8163 0.5173 0.8163 0.9035
No log 17.2 258 0.7230 0.5585 0.7230 0.8503
No log 17.3333 260 0.7011 0.5845 0.7011 0.8373
No log 17.4667 262 0.6985 0.5735 0.6985 0.8358
No log 17.6 264 0.6845 0.5368 0.6845 0.8274
No log 17.7333 266 0.6963 0.5959 0.6963 0.8345
No log 17.8667 268 0.7653 0.5383 0.7653 0.8748
No log 18.0 270 0.8114 0.4938 0.8114 0.9008
No log 18.1333 272 0.7895 0.5383 0.7895 0.8886
No log 18.2667 274 0.7264 0.5173 0.7264 0.8523
No log 18.4 276 0.6928 0.5847 0.6928 0.8323
No log 18.5333 278 0.7065 0.5274 0.7065 0.8405
No log 18.6667 280 0.7146 0.5060 0.7146 0.8453
No log 18.8 282 0.7076 0.5274 0.7076 0.8412
No log 18.9333 284 0.7026 0.5249 0.7026 0.8382
No log 19.0667 286 0.7086 0.5142 0.7086 0.8418
No log 19.2 288 0.7251 0.5364 0.7251 0.8515
No log 19.3333 290 0.7406 0.5795 0.7406 0.8606
No log 19.4667 292 0.7202 0.5364 0.7202 0.8487
No log 19.6 294 0.7040 0.5475 0.7040 0.8391
No log 19.7333 296 0.7039 0.5129 0.7039 0.8390
No log 19.8667 298 0.6833 0.5475 0.6833 0.8266
No log 20.0 300 0.6798 0.5923 0.6798 0.8245
No log 20.1333 302 0.6650 0.5594 0.6650 0.8155
No log 20.2667 304 0.7076 0.4974 0.7076 0.8412
No log 20.4 306 0.7633 0.5279 0.7633 0.8736
No log 20.5333 308 0.7390 0.5173 0.7390 0.8596
No log 20.6667 310 0.6553 0.5450 0.6553 0.8095
No log 20.8 312 0.6455 0.6488 0.6455 0.8034
No log 20.9333 314 0.6309 0.6154 0.6309 0.7943
No log 21.0667 316 0.6352 0.6291 0.6352 0.7970
No log 21.2 318 0.7366 0.5163 0.7366 0.8583
No log 21.3333 320 0.7568 0.5266 0.7568 0.8699
No log 21.4667 322 0.7071 0.5279 0.7071 0.8409
No log 21.6 324 0.6719 0.5585 0.6719 0.8197
No log 21.7333 326 0.6315 0.5960 0.6315 0.7947
No log 21.8667 328 0.6059 0.6025 0.6059 0.7784
No log 22.0 330 0.5988 0.6491 0.5988 0.7738
No log 22.1333 332 0.6069 0.6347 0.6069 0.7791
No log 22.2667 334 0.6291 0.6446 0.6291 0.7931
No log 22.4 336 0.6303 0.6446 0.6303 0.7939
No log 22.5333 338 0.6482 0.6073 0.6482 0.8051
No log 22.6667 340 0.6724 0.6073 0.6724 0.8200
No log 22.8 342 0.6963 0.5558 0.6963 0.8345
No log 22.9333 344 0.7235 0.5605 0.7235 0.8506
No log 23.0667 346 0.7434 0.5103 0.7434 0.8622
No log 23.2 348 0.7497 0.5516 0.7497 0.8658
No log 23.3333 350 0.7339 0.5858 0.7339 0.8567
No log 23.4667 352 0.7058 0.5585 0.7058 0.8401
No log 23.6 354 0.6842 0.5261 0.6842 0.8272
No log 23.7333 356 0.6785 0.5396 0.6785 0.8237
No log 23.8667 358 0.6691 0.5614 0.6691 0.8180
No log 24.0 360 0.6733 0.5485 0.6733 0.8205
No log 24.1333 362 0.6801 0.6014 0.6801 0.8247
No log 24.2667 364 0.7146 0.5729 0.7146 0.8453
No log 24.4 366 0.7238 0.5729 0.7238 0.8507
No log 24.5333 368 0.6828 0.5740 0.6828 0.8263
No log 24.6667 370 0.6510 0.6143 0.6510 0.8069
No log 24.8 372 0.6575 0.6143 0.6575 0.8109
No log 24.9333 374 0.6845 0.5986 0.6845 0.8273
No log 25.0667 376 0.7155 0.6092 0.7155 0.8459
No log 25.2 378 0.7561 0.5622 0.7561 0.8695
No log 25.3333 380 0.7806 0.5591 0.7806 0.8835
No log 25.4667 382 0.7439 0.5504 0.7439 0.8625
No log 25.6 384 0.6852 0.5688 0.6852 0.8278
No log 25.7333 386 0.6678 0.5905 0.6678 0.8172
No log 25.8667 388 0.6734 0.5905 0.6734 0.8206
No log 26.0 390 0.6913 0.5740 0.6913 0.8315
No log 26.1333 392 0.7289 0.5266 0.7289 0.8538
No log 26.2667 394 0.8116 0.5475 0.8116 0.9009
No log 26.4 396 0.7909 0.5591 0.7909 0.8893
No log 26.5333 398 0.6971 0.5498 0.6971 0.8350
No log 26.6667 400 0.6132 0.6360 0.6132 0.7831
No log 26.8 402 0.6070 0.5831 0.6070 0.7791
No log 26.9333 404 0.6085 0.5833 0.6085 0.7801
No log 27.0667 406 0.6201 0.6032 0.6201 0.7875
No log 27.2 408 0.6327 0.5774 0.6327 0.7954
No log 27.3333 410 0.6523 0.5751 0.6523 0.8077
No log 27.4667 412 0.6657 0.5516 0.6657 0.8159
No log 27.6 414 0.6452 0.5855 0.6452 0.8032
No log 27.7333 416 0.6329 0.5763 0.6329 0.7956
No log 27.8667 418 0.6381 0.6237 0.6381 0.7988
No log 28.0 420 0.6460 0.6215 0.6460 0.8038
No log 28.1333 422 0.6490 0.6186 0.6490 0.8056
No log 28.2667 424 0.6386 0.6284 0.6386 0.7991
No log 28.4 426 0.6360 0.6389 0.6360 0.7975
No log 28.5333 428 0.6415 0.5969 0.6415 0.8009
No log 28.6667 430 0.6654 0.5634 0.6654 0.8157
No log 28.8 432 0.6467 0.5645 0.6467 0.8042
No log 28.9333 434 0.6226 0.6219 0.6226 0.7891
No log 29.0667 436 0.6193 0.6014 0.6193 0.7869
No log 29.2 438 0.6604 0.5634 0.6604 0.8126
No log 29.3333 440 0.6924 0.5516 0.6924 0.8321
No log 29.4667 442 0.7176 0.5622 0.7176 0.8471
No log 29.6 444 0.7135 0.5622 0.7135 0.8447
No log 29.7333 446 0.6655 0.5634 0.6655 0.8158
No log 29.8667 448 0.6424 0.5645 0.6424 0.8015
No log 30.0 450 0.6228 0.5863 0.6228 0.7892
No log 30.1333 452 0.6171 0.5887 0.6171 0.7856
No log 30.2667 454 0.6179 0.5964 0.6179 0.7861
No log 30.4 456 0.6475 0.5516 0.6475 0.8047
No log 30.5333 458 0.7406 0.5622 0.7406 0.8606
No log 30.6667 460 0.8610 0.5458 0.8610 0.9279
No log 30.8 462 0.9250 0.5208 0.9250 0.9618
No log 30.9333 464 0.9091 0.5106 0.9091 0.9535
No log 31.0667 466 0.8227 0.5147 0.8227 0.9071
No log 31.2 468 0.7087 0.5622 0.7087 0.8418
No log 31.3333 470 0.6599 0.5546 0.6599 0.8123
No log 31.4667 472 0.6444 0.5455 0.6444 0.8028
No log 31.6 474 0.6467 0.5455 0.6467 0.8042
No log 31.7333 476 0.6683 0.5855 0.6683 0.8175
No log 31.8667 478 0.7170 0.5410 0.7170 0.8468
No log 32.0 480 0.7389 0.5516 0.7389 0.8596
No log 32.1333 482 0.7090 0.5528 0.7090 0.8420
No log 32.2667 484 0.6800 0.5678 0.6800 0.8246
No log 32.4 486 0.6779 0.5480 0.6779 0.8234
No log 32.5333 488 0.6814 0.5932 0.6814 0.8255
No log 32.6667 490 0.6927 0.5798 0.6927 0.8323
No log 32.8 492 0.6886 0.5798 0.6886 0.8298
No log 32.9333 494 0.7018 0.5528 0.7018 0.8378
No log 33.0667 496 0.6977 0.5528 0.6977 0.8353
No log 33.2 498 0.6912 0.5528 0.6912 0.8314
0.2408 33.3333 500 0.6704 0.6043 0.6704 0.8188
0.2408 33.4667 502 0.6636 0.5669 0.6636 0.8146
0.2408 33.6 504 0.6554 0.5902 0.6554 0.8095
0.2408 33.7333 506 0.6493 0.5887 0.6493 0.8058
0.2408 33.8667 508 0.6446 0.6028 0.6446 0.8029
0.2408 34.0 510 0.6411 0.5891 0.6411 0.8007

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k4_task5_organization

Finetuned
(4019)
this model