ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k8_task7_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5296
  • Qwk: 0.4997
  • Mse: 0.5296
  • Rmse: 0.7277

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0690 2 2.6698 0.0070 2.6698 1.6339
No log 0.1379 4 1.4086 -0.0092 1.4086 1.1869
No log 0.2069 6 0.8465 0.0535 0.8465 0.9201
No log 0.2759 8 0.9350 0.0053 0.9350 0.9669
No log 0.3448 10 1.1076 -0.0173 1.1076 1.0524
No log 0.4138 12 0.8806 0.0 0.8806 0.9384
No log 0.4828 14 0.7868 0.0 0.7868 0.8870
No log 0.5517 16 0.8409 0.0 0.8409 0.9170
No log 0.6207 18 0.9885 0.0509 0.9885 0.9942
No log 0.6897 20 0.9870 0.0509 0.9870 0.9935
No log 0.7586 22 0.9603 0.0053 0.9603 0.9799
No log 0.8276 24 0.8383 0.0 0.8383 0.9156
No log 0.8966 26 0.8032 0.0 0.8032 0.8962
No log 0.9655 28 0.8257 0.0 0.8257 0.9087
No log 1.0345 30 0.8871 0.0 0.8871 0.9418
No log 1.1034 32 0.9202 0.0481 0.9202 0.9593
No log 1.1724 34 0.9055 0.0 0.9055 0.9516
No log 1.2414 36 0.8905 0.0 0.8905 0.9437
No log 1.3103 38 0.8168 0.0 0.8168 0.9038
No log 1.3793 40 0.7317 -0.0027 0.7317 0.8554
No log 1.4483 42 0.6554 0.2270 0.6554 0.8095
No log 1.5172 44 0.6321 0.1561 0.6321 0.7951
No log 1.5862 46 0.6793 0.1327 0.6793 0.8242
No log 1.6552 48 0.7921 0.2526 0.7921 0.8900
No log 1.7241 50 0.9480 0.2613 0.9480 0.9737
No log 1.7931 52 0.9377 0.2703 0.9377 0.9683
No log 1.8621 54 0.7791 0.1770 0.7791 0.8827
No log 1.9310 56 0.7389 0.0940 0.7389 0.8596
No log 2.0 58 0.7002 0.0764 0.7002 0.8368
No log 2.0690 60 0.6914 0.2606 0.6914 0.8315
No log 2.1379 62 0.6734 0.1903 0.6734 0.8206
No log 2.2069 64 0.7441 0.0027 0.7441 0.8626
No log 2.2759 66 0.9222 0.2736 0.9222 0.9603
No log 2.3448 68 0.9832 0.2642 0.9832 0.9916
No log 2.4138 70 0.8676 0.1339 0.8676 0.9314
No log 2.4828 72 0.7495 0.0 0.7495 0.8657
No log 2.5517 74 0.6941 0.1400 0.6941 0.8331
No log 2.6207 76 0.6289 0.2540 0.6289 0.7930
No log 2.6897 78 0.6366 0.2884 0.6366 0.7979
No log 2.7586 80 0.7122 0.3118 0.7122 0.8439
No log 2.8276 82 0.7488 0.2991 0.7488 0.8653
No log 2.8966 84 0.7697 0.2574 0.7697 0.8773
No log 2.9655 86 0.8165 0.2857 0.8165 0.9036
No log 3.0345 88 0.8040 0.2939 0.8040 0.8967
No log 3.1034 90 0.6920 0.2239 0.6920 0.8319
No log 3.1724 92 0.6496 0.3153 0.6496 0.8060
No log 3.2414 94 0.6449 0.2787 0.6449 0.8031
No log 3.3103 96 0.6399 0.2787 0.6399 0.7999
No log 3.3793 98 0.6951 0.3745 0.6951 0.8337
No log 3.4483 100 0.8504 0.2369 0.8504 0.9222
No log 3.5172 102 0.8060 0.3312 0.8060 0.8978
No log 3.5862 104 0.6552 0.5208 0.6552 0.8094
No log 3.6552 106 0.6087 0.3407 0.6087 0.7802
No log 3.7241 108 0.5992 0.3608 0.5992 0.7741
No log 3.7931 110 0.6106 0.3942 0.6106 0.7814
No log 3.8621 112 0.6234 0.4698 0.6234 0.7896
No log 3.9310 114 0.6687 0.4412 0.6687 0.8177
No log 4.0 116 0.6475 0.4719 0.6475 0.8047
No log 4.0690 118 0.6062 0.3366 0.6062 0.7786
No log 4.1379 120 0.5646 0.2923 0.5646 0.7514
No log 4.2069 122 0.5668 0.3445 0.5668 0.7529
No log 4.2759 124 0.5446 0.3445 0.5446 0.7380
No log 4.3448 126 0.5495 0.5449 0.5495 0.7413
No log 4.4138 128 0.5449 0.5332 0.5449 0.7382
No log 4.4828 130 0.5961 0.5046 0.5961 0.7721
No log 4.5517 132 0.5437 0.6141 0.5437 0.7374
No log 4.6207 134 0.5061 0.5583 0.5061 0.7114
No log 4.6897 136 0.5197 0.5395 0.5197 0.7209
No log 4.7586 138 0.5218 0.5015 0.5218 0.7223
No log 4.8276 140 0.5130 0.5379 0.5130 0.7163
No log 4.8966 142 0.5486 0.6027 0.5486 0.7407
No log 4.9655 144 0.5079 0.5475 0.5079 0.7127
No log 5.0345 146 0.5016 0.5641 0.5016 0.7083
No log 5.1034 148 0.5062 0.5424 0.5062 0.7115
No log 5.1724 150 0.5754 0.6124 0.5754 0.7585
No log 5.2414 152 0.6872 0.3417 0.6872 0.8290
No log 5.3103 154 0.6121 0.5247 0.6121 0.7824
No log 5.3793 156 0.4951 0.5489 0.4951 0.7037
No log 5.4483 158 0.5287 0.5639 0.5287 0.7271
No log 5.5172 160 0.5213 0.5111 0.5213 0.7220
No log 5.5862 162 0.5253 0.5367 0.5253 0.7248
No log 5.6552 164 0.5916 0.4819 0.5916 0.7691
No log 5.7241 166 0.5650 0.4997 0.5650 0.7517
No log 5.7931 168 0.5299 0.5485 0.5299 0.7279
No log 5.8621 170 0.5208 0.5071 0.5208 0.7217
No log 5.9310 172 0.5239 0.5753 0.5239 0.7238
No log 6.0 174 0.5274 0.5797 0.5274 0.7263
No log 6.0690 176 0.5140 0.5306 0.5140 0.7169
No log 6.1379 178 0.5301 0.5601 0.5301 0.7281
No log 6.2069 180 0.5053 0.5053 0.5053 0.7109
No log 6.2759 182 0.5258 0.6475 0.5258 0.7251
No log 6.3448 184 0.5574 0.5682 0.5574 0.7466
No log 6.4138 186 0.5489 0.5283 0.5489 0.7409
No log 6.4828 188 0.5211 0.5710 0.5211 0.7219
No log 6.5517 190 0.5282 0.5348 0.5282 0.7267
No log 6.6207 192 0.5445 0.5521 0.5445 0.7379
No log 6.6897 194 0.5495 0.5003 0.5495 0.7413
No log 6.7586 196 0.5627 0.4438 0.5627 0.7502
No log 6.8276 198 0.5909 0.4715 0.5909 0.7687
No log 6.8966 200 0.5663 0.4322 0.5663 0.7525
No log 6.9655 202 0.5733 0.4547 0.5733 0.7572
No log 7.0345 204 0.5922 0.4875 0.5922 0.7695
No log 7.1034 206 0.6002 0.4948 0.6002 0.7747
No log 7.1724 208 0.5489 0.4866 0.5489 0.7409
No log 7.2414 210 0.6062 0.4653 0.6062 0.7786
No log 7.3103 212 0.7187 0.3522 0.7187 0.8478
No log 7.3793 214 0.6351 0.4489 0.6351 0.7969
No log 7.4483 216 0.5834 0.4139 0.5834 0.7638
No log 7.5172 218 0.5453 0.4677 0.5453 0.7384
No log 7.5862 220 0.5442 0.4538 0.5442 0.7377
No log 7.6552 222 0.5449 0.3976 0.5449 0.7382
No log 7.7241 224 0.5254 0.4420 0.5254 0.7248
No log 7.7931 226 0.5080 0.4983 0.5080 0.7127
No log 7.8621 228 0.4932 0.5765 0.4932 0.7023
No log 7.9310 230 0.5498 0.5083 0.5498 0.7415
No log 8.0 232 0.5672 0.4482 0.5672 0.7531
No log 8.0690 234 0.4808 0.5633 0.4808 0.6934
No log 8.1379 236 0.4706 0.5736 0.4706 0.6860
No log 8.2069 238 0.4777 0.6029 0.4777 0.6912
No log 8.2759 240 0.4608 0.5732 0.4608 0.6788
No log 8.3448 242 0.4992 0.6308 0.4992 0.7065
No log 8.4138 244 0.5851 0.4501 0.5851 0.7649
No log 8.4828 246 0.5326 0.6074 0.5326 0.7298
No log 8.5517 248 0.4626 0.5930 0.4626 0.6802
No log 8.6207 250 0.4646 0.6264 0.4646 0.6816
No log 8.6897 252 0.4635 0.6303 0.4635 0.6808
No log 8.7586 254 0.4647 0.5404 0.4647 0.6817
No log 8.8276 256 0.4728 0.6068 0.4728 0.6876
No log 8.8966 258 0.4694 0.5782 0.4694 0.6852
No log 8.9655 260 0.4737 0.5286 0.4737 0.6883
No log 9.0345 262 0.4782 0.5189 0.4782 0.6915
No log 9.1034 264 0.4807 0.5367 0.4807 0.6933
No log 9.1724 266 0.4854 0.5836 0.4854 0.6967
No log 9.2414 268 0.4836 0.5252 0.4836 0.6954
No log 9.3103 270 0.5542 0.5621 0.5542 0.7445
No log 9.3793 272 0.6055 0.5373 0.6055 0.7781
No log 9.4483 274 0.5578 0.6004 0.5578 0.7469
No log 9.5172 276 0.5230 0.5662 0.5230 0.7232
No log 9.5862 278 0.5696 0.4920 0.5696 0.7547
No log 9.6552 280 0.5503 0.4703 0.5503 0.7418
No log 9.7241 282 0.5143 0.5472 0.5143 0.7172
No log 9.7931 284 0.5336 0.5714 0.5336 0.7305
No log 9.8621 286 0.5432 0.5555 0.5432 0.7370
No log 9.9310 288 0.5246 0.5475 0.5246 0.7243
No log 10.0 290 0.5253 0.5472 0.5253 0.7248
No log 10.0690 292 0.5338 0.5472 0.5338 0.7306
No log 10.1379 294 0.5454 0.5195 0.5454 0.7385
No log 10.2069 296 0.5524 0.5177 0.5524 0.7433
No log 10.2759 298 0.5330 0.4964 0.5330 0.7301
No log 10.3448 300 0.5196 0.5625 0.5196 0.7209
No log 10.4138 302 0.5352 0.5428 0.5352 0.7316
No log 10.4828 304 0.5466 0.5628 0.5466 0.7393
No log 10.5517 306 0.5410 0.5628 0.5410 0.7355
No log 10.6207 308 0.5253 0.6158 0.5253 0.7248
No log 10.6897 310 0.5189 0.5632 0.5189 0.7203
No log 10.7586 312 0.5177 0.6060 0.5177 0.7195
No log 10.8276 314 0.5657 0.5650 0.5657 0.7521
No log 10.8966 316 0.5621 0.5947 0.5621 0.7497
No log 10.9655 318 0.5347 0.5476 0.5347 0.7312
No log 11.0345 320 0.5227 0.5323 0.5227 0.7230
No log 11.1034 322 0.5235 0.5457 0.5235 0.7235
No log 11.1724 324 0.5292 0.5324 0.5292 0.7274
No log 11.2414 326 0.5195 0.5556 0.5195 0.7208
No log 11.3103 328 0.5110 0.5056 0.5110 0.7148
No log 11.3793 330 0.5589 0.5086 0.5589 0.7476
No log 11.4483 332 0.5460 0.5327 0.5460 0.7390
No log 11.5172 334 0.5113 0.4808 0.5113 0.7151
No log 11.5862 336 0.5127 0.5683 0.5127 0.7160
No log 11.6552 338 0.6561 0.5436 0.6561 0.8100
No log 11.7241 340 0.7029 0.4427 0.7029 0.8384
No log 11.7931 342 0.6301 0.5160 0.6301 0.7938
No log 11.8621 344 0.5308 0.5663 0.5308 0.7286
No log 11.9310 346 0.5179 0.5413 0.5179 0.7196
No log 12.0 348 0.5256 0.5076 0.5256 0.7250
No log 12.0690 350 0.5127 0.5076 0.5127 0.7161
No log 12.1379 352 0.5055 0.4768 0.5055 0.7110
No log 12.2069 354 0.5091 0.5379 0.5091 0.7135
No log 12.2759 356 0.5022 0.5022 0.5022 0.7087
No log 12.3448 358 0.4965 0.5095 0.4965 0.7046
No log 12.4138 360 0.4909 0.5095 0.4909 0.7006
No log 12.4828 362 0.4902 0.5095 0.4902 0.7001
No log 12.5517 364 0.5033 0.5577 0.5033 0.7094
No log 12.6207 366 0.5087 0.6087 0.5087 0.7132
No log 12.6897 368 0.5233 0.5817 0.5233 0.7234
No log 12.7586 370 0.5080 0.5723 0.5080 0.7128
No log 12.8276 372 0.5067 0.6039 0.5067 0.7118
No log 12.8966 374 0.5223 0.5714 0.5223 0.7227
No log 12.9655 376 0.5218 0.5714 0.5218 0.7224
No log 13.0345 378 0.5264 0.5714 0.5264 0.7256
No log 13.1034 380 0.5296 0.5714 0.5296 0.7278
No log 13.1724 382 0.5155 0.5472 0.5155 0.7180
No log 13.2414 384 0.5269 0.5468 0.5269 0.7259
No log 13.3103 386 0.5214 0.5131 0.5214 0.7220
No log 13.3793 388 0.5173 0.4888 0.5173 0.7192
No log 13.4483 390 0.5210 0.5432 0.5210 0.7218
No log 13.5172 392 0.5260 0.4825 0.5260 0.7252
No log 13.5862 394 0.5388 0.4888 0.5388 0.7340
No log 13.6552 396 0.5447 0.4538 0.5447 0.7381
No log 13.7241 398 0.5405 0.4136 0.5405 0.7352
No log 13.7931 400 0.5420 0.4136 0.5420 0.7362
No log 13.8621 402 0.5380 0.4136 0.5380 0.7335
No log 13.9310 404 0.5321 0.4361 0.5321 0.7295
No log 14.0 406 0.5367 0.4136 0.5367 0.7326
No log 14.0690 408 0.5371 0.4898 0.5371 0.7328
No log 14.1379 410 0.5437 0.5079 0.5437 0.7374
No log 14.2069 412 0.5536 0.5171 0.5536 0.7441
No log 14.2759 414 0.5625 0.4656 0.5625 0.7500
No log 14.3448 416 0.5520 0.3228 0.5520 0.7430
No log 14.4138 418 0.5582 0.3953 0.5582 0.7471
No log 14.4828 420 0.5563 0.3523 0.5563 0.7458
No log 14.5517 422 0.5553 0.4249 0.5553 0.7452
No log 14.6207 424 0.5499 0.5010 0.5499 0.7416
No log 14.6897 426 0.5510 0.5510 0.5510 0.7423
No log 14.7586 428 0.5220 0.5286 0.5220 0.7225
No log 14.8276 430 0.5206 0.4788 0.5206 0.7215
No log 14.8966 432 0.5200 0.4788 0.5200 0.7211
No log 14.9655 434 0.5231 0.5061 0.5231 0.7232
No log 15.0345 436 0.5439 0.5747 0.5439 0.7375
No log 15.1034 438 0.5283 0.5171 0.5283 0.7269
No log 15.1724 440 0.5114 0.5357 0.5114 0.7151
No log 15.2414 442 0.5072 0.5388 0.5072 0.7122
No log 15.3103 444 0.5297 0.5639 0.5297 0.7278
No log 15.3793 446 0.5305 0.5677 0.5305 0.7284
No log 15.4483 448 0.4913 0.5692 0.4913 0.7009
No log 15.5172 450 0.4642 0.6254 0.4642 0.6814
No log 15.5862 452 0.4634 0.5714 0.4634 0.6807
No log 15.6552 454 0.4604 0.5930 0.4604 0.6786
No log 15.7241 456 0.4711 0.6201 0.4711 0.6864
No log 15.7931 458 0.5257 0.5452 0.5257 0.7251
No log 15.8621 460 0.5762 0.5080 0.5762 0.7591
No log 15.9310 462 0.5396 0.5497 0.5396 0.7346
No log 16.0 464 0.5387 0.4911 0.5387 0.7340
No log 16.0690 466 0.5412 0.4502 0.5412 0.7356
No log 16.1379 468 0.5365 0.4409 0.5365 0.7325
No log 16.2069 470 0.5256 0.5270 0.5256 0.7250
No log 16.2759 472 0.5164 0.5018 0.5164 0.7186
No log 16.3448 474 0.5114 0.5182 0.5114 0.7152
No log 16.4138 476 0.5187 0.5413 0.5187 0.7202
No log 16.4828 478 0.5411 0.4663 0.5411 0.7356
No log 16.5517 480 0.5324 0.4925 0.5324 0.7297
No log 16.6207 482 0.5028 0.5248 0.5028 0.7091
No log 16.6897 484 0.5006 0.5151 0.5006 0.7075
No log 16.7586 486 0.4984 0.4591 0.4984 0.7060
No log 16.8276 488 0.4956 0.5556 0.4956 0.7040
No log 16.8966 490 0.5033 0.5397 0.5033 0.7095
No log 16.9655 492 0.4972 0.5044 0.4972 0.7051
No log 17.0345 494 0.5001 0.5044 0.5001 0.7072
No log 17.1034 496 0.4931 0.4774 0.4931 0.7022
No log 17.1724 498 0.5000 0.4888 0.5000 0.7071
0.2895 17.2414 500 0.5421 0.5452 0.5421 0.7362
0.2895 17.3103 502 0.5603 0.5452 0.5603 0.7485
0.2895 17.3793 504 0.5265 0.4816 0.5265 0.7256
0.2895 17.4483 506 0.5124 0.5640 0.5124 0.7158
0.2895 17.5172 508 0.5244 0.4638 0.5244 0.7241
0.2895 17.5862 510 0.5392 0.5627 0.5392 0.7343
0.2895 17.6552 512 0.5212 0.5057 0.5212 0.7219
0.2895 17.7241 514 0.5197 0.5362 0.5197 0.7209
0.2895 17.7931 516 0.5865 0.5045 0.5865 0.7659
0.2895 17.8621 518 0.5954 0.5045 0.5954 0.7716
0.2895 17.9310 520 0.5296 0.4997 0.5296 0.7277

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_usingALLEssays_FineTuningAraBERT_run1_AugV5_k8_task7_organization

Finetuned
(4019)
this model