ArabicNewSplits6_FineTuningAraBERT_run2_AugV5_k9_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5814
  • Qwk: 0.3295
  • Mse: 0.5814
  • Rmse: 0.7625

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0465 2 3.2912 -0.0149 3.2912 1.8142
No log 0.0930 4 1.5681 0.0255 1.5681 1.2522
No log 0.1395 6 0.9329 0.0947 0.9329 0.9659
No log 0.1860 8 0.6913 0.1724 0.6913 0.8314
No log 0.2326 10 0.5917 0.1111 0.5917 0.7692
No log 0.2791 12 0.7840 0.0515 0.7840 0.8854
No log 0.3256 14 1.0015 0.1130 1.0015 1.0007
No log 0.3721 16 0.9975 0.0916 0.9975 0.9987
No log 0.4186 18 0.7145 0.1345 0.7145 0.8453
No log 0.4651 20 0.6284 0.0145 0.6284 0.7927
No log 0.5116 22 0.7303 0.2000 0.7303 0.8546
No log 0.5581 24 0.6570 0.1638 0.6570 0.8106
No log 0.6047 26 0.6204 0.1698 0.6204 0.7877
No log 0.6512 28 0.5529 0.0303 0.5529 0.7435
No log 0.6977 30 0.5899 0.0476 0.5899 0.7681
No log 0.7442 32 0.5793 0.0400 0.5793 0.7611
No log 0.7907 34 0.6246 0.1186 0.6246 0.7903
No log 0.8372 36 0.5966 0.2099 0.5966 0.7724
No log 0.8837 38 0.6097 0.2000 0.6097 0.7808
No log 0.9302 40 0.9054 0.1453 0.9054 0.9515
No log 0.9767 42 0.6096 0.1698 0.6096 0.7807
No log 1.0233 44 0.5652 0.0638 0.5652 0.7518
No log 1.0698 46 0.5397 -0.0303 0.5397 0.7346
No log 1.1163 48 0.6348 0.1813 0.6348 0.7967
No log 1.1628 50 0.5946 0.1195 0.5946 0.7711
No log 1.2093 52 0.6040 0.1781 0.6040 0.7772
No log 1.2558 54 0.6430 0.1888 0.6430 0.8019
No log 1.3023 56 0.5712 0.2195 0.5712 0.7557
No log 1.3488 58 0.9290 0.1453 0.9290 0.9639
No log 1.3953 60 1.2297 0.0638 1.2297 1.1089
No log 1.4419 62 0.9566 0.0947 0.9566 0.9780
No log 1.4884 64 0.5178 0.1484 0.5178 0.7195
No log 1.5349 66 0.5247 0.1045 0.5247 0.7243
No log 1.5814 68 0.5213 0.1888 0.5213 0.7220
No log 1.6279 70 0.5719 0.0850 0.5719 0.7563
No log 1.6744 72 0.9639 0.2066 0.9639 0.9818
No log 1.7209 74 0.8779 0.2356 0.8779 0.9370
No log 1.7674 76 0.6020 0.1329 0.6020 0.7759
No log 1.8140 78 0.7606 0.2308 0.7606 0.8721
No log 1.8605 80 0.7239 0.2318 0.7239 0.8508
No log 1.9070 82 0.6304 0.0123 0.6304 0.7940
No log 1.9535 84 0.7414 0.1919 0.7414 0.8610
No log 2.0 86 0.6801 0.1038 0.6801 0.8247
No log 2.0465 88 0.5801 0.1902 0.5801 0.7616
No log 2.0930 90 0.5706 0.2000 0.5706 0.7554
No log 2.1395 92 0.6545 0.2208 0.6545 0.8090
No log 2.1860 94 0.6512 0.1698 0.6512 0.8070
No log 2.2326 96 0.6403 0.1285 0.6403 0.8002
No log 2.2791 98 0.6642 0.0703 0.6642 0.8150
No log 2.3256 100 0.6637 0.1285 0.6637 0.8147
No log 2.3721 102 0.6713 0.1364 0.6713 0.8194
No log 2.4186 104 0.7372 0.2487 0.7372 0.8586
No log 2.4651 106 1.0951 0.1032 1.0951 1.0465
No log 2.5116 108 0.8766 0.1130 0.8766 0.9362
No log 2.5581 110 0.5900 0.3149 0.5900 0.7681
No log 2.6047 112 0.5854 0.3846 0.5854 0.7651
No log 2.6512 114 1.0422 0.1293 1.0422 1.0209
No log 2.6977 116 1.2516 0.0541 1.2516 1.1187
No log 2.7442 118 0.7832 0.2475 0.7832 0.8850
No log 2.7907 120 0.7037 0.3116 0.7037 0.8389
No log 2.8372 122 0.8122 0.2000 0.8122 0.9012
No log 2.8837 124 0.5859 0.2917 0.5859 0.7654
No log 2.9302 126 0.7198 0.2077 0.7198 0.8484
No log 2.9767 128 0.7398 0.2150 0.7398 0.8601
No log 3.0233 130 0.5462 0.2704 0.5462 0.7391
No log 3.0698 132 0.6553 0.2577 0.6553 0.8095
No log 3.1163 134 0.8232 0.1148 0.8232 0.9073
No log 3.1628 136 0.6344 0.1461 0.6344 0.7965
No log 3.2093 138 0.6372 0.2832 0.6372 0.7983
No log 3.2558 140 0.6796 0.2832 0.6796 0.8244
No log 3.3023 142 0.7857 0.1111 0.7857 0.8864
No log 3.3488 144 0.9256 0.1930 0.9256 0.9621
No log 3.3953 146 0.7974 0.1238 0.7974 0.8930
No log 3.4419 148 0.9119 0.1790 0.9119 0.9550
No log 3.4884 150 0.8560 0.2511 0.8560 0.9252
No log 3.5349 152 0.6971 0.0995 0.6971 0.8350
No log 3.5814 154 0.6997 0.1179 0.6997 0.8365
No log 3.6279 156 0.6503 0.0802 0.6503 0.8064
No log 3.6744 158 0.9424 0.1807 0.9424 0.9708
No log 3.7209 160 1.1884 0.1014 1.1884 1.0901
No log 3.7674 162 0.8538 0.2793 0.8538 0.9240
No log 3.8140 164 0.6291 0.2000 0.6291 0.7931
No log 3.8605 166 0.6249 0.2081 0.6249 0.7905
No log 3.9070 168 0.6248 0.2914 0.6248 0.7905
No log 3.9535 170 0.9881 0.1292 0.9881 0.9940
No log 4.0 172 1.1636 0.0791 1.1636 1.0787
No log 4.0465 174 1.0414 0.1062 1.0414 1.0205
No log 4.0930 176 0.6733 0.2444 0.6733 0.8205
No log 4.1395 178 0.6050 0.2350 0.6050 0.7778
No log 4.1860 180 0.6413 0.2471 0.6413 0.8008
No log 4.2326 182 0.7136 0.3333 0.7136 0.8447
No log 4.2791 184 0.6978 0.2967 0.6978 0.8354
No log 4.3256 186 0.8320 0.2793 0.8320 0.9122
No log 4.3721 188 0.6549 0.2994 0.6549 0.8092
No log 4.4186 190 0.6081 0.2780 0.6081 0.7798
No log 4.4651 192 0.6008 0.3641 0.6008 0.7751
No log 4.5116 194 0.7236 0.2871 0.7236 0.8507
No log 4.5581 196 0.7530 0.2830 0.7530 0.8677
No log 4.6047 198 0.6848 0.2967 0.6848 0.8275
No log 4.6512 200 0.6430 0.3371 0.6430 0.8019
No log 4.6977 202 0.5609 0.3333 0.5609 0.7489
No log 4.7442 204 0.5478 0.3016 0.5478 0.7401
No log 4.7907 206 0.5672 0.3641 0.5672 0.7532
No log 4.8372 208 0.5528 0.3089 0.5528 0.7435
No log 4.8837 210 0.5949 0.3043 0.5949 0.7713
No log 4.9302 212 0.7872 0.3153 0.7872 0.8872
No log 4.9767 214 0.6943 0.3846 0.6943 0.8333
No log 5.0233 216 0.6279 0.3402 0.6279 0.7924
No log 5.0698 218 0.5839 0.3641 0.5839 0.7641
No log 5.1163 220 0.6491 0.4234 0.6491 0.8056
No log 5.1628 222 0.6171 0.2990 0.6171 0.7855
No log 5.2093 224 0.6860 0.3237 0.6860 0.8282
No log 5.2558 226 0.8797 0.2389 0.8797 0.9379
No log 5.3023 228 0.7760 0.2423 0.7760 0.8809
No log 5.3488 230 0.6104 0.3216 0.6104 0.7813
No log 5.3953 232 0.6652 0.1837 0.6652 0.8156
No log 5.4419 234 0.6534 0.1503 0.6534 0.8083
No log 5.4884 236 0.5699 0.2575 0.5699 0.7549
No log 5.5349 238 0.5872 0.3253 0.5872 0.7663
No log 5.5814 240 0.5715 0.2941 0.5715 0.7560
No log 5.6279 242 0.5837 0.3371 0.5837 0.7640
No log 5.6744 244 0.5953 0.3371 0.5953 0.7716
No log 5.7209 246 0.6048 0.3016 0.6048 0.7777
No log 5.7674 248 0.6119 0.3016 0.6119 0.7823
No log 5.8140 250 0.6006 0.3016 0.6006 0.7750
No log 5.8605 252 0.6189 0.2165 0.6189 0.7867
No log 5.9070 254 0.6143 0.1489 0.6143 0.7838
No log 5.9535 256 0.6022 0.2421 0.6022 0.7760
No log 6.0 258 0.6037 0.3118 0.6037 0.7770
No log 6.0465 260 0.6530 0.2917 0.6530 0.8081
No log 6.0930 262 0.7305 0.2830 0.7305 0.8547
No log 6.1395 264 0.6801 0.3365 0.6801 0.8247
No log 6.1860 266 0.6295 0.2821 0.6295 0.7934
No log 6.2326 268 0.6270 0.2323 0.6270 0.7919
No log 6.2791 270 0.6083 0.3118 0.6083 0.7799
No log 6.3256 272 0.6736 0.3333 0.6736 0.8207
No log 6.3721 274 0.6803 0.2893 0.6803 0.8248
No log 6.4186 276 0.6228 0.2970 0.6228 0.7892
No log 6.4651 278 0.6011 0.2970 0.6011 0.7753
No log 6.5116 280 0.5626 0.3488 0.5626 0.7501
No log 6.5581 282 0.5585 0.3591 0.5585 0.7473
No log 6.6047 284 0.5495 0.3591 0.5495 0.7413
No log 6.6512 286 0.5403 0.4083 0.5403 0.7351
No log 6.6977 288 0.5665 0.3333 0.5665 0.7526
No log 6.7442 290 0.5571 0.3333 0.5571 0.7464
No log 6.7907 292 0.5354 0.3684 0.5354 0.7317
No log 6.8372 294 0.5286 0.4083 0.5286 0.7271
No log 6.8837 296 0.5232 0.3412 0.5232 0.7233
No log 6.9302 298 0.5202 0.3829 0.5202 0.7212
No log 6.9767 300 0.5240 0.3735 0.5240 0.7239
No log 7.0233 302 0.5412 0.2970 0.5412 0.7356
No log 7.0698 304 0.5375 0.3000 0.5375 0.7332
No log 7.1163 306 0.5485 0.3735 0.5485 0.7406
No log 7.1628 308 0.5750 0.3333 0.5750 0.7583
No log 7.2093 310 0.5873 0.3708 0.5873 0.7663
No log 7.2558 312 0.5719 0.4023 0.5719 0.7563
No log 7.3023 314 0.5835 0.3636 0.5835 0.7639
No log 7.3488 316 0.5976 0.4098 0.5976 0.7730
No log 7.3953 318 0.5865 0.3333 0.5865 0.7658
No log 7.4419 320 0.5591 0.3563 0.5591 0.7477
No log 7.4884 322 0.5602 0.3563 0.5602 0.7484
No log 7.5349 324 0.5576 0.3563 0.5576 0.7467
No log 7.5814 326 0.5493 0.3016 0.5493 0.7412
No log 7.6279 328 0.5489 0.3231 0.5489 0.7409
No log 7.6744 330 0.5502 0.3253 0.5502 0.7418
No log 7.7209 332 0.5927 0.2941 0.5927 0.7699
No log 7.7674 334 0.6336 0.3446 0.6336 0.7960
No log 7.8140 336 0.6425 0.3103 0.6425 0.8015
No log 7.8605 338 0.6123 0.3446 0.6123 0.7825
No log 7.9070 340 0.5652 0.3216 0.5652 0.7518
No log 7.9535 342 0.5486 0.3333 0.5486 0.7407
No log 8.0 344 0.5556 0.2941 0.5556 0.7454
No log 8.0465 346 0.5573 0.3089 0.5573 0.7465
No log 8.0930 348 0.5811 0.3182 0.5811 0.7623
No log 8.1395 350 0.6377 0.3263 0.6377 0.7986
No log 8.1860 352 0.6720 0.3171 0.6720 0.8198
No log 8.2326 354 0.6590 0.3200 0.6590 0.8118
No log 8.2791 356 0.6582 0.3641 0.6582 0.8113
No log 8.3256 358 0.6500 0.3641 0.6500 0.8062
No log 8.3721 360 0.5981 0.2914 0.5981 0.7734
No log 8.4186 362 0.5579 0.3446 0.5579 0.7469
No log 8.4651 364 0.5464 0.3118 0.5464 0.7392
No log 8.5116 366 0.5407 0.2707 0.5407 0.7353
No log 8.5581 368 0.5343 0.3609 0.5343 0.7310
No log 8.6047 370 0.5387 0.3333 0.5387 0.7340
No log 8.6512 372 0.5415 0.3333 0.5415 0.7359
No log 8.6977 374 0.5342 0.3609 0.5342 0.7309
No log 8.7442 376 0.5342 0.3609 0.5342 0.7309
No log 8.7907 378 0.5366 0.3609 0.5366 0.7325
No log 8.8372 380 0.5416 0.3563 0.5416 0.7359
No log 8.8837 382 0.5414 0.3563 0.5414 0.7358
No log 8.9302 384 0.5435 0.3297 0.5435 0.7372
No log 8.9767 386 0.5455 0.3118 0.5455 0.7386
No log 9.0233 388 0.5495 0.3191 0.5495 0.7413
No log 9.0698 390 0.5511 0.3191 0.5511 0.7424
No log 9.1163 392 0.5570 0.3297 0.5570 0.7464
No log 9.1628 394 0.5656 0.3333 0.5656 0.7521
No log 9.2093 396 0.5835 0.3295 0.5835 0.7639
No log 9.2558 398 0.5915 0.3295 0.5915 0.7691
No log 9.3023 400 0.5959 0.3295 0.5959 0.7720
No log 9.3488 402 0.5866 0.3295 0.5866 0.7659
No log 9.3953 404 0.5770 0.3295 0.5770 0.7596
No log 9.4419 406 0.5670 0.3333 0.5670 0.7530
No log 9.4884 408 0.5602 0.3563 0.5602 0.7485
No log 9.5349 410 0.5575 0.3563 0.5575 0.7467
No log 9.5814 412 0.5596 0.3563 0.5596 0.7481
No log 9.6279 414 0.5649 0.3333 0.5649 0.7516
No log 9.6744 416 0.5725 0.3333 0.5725 0.7566
No log 9.7209 418 0.5779 0.3333 0.5779 0.7602
No log 9.7674 420 0.5808 0.3295 0.5808 0.7621
No log 9.8140 422 0.5836 0.3295 0.5836 0.7639
No log 9.8605 424 0.5840 0.3295 0.5840 0.7642
No log 9.9070 426 0.5830 0.3295 0.5830 0.7635
No log 9.9535 428 0.5818 0.3295 0.5818 0.7628
No log 10.0 430 0.5814 0.3295 0.5814 0.7625

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
1
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_FineTuningAraBERT_run2_AugV5_k9_task3_organization

Finetuned
(4023)
this model