ArabicNewSplits7_FineTuningAraBERT_run3_AugV5_k7_task3_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7113
  • Qwk: 0.0562
  • Mse: 0.7113
  • Rmse: 0.8434

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0952 2 3.6784 -0.0068 3.6784 1.9179
No log 0.1905 4 2.0918 0.0737 2.0918 1.4463
No log 0.2857 6 2.2968 0.0076 2.2968 1.5155
No log 0.3810 8 1.4426 0.0047 1.4426 1.2011
No log 0.4762 10 1.0063 -0.0936 1.0063 1.0031
No log 0.5714 12 1.0118 -0.0606 1.0118 1.0059
No log 0.6667 14 0.9097 0.0207 0.9097 0.9538
No log 0.7619 16 0.7505 0.0759 0.7505 0.8663
No log 0.8571 18 0.7451 0.0807 0.7451 0.8632
No log 0.9524 20 1.0125 -0.0638 1.0125 1.0062
No log 1.0476 22 1.2395 -0.0744 1.2395 1.1133
No log 1.1429 24 1.4883 0.0 1.4883 1.2200
No log 1.2381 26 1.1733 -0.0490 1.1733 1.0832
No log 1.3333 28 0.9321 -0.0961 0.9321 0.9654
No log 1.4286 30 0.8129 -0.1239 0.8129 0.9016
No log 1.5238 32 0.8522 0.0043 0.8522 0.9231
No log 1.6190 34 0.8500 -0.0766 0.8500 0.9220
No log 1.7143 36 0.9093 -0.0916 0.9093 0.9536
No log 1.8095 38 1.1113 -0.0479 1.1113 1.0542
No log 1.9048 40 1.3176 -0.0247 1.3176 1.1479
No log 2.0 42 1.8691 0.0 1.8691 1.3671
No log 2.0952 44 2.1069 -0.0015 2.1069 1.4515
No log 2.1905 46 1.7412 0.0 1.7412 1.3195
No log 2.2857 48 1.2256 0.0 1.2256 1.1071
No log 2.3810 50 0.8572 -0.1253 0.8572 0.9259
No log 2.4762 52 0.7580 -0.1223 0.7580 0.8706
No log 2.5714 54 0.7225 -0.0035 0.7225 0.8500
No log 2.6667 56 0.8083 0.0129 0.8083 0.8991
No log 2.7619 58 1.2583 0.0746 1.2583 1.1217
No log 2.8571 60 1.2925 0.0115 1.2925 1.1369
No log 2.9524 62 1.1273 0.0878 1.1273 1.0617
No log 3.0476 64 1.1601 -0.0361 1.1601 1.0771
No log 3.1429 66 0.7907 -0.0675 0.7907 0.8892
No log 3.2381 68 0.7314 -0.0069 0.7314 0.8552
No log 3.3333 70 0.9203 -0.0526 0.9203 0.9593
No log 3.4286 72 1.3159 -0.0207 1.3159 1.1471
No log 3.5238 74 1.2272 -0.0207 1.2272 1.1078
No log 3.6190 76 0.8624 0.0676 0.8624 0.9287
No log 3.7143 78 0.7682 0.0374 0.7682 0.8765
No log 3.8095 80 0.8100 0.0628 0.8100 0.9000
No log 3.9048 82 0.9139 0.0016 0.9139 0.9560
No log 4.0 84 0.8287 0.0956 0.8287 0.9103
No log 4.0952 86 0.8797 0.0805 0.8797 0.9379
No log 4.1905 88 1.5023 -0.0113 1.5023 1.2257
No log 4.2857 90 1.4546 0.0179 1.4546 1.2061
No log 4.3810 92 0.9691 -0.0349 0.9691 0.9844
No log 4.4762 94 0.8758 0.0441 0.8758 0.9359
No log 4.5714 96 0.9117 -0.0128 0.9117 0.9548
No log 4.6667 98 1.1850 0.0950 1.1850 1.0886
No log 4.7619 100 0.9422 -0.0939 0.9422 0.9706
No log 4.8571 102 0.9631 0.1361 0.9631 0.9814
No log 4.9524 104 0.8942 -0.0569 0.8942 0.9456
No log 5.0476 106 1.4408 0.0370 1.4408 1.2003
No log 5.1429 108 1.3101 -0.0080 1.3101 1.1446
No log 5.2381 110 0.9395 -0.0828 0.9395 0.9693
No log 5.3333 112 1.0634 -0.0175 1.0634 1.0312
No log 5.4286 114 1.0544 -0.0175 1.0544 1.0269
No log 5.5238 116 0.9233 -0.0425 0.9233 0.9609
No log 5.6190 118 0.8762 -0.0799 0.8762 0.9361
No log 5.7143 120 0.8343 0.0123 0.8343 0.9134
No log 5.8095 122 0.8235 0.0538 0.8235 0.9075
No log 5.9048 124 0.9273 0.0293 0.9273 0.9629
No log 6.0 126 0.7886 0.0538 0.7886 0.8880
No log 6.0952 128 0.8061 0.1393 0.8061 0.8978
No log 6.1905 130 0.7927 0.0985 0.7927 0.8903
No log 6.2857 132 0.8144 0.0633 0.8144 0.9024
No log 6.3810 134 0.7918 0.0798 0.7918 0.8899
No log 6.4762 136 0.9003 0.0029 0.9003 0.9489
No log 6.5714 138 0.8851 0.0741 0.8851 0.9408
No log 6.6667 140 0.7718 0.1287 0.7718 0.8785
No log 6.7619 142 0.7740 0.1347 0.7740 0.8798
No log 6.8571 144 0.8152 0.1379 0.8152 0.9029
No log 6.9524 146 0.8770 -0.0187 0.8770 0.9365
No log 7.0476 148 0.9192 -0.0528 0.9192 0.9588
No log 7.1429 150 0.8580 -0.0465 0.8580 0.9263
No log 7.2381 152 0.7754 0.0821 0.7754 0.8806
No log 7.3333 154 0.8812 0.0676 0.8812 0.9387
No log 7.4286 156 0.8662 0.0346 0.8662 0.9307
No log 7.5238 158 0.7099 0.0776 0.7099 0.8426
No log 7.6190 160 0.7301 0.1513 0.7301 0.8544
No log 7.7143 162 0.7216 0.1538 0.7216 0.8495
No log 7.8095 164 0.7408 0.0557 0.7408 0.8607
No log 7.9048 166 0.7466 0.0879 0.7466 0.8641
No log 8.0 168 0.9959 0.0454 0.9959 0.9979
No log 8.0952 170 1.1095 0.0025 1.1095 1.0533
No log 8.1905 172 0.7985 0.0600 0.7985 0.8936
No log 8.2857 174 0.8314 0.0240 0.8314 0.9118
No log 8.3810 176 0.8908 0.1165 0.8908 0.9438
No log 8.4762 178 0.7432 0.0585 0.7432 0.8621
No log 8.5714 180 0.8735 0.0424 0.8735 0.9346
No log 8.6667 182 1.0982 0.0977 1.0982 1.0479
No log 8.7619 184 0.9308 -0.0054 0.9308 0.9648
No log 8.8571 186 0.7531 0.0976 0.7531 0.8678
No log 8.9524 188 0.7456 0.0585 0.7456 0.8635
No log 9.0476 190 0.7542 0.0741 0.7542 0.8684
No log 9.1429 192 0.8421 0.0016 0.8421 0.9177
No log 9.2381 194 0.7927 0.0123 0.7927 0.8904
No log 9.3333 196 0.7264 0.1525 0.7264 0.8523
No log 9.4286 198 0.7644 0.0196 0.7644 0.8743
No log 9.5238 200 0.7211 0.1513 0.7211 0.8492
No log 9.6190 202 0.7582 0.0600 0.7582 0.8707
No log 9.7143 204 0.9478 0.1265 0.9478 0.9735
No log 9.8095 206 0.8615 0.0392 0.8615 0.9282
No log 9.9048 208 0.7671 0.0869 0.7671 0.8759
No log 10.0 210 0.7602 0.0449 0.7602 0.8719
No log 10.0952 212 0.7807 0.0355 0.7807 0.8836
No log 10.1905 214 0.9503 0.0224 0.9503 0.9748
No log 10.2857 216 1.3706 0.0350 1.3706 1.1707
No log 10.3810 218 1.3533 0.0359 1.3533 1.1633
No log 10.4762 220 0.9907 -0.0194 0.9907 0.9953
No log 10.5714 222 0.8125 0.1095 0.8125 0.9014
No log 10.6667 224 0.8374 0.0562 0.8374 0.9151
No log 10.7619 226 1.0787 0.0045 1.0787 1.0386
No log 10.8571 228 1.1450 0.0623 1.1450 1.0700
No log 10.9524 230 0.8484 0.0684 0.8484 0.9211
No log 11.0476 232 0.7334 0.0393 0.7334 0.8564
No log 11.1429 234 0.7406 0.0393 0.7406 0.8606
No log 11.2381 236 0.7490 0.0562 0.7490 0.8655
No log 11.3333 238 1.0383 0.1534 1.0383 1.0190
No log 11.4286 240 1.2239 0.0859 1.2239 1.1063
No log 11.5238 242 0.9601 0.1269 0.9601 0.9799
No log 11.6190 244 0.7830 0.0600 0.7830 0.8849
No log 11.7143 246 0.8369 0.0068 0.8369 0.9148
No log 11.8095 248 0.8825 0.0424 0.8825 0.9394
No log 11.9048 250 0.9967 0.0951 0.9967 0.9984
No log 12.0 252 0.8805 0.0041 0.8805 0.9383
No log 12.0952 254 0.9075 0.0831 0.9075 0.9526
No log 12.1905 256 1.1499 0.0666 1.1499 1.0723
No log 12.2857 258 1.1453 0.0366 1.1453 1.0702
No log 12.3810 260 0.9151 -0.0101 0.9151 0.9566
No log 12.4762 262 0.7851 -0.0274 0.7851 0.8861
No log 12.5714 264 0.7713 0.0214 0.7713 0.8782
No log 12.6667 266 0.8093 -0.0390 0.8093 0.8996
No log 12.7619 268 0.8523 -0.0033 0.8523 0.9232
No log 12.8571 270 0.8643 0.0316 0.8643 0.9297
No log 12.9524 272 0.7888 0.0549 0.7888 0.8882
No log 13.0476 274 0.8249 0.0490 0.8249 0.9083
No log 13.1429 276 0.7531 0.0600 0.7531 0.8678
No log 13.2381 278 0.7706 0.0518 0.7706 0.8778
No log 13.3333 280 0.7534 0.0058 0.7534 0.8680
No log 13.4286 282 0.7223 0.0318 0.7223 0.8499
No log 13.5238 284 0.7884 0.0525 0.7884 0.8879
No log 13.6190 286 0.8930 0.1025 0.8930 0.9450
No log 13.7143 288 0.8113 0.1593 0.8113 0.9007
No log 13.8095 290 0.7292 0.0282 0.7292 0.8539
No log 13.9048 292 0.7370 0.0680 0.7370 0.8585
No log 14.0 294 0.7201 0.0282 0.7201 0.8486
No log 14.0952 296 0.7496 0.0269 0.7496 0.8658
No log 14.1905 298 0.7836 0.0920 0.7836 0.8852
No log 14.2857 300 0.8259 0.1145 0.8259 0.9088
No log 14.3810 302 0.8382 0.1065 0.8382 0.9155
No log 14.4762 304 0.7522 0.0670 0.7522 0.8673
No log 14.5714 306 0.8136 0.1367 0.8136 0.9020
No log 14.6667 308 0.8046 0.1367 0.8046 0.8970
No log 14.7619 310 0.7288 0.1094 0.7288 0.8537
No log 14.8571 312 0.8368 0.1437 0.8368 0.9148
No log 14.9524 314 0.8136 0.1532 0.8136 0.9020
No log 15.0476 316 0.7345 0.0723 0.7345 0.8570
No log 15.1429 318 0.7410 0.0394 0.7410 0.8608
No log 15.2381 320 0.7590 0.0357 0.7590 0.8712
No log 15.3333 322 0.8289 0.0805 0.8289 0.9105
No log 15.4286 324 0.9132 0.0988 0.9132 0.9556
No log 15.5238 326 0.8132 0.0490 0.8132 0.9018
No log 15.6190 328 0.7471 -0.0179 0.7471 0.8643
No log 15.7143 330 0.7534 0.0869 0.7534 0.8680
No log 15.8095 332 0.7888 0.0123 0.7888 0.8881
No log 15.9048 334 0.8121 0.0871 0.8121 0.9011
No log 16.0 336 0.8642 0.0333 0.8642 0.9296
No log 16.0952 338 0.9272 -0.0118 0.9272 0.9629
No log 16.1905 340 0.9136 -0.0118 0.9136 0.9558
No log 16.2857 342 0.8089 0.0392 0.8089 0.8994
No log 16.3810 344 0.9134 0.0873 0.9134 0.9557
No log 16.4762 346 0.9874 0.1334 0.9874 0.9937
No log 16.5714 348 0.9302 0.1228 0.9302 0.9645
No log 16.6667 350 0.8949 0.0949 0.8949 0.9460
No log 16.7619 352 0.8020 0.0452 0.8020 0.8956
No log 16.8571 354 0.7386 0.1423 0.7386 0.8594
No log 16.9524 356 0.6957 0.0680 0.6957 0.8341
No log 17.0476 358 0.6917 0.2424 0.6917 0.8317
No log 17.1429 360 0.6832 0.1202 0.6832 0.8266
No log 17.2381 362 0.7139 0.2424 0.7139 0.8449
No log 17.3333 364 0.7306 0.2034 0.7306 0.8548
No log 17.4286 366 0.8010 0.0438 0.8010 0.8950
No log 17.5238 368 0.7956 0.0504 0.7956 0.8920
No log 17.6190 370 0.7286 0.1144 0.7286 0.8536
No log 17.7143 372 0.6904 -0.0118 0.6904 0.8309
No log 17.8095 374 0.7066 0.1943 0.7066 0.8406
No log 17.9048 376 0.7728 0.0711 0.7728 0.8791
No log 18.0 378 0.7389 0.1387 0.7389 0.8596
No log 18.0952 380 0.7040 -0.0118 0.7040 0.8391
No log 18.1905 382 0.7299 0.0528 0.7299 0.8544
No log 18.2857 384 0.7335 0.0976 0.7335 0.8564
No log 18.3810 386 0.7240 0.1003 0.7240 0.8509
No log 18.4762 388 0.8159 0.0362 0.8159 0.9033
No log 18.5714 390 0.8744 0.1064 0.8744 0.9351
No log 18.6667 392 0.8427 0.1064 0.8427 0.9180
No log 18.7619 394 0.7427 0.0867 0.7427 0.8618
No log 18.8571 396 0.7022 0.0205 0.7022 0.8380
No log 18.9524 398 0.7087 0.0175 0.7087 0.8419
No log 19.0476 400 0.7754 0.0867 0.7754 0.8806
No log 19.1429 402 0.8692 0.2000 0.8692 0.9323
No log 19.2381 404 0.7719 0.0909 0.7719 0.8786
No log 19.3333 406 0.7047 -0.0145 0.7047 0.8395
No log 19.4286 408 0.7205 -0.0513 0.7205 0.8488
No log 19.5238 410 0.7156 -0.0096 0.7156 0.8459
No log 19.6190 412 0.7592 0.0123 0.7592 0.8713
No log 19.7143 414 0.7645 0.0152 0.7645 0.8743
No log 19.8095 416 0.7764 0.0247 0.7764 0.8811
No log 19.9048 418 0.7897 0.0247 0.7897 0.8886
No log 20.0 420 0.8642 0.0277 0.8642 0.9296
No log 20.0952 422 1.0187 0.0631 1.0187 1.0093
No log 20.1905 424 1.0491 0.0909 1.0491 1.0243
No log 20.2857 426 1.0894 0.0238 1.0894 1.0437
No log 20.3810 428 0.9137 0.0277 0.9137 0.9559
No log 20.4762 430 0.8678 0.0277 0.8678 0.9316
No log 20.5714 432 0.9161 0.0642 0.9161 0.9571
No log 20.6667 434 0.8905 0.0642 0.8905 0.9437
No log 20.7619 436 0.8816 0.1064 0.8816 0.9390
No log 20.8571 438 0.8747 0.1360 0.8747 0.9353
No log 20.9524 440 0.8104 -0.0373 0.8104 0.9002
No log 21.0476 442 0.7775 -0.0156 0.7775 0.8818
No log 21.1429 444 0.7784 -0.0218 0.7784 0.8823
No log 21.2381 446 0.8584 0.1065 0.8584 0.9265
No log 21.3333 448 0.9915 0.1152 0.9915 0.9957
No log 21.4286 450 0.9994 0.0741 0.9994 0.9997
No log 21.5238 452 0.8135 0.0442 0.8135 0.9019
No log 21.6190 454 0.7095 0.0768 0.7095 0.8423
No log 21.7143 456 0.6950 -0.0125 0.6950 0.8336
No log 21.8095 458 0.6872 0.0318 0.6872 0.8290
No log 21.9048 460 0.7046 0.1146 0.7046 0.8394
No log 22.0 462 0.7028 0.1199 0.7028 0.8383
No log 22.0952 464 0.7090 0.0732 0.7090 0.8420
No log 22.1905 466 0.7402 0.0574 0.7402 0.8603
No log 22.2857 468 0.8480 0.1502 0.8480 0.9209
No log 22.3810 470 0.8565 0.1406 0.8565 0.9255
No log 22.4762 472 0.7841 0.0909 0.7841 0.8855
No log 22.5714 474 0.7139 0.0303 0.7139 0.8449
No log 22.6667 476 0.7202 0.0 0.7202 0.8486
No log 22.7619 478 0.7141 -0.0059 0.7141 0.8450
No log 22.8571 480 0.7260 0.0690 0.7260 0.8521
No log 22.9524 482 0.8426 0.0867 0.8426 0.9179
No log 23.0476 484 0.9455 0.0805 0.9455 0.9724
No log 23.1429 486 0.8628 0.0748 0.8628 0.9289
No log 23.2381 488 0.7785 0.0490 0.7785 0.8823
No log 23.3333 490 0.7219 0.0303 0.7219 0.8497
No log 23.4286 492 0.7267 -0.0059 0.7267 0.8524
No log 23.5238 494 0.7184 0.0394 0.7184 0.8476
No log 23.6190 496 0.7526 0.0095 0.7526 0.8675
No log 23.7143 498 0.8879 0.1360 0.8879 0.9423
0.2955 23.8095 500 0.9840 0.0623 0.9840 0.9919
0.2955 23.9048 502 0.9097 0.0949 0.9097 0.9538
0.2955 24.0 504 0.7900 -0.0303 0.7900 0.8888
0.2955 24.0952 506 0.7822 0.0167 0.7822 0.8844
0.2955 24.1905 508 0.8169 -0.0008 0.8169 0.9038
0.2955 24.2857 510 0.9417 0.0810 0.9417 0.9704
0.2955 24.3810 512 0.9709 0.0974 0.9709 0.9854
0.2955 24.4762 514 0.8736 0.1106 0.8736 0.9346
0.2955 24.5714 516 0.7711 0.0095 0.7711 0.8781
0.2955 24.6667 518 0.7313 0.0723 0.7313 0.8552
0.2955 24.7619 520 0.7256 0.0723 0.7256 0.8518
0.2955 24.8571 522 0.7658 0.0913 0.7658 0.8751
0.2955 24.9524 524 0.8179 0.1502 0.8179 0.9044
0.2955 25.0476 526 0.8613 0.0984 0.8613 0.9281
0.2955 25.1429 528 0.8077 0.1106 0.8077 0.8987
0.2955 25.2381 530 0.7233 0.1141 0.7233 0.8505
0.2955 25.3333 532 0.7268 0.0834 0.7268 0.8525
0.2955 25.4286 534 0.7138 0.0791 0.7138 0.8449
0.2955 25.5238 536 0.7596 0.1106 0.7596 0.8716
0.2955 25.6190 538 0.9110 0.1798 0.9110 0.9544
0.2955 25.7143 540 1.0538 -0.0133 1.0538 1.0266
0.2955 25.8095 542 1.0821 -0.0133 1.0821 1.0402
0.2955 25.9048 544 0.9155 0.2037 0.9155 0.9568
0.2955 26.0 546 0.7712 0.1106 0.7712 0.8782
0.2955 26.0952 548 0.7389 0.0091 0.7389 0.8596
0.2955 26.1905 550 0.7346 0.0700 0.7346 0.8571
0.2955 26.2857 552 0.7564 -0.0355 0.7564 0.8697
0.2955 26.3810 554 0.8509 0.1354 0.8509 0.9224
0.2955 26.4762 556 0.9353 0.0843 0.9353 0.9671
0.2955 26.5714 558 0.8794 0.1311 0.8794 0.9377
0.2955 26.6667 560 0.7744 0.0041 0.7744 0.8800
0.2955 26.7619 562 0.7595 0.0041 0.7595 0.8715
0.2955 26.8571 564 0.7913 0.1149 0.7913 0.8896
0.2955 26.9524 566 0.8430 0.1399 0.8430 0.9181
0.2955 27.0476 568 0.8832 0.1228 0.8832 0.9398
0.2955 27.1429 570 0.8174 0.1399 0.8174 0.9041
0.2955 27.2381 572 0.7342 0.0123 0.7342 0.8569
0.2955 27.3333 574 0.7181 -0.0086 0.7181 0.8474
0.2955 27.4286 576 0.7268 -0.0059 0.7268 0.8525
0.2955 27.5238 578 0.7294 -0.0152 0.7294 0.8540
0.2955 27.6190 580 0.8301 0.1065 0.8301 0.9111
0.2955 27.7143 582 1.0397 0.0543 1.0397 1.0197
0.2955 27.8095 584 1.1238 -0.0133 1.1238 1.0601
0.2955 27.9048 586 1.0416 0.0543 1.0416 1.0206
0.2955 28.0 588 0.8804 0.0912 0.8804 0.9383
0.2955 28.0952 590 0.7520 -0.0152 0.7520 0.8672
0.2955 28.1905 592 0.7738 0.0027 0.7738 0.8796
0.2955 28.2857 594 0.7732 0.0465 0.7732 0.8793
0.2955 28.3810 596 0.7574 0.0791 0.7574 0.8703
0.2955 28.4762 598 0.7592 0.0068 0.7592 0.8713
0.2955 28.5714 600 0.7450 0.0690 0.7450 0.8631
0.2955 28.6667 602 0.7357 0.0791 0.7357 0.8577
0.2955 28.7619 604 0.7312 0.1189 0.7312 0.8551
0.2955 28.8571 606 0.7348 0.1001 0.7348 0.8572
0.2955 28.9524 608 0.7480 0.0041 0.7480 0.8649
0.2955 29.0476 610 0.7508 0.0041 0.7508 0.8665
0.2955 29.1429 612 0.7291 0.1047 0.7291 0.8539
0.2955 29.2381 614 0.7211 0.0214 0.7211 0.8492
0.2955 29.3333 616 0.7260 0.0214 0.7260 0.8520
0.2955 29.4286 618 0.7317 0.0214 0.7317 0.8554
0.2955 29.5238 620 0.7522 0.1001 0.7522 0.8673
0.2955 29.6190 622 0.7534 0.1001 0.7534 0.8680
0.2955 29.7143 624 0.7689 0.1518 0.7689 0.8769
0.2955 29.8095 626 0.8011 0.1004 0.8011 0.8950
0.2955 29.9048 628 0.8473 -0.0376 0.8473 0.9205
0.2955 30.0 630 0.8714 0.0333 0.8714 0.9335
0.2955 30.0952 632 0.8141 0.0016 0.8141 0.9023
0.2955 30.1905 634 0.7604 0.1096 0.7604 0.8720
0.2955 30.2857 636 0.7396 0.1298 0.7396 0.8600
0.2955 30.3810 638 0.7249 0.0318 0.7249 0.8514
0.2955 30.4762 640 0.7324 0.1096 0.7324 0.8558
0.2955 30.5714 642 0.7701 -0.0008 0.7701 0.8775
0.2955 30.6667 644 0.8278 0.0333 0.8278 0.9098
0.2955 30.7619 646 0.8300 0.0711 0.8300 0.9111
0.2955 30.8571 648 0.7618 0.0041 0.7618 0.8728
0.2955 30.9524 650 0.7004 0.1096 0.7004 0.8369
0.2955 31.0476 652 0.7245 -0.0967 0.7245 0.8512
0.2955 31.1429 654 0.7572 0.0141 0.7572 0.8701
0.2955 31.2381 656 0.7321 0.0031 0.7321 0.8556
0.2955 31.3333 658 0.6866 0.0318 0.6866 0.8286
0.2955 31.4286 660 0.7113 0.0562 0.7113 0.8434

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits7_FineTuningAraBERT_run3_AugV5_k7_task3_organization

Finetuned
(4023)
this model