ArabicNewSplits6_FineTuningAraBERT_run2_AugV5_k13_task1_organization

This model is a fine-tuned version of aubmindlab/bert-base-arabertv02 on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7846
  • Qwk: 0.6676
  • Mse: 0.7846
  • Rmse: 0.8857

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Qwk Mse Rmse
No log 0.0308 2 5.1001 -0.0428 5.1001 2.2583
No log 0.0615 4 3.3131 0.0329 3.3131 1.8202
No log 0.0923 6 2.6624 -0.1322 2.6624 1.6317
No log 0.1231 8 1.7138 0.0885 1.7138 1.3091
No log 0.1538 10 1.2779 0.2393 1.2779 1.1304
No log 0.1846 12 1.1523 0.2350 1.1523 1.0735
No log 0.2154 14 1.4043 0.0794 1.4043 1.1850
No log 0.2462 16 1.5480 0.0075 1.5480 1.2442
No log 0.2769 18 1.9403 -0.0766 1.9403 1.3929
No log 0.3077 20 1.5427 0.0322 1.5427 1.2420
No log 0.3385 22 1.3791 0.1262 1.3791 1.1743
No log 0.3692 24 1.3734 0.1118 1.3734 1.1719
No log 0.4 26 1.3089 0.2288 1.3089 1.1441
No log 0.4308 28 1.2496 0.2065 1.2496 1.1179
No log 0.4615 30 1.2918 0.1812 1.2918 1.1366
No log 0.4923 32 2.2051 0.1754 2.2051 1.4850
No log 0.5231 34 2.9271 0.0953 2.9271 1.7109
No log 0.5538 36 2.5721 0.1388 2.5721 1.6038
No log 0.5846 38 1.7583 0.2269 1.7583 1.3260
No log 0.6154 40 1.2992 0.2051 1.2992 1.1398
No log 0.6462 42 1.1522 0.4103 1.1522 1.0734
No log 0.6769 44 1.0573 0.4092 1.0573 1.0283
No log 0.7077 46 1.0191 0.4093 1.0191 1.0095
No log 0.7385 48 1.0169 0.4189 1.0169 1.0084
No log 0.7692 50 1.1290 0.4062 1.1290 1.0625
No log 0.8 52 1.4256 0.3459 1.4256 1.1940
No log 0.8308 54 1.8291 0.2526 1.8291 1.3525
No log 0.8615 56 1.6570 0.2722 1.6570 1.2873
No log 0.8923 58 1.2104 0.3779 1.2104 1.1002
No log 0.9231 60 0.9602 0.5026 0.9602 0.9799
No log 0.9538 62 0.8986 0.5077 0.8986 0.9479
No log 0.9846 64 0.8786 0.5273 0.8786 0.9373
No log 1.0154 66 0.8914 0.5736 0.8914 0.9441
No log 1.0462 68 1.0429 0.5065 1.0429 1.0212
No log 1.0769 70 1.0840 0.4630 1.0840 1.0411
No log 1.1077 72 1.1132 0.4141 1.1132 1.0551
No log 1.1385 74 0.9231 0.5538 0.9231 0.9608
No log 1.1692 76 0.8359 0.5618 0.8359 0.9143
No log 1.2 78 0.8609 0.5723 0.8609 0.9278
No log 1.2308 80 0.9123 0.5468 0.9123 0.9552
No log 1.2615 82 1.1956 0.4370 1.1956 1.0934
No log 1.2923 84 1.6871 0.3315 1.6871 1.2989
No log 1.3231 86 2.1200 0.2472 2.1200 1.4560
No log 1.3538 88 1.8267 0.2935 1.8267 1.3516
No log 1.3846 90 1.1488 0.4181 1.1488 1.0718
No log 1.4154 92 0.9033 0.5536 0.9033 0.9504
No log 1.4462 94 0.8759 0.5420 0.8759 0.9359
No log 1.4769 96 0.9046 0.4715 0.9046 0.9511
No log 1.5077 98 0.8971 0.4846 0.8971 0.9472
No log 1.5385 100 0.8928 0.4800 0.8928 0.9449
No log 1.5692 102 0.8309 0.5128 0.8309 0.9115
No log 1.6 104 0.7794 0.5828 0.7794 0.8828
No log 1.6308 106 0.7595 0.5966 0.7595 0.8715
No log 1.6615 108 0.7513 0.5906 0.7513 0.8668
No log 1.6923 110 0.7825 0.5617 0.7825 0.8846
No log 1.7231 112 1.0201 0.5101 1.0201 1.0100
No log 1.7538 114 1.5725 0.3875 1.5725 1.2540
No log 1.7846 116 1.6331 0.3650 1.6331 1.2779
No log 1.8154 118 1.2147 0.4546 1.2147 1.1021
No log 1.8462 120 0.8881 0.5707 0.8881 0.9424
No log 1.8769 122 0.9038 0.5707 0.9038 0.9507
No log 1.9077 124 1.1969 0.4458 1.1969 1.0940
No log 1.9385 126 1.5901 0.3542 1.5901 1.2610
No log 1.9692 128 1.4105 0.3969 1.4105 1.1876
No log 2.0 130 0.9980 0.4879 0.9980 0.9990
No log 2.0308 132 0.7324 0.5845 0.7324 0.8558
No log 2.0615 134 0.7990 0.5892 0.7990 0.8938
No log 2.0923 136 0.8699 0.5964 0.8699 0.9327
No log 2.1231 138 0.8301 0.6143 0.8301 0.9111
No log 2.1538 140 0.6890 0.6185 0.6890 0.8301
No log 2.1846 142 0.6667 0.5909 0.6667 0.8165
No log 2.2154 144 0.6827 0.6070 0.6827 0.8263
No log 2.2462 146 0.7665 0.6451 0.7665 0.8755
No log 2.2769 148 0.7953 0.6271 0.7953 0.8918
No log 2.3077 150 0.7428 0.6062 0.7428 0.8618
No log 2.3385 152 0.7882 0.6119 0.7882 0.8878
No log 2.3692 154 0.8351 0.5941 0.8351 0.9138
No log 2.4 156 0.7772 0.6086 0.7772 0.8816
No log 2.4308 158 0.7858 0.6387 0.7858 0.8864
No log 2.4615 160 0.8167 0.6695 0.8167 0.9037
No log 2.4923 162 0.7761 0.6613 0.7761 0.8810
No log 2.5231 164 0.7434 0.6159 0.7434 0.8622
No log 2.5538 166 0.7302 0.6387 0.7302 0.8545
No log 2.5846 168 0.7242 0.6234 0.7242 0.8510
No log 2.6154 170 0.7186 0.6348 0.7186 0.8477
No log 2.6462 172 0.7206 0.6355 0.7206 0.8489
No log 2.6769 174 0.7209 0.6240 0.7209 0.8490
No log 2.7077 176 0.7175 0.6382 0.7175 0.8471
No log 2.7385 178 0.7262 0.6189 0.7262 0.8522
No log 2.7692 180 0.7299 0.5831 0.7299 0.8543
No log 2.8 182 0.7368 0.6190 0.7368 0.8584
No log 2.8308 184 0.7420 0.6258 0.7420 0.8614
No log 2.8615 186 0.7424 0.6196 0.7424 0.8616
No log 2.8923 188 0.7313 0.6608 0.7313 0.8552
No log 2.9231 190 0.7202 0.6298 0.7202 0.8486
No log 2.9538 192 0.7857 0.6757 0.7857 0.8864
No log 2.9846 194 0.8106 0.6818 0.8106 0.9003
No log 3.0154 196 0.8065 0.6818 0.8065 0.8981
No log 3.0462 198 0.7571 0.6788 0.7571 0.8701
No log 3.0769 200 0.6991 0.6671 0.6991 0.8361
No log 3.1077 202 0.6807 0.6627 0.6807 0.8250
No log 3.1385 204 0.6690 0.6582 0.6690 0.8179
No log 3.1692 206 0.6566 0.6402 0.6566 0.8103
No log 3.2 208 0.6506 0.6351 0.6506 0.8066
No log 3.2308 210 0.6542 0.6733 0.6542 0.8088
No log 3.2615 212 0.6848 0.6782 0.6848 0.8275
No log 3.2923 214 0.6555 0.6734 0.6555 0.8096
No log 3.3231 216 0.6501 0.6453 0.6501 0.8063
No log 3.3538 218 0.6524 0.6610 0.6524 0.8077
No log 3.3846 220 0.6604 0.6692 0.6604 0.8126
No log 3.4154 222 0.6973 0.6666 0.6973 0.8350
No log 3.4462 224 0.7425 0.7093 0.7425 0.8617
No log 3.4769 226 0.7587 0.7049 0.7587 0.8710
No log 3.5077 228 0.7347 0.6905 0.7347 0.8572
No log 3.5385 230 0.7039 0.6769 0.7039 0.8390
No log 3.5692 232 0.7169 0.6789 0.7169 0.8467
No log 3.6 234 0.7640 0.6967 0.7640 0.8741
No log 3.6308 236 0.7702 0.6948 0.7702 0.8776
No log 3.6615 238 0.7049 0.6870 0.7049 0.8396
No log 3.6923 240 0.6650 0.7066 0.6650 0.8155
No log 3.7231 242 0.6800 0.6997 0.6800 0.8246
No log 3.7538 244 0.7134 0.6894 0.7134 0.8447
No log 3.7846 246 0.7321 0.6759 0.7321 0.8556
No log 3.8154 248 0.7054 0.6915 0.7054 0.8399
No log 3.8462 250 0.6974 0.6864 0.6974 0.8351
No log 3.8769 252 0.7064 0.6397 0.7064 0.8405
No log 3.9077 254 0.7167 0.6136 0.7167 0.8466
No log 3.9385 256 0.7249 0.6276 0.7249 0.8514
No log 3.9692 258 0.7388 0.6655 0.7388 0.8595
No log 4.0 260 0.7349 0.6212 0.7349 0.8572
No log 4.0308 262 0.7701 0.5739 0.7701 0.8776
No log 4.0615 264 0.7975 0.5868 0.7975 0.8930
No log 4.0923 266 0.7553 0.5823 0.7553 0.8691
No log 4.1231 268 0.7406 0.6097 0.7406 0.8606
No log 4.1538 270 0.7391 0.6107 0.7391 0.8597
No log 4.1846 272 0.7553 0.6055 0.7553 0.8691
No log 4.2154 274 0.7466 0.6163 0.7466 0.8641
No log 4.2462 276 0.7175 0.6576 0.7175 0.8471
No log 4.2769 278 0.7589 0.6734 0.7589 0.8712
No log 4.3077 280 0.8300 0.6714 0.8300 0.9110
No log 4.3385 282 0.8726 0.6606 0.8726 0.9341
No log 4.3692 284 0.8471 0.6655 0.8471 0.9204
No log 4.4 286 0.7528 0.6817 0.7528 0.8677
No log 4.4308 288 0.6871 0.6800 0.6871 0.8289
No log 4.4615 290 0.7300 0.6883 0.7300 0.8544
No log 4.4923 292 0.7269 0.6504 0.7269 0.8526
No log 4.5231 294 0.7332 0.6824 0.7332 0.8563
No log 4.5538 296 0.7790 0.6543 0.7790 0.8826
No log 4.5846 298 0.7904 0.6593 0.7904 0.8890
No log 4.6154 300 0.7718 0.6686 0.7718 0.8785
No log 4.6462 302 0.7510 0.6422 0.7510 0.8666
No log 4.6769 304 0.7454 0.6572 0.7454 0.8634
No log 4.7077 306 0.7525 0.6786 0.7525 0.8674
No log 4.7385 308 0.8187 0.6592 0.8187 0.9048
No log 4.7692 310 0.8440 0.6459 0.8440 0.9187
No log 4.8 312 0.7996 0.6817 0.7996 0.8942
No log 4.8308 314 0.7452 0.6590 0.7452 0.8632
No log 4.8615 316 0.7282 0.6485 0.7282 0.8533
No log 4.8923 318 0.7342 0.7033 0.7342 0.8569
No log 4.9231 320 0.7820 0.6478 0.7820 0.8843
No log 4.9538 322 0.8209 0.6457 0.8209 0.9060
No log 4.9846 324 0.8074 0.6517 0.8074 0.8985
No log 5.0154 326 0.7790 0.6536 0.7790 0.8826
No log 5.0462 328 0.7470 0.6820 0.7470 0.8643
No log 5.0769 330 0.7036 0.6723 0.7036 0.8388
No log 5.1077 332 0.6841 0.6447 0.6841 0.8271
No log 5.1385 334 0.6815 0.6366 0.6815 0.8255
No log 5.1692 336 0.6912 0.6633 0.6912 0.8314
No log 5.2 338 0.6932 0.6647 0.6932 0.8326
No log 5.2308 340 0.7025 0.6679 0.7025 0.8382
No log 5.2615 342 0.7290 0.6905 0.7290 0.8538
No log 5.2923 344 0.7428 0.6905 0.7428 0.8619
No log 5.3231 346 0.7547 0.6857 0.7547 0.8688
No log 5.3538 348 0.7780 0.6585 0.7780 0.8821
No log 5.3846 350 0.7586 0.6905 0.7586 0.8710
No log 5.4154 352 0.7099 0.6254 0.7099 0.8426
No log 5.4462 354 0.7014 0.6274 0.7014 0.8375
No log 5.4769 356 0.7103 0.6212 0.7103 0.8428
No log 5.5077 358 0.7453 0.6844 0.7453 0.8633
No log 5.5385 360 0.7483 0.6864 0.7483 0.8651
No log 5.5692 362 0.7677 0.7034 0.7677 0.8762
No log 5.6 364 0.7585 0.6949 0.7585 0.8709
No log 5.6308 366 0.7138 0.6796 0.7138 0.8449
No log 5.6615 368 0.7246 0.6627 0.7246 0.8513
No log 5.6923 370 0.7823 0.6501 0.7823 0.8845
No log 5.7231 372 0.8780 0.6626 0.8780 0.9370
No log 5.7538 374 0.9950 0.6294 0.9950 0.9975
No log 5.7846 376 0.9800 0.6496 0.9800 0.9899
No log 5.8154 378 0.8804 0.6492 0.8804 0.9383
No log 5.8462 380 0.7444 0.6814 0.7444 0.8628
No log 5.8769 382 0.6989 0.6684 0.6989 0.8360
No log 5.9077 384 0.7276 0.6886 0.7276 0.8530
No log 5.9385 386 0.8125 0.6382 0.8125 0.9014
No log 5.9692 388 0.8550 0.6502 0.8550 0.9247
No log 6.0 390 0.8880 0.6394 0.8880 0.9423
No log 6.0308 392 0.8874 0.6394 0.8874 0.9420
No log 6.0615 394 0.8436 0.6588 0.8436 0.9185
No log 6.0923 396 0.8114 0.6560 0.8114 0.9008
No log 6.1231 398 0.8616 0.6479 0.8616 0.9282
No log 6.1538 400 0.9295 0.6136 0.9295 0.9641
No log 6.1846 402 0.9988 0.6137 0.9988 0.9994
No log 6.2154 404 0.9776 0.6180 0.9776 0.9887
No log 6.2462 406 0.9091 0.6274 0.9091 0.9535
No log 6.2769 408 0.8259 0.6569 0.8259 0.9088
No log 6.3077 410 0.7949 0.6529 0.7949 0.8916
No log 6.3385 412 0.8211 0.6622 0.8211 0.9061
No log 6.3692 414 0.8610 0.6557 0.8610 0.9279
No log 6.4 416 0.8854 0.6323 0.8854 0.9409
No log 6.4308 418 0.9516 0.6231 0.9516 0.9755
No log 6.4615 420 1.0341 0.5976 1.0341 1.0169
No log 6.4923 422 1.0653 0.5510 1.0653 1.0322
No log 6.5231 424 0.9960 0.6111 0.9960 0.9980
No log 6.5538 426 0.9010 0.6317 0.9010 0.9492
No log 6.5846 428 0.7945 0.6546 0.7945 0.8914
No log 6.6154 430 0.7517 0.6728 0.7517 0.8670
No log 6.6462 432 0.7775 0.6669 0.7775 0.8818
No log 6.6769 434 0.8078 0.6509 0.8078 0.8988
No log 6.7077 436 0.7785 0.6604 0.7785 0.8823
No log 6.7385 438 0.7504 0.6943 0.7504 0.8662
No log 6.7692 440 0.7529 0.6956 0.7529 0.8677
No log 6.8 442 0.7807 0.6621 0.7807 0.8836
No log 6.8308 444 0.7865 0.6604 0.7865 0.8868
No log 6.8615 446 0.7637 0.6741 0.7637 0.8739
No log 6.8923 448 0.7307 0.6919 0.7307 0.8548
No log 6.9231 450 0.7236 0.7114 0.7236 0.8506
No log 6.9538 452 0.7381 0.7114 0.7381 0.8591
No log 6.9846 454 0.7446 0.6960 0.7446 0.8629
No log 7.0154 456 0.7436 0.6831 0.7436 0.8623
No log 7.0462 458 0.7731 0.6898 0.7731 0.8792
No log 7.0769 460 0.8245 0.6585 0.8245 0.9080
No log 7.1077 462 0.8281 0.6385 0.8281 0.9100
No log 7.1385 464 0.7878 0.6685 0.7878 0.8876
No log 7.1692 466 0.7359 0.6630 0.7359 0.8579
No log 7.2 468 0.7127 0.6584 0.7127 0.8442
No log 7.2308 470 0.7107 0.6471 0.7107 0.8430
No log 7.2615 472 0.7109 0.6534 0.7109 0.8432
No log 7.2923 474 0.7124 0.6692 0.7124 0.8441
No log 7.3231 476 0.7195 0.6802 0.7195 0.8482
No log 7.3538 478 0.7413 0.6774 0.7413 0.8610
No log 7.3846 480 0.7499 0.6814 0.7499 0.8660
No log 7.4154 482 0.7520 0.6752 0.7520 0.8672
No log 7.4462 484 0.7467 0.6771 0.7467 0.8641
No log 7.4769 486 0.7351 0.6817 0.7351 0.8574
No log 7.5077 488 0.7230 0.7229 0.7230 0.8503
No log 7.5385 490 0.7180 0.7179 0.7180 0.8473
No log 7.5692 492 0.7445 0.6806 0.7445 0.8628
No log 7.6 494 0.7569 0.6715 0.7569 0.8700
No log 7.6308 496 0.7526 0.6694 0.7526 0.8675
No log 7.6615 498 0.7407 0.6694 0.7407 0.8606
0.4621 7.6923 500 0.7412 0.6694 0.7412 0.8609
0.4621 7.7231 502 0.7609 0.6676 0.7609 0.8723
0.4621 7.7538 504 0.7918 0.6561 0.7918 0.8898
0.4621 7.7846 506 0.7838 0.6578 0.7838 0.8853
0.4621 7.8154 508 0.7547 0.6721 0.7547 0.8687
0.4621 7.8462 510 0.7352 0.6757 0.7352 0.8574
0.4621 7.8769 512 0.7432 0.6757 0.7432 0.8621
0.4621 7.9077 514 0.7494 0.6802 0.7494 0.8657
0.4621 7.9385 516 0.7484 0.6802 0.7484 0.8651
0.4621 7.9692 518 0.7412 0.6836 0.7412 0.8609
0.4621 8.0 520 0.7578 0.6836 0.7578 0.8705
0.4621 8.0308 522 0.7804 0.6684 0.7804 0.8834
0.4621 8.0615 524 0.8038 0.6652 0.8038 0.8965
0.4621 8.0923 526 0.8000 0.6695 0.8000 0.8944
0.4621 8.1231 528 0.8063 0.6652 0.8063 0.8979
0.4621 8.1538 530 0.8384 0.6681 0.8384 0.9156
0.4621 8.1846 532 0.8560 0.6537 0.8560 0.9252
0.4621 8.2154 534 0.8416 0.6690 0.8416 0.9174
0.4621 8.2462 536 0.8229 0.6681 0.8229 0.9071
0.4621 8.2769 538 0.8303 0.6681 0.8303 0.9112
0.4621 8.3077 540 0.8328 0.6690 0.8328 0.9126
0.4621 8.3385 542 0.8409 0.6690 0.8409 0.9170
0.4621 8.3692 544 0.8243 0.6690 0.8243 0.9079
0.4621 8.4 546 0.8202 0.6635 0.8202 0.9056
0.4621 8.4308 548 0.8319 0.6681 0.8319 0.9121
0.4621 8.4615 550 0.8210 0.6698 0.8210 0.9061
0.4621 8.4923 552 0.8283 0.6681 0.8283 0.9101
0.4621 8.5231 554 0.8372 0.6681 0.8372 0.9150
0.4621 8.5538 556 0.8388 0.6681 0.8388 0.9159
0.4621 8.5846 558 0.8362 0.6681 0.8362 0.9145
0.4621 8.6154 560 0.8219 0.6698 0.8219 0.9066
0.4621 8.6462 562 0.7962 0.6758 0.7962 0.8923
0.4621 8.6769 564 0.7733 0.6666 0.7733 0.8794
0.4621 8.7077 566 0.7499 0.6784 0.7499 0.8660
0.4621 8.7385 568 0.7281 0.6809 0.7281 0.8533
0.4621 8.7692 570 0.7081 0.7040 0.7081 0.8415
0.4621 8.8 572 0.6988 0.7247 0.6988 0.8360
0.4621 8.8308 574 0.7008 0.7226 0.7008 0.8371
0.4621 8.8615 576 0.7145 0.6854 0.7145 0.8453
0.4621 8.8923 578 0.7420 0.6802 0.7420 0.8614
0.4621 8.9231 580 0.7769 0.6767 0.7769 0.8814
0.4621 8.9538 582 0.8157 0.6707 0.8157 0.9031
0.4621 8.9846 584 0.8525 0.6683 0.8525 0.9233
0.4621 9.0154 586 0.8745 0.6719 0.8745 0.9352
0.4621 9.0462 588 0.8793 0.6565 0.8793 0.9377
0.4621 9.0769 590 0.8647 0.6683 0.8647 0.9299
0.4621 9.1077 592 0.8384 0.6683 0.8384 0.9156
0.4621 9.1385 594 0.8211 0.6674 0.8211 0.9061
0.4621 9.1692 596 0.8184 0.6707 0.8184 0.9046
0.4621 9.2 598 0.8111 0.6662 0.8111 0.9006
0.4621 9.2308 600 0.8035 0.6679 0.8035 0.8964
0.4621 9.2615 602 0.7929 0.6767 0.7929 0.8905
0.4621 9.2923 604 0.7827 0.6767 0.7827 0.8847
0.4621 9.3231 606 0.7695 0.6866 0.7695 0.8772
0.4621 9.3538 608 0.7620 0.6855 0.7620 0.8730
0.4621 9.3846 610 0.7629 0.6855 0.7629 0.8734
0.4621 9.4154 612 0.7611 0.6855 0.7611 0.8724
0.4621 9.4462 614 0.7554 0.6855 0.7554 0.8692
0.4621 9.4769 616 0.7529 0.6766 0.7529 0.8677
0.4621 9.5077 618 0.7567 0.6766 0.7567 0.8699
0.4621 9.5385 620 0.7654 0.6766 0.7654 0.8749
0.4621 9.5692 622 0.7723 0.6666 0.7723 0.8788
0.4621 9.6 624 0.7803 0.6676 0.7803 0.8833
0.4621 9.6308 626 0.7843 0.6676 0.7843 0.8856
0.4621 9.6615 628 0.7891 0.6632 0.7891 0.8883
0.4621 9.6923 630 0.7925 0.6632 0.7925 0.8902
0.4621 9.7231 632 0.7986 0.6632 0.7986 0.8936
0.4621 9.7538 634 0.8016 0.6632 0.8016 0.8953
0.4621 9.7846 636 0.8002 0.6632 0.8002 0.8946
0.4621 9.8154 638 0.7986 0.6632 0.7986 0.8937
0.4621 9.8462 640 0.7969 0.6632 0.7969 0.8927
0.4621 9.8769 642 0.7938 0.6632 0.7938 0.8909
0.4621 9.9077 644 0.7905 0.6632 0.7905 0.8891
0.4621 9.9385 646 0.7874 0.6632 0.7874 0.8873
0.4621 9.9692 648 0.7852 0.6676 0.7852 0.8861
0.4621 10.0 650 0.7846 0.6676 0.7846 0.8857

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0+cu118
  • Datasets 2.21.0
  • Tokenizers 0.19.1
Downloads last month
4
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for MayBashendy/ArabicNewSplits6_FineTuningAraBERT_run2_AugV5_k13_task1_organization

Finetuned
(4023)
this model