HuggingfaceTest / README.md
anderloh's picture
Training in progress, epoch 0
2a042fd verified
|
raw
history blame
16.1 kB
metadata
base_model: anderloh/Hugginhface-master-wav2vec-pretreined-5-class-train-test
tags:
  - generated_from_trainer
metrics:
  - accuracy
model-index:
  - name: PushToHubModel
    results: []

PushToHubModel

This model is a fine-tuned version of anderloh/Hugginhface-master-wav2vec-pretreined-5-class-train-test on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8947
  • Accuracy: 0.6748

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 128
  • eval_batch_size: 128
  • seed: 0
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 512
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 250.0
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.92 3 1.5989 0.3427
No log 1.85 6 1.5987 0.3427
No log 2.77 9 1.5984 0.3427
No log 4.0 13 1.5977 0.3427
No log 4.92 16 1.5970 0.3427
No log 5.85 19 1.5963 0.3392
No log 6.77 22 1.5953 0.3357
No log 8.0 26 1.5938 0.3287
No log 8.92 29 1.5925 0.3217
No log 9.85 32 1.5911 0.3147
No log 10.77 35 1.5896 0.3112
No log 12.0 39 1.5873 0.3042
No log 12.92 42 1.5854 0.3007
No log 13.85 45 1.5834 0.2972
No log 14.77 48 1.5813 0.2832
1.5878 16.0 52 1.5784 0.2727
1.5878 16.92 55 1.5760 0.2727
1.5878 17.85 58 1.5736 0.2692
1.5878 18.77 61 1.5711 0.2517
1.5878 20.0 65 1.5675 0.2378
1.5878 20.92 68 1.5645 0.2343
1.5878 21.85 71 1.5613 0.2238
1.5878 22.77 74 1.5581 0.2273
1.5878 24.0 78 1.5537 0.2273
1.5878 24.92 81 1.5504 0.2273
1.5878 25.85 84 1.5473 0.2273
1.5878 26.77 87 1.5444 0.2273
1.5878 28.0 91 1.5405 0.2273
1.5878 28.92 94 1.5378 0.2273
1.5878 29.85 97 1.5353 0.2273
1.5185 30.77 100 1.5335 0.2273
1.5185 32.0 104 1.5322 0.2273
1.5185 32.92 107 1.5323 0.2273
1.5185 33.85 110 1.5341 0.2273
1.5185 34.77 113 1.5373 0.2273
1.5185 36.0 117 1.5437 0.2273
1.5185 36.92 120 1.5511 0.2273
1.5185 37.85 123 1.5605 0.2273
1.5185 38.77 126 1.5712 0.2273
1.5185 40.0 130 1.5858 0.2273
1.5185 40.92 133 1.5946 0.2273
1.5185 41.85 136 1.6022 0.2273
1.5185 42.77 139 1.6077 0.2273
1.5185 44.0 143 1.6055 0.2308
1.5185 44.92 146 1.5978 0.2238
1.5185 45.85 149 1.5862 0.2413
1.3621 46.77 152 1.5746 0.2692
1.3621 48.0 156 1.5576 0.2657
1.3621 48.92 159 1.5446 0.2867
1.3621 49.85 162 1.5347 0.3007
1.3621 50.77 165 1.5267 0.3182
1.3621 52.0 169 1.5192 0.3322
1.3621 52.92 172 1.5158 0.3357
1.3621 53.85 175 1.5128 0.3392
1.3621 54.77 178 1.5093 0.3427
1.3621 56.0 182 1.5042 0.3462
1.3621 56.92 185 1.4979 0.3531
1.3621 57.85 188 1.4927 0.3566
1.3621 58.77 191 1.4858 0.3601
1.3621 60.0 195 1.4790 0.3706
1.3621 60.92 198 1.4717 0.3741
1.2297 61.85 201 1.4674 0.3846
1.2297 62.77 204 1.4588 0.3881
1.2297 64.0 208 1.4482 0.4021
1.2297 64.92 211 1.4374 0.4161
1.2297 65.85 214 1.4255 0.4231
1.2297 66.77 217 1.4126 0.4336
1.2297 68.0 221 1.4000 0.4371
1.2297 68.92 224 1.3919 0.4371
1.2297 69.85 227 1.3865 0.4406
1.2297 70.77 230 1.3836 0.4441
1.2297 72.0 234 1.3742 0.4441
1.2297 72.92 237 1.3636 0.4476
1.2297 73.85 240 1.3518 0.4580
1.2297 74.77 243 1.3429 0.4685
1.2297 76.0 247 1.3334 0.4825
1.1141 76.92 250 1.3253 0.4860
1.1141 77.85 253 1.3172 0.4860
1.1141 78.77 256 1.3118 0.4825
1.1141 80.0 260 1.3054 0.4790
1.1141 80.92 263 1.2986 0.4790
1.1141 81.85 266 1.2907 0.4790
1.1141 82.77 269 1.2791 0.4860
1.1141 84.0 273 1.2688 0.4860
1.1141 84.92 276 1.2623 0.4895
1.1141 85.85 279 1.2557 0.4930
1.1141 86.77 282 1.2547 0.5
1.1141 88.0 286 1.2539 0.5070
1.1141 88.92 289 1.2504 0.5070
1.1141 89.85 292 1.2435 0.5070
1.1141 90.77 295 1.2374 0.5070
1.1141 92.0 299 1.2278 0.5140
1.0055 92.92 302 1.2204 0.5210
1.0055 93.85 305 1.2155 0.5210
1.0055 94.77 308 1.2131 0.5210
1.0055 96.0 312 1.2082 0.5280
1.0055 96.92 315 1.2022 0.5350
1.0055 97.85 318 1.1926 0.5350
1.0055 98.77 321 1.1854 0.5420
1.0055 100.0 325 1.1779 0.5455
1.0055 100.92 328 1.1747 0.5490
1.0055 101.85 331 1.1715 0.5490
1.0055 102.77 334 1.1681 0.5490
1.0055 104.0 338 1.1552 0.5524
1.0055 104.92 341 1.1457 0.5594
1.0055 105.85 344 1.1385 0.5629
1.0055 106.77 347 1.1312 0.5699
0.9214 108.0 351 1.1231 0.5769
0.9214 108.92 354 1.1204 0.5804
0.9214 109.85 357 1.1177 0.5769
0.9214 110.77 360 1.1143 0.5804
0.9214 112.0 364 1.1112 0.5769
0.9214 112.92 367 1.1073 0.5944
0.9214 113.85 370 1.1025 0.5944
0.9214 114.77 373 1.0935 0.6049
0.9214 116.0 377 1.0794 0.6049
0.9214 116.92 380 1.0689 0.6084
0.9214 117.85 383 1.0585 0.6189
0.9214 118.77 386 1.0515 0.6259
0.9214 120.0 390 1.0456 0.6189
0.9214 120.92 393 1.0421 0.6189
0.9214 121.85 396 1.0393 0.6189
0.9214 122.77 399 1.0351 0.6189
0.8358 124.0 403 1.0309 0.6189
0.8358 124.92 406 1.0282 0.6189
0.8358 125.85 409 1.0239 0.6189
0.8358 126.77 412 1.0165 0.6259
0.8358 128.0 416 1.0052 0.6294
0.8358 128.92 419 0.9978 0.6329
0.8358 129.85 422 0.9944 0.6399
0.8358 130.77 425 0.9936 0.6399
0.8358 132.0 429 0.9909 0.6399
0.8358 132.92 432 0.9893 0.6434
0.8358 133.85 435 0.9849 0.6469
0.8358 134.77 438 0.9814 0.6469
0.8358 136.0 442 0.9776 0.6434
0.8358 136.92 445 0.9732 0.6503
0.8358 137.85 448 0.9680 0.6503
0.773 138.77 451 0.9665 0.6503
0.773 140.0 455 0.9641 0.6503
0.773 140.92 458 0.9629 0.6538
0.773 141.85 461 0.9600 0.6538
0.773 142.77 464 0.9584 0.6503
0.773 144.0 468 0.9512 0.6538
0.773 144.92 471 0.9475 0.6538
0.773 145.85 474 0.9492 0.6538
0.773 146.77 477 0.9511 0.6573
0.773 148.0 481 0.9548 0.6608
0.773 148.92 484 0.9539 0.6573
0.773 149.85 487 0.9493 0.6678
0.773 150.77 490 0.9428 0.6678
0.773 152.0 494 0.9381 0.6643
0.773 152.92 497 0.9356 0.6643
0.7252 153.85 500 0.9317 0.6643
0.7252 154.77 503 0.9293 0.6643
0.7252 156.0 507 0.9327 0.6678
0.7252 156.92 510 0.9338 0.6678
0.7252 157.85 513 0.9341 0.6643
0.7252 158.77 516 0.9338 0.6643
0.7252 160.0 520 0.9282 0.6643
0.7252 160.92 523 0.9234 0.6678
0.7252 161.85 526 0.9192 0.6678
0.7252 162.77 529 0.9182 0.6678
0.7252 164.0 533 0.9217 0.6678
0.7252 164.92 536 0.9230 0.6643
0.7252 165.85 539 0.9247 0.6678
0.7252 166.77 542 0.9255 0.6713
0.7252 168.0 546 0.9225 0.6713
0.7252 168.92 549 0.9201 0.6713
0.697 169.85 552 0.9186 0.6678
0.697 170.77 555 0.9153 0.6678
0.697 172.0 559 0.9132 0.6713
0.697 172.92 562 0.9127 0.6713
0.697 173.85 565 0.9122 0.6713
0.697 174.77 568 0.9141 0.6713
0.697 176.0 572 0.9148 0.6713
0.697 176.92 575 0.9140 0.6713
0.697 177.85 578 0.9140 0.6713
0.697 178.77 581 0.9127 0.6713
0.697 180.0 585 0.9130 0.6748
0.697 180.92 588 0.9104 0.6748
0.697 181.85 591 0.9088 0.6748
0.697 182.77 594 0.9052 0.6748
0.697 184.0 598 0.9011 0.6748
0.6822 184.92 601 0.8989 0.6748
0.6822 185.85 604 0.8974 0.6748
0.6822 186.77 607 0.8963 0.6748
0.6822 188.0 611 0.8967 0.6748
0.6822 188.92 614 0.8982 0.6748
0.6822 189.85 617 0.9005 0.6748
0.6822 190.77 620 0.9020 0.6748
0.6822 192.0 624 0.9018 0.6748
0.6822 192.92 627 0.9009 0.6748
0.6822 193.85 630 0.9002 0.6748
0.6822 194.77 633 0.8995 0.6748
0.6822 196.0 637 0.8988 0.6748
0.6822 196.92 640 0.8973 0.6748
0.6822 197.85 643 0.8967 0.6748
0.6822 198.77 646 0.8970 0.6748
0.6578 200.0 650 0.8954 0.6748
0.6578 200.92 653 0.8951 0.6748
0.6578 201.85 656 0.8945 0.6748
0.6578 202.77 659 0.8946 0.6748
0.6578 204.0 663 0.8944 0.6748
0.6578 204.92 666 0.8950 0.6748
0.6578 205.85 669 0.8960 0.6748
0.6578 206.77 672 0.8969 0.6748
0.6578 208.0 676 0.8992 0.6748
0.6578 208.92 679 0.8995 0.6748
0.6578 209.85 682 0.8992 0.6748
0.6578 210.77 685 0.8990 0.6748
0.6578 212.0 689 0.8986 0.6748
0.6578 212.92 692 0.8984 0.6748
0.6578 213.85 695 0.8981 0.6748
0.6578 214.77 698 0.8979 0.6748
0.6633 216.0 702 0.8977 0.6748
0.6633 216.92 705 0.8973 0.6748
0.6633 217.85 708 0.8968 0.6748
0.6633 218.77 711 0.8963 0.6748
0.6633 220.0 715 0.8957 0.6748
0.6633 220.92 718 0.8955 0.6748
0.6633 221.85 721 0.8952 0.6748
0.6633 222.77 724 0.8954 0.6748
0.6633 224.0 728 0.8953 0.6748
0.6633 224.92 731 0.8952 0.6748
0.6633 225.85 734 0.8952 0.6748
0.6633 226.77 737 0.8951 0.6748
0.6633 228.0 741 0.8949 0.6748
0.6633 228.92 744 0.8948 0.6748
0.6633 229.85 747 0.8948 0.6748
0.6693 230.77 750 0.8947 0.6748

Framework versions

  • Transformers 4.39.0.dev0
  • Pytorch 2.2.1+cu121
  • Datasets 2.17.1
  • Tokenizers 0.15.2