Iliass Lasri commited on
Commit
a9e7ffa
·
1 Parent(s): eff87e0

add layer

Browse files
Files changed (1) hide show
  1. README.md +11 -11
README.md CHANGED
@@ -76,17 +76,17 @@ We trained quantizers across different encoders, codebook sizes, and augmentatio
76
  - **All augmentations, single** — all augmentations are enabled, but only one randomly chosen augmentation is applied per sample.
77
  - **No extra augmentations, single** — only the baseline augmentations (from the original paper) are used, with one applied per sample.
78
 
79
- | Encoder | Codebook | Augmentation Strategy |
80
- |:---|:---:|:---|
81
- | HuBERT | 500 | All augmentations, chained |
82
- | HuBERT | 500 | All augmentations, single |
83
- | HuBERT | 500 | No extra augmentations, single |
84
- | | | |
85
- | SpidR | 256 | No extra augmentations, single |
86
- | SpidR | 256 | All augmentations, chained |
87
- | | | |
88
- | DinoSR (original) | 256 | All augmentations, chained |
89
- | DinoSR (reproduced) | 256 | All augmentations, chained |
90
 
91
  ## Links
92
  - Paper: [Algayres et al., Interspeech 2023](https://aclanthology.org/2023.iwslt-1.46/)
 
76
  - **All augmentations, single** — all augmentations are enabled, but only one randomly chosen augmentation is applied per sample.
77
  - **No extra augmentations, single** — only the baseline augmentations (from the original paper) are used, with one applied per sample.
78
 
79
+ | Encoder | Layer | Codebook | Augmentation Strategy |
80
+ |:---|:---:|:---:|:---|
81
+ | HuBERT | 6 | 500 | All augmentations, chained |
82
+ | | | | All augmentations, single |
83
+ | | | | No extra augmentations, single |
84
+ | | | | |
85
+ | SpidR | 6 | 256 | No extra augmentations, single |
86
+ | | | | All augmentations, chained |
87
+ | | | | |
88
+ | DinoSR (original) | 5 | 256 | All augmentations, chained |
89
+ | DinoSR (reproduced) | 5 | 256 | All augmentations, chained |
90
 
91
  ## Links
92
  - Paper: [Algayres et al., Interspeech 2023](https://aclanthology.org/2023.iwslt-1.46/)