Datasets:
Update README.md
Browse files
README.md
CHANGED
|
@@ -163,6 +163,35 @@ Take down: We will comply to legitimate requests by removing the affected source
|
|
| 163 |
| zh | Mandarin Chinese | 44 |
|
| 164 |
|
| 165 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 166 |
|
| 167 |
|
| 168 |
|
|
|
|
| 163 |
| zh | Mandarin Chinese | 44 |
|
| 164 |
|
| 165 |
|
| 166 |
+
### Usage
|
| 167 |
+
|
| 168 |
+
Although webdataset can be used in a streaming fashion, it is recommended to first make a local copy fo the dataset using git clone.
|
| 169 |
+
|
| 170 |
+
git lfs install
|
| 171 |
+
git clone git@hf.co:datasets/TalTechNLP/voxlingua107_wds
|
| 172 |
+
|
| 173 |
+
Then you can use the Python webdataset library to create an iterable dataset out of it:
|
| 174 |
+
|
| 175 |
+
import webdataset as wds
|
| 176 |
+
import random
|
| 177 |
+
train_files = glob.glob("voxlingua107_wds/train/**/*.tar") # you can also limit the training data to selected languages
|
| 178 |
+
dev_files = glob.glob("voxlingua107_wds/dev/*.tar")
|
| 179 |
+
|
| 180 |
+
random.shuffle(train_files)
|
| 181 |
+
|
| 182 |
+
def mapper(sample):
|
| 183 |
+
# "audio" field represents 16 kHz raw audio
|
| 184 |
+
return {"audio": sample[0], "lang": sample[1]["lang"]}
|
| 185 |
+
|
| 186 |
+
# Since each shard contains 500 samples for a single language, it is good to use a reasonably large buffer size to get nicely shuffled samples
|
| 187 |
+
buffer_size = 100000
|
| 188 |
+
dataset = wds.WebDataset(train_urls, shardshuffle=2000).shuffle(buffer_size, initial=buffer_size).decode(wds.torch_audio).to_tuple("wav","json").map(mapper)
|
| 189 |
+
dev_dataset = wds.WebDataset(dev_urls).decode(wds.torch_audio).to_tuple("wav","json").map(mapper)
|
| 190 |
+
|
| 191 |
+
train_iter = iter(dataset)
|
| 192 |
+
print(next(train_iter))
|
| 193 |
+
{'audio': (tensor([[-9.7656e-04, -8.5449e-04, -3.0518e-05, ..., 2.7466e-03,
|
| 194 |
+
3.7842e-03, 5.1880e-03]]), 16000), 'lang': 'kn'}
|
| 195 |
|
| 196 |
|
| 197 |
|