Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Datasets:
kuanhuggingface
/
google_tts_speech_tokenizer
like
0
Modalities:
Text
Formats:
parquet
Size:
100K - 1M
Libraries:
Datasets
Dask
Croissant
+ 1
Dataset card
Data Studio
Files
Files and versions
xet
Community
refs/convert/parquet
google_tts_speech_tokenizer
/
default
148 MB
2 contributors
History:
9 commits
parquet-converter
Delete old duckdb index files
bea8574
verified
almost 2 years ago
test
Delete old duckdb index files
almost 2 years ago
train
Delete old duckdb index files
almost 2 years ago
validation
Delete old duckdb index files
almost 2 years ago