Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
Datasets:
zion84006
/
tencentdata_speech_tokenizer
like
0
Modalities:
Tabular
Text
Formats:
parquet
Size:
100K - 1M
Libraries:
Datasets
Dask
Croissant
+ 1
Dataset card
Data Studio
Files
Files and versions
xet
Community
1
refs/convert/parquet
tencentdata_speech_tokenizer
/
default
708 MB
1 contributor
History:
38 commits
parquet-converter
Delete old duckdb index files
1d7c798
verified
about 2 years ago
test
Delete old duckdb index files
about 2 years ago
train
Delete old duckdb index files
about 2 years ago
valid
Delete old duckdb index files
about 2 years ago