Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
Datasets:
Kush26
/
wikitext103-tokenized-gpt2
like
0
Formats:
parquet
Size:
100K - 1M
Libraries:
Datasets
Dask
Croissant
+ 1
License:
apache-2.0
Dataset card
Data Studio
Files
Files and versions
xet
Community
1
refs/convert/parquet
wikitext103-tokenized-gpt2
/
default
475 MB
1 contributor
History:
1 commit
parquet-converter
Update parquet files
bd3b2f8
verified
6 months ago
test
Update parquet files
6 months ago
train
Update parquet files
6 months ago
validation
Update parquet files
6 months ago