Hugging Face
Models
Datasets
Spaces
Buckets
new
Docs
Enterprise
Pricing
Log In
Sign Up
Datasets:
Mavies5526
/
wikiqa_tokenized
like
0
Modalities:
Text
Formats:
parquet
Size:
10K - 100K
Libraries:
Datasets
pandas
Croissant
+ 1
Dataset card
Data Studio
Files
Files and versions
xet
Community
main
wikiqa_tokenized
9.16 MB
1 contributor
History:
3 commits
Mavies5526
Upload dataset
36c3ddb
verified
6 months ago
chunk_0
Upload folder using huggingface_hub
6 months ago
data
Upload dataset
6 months ago
.amlignore
Safe
315 Bytes
Upload folder using huggingface_hub
6 months ago
.amlignore.amltmp
Safe
315 Bytes
Upload folder using huggingface_hub
6 months ago
.gitattributes
Safe
2.46 kB
initial commit
6 months ago
README.md
Safe
376 Bytes
Upload dataset
6 months ago
tokenization_log.txt
Safe
373 Bytes
Upload folder using huggingface_hub
6 months ago