Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Datasets:
Nma
/
tokenize_resume_dataset
like
0
Formats:
parquet
Size:
100K - 1M
Libraries:
Datasets
Dask
Croissant
+ 1
Dataset card
Data Studio
Files
Files and versions
xet
Community
main
tokenize_resume_dataset
522 MB
1 contributor
History:
6 commits
Nma
Upload README.md with huggingface_hub
f0471f9
about 3 years ago
data
Upload data/train-00002-of-00003-5e2a1decf0a3bec6.parquet with huggingface_hub
about 3 years ago
.gitattributes
Safe
2.27 kB
initial commit
about 3 years ago
README.md
Safe
491 Bytes
Upload README.md with huggingface_hub
about 3 years ago