Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

Datasets:
nikolina-p
/
gutenberg_clean_tokenized_en

Tasks:
Text Generation
Summarization
Modalities:
Text
Formats:
parquet
Languages:
English
Size:
10K - 100K
Libraries:
Datasets
Dask
Dataset card Data Studio Files Files and versions
xet
Community
1
gutenberg_clean_tokenized_en
15 GB
  • 1 contributor
History: 4 commits
nikolina-p's picture
nikolina-p
Update README.md
eb1b25b verified 7 months ago
  • data
    Initial upload of deduplicated, cleaned and tokenized (tiktoken gpt2) streaming dataset 7 months ago
  • .gitattributes
    2.46 kB
    initial commit 7 months ago
  • README.md
    3.32 kB
    Update README.md 7 months ago