| license: mit | |
| task_categories: | |
| - text-generation | |
| language: | |
| - en | |
| pretty_name: fineweb-sample-100BT-tokenized-45b | |
| size_categories: | |
| - 10B<n<100B | |
| [Fineweb sample-100BT](https://huggingface.co/datasets/HuggingFaceFW/fineweb) tokenized with the [gpt2 tokenizer](https://huggingface.co/openai-community/gpt2). Sequences are truncated to a length of 1024 tokens. | |
| For efficiency, tokens are stored in uint16 numpy arrays. Shards are stored in the webdataset format. | |