File size: 2,073 Bytes
f7ef41f 64105e7 f7ef41f 64105e7 f7ef41f 64105e7 ee85529 64105e7 ee85529 64105e7 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 | ---
dataset_info:
features:
- name: text
dtype: string
- name: id
dtype: string
- name: dump
dtype: string
- name: url
dtype: string
- name: date
dtype: string
- name: file_path
dtype: string
- name: language
dtype: string
- name: language_score
dtype: float64
- name: token_count
dtype: int64
- name: score
dtype: float64
- name: int_score
dtype: int64
- name: dataset
dtype: string
splits:
- name: train
num_examples: 160677091
license: odc-by
language:
- en
size_categories:
- 100M<n<1B
tags:
- pretraining
- smol-data
pretty_name: FineWeb 100BT (Shuffled)
---
# FineWeb 100BT (Shuffled)
A globally shuffled version of [HuggingFaceFW/fineweb_100BT](https://huggingface.co/datasets/HuggingFaceFW/fineweb_100BT).
Part of the [Smol-Data](https://huggingface.co/collections/HuggingFaceFW/smol-data) collection — tried and tested mixes for strong pretraining.
## Dataset Description
This dataset contains the same ~100B tokens as [fineweb_100BT](https://huggingface.co/datasets/HuggingFaceFW/fineweb_100BT) but with all documents globally shuffled (seed=42). Use this version when you need randomized document ordering for pretraining.
## How It Was Created
The unshuffled dataset was loaded into memory, shuffled with `dataset.shuffle(seed=42)`, and re-uploaded with 100 shards. See the [smol_data.py](https://github.com/huggingface/datatrove/blob/main/examples/smol_data.py) script for details.
## Usage
```python
from datasets import load_dataset
ds = load_dataset("HuggingFaceFW/fineweb_100BT-shuffled", split="train", streaming=True)
for sample in ds:
print(sample["text"][:200])
break
```
## Citation
```bibtex
@misc{niklaus2026smoldata,
title={SmolData},
author={Joel Niklaus and Hynek Kydl{\'\i}{\v{c}}ek},
year={2026},
publisher={Hugging Face},
journal={Hugging Face repository},
howpublished={\url{https://huggingface.co/collections/HuggingFaceFW/smol-data}}
}
```
|