all-wikis-mix / README.md
JayJayThrowThrow's picture
Add FineWeb-style all-wikis mix shards
fe22a85 verified

all-wikis-mix

FineWeb-style Parquet shards created by merging all the already-built monolingual Wikipedia datasets into a single dataset.

Why this exists

Training pipelines that round-robin by shard can “loop” on tiny datasets (repeating the same few shards). A single merged dataset has a healthier shard count and makes mixing more stable.

What’s inside

  • Format: nanochat-parquet-v1
  • Layout: shard_*.parquet + metadata.json
  • Text column: text
  • Parquet settings: zstd (level 3), row_group_size=1024, use_dictionary=False, write_statistics=False

Loading

from datasets import load_dataset

ds = load_dataset("JayJayThrowThrow/all-wikis-mix", split="train")
print(ds.column_names)  # ["text"]