Datasets:
File size: 729 Bytes
137edd6 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 | ---
pretty_name: test strict shuffled
task_categories:
- text-classification
tags:
- parquet
- shuffled
- exact-shuffle
---
# Strict shuffled copy of `KantaHayashiAI/test`
This dataset was produced by assigning each source row a deterministic pseudo-random sort key
derived from:
- source parquet path
- row index inside that parquet
- shuffle seed `2026-04-01-strict-shuffle-v1`
Rows were first partitioned into `512` buckets by the high bits of the key,
then each bucket was fully sorted by `(__key_hi, __key_lo, __file_id, __row_idx)`.
This yields a deterministic global shuffled order without requiring the full dataset
to be materialized twice on local disk.
Expected train shard count: `512`.
|