Dataset Viewer
The dataset viewer is not available for this subset.
Cannot get the split names for the config 'default' of the dataset.
Exception:    SplitsNotFoundError
Message:      The split names could not be parsed from the dataset config.
Traceback:    Traceback (most recent call last):
                File "/usr/local/lib/python3.12/site-packages/datasets/inspect.py", line 286, in get_dataset_config_info
                  for split_generator in builder._split_generators(
                                         ^^^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/packaged_modules/webdataset/webdataset.py", line 82, in _split_generators
                  raise ValueError(
              ValueError: The TAR archives of the dataset should be in WebDataset format, but the files in the archive don't share the same prefix or the same types.
              
              The above exception was the direct cause of the following exception:
              
              Traceback (most recent call last):
                File "/src/services/worker/src/worker/job_runners/config/split_names.py", line 65, in compute_split_names_from_streaming_response
                  for split in get_dataset_split_names(
                               ^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/inspect.py", line 340, in get_dataset_split_names
                  info = get_dataset_config_info(
                         ^^^^^^^^^^^^^^^^^^^^^^^^
                File "/usr/local/lib/python3.12/site-packages/datasets/inspect.py", line 291, in get_dataset_config_info
                  raise SplitsNotFoundError("The split names could not be parsed from the dataset config.") from err
              datasets.inspect.SplitsNotFoundError: The split names could not be parsed from the dataset config.

Need help to make the dataset viewer work? Make sure to review how to configure the dataset viewer, and open a discussion for direct support.

MultiWorld Duet OOV Scripted v1

2-player out-of-view (OOV) Minecraft gameplay dataset for training and evaluating multiplayer world models. Companion dataset to the MultiWorld paper.

Overview

  • Episodes: ~4,300 (8,600+ player streams)
  • Format: Solaris-compatible sharded WebDataset TARs
  • Resolution: 1280×720 @ 20 FPS
  • Actions: 25+ discrete action types matching nyu-visionx/solaris-training-dataset
  • Shards: 219 TAR files, ~116 GB total
  • Server: Minecraft 1.21 (Paper), flat world

Why this dataset exists

The Solaris Duet dataset (12.64M frames, 2 players) is overwhelmingly co-visible gameplay. MultiWorld's architecture — learned client-server decomposition, authority gates, latent reconciliation — only matters when partners are not in view of each other and client-side prediction has to fill the gap. This dataset provides the missing OOV coverage across six distinct regimes.

Scenarios

1. splitAndRejoin — Decoupled futures

Both bots start near each other, then walk in opposite directions to a random distance (60–180 blocks). At their remote locations, each bot independently performs world-state mutations (mining and placing blocks) that the other bot has no observation of. Both bots then pathfind back to the rendezvous point.

OOV test: Can the world model maintain consistent predictions about world state that was modified by an unobserved partner?

Parameter Range
Separation distance 60–180 blocks
Mutations per bot 2–5 (mine/place alternating)
Separation direction Random angle θ ∈ [0, 2π)

Phases: SEPARATE → barrier → MUTATE → barrier → REJOIN → eye-contact hold → END


2. throughTheWall — Occluder-only co-presence

A stone wall (5–9 blocks wide, 4–6 blocks tall) is RCON-placed at the midpoint between the bots. Both bots walk toward the wall and stand on opposite sides. The "active" bot performs a sequence of world-state mutations (mining/placing) near the wall while the other bot holds position. Both bots then walk away from the wall to verify the aftermath.

OOV test: Can the world model infer partner activity through an occluder, using only indirect cues (audio, block-update packets)?

Parameter Range
Wall width 5–9 blocks
Wall height 4–6 blocks
Mutations 2–5 per active bot
Audio cues 4–8 noteblock actions

Phases: WALL_BUILD → APPROACH → barrier → ACTIVE_PHASE → barrier → AUDIO_CUES → barrier → WITHDRAW → END


3. farApartBuild — Remote construction

Both bots are teleported to distant positions (200–600 blocks apart). Each bot independently builds a tower of random height (2–6 blocks). After building, both bots are teleported to each other's build site to inspect the other's construction for several seconds.

OOV test: Can the world model predict what a remote partner built when neither bot observed the construction?

Parameter Range
Scatter distance 200–600 blocks
Tower height (per bot) 2–6 blocks
Inspect duration 4 seconds

Phases: SCATTER (tp) → barrier → BUILD → barrier → SWAP (tp) → INSPECT → END


4. asymmetricInformation — Partial information

Both bots are teleported to distant positions (400–1000 blocks apart). The "witness" bot summons a passive mob (pig/cow/sheep/chicken/rabbit) at its location and observes it. The "observer" bot waits in its own region with no knowledge of the mob. Both bots then swap positions — the observer visits the witness's region to see the mob aftermath, and the witness visits the observer's empty region.

OOV test: Can the world model represent asymmetric knowledge — one player knows something the other doesn't — and reconcile it when they swap?

Parameter Range
Scatter distance 400–1000 blocks
Mob type pig, cow, sheep, chicken, rabbit
Witness role Random (50/50)
Witness duration 5 seconds

Phases: SCATTER (tp) → barrier → WITNESS/OBSERVE → barrier → SWAP (tp) → INSPECT → END


5. jointWitnessReunite — Shared-memory persistence

Both bots start near each other. A passive mob is RCON-summoned at the midpoint. Both bots co-observe the mob for ~5 seconds, then separate in opposite directions (40–80 blocks). During the hold-OOV period (5–15 seconds), the mob continues wandering under the server's normal AI. Both bots then pathfind back to the midpoint to observe the mob's new state.

OOV test: Can the world model maintain a consistent prediction across both bots' perspectives about a jointly-observed entity during a period when neither bot can see it? (Shared-memory test)

Parameter Range
Mob type pig, cow, sheep, chicken, rabbit
Separation distance 40–80 blocks
Hold-OOV duration 5–15 seconds
Co-observe hold 5 seconds

Phases: POSITION_EXCHANGE → CO_OBSERVE (summon + look) → barrier → SEPARATE → barrier → HOLD_OOV → barrier → RETURN → POST_OBSERVE → END


6. sharedBuildReunite — Joint-artifact persistence

Both bots start near each other. A stone platform (3×3 to 5×5) is RCON-placed at the midpoint. Both bots collaboratively place blocks on the platform — lower-name bot fills even-indexed positions, higher-name fills odd-indexed positions (4–8 blocks total). Both bots observe the completed structure, then separate (40–80 blocks apart) and hold OOV (5–15 seconds). Both bots return to verify the structure is unchanged.

OOV test: Can the world model maintain a consistent prediction across both bots about a jointly-constructed artifact during their absence? (Shared-memory + joint-construction test)

Parameter Range
Platform size 3×3 to 5×5
Co-built blocks 4–8
Separation distance 40–80 blocks
Hold-OOV duration 5–15 seconds

Phases: POSITION_EXCHANGE → PLATFORM_SETUP → barrier → CO_BUILD → barrier → CO_OBSERVE → barrier → SEPARATE → barrier → HOLD_OOV → barrier → RETURN → POST_OBSERVE → END


Episode metadata

Each episode produces an _episode_info.json file with scenario-specific metadata:

{
  episode_type: splitAndRejoin,
  encountered_error: false,
  peer_encountered_error: false,
  bot_died: false,
  eval_metadata: {
    separation_blocks: 142.7,
    num_mutations: 3
  }
}

The eval_metadata fields vary by scenario type and include the procedural parameters used for that episode (distances, durations, mob types, etc.).

File structure

Each TAR shard contains entries under train/ or test/ prefixes:

train/<prefix>_<episode>_Alpha_instance_<i>.mp4      # Player 1 video
train/<prefix>_<episode>_Alpha_instance_<i>.json      # Player 1 per-frame actions
train/<prefix>_<episode>_Alpha_instance_<i>_episode_info.json
train/<prefix>_<episode>_Bravo_instance_<i>.mp4      # Player 2 video
train/<prefix>_<episode>_Bravo_instance_<i>.json      # Player 2 per-frame actions
train/<prefix>_<episode>_Bravo_instance_<i>_episode_info.json

Usage

Compatible with the Solaris training pipeline. Extract shards:

# Each shard contains train/ and test/ prefixed entries
for f in *.tar; do tar xf $f; done

Collection infrastructure

Built on solaris-engine with custom episode handlers for OOV scenarios. Headless Minecraft 1.21 (Paper), mineflayer bots with pathfinder, GPU-accelerated recording via EGL. All procedural parameters are driven by a shared PRNG seed so both bots make compatible decisions independently.

Citation

If you use this dataset, please cite the MultiWorld paper (forthcoming).

Downloads last month
-