license: cc0-1.0
task_categories:
- reinforcement-learning
- robotics
- image-to-video
- image-text-to-video
- image-to-3d
language:
- en
tags:
- world-model
- reinforcement-learning
- human-in-the-loop
- agent
pretty_name: No Man's Sky High-Fidelity Human-in-the-loop World Model Dataset
size_categories:
- 100K<n<1M
No Man's Sky High-Fidelity Human-in-the-loop World Model Dataset
Overview
This dataset is designed for world model training using real human gameplay data from No Man’s Sky.
It captures high-fidelity human–computer interaction by recording both the game video and time-aligned input actions, preserving the realistic latency characteristics of a human-in-the-loop system.
Compared with “internal game state” datasets, this dataset retains the physical interaction chain (input → game/render → screen → capture), making it well-suited for training models that need to operate under real-world latency and sensory constraints.
Dataset Structure
Each recording session is stored in a UUID directory.
A typical session contains:
/
recording.mp4
actions.jsonl
events.jsonl
metadata.json
actions_resampled.jsonl
1) recording.mp4
The recorded gameplay video.
2) actions.jsonl (per-frame input state)
One JSON object per video frame. Each entry contains the input state sampled at frame time.
Schema:
frame(int): frame indextimestamp_ms(int): wall-clock timestamp in millisecondsframe_pts_ms(float): frame time in milliseconds (PTS-based)capture_ns(int): OBS compositor timestamp in nanosecondskey(string[]): list of pressed keys at this framemouse(object):dx(int): accumulated mouse delta X during the framedy(int): accumulated mouse delta Y during the framex(int): absolute mouse X positiony(int): absolute mouse Y positionscroll_dy(int): scroll delta during the framebutton(string[]): pressed mouse buttons (e.g.,LeftButton,Button4)
3) events.jsonl (raw sub-frame input events)
Raw input events with microsecond timing, captured from the OS event stream.
Schema:
type(string): event typekey_down,key_up,flags_changedmouse_move,mouse_button_down,mouse_button_upscroll
timestamp_ms(int): wall-clock timestampsession_offset_us(int): microsecond offset from session startkey(string): key name for key eventsbutton(string): mouse button namedx,dy,x,y(int): mouse movementscroll_dy(int): scroll delta
4) metadata.json
Session-level metadata and video info.
Schema:
stream_name(string): session UUIDgame_name(string): game nameplatform(string):mac/windows/linuxvideo_meta(object):width(int)height(int)fps(float)total_frames(int)duration_ms(int)
input_latency_bias_ms(number): recommended latency bias for alignment
5) actions_resampled.jsonl
High-precision resampled per-frame actions reconstructed from events.jsonl using latency compensation.
This is the recommended aligned input stream for training.
Suggested Usage
- For world model training, use
recording.<ext>+actions_resampled.jsonl. - For analysis or recalibration, use
events.jsonlandmetadata.json.
Notes
- The dataset captures realistic system latency; alignment is provided but does not remove physical pipeline delay.
- This design targets high-fidelity human-in-the-loop interaction for robust world-model learning.