Spaces:
Running on Zero
Running on Zero
A newer version of the Gradio SDK is available: 6.13.0
metadata
title: HY-WorldPlay
emoji: 🌍
colorFrom: blue
colorTo: green
sdk: gradio
python_version: 3.1
app_file: app.py
hardware: zero-gpu
license: apache-2.0
short_description: HY-World 1.5 - Interactive World Modeling
HY-WorldPlay (HunyuanWorld 1.5)
This Space demonstrates HY-WorldPlay: a streaming video diffusion model for real-time interactive world modeling.
Model Details
- Base Model: HunyuanVideo-1.5
- Architecture: Latent Video Diffusion with Dual Action Representation
- Capability: Generates long-horizon streaming video with geometric consistency.
Usage
- Prompt: Enter a text description of the scene.
- Image: (Optional) Upload a starting image for Image-to-Video generation.
- Camera Path: Upload a JSON file defining the camera trajectory (Pose).
- Note: You can find example pose files in the official repository or use the default provided in the demo if implemented.
- Generate: Click generate to create the video.
Notes
- This Space uses ZeroGPU for inference.
- The first run might take longer to download the model weights (~30GB+).
- The model is running in Bidirectional mode by default for quality.
Citation
@article{worldplay2025,
title={WorldPlay: Towards Long-Term Geometric Consistency for Real-Time Interactive World Model},
author={Wenqiang Sun and Haiyu Zhang and Haoyuan Wang and Junta Wu and Zehan Wang and Zhenwei Wang and Yunhong Wang and Jun Zhang and Tengfei Wang and Chunchao Guo},
year={2025},
journal={arXiv preprint}
}