---
arxiv: "2602.14381"
tags:
- video-generation
- real-time
- autoregressive
- diffusion
- webrtc
- vace
- wan
license: apache-2.0
---
# Daydream Scope
Scope is an open-source tool for running real-time, interactive generative AI video pipelines. It supports multiple autoregressive video diffusion models with WebRTC streaming, Spout integration for live VJ workflows, and a composable pipeline architecture.
## Highlights
- **Multiple model backends** — StreamDiffusionV2, LongLive, Krea Realtime, RewardForcing, MemFlow, and more via plugins
- **VACE integration** — Use reference images and control videos (depth, scribble, optical flow, etc.) to guide generation in real time
- **Spout / NDI** — Pipe video in and out of tools like Resolume Arena, TouchDesigner, OBS
- **LoRA support** — Customize styles and concepts on the fly
- **Composable pipelines** — Chain video diffusion with depth mapping, frame interpolation, and other processors
- **WebRTC streaming API** — Low-latency output to any browser
## Demos
Resolume Arena as live input into Scope via Spout:
Real-time depth-conditioned generation:
## Links
- [GitHub](https://github.com/daydreamlive/scope)
- [Documentation](https://docs.daydream.live/scope/getting-started/quickstart)
- [Discord](https://discord.gg/mnfGR4Fjhp)
- [Daydream](https://www.daydream.live/)
## Citation
The VACE integration in Scope is described in:
```bibtex
@article{fosdick2026adapting,
title={Adapting VACE for Real-Time Autoregressive Video Diffusion},
author={Fosdick, Ryan},
journal={arXiv preprint arXiv:2602.14381},
year={2026}
}
```