File size: 1,888 Bytes
2a4d861 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 |
---
arxiv: "2602.14381"
tags:
- video-generation
- real-time
- autoregressive
- diffusion
- webrtc
- vace
- wan
license: apache-2.0
---
# Daydream Scope
Scope is an open-source tool for running real-time, interactive generative AI video pipelines. It supports multiple autoregressive video diffusion models with WebRTC streaming, Spout integration for live VJ workflows, and a composable pipeline architecture.
## Highlights
- **Multiple model backends** — StreamDiffusionV2, LongLive, Krea Realtime, RewardForcing, MemFlow, and more via plugins
- **VACE integration** — Use reference images and control videos (depth, scribble, optical flow, etc.) to guide generation in real time
- **Spout / NDI** — Pipe video in and out of tools like Resolume Arena, TouchDesigner, OBS
- **LoRA support** — Customize styles and concepts on the fly
- **Composable pipelines** — Chain video diffusion with depth mapping, frame interpolation, and other processors
- **WebRTC streaming API** — Low-latency output to any browser
## Demos
Resolume Arena as live input into Scope via Spout:
<video src="https://huggingface.co/daydreamlive/scope/resolve/main/videos/resolume.mp4" controls autoplay loop muted></video>
Real-time depth-conditioned generation:
<video src="https://huggingface.co/daydreamlive/scope/resolve/main/videos/david_depth.mp4" controls autoplay loop muted></video>
## Links
- [GitHub](https://github.com/daydreamlive/scope)
- [Documentation](https://docs.daydream.live/scope/getting-started/quickstart)
- [Discord](https://discord.gg/mnfGR4Fjhp)
- [Daydream](https://www.daydream.live/)
## Citation
The VACE integration in Scope is described in:
```bibtex
@article{fosdick2026adapting,
title={Adapting VACE for Real-Time Autoregressive Video Diffusion},
author={Fosdick, Ryan},
journal={arXiv preprint arXiv:2602.14381},
year={2026}
}
```
|