scope / README.md
ryanontheinside's picture
Upload README.md with huggingface_hub
2a4d861 verified
---
arxiv: "2602.14381"
tags:
- video-generation
- real-time
- autoregressive
- diffusion
- webrtc
- vace
- wan
license: apache-2.0
---
# Daydream Scope
Scope is an open-source tool for running real-time, interactive generative AI video pipelines. It supports multiple autoregressive video diffusion models with WebRTC streaming, Spout integration for live VJ workflows, and a composable pipeline architecture.
## Highlights
- **Multiple model backends** β€” StreamDiffusionV2, LongLive, Krea Realtime, RewardForcing, MemFlow, and more via plugins
- **VACE integration** β€” Use reference images and control videos (depth, scribble, optical flow, etc.) to guide generation in real time
- **Spout / NDI** β€” Pipe video in and out of tools like Resolume Arena, TouchDesigner, OBS
- **LoRA support** β€” Customize styles and concepts on the fly
- **Composable pipelines** β€” Chain video diffusion with depth mapping, frame interpolation, and other processors
- **WebRTC streaming API** β€” Low-latency output to any browser
## Demos
Resolume Arena as live input into Scope via Spout:
<video src="https://huggingface.co/daydreamlive/scope/resolve/main/videos/resolume.mp4" controls autoplay loop muted></video>
Real-time depth-conditioned generation:
<video src="https://huggingface.co/daydreamlive/scope/resolve/main/videos/david_depth.mp4" controls autoplay loop muted></video>
## Links
- [GitHub](https://github.com/daydreamlive/scope)
- [Documentation](https://docs.daydream.live/scope/getting-started/quickstart)
- [Discord](https://discord.gg/mnfGR4Fjhp)
- [Daydream](https://www.daydream.live/)
## Citation
The VACE integration in Scope is described in:
```bibtex
@article{fosdick2026adapting,
title={Adapting VACE for Real-Time Autoregressive Video Diffusion},
author={Fosdick, Ryan},
journal={arXiv preprint arXiv:2602.14381},
year={2026}
}
```