Papers
arxiv:2603.26546

AutoWeather4D: Autonomous Driving Video Weather Conversion via G-Buffer Dual-Pass Editing

Published on Mar 27
ยท Submitted by
tianyu liu
on Apr 1
ยท HKUST HKUST
Authors:
,
,
,
,
,

Abstract

AutoWeather4D is a 3D-aware weather editing framework that decouples geometry and illumination through a dual-pass mechanism, enabling efficient and physically accurate weather modification for autonomous driving applications.

AI-generated summary

Generative video models have significantly advanced the photorealistic synthesis of adverse weather for autonomous driving; however, they consistently demand massive datasets to learn rare weather scenarios. While 3D-aware editing methods alleviate these data constraints by augmenting existing video footage, they are fundamentally bottlenecked by costly per-scene optimization and suffer from inherent geometric and illumination entanglement. In this work, we introduce AutoWeather4D, a feed-forward 3D-aware weather editing framework designed to explicitly decouple geometry and illumination. At the core of our approach is a G-buffer Dual-pass Editing mechanism. The Geometry Pass leverages explicit structural foundations to enable surface-anchored physical interactions, while the Light Pass analytically resolves light transport, accumulating the contributions of local illuminants into the global illumination to enable dynamic 3D local relighting. Extensive experiments demonstrate that AutoWeather4D achieves comparable photorealism and structural consistency to generative baselines while enabling fine-grained parametric physical control, serving as a practical data engine for autonomous driving.

Community

Paper author Paper submitter
โ€ข
edited about 13 hours ago

AutoWeather4D: Autonomous Driving Video Weather Conversion via G-Buffer Dual-Pass Editing

Introducing a feed-forward framework for autonomous driving video editing. By explicitly decoupling geometry from lighting and performing direct editing on the G-buffer, our method achieves efficient, physically controllable, and photorealistic transitions across multiple weather conditions and times of day.

With this framework, editing real-world driving videos becomes as intuitive as using a game engine: it allows for flexible control over scene illumination while naturally rendering dynamic weather effects like rain, snow, and fog.

๐Ÿ”— Project Page: https://lty2226262.github.io/autoweather4d/
๐Ÿ“„ arXiv: https://arxiv.org/abs/2603.26546
๐Ÿ’ป Code: https://github.com/lty2226262/AutoWeather4D
๐Ÿค— Huggingface Paper: https://huggingface.co/papers/2603.26546

Sign up or log in to comment

Get this paper in your agent:

hf papers read 2603.26546
Don't have the latest CLI?
curl -LsSf https://hf.co/cli/install.sh | bash

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2603.26546 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2603.26546 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2603.26546 in a Space README.md to link it from this page.

Collections including this paper 1