UniScale: Unified Scale-Aware 3D Reconstruction for Multi-View Understanding via Prior Injection for Robotic Perception
Paper โข 2602.23224 โข Published
Part of the ANIMA Perception Suite by Robot Flow Labs.
UniScale: Unified Scale-Aware Multi-View 3D Reconstruction arXiv: 2602.23224
UniScale combines camera intrinsics, extrinsics, metric depth, and 3D point cloud generation into a single neural network forward pass. The core design leverages frozen DINOv2 ViT-B/14 foundation model features with lightweight scale-aware pose and depth decoders, enabling metrically consistent multi-view 3D reconstruction without iterative optimization (no RANSAC, no bundle adjustment).
Key components:
| Format | File | Size | Use Case |
|---|---|---|---|
| PyTorch (.pth) | pytorch/urd_v1.pth |
~1.0 GB | Training, fine-tuning, resume |
| SafeTensors | pytorch/urd_v1.safetensors |
~347 MB | Fast loading, safe deserialization |
| ONNX | onnx/urd_v1.onnx |
~347 MB | Cross-platform inference |
| TensorRT FP16 | tensorrt/urd_v1_fp16.engine |
~177 MB | Edge deployment (Jetson/L4) |
| TensorRT FP32 | tensorrt/urd_v1_fp32.engine |
~355 MB | Full precision inference |
final.pth (epoch 30/30)See configs/ for full hyperparameters and logs/training_history.json for loss curves.
import torch
from anima_urd.model import UniScale
# Load from checkpoint
model = UniScale.load("pytorch/urd_v1.pth", device="cuda")
model.eval()
# Inference: 4 multi-view images at 512x512
images = torch.randn(1, 4, 3, 512, 512, device="cuda")
with torch.no_grad():
output = model(images)
depth = output.depth_maps # [1, 4, 512, 512] metric depth (meters)
confidence = output.depth_confidence # [1, 4, 512, 512]
intrinsics = output.intrinsics # [1, 3, 3]
scale = output.scale_factors # [1] metric scale
UniScale is designed for multi-robot coordination:
@article{UniScale_2026,
title={UniScale Unified Scale-Aware Multi-View 3D Reconstruction},
year={2026},
eprint={2602.23224},
archivePrefix={arXiv},
primaryClass={cs.CV},
url={https://arxiv.org/abs/2602.23224}
}
Apache 2.0 โ Robot Flow Labs / AIFLOW LABS LIMITED