This is the Jittor implementation of DepthAnything series for monocular depth estimation including DepthAnything, DepthAnything V2 and DepthAnything-AC. Developed based on the Jittor deep learning framework, it provides efficient solutions for training and inference.
This repository contains the official Jittor implementation of the following papers:
Depth Anything At Any Condtion
Boyuan Sun*, Modi Jin*, Bowen Yin, Qibin Hou†
Arxiv 2025. Paper Link | Homepage | PyTorch Version
Depth Anything V2
Lihe Yang, Bingyi Kang, Zilong Huang, Zhen Zhao, Xiaogang Xu, Jiashi Feng, Hengshuang Zhao†
NeurIPS 2024. Paper Link | Homepage | PyTorch Version without training
Depth Anything: Unleashing the Power of Large-Scale Unlabeled Data
Lihe Yang, Bingyi Kang, Zilong Huang, Xiaogang Xu, Jiashi Feng, Hengshuang Zhao†
CVPR 2024. Paper Link | Homepage | PyTorch Version without training
Model tree for ghost233lism/NK-depth-Jittor
Base model
depth-anything/Depth-Anything-V2-Small