This is the Jittor implementation of DepthAnything series for monocular depth estimation including DepthAnything, DepthAnything V2 and DepthAnything-AC. Developed based on the Jittor deep learning framework, it provides efficient solutions for training and inference.

This repository contains the official Jittor implementation of the following papers:

Depth Anything At Any Condtion
Boyuan Sun*, Modi Jin*, Bowen Yin, Qibin Hou
Arxiv 2025. Paper Link | Homepage | PyTorch Version

Depth Anything V2
Lihe Yang, Bingyi Kang, Zilong Huang, Zhen Zhao, Xiaogang Xu, Jiashi Feng, Hengshuang Zhao
NeurIPS 2024. Paper Link | Homepage | PyTorch Version without training

Depth Anything: Unleashing the Power of Large-Scale Unlabeled Data
Lihe Yang, Bingyi Kang, Zilong Huang, Xiaogang Xu, Jiashi Feng, Hengshuang Zhao
CVPR 2024. Paper Link | Homepage | PyTorch Version without training

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for ghost233lism/NK-depth-Jittor

Finetuned
(2)
this model

Papers for ghost233lism/NK-depth-Jittor