Instructions to use depth-anything/Depth-Anything-V2-Small with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- DepthAnythingV2
How to use depth-anything/Depth-Anything-V2-Small with DepthAnythingV2:
# Install from https://github.com/DepthAnything/Depth-Anything-V2 # Load the model and infer depth from an image import cv2 import torch from depth_anything_v2.dpt import DepthAnythingV2 # instantiate the model model = DepthAnythingV2(encoder="vits", features=64, out_channels=[48, 96, 192, 384]) # load the weights filepath = hf_hub_download(repo_id="depth-anything/Depth-Anything-V2-Small", filename="depth_anything_v2_vits.pth", repo_type="model") state_dict = torch.load(filepath, map_location="cpu") model.load_state_dict(state_dict).eval() raw_img = cv2.imread("your/image/path") depth = model.infer_image(raw_img) # HxW raw depth map in numpy - Notebooks
- Google Colab
- Kaggle
Make sure download stats work
Browse filesCorresponding PR: https://github.com/huggingface/huggingface.js/pull/785
README.md
CHANGED
|
@@ -3,6 +3,7 @@ license: apache-2.0
|
|
| 3 |
language:
|
| 4 |
- en
|
| 5 |
pipeline_tag: depth-estimation
|
|
|
|
| 6 |
tags:
|
| 7 |
- depth
|
| 8 |
- relative depth
|
|
|
|
| 3 |
language:
|
| 4 |
- en
|
| 5 |
pipeline_tag: depth-estimation
|
| 6 |
+
library_name: depth-anything-v2
|
| 7 |
tags:
|
| 8 |
- depth
|
| 9 |
- relative depth
|