Update README.md
Browse files
README.md
CHANGED
|
@@ -12,7 +12,7 @@ model: Viglong/OriAnyV2_ckpt
|
|
| 12 |
<h1>[NeurIPS 2025 Spotlight]<br>
|
| 13 |
Orient Anything V2: Unifying Orientation and Rotation Understanding</h1>
|
| 14 |
|
| 15 |
-
[**Zehan Wang**](https://scholar.google.com/citations?user=euXK0lkAAAAJ)<sup>1*</sup> 路 [**Ziang Zhang**](https://scholar.google.com/citations?hl=zh-CN&user=DptGMnYAAAAJ)<sup>1*</sup> 路 [**
|
| 16 |
|
| 17 |
<sup>1</sup>Zhejiang University    <sup>2</sup>SEA AI Lab    <sup>3</sup>HKU
|
| 18 |
|
|
@@ -24,7 +24,7 @@ Orient Anything V2: Unifying Orientation and Rotation Understanding</h1>
|
|
| 24 |
<a href='https://huggingface.co/spaces/Viglong/Orient-Anything-V2'><img src='https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue'></a>
|
| 25 |
<a href='https://huggingface.co/datasets/Viglong/OriAnyV2_Train_Render'><img src='https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Train Data-orange'></a>
|
| 26 |
<a href='https://huggingface.co/datasets/Viglong/OriAnyV2_Inference'><img src='https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Test Data-orange'></a>
|
| 27 |
-
<a href='https://huggingface.co/papers/
|
| 28 |
</div>
|
| 29 |
|
| 30 |
**Orient Anything V2**, a unified spatial vision model for understanding orientation, symmetry, and relative rotation, achieves SOTA performance across 14 datasets.
|
|
@@ -32,7 +32,7 @@ Orient Anything V2: Unifying Orientation and Rotation Understanding</h1>
|
|
| 32 |
<!--  -->
|
| 33 |
|
| 34 |
## News
|
| 35 |
-
* **2025-10-24:** 馃敟[Paper](https://arxiv.org/abs/
|
| 36 |
|
| 37 |
* **2025-09-18:** 馃敟Orient Anything V2 has been accepted as a Spotlight @ NeurIPS 2025!
|
| 38 |
|
|
|
|
| 12 |
<h1>[NeurIPS 2025 Spotlight]<br>
|
| 13 |
Orient Anything V2: Unifying Orientation and Rotation Understanding</h1>
|
| 14 |
|
| 15 |
+
[**Zehan Wang**](https://scholar.google.com/citations?user=euXK0lkAAAAJ)<sup>1*</sup> 路 [**Ziang Zhang**](https://scholar.google.com/citations?hl=zh-CN&user=DptGMnYAAAAJ)<sup>1*</sup> 路 [**Jiayang Xu**](https://github.com/1339354001)<sup>1</sup> 路 [**Jialei Wang**](https://scholar.google.com/citations?hl=en&user=OIuFz1gAAAAJ)<sup>1</sup> 路 [**Tianyu Pang**](https://scholar.google.com/citations?hl=zh-CN&user=wYDbtFsAAAAJ)<sup>2</sup> 路 [**Du Chao**](https://scholar.google.com/citations?hl=zh-CN&user=QOp7xW0AAAAJ)<sup>2</sup> 路 [**Hengshuang Zhao**](https://scholar.google.com/citations?user=4uE10I0AAAAJ&hl&oi=ao)<sup>3</sup> 路 [**Zhou Zhao**](https://scholar.google.com/citations?user=IIoFY90AAAAJ&hl&oi=ao)<sup>1</sup>
|
| 16 |
|
| 17 |
<sup>1</sup>Zhejiang University    <sup>2</sup>SEA AI Lab    <sup>3</sup>HKU
|
| 18 |
|
|
|
|
| 24 |
<a href='https://huggingface.co/spaces/Viglong/Orient-Anything-V2'><img src='https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Spaces-blue'></a>
|
| 25 |
<a href='https://huggingface.co/datasets/Viglong/OriAnyV2_Train_Render'><img src='https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Train Data-orange'></a>
|
| 26 |
<a href='https://huggingface.co/datasets/Viglong/OriAnyV2_Inference'><img src='https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Test Data-orange'></a>
|
| 27 |
+
<a href='https://huggingface.co/papers/2601.05573'><img src='https://img.shields.io/badge/%F0%9F%A4%97%20Hugging%20Face-Paper-yellow'></a>
|
| 28 |
</div>
|
| 29 |
|
| 30 |
**Orient Anything V2**, a unified spatial vision model for understanding orientation, symmetry, and relative rotation, achieves SOTA performance across 14 datasets.
|
|
|
|
| 32 |
<!--  -->
|
| 33 |
|
| 34 |
## News
|
| 35 |
+
* **2025-10-24:** 馃敟[Paper](https://arxiv.org/abs/2601.05573), [Project Page](https://orient-anythingv2.github.io), [Code](https://github.com/SpatialVision/Orient-Anything-V2), [Model Checkpoint](https://huggingface.co/Viglong/OriAnyV2_ckpt/blob/main/demo_ckpts/rotmod_realrotaug_best.pt), and [Demo](https://huggingface.co/spaces/Viglong/Orient-Anything-V2) have been released!
|
| 36 |
|
| 37 |
* **2025-09-18:** 馃敟Orient Anything V2 has been accepted as a Spotlight @ NeurIPS 2025!
|
| 38 |
|