Update model card: add pipeline tag, paper link, and sample usage
#4
by nielsr HF Staff - opened
README.md
CHANGED
|
@@ -1,11 +1,13 @@
|
|
| 1 |
---
|
| 2 |
-
license: mit
|
| 3 |
-
datasets:
|
| 4 |
-
- OpenResearcher/OpenResearcher-Dataset
|
| 5 |
base_model:
|
| 6 |
- nvidia/NVIDIA-Nemotron-3-Nano-30B-A3B-Base-BF16
|
|
|
|
|
|
|
| 7 |
library_name: transformers
|
|
|
|
|
|
|
| 8 |
---
|
|
|
|
| 9 |
<div style="display: flex; align-items: center; justify-content: center; gap: 8px;">
|
| 10 |
<img src="imgs/or-logo1.png" style="height: 84px; width: auto;">
|
| 11 |
<img src="imgs/openresearcher-title.svg" style="height: 84px; width: auto;">
|
|
@@ -13,6 +15,7 @@ library_name: transformers
|
|
| 13 |
|
| 14 |
|
| 15 |
<div align="center">
|
|
|
|
| 16 |
<a href="https://x.com/DongfuJiang/status/2020946549422031040"><img src="https://img.shields.io/badge/Twitter-000000?style=for-the-badge&logo=X&logoColor=white" alt="Blog"></a>
|
| 17 |
<a href="https://boiled-honeycup-4c7.notion.site/OpenResearcher-A-Fully-Open-Pipeline-for-Long-Horizon-Deep-Research-Trajectory-Synthesis-2f7e290627b5800cb3a0cd7e8d6ec0ea?source=copy_link"><img src="https://img.shields.io/badge/Blog-4285F4?style=for-the-badge&logo=google-chrome&logoColor=white" alt="Blog"></a>
|
| 18 |
<a href="https://github.com/TIGER-AI-Lab/OpenResearcher"><img src="https://img.shields.io/badge/Github-181717?style=for-the-badge&logo=github&logoColor=white" alt="Blog"></a>
|
|
@@ -22,7 +25,7 @@ library_name: transformers
|
|
| 22 |
<!-- <a href="https://wandb.ai/dongfu/nano-v3-sft-search"><img src="https://img.shields.io/badge/WandB%20Logs-48B5A3?style=for-the-badge&logo=weightsandbiases&logoColor=white" alt="WandB Logs"></a> -->
|
| 23 |
<a href="https://huggingface.co/datasets/OpenResearcher/OpenResearcher-Eval-Logs/tree/main"><img src="https://img.shields.io/badge/Eval%20Logs-755BB4?style=for-the-badge&logo=google-sheets&logoColor=white" alt="Eval Logs"></a>
|
| 24 |
</div>
|
| 25 |
-
|
| 26 |
<p align="center">
|
| 27 |
🤗 <a href="https://huggingface.co/collections/TIGER-Lab/openresearcher" target="_blank">HuggingFace</a> |
|
| 28 |
<img src="imgs/notion.svg" width="15px" style="display:inline;"> <a href="https://boiled-honeycup-4c7.notion.site/OpenResearcher-A-Fully-Open-Pipeline-for-Long-Horizon-Deep-Research-Trajectory-Synthesis-2f7e290627b5800cb3a0cd7e8d6ec0ea?source=copy_link" target="_blank">Blog</a> | <img src="imgs/slack.png" width="14px" style="display:inline;"> <a href="https://join.slack.com/t/openresearcher/shared_invite/zt-3p0r32cky-PqtZkVjjWIAI14~XwcRMfQ" target="_blank">Slack</a> | <img src="imgs/wechat.svg" width="14px" style="display:inline;"> <a href="https://github.com/TIGER-AI-Lab/OpenResearcher/blob/main/assets/imgs/wechat_group.jpg" target="_blank">WeChat</a>
|
|
@@ -30,7 +33,9 @@ library_name: transformers
|
|
| 30 |
</p>
|
| 31 |
|
| 32 |
## OpenResearcher-30B-A3B Overview
|
| 33 |
-
OpenResearcher-30B-A3B
|
|
|
|
|
|
|
| 34 |
|
| 35 |
The model achieves an impressive **54.8%** accuracy on [BrowseComp-Plus](https://huggingface.co/spaces/Tevatron/BrowseComp-Plus), surpassing performance of `GPT-4.1`, `Claude-Opus-4`, `Gemini-2.5-Pro`, `DeepSeek-R1` and `Tongyi-DeepResearch`.
|
| 36 |
<div align="center">
|
|
@@ -47,18 +52,44 @@ The model achieves an impressive **54.8%** accuracy on [BrowseComp-Plus](https:/
|
|
| 47 |
We evaluate OpenResearcher-30B-A3B across a range of deep research benchmarks, including BrowseComp-Plus, BrowseComp, GAIA, xbench-DeepSearch. Please find more details in [GitHub](https://github.com/TIGER-AI-Lab/OpenResearcher?tab=readme-ov-file#-benchmark-openresearcher).
|
| 48 |
|
| 49 |
|
| 50 |
-
##
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 51 |
|
| 52 |
-
|
| 53 |
|
|
|
|
|
|
|
|
|
|
| 54 |
|
| 55 |
## Citation
|
| 56 |
```bibtex
|
| 57 |
-
@
|
| 58 |
-
title={OpenResearcher: A Fully Open Pipeline for Long-Horizon Deep Research Trajectory Synthesis},
|
| 59 |
-
author={
|
| 60 |
-
|
| 61 |
-
|
| 62 |
-
note={Notion Blog}
|
| 63 |
}
|
| 64 |
```
|
|
|
|
| 1 |
---
|
|
|
|
|
|
|
|
|
|
| 2 |
base_model:
|
| 3 |
- nvidia/NVIDIA-Nemotron-3-Nano-30B-A3B-Base-BF16
|
| 4 |
+
datasets:
|
| 5 |
+
- OpenResearcher/OpenResearcher-Dataset
|
| 6 |
library_name: transformers
|
| 7 |
+
license: mit
|
| 8 |
+
pipeline_tag: text-generation
|
| 9 |
---
|
| 10 |
+
|
| 11 |
<div style="display: flex; align-items: center; justify-content: center; gap: 8px;">
|
| 12 |
<img src="imgs/or-logo1.png" style="height: 84px; width: auto;">
|
| 13 |
<img src="imgs/openresearcher-title.svg" style="height: 84px; width: auto;">
|
|
|
|
| 15 |
|
| 16 |
|
| 17 |
<div align="center">
|
| 18 |
+
<a href="https://huggingface.co/papers/2603.20278"><img src="https://img.shields.io/badge/arXiv-B31B1B?style=for-the-badge&logo=arXiv&logoColor=white" alt="Paper"></a>
|
| 19 |
<a href="https://x.com/DongfuJiang/status/2020946549422031040"><img src="https://img.shields.io/badge/Twitter-000000?style=for-the-badge&logo=X&logoColor=white" alt="Blog"></a>
|
| 20 |
<a href="https://boiled-honeycup-4c7.notion.site/OpenResearcher-A-Fully-Open-Pipeline-for-Long-Horizon-Deep-Research-Trajectory-Synthesis-2f7e290627b5800cb3a0cd7e8d6ec0ea?source=copy_link"><img src="https://img.shields.io/badge/Blog-4285F4?style=for-the-badge&logo=google-chrome&logoColor=white" alt="Blog"></a>
|
| 21 |
<a href="https://github.com/TIGER-AI-Lab/OpenResearcher"><img src="https://img.shields.io/badge/Github-181717?style=for-the-badge&logo=github&logoColor=white" alt="Blog"></a>
|
|
|
|
| 25 |
<!-- <a href="https://wandb.ai/dongfu/nano-v3-sft-search"><img src="https://img.shields.io/badge/WandB%20Logs-48B5A3?style=for-the-badge&logo=weightsandbiases&logoColor=white" alt="WandB Logs"></a> -->
|
| 26 |
<a href="https://huggingface.co/datasets/OpenResearcher/OpenResearcher-Eval-Logs/tree/main"><img src="https://img.shields.io/badge/Eval%20Logs-755BB4?style=for-the-badge&logo=google-sheets&logoColor=white" alt="Eval Logs"></a>
|
| 27 |
</div>
|
| 28 |
+
|
| 29 |
<p align="center">
|
| 30 |
🤗 <a href="https://huggingface.co/collections/TIGER-Lab/openresearcher" target="_blank">HuggingFace</a> |
|
| 31 |
<img src="imgs/notion.svg" width="15px" style="display:inline;"> <a href="https://boiled-honeycup-4c7.notion.site/OpenResearcher-A-Fully-Open-Pipeline-for-Long-Horizon-Deep-Research-Trajectory-Synthesis-2f7e290627b5800cb3a0cd7e8d6ec0ea?source=copy_link" target="_blank">Blog</a> | <img src="imgs/slack.png" width="14px" style="display:inline;"> <a href="https://join.slack.com/t/openresearcher/shared_invite/zt-3p0r32cky-PqtZkVjjWIAI14~XwcRMfQ" target="_blank">Slack</a> | <img src="imgs/wechat.svg" width="14px" style="display:inline;"> <a href="https://github.com/TIGER-AI-Lab/OpenResearcher/blob/main/assets/imgs/wechat_group.jpg" target="_blank">WeChat</a>
|
|
|
|
| 33 |
</p>
|
| 34 |
|
| 35 |
## OpenResearcher-30B-A3B Overview
|
| 36 |
+
OpenResearcher-30B-A3B is an agentic large language model designed for long-horizon deep research, presented in the paper [OpenResearcher: A Fully Open Pipeline for Long-Horizon Deep Research Trajectory Synthesis](https://huggingface.co/papers/2603.20278).
|
| 37 |
+
|
| 38 |
+
It is fine-tuned from [NVIDIA-Nemotron-3-Nano-30B-A3B-Base-BF16](https://huggingface.co/nvidia/NVIDIA-Nemotron-3-Nano-30B-A3B-Base-BF16) on 96K [OpenResearcher dataset](https://huggingface.co/datasets/OpenResearcher/OpenResearcher-Dataset) with **100+** turns. The dataset is derived by distilling GPT-OSS-120B with [native browser tools](https://docs.vllm.ai/projects/recipes/en/latest/OpenAI/GPT-OSS.html#usage:~:text=Limitation%20section%20below.-,Tool%20Use,-%C2%B6). More info can be found on the dataset card at [OpenResearcher dataset](https://huggingface.co/datasets/OpenResearcher/OpenResearcher-Dataset).
|
| 39 |
|
| 40 |
The model achieves an impressive **54.8%** accuracy on [BrowseComp-Plus](https://huggingface.co/spaces/Tevatron/BrowseComp-Plus), surpassing performance of `GPT-4.1`, `Claude-Opus-4`, `Gemini-2.5-Pro`, `DeepSeek-R1` and `Tongyi-DeepResearch`.
|
| 41 |
<div align="center">
|
|
|
|
| 52 |
We evaluate OpenResearcher-30B-A3B across a range of deep research benchmarks, including BrowseComp-Plus, BrowseComp, GAIA, xbench-DeepSearch. Please find more details in [GitHub](https://github.com/TIGER-AI-Lab/OpenResearcher?tab=readme-ov-file#-benchmark-openresearcher).
|
| 53 |
|
| 54 |
|
| 55 |
+
## Sample Usage
|
| 56 |
+
|
| 57 |
+
The following example demonstrates how to use `OpenResearcher-30B-A3B` for deep research within its agentic environment. This requires the tools and environment setup provided in the [official GitHub repository](https://github.com/TIGER-AI-Lab/OpenResearcher).
|
| 58 |
+
|
| 59 |
+
```python
|
| 60 |
+
import asyncio
|
| 61 |
+
from deploy_agent import run_one, BrowserPool
|
| 62 |
+
from utils.openai_generator import OpenAIAsyncGenerator
|
| 63 |
+
|
| 64 |
+
async def main():
|
| 65 |
+
# Initialize generator and browser
|
| 66 |
+
generator = OpenAIAsyncGenerator(
|
| 67 |
+
base_url="http://localhost:8001/v1",
|
| 68 |
+
model_name="OpenResearcher/OpenResearcher-30B-A3B",
|
| 69 |
+
use_native_tools=True
|
| 70 |
+
)
|
| 71 |
+
browser_pool = BrowserPool(search_url=None, browser_backend="serper")
|
| 72 |
+
|
| 73 |
+
# Run deep research
|
| 74 |
+
await run_one(
|
| 75 |
+
question="What is the latest news about OpenAI?",
|
| 76 |
+
qid="quick_start",
|
| 77 |
+
generator=generator,
|
| 78 |
+
browser_pool=browser_pool,
|
| 79 |
+
)
|
| 80 |
|
| 81 |
+
browser_pool.cleanup("quick_start")
|
| 82 |
|
| 83 |
+
if __name__ == "__main__":
|
| 84 |
+
asyncio.run(main())
|
| 85 |
+
```
|
| 86 |
|
| 87 |
## Citation
|
| 88 |
```bibtex
|
| 89 |
+
@article{li2026openresearcher,
|
| 90 |
+
title={{OpenResearcher: A Fully Open Pipeline for Long-Horizon Deep Research Trajectory Synthesis}},
|
| 91 |
+
author={Li, Zhuofeng and Jiang, Dongfu and Ma, Xueguang and Zhang, Haoxiang and Nie, Ping and Zhang, Yuyu and Zou, Kai and Xie, Jianwen and Yu Zhang and Wenhu Chen},
|
| 92 |
+
journal={arXiv preprint arXiv:2603.20278},
|
| 93 |
+
year={2026}
|
|
|
|
| 94 |
}
|
| 95 |
```
|