Add essential metadata and improve links in model card
#2
by
nielsr
HF Staff
- opened
README.md
CHANGED
|
@@ -1,3 +1,9 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
# βοΈ Atom-Searcher: Enhancing Agentic Deep Research via Fine-Grained Atomic Thought Reward
|
| 2 |
|
| 3 |
<p align="center">
|
|
@@ -77,7 +83,7 @@ We demonstrate through extensive experiments that Atom-Searcher sets a new state
|
|
| 77 |
- π It achieves significant performance gains over strong baselines like **DeepResearcher** and **R1-Searcher** on seven distinct benchmarks.
|
| 78 |
- π§ At test time, Atom-Searcher **scales its computation effectively**, generating 3.2x more tokens and making 1.24x more tool calls on average than the SOTA baseline, indicating deeper exploration and reasoning without explicit incentives.
|
| 79 |
|
| 80 |
-
π[Hugging Face
|
| 81 |
|
| 82 |
-----
|
| 83 |
|
|
|
|
| 1 |
+
---
|
| 2 |
+
license: apache-2.0
|
| 3 |
+
library_name: transformers
|
| 4 |
+
pipeline_tag: text-generation
|
| 5 |
+
---
|
| 6 |
+
|
| 7 |
# βοΈ Atom-Searcher: Enhancing Agentic Deep Research via Fine-Grained Atomic Thought Reward
|
| 8 |
|
| 9 |
<p align="center">
|
|
|
|
| 83 |
- π It achieves significant performance gains over strong baselines like **DeepResearcher** and **R1-Searcher** on seven distinct benchmarks.
|
| 84 |
- π§ At test time, Atom-Searcher **scales its computation effectively**, generating 3.2x more tokens and making 1.24x more tool calls on average than the SOTA baseline, indicating deeper exploration and reasoning without explicit incentives.
|
| 85 |
|
| 86 |
+
π[Hugging Face Collection](https://huggingface.co/collections/ant-group/atom-searcher)
|
| 87 |
|
| 88 |
-----
|
| 89 |
|