Instructions to use gasolsun/DynamicRAG-7B with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use gasolsun/DynamicRAG-7B with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("question-answering", model="gasolsun/DynamicRAG-7B")# Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("gasolsun/DynamicRAG-7B") model = AutoModelForCausalLM.from_pretrained("gasolsun/DynamicRAG-7B") - Notebooks
- Google Colab
- Kaggle
Improve model card
#2
by nielsr HF Staff - opened
README.md
CHANGED
|
@@ -6,7 +6,7 @@ datasets:
|
|
| 6 |
language:
|
| 7 |
- en
|
| 8 |
license: apache-2.0
|
| 9 |
-
pipeline_tag:
|
| 10 |
library_name: transformers
|
| 11 |
---
|
| 12 |
|
|
@@ -43,7 +43,7 @@ library_name: transformers
|
|
| 43 |
- Explores improved trajectories and further optimizes the reranker.
|
| 44 |
|
| 45 |
## How to cite
|
| 46 |
-
If you extend or use this work, please cite the [paper](https://arxiv.org/abs/
|
| 47 |
|
| 48 |
```
|
| 49 |
@misc{sun2025dynamicragleveragingoutputslarge,
|
|
|
|
| 6 |
language:
|
| 7 |
- en
|
| 8 |
license: apache-2.0
|
| 9 |
+
pipeline_tag: text-ranking
|
| 10 |
library_name: transformers
|
| 11 |
---
|
| 12 |
|
|
|
|
| 43 |
- Explores improved trajectories and further optimizes the reranker.
|
| 44 |
|
| 45 |
## How to cite
|
| 46 |
+
If you extend or use this work, please cite the [paper](https://arxiv.org/abs/2505.07233) where it was introduced:
|
| 47 |
|
| 48 |
```
|
| 49 |
@misc{sun2025dynamicragleveragingoutputslarge,
|