Update pipeline tag, add library name and GitHub link, fix example import

#1
by nielsr HF Staff - opened
Files changed (1) hide show
  1. README.md +10 -6
README.md CHANGED
@@ -1,10 +1,11 @@
1
  ---
2
- license: mit
3
- language:
4
- - en
5
  base_model:
6
  - Qwen/Qwen2.5-VL-7B-Instruct
7
- pipeline_tag: reinforcement-learning
 
 
 
 
8
  ---
9
 
10
  # 🧠 Ariadne
@@ -12,10 +13,12 @@ pipeline_tag: reinforcement-learning
12
  This is the official model checkpoint for the paper:
13
  **[Ariadne: A Controllable Framework for Probing and Extending VLM Reasoning Boundaries](https://arxiv.org/abs/2511.00710)**
14
 
 
 
15
  ### 🔬 Example
16
 
17
  ```python
18
-
19
  from transformers import AutoModelForImageTextToText, AutoProcessor
20
 
21
  MODEL_ID = "..." # path
@@ -62,4 +65,5 @@ input_len = inputs["input_ids"].shape[1]
62
  gen_ids = sequences[0, input_len:]
63
  resp_text = processor.tokenizer.decode(
64
  gen_ids, skip_special_tokens=True, clean_up_tokenization_spaces=True
65
- ).strip()
 
 
1
  ---
 
 
 
2
  base_model:
3
  - Qwen/Qwen2.5-VL-7B-Instruct
4
+ language:
5
+ - en
6
+ license: mit
7
+ pipeline_tag: image-text-to-text
8
+ library_name: transformers
9
  ---
10
 
11
  # 🧠 Ariadne
 
13
  This is the official model checkpoint for the paper:
14
  **[Ariadne: A Controllable Framework for Probing and Extending VLM Reasoning Boundaries](https://arxiv.org/abs/2511.00710)**
15
 
16
+ Code: https://github.com/Minghe-Shen/Ariadne
17
+
18
  ### 🔬 Example
19
 
20
  ```python
21
+ import torch # Added for torch.bfloat16 and torch.cuda.is_available()
22
  from transformers import AutoModelForImageTextToText, AutoProcessor
23
 
24
  MODEL_ID = "..." # path
 
65
  gen_ids = sequences[0, input_len:]
66
  resp_text = processor.tokenizer.decode(
67
  gen_ids, skip_special_tokens=True, clean_up_tokenization_spaces=True
68
+ ).strip()
69
+ ```