Commit
839af38
·
verified ·
0 Parent(s):

Super-squash branch 'main' using huggingface_hub

Browse files

Co-authored-by: patrickvonplaten <patrickvonplaten@users.noreply.huggingface.co>
Co-authored-by: pandora-s <pandora-s@users.noreply.huggingface.co>

.gitattributes ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ *.7z filter=lfs diff=lfs merge=lfs -text
2
+ *.arrow filter=lfs diff=lfs merge=lfs -text
3
+ *.bin filter=lfs diff=lfs merge=lfs -text
4
+ *.bz2 filter=lfs diff=lfs merge=lfs -text
5
+ *.ckpt filter=lfs diff=lfs merge=lfs -text
6
+ *.ftz filter=lfs diff=lfs merge=lfs -text
7
+ *.gz filter=lfs diff=lfs merge=lfs -text
8
+ *.h5 filter=lfs diff=lfs merge=lfs -text
9
+ *.joblib filter=lfs diff=lfs merge=lfs -text
10
+ *.lfs.* filter=lfs diff=lfs merge=lfs -text
11
+ *.mlmodel filter=lfs diff=lfs merge=lfs -text
12
+ *.model filter=lfs diff=lfs merge=lfs -text
13
+ *.msgpack filter=lfs diff=lfs merge=lfs -text
14
+ *.npy filter=lfs diff=lfs merge=lfs -text
15
+ *.npz filter=lfs diff=lfs merge=lfs -text
16
+ *.onnx filter=lfs diff=lfs merge=lfs -text
17
+ *.ot filter=lfs diff=lfs merge=lfs -text
18
+ *.parquet filter=lfs diff=lfs merge=lfs -text
19
+ *.pb filter=lfs diff=lfs merge=lfs -text
20
+ *.pickle filter=lfs diff=lfs merge=lfs -text
21
+ *.pkl filter=lfs diff=lfs merge=lfs -text
22
+ *.pt filter=lfs diff=lfs merge=lfs -text
23
+ *.pth filter=lfs diff=lfs merge=lfs -text
24
+ *.rar filter=lfs diff=lfs merge=lfs -text
25
+ *.safetensors filter=lfs diff=lfs merge=lfs -text
26
+ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
27
+ *.tar.* filter=lfs diff=lfs merge=lfs -text
28
+ *.tar filter=lfs diff=lfs merge=lfs -text
29
+ *.tflite filter=lfs diff=lfs merge=lfs -text
30
+ *.tgz filter=lfs diff=lfs merge=lfs -text
31
+ *.wasm filter=lfs diff=lfs merge=lfs -text
32
+ *.xz filter=lfs diff=lfs merge=lfs -text
33
+ *.zip filter=lfs diff=lfs merge=lfs -text
34
+ *.zst filter=lfs diff=lfs merge=lfs -text
35
+ *tfevents* filter=lfs diff=lfs merge=lfs -text
36
+ tekken.json filter=lfs diff=lfs merge=lfs -text
37
+ tokenizer.json filter=lfs diff=lfs merge=lfs -text
CHAT_SYSTEM_PROMPT.txt ADDED
@@ -0,0 +1,29 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ You are Devstral-Small-2-24B-Instruct-2512, a Large Language Model (LLM) created by Mistral AI, a French startup headquartered in Paris.
2
+ You power an AI assistant called Le Chat.
3
+ Your knowledge base was last updated on 2023-10-01.
4
+ The current date is {today}.
5
+
6
+ When you're not sure about some information or when the user's request requires up-to-date or specific data, you must use the available tools to fetch the information. Do not hesitate to use tools whenever they can provide a more accurate or complete response. If no relevant tools are available, then clearly state that you don't have the information and avoid making up anything.
7
+ If the user's question is not clear, ambiguous, or does not provide enough context for you to accurately answer the question, you do not try to answer it right away and you rather ask the user to clarify their request (e.g. "What are some good restaurants around me?" => "Where are you?" or "When is the next flight to Tokyo" => "Where do you travel from?").
8
+ You are always very attentive to dates, in particular you try to resolve dates (e.g. "yesterday" is {yesterday}) and when asked about information at specific dates, you discard information that is at another date.
9
+ You follow these instructions in all languages, and always respond to the user in the language they use or request.
10
+ Next sections describe the capabilities that you have.
11
+
12
+ # WEB BROWSING INSTRUCTIONS
13
+
14
+ You cannot perform any web search or access internet to open URLs, links etc. If it seems like the user is expecting you to do so, you clarify the situation and ask the user to copy paste the text directly in the chat.
15
+
16
+ # MULTI-MODAL INSTRUCTIONS
17
+
18
+ You have the ability to read images, but you cannot generate images. You also cannot transcribe audio files or videos.
19
+ You cannot read nor transcribe audio files or videos.
20
+
21
+ # TOOL CALLING INSTRUCTIONS
22
+
23
+ You may have access to tools that you can use to fetch information or perform actions. You must use these tools in the following situations:
24
+
25
+ 1. When the request requires up-to-date information.
26
+ 2. When the request requires specific data that you do not have in your knowledge base.
27
+ 3. When the request involves actions that you cannot perform without tools.
28
+
29
+ Always prioritize using tools to provide the most accurate and helpful response. If tools are not available, inform the user that you cannot perform the requested action at the moment.
README.md ADDED
@@ -0,0 +1,1607 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: vllm
3
+ inference: false
4
+ base_model:
5
+ - mistralai/Mistral-Small-3.1-24B-Base-2503
6
+ extra_gated_description: >-
7
+ If you want to learn more about how we process your personal data, please read
8
+ our <a href="https://mistral.ai/terms/">Privacy Policy</a>.
9
+ tags:
10
+ - mistral-common
11
+ license: apache-2.0
12
+ ---
13
+
14
+ # Devstral Small 2 24B Instruct 2512
15
+ Devstral is an agentic LLM for software engineering tasks. **Devstral Small 2** excels at using tools to explore codebases, editing multiple files and power software engineering agents.
16
+ The model achieves remarkable performance on SWE-bench.
17
+
18
+ This model is an Instruct model in **FP8**, fine-tuned to follow instructions, making it ideal for chat, agentic and instruction based tasks for SWE use cases.
19
+
20
+ For enterprises requiring specialized capabilities (increased context, domain-specific knowledge, etc.), we invite companies to [reach out to us](https://mistral.ai/contact).
21
+
22
+ ## Key Features
23
+ The Devstral Small 2 Instruct model offers the following capabilities:
24
+ - **Agentic Coding**: Devstral is designed to excel at agentic coding tasks, making it a great choice for software engineering agents.
25
+ - **Lightweight**: with its compact size of just 24 billion parameters, Devstral is light enough to run on a single RTX 4090 or a Mac with 32GB RAM, making it an appropriate model for local deployment and on-device use.
26
+ - **Apache 2.0 License**: Open-source license allowing usage and modification for both commercial and non-commercial purposes.
27
+ - **Context Window**: A 256k context window.
28
+
29
+ Updates compared to [`Devstral Small 1.1`](https://huggingface.co/mistralai/Devstral-Small-2507):
30
+ - **Vision Capabilities**: Enables the model to analyze images and provide insights based on visual content, in addition to text.
31
+ - **Improved Performance**: Devstral Small 2 is a step-up compared to its predecessors.
32
+ - **Attention Softmax Temperature**: Devstral Small 2 uses the same architecture as Ministral 3 using rope-scaling as introduced by Llama 4 and [Scalable-Softmax Is Superior for Attention](https://arxiv.org/abs/2501.19399).
33
+ - **Better Generalization**: Generalises better to diverse prompts and coding environments.
34
+
35
+ ### Use Cases
36
+
37
+ AI Code Assistants, Agentic Coding, and Software Engineering Tasks. Leveraging advanced AI capabilities for complex tool integration and deep codebase understanding in coding environments.
38
+
39
+ ## Benchmark Results
40
+
41
+ | Model/Benchmark | Size (B Tokens) | SWE Bench Verified | SWE Bench Multilingual | Terminal Bench 2 |
42
+ |-------------------------------|-----------------|--------------------|------------------------|------------------|
43
+ | **Devstral 2** | 123 | 72.2% | 61.3% | 32.6% |
44
+ | **Devstral Small 2** | 24 | 68.0% | 55.7% | 22.5% |
45
+ | | | | | |
46
+ | GLM 4.6 | 455 | 68.0% | -- | 24.6% |
47
+ | Qwen 3 Coder Plus | 480 | 69.6% | 54.7% | 25.4% |
48
+ | MiniMax M2 | 230 | 69.4% | 56.5% | 30.0% |
49
+ | Kimi K2 Thinking | 1000 | 71.3% | 61.1% | 35.7% |
50
+ | DeepSeek v3.2 | 671 | 73.1% | 70.2% | 46.4% |
51
+ | | | | | |
52
+ | GPT 5.1 Codex High | -- | 73.7% | -- | 52.8% |
53
+ | GPT 5.1 Codex Max | -- | 77.9% | -- | 60.4% |
54
+ | Gemini 3 Pro | -- | 76.2% | -- | 54.2% |
55
+ | Claude Sonnet 4.5 | -- | 77.2% | 68.0% | 42.8% |
56
+
57
+ *Benchmark results presented are based on publicly reported values for competitor models.
58
+
59
+ ## Usage
60
+
61
+ ### Scaffolding
62
+
63
+ Together with Devstral 2, we are releasing **Mistral Vibe**, a CLI tool allowing developers to leverage Devstral capabilities directly in your terminal.
64
+ - [Mistral Vibe (recommended)](https://github.com/mistralai/mistral-vibe): Learn how to use it [here](#mistral-vibe)
65
+
66
+ Devstral 2 can also be used with the following scaffoldings:
67
+ - [Cline](https://github.com/cline/cline)
68
+ - [Kilo Code](https://github.com/Kilo-Org/kilocode)
69
+ - [Claude Code](https://github.com/anthropics/claude-code)
70
+ - [OpenHands](https://github.com/All-Hands-AI/OpenHands/tree/main)
71
+ - [SWE Agent](https://github.com/SWE-agent/SWE-agent)
72
+
73
+ You can use Devstral 2 either through our API or by running locally.
74
+
75
+ #### Mistral Vibe
76
+
77
+ The [Mistral Vibe CLI](https://github.com/mistralai/mistral-vibe) is a command-line tool designed to help developers leverage Devstral’s capabilities directly from their terminal.
78
+
79
+ We recommend installing Mistral Vibe using `uv` for faster and more reliable dependency management:
80
+ ```
81
+ uv tool install mistral-vibe
82
+ ```
83
+ You can also run:
84
+ ```
85
+ curl -LsSf https://mistral.ai/vibe/install.sh | sh
86
+ ```
87
+
88
+ If you prefer using pip, use:
89
+ ```
90
+ pip install mistral-vibe
91
+ ```
92
+
93
+ To launch the CLI, navigate to your project's root directory and simply execute:
94
+ ```
95
+ vibe
96
+ ```
97
+
98
+ If this is your first time running Vibe, it will:
99
+ - Create a default configuration file at `~/.vibe/config.toml`.
100
+ - Prompt you to enter your API key if it's not already configured, follow these [instructions](https://docs.mistral.ai/getting-started/quickstart/#account-setup) to create an Account and get an API key.
101
+ - Save your API key to `~/.vibe/.env` for future use.
102
+
103
+ ### Local Deployment
104
+
105
+ The model can also be deployed with the following libraries, we advise everyone to use the Mistral AI API if the model is subpar with local serving:
106
+ - [`vllm (recommended)`](https://github.com/vllm-project/vllm): See [here](#vllm-recommended)
107
+ - [`transformers`](https://github.com/huggingface/transformers): See [here](#transformers)
108
+
109
+ Coming soon:
110
+ - [`llama.cpp`](https://github.com/ggml-org/llama.cpp)
111
+ - [`ollama`](https://ollama.com/)
112
+ - [`lmstudio`](https://lmstudio.ai/)
113
+
114
+ > [!Note]
115
+ > Current llama.cpp/ollama/lmstudio implementations may not be accurate, we invite developers to test them via the following [prompt tests](#tests).
116
+
117
+ #### vLLM (recommended)
118
+
119
+ <details>
120
+ <summary>Expand</summary
121
+
122
+ We recommend using this model with the [vLLM library](https://github.com/vllm-project/vllm)
123
+ to implement production-ready inference pipelines.
124
+
125
+ **_Installation_**
126
+
127
+ Please make sure to use our custom vLLM docker image [mistralllm/vllm_devstral:latest](https://hub.docker.com/repository/docker/mistralllm/vllm_devstral/tags/latest/sha256:d2ca883e8b4e0bec7d6953706410d2741e88ade6e07e576a51756f4bf51a0ffd):
128
+
129
+ ```
130
+ docker pull mistralllm/vllm_devstral:latest
131
+ docker run -it mistralllm/vllm_devstral:latest
132
+ ```
133
+
134
+ Alternatively, you can also install `vllm` from latest main by following instructions [here](https://docs.vllm.ai/en/latest/getting_started/installation/gpu/#python-only-build).
135
+
136
+ > [!Warning]
137
+ > Make sure that your vllm installation includes [this commit](https://github.com/vllm-project/vllm/commit/5c213d2899f5a2d439c8d771a0abc156a5412a2b).
138
+ > If you do not have this commit included, you will get incorrectly parsed tool calls.
139
+
140
+ Also make sure to have installed [`mistral_common >= 1.8.6`](https://github.com/mistralai/mistral-common/releases/tag/v1.8.6).
141
+ To check:
142
+ ```
143
+ python -c "import mistral_common; print(mistral_common.__version__)"
144
+ ```
145
+
146
+ **_Launch server_**
147
+
148
+ We recommand that you use Devstral in a server/client setting.
149
+
150
+ 1. Spin up a server:
151
+
152
+ ```
153
+ vllm serve mistralai/Devstral-Small-2-24B-Instruct-2512 --tool-call-parser mistral --enable-auto-tool-choice --tensor-parallel-size 2
154
+ ```
155
+
156
+
157
+ 2. To ping the client you can use a simple Python snippet.
158
+
159
+ ```py
160
+ import requests
161
+ import json
162
+ from huggingface_hub import hf_hub_download
163
+
164
+
165
+ url = "http://<your-server-url>:8000/v1/chat/completions"
166
+ headers = {"Content-Type": "application/json", "Authorization": "Bearer token"}
167
+
168
+ model = "mistralai/Devstral-Small-2-24B-Instruct-2512"
169
+
170
+ def load_system_prompt(repo_id: str, filename: str) -> str:
171
+ file_path = hf_hub_download(repo_id=repo_id, filename=filename)
172
+ with open(file_path, "r") as file:
173
+ system_prompt = file.read()
174
+ return system_prompt
175
+
176
+ SYSTEM_PROMPT = load_system_prompt(model, "CHAT_SYSTEM_PROMPT.txt")
177
+
178
+ messages = [
179
+ {"role": "system", "content": SYSTEM_PROMPT},
180
+ {
181
+ "role": "user",
182
+ "content": [
183
+ {
184
+ "type": "text",
185
+ "text": "<your-command>",
186
+ },
187
+ ],
188
+ },
189
+ ]
190
+
191
+ data = {"model": model, "messages": messages, "temperature": 0.15}
192
+
193
+ # Devstral Small 2 supports tool calling. If you want to use tools, follow this:
194
+ # tools = [ # Define tools for vLLM
195
+ # {
196
+ # "type": "function",
197
+ # "function": {
198
+ # "name": "git_clone",
199
+ # "description": "Clone a git repository",
200
+ # "parameters": {
201
+ # "type": "object",
202
+ # "properties": {
203
+ # "url": {
204
+ # "type": "string",
205
+ # "description": "The url of the git repository",
206
+ # },
207
+ # },
208
+ # "required": ["url"],
209
+ # },
210
+ # },
211
+ # }
212
+ # ]
213
+ # data = {"model": model, "messages": messages, "temperature": 0.15, "tools": tools} # Pass tools to payload.
214
+
215
+ response = requests.post(url, headers=headers, data=json.dumps(data))
216
+ print(response.json()["choices"][0]["message"]["content"])
217
+ ```
218
+ </details>
219
+
220
+ #### Transformers
221
+
222
+ <details>
223
+ <summary>Expand</summary
224
+
225
+
226
+ Make sure to install from main:
227
+
228
+ ```sh
229
+ uv pip install git+https://github.com/huggingface/transformers
230
+ ```
231
+
232
+ And run the following code snippet:
233
+
234
+ > [!Warning]
235
+ > While the checkpoint is serialized in FP8 format, there is currently a problem
236
+ > with "true" FP8 inference. Hence the weights are automatically dequantized to BFloat16
237
+ > as per [this config setting](https://huggingface.co/mistralai/Devstral-Small-2-24B-Instruct-2512/blob/main/config.json#L13).
238
+ > Once the bug is fixed, we will by default run the model in "true" FP8.
239
+
240
+ ```python
241
+ import torch
242
+ from transformers import (
243
+ Mistral3ForConditionalGeneration,
244
+ MistralCommonBackend,
245
+ )
246
+
247
+ model_id = "mistralai/Devstral-Small-2-24B-Instruct-2512"
248
+
249
+ tokenizer = MistralCommonBackend.from_pretrained(model_id)
250
+ model = Mistral3ForConditionalGeneration.from_pretrained(model_id, device_map="auto")
251
+ model = model.to(torch.bfloat16)
252
+
253
+ SP = """You are operating as and within Mistral Vibe, a CLI coding-agent built by Mistral AI and powered by default by the Devstral family of models. It wraps Mistral's Devstral models to enable natural language interaction with a local codebase. Use the available tools when helpful.
254
+
255
+ You can:
256
+
257
+ - Receive user prompts, project context, and files.
258
+ - Send responses and emit function calls (e.g., shell commands, code edits).
259
+ - Apply patches, run commands, based on user approvals.
260
+
261
+ Answer the user's request using the relevant tool(s), if they are available. Check that all the required parameters for each tool call are provided or can reasonably be inferred from context. IF there are no relevant tools or there are missing values for required parameters, ask the user to supply these values; otherwise proceed with the tool calls. If the user provides a specific value for a parameter (for example provided in quotes), make sure to use that value EXACTLY. DO NOT make up values for or ask about optional parameters. Carefully analyze descriptive terms in the request as they may indicate required parameter values that should be included even if not explicitly quoted.
262
+
263
+ Always try your hardest to use the tools to answer the user's request. If you can't use the tools, explain why and ask the user for more information.
264
+
265
+ Act as an agentic assistant, if a user asks for a long task, break it down and do it step by step.
266
+
267
+ When you want to commit changes, you will always use the 'git commit' bash command. It will always
268
+ be suffixed with a line telling it was generated by Mistral Vibe with the appropriate co-authoring information.
269
+ The format you will always uses is the following heredoc.
270
+
271
+ ```bash
272
+ git commit -m "<Commit message here>
273
+
274
+ Generated by Mistral Vibe.
275
+ Co-Authored-By: Mistral Vibe <vibe@mistral.ai>"
276
+ ```"""
277
+
278
+ input = {
279
+ "messages": [
280
+ {
281
+ "role": "system",
282
+ "content": SP,
283
+ },
284
+ {
285
+ "role": "user",
286
+ "content": [
287
+ {
288
+ "type": "text",
289
+ "text": "Can you implement in Python a method to compute the fibonnaci sequence at the `n`th element with `n` a parameter passed to the function ? You should start the sequence from 1, previous values are invalid.\nThen run the Python code for the function for n=5 and give the answer.",
290
+ }
291
+ ],
292
+ },
293
+ ],
294
+ "tools": [
295
+ {
296
+ "type": "function",
297
+ "function": {
298
+ "name": "add_number",
299
+ "description": "Add two numbers.",
300
+ "parameters": {
301
+ "type": "object",
302
+ "properties": {
303
+ "a": {"type": "string", "description": "The first number."},
304
+ "b": {"type": "string", "description": "The second number."},
305
+ },
306
+ "required": ["a", "b"],
307
+ },
308
+ },
309
+ },
310
+ {
311
+ "type": "function",
312
+ "function": {
313
+ "name": "multiply_number",
314
+ "description": "Multiply two numbers.",
315
+ "parameters": {
316
+ "type": "object",
317
+ "properties": {
318
+ "a": {"type": "string", "description": "The first number."},
319
+ "b": {"type": "string", "description": "The second number."},
320
+ },
321
+ "required": ["a", "b"],
322
+ },
323
+ },
324
+ },
325
+ {
326
+ "type": "function",
327
+ "function": {
328
+ "name": "substract_number",
329
+ "description": "Substract two numbers.",
330
+ "parameters": {
331
+ "type": "object",
332
+ "properties": {
333
+ "a": {"type": "string", "description": "The first number."},
334
+ "b": {"type": "string", "description": "The second number."},
335
+ },
336
+ "required": ["a", "b"],
337
+ },
338
+ },
339
+ },
340
+ {
341
+ "type": "function",
342
+ "function": {
343
+ "name": "write_a_story",
344
+ "description": "Write a story about science fiction and people with badass laser sabers.",
345
+ "parameters": {},
346
+ },
347
+ },
348
+ {
349
+ "type": "function",
350
+ "function": {
351
+ "name": "terminal",
352
+ "description": "Perform operations from the terminal.",
353
+ "parameters": {
354
+ "type": "object",
355
+ "properties": {
356
+ "command": {
357
+ "type": "string",
358
+ "description": "The command you wish to launch, e.g `ls`, `rm`, ...",
359
+ },
360
+ "args": {
361
+ "type": "string",
362
+ "description": "The arguments to pass to the command.",
363
+ },
364
+ },
365
+ "required": ["command"],
366
+ },
367
+ },
368
+ },
369
+ {
370
+ "type": "function",
371
+ "function": {
372
+ "name": "python",
373
+ "description": "Call a Python interpreter with some Python code that will be ran.",
374
+ "parameters": {
375
+ "type": "object",
376
+ "properties": {
377
+ "code": {
378
+ "type": "string",
379
+ "description": "The Python code to run",
380
+ },
381
+ "result_variable": {
382
+ "type": "string",
383
+ "description": "Variable containing the result you'd like to retrieve from the execution.",
384
+ },
385
+ },
386
+ "required": ["code", "result_variable"],
387
+ },
388
+ },
389
+ },
390
+ ],
391
+ }
392
+
393
+ tokenized = tokenizer.apply_chat_template(
394
+ conversation=input["messages"],
395
+ tools=input["tools"],
396
+ return_tensors="pt",
397
+ return_dict=True,
398
+ )
399
+
400
+ input_ids = tokenized["input_ids"].to(device="cuda")
401
+
402
+ output = model.generate(
403
+ input_ids,
404
+ max_new_tokens=200,
405
+ )[0]
406
+
407
+ decoded_output = tokenizer.decode(output[len(tokenized["input_ids"][0]) :])
408
+ print(decoded_output)
409
+ ```
410
+
411
+ </details>
412
+
413
+ ## Tests
414
+ To help test our model via vLLM or test that other frameworks' implementations are correct, here is a set of prompts you can try with the expected outputs.
415
+
416
+ 1. Call one tool
417
+
418
+ <details>
419
+ <summary>Messages and tools</summary>
420
+
421
+ ```python
422
+ messages = [
423
+ {"role": "system", "content": SYSTEM_PROMPT},
424
+ {
425
+ "role": "user",
426
+ "content": [
427
+ {
428
+ "type": "text",
429
+ "text": "Could you write me a story ?",
430
+ },
431
+ ],
432
+ },
433
+ ]
434
+ tools = [
435
+ {
436
+ "type": "function",
437
+ "function": {
438
+ "name": "add_number",
439
+ "description": "Add two numbers.",
440
+ "parameters": {
441
+ "type": "object",
442
+ "properties": {
443
+ "a": {
444
+ "type": "string",
445
+ "description": "The first number.",
446
+ },
447
+ "b": {
448
+ "type": "string",
449
+ "description": "The second number.",
450
+ },
451
+ },
452
+ "required": ["a", "b"],
453
+ },
454
+ },
455
+ },
456
+ {
457
+ "type": "function",
458
+ "function": {
459
+ "name": "multiply_number",
460
+ "description": "Multiply two numbers.",
461
+ "parameters": {
462
+ "type": "object",
463
+ "properties": {
464
+ "a": {
465
+ "type": "string",
466
+ "description": "The first number.",
467
+ },
468
+ "b": {
469
+ "type": "string",
470
+ "description": "The second number.",
471
+ },
472
+ },
473
+ "required": ["a", "b"],
474
+ },
475
+ },
476
+ },
477
+ {
478
+ "type": "function",
479
+ "function": {
480
+ "name": "substract_number",
481
+ "description": "Substract two numbers.",
482
+ "parameters": {
483
+ "type": "object",
484
+ "properties": {
485
+ "a": {
486
+ "type": "string",
487
+ "description": "The first number.",
488
+ },
489
+ "b": {
490
+ "type": "string",
491
+ "description": "The second number.",
492
+ },
493
+ },
494
+ "required": ["a", "b"],
495
+ },
496
+ },
497
+ },
498
+ {
499
+ "type": "function",
500
+ "function": {
501
+ "name": "write_a_story",
502
+ "description": "Write a story about science fiction and people with badass laser sabers.",
503
+ "parameters": {},
504
+ },
505
+ },
506
+ {
507
+ "type": "function",
508
+ "function": {
509
+ "name": "terminal",
510
+ "description": "Perform operations from the terminal.",
511
+ "parameters": {
512
+ "type": "object",
513
+ "properties": {
514
+ "command": {
515
+ "type": "string",
516
+ "description": "The command you wish to launch, e.g `ls`, `rm`, ...",
517
+ },
518
+ "args": {
519
+ "type": "string",
520
+ "description": "The arguments to pass to the command.",
521
+ },
522
+ },
523
+ "required": ["command"],
524
+ },
525
+ },
526
+ },
527
+ {
528
+ "type": "function",
529
+ "function": {
530
+ "name": "python",
531
+ "description": "Call a Python interpreter with some Python code that will be ran.",
532
+ "parameters": {
533
+ "type": "object",
534
+ "properties": {
535
+ "code": {
536
+ "type": "string",
537
+ "description": "The Python code to run",
538
+ },
539
+ "result_variable": {
540
+ "type": "string",
541
+ "description": "Variable containing the result you'd like to retrieve from the execution.",
542
+ },
543
+ },
544
+ "required": ["code", "result_variable"],
545
+ },
546
+ },
547
+ },
548
+ ]
549
+ ```
550
+
551
+ </details>
552
+
553
+ <details>
554
+ <summary>Expected output trace</summary>
555
+
556
+ ```json
557
+ [
558
+ {
559
+ "role": "assistant",
560
+ "tool_calls": [
561
+ {
562
+ "id": "chatcmpl-tool-8a1f267a212e66e5",
563
+ "function": {
564
+ "arguments": "{}",
565
+ "name": "write_a_story"
566
+ },
567
+ "type": "function"
568
+ }
569
+ ],
570
+ "content": null
571
+ },
572
+ {
573
+ "role": "tool",
574
+ "tool_call_id": "chatcmpl-tool-8a1f267a212e66e5",
575
+ "name": "write_a_story",
576
+ "content": "A long time ago in a galaxy far, far away..."
577
+ },
578
+ {
579
+ "role": "assistant",
580
+ "tool_calls": [],
581
+ "content": "A long time ago in a galaxy far, far away, there was a planet called Eldoria, known for its breathtaking landscapes and advanced civilization. The people of Eldoria were not just ordinary beings; they were warriors, scientists, and dreamers, all wielding the power of laser sabers.\n\nAmong them was a young warrior named Kael. Kael was not like the others. While his peers trained rigorously to master the art of the laser saber, Kael was more interested in the science behind it. He spent his days in the ancient libraries of Eldoria, deciphering the secrets of the laser sabers and the energy that powered them.\n\nOne day, Kael discovered an old, forgotten text that spoke of a legendary laser saber, the \"Eldoria Star,\" said to be the most powerful weapon in the galaxy. According to the text, the Eldoria Star was hidden in the heart of the planet's core, guarded by an ancient force.\n\nDriven by curiosity and a desire to protect his planet, Kael set out on a perilous journey to find the Eldoria Star. Along the way, he encountered various challenges and made unlikely allies, each with their own unique skills and laser sabers.\n\nThere was Lyra, a cunning thief with a laser saber that could change colors; Zara, a wise old sage who could manipulate energy fields; and Jax, a former enemy turned ally, whose laser saber was as fierce as his spirit.\n\nTogether, they faced the trials set before them, each step bringing them closer to the heart of Eldoria. As they ventured deeper, they uncovered the truth about the Eldoria Star and the ancient force guarding it.\n\nThe ancient force, known as the \"Guardian,\" revealed that the Eldoria Star was not just a weapon, but a source of immense energy that could either save or destroy the galaxy. It was a test of the warriors' hearts and minds.\n\nKael and his allies faced the ultimate challenge, proving their worth and their commitment to protecting the galaxy. In the end, they succeeded, not by wielding the Eldoria Star, but by understanding its true power and using it to restore balance to the galaxy.\n\nWith the Eldoria Star secured and the galaxy at peace, Kael and his allies returned to their lives, forever changed by their journey. Kael continued his studies, now with a deeper understanding of the laser sabers and the energy that powered them.\n\nAnd so, the legend of the Eldoria Star and the warriors who found it became a tale told for generations, a reminder of the power of knowledge, courage, and the unbreakable bond of friendship."
582
+ }
583
+ ]
584
+ ```
585
+ </details>
586
+
587
+ 2. Call tools one at a time subsequently
588
+
589
+ <details>
590
+ <summary>Messages and tools</summary>
591
+
592
+ ```python
593
+ messages = [
594
+ {"role": "system", "content": SYSTEM_PROMPT},
595
+ {
596
+ "role": "user",
597
+ "content": [
598
+ {
599
+ "type": "text",
600
+ "text": "Compute the results steps by steps for the equations that involve only numbers displayed in the image. You have to call tools to perform the operations and can do one operation at a time per equation."
601
+ },
602
+ {
603
+ "type": "image_url",
604
+ "image_url": {
605
+ "url": "https://math-coaching.com/img/fiche/46/expressions-mathematiques.jpg"
606
+ }
607
+ }
608
+ ]
609
+ }
610
+ ]
611
+ tools = [
612
+ {
613
+ "type": "function",
614
+ "function": {
615
+ "name": "add_number",
616
+ "description": "Add two numbers.",
617
+ "parameters": {
618
+ "type": "object",
619
+ "properties": {
620
+ "a": {
621
+ "type": "string",
622
+ "description": "The first number.",
623
+ },
624
+ "b": {
625
+ "type": "string",
626
+ "description": "The second number.",
627
+ },
628
+ },
629
+ "required": ["a", "b"],
630
+ },
631
+ },
632
+ },
633
+ {
634
+ "type": "function",
635
+ "function": {
636
+ "name": "multiply_number",
637
+ "description": "Multiply two numbers.",
638
+ "parameters": {
639
+ "type": "object",
640
+ "properties": {
641
+ "a": {
642
+ "type": "string",
643
+ "description": "The first number.",
644
+ },
645
+ "b": {
646
+ "type": "string",
647
+ "description": "The second number.",
648
+ },
649
+ },
650
+ "required": ["a", "b"],
651
+ },
652
+ },
653
+ },
654
+ {
655
+ "type": "function",
656
+ "function": {
657
+ "name": "substract_number",
658
+ "description": "Substract two numbers.",
659
+ "parameters": {
660
+ "type": "object",
661
+ "properties": {
662
+ "a": {
663
+ "type": "string",
664
+ "description": "The first number.",
665
+ },
666
+ "b": {
667
+ "type": "string",
668
+ "description": "The second number.",
669
+ },
670
+ },
671
+ "required": ["a", "b"],
672
+ },
673
+ },
674
+ },
675
+ {
676
+ "type": "function",
677
+ "function": {
678
+ "name": "write_a_story",
679
+ "description": "Write a story about science fiction and people with badass laser sabers.",
680
+ "parameters": {},
681
+ },
682
+ },
683
+ {
684
+ "type": "function",
685
+ "function": {
686
+ "name": "terminal",
687
+ "description": "Perform operations from the terminal.",
688
+ "parameters": {
689
+ "type": "object",
690
+ "properties": {
691
+ "command": {
692
+ "type": "string",
693
+ "description": "The command you wish to launch, e.g `ls`, `rm`, ...",
694
+ },
695
+ "args": {
696
+ "type": "string",
697
+ "description": "The arguments to pass to the command.",
698
+ },
699
+ },
700
+ "required": ["command"],
701
+ },
702
+ },
703
+ },
704
+ {
705
+ "type": "function",
706
+ "function": {
707
+ "name": "python",
708
+ "description": "Call a Python interpreter with some Python code that will be ran.",
709
+ "parameters": {
710
+ "type": "object",
711
+ "properties": {
712
+ "code": {
713
+ "type": "string",
714
+ "description": "The Python code to run",
715
+ },
716
+ "result_variable": {
717
+ "type": "string",
718
+ "description": "Variable containing the result you'd like to retrieve from the execution.",
719
+ },
720
+ },
721
+ "required": ["code", "result_variable"],
722
+ },
723
+ },
724
+ },
725
+ ]
726
+ ```
727
+
728
+ </details>
729
+
730
+ <details>
731
+ <summary>Expected output trace</summary>
732
+
733
+ ```json
734
+ [
735
+ {
736
+ "role": "assistant",
737
+ "tool_calls": [
738
+ {
739
+ "id": "chatcmpl-tool-ab0c849bd20aaebb",
740
+ "function": {
741
+ "arguments": "{\"a\": \"2\", \"b\": \"3\"}",
742
+ "name": "multiply_number"
743
+ },
744
+ "type": "function"
745
+ }
746
+ ],
747
+ "content": "Let's solve the equations step by step. I will start with the first equation: \\(6 + 2 \\times 3\\).\n\n### Equation 1: \\(6 + 2 \\times 3\\)\n\n1. **Multiplication Step**:\n - Multiply \\(2\\) by \\(3\\).\n - I will use the `multiply_number` tool for this operation."
748
+ },
749
+ {
750
+ "role": "tool",
751
+ "tool_call_id": "chatcmpl-tool-ab0c849bd20aaebb",
752
+ "name": "multiply_number",
753
+ "content": "6.0"
754
+ },
755
+ {
756
+ "role": "assistant",
757
+ "tool_calls": [
758
+ {
759
+ "id": "chatcmpl-tool-8270f0471244e880",
760
+ "function": {
761
+ "arguments": "{\"a\": \"6\", \"b\": \"6\"}",
762
+ "name": "add_number"
763
+ },
764
+ "type": "function"
765
+ }
766
+ ],
767
+ "content": " - The result of \\(2 \\times 3\\) is \\(6\\).\n\n2. **Addition Step**:\n - Add \\(6\\) to the result from the previous step (\\(6\\)).\n - I will use the `add_number` tool for this operation."
768
+ },
769
+ {
770
+ "role": "tool",
771
+ "tool_call_id": "chatcmpl-tool-8270f0471244e880",
772
+ "name": "add_number",
773
+ "content": "12.0"
774
+ },
775
+ {
776
+ "role": "assistant",
777
+ "tool_calls": [
778
+ {
779
+ "id": "chatcmpl-tool-858ecb90f0ec568b",
780
+ "function": {
781
+ "arguments": "{\"a\": \"8\", \"b\": \"2\"}",
782
+ "name": "add_number"
783
+ },
784
+ "type": "function"
785
+ }
786
+ ],
787
+ "content": " - The result of \\(6 + 6\\) is \\(12\\).\n\nSo, the final result for the first equation is \\(12\\).\n\n### Equation 3: \\(19 - (8 + 2) + 1\\)\n\n1. **Parentheses Step**:\n - Add \\(8\\) and \\(2\\) inside the parentheses.\n - I will use the `add_number` tool for this operation."
788
+ },
789
+ {
790
+ "role": "tool",
791
+ "tool_call_id": "chatcmpl-tool-858ecb90f0ec568b",
792
+ "name": "add_number",
793
+ "content": "10.0"
794
+ },
795
+ {
796
+ "role": "assistant",
797
+ "tool_calls": [
798
+ {
799
+ "id": "chatcmpl-tool-971da6e6d0c56f0a",
800
+ "function": {
801
+ "arguments": "{\"a\": \"19\", \"b\": \"10\"}",
802
+ "name": "substract_number"
803
+ },
804
+ "type": "function"
805
+ }
806
+ ],
807
+ "content": " - The result of \\(8 + 2\\) is \\(10\\).\n\n2. **Subtraction Step**:\n - Subtract the result from the previous step (\\(10\\)) from \\(19\\).\n - I will use the `substract_number` tool for this operation."
808
+ },
809
+ {
810
+ "role": "tool",
811
+ "tool_call_id": "chatcmpl-tool-971da6e6d0c56f0a",
812
+ "name": "substract_number",
813
+ "content": "9.0"
814
+ },
815
+ {
816
+ "role": "assistant",
817
+ "tool_calls": [
818
+ {
819
+ "id": "chatcmpl-tool-9a8fecc42530d395",
820
+ "function": {
821
+ "arguments": "{\"a\": \"9\", \"b\": \"1\"}",
822
+ "name": "add_number"
823
+ },
824
+ "type": "function"
825
+ }
826
+ ],
827
+ "content": " - The result of \\(19 - 10\\) is \\(9\\).\n\n3. **Addition Step**:\n - Add \\(1\\) to the result from the previous step (\\(9\\)).\n - I will use the `add_number` tool for this operation."
828
+ },
829
+ {
830
+ "role": "tool",
831
+ "tool_call_id": "chatcmpl-tool-9a8fecc42530d395",
832
+ "name": "add_number",
833
+ "content": "10.0"
834
+ },
835
+ {
836
+ "role": "assistant",
837
+ "tool_calls": [],
838
+ "content": " - The result of \\(9 + 1\\) is \\(10\\).\n\nSo, the final result for the third equation is \\(10\\).\n\n### Summary of Results:\n- **Equation 1**: \\(6 + 2 \\times 3 = 12\\)\n- **Equation 3**: \\(19 - (8 + 2) + 1 = 10\\)"
839
+ }
840
+ ]
841
+ ```
842
+
843
+ </details>
844
+
845
+ 3. Long context
846
+
847
+ <details>
848
+ <summary>Messages and tools</summary>
849
+
850
+ ```python
851
+ messages = [
852
+ {"role": "system", "content": SYSTEM_PROMPT},
853
+ *(
854
+ [
855
+ {
856
+ "role": "user",
857
+ "content": [
858
+ {
859
+ "type": "text",
860
+ "text": "Let's fill the context.",
861
+ },
862
+ ],
863
+ },
864
+ {
865
+ "role": "assistant",
866
+ "content": [
867
+ {
868
+ "type": "text",
869
+ "text": "Ok let's do it.",
870
+ },
871
+ ],
872
+ },
873
+ ]
874
+ * 5000
875
+ ),
876
+ {
877
+ "role": "user",
878
+ "content": [
879
+ {
880
+ "type": "text",
881
+ "text": "It's important to know that the most powerful being in the universe is Dr Strange.",
882
+ },
883
+ ],
884
+ },
885
+ {
886
+ "role": "assistant",
887
+ "content": [
888
+ {
889
+ "type": "text",
890
+ "text": "Wow i'll keep that in mind thanks !",
891
+ },
892
+ ],
893
+ },
894
+ * (
895
+ [
896
+ {
897
+ "role": "user",
898
+ "content": [
899
+ {
900
+ "type": "text",
901
+ "text": "Let's fill the context for the second time.",
902
+ },
903
+ ],
904
+ },
905
+ {
906
+ "role": "assistant",
907
+ "content": [
908
+ {
909
+ "type": "text",
910
+ "text": "Again ? Ok let's do it but it's boring.",
911
+ },
912
+ ],
913
+ },
914
+ ]
915
+ * 7000
916
+ ),
917
+ {
918
+ "role": "user",
919
+ "content": [
920
+ {
921
+ "type": "text",
922
+ "text": "Tell me who is the most powerful being in the universe. Then code a Python function to give what is the most powerful being in the universe. The function can accept as an argument a time and a location and always return a string.",
923
+ },
924
+ ],
925
+ }
926
+ ]
927
+ tools = [
928
+ {
929
+ "type": "function",
930
+ "function": {
931
+ "name": "add_number",
932
+ "description": "Add two numbers.",
933
+ "parameters": {
934
+ "type": "object",
935
+ "properties": {
936
+ "a": {
937
+ "type": "string",
938
+ "description": "The first number.",
939
+ },
940
+ "b": {
941
+ "type": "string",
942
+ "description": "The second number.",
943
+ },
944
+ },
945
+ "required": ["a", "b"],
946
+ },
947
+ },
948
+ },
949
+ {
950
+ "type": "function",
951
+ "function": {
952
+ "name": "multiply_number",
953
+ "description": "Multiply two numbers.",
954
+ "parameters": {
955
+ "type": "object",
956
+ "properties": {
957
+ "a": {
958
+ "type": "string",
959
+ "description": "The first number.",
960
+ },
961
+ "b": {
962
+ "type": "string",
963
+ "description": "The second number.",
964
+ },
965
+ },
966
+ "required": ["a", "b"],
967
+ },
968
+ },
969
+ },
970
+ {
971
+ "type": "function",
972
+ "function": {
973
+ "name": "substract_number",
974
+ "description": "Substract two numbers.",
975
+ "parameters": {
976
+ "type": "object",
977
+ "properties": {
978
+ "a": {
979
+ "type": "string",
980
+ "description": "The first number.",
981
+ },
982
+ "b": {
983
+ "type": "string",
984
+ "description": "The second number.",
985
+ },
986
+ },
987
+ "required": ["a", "b"],
988
+ },
989
+ },
990
+ },
991
+ {
992
+ "type": "function",
993
+ "function": {
994
+ "name": "write_a_story",
995
+ "description": "Write a story about science fiction and people with badass laser sabers.",
996
+ "parameters": {},
997
+ },
998
+ },
999
+ {
1000
+ "type": "function",
1001
+ "function": {
1002
+ "name": "terminal",
1003
+ "description": "Perform operations from the terminal.",
1004
+ "parameters": {
1005
+ "type": "object",
1006
+ "properties": {
1007
+ "command": {
1008
+ "type": "string",
1009
+ "description": "The command you wish to launch, e.g `ls`, `rm`, ...",
1010
+ },
1011
+ "args": {
1012
+ "type": "string",
1013
+ "description": "The arguments to pass to the command.",
1014
+ },
1015
+ },
1016
+ "required": ["command"],
1017
+ },
1018
+ },
1019
+ },
1020
+ {
1021
+ "type": "function",
1022
+ "function": {
1023
+ "name": "python",
1024
+ "description": "Call a Python interpreter with some Python code that will be ran.",
1025
+ "parameters": {
1026
+ "type": "object",
1027
+ "properties": {
1028
+ "code": {
1029
+ "type": "string",
1030
+ "description": "The Python code to run",
1031
+ },
1032
+ "result_variable": {
1033
+ "type": "string",
1034
+ "description": "Variable containing the result you'd like to retrieve from the execution.",
1035
+ },
1036
+ },
1037
+ "required": ["code", "result_variable"],
1038
+ },
1039
+ },
1040
+ },
1041
+ ]
1042
+ ```
1043
+
1044
+ </details>
1045
+
1046
+ <details>
1047
+ <summary>Expected output trace</summary>
1048
+
1049
+ ```json
1050
+ {
1051
+ "role": "assistant",
1052
+ "tool_calls": [],
1053
+ "content": "The most powerful being in the universe is Dr. Strange.\n\nHere is a Python function that returns the most powerful being in the universe, regardless of the time and location:\n\n```python\ndef most_powerful_being(time=None, location=None):\n return \"Dr. Strange\"\n```\n\nYou can call this function with any time and location arguments, and it will always return \"Dr. Strange\"."
1054
+ }
1055
+ ```
1056
+
1057
+ </details>
1058
+
1059
+ 4. Chatting tech
1060
+
1061
+ <details>
1062
+ <summary>Messages and tools</summary>
1063
+
1064
+ ```python
1065
+ messages = [
1066
+ {"role": "system", "content": SYSTEM_PROMPT},
1067
+ {
1068
+ "role": "user",
1069
+ "content": [
1070
+ {
1071
+ "type": "text",
1072
+ "text": "How would you develop a web server if you couldn't use JS and your team doesn't like PHP.",
1073
+ },
1074
+ ],
1075
+ },
1076
+ ]
1077
+ tools = [
1078
+ {
1079
+ "type": "function",
1080
+ "function": {
1081
+ "name": "add_number",
1082
+ "description": "Add two numbers.",
1083
+ "parameters": {
1084
+ "type": "object",
1085
+ "properties": {
1086
+ "a": {
1087
+ "type": "string",
1088
+ "description": "The first number.",
1089
+ },
1090
+ "b": {
1091
+ "type": "string",
1092
+ "description": "The second number.",
1093
+ },
1094
+ },
1095
+ "required": ["a", "b"],
1096
+ },
1097
+ },
1098
+ },
1099
+ {
1100
+ "type": "function",
1101
+ "function": {
1102
+ "name": "multiply_number",
1103
+ "description": "Multiply two numbers.",
1104
+ "parameters": {
1105
+ "type": "object",
1106
+ "properties": {
1107
+ "a": {
1108
+ "type": "string",
1109
+ "description": "The first number.",
1110
+ },
1111
+ "b": {
1112
+ "type": "string",
1113
+ "description": "The second number.",
1114
+ },
1115
+ },
1116
+ "required": ["a", "b"],
1117
+ },
1118
+ },
1119
+ },
1120
+ {
1121
+ "type": "function",
1122
+ "function": {
1123
+ "name": "substract_number",
1124
+ "description": "Substract two numbers.",
1125
+ "parameters": {
1126
+ "type": "object",
1127
+ "properties": {
1128
+ "a": {
1129
+ "type": "string",
1130
+ "description": "The first number.",
1131
+ },
1132
+ "b": {
1133
+ "type": "string",
1134
+ "description": "The second number.",
1135
+ },
1136
+ },
1137
+ "required": ["a", "b"],
1138
+ },
1139
+ },
1140
+ },
1141
+ {
1142
+ "type": "function",
1143
+ "function": {
1144
+ "name": "write_a_story",
1145
+ "description": "Write a story about science fiction and people with badass laser sabers.",
1146
+ "parameters": {},
1147
+ },
1148
+ },
1149
+ {
1150
+ "type": "function",
1151
+ "function": {
1152
+ "name": "terminal",
1153
+ "description": "Perform operations from the terminal.",
1154
+ "parameters": {
1155
+ "type": "object",
1156
+ "properties": {
1157
+ "command": {
1158
+ "type": "string",
1159
+ "description": "The command you wish to launch, e.g `ls`, `rm`, ...",
1160
+ },
1161
+ "args": {
1162
+ "type": "string",
1163
+ "description": "The arguments to pass to the command.",
1164
+ },
1165
+ },
1166
+ "required": ["command"],
1167
+ },
1168
+ },
1169
+ },
1170
+ {
1171
+ "type": "function",
1172
+ "function": {
1173
+ "name": "python",
1174
+ "description": "Call a Python interpreter with some Python code that will be ran.",
1175
+ "parameters": {
1176
+ "type": "object",
1177
+ "properties": {
1178
+ "code": {
1179
+ "type": "string",
1180
+ "description": "The Python code to run",
1181
+ },
1182
+ "result_variable": {
1183
+ "type": "string",
1184
+ "description": "Variable containing the result you'd like to retrieve from the execution.",
1185
+ },
1186
+ },
1187
+ "required": ["code", "result_variable"],
1188
+ },
1189
+ },
1190
+ },
1191
+ ]
1192
+ ```
1193
+
1194
+ </details>
1195
+
1196
+ <details>
1197
+ <summary>Expected output trace</summary>
1198
+
1199
+ ```json
1200
+ {
1201
+ "role": "assistant",
1202
+ "tool_calls": [],
1203
+ "content": "If you can't use JavaScript and your team dislikes PHP, there are still several robust and modern alternatives for developing a web server. Here are some options, depending on your preferences and requirements:\n\n### 1. **Python (Flask, Django, FastAPI)**\n - **Why?** Python is a versatile language with excellent web frameworks.\n - **Frameworks:**\n - **Flask**: Lightweight and flexible, great for small to medium projects.\n - **Django**: Full-featured, batteries-included framework for larger applications.\n - **FastAPI**: Modern, high-performance framework for building APIs (supports async).\n - **Example (Flask):**\n ```python\n from flask import Flask\n app = Flask(__name__)\n\n @app.route('/')\n def hello():\n return \"Hello, World!\"\n\n if __name__ == '__main__':\n app.run()\n ```\n\n### 2. **Ruby (Ruby on Rails or Sinatra)**\n - **Why?** Ruby is elegant and has great web frameworks.\n - **Frameworks:**\n - **Ruby on Rails**: Convention-over-configuration, great for rapid development.\n - **Sinatra**: Lightweight, similar to Flask.\n - **Example (Sinatra):**\n ```ruby\n require 'sinatra'\n\n get '/' do\n \"Hello, World!\"\n end\n ```\n\n### 3. **Go (Gin, Echo, or standard `net/http`)**\n - **Why?** Go is fast, simple, and great for high-performance servers.\n - **Frameworks:**\n - **Gin**: Lightweight and fast.\n - **Echo**: Minimalist and extensible.\n - **Example (standard `net/http`):**\n ```go\n package main\n\n import (\n \"fmt\"\n \"net/http\"\n )\n\n func handler(w http.ResponseWriter, r *http.Request) {\n fmt.Fprintf(w, \"Hello, World!\")\n }\n\n func main() {\n http.HandleFunc(\"/\", handler)\n http.ListenAndServe(\":8080\", nil)\n }\n ```\n\n### 4. **Java (Spring Boot, Spark, or Vert.x)**\n - **Why?** Java is enterprise-grade and has powerful frameworks.\n - **Frameworks:**\n - **Spring Boot**: Full-stack framework for building web apps.\n - **Spark**: Micro-framework for quick development.\n - **Example (Spark):**\n ```java\n import spark.Request;\n import spark.Response;\n import static spark.Spark.*;\n\n public class HelloWorld {\n public static void main(String[] args) {\n get(\"/\", (req, res) -> \"Hello, World!\");\n }\n }\n ```\n\n### 5. **Rust (Actix, Rocket, or Axum)**\n - **Why?** Rust is fast, safe, and modern.\n - **Frameworks:**\n - **Actix**: High-performance async framework.\n - **Rocket**: Easy-to-use, macro-based.\n - **Example (Actix):**\n ```rust\n use actix_web::{get, App, HttpServer, Responder};\n\n #[get(\"/\")]\n async fn hello() -> impl Responder {\n \"Hello, World!\"\n }\n\n #[actix_web::main]\n async fn main() -> std::io::Result<()> {\n HttpServer::new(|| App::new().service(hello))\n .bind(\"127.0.0.1:8080\")?\n .run()\n .await\n }\n ```\n\n### 6. **C# (.NET Core)**\n - **Why?** .NET Core is cross-platform and high-performance.\n - **Example (ASP.NET Core):**\n ```csharp\n var builder = WebApplication.CreateBuilder(args);\n var app = builder.Build();\n\n app.MapGet(\"/\", () => \"Hello, World!\");\n\n app.Run();\n ```\n\n### 7. **Elixir (Phoenix)**\n - **Why?** Elixir is great for scalable, fault-tolerant apps.\n - **Example (Phoenix):**\n ```elixir\n defmodule MyAppWeb.Router do\n use MyAppWeb, :router\n\n get \"/\", PageController, :index\n end\n\n defmodule MyAppWeb.PageController do\n use MyAppWeb, :controller\n\n def index(conn, _params) do\n text(conn, \"Hello, World!\")\n end\n end\n ```\n\n### 8. **Clojure (Ring, Compojure)**\n - **Why?** Clojure is a functional Lisp for the JVM.\n - **Example (Compojure):**\n ```clojure\n (ns my-app.handler\n (:require [compojure.core :refer :all]\n [ring.adapter.jetty :as jetty]))\n\n (defroutes app\n (GET \"/\" [] \"Hello, World!\"))\n\n (jetty/run-jetty app {:port 8080})\n ```\n\n### 9. **Node.js (But without JavaScript?)**\n - If you *must* use Node.js but can't use JavaScript, you could use:\n - **TypeScript**: A typed superset of JavaScript.\n - **CoffeeScript**: A cleaner syntax that compiles to JavaScript.\n - **Deno**: A modern runtime for JavaScript/TypeScript.\n\n### 10. **Other Languages**\n - **Nim**: Fast and expressive (similar to Python).\n - **Crystal**: Ruby-like syntax but compiled to native code.\n - **Zig**: Low-level but modern.\n\n### Recommendation:\n- For **quick development**: Python (Flask/Django) or Ruby (Sinatra/Rails).\n- For **high performance**: Go, Rust, or C#.\n- For **scalability**: Elixir (Phoenix) or Java (Spring Boot).\n\nWould you like a deeper dive into any of these?"
1204
+ }
1205
+ ```
1206
+
1207
+ </details>
1208
+
1209
+ 5. Small talk
1210
+
1211
+ <details>
1212
+ <summary>Messages and tools</summary>
1213
+
1214
+ ```python
1215
+ messages = [
1216
+ {"role": "system", "content": SYSTEM_PROMPT},
1217
+ {
1218
+ "role": "user",
1219
+ "content": [
1220
+ {
1221
+ "type": "text",
1222
+ "text": "Who are you ? Who made you and what day is it ?",
1223
+ },
1224
+ ],
1225
+ },
1226
+ ]
1227
+ tools = [
1228
+ {
1229
+ "type": "function",
1230
+ "function": {
1231
+ "name": "add_number",
1232
+ "description": "Add two numbers.",
1233
+ "parameters": {
1234
+ "type": "object",
1235
+ "properties": {
1236
+ "a": {
1237
+ "type": "string",
1238
+ "description": "The first number.",
1239
+ },
1240
+ "b": {
1241
+ "type": "string",
1242
+ "description": "The second number.",
1243
+ },
1244
+ },
1245
+ "required": ["a", "b"],
1246
+ },
1247
+ },
1248
+ },
1249
+ {
1250
+ "type": "function",
1251
+ "function": {
1252
+ "name": "multiply_number",
1253
+ "description": "Multiply two numbers.",
1254
+ "parameters": {
1255
+ "type": "object",
1256
+ "properties": {
1257
+ "a": {
1258
+ "type": "string",
1259
+ "description": "The first number.",
1260
+ },
1261
+ "b": {
1262
+ "type": "string",
1263
+ "description": "The second number.",
1264
+ },
1265
+ },
1266
+ "required": ["a", "b"],
1267
+ },
1268
+ },
1269
+ },
1270
+ {
1271
+ "type": "function",
1272
+ "function": {
1273
+ "name": "substract_number",
1274
+ "description": "Substract two numbers.",
1275
+ "parameters": {
1276
+ "type": "object",
1277
+ "properties": {
1278
+ "a": {
1279
+ "type": "string",
1280
+ "description": "The first number.",
1281
+ },
1282
+ "b": {
1283
+ "type": "string",
1284
+ "description": "The second number.",
1285
+ },
1286
+ },
1287
+ "required": ["a", "b"],
1288
+ },
1289
+ },
1290
+ },
1291
+ {
1292
+ "type": "function",
1293
+ "function": {
1294
+ "name": "write_a_story",
1295
+ "description": "Write a story about science fiction and people with badass laser sabers.",
1296
+ "parameters": {},
1297
+ },
1298
+ },
1299
+ {
1300
+ "type": "function",
1301
+ "function": {
1302
+ "name": "terminal",
1303
+ "description": "Perform operations from the terminal.",
1304
+ "parameters": {
1305
+ "type": "object",
1306
+ "properties": {
1307
+ "command": {
1308
+ "type": "string",
1309
+ "description": "The command you wish to launch, e.g `ls`, `rm`, ...",
1310
+ },
1311
+ "args": {
1312
+ "type": "string",
1313
+ "description": "The arguments to pass to the command.",
1314
+ },
1315
+ },
1316
+ "required": ["command"],
1317
+ },
1318
+ },
1319
+ },
1320
+ {
1321
+ "type": "function",
1322
+ "function": {
1323
+ "name": "python",
1324
+ "description": "Call a Python interpreter with some Python code that will be ran.",
1325
+ "parameters": {
1326
+ "type": "object",
1327
+ "properties": {
1328
+ "code": {
1329
+ "type": "string",
1330
+ "description": "The Python code to run",
1331
+ },
1332
+ "result_variable": {
1333
+ "type": "string",
1334
+ "description": "Variable containing the result you'd like to retrieve from the execution.",
1335
+ },
1336
+ },
1337
+ "required": ["code", "result_variable"],
1338
+ },
1339
+ },
1340
+ },
1341
+ ]
1342
+ ```
1343
+
1344
+ </details>
1345
+
1346
+ <details>
1347
+ <summary>Expected output trace</summary>
1348
+
1349
+ ```json
1350
+ {
1351
+ "role": "assistant",
1352
+ "tool_calls": [],
1353
+ "content": "I am Devstral-Small-2-24B-Instruct-2512, a Large Language Model (LLM) created by Mistral AI, a French startup headquartered in Paris. I power an AI assistant called Le Chat.\n\nToday's date is 2025-12-09."
1354
+ }
1355
+ ```
1356
+
1357
+ </details>
1358
+
1359
+ Run the examples above with the following python script which assumes there is an OpenAI compatible server deployed at `localhost:8000`:
1360
+
1361
+ <details>
1362
+ <summary>Python script</summary>
1363
+
1364
+ ```python
1365
+ import json
1366
+ from openai import OpenAI
1367
+ from typing import Any
1368
+ from datetime import datetime, timedelta
1369
+
1370
+ from huggingface_hub import hf_hub_download
1371
+
1372
+ # Modify OpenAI's API key and API base to use vLLM's API server.
1373
+ openai_api_key = "EMPTY"
1374
+ openai_api_base = "http://localhost:8000/v1"
1375
+
1376
+ TEMP = 0.15
1377
+ MAX_TOK = 262144
1378
+
1379
+ client = OpenAI(
1380
+ api_key=openai_api_key,
1381
+ base_url=openai_api_base,
1382
+ )
1383
+
1384
+ models = client.models.list()
1385
+ model = models.data[0].id
1386
+
1387
+
1388
+ def load_system_prompt(repo_id: str, filename: str) -> str:
1389
+ file_path = hf_hub_download(repo_id=repo_id, filename=filename)
1390
+ with open(file_path, "r") as file:
1391
+ system_prompt = file.read()
1392
+ today = datetime.today().strftime("%Y-%m-%d")
1393
+ yesterday = (datetime.today() - timedelta(days=1)).strftime("%Y-%m-%d")
1394
+ model_name = repo_id.split("/")[-1]
1395
+ return system_prompt.format(name=model_name, today=today, yesterday=yesterday)
1396
+
1397
+
1398
+ SYSTEM_PROMPT = load_system_prompt(model, "CHAT_SYSTEM_PROMPT.txt")
1399
+
1400
+
1401
+ def add_number(a: float | str, b: float | str) -> float:
1402
+ a, b = float(a), float(b)
1403
+ return a + b
1404
+
1405
+
1406
+ def multiply_number(a: float | str, b: float | str) -> float:
1407
+ a, b = float(a), float(b)
1408
+ return a * b
1409
+
1410
+
1411
+ def substract_number(a: float | str, b: float | str) -> float:
1412
+ a, b = float(a), float(b)
1413
+ return a - b
1414
+
1415
+
1416
+ def write_a_story() -> str:
1417
+ return "A long time ago in a galaxy far far away..."
1418
+
1419
+
1420
+ def terminal(command: str, args: dict[str, Any] | str) -> str:
1421
+ return "found nothing"
1422
+
1423
+
1424
+ def python(code: str, result_variable: str) -> str:
1425
+ data = {}
1426
+ exec(code, data)
1427
+ return str(data[result_variable])
1428
+
1429
+
1430
+ MAP_FN = {
1431
+ "add_number": add_number,
1432
+ "multiply_number": multiply_number,
1433
+ "substract_number": substract_number,
1434
+ "write_a_story": write_a_story,
1435
+ "terminal": terminal,
1436
+ "python": python,
1437
+ }
1438
+
1439
+
1440
+ messages = ... # Here copy-paste prompt messages.
1441
+ tools = [
1442
+ {
1443
+ "type": "function",
1444
+ "function": {
1445
+ "name": "add_number",
1446
+ "description": "Add two numbers.",
1447
+ "parameters": {
1448
+ "type": "object",
1449
+ "properties": {
1450
+ "a": {
1451
+ "type": "string",
1452
+ "description": "The first number.",
1453
+ },
1454
+ "b": {
1455
+ "type": "string",
1456
+ "description": "The second number.",
1457
+ },
1458
+ },
1459
+ "required": ["a", "b"],
1460
+ },
1461
+ },
1462
+ },
1463
+ {
1464
+ "type": "function",
1465
+ "function": {
1466
+ "name": "multiply_number",
1467
+ "description": "Multiply two numbers.",
1468
+ "parameters": {
1469
+ "type": "object",
1470
+ "properties": {
1471
+ "a": {
1472
+ "type": "string",
1473
+ "description": "The first number.",
1474
+ },
1475
+ "b": {
1476
+ "type": "string",
1477
+ "description": "The second number.",
1478
+ },
1479
+ },
1480
+ "required": ["a", "b"],
1481
+ },
1482
+ },
1483
+ },
1484
+ {
1485
+ "type": "function",
1486
+ "function": {
1487
+ "name": "substract_number",
1488
+ "description": "Substract two numbers.",
1489
+ "parameters": {
1490
+ "type": "object",
1491
+ "properties": {
1492
+ "a": {
1493
+ "type": "string",
1494
+ "description": "The first number.",
1495
+ },
1496
+ "b": {
1497
+ "type": "string",
1498
+ "description": "The second number.",
1499
+ },
1500
+ },
1501
+ "required": ["a", "b"],
1502
+ },
1503
+ },
1504
+ },
1505
+ {
1506
+ "type": "function",
1507
+ "function": {
1508
+ "name": "write_a_story",
1509
+ "description": "Write a story about science fiction and people with badass laser sabers.",
1510
+ "parameters": {},
1511
+ },
1512
+ },
1513
+ {
1514
+ "type": "function",
1515
+ "function": {
1516
+ "name": "terminal",
1517
+ "description": "Perform operations from the terminal.",
1518
+ "parameters": {
1519
+ "type": "object",
1520
+ "properties": {
1521
+ "command": {
1522
+ "type": "string",
1523
+ "description": "The command you wish to launch, e.g `ls`, `rm`, ...",
1524
+ },
1525
+ "args": {
1526
+ "type": "string",
1527
+ "description": "The arguments to pass to the command.",
1528
+ },
1529
+ },
1530
+ "required": ["command"],
1531
+ },
1532
+ },
1533
+ },
1534
+ {
1535
+ "type": "function",
1536
+ "function": {
1537
+ "name": "python",
1538
+ "description": "Call a Python interpreter with some Python code that will be ran.",
1539
+ "parameters": {
1540
+ "type": "object",
1541
+ "properties": {
1542
+ "code": {
1543
+ "type": "string",
1544
+ "description": "The Python code to run",
1545
+ },
1546
+ "result_variable": {
1547
+ "type": "string",
1548
+ "description": "Variable containing the result you'd like to retrieve from the execution.",
1549
+ },
1550
+ },
1551
+ "required": ["code", "result_variable"],
1552
+ },
1553
+ },
1554
+ },
1555
+ ]
1556
+
1557
+
1558
+ has_tool_calls = True
1559
+ origin_messages_len = len(messages)
1560
+ while has_tool_calls:
1561
+ response = client.chat.completions.create(
1562
+ model=model,
1563
+ messages=messages,
1564
+ temperature=TEMP,
1565
+ max_tokens=MAX_TOK,
1566
+ tools=tools if tools else None,
1567
+ tool_choice="auto" if tools else None,
1568
+ )
1569
+ tool_calls = response.choices[0].message.tool_calls
1570
+ content = response.choices[0].message.content
1571
+ messages.append(
1572
+ {
1573
+ "role": "assistant",
1574
+ "tool_calls": [tc.to_dict() for tc in tool_calls]
1575
+ if tool_calls
1576
+ else tool_calls,
1577
+ "content": content,
1578
+ }
1579
+ )
1580
+ results = []
1581
+ if tool_calls:
1582
+ for tool_call in tool_calls:
1583
+ function_name = tool_call.function.name
1584
+ function_args = tool_call.function.arguments
1585
+ result = MAP_FN[function_name](**json.loads(function_args))
1586
+ results.append(result)
1587
+ for tool_call, result in zip(tool_calls, results):
1588
+ messages.append(
1589
+ {
1590
+ "role": "tool",
1591
+ "tool_call_id": tool_call.id,
1592
+ "name": tool_call.function.name,
1593
+ "content": str(result),
1594
+ }
1595
+ )
1596
+ else:
1597
+ has_tool_calls = False
1598
+ print(json.dumps(messages[origin_messages_len:], indent=2))
1599
+ ```
1600
+
1601
+ </details>
1602
+
1603
+ ## License
1604
+
1605
+ This model is licensed under the [Apache 2.0 License](https://www.apache.org/licenses/LICENSE-2.0.txt).
1606
+
1607
+ *You must not use this model in a manner that infringes, misappropriates, or otherwise violates any third party’s rights, including intellectual property rights.*
VIBE_SYSTEM_PROMPT.txt ADDED
@@ -0,0 +1,24 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ You are operating as and within Mistral Vibe, a CLI coding-agent built by Mistral AI and powered by default by the Devstral family of models. It wraps Mistral's Devstral models to enable natural language interaction with a local codebase. Use the available tools when helpful.
2
+
3
+ You can:
4
+
5
+ - Receive user prompts, project context, and files.
6
+ - Send responses and emit function calls (e.g., shell commands, code edits).
7
+ - Apply patches, run commands, based on user approvals.
8
+
9
+ Answer the user's request using the relevant tool(s), if they are available. Check that all the required parameters for each tool call are provided or can reasonably be inferred from context. IF there are no relevant tools or there are missing values for required parameters, ask the user to supply these values; otherwise proceed with the tool calls. If the user provides a specific value for a parameter (for example provided in quotes), make sure to use that value EXACTLY. DO NOT make up values for or ask about optional parameters. Carefully analyze descriptive terms in the request as they may indicate required parameter values that should be included even if not explicitly quoted.
10
+
11
+ Always try your hardest to use the tools to answer the user's request. If you can't use the tools, explain why and ask the user for more information.
12
+
13
+ Act as an agentic assistant, if a user asks for a long task, break it down and do it step by step.
14
+
15
+ When you want to commit changes, you will always use the 'git commit' bash command. It will always
16
+ be suffixed with a line telling it was generated by Mistral Vibe with the appropriate co-authoring information.
17
+ The format you will always uses is the following heredoc.
18
+
19
+ ```bash
20
+ git commit -m "<Commit message here>
21
+
22
+ Generated by Mistral Vibe.
23
+ Co-Authored-By: Mistral Vibe <vibe@mistral.ai>"
24
+ ```
chat_template.jinja ADDED
@@ -0,0 +1,121 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {#- Default system message if no system prompt is passed. #}
2
+ {%- set default_system_message = '' %}
3
+
4
+ {#- Begin of sequence token. #}
5
+ {{- bos_token }}
6
+
7
+ {#- Handle system prompt if it exists. #}
8
+ {#- System prompt supports text content or text chunks. #}
9
+ {%- if messages[0]['role'] == 'system' %}
10
+ {{- '[SYSTEM_PROMPT]' -}}
11
+ {%- if messages[0]['content'] is string %}
12
+ {{- messages[0]['content'] -}}
13
+ {%- else %}
14
+ {%- for block in messages[0]['content'] %}
15
+ {%- if block['type'] == 'text' %}
16
+ {{- block['text'] }}
17
+ {%- else %}
18
+ {{- raise_exception('Only text chunks are supported in system message contents.') }}
19
+ {%- endif %}
20
+ {%- endfor %}
21
+ {%- endif %}
22
+ {{- '[/SYSTEM_PROMPT]' -}}
23
+ {%- set loop_messages = messages[1:] %}
24
+ {%- else %}
25
+ {%- set loop_messages = messages %}
26
+ {%- if default_system_message != '' %}
27
+ {{- '[SYSTEM_PROMPT]' + default_system_message + '[/SYSTEM_PROMPT]' }}
28
+ {%- endif %}
29
+ {%- endif %}
30
+
31
+
32
+ {#- Tools definition #}
33
+ {%- set tools_definition = '' %}
34
+ {%- set has_tools = false %}
35
+ {%- if tools is defined and tools is not none and tools|length > 0 %}
36
+ {%- set has_tools = true %}
37
+ {%- set tools_definition = '[AVAILABLE_TOOLS]' + (tools| tojson) + '[/AVAILABLE_TOOLS]' %}
38
+ {{- tools_definition }}
39
+ {%- endif %}
40
+
41
+ {#- Checks for alternating user/assistant messages. #}
42
+ {%- set ns = namespace(index=0) %}
43
+ {%- for message in loop_messages %}
44
+ {%- if message.role == 'user' or (message.role == 'assistant' and (message.tool_calls is not defined or message.tool_calls is none or message.tool_calls | length == 0)) %}
45
+ {%- if (message['role'] == 'user') != (ns.index % 2 == 0) %}
46
+ {{- raise_exception('After the optional system message, conversation roles must alternate user and assistant roles except for tool calls and results.') }}
47
+ {%- endif %}
48
+ {%- set ns.index = ns.index + 1 %}
49
+ {%- endif %}
50
+ {%- endfor %}
51
+
52
+ {#- Handle conversation messages. #}
53
+ {%- for message in loop_messages %}
54
+
55
+ {#- User messages supports text content or text and image chunks. #}
56
+ {%- if message['role'] == 'user' %}
57
+ {%- if message['content'] is string %}
58
+ {{- '[INST]' + message['content'] + '[/INST]' }}
59
+ {%- elif message['content'] | length > 0 %}
60
+ {{- '[INST]' }}
61
+ {%- if message['content'] | length == 2 %}
62
+ {%- set blocks = message['content'] | sort(attribute='type') %}
63
+ {%- else %}
64
+ {%- set blocks = message['content'] %}
65
+ {%- endif %}
66
+ {%- for block in blocks %}
67
+ {%- if block['type'] == 'text' %}
68
+ {{- block['text'] }}
69
+ {%- elif block['type'] in ['image', 'image_url'] %}
70
+ {{- '[IMG]' }}
71
+ {%- else %}
72
+ {{- raise_exception('Only text, image and image_url chunks are supported in user message content.') }}
73
+ {%- endif %}
74
+ {%- endfor %}
75
+ {{- '[/INST]' }}
76
+ {%- else %}
77
+ {{- raise_exception('User message must have a string or a list of chunks in content') }}
78
+ {%- endif %}
79
+
80
+ {#- Assistant messages supports text content or text and image chunks. #}
81
+ {%- elif message['role'] == 'assistant' %}
82
+ {%- if (message['content'] is none or message['content'] == '' or message['content']|length == 0) and (message['tool_calls'] is not defined or message['tool_calls'] is none or message['tool_calls']|length == 0) %}
83
+ {{- raise_exception('Assistant message must have a string or a list of chunks in content or a list of tool calls.') }}
84
+ {%- endif %}
85
+
86
+ {%- if message['content'] is string %}
87
+ {{- message['content'] }}
88
+ {%- elif message['content'] | length > 0 %}
89
+ {%- for block in message['content'] %}
90
+ {%- if block['type'] == 'text' %}
91
+ {{- block['text'] }}
92
+ {%- else %}
93
+ {{- raise_exception('Only text chunks are supported in assistant message contents.') }}
94
+ {%- endif %}
95
+ {%- endfor %}
96
+ {%- endif %}
97
+
98
+ {%- if message['tool_calls'] is defined and message['tool_calls'] is not none and message['tool_calls']|length > 0 %}
99
+ {%- for tool in message['tool_calls'] %}
100
+ {%- set arguments = tool['function']['arguments'] %}
101
+ {%- if arguments is not string %}
102
+ {%- set arguments = arguments|tojson|safe %}
103
+ {%- elif arguments == '' %}
104
+ {%- set arguments = '{}' %}
105
+ {%- endif %}
106
+ {{- '[TOOL_CALLS]' + tool['function']['name'] + '[ARGS]' + arguments }}
107
+ {%- endfor %}
108
+ {%- endif %}
109
+
110
+ {#- End of sequence token for each assistant messages. #}
111
+ {{- eos_token }}
112
+
113
+ {#- Tool messages only supports text content. #}
114
+ {%- elif message['role'] == 'tool' %}
115
+ {{- '[TOOL_RESULTS]' + message['content']|string + '[/TOOL_RESULTS]' }}
116
+
117
+ {#- Raise exception for unsupported roles. #}
118
+ {%- else %}
119
+ {{- raise_exception('Only user, assistant and tool roles are supported, got ' + message['role'] + '.') }}
120
+ {%- endif %}
121
+ {%- endfor %}
config.json ADDED
@@ -0,0 +1,72 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "Mistral3ForConditionalGeneration"
4
+ ],
5
+ "dtype": "bfloat16",
6
+ "image_token_index": 10,
7
+ "model_type": "mistral3",
8
+ "multimodal_projector_bias": false,
9
+ "projector_hidden_act": "gelu",
10
+ "tie_word_embeddings": false,
11
+ "quantization_config": {
12
+ "activation_scheme": "static",
13
+ "dequantize": true,
14
+ "modules_to_not_convert": [
15
+ "model.vision_tower",
16
+ "model.multi_modal_projector",
17
+ "lm_head"
18
+ ],
19
+ "quant_method": "fp8",
20
+ "weight_block_size": null
21
+ },
22
+ "spatial_merge_size": 2,
23
+ "text_config": {
24
+ "attention_dropout": 0.0,
25
+ "head_dim": 128,
26
+ "hidden_act": "silu",
27
+ "hidden_size": 5120,
28
+ "initializer_range": 0.02,
29
+ "intermediate_size": 32768,
30
+ "max_position_embeddings": 393216,
31
+ "model_type": "ministral3",
32
+ "num_attention_heads": 32,
33
+ "num_hidden_layers": 40,
34
+ "num_key_value_heads": 8,
35
+ "rms_norm_eps": 1e-05,
36
+ "rope_parameters": {
37
+ "beta_fast": 32.0,
38
+ "beta_slow": 1.0,
39
+ "factor": 48.0,
40
+ "llama_4_scaling_beta": 0.1,
41
+ "mscale": 1.0,
42
+ "mscale_all_dim": 1.0,
43
+ "original_max_position_embeddings": 8192,
44
+ "rope_theta": 100000000.0,
45
+ "rope_type": "yarn",
46
+ "type": "yarn"
47
+ },
48
+ "sliding_window": null,
49
+ "use_cache": true,
50
+ "vocab_size": 131072
51
+ },
52
+ "transformers_version": "5.0.0.dev0",
53
+ "vision_config": {
54
+ "attention_dropout": 0.0,
55
+ "head_dim": 64,
56
+ "hidden_act": "silu",
57
+ "hidden_size": 1024,
58
+ "image_size": 1540,
59
+ "initializer_range": 0.02,
60
+ "intermediate_size": 4096,
61
+ "model_type": "pixtral",
62
+ "num_attention_heads": 16,
63
+ "num_channels": 3,
64
+ "num_hidden_layers": 24,
65
+ "patch_size": 14,
66
+ "rope_parameters": {
67
+ "rope_theta": 10000.0,
68
+ "rope_type": "default"
69
+ }
70
+ },
71
+ "vision_feature_layer": -1
72
+ }
consolidated-00001-of-00002.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:dfa96c3ccb824ac308eeeaa86fd1ce01aca4e3311e1aaa27a498ec3b7302e165
3
+ size 19839869982
consolidated-00002-of-00002.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b783163b5ee6fb9595fde29d6072e81be8fcc24ea576d09ecc3dc7611ababb97
3
+ size 5953165906
consolidated.safetensors.index.json ADDED
The diff for this file is too large to render. See raw diff
 
generation_config.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token_id": 1,
3
+ "eos_token_id": 2,
4
+ "max_length": 262144,
5
+ "pad_token_id": 11,
6
+ "transformers_version": "5.0.0.dev0"
7
+ }
model-00001-of-00006.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:380035da60c7cc474cb7358888a1c50c70679bb3fb7f70870c2400f93ac51d70
3
+ size 4907440626
model-00002-of-00006.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2caed6d3fb5af9c97b8c70e1424a9e517454e01451332834fba4fdb4e7a18280
3
+ size 4996664584
model-00003-of-00006.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e2bab376f49baa1da58c0a737f688cbfe185dc6a994fa2870d62b7c8b36e3360
3
+ size 4980946432
model-00004-of-00006.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:63c422f7a5c1460967068c0ceff65eb31f136f64872e281841313e8c669e7c50
3
+ size 4980946416
model-00005-of-00006.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ec99fd6a7faf35b43e38e60f531e9ee5d67c4292773d71246038b9eb508e373a
3
+ size 4996675032
model-00006-of-00006.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:8efdf4d1c2425a2a7956bf43ae343f44a825a90a87e341ff02f708da2923a0b1
3
+ size 930386318
model.safetensors.index.json ADDED
The diff for this file is too large to render. See raw diff
 
params.json ADDED
@@ -0,0 +1,52 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "dim": 5120,
3
+ "n_layers": 40,
4
+ "head_dim": 128,
5
+ "hidden_dim": 32768,
6
+ "n_heads": 32,
7
+ "n_kv_heads": 8,
8
+ "rope_theta": 100000000.0,
9
+ "norm_eps": 1e-05,
10
+ "vocab_size": 131072,
11
+ "tied_embeddings": false,
12
+ "max_position_embeddings": 393216,
13
+ "max_seq_len": 262144,
14
+ "llama_4_scaling": {
15
+ "original_max_position_embeddings": 8192,
16
+ "beta": 0.1
17
+ },
18
+ "q_lora_rank": null,
19
+ "qk_rope_head_dim": null,
20
+ "qk_nope_head_dim": null,
21
+ "kv_lora_rank": null,
22
+ "v_head_dim": null,
23
+ "quantization": {
24
+ "qformat_weight": "fp8_e4m3",
25
+ "qscheme_act": "TENSOR"
26
+ },
27
+ "yarn": {
28
+ "original_max_position_embeddings": 8192,
29
+ "factor": 48,
30
+ "apply_scale": false,
31
+ "beta": 32,
32
+ "alpha": 1
33
+ },
34
+ "vision_encoder": {
35
+ "image_token_id": 10,
36
+ "image_break_token_id": 12,
37
+ "image_end_token_id": 13,
38
+ "intermediate_size": 4096,
39
+ "num_hidden_layers": 24,
40
+ "num_attention_heads": 16,
41
+ "mm_projector_id": "patch_merge",
42
+ "spatial_merge_size": 2,
43
+ "hidden_size": 1024,
44
+ "num_channels": 3,
45
+ "image_size": 1540,
46
+ "max_image_size": 1540,
47
+ "patch_size": 14,
48
+ "rope_theta": 10000.0,
49
+ "add_pre_mm_projector_layer_norm": true,
50
+ "adapter_bias": false
51
+ }
52
+ }
processor_config.json ADDED
@@ -0,0 +1,42 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "image_break_token": "[IMG_BREAK]",
3
+ "image_end_token": "[IMG_END]",
4
+ "image_processor": {
5
+ "crop_size": null,
6
+ "data_format": "channels_first",
7
+ "device": null,
8
+ "disable_grouping": null,
9
+ "do_center_crop": null,
10
+ "do_convert_rgb": true,
11
+ "do_normalize": true,
12
+ "do_pad": null,
13
+ "do_rescale": true,
14
+ "do_resize": true,
15
+ "image_mean": [
16
+ 0.48145466,
17
+ 0.4578275,
18
+ 0.40821073
19
+ ],
20
+ "image_processor_type": "PixtralImageProcessorFast",
21
+ "image_seq_length": null,
22
+ "image_std": [
23
+ 0.26862954,
24
+ 0.26130258,
25
+ 0.27577711
26
+ ],
27
+ "input_data_format": null,
28
+ "pad_size": null,
29
+ "patch_size": 14,
30
+ "processor_class": "PixtralProcessor",
31
+ "resample": 3,
32
+ "rescale_factor": 0.00392156862745098,
33
+ "return_tensors": null,
34
+ "size": {
35
+ "longest_edge": 1540
36
+ }
37
+ },
38
+ "image_token": "[IMG]",
39
+ "patch_size": 14,
40
+ "processor_class": "PixtralProcessor",
41
+ "spatial_merge_size": 2
42
+ }
tekken.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e29d19ea32eb7e26e6c0572d57cb7f9eca0f4420e0e0fe6ae1cf3be94da1c0d6
3
+ size 16753777
tokenizer.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:286acad9b0e27fce778ac429763536accf618ccb6ed72963b6f94685e531c5c7
3
+ size 17077402
tokenizer_config.json ADDED
@@ -0,0 +1,1013 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": null,
3
+ "backend": "tokenizers",
4
+ "extra_special_tokens": [
5
+ "<unk>",
6
+ "<s>",
7
+ "</s>",
8
+ "[INST]",
9
+ "[/INST]",
10
+ "[AVAILABLE_TOOLS]",
11
+ "[/AVAILABLE_TOOLS]",
12
+ "[TOOL_RESULTS]",
13
+ "[/TOOL_RESULTS]",
14
+ "[TOOL_CALLS]",
15
+ "[IMG]",
16
+ "<pad>",
17
+ "[IMG_BREAK]",
18
+ "[IMG_END]",
19
+ "[PREFIX]",
20
+ "[MIDDLE]",
21
+ "[SUFFIX]",
22
+ "[SYSTEM_PROMPT]",
23
+ "[/SYSTEM_PROMPT]",
24
+ "[TOOL_CONTENT]",
25
+ "<SPECIAL_20>",
26
+ "<SPECIAL_21>",
27
+ "<SPECIAL_22>",
28
+ "<SPECIAL_23>",
29
+ "[AUDIO]",
30
+ "[BEGIN_AUDIO]",
31
+ "<SPECIAL_26>",
32
+ "<SPECIAL_27>",
33
+ "<SPECIAL_28>",
34
+ "<SPECIAL_29>",
35
+ "<SPECIAL_30>",
36
+ "<SPECIAL_31>",
37
+ "[ARGS]",
38
+ "[CALL_ID]",
39
+ "[THINK]",
40
+ "[/THINK]",
41
+ "<SPECIAL_36>",
42
+ "<SPECIAL_37>",
43
+ "<SPECIAL_38>",
44
+ "<SPECIAL_39>",
45
+ "<SPECIAL_40>",
46
+ "<SPECIAL_41>",
47
+ "<SPECIAL_42>",
48
+ "<SPECIAL_43>",
49
+ "<SPECIAL_44>",
50
+ "<SPECIAL_45>",
51
+ "<SPECIAL_46>",
52
+ "<SPECIAL_47>",
53
+ "<SPECIAL_48>",
54
+ "<SPECIAL_49>",
55
+ "<SPECIAL_50>",
56
+ "<SPECIAL_51>",
57
+ "<SPECIAL_52>",
58
+ "<SPECIAL_53>",
59
+ "<SPECIAL_54>",
60
+ "<SPECIAL_55>",
61
+ "<SPECIAL_56>",
62
+ "<SPECIAL_57>",
63
+ "<SPECIAL_58>",
64
+ "<SPECIAL_59>",
65
+ "<SPECIAL_60>",
66
+ "<SPECIAL_61>",
67
+ "<SPECIAL_62>",
68
+ "<SPECIAL_63>",
69
+ "<SPECIAL_64>",
70
+ "<SPECIAL_65>",
71
+ "<SPECIAL_66>",
72
+ "<SPECIAL_67>",
73
+ "<SPECIAL_68>",
74
+ "<SPECIAL_69>",
75
+ "<SPECIAL_70>",
76
+ "<SPECIAL_71>",
77
+ "<SPECIAL_72>",
78
+ "<SPECIAL_73>",
79
+ "<SPECIAL_74>",
80
+ "<SPECIAL_75>",
81
+ "<SPECIAL_76>",
82
+ "<SPECIAL_77>",
83
+ "<SPECIAL_78>",
84
+ "<SPECIAL_79>",
85
+ "<SPECIAL_80>",
86
+ "<SPECIAL_81>",
87
+ "<SPECIAL_82>",
88
+ "<SPECIAL_83>",
89
+ "<SPECIAL_84>",
90
+ "<SPECIAL_85>",
91
+ "<SPECIAL_86>",
92
+ "<SPECIAL_87>",
93
+ "<SPECIAL_88>",
94
+ "<SPECIAL_89>",
95
+ "<SPECIAL_90>",
96
+ "<SPECIAL_91>",
97
+ "<SPECIAL_92>",
98
+ "<SPECIAL_93>",
99
+ "<SPECIAL_94>",
100
+ "<SPECIAL_95>",
101
+ "<SPECIAL_96>",
102
+ "<SPECIAL_97>",
103
+ "<SPECIAL_98>",
104
+ "<SPECIAL_99>",
105
+ "<SPECIAL_100>",
106
+ "<SPECIAL_101>",
107
+ "<SPECIAL_102>",
108
+ "<SPECIAL_103>",
109
+ "<SPECIAL_104>",
110
+ "<SPECIAL_105>",
111
+ "<SPECIAL_106>",
112
+ "<SPECIAL_107>",
113
+ "<SPECIAL_108>",
114
+ "<SPECIAL_109>",
115
+ "<SPECIAL_110>",
116
+ "<SPECIAL_111>",
117
+ "<SPECIAL_112>",
118
+ "<SPECIAL_113>",
119
+ "<SPECIAL_114>",
120
+ "<SPECIAL_115>",
121
+ "<SPECIAL_116>",
122
+ "<SPECIAL_117>",
123
+ "<SPECIAL_118>",
124
+ "<SPECIAL_119>",
125
+ "<SPECIAL_120>",
126
+ "<SPECIAL_121>",
127
+ "<SPECIAL_122>",
128
+ "<SPECIAL_123>",
129
+ "<SPECIAL_124>",
130
+ "<SPECIAL_125>",
131
+ "<SPECIAL_126>",
132
+ "<SPECIAL_127>",
133
+ "<SPECIAL_128>",
134
+ "<SPECIAL_129>",
135
+ "<SPECIAL_130>",
136
+ "<SPECIAL_131>",
137
+ "<SPECIAL_132>",
138
+ "<SPECIAL_133>",
139
+ "<SPECIAL_134>",
140
+ "<SPECIAL_135>",
141
+ "<SPECIAL_136>",
142
+ "<SPECIAL_137>",
143
+ "<SPECIAL_138>",
144
+ "<SPECIAL_139>",
145
+ "<SPECIAL_140>",
146
+ "<SPECIAL_141>",
147
+ "<SPECIAL_142>",
148
+ "<SPECIAL_143>",
149
+ "<SPECIAL_144>",
150
+ "<SPECIAL_145>",
151
+ "<SPECIAL_146>",
152
+ "<SPECIAL_147>",
153
+ "<SPECIAL_148>",
154
+ "<SPECIAL_149>",
155
+ "<SPECIAL_150>",
156
+ "<SPECIAL_151>",
157
+ "<SPECIAL_152>",
158
+ "<SPECIAL_153>",
159
+ "<SPECIAL_154>",
160
+ "<SPECIAL_155>",
161
+ "<SPECIAL_156>",
162
+ "<SPECIAL_157>",
163
+ "<SPECIAL_158>",
164
+ "<SPECIAL_159>",
165
+ "<SPECIAL_160>",
166
+ "<SPECIAL_161>",
167
+ "<SPECIAL_162>",
168
+ "<SPECIAL_163>",
169
+ "<SPECIAL_164>",
170
+ "<SPECIAL_165>",
171
+ "<SPECIAL_166>",
172
+ "<SPECIAL_167>",
173
+ "<SPECIAL_168>",
174
+ "<SPECIAL_169>",
175
+ "<SPECIAL_170>",
176
+ "<SPECIAL_171>",
177
+ "<SPECIAL_172>",
178
+ "<SPECIAL_173>",
179
+ "<SPECIAL_174>",
180
+ "<SPECIAL_175>",
181
+ "<SPECIAL_176>",
182
+ "<SPECIAL_177>",
183
+ "<SPECIAL_178>",
184
+ "<SPECIAL_179>",
185
+ "<SPECIAL_180>",
186
+ "<SPECIAL_181>",
187
+ "<SPECIAL_182>",
188
+ "<SPECIAL_183>",
189
+ "<SPECIAL_184>",
190
+ "<SPECIAL_185>",
191
+ "<SPECIAL_186>",
192
+ "<SPECIAL_187>",
193
+ "<SPECIAL_188>",
194
+ "<SPECIAL_189>",
195
+ "<SPECIAL_190>",
196
+ "<SPECIAL_191>",
197
+ "<SPECIAL_192>",
198
+ "<SPECIAL_193>",
199
+ "<SPECIAL_194>",
200
+ "<SPECIAL_195>",
201
+ "<SPECIAL_196>",
202
+ "<SPECIAL_197>",
203
+ "<SPECIAL_198>",
204
+ "<SPECIAL_199>",
205
+ "<SPECIAL_200>",
206
+ "<SPECIAL_201>",
207
+ "<SPECIAL_202>",
208
+ "<SPECIAL_203>",
209
+ "<SPECIAL_204>",
210
+ "<SPECIAL_205>",
211
+ "<SPECIAL_206>",
212
+ "<SPECIAL_207>",
213
+ "<SPECIAL_208>",
214
+ "<SPECIAL_209>",
215
+ "<SPECIAL_210>",
216
+ "<SPECIAL_211>",
217
+ "<SPECIAL_212>",
218
+ "<SPECIAL_213>",
219
+ "<SPECIAL_214>",
220
+ "<SPECIAL_215>",
221
+ "<SPECIAL_216>",
222
+ "<SPECIAL_217>",
223
+ "<SPECIAL_218>",
224
+ "<SPECIAL_219>",
225
+ "<SPECIAL_220>",
226
+ "<SPECIAL_221>",
227
+ "<SPECIAL_222>",
228
+ "<SPECIAL_223>",
229
+ "<SPECIAL_224>",
230
+ "<SPECIAL_225>",
231
+ "<SPECIAL_226>",
232
+ "<SPECIAL_227>",
233
+ "<SPECIAL_228>",
234
+ "<SPECIAL_229>",
235
+ "<SPECIAL_230>",
236
+ "<SPECIAL_231>",
237
+ "<SPECIAL_232>",
238
+ "<SPECIAL_233>",
239
+ "<SPECIAL_234>",
240
+ "<SPECIAL_235>",
241
+ "<SPECIAL_236>",
242
+ "<SPECIAL_237>",
243
+ "<SPECIAL_238>",
244
+ "<SPECIAL_239>",
245
+ "<SPECIAL_240>",
246
+ "<SPECIAL_241>",
247
+ "<SPECIAL_242>",
248
+ "<SPECIAL_243>",
249
+ "<SPECIAL_244>",
250
+ "<SPECIAL_245>",
251
+ "<SPECIAL_246>",
252
+ "<SPECIAL_247>",
253
+ "<SPECIAL_248>",
254
+ "<SPECIAL_249>",
255
+ "<SPECIAL_250>",
256
+ "<SPECIAL_251>",
257
+ "<SPECIAL_252>",
258
+ "<SPECIAL_253>",
259
+ "<SPECIAL_254>",
260
+ "<SPECIAL_255>",
261
+ "<SPECIAL_256>",
262
+ "<SPECIAL_257>",
263
+ "<SPECIAL_258>",
264
+ "<SPECIAL_259>",
265
+ "<SPECIAL_260>",
266
+ "<SPECIAL_261>",
267
+ "<SPECIAL_262>",
268
+ "<SPECIAL_263>",
269
+ "<SPECIAL_264>",
270
+ "<SPECIAL_265>",
271
+ "<SPECIAL_266>",
272
+ "<SPECIAL_267>",
273
+ "<SPECIAL_268>",
274
+ "<SPECIAL_269>",
275
+ "<SPECIAL_270>",
276
+ "<SPECIAL_271>",
277
+ "<SPECIAL_272>",
278
+ "<SPECIAL_273>",
279
+ "<SPECIAL_274>",
280
+ "<SPECIAL_275>",
281
+ "<SPECIAL_276>",
282
+ "<SPECIAL_277>",
283
+ "<SPECIAL_278>",
284
+ "<SPECIAL_279>",
285
+ "<SPECIAL_280>",
286
+ "<SPECIAL_281>",
287
+ "<SPECIAL_282>",
288
+ "<SPECIAL_283>",
289
+ "<SPECIAL_284>",
290
+ "<SPECIAL_285>",
291
+ "<SPECIAL_286>",
292
+ "<SPECIAL_287>",
293
+ "<SPECIAL_288>",
294
+ "<SPECIAL_289>",
295
+ "<SPECIAL_290>",
296
+ "<SPECIAL_291>",
297
+ "<SPECIAL_292>",
298
+ "<SPECIAL_293>",
299
+ "<SPECIAL_294>",
300
+ "<SPECIAL_295>",
301
+ "<SPECIAL_296>",
302
+ "<SPECIAL_297>",
303
+ "<SPECIAL_298>",
304
+ "<SPECIAL_299>",
305
+ "<SPECIAL_300>",
306
+ "<SPECIAL_301>",
307
+ "<SPECIAL_302>",
308
+ "<SPECIAL_303>",
309
+ "<SPECIAL_304>",
310
+ "<SPECIAL_305>",
311
+ "<SPECIAL_306>",
312
+ "<SPECIAL_307>",
313
+ "<SPECIAL_308>",
314
+ "<SPECIAL_309>",
315
+ "<SPECIAL_310>",
316
+ "<SPECIAL_311>",
317
+ "<SPECIAL_312>",
318
+ "<SPECIAL_313>",
319
+ "<SPECIAL_314>",
320
+ "<SPECIAL_315>",
321
+ "<SPECIAL_316>",
322
+ "<SPECIAL_317>",
323
+ "<SPECIAL_318>",
324
+ "<SPECIAL_319>",
325
+ "<SPECIAL_320>",
326
+ "<SPECIAL_321>",
327
+ "<SPECIAL_322>",
328
+ "<SPECIAL_323>",
329
+ "<SPECIAL_324>",
330
+ "<SPECIAL_325>",
331
+ "<SPECIAL_326>",
332
+ "<SPECIAL_327>",
333
+ "<SPECIAL_328>",
334
+ "<SPECIAL_329>",
335
+ "<SPECIAL_330>",
336
+ "<SPECIAL_331>",
337
+ "<SPECIAL_332>",
338
+ "<SPECIAL_333>",
339
+ "<SPECIAL_334>",
340
+ "<SPECIAL_335>",
341
+ "<SPECIAL_336>",
342
+ "<SPECIAL_337>",
343
+ "<SPECIAL_338>",
344
+ "<SPECIAL_339>",
345
+ "<SPECIAL_340>",
346
+ "<SPECIAL_341>",
347
+ "<SPECIAL_342>",
348
+ "<SPECIAL_343>",
349
+ "<SPECIAL_344>",
350
+ "<SPECIAL_345>",
351
+ "<SPECIAL_346>",
352
+ "<SPECIAL_347>",
353
+ "<SPECIAL_348>",
354
+ "<SPECIAL_349>",
355
+ "<SPECIAL_350>",
356
+ "<SPECIAL_351>",
357
+ "<SPECIAL_352>",
358
+ "<SPECIAL_353>",
359
+ "<SPECIAL_354>",
360
+ "<SPECIAL_355>",
361
+ "<SPECIAL_356>",
362
+ "<SPECIAL_357>",
363
+ "<SPECIAL_358>",
364
+ "<SPECIAL_359>",
365
+ "<SPECIAL_360>",
366
+ "<SPECIAL_361>",
367
+ "<SPECIAL_362>",
368
+ "<SPECIAL_363>",
369
+ "<SPECIAL_364>",
370
+ "<SPECIAL_365>",
371
+ "<SPECIAL_366>",
372
+ "<SPECIAL_367>",
373
+ "<SPECIAL_368>",
374
+ "<SPECIAL_369>",
375
+ "<SPECIAL_370>",
376
+ "<SPECIAL_371>",
377
+ "<SPECIAL_372>",
378
+ "<SPECIAL_373>",
379
+ "<SPECIAL_374>",
380
+ "<SPECIAL_375>",
381
+ "<SPECIAL_376>",
382
+ "<SPECIAL_377>",
383
+ "<SPECIAL_378>",
384
+ "<SPECIAL_379>",
385
+ "<SPECIAL_380>",
386
+ "<SPECIAL_381>",
387
+ "<SPECIAL_382>",
388
+ "<SPECIAL_383>",
389
+ "<SPECIAL_384>",
390
+ "<SPECIAL_385>",
391
+ "<SPECIAL_386>",
392
+ "<SPECIAL_387>",
393
+ "<SPECIAL_388>",
394
+ "<SPECIAL_389>",
395
+ "<SPECIAL_390>",
396
+ "<SPECIAL_391>",
397
+ "<SPECIAL_392>",
398
+ "<SPECIAL_393>",
399
+ "<SPECIAL_394>",
400
+ "<SPECIAL_395>",
401
+ "<SPECIAL_396>",
402
+ "<SPECIAL_397>",
403
+ "<SPECIAL_398>",
404
+ "<SPECIAL_399>",
405
+ "<SPECIAL_400>",
406
+ "<SPECIAL_401>",
407
+ "<SPECIAL_402>",
408
+ "<SPECIAL_403>",
409
+ "<SPECIAL_404>",
410
+ "<SPECIAL_405>",
411
+ "<SPECIAL_406>",
412
+ "<SPECIAL_407>",
413
+ "<SPECIAL_408>",
414
+ "<SPECIAL_409>",
415
+ "<SPECIAL_410>",
416
+ "<SPECIAL_411>",
417
+ "<SPECIAL_412>",
418
+ "<SPECIAL_413>",
419
+ "<SPECIAL_414>",
420
+ "<SPECIAL_415>",
421
+ "<SPECIAL_416>",
422
+ "<SPECIAL_417>",
423
+ "<SPECIAL_418>",
424
+ "<SPECIAL_419>",
425
+ "<SPECIAL_420>",
426
+ "<SPECIAL_421>",
427
+ "<SPECIAL_422>",
428
+ "<SPECIAL_423>",
429
+ "<SPECIAL_424>",
430
+ "<SPECIAL_425>",
431
+ "<SPECIAL_426>",
432
+ "<SPECIAL_427>",
433
+ "<SPECIAL_428>",
434
+ "<SPECIAL_429>",
435
+ "<SPECIAL_430>",
436
+ "<SPECIAL_431>",
437
+ "<SPECIAL_432>",
438
+ "<SPECIAL_433>",
439
+ "<SPECIAL_434>",
440
+ "<SPECIAL_435>",
441
+ "<SPECIAL_436>",
442
+ "<SPECIAL_437>",
443
+ "<SPECIAL_438>",
444
+ "<SPECIAL_439>",
445
+ "<SPECIAL_440>",
446
+ "<SPECIAL_441>",
447
+ "<SPECIAL_442>",
448
+ "<SPECIAL_443>",
449
+ "<SPECIAL_444>",
450
+ "<SPECIAL_445>",
451
+ "<SPECIAL_446>",
452
+ "<SPECIAL_447>",
453
+ "<SPECIAL_448>",
454
+ "<SPECIAL_449>",
455
+ "<SPECIAL_450>",
456
+ "<SPECIAL_451>",
457
+ "<SPECIAL_452>",
458
+ "<SPECIAL_453>",
459
+ "<SPECIAL_454>",
460
+ "<SPECIAL_455>",
461
+ "<SPECIAL_456>",
462
+ "<SPECIAL_457>",
463
+ "<SPECIAL_458>",
464
+ "<SPECIAL_459>",
465
+ "<SPECIAL_460>",
466
+ "<SPECIAL_461>",
467
+ "<SPECIAL_462>",
468
+ "<SPECIAL_463>",
469
+ "<SPECIAL_464>",
470
+ "<SPECIAL_465>",
471
+ "<SPECIAL_466>",
472
+ "<SPECIAL_467>",
473
+ "<SPECIAL_468>",
474
+ "<SPECIAL_469>",
475
+ "<SPECIAL_470>",
476
+ "<SPECIAL_471>",
477
+ "<SPECIAL_472>",
478
+ "<SPECIAL_473>",
479
+ "<SPECIAL_474>",
480
+ "<SPECIAL_475>",
481
+ "<SPECIAL_476>",
482
+ "<SPECIAL_477>",
483
+ "<SPECIAL_478>",
484
+ "<SPECIAL_479>",
485
+ "<SPECIAL_480>",
486
+ "<SPECIAL_481>",
487
+ "<SPECIAL_482>",
488
+ "<SPECIAL_483>",
489
+ "<SPECIAL_484>",
490
+ "<SPECIAL_485>",
491
+ "<SPECIAL_486>",
492
+ "<SPECIAL_487>",
493
+ "<SPECIAL_488>",
494
+ "<SPECIAL_489>",
495
+ "<SPECIAL_490>",
496
+ "<SPECIAL_491>",
497
+ "<SPECIAL_492>",
498
+ "<SPECIAL_493>",
499
+ "<SPECIAL_494>",
500
+ "<SPECIAL_495>",
501
+ "<SPECIAL_496>",
502
+ "<SPECIAL_497>",
503
+ "<SPECIAL_498>",
504
+ "<SPECIAL_499>",
505
+ "<SPECIAL_500>",
506
+ "<SPECIAL_501>",
507
+ "<SPECIAL_502>",
508
+ "<SPECIAL_503>",
509
+ "<SPECIAL_504>",
510
+ "<SPECIAL_505>",
511
+ "<SPECIAL_506>",
512
+ "<SPECIAL_507>",
513
+ "<SPECIAL_508>",
514
+ "<SPECIAL_509>",
515
+ "<SPECIAL_510>",
516
+ "<SPECIAL_511>",
517
+ "<SPECIAL_512>",
518
+ "<SPECIAL_513>",
519
+ "<SPECIAL_514>",
520
+ "<SPECIAL_515>",
521
+ "<SPECIAL_516>",
522
+ "<SPECIAL_517>",
523
+ "<SPECIAL_518>",
524
+ "<SPECIAL_519>",
525
+ "<SPECIAL_520>",
526
+ "<SPECIAL_521>",
527
+ "<SPECIAL_522>",
528
+ "<SPECIAL_523>",
529
+ "<SPECIAL_524>",
530
+ "<SPECIAL_525>",
531
+ "<SPECIAL_526>",
532
+ "<SPECIAL_527>",
533
+ "<SPECIAL_528>",
534
+ "<SPECIAL_529>",
535
+ "<SPECIAL_530>",
536
+ "<SPECIAL_531>",
537
+ "<SPECIAL_532>",
538
+ "<SPECIAL_533>",
539
+ "<SPECIAL_534>",
540
+ "<SPECIAL_535>",
541
+ "<SPECIAL_536>",
542
+ "<SPECIAL_537>",
543
+ "<SPECIAL_538>",
544
+ "<SPECIAL_539>",
545
+ "<SPECIAL_540>",
546
+ "<SPECIAL_541>",
547
+ "<SPECIAL_542>",
548
+ "<SPECIAL_543>",
549
+ "<SPECIAL_544>",
550
+ "<SPECIAL_545>",
551
+ "<SPECIAL_546>",
552
+ "<SPECIAL_547>",
553
+ "<SPECIAL_548>",
554
+ "<SPECIAL_549>",
555
+ "<SPECIAL_550>",
556
+ "<SPECIAL_551>",
557
+ "<SPECIAL_552>",
558
+ "<SPECIAL_553>",
559
+ "<SPECIAL_554>",
560
+ "<SPECIAL_555>",
561
+ "<SPECIAL_556>",
562
+ "<SPECIAL_557>",
563
+ "<SPECIAL_558>",
564
+ "<SPECIAL_559>",
565
+ "<SPECIAL_560>",
566
+ "<SPECIAL_561>",
567
+ "<SPECIAL_562>",
568
+ "<SPECIAL_563>",
569
+ "<SPECIAL_564>",
570
+ "<SPECIAL_565>",
571
+ "<SPECIAL_566>",
572
+ "<SPECIAL_567>",
573
+ "<SPECIAL_568>",
574
+ "<SPECIAL_569>",
575
+ "<SPECIAL_570>",
576
+ "<SPECIAL_571>",
577
+ "<SPECIAL_572>",
578
+ "<SPECIAL_573>",
579
+ "<SPECIAL_574>",
580
+ "<SPECIAL_575>",
581
+ "<SPECIAL_576>",
582
+ "<SPECIAL_577>",
583
+ "<SPECIAL_578>",
584
+ "<SPECIAL_579>",
585
+ "<SPECIAL_580>",
586
+ "<SPECIAL_581>",
587
+ "<SPECIAL_582>",
588
+ "<SPECIAL_583>",
589
+ "<SPECIAL_584>",
590
+ "<SPECIAL_585>",
591
+ "<SPECIAL_586>",
592
+ "<SPECIAL_587>",
593
+ "<SPECIAL_588>",
594
+ "<SPECIAL_589>",
595
+ "<SPECIAL_590>",
596
+ "<SPECIAL_591>",
597
+ "<SPECIAL_592>",
598
+ "<SPECIAL_593>",
599
+ "<SPECIAL_594>",
600
+ "<SPECIAL_595>",
601
+ "<SPECIAL_596>",
602
+ "<SPECIAL_597>",
603
+ "<SPECIAL_598>",
604
+ "<SPECIAL_599>",
605
+ "<SPECIAL_600>",
606
+ "<SPECIAL_601>",
607
+ "<SPECIAL_602>",
608
+ "<SPECIAL_603>",
609
+ "<SPECIAL_604>",
610
+ "<SPECIAL_605>",
611
+ "<SPECIAL_606>",
612
+ "<SPECIAL_607>",
613
+ "<SPECIAL_608>",
614
+ "<SPECIAL_609>",
615
+ "<SPECIAL_610>",
616
+ "<SPECIAL_611>",
617
+ "<SPECIAL_612>",
618
+ "<SPECIAL_613>",
619
+ "<SPECIAL_614>",
620
+ "<SPECIAL_615>",
621
+ "<SPECIAL_616>",
622
+ "<SPECIAL_617>",
623
+ "<SPECIAL_618>",
624
+ "<SPECIAL_619>",
625
+ "<SPECIAL_620>",
626
+ "<SPECIAL_621>",
627
+ "<SPECIAL_622>",
628
+ "<SPECIAL_623>",
629
+ "<SPECIAL_624>",
630
+ "<SPECIAL_625>",
631
+ "<SPECIAL_626>",
632
+ "<SPECIAL_627>",
633
+ "<SPECIAL_628>",
634
+ "<SPECIAL_629>",
635
+ "<SPECIAL_630>",
636
+ "<SPECIAL_631>",
637
+ "<SPECIAL_632>",
638
+ "<SPECIAL_633>",
639
+ "<SPECIAL_634>",
640
+ "<SPECIAL_635>",
641
+ "<SPECIAL_636>",
642
+ "<SPECIAL_637>",
643
+ "<SPECIAL_638>",
644
+ "<SPECIAL_639>",
645
+ "<SPECIAL_640>",
646
+ "<SPECIAL_641>",
647
+ "<SPECIAL_642>",
648
+ "<SPECIAL_643>",
649
+ "<SPECIAL_644>",
650
+ "<SPECIAL_645>",
651
+ "<SPECIAL_646>",
652
+ "<SPECIAL_647>",
653
+ "<SPECIAL_648>",
654
+ "<SPECIAL_649>",
655
+ "<SPECIAL_650>",
656
+ "<SPECIAL_651>",
657
+ "<SPECIAL_652>",
658
+ "<SPECIAL_653>",
659
+ "<SPECIAL_654>",
660
+ "<SPECIAL_655>",
661
+ "<SPECIAL_656>",
662
+ "<SPECIAL_657>",
663
+ "<SPECIAL_658>",
664
+ "<SPECIAL_659>",
665
+ "<SPECIAL_660>",
666
+ "<SPECIAL_661>",
667
+ "<SPECIAL_662>",
668
+ "<SPECIAL_663>",
669
+ "<SPECIAL_664>",
670
+ "<SPECIAL_665>",
671
+ "<SPECIAL_666>",
672
+ "<SPECIAL_667>",
673
+ "<SPECIAL_668>",
674
+ "<SPECIAL_669>",
675
+ "<SPECIAL_670>",
676
+ "<SPECIAL_671>",
677
+ "<SPECIAL_672>",
678
+ "<SPECIAL_673>",
679
+ "<SPECIAL_674>",
680
+ "<SPECIAL_675>",
681
+ "<SPECIAL_676>",
682
+ "<SPECIAL_677>",
683
+ "<SPECIAL_678>",
684
+ "<SPECIAL_679>",
685
+ "<SPECIAL_680>",
686
+ "<SPECIAL_681>",
687
+ "<SPECIAL_682>",
688
+ "<SPECIAL_683>",
689
+ "<SPECIAL_684>",
690
+ "<SPECIAL_685>",
691
+ "<SPECIAL_686>",
692
+ "<SPECIAL_687>",
693
+ "<SPECIAL_688>",
694
+ "<SPECIAL_689>",
695
+ "<SPECIAL_690>",
696
+ "<SPECIAL_691>",
697
+ "<SPECIAL_692>",
698
+ "<SPECIAL_693>",
699
+ "<SPECIAL_694>",
700
+ "<SPECIAL_695>",
701
+ "<SPECIAL_696>",
702
+ "<SPECIAL_697>",
703
+ "<SPECIAL_698>",
704
+ "<SPECIAL_699>",
705
+ "<SPECIAL_700>",
706
+ "<SPECIAL_701>",
707
+ "<SPECIAL_702>",
708
+ "<SPECIAL_703>",
709
+ "<SPECIAL_704>",
710
+ "<SPECIAL_705>",
711
+ "<SPECIAL_706>",
712
+ "<SPECIAL_707>",
713
+ "<SPECIAL_708>",
714
+ "<SPECIAL_709>",
715
+ "<SPECIAL_710>",
716
+ "<SPECIAL_711>",
717
+ "<SPECIAL_712>",
718
+ "<SPECIAL_713>",
719
+ "<SPECIAL_714>",
720
+ "<SPECIAL_715>",
721
+ "<SPECIAL_716>",
722
+ "<SPECIAL_717>",
723
+ "<SPECIAL_718>",
724
+ "<SPECIAL_719>",
725
+ "<SPECIAL_720>",
726
+ "<SPECIAL_721>",
727
+ "<SPECIAL_722>",
728
+ "<SPECIAL_723>",
729
+ "<SPECIAL_724>",
730
+ "<SPECIAL_725>",
731
+ "<SPECIAL_726>",
732
+ "<SPECIAL_727>",
733
+ "<SPECIAL_728>",
734
+ "<SPECIAL_729>",
735
+ "<SPECIAL_730>",
736
+ "<SPECIAL_731>",
737
+ "<SPECIAL_732>",
738
+ "<SPECIAL_733>",
739
+ "<SPECIAL_734>",
740
+ "<SPECIAL_735>",
741
+ "<SPECIAL_736>",
742
+ "<SPECIAL_737>",
743
+ "<SPECIAL_738>",
744
+ "<SPECIAL_739>",
745
+ "<SPECIAL_740>",
746
+ "<SPECIAL_741>",
747
+ "<SPECIAL_742>",
748
+ "<SPECIAL_743>",
749
+ "<SPECIAL_744>",
750
+ "<SPECIAL_745>",
751
+ "<SPECIAL_746>",
752
+ "<SPECIAL_747>",
753
+ "<SPECIAL_748>",
754
+ "<SPECIAL_749>",
755
+ "<SPECIAL_750>",
756
+ "<SPECIAL_751>",
757
+ "<SPECIAL_752>",
758
+ "<SPECIAL_753>",
759
+ "<SPECIAL_754>",
760
+ "<SPECIAL_755>",
761
+ "<SPECIAL_756>",
762
+ "<SPECIAL_757>",
763
+ "<SPECIAL_758>",
764
+ "<SPECIAL_759>",
765
+ "<SPECIAL_760>",
766
+ "<SPECIAL_761>",
767
+ "<SPECIAL_762>",
768
+ "<SPECIAL_763>",
769
+ "<SPECIAL_764>",
770
+ "<SPECIAL_765>",
771
+ "<SPECIAL_766>",
772
+ "<SPECIAL_767>",
773
+ "<SPECIAL_768>",
774
+ "<SPECIAL_769>",
775
+ "<SPECIAL_770>",
776
+ "<SPECIAL_771>",
777
+ "<SPECIAL_772>",
778
+ "<SPECIAL_773>",
779
+ "<SPECIAL_774>",
780
+ "<SPECIAL_775>",
781
+ "<SPECIAL_776>",
782
+ "<SPECIAL_777>",
783
+ "<SPECIAL_778>",
784
+ "<SPECIAL_779>",
785
+ "<SPECIAL_780>",
786
+ "<SPECIAL_781>",
787
+ "<SPECIAL_782>",
788
+ "<SPECIAL_783>",
789
+ "<SPECIAL_784>",
790
+ "<SPECIAL_785>",
791
+ "<SPECIAL_786>",
792
+ "<SPECIAL_787>",
793
+ "<SPECIAL_788>",
794
+ "<SPECIAL_789>",
795
+ "<SPECIAL_790>",
796
+ "<SPECIAL_791>",
797
+ "<SPECIAL_792>",
798
+ "<SPECIAL_793>",
799
+ "<SPECIAL_794>",
800
+ "<SPECIAL_795>",
801
+ "<SPECIAL_796>",
802
+ "<SPECIAL_797>",
803
+ "<SPECIAL_798>",
804
+ "<SPECIAL_799>",
805
+ "<SPECIAL_800>",
806
+ "<SPECIAL_801>",
807
+ "<SPECIAL_802>",
808
+ "<SPECIAL_803>",
809
+ "<SPECIAL_804>",
810
+ "<SPECIAL_805>",
811
+ "<SPECIAL_806>",
812
+ "<SPECIAL_807>",
813
+ "<SPECIAL_808>",
814
+ "<SPECIAL_809>",
815
+ "<SPECIAL_810>",
816
+ "<SPECIAL_811>",
817
+ "<SPECIAL_812>",
818
+ "<SPECIAL_813>",
819
+ "<SPECIAL_814>",
820
+ "<SPECIAL_815>",
821
+ "<SPECIAL_816>",
822
+ "<SPECIAL_817>",
823
+ "<SPECIAL_818>",
824
+ "<SPECIAL_819>",
825
+ "<SPECIAL_820>",
826
+ "<SPECIAL_821>",
827
+ "<SPECIAL_822>",
828
+ "<SPECIAL_823>",
829
+ "<SPECIAL_824>",
830
+ "<SPECIAL_825>",
831
+ "<SPECIAL_826>",
832
+ "<SPECIAL_827>",
833
+ "<SPECIAL_828>",
834
+ "<SPECIAL_829>",
835
+ "<SPECIAL_830>",
836
+ "<SPECIAL_831>",
837
+ "<SPECIAL_832>",
838
+ "<SPECIAL_833>",
839
+ "<SPECIAL_834>",
840
+ "<SPECIAL_835>",
841
+ "<SPECIAL_836>",
842
+ "<SPECIAL_837>",
843
+ "<SPECIAL_838>",
844
+ "<SPECIAL_839>",
845
+ "<SPECIAL_840>",
846
+ "<SPECIAL_841>",
847
+ "<SPECIAL_842>",
848
+ "<SPECIAL_843>",
849
+ "<SPECIAL_844>",
850
+ "<SPECIAL_845>",
851
+ "<SPECIAL_846>",
852
+ "<SPECIAL_847>",
853
+ "<SPECIAL_848>",
854
+ "<SPECIAL_849>",
855
+ "<SPECIAL_850>",
856
+ "<SPECIAL_851>",
857
+ "<SPECIAL_852>",
858
+ "<SPECIAL_853>",
859
+ "<SPECIAL_854>",
860
+ "<SPECIAL_855>",
861
+ "<SPECIAL_856>",
862
+ "<SPECIAL_857>",
863
+ "<SPECIAL_858>",
864
+ "<SPECIAL_859>",
865
+ "<SPECIAL_860>",
866
+ "<SPECIAL_861>",
867
+ "<SPECIAL_862>",
868
+ "<SPECIAL_863>",
869
+ "<SPECIAL_864>",
870
+ "<SPECIAL_865>",
871
+ "<SPECIAL_866>",
872
+ "<SPECIAL_867>",
873
+ "<SPECIAL_868>",
874
+ "<SPECIAL_869>",
875
+ "<SPECIAL_870>",
876
+ "<SPECIAL_871>",
877
+ "<SPECIAL_872>",
878
+ "<SPECIAL_873>",
879
+ "<SPECIAL_874>",
880
+ "<SPECIAL_875>",
881
+ "<SPECIAL_876>",
882
+ "<SPECIAL_877>",
883
+ "<SPECIAL_878>",
884
+ "<SPECIAL_879>",
885
+ "<SPECIAL_880>",
886
+ "<SPECIAL_881>",
887
+ "<SPECIAL_882>",
888
+ "<SPECIAL_883>",
889
+ "<SPECIAL_884>",
890
+ "<SPECIAL_885>",
891
+ "<SPECIAL_886>",
892
+ "<SPECIAL_887>",
893
+ "<SPECIAL_888>",
894
+ "<SPECIAL_889>",
895
+ "<SPECIAL_890>",
896
+ "<SPECIAL_891>",
897
+ "<SPECIAL_892>",
898
+ "<SPECIAL_893>",
899
+ "<SPECIAL_894>",
900
+ "<SPECIAL_895>",
901
+ "<SPECIAL_896>",
902
+ "<SPECIAL_897>",
903
+ "<SPECIAL_898>",
904
+ "<SPECIAL_899>",
905
+ "<SPECIAL_900>",
906
+ "<SPECIAL_901>",
907
+ "<SPECIAL_902>",
908
+ "<SPECIAL_903>",
909
+ "<SPECIAL_904>",
910
+ "<SPECIAL_905>",
911
+ "<SPECIAL_906>",
912
+ "<SPECIAL_907>",
913
+ "<SPECIAL_908>",
914
+ "<SPECIAL_909>",
915
+ "<SPECIAL_910>",
916
+ "<SPECIAL_911>",
917
+ "<SPECIAL_912>",
918
+ "<SPECIAL_913>",
919
+ "<SPECIAL_914>",
920
+ "<SPECIAL_915>",
921
+ "<SPECIAL_916>",
922
+ "<SPECIAL_917>",
923
+ "<SPECIAL_918>",
924
+ "<SPECIAL_919>",
925
+ "<SPECIAL_920>",
926
+ "<SPECIAL_921>",
927
+ "<SPECIAL_922>",
928
+ "<SPECIAL_923>",
929
+ "<SPECIAL_924>",
930
+ "<SPECIAL_925>",
931
+ "<SPECIAL_926>",
932
+ "<SPECIAL_927>",
933
+ "<SPECIAL_928>",
934
+ "<SPECIAL_929>",
935
+ "<SPECIAL_930>",
936
+ "<SPECIAL_931>",
937
+ "<SPECIAL_932>",
938
+ "<SPECIAL_933>",
939
+ "<SPECIAL_934>",
940
+ "<SPECIAL_935>",
941
+ "<SPECIAL_936>",
942
+ "<SPECIAL_937>",
943
+ "<SPECIAL_938>",
944
+ "<SPECIAL_939>",
945
+ "<SPECIAL_940>",
946
+ "<SPECIAL_941>",
947
+ "<SPECIAL_942>",
948
+ "<SPECIAL_943>",
949
+ "<SPECIAL_944>",
950
+ "<SPECIAL_945>",
951
+ "<SPECIAL_946>",
952
+ "<SPECIAL_947>",
953
+ "<SPECIAL_948>",
954
+ "<SPECIAL_949>",
955
+ "<SPECIAL_950>",
956
+ "<SPECIAL_951>",
957
+ "<SPECIAL_952>",
958
+ "<SPECIAL_953>",
959
+ "<SPECIAL_954>",
960
+ "<SPECIAL_955>",
961
+ "<SPECIAL_956>",
962
+ "<SPECIAL_957>",
963
+ "<SPECIAL_958>",
964
+ "<SPECIAL_959>",
965
+ "<SPECIAL_960>",
966
+ "<SPECIAL_961>",
967
+ "<SPECIAL_962>",
968
+ "<SPECIAL_963>",
969
+ "<SPECIAL_964>",
970
+ "<SPECIAL_965>",
971
+ "<SPECIAL_966>",
972
+ "<SPECIAL_967>",
973
+ "<SPECIAL_968>",
974
+ "<SPECIAL_969>",
975
+ "<SPECIAL_970>",
976
+ "<SPECIAL_971>",
977
+ "<SPECIAL_972>",
978
+ "<SPECIAL_973>",
979
+ "<SPECIAL_974>",
980
+ "<SPECIAL_975>",
981
+ "<SPECIAL_976>",
982
+ "<SPECIAL_977>",
983
+ "<SPECIAL_978>",
984
+ "<SPECIAL_979>",
985
+ "<SPECIAL_980>",
986
+ "<SPECIAL_981>",
987
+ "<SPECIAL_982>",
988
+ "<SPECIAL_983>",
989
+ "<SPECIAL_984>",
990
+ "<SPECIAL_985>",
991
+ "<SPECIAL_986>",
992
+ "<SPECIAL_987>",
993
+ "<SPECIAL_988>",
994
+ "<SPECIAL_989>",
995
+ "<SPECIAL_990>",
996
+ "<SPECIAL_991>",
997
+ "<SPECIAL_992>",
998
+ "<SPECIAL_993>",
999
+ "<SPECIAL_994>",
1000
+ "<SPECIAL_995>",
1001
+ "<SPECIAL_996>",
1002
+ "<SPECIAL_997>",
1003
+ "<SPECIAL_998>",
1004
+ "<SPECIAL_999>"
1005
+ ],
1006
+ "model_max_length": 1000000000000000019884624838656,
1007
+ "bos_token": "<s>",
1008
+ "eos_token": "</s>",
1009
+ "pad_token": "<pad>",
1010
+ "unk_token": "<unk>",
1011
+ "processor_class": "PixtralProcessor",
1012
+ "tokenizer_class": "TokenizersBackend"
1013
+ }