leejunhyeok commited on
Commit
c2ca405
·
verified ·
1 Parent(s): d63f78d

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -2
README.md CHANGED
@@ -74,10 +74,9 @@ For maximum performance, we highly recommend using the options below.
74
  ```--compilation_config '{"full_cuda_graph": true}'``` : Activates cuda [full graph capture](https://docs.vllm.ai/en/stable/design/cuda_graphs/#cudagraphmodes)
75
  ```--rope-scaling '{"rope_type":"yarn","factor":2.0,"original_max_position_embeddings":65536}'```: Apply [yarn](https://arxiv.org/abs/2309.00071) to support 128K context length
76
  ```--enable-auto-tool-choice --tool-call-parser hermes``` : Enables [tool calling](https://docs.vllm.ai/en/latest/features/tool_calling/)
77
- ```--logits-processors logit_:WrappedPerReqLogitsProcessor``` : Enables VLLM_THINK_BUDGET_RATIO env variable and repetition-based auto-stop for thinking autostop
78
  ```--reasoning-parser deepseek_r1``` : Parses [reasoning outputs](https://docs.vllm.ai/en/latest/features/reasoning_outputs/)
79
 
80
- ### how to use
81
  ```bash
82
  pip install -U "huggingface_hub[cli]"
83
  hf download Motif-Technologies/Motif-2-12.7B-Reasoning \
 
74
  ```--compilation_config '{"full_cuda_graph": true}'``` : Activates cuda [full graph capture](https://docs.vllm.ai/en/stable/design/cuda_graphs/#cudagraphmodes)
75
  ```--rope-scaling '{"rope_type":"yarn","factor":2.0,"original_max_position_embeddings":65536}'```: Apply [yarn](https://arxiv.org/abs/2309.00071) to support 128K context length
76
  ```--enable-auto-tool-choice --tool-call-parser hermes``` : Enables [tool calling](https://docs.vllm.ai/en/latest/features/tool_calling/)
77
+ ```--logits-processors logit_:WrappedPerReqLogitsProcessor```: Enables a ratio-based thinking budget and repetition-based auto-stop. The model is guided to think for ```(model_max_len - input_prompt_len) * VLLM_THINK_BUDGET_RATIO``` tokens, using the rest of the context window to generate the response
78
  ```--reasoning-parser deepseek_r1``` : Parses [reasoning outputs](https://docs.vllm.ai/en/latest/features/reasoning_outputs/)
79
 
 
80
  ```bash
81
  pip install -U "huggingface_hub[cli]"
82
  hf download Motif-Technologies/Motif-2-12.7B-Reasoning \