Translation
Safetensors
mistral
YuLu0713 commited on
Commit
cecfc98
·
verified ·
1 Parent(s): 6cea7e7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -4
README.md CHANGED
@@ -44,12 +44,16 @@ This repo contains the **Seed-X-Instruct** model, with the following features:
44
 
45
  ## Quickstart
46
 
47
- * The **language tags** at the end of the prompt is necessary, which are used in PPO training. For example, when the target language is German, \<de\> needs to be added. You can refer to the above table for language abbreviations.
48
- * This model is specialized in multilingual translation, which is unexpected to support other tasks.
49
- * We don't have any chat template, thus you don't have to perform ```tokenizer.apply_chat_template```. Please avoid prompting the model in a multi-round conversation format.
50
- * We recommend against using unofficial quantized versions for local deployment. We will soon release an official quantized model and develop a demo on Hugging Face Space.
 
51
 
52
  Here is a simple example demonstrating how to load the model and perform translation using ```vllm```
 
 
 
53
  ```python
54
  from vllm import LLM, SamplingParams, BeamSearchParams
55
 
 
44
 
45
  ## Quickstart
46
 
47
+ 📮 **Notice**
48
+ * **The language tags at the end of the prompt is necessary**, which are used in PPO training. For example, when the target language is German, \<de\> needs to be added. You can refer to the above table for language abbreviations.
49
+ * **This model is specialized in multilingual translation**, which is unexpected to support other tasks.
50
+ * **We don't have any chat template**, thus you don't have to perform ```tokenizer.apply_chat_template```. Please avoid prompting the model in a multi-round conversation format.
51
+ * **We recommend against using unofficial quantized versions for local deployment.** We will soon release an official quantized model and develop a demo on Hugging Face Space.
52
 
53
  Here is a simple example demonstrating how to load the model and perform translation using ```vllm```
54
+
55
+ Recommended: ```vllm==0.8.0, transformers==4.51.3```
56
+
57
  ```python
58
  from vllm import LLM, SamplingParams, BeamSearchParams
59