liumaolin commited on
Commit
c531bfc
·
1 Parent(s): bbf79f2

更新LLM模型参数,调整top_p值为0.8,并添加max_tokens参数,设置为32768。

Browse files
src/voice_dialogue/config/llm_config.py CHANGED
@@ -36,8 +36,9 @@ def get_llm_model_params() -> Dict[str, Any]:
36
  'n_gpu_layers': -1,
37
  'n_batch': 1024,
38
  'temperature': 0.7,
39
- 'top_p': 0.9,
40
  'top_k': 20,
 
41
  'model_kwargs': {
42
  'mini_p': 0,
43
  'presence_penalty': 1.5
 
36
  'n_gpu_layers': -1,
37
  'n_batch': 1024,
38
  'temperature': 0.7,
39
+ 'top_p': 0.8,
40
  'top_k': 20,
41
+ 'max_tokens': 32768,
42
  'model_kwargs': {
43
  'mini_p': 0,
44
  'presence_penalty': 1.5