hanghangaidoudou commited on
Commit
97164ab
·
verified ·
1 Parent(s): 77d30d9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +68 -0
README.md CHANGED
@@ -1,3 +1,71 @@
1
  ---
2
  license: apache-2.0
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: apache-2.0
3
  ---
4
+ # CNTDAI-6B
5
+ ## Model Description
6
+ CNTDAI-6B 是 Community AI Model Group 为了进行POC来微调出来的符合公司内部需求的一个实验行模型,基于GLM Transformer模型进行微调的中英文LLM.采用了更多样的训练数据、更充分的训练步数和更合理的训练策略。在语义、数学、推理、代码、知识等不同角度的数据集上进行优化.
7
+
8
+ ## Usage
9
+
10
+ ```python
11
+ import os
12
+ import platform
13
+ import torch
14
+ from transformers import AutoTokenizer, AutoModel
15
+
16
+
17
+ model_path = "cntd\CNTDAI-6B"
18
+ print("是否可用:", torch.cuda.is_available()) # 查看GPU是否可用
19
+ print("GPU数量:", torch.cuda.device_count()) # 查看GPU数量
20
+ print("torch方法查看CUDA版本:", torch.version.cuda) # torch方法查看CUDA版本
21
+ print("GPU索引号:", torch.cuda.current_device()) # 查看GPU索引号
22
+ tokenizer = AutoTokenizer.from_pretrained(model_path, trust_remote_code=True)
23
+ model = AutoModel.from_pretrained(model_path, trust_remote_code=True).half().cuda()
24
+ model = model.eval()
25
+ os_name = platform.system()
26
+ clear_command = 'cls' if os_name == 'Windows' else 'clear'
27
+ stop_stream = False
28
+
29
+
30
+ def build_prompt(history):
31
+ prompt = "欢迎使用 CNTDAI-6B 模型,输入内容即可进行对话,clear 清空对话历史,stop 终止程序"
32
+ for query, response in history:
33
+ prompt += f"\n\n用户:{query}"
34
+ prompt += f"\n\nCNTDAI-6B:{response}"
35
+ return prompt
36
+
37
+
38
+
39
+
40
+ def main():
41
+ past_key_values, history = None, []
42
+ global stop_stream
43
+ print("欢迎使用 CNTDAI-6B 模型,输入内容即可进行对话,clear 清空对话历史,stop 终止程序")
44
+ while True:
45
+ query = input("\n用户:")
46
+ if query.strip() == "stop":
47
+ break
48
+ if query.strip() == "clear":
49
+ past_key_values, history = None, []
50
+ os.system(clear_command)
51
+ print("欢迎使用 CNTDAI-6B 模型,输入内容即可进行对话,clear 清空对话历史,stop 终止程序")
52
+ continue
53
+ print("\nCNTDAI:", end="")
54
+ current_length = 0
55
+ for response, history, past_key_values in model.stream_chat(tokenizer, query, history=history,
56
+ past_key_values=past_key_values,
57
+ return_past_key_values=True):
58
+ if stop_stream:
59
+ stop_stream = False
60
+ break
61
+ else:
62
+ print(response[current_length:], end="", flush=True)
63
+ current_length = len(response)
64
+ print("")
65
+
66
+
67
+ if __name__ == "__main__":
68
+ main()
69
+
70
+ ```
71
+