How to use zai-org/chatglm-6b with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("zai-org/chatglm-6b", trust_remote_code=True, dtype="auto")
https://github.com/wangzhaode/ChatGLM-MNN/releases/tag/v0.6
123
dsad
???
tql
如何编译模型?又如何下载模型?完全没有反应啊。。。
· Sign up or log in to comment