How to use zai-org/chatglm-6b with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("zai-org/chatglm-6b", trust_remote_code=True, dtype="auto")
在windows 10系统上,RTX3060,本地部署Chatglm-6b,出现了这样的错误提示
pip install sat 出现这样的错误
python的版本太高了?python3.11还有好多库不兼容
· Sign up or log in to comment