How to use zai-org/chatglm-6b with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("zai-org/chatglm-6b", trust_remote_code=True, dtype="auto")
【修复】修复在有多个回答长度不一致的inputs时,模型没有对finished_sequences进行截断的问题
· Sign up or log in to comment