How to use zai-org/chatglm3-6b-base with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("zai-org/chatglm3-6b-base", trust_remote_code=True, dtype="auto")
you are right , output format is super weird
· Sign up or log in to comment