How to use L1-m1ng/chatglm2 with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("L1-m1ng/chatglm2", trust_remote_code=True, dtype="auto")
No model card