How to use zai-org/chatglm2-6b with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("zai-org/chatglm2-6b", trust_remote_code=True, dtype="auto")
Please release the GGML version of this model that is compatible with llama.cpp too. Many users would use the GGML version.
https://github.com/li-plus/chatglm.cpp
· Sign up or log in to comment