How to use zai-org/chatglm2-6b with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("zai-org/chatglm2-6b", trust_remote_code=True, dtype="auto")
Failed to load model 'TheBloke • chatglm2 6B q4_0 ggml'Error: llama.cpp: tensor 'tok_embeddings.weight' is missing from model
· Sign up or log in to comment