Instructions to use zai-org/chatglm3-6b with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use zai-org/chatglm3-6b with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("zai-org/chatglm3-6b", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
vllm for chatglm3
#65
by allenwang37 - opened
ERROR - Error during request: Error code: 400 - {'object': 'error', 'message': 'As of transformers v4.44, default chat template is no longer allowed, so you must provide a chat template if the tokenizer does not define one.', 'type': 'BadRequestError', 'param': None, 'code': 400}