Instructions to use zai-org/chatglm2-6b with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use zai-org/chatglm2-6b with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("zai-org/chatglm2-6b", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
triton model configuration needed
#55
by wangruiai2023 - opened
已经将仓库转化为了 pt 模型文件,使用 triton 部署的话需要 config.pbtxt 。但是我不了解如何确定 input 参数以及使用推荐使用何种 backend?
我通过 tokenizer 确定了 ouput
wangruiai2023 changed discussion status to closed