Instructions to use zai-org/glm-2b with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use zai-org/glm-2b with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="zai-org/glm-2b", trust_remote_code=True)# Load model directly from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("zai-org/glm-2b", trust_remote_code=True) model = AutoModel.from_pretrained("zai-org/glm-2b", trust_remote_code=True) - Notebooks
- Google Colab
- Kaggle
Commit History
Fix batch beam search 774fda8
duzx16 commited on
add tag 6ed40fb
duzx16 commited on
init commit 9c579b8
duzx16 commited on
init commit aa9679f
duzx16 commited on
initial commit c15a605
Zhengxiao Du commited on