Instructions to use zai-org/chatglm2-6b with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use zai-org/chatglm2-6b with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("zai-org/chatglm2-6b", trust_remote_code=True, dtype="auto") - Notebooks
- Google Colab
- Kaggle
本地加载失败?
#39
by enozhu - opened
Traceback (most recent call last):
File "test.py", line 3, in
tokenizer = AutoTokenizer.from_pretrained("ChatGLM2-6B-file", trust_remote_code=True)
File "/home/zhuxiaolin/tool/miniconda3/envs/ChatGLM2-6B_py3.8/lib/python3.8/site-packages/transformers/models/auto/tokenization_auto.py", line 719, in from_pretrained
raise ValueError(
ValueError: Unrecognized configuration class <class 'transformers_modules.ChatGLM2-6B-file.configuration_chatglm.ChatGLMConfig'> to build an AutoTokenizer.
手动下载了模型和配置文件, 放在ChatGLM2-6B-file/目录下,tokenizer = AutoTokenizer.from_pretrained("ChatGLM2-6B-file", trust_remote_code=True)这样加载不行么,必须要放在.cache目录下?
你倒是用绝对路径啊
绝对路径也不行,一样的错误,只能放在cache对应的snapshots下