How to use zai-org/chatglm3-6b-base with Transformers:
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("zai-org/chatglm3-6b-base", trust_remote_code=True, dtype="auto")
微调时报错torchrun:not found,是啥原因?
· Sign up or log in to comment