How to use shareAI/CodeLLaMA-chat-13b-Chinese with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("question-answering", model="shareAI/CodeLLaMA-chat-13b-Chinese")
# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("shareAI/CodeLLaMA-chat-13b-Chinese", dtype="auto")
希望出7b的Chinese code llama
· Sign up or log in to comment