Instructions to use shareAI/Phi-3-mini-128k-instruct-Chinese with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use shareAI/Phi-3-mini-128k-instruct-Chinese with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("question-answering", model="shareAI/Phi-3-mini-128k-instruct-Chinese")# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("shareAI/Phi-3-mini-128k-instruct-Chinese", dtype="auto") - Notebooks
- Google Colab
- Kaggle
Compared with the original model, the quality of Chinese has dropped significantly (same Q8_0 quantification).
#2 opened about 2 years ago
by
riceballl
GGUF model file
β€οΈβ 1
#1 opened about 2 years ago
by
BB8-dev