Instructions to use openbmb/Eurus-RM-7b with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use openbmb/Eurus-RM-7b with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-classification", model="openbmb/Eurus-RM-7b", trust_remote_code=True)# Load model directly from transformers import AutoTokenizer, AutoModel tokenizer = AutoTokenizer.from_pretrained("openbmb/Eurus-RM-7b", trust_remote_code=True) model = AutoModel.from_pretrained("openbmb/Eurus-RM-7b", trust_remote_code=True) - Notebooks
- Google Colab
- Kaggle
How to use with batch size > 1?
1
#9 opened over 1 year ago
by
RylanSchaeffer
Using Mistral's chat_template produces different text than the demo.
#8 opened almost 2 years ago
by
EutronH
trust_remote_code=False模型无法正确输出
#7 opened almost 2 years ago
by
AIR-hl
Multi-turn input template
1
#6 opened almost 2 years ago
by
linzeqipku
Wrong Output in Usage Example Code?
2
#5 opened about 2 years ago
by
hmomin