How to use euclaise/Ferret-3B with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("question-answering", model="euclaise/Ferret-3B", trust_remote_code=True)
# Load model directly from transformers import AutoModelForCausalLM model = AutoModelForCausalLM.from_pretrained("euclaise/Ferret-3B", trust_remote_code=True, dtype="auto")
A 3B CoT model. ChatML format. Use it however you want, consider it public domain.
Chat template
Files info