# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("Nbardy/micro-mistral")
model = AutoModelForCausalLM.from_pretrained("Nbardy/micro-mistral")Quick Links
Micro Mistral
A small version of mistral.
Similiar to some of the small llama variants, but uses GQA, tied embeddings, and sliding window attention.
Dataset Minipile Instruct Math OpenOrca Synthetic Data
TODO: Complete Dataset section
- Downloads last month
- 9
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="Nbardy/micro-mistral")