# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("athirdpath/BigMistral-11b")
model = AutoModelForCausalLM.from_pretrained("athirdpath/BigMistral-11b")Quick Links
EDIT: Base Mistral, but with some minor head trauma. Promising in theory, needs finetuning, but only really outperforms the 14b in size.
A 11b Mistral base model, based on the NeverSleep recipe.
Recipe
slices
sources:
- model: mistralai/Mistral-7B-v0.1
- layer_range: [0, 24]
sources:
- model: mistralai/Mistral-7B-v0.1
- layer_range: [8, 32]
merge_method: passthrough
dtype: bfloat16
- Downloads last month
- 8
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="athirdpath/BigMistral-11b")