AI Chef

Fine-tuned LoRA adapter of Mistral-7B-v0.1 specialized in culinary and nutritional recipe generation.

Given a list of available ingredients and dietary restrictions, the model generates a complete structured recipe including nutritional information (calories, protein, carbs, fat) and step-by-step instructions.

Model Details

Field Value
Base model mistralai/Mistral-7B-v0.1
Fine-tuning technique QLoRA (4-bit NF4 + LoRA)
LoRA rank (r) 16
LoRA alpha 16
Target modules q, k, v, o, gate, up, down proj
Trainable parameters 41,943,040 (0.58% of total)
Training examples 3,000
Epochs 1
Final eval loss 0.454
Training tool Unsloth on Google Colab T4

Training Dataset

  • Dataset: Shengtao/recipe
  • 24,970 recipes after filtering (3,000 used for training)
  • Dietary labels (Vegan, Vegetarian, Gluten-Free) inferred heuristically

Usage

from peft import PeftModel
from transformers import AutoModelForCausalLM, AutoTokenizer
import torch

# Load base model + adapter
tokenizer = AutoTokenizer.from_pretrained("paulaschez/Mistral-7B-AI-Chef")
model = AutoModelForCausalLM.from_pretrained(
    "mistralai/Mistral-7B-v0.1",
    torch_dtype=torch.float16,
    device_map="auto"
)
model = PeftModel.from_pretrained(model, "paulaschez/Mistral-7B-AI-Chef")

# Example prompt
prompt = """[INST] You are AI Chef, an advanced culinary and nutrition assistant \
for SmartKitchen Solutions. Given a list of available ingredients and dietary \
restrictions, generate a complete, structured recipe with nutritional information.

Ingredients: eggs, tomato, onion, olive oil
Dietary Restrictions: no specific restrictions [/INST]"""

inputs = tokenizer(prompt, return_tensors="pt").to(model.device)
outputs = model.generate(**inputs, max_new_tokens=400, temperature=0.3)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))

Limitations

  • Dietary labels are inferred heuristically — not 100% accurate for complex cases
  • Trained on 3,000 examples (subset of available data) due to compute constraints
  • English only
Downloads last month
-
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for paulaschez/Mistral-7B-AI-Chef

Adapter
(2468)
this model

Dataset used to train paulaschez/Mistral-7B-AI-Chef

Space using paulaschez/Mistral-7B-AI-Chef 1