whis-22/bankai-recipe
This is a fine-tuned version of the GPT-2 model for recipe generation.
Model Details
- Model type: GPT-2
- Language(s) (NLP): English
- License: MIT
- Finetuned from: gpt2
Intended Uses & Limitations
This model is intended for generating cooking recipes based on input prompts. It was fine-tuned on a recipe dataset.
How to Use
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("whis-22/bankai-recipe")
tokenizer = AutoTokenizer.from_pretrained("whis-22/bankai-recipe")
# Generate a recipe
input_prompt = "Chocolate Chip Cookies Recipe:"
input_ids = tokenizer.encode(input_prompt, return_tensors="pt")
output = model.generate(input_ids, max_length=300, num_return_sequences=1)
generated_recipe = tokenizer.decode(output[0], skip_special_tokens=True)
print(generated_recipe)
- Downloads last month
- 1
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support