MinimalistRecipeTextGenerator
Overview
This model is a fine-tuned version of the GPT-2 (small) language model, specifically trained to generate coherent and realistic short recipe texts. Given a prompt (e.g., "A quick chicken curry"), the model completes the text, often generating ingredient lists and basic instructions.
Model Architecture
The model uses the standard GPT-2 language modeling architecture.
- Core: A 12-layer, 768-dimensional transformer decoder stack.
- Mechanism: It operates based on attention mechanisms, predicting the next token in a sequence given all previous tokens.
- Training: Fine-tuned on a dataset of simple, short recipes, enabling it to learn the structural patterns of recipes (Title -> Ingredients -> Instructions).
- Generation Parameters: The
config.jsonsets default generation parameters for high-quality output:do_sample: True (for creative text generation)temperature: 0.7 (controls randomness)max_length: 256 (for short, complete recipes)
Intended Use
This model is intended for creative and content generation purposes:
- Creative Writing/Blogging: Generating unique recipe ideas for food blogs or social media.
- Data Augmentation: Creating synthetic, but structurally correct, recipe texts for training other culinary-focused models.
- Demonstration: Serving as a basic example of fine-tuning GPT-2 on a domain-specific corpus.
How to use
from transformers import pipeline
generator = pipeline("text-generation", model="your_username/MinimalistRecipeTextGenerator") # Replace with actual hub path
prompt = "Recipe for a refreshing summer salad:"
output = generator(prompt, max_length=150, num_return_sequences=1, temperature=0.8)
print(output[0]['generated_text'])
- Downloads last month
- 15