Basic Text Generator
Overview
This is a fine-tuned GPT-2 model for general text generation. It can continue prompts, generate stories, or create coherent paragraphs based on input text. Trained on a diverse corpus for broad applicability.
Model Architecture
- Base Model: GPT-2
- Layers: 12
- Hidden Size: 768
- Attention Heads: 12
- Context Window: 1024 tokens
Intended Use
Suitable for creative writing, content generation, or prototyping language-based applications.
Limitations
- May generate biased or inappropriate content based on training data.
- Outputs can be repetitive or nonsensical for long generations.
- Not optimized for specific domains like code or math.
Example Code
from transformers import pipeline
generator = pipeline("text-generation", model="user/basic-text-generator")
result = generator("Once upon a time,", max_length=50)
print(result[0]['generated_text'])
# "Once upon a time, in a land far away..."
- Downloads last month
- 17
Dataset used to train Shoriful025/basic_text_generator
Evaluation results
- Perplexity on bookcorpusself-reported25.300