Text Generation
Transformers
PyTorch
Safetensors
gpt2
Generated from Trainer
storytelling
fiction
tiny-stories
text-generation-inference
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("Athagi/Gg")
model = AutoModelForCausalLM.from_pretrained("Athagi/Gg")Quick Links
Athspi LLM
🧠A small but capable language model for creative story generation, trained on the TinyStories dataset.
Model Details
Architecture
- Model Type: Transformer-based language model
- Layers: 4
- Embedding Dim: 384
- Heads: 6
- Sequence Length: 128 tokens
- Parameters: ~28M
Training Data
- Dataset: TinyStories
- Training Coverage: 5% of dataset (~100k samples)
Usage
Installation
pip install torch transformers sentencepiece
- Downloads last month
- 5

# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="Athagi/Gg")