Text Generation
Transformers
PyTorch
Safetensors
gpt_neox
alpaca
instruction
pythia
text-generation-inference
# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("GerbilLab/IPythia-70m")
model = AutoModelForCausalLM.from_pretrained("GerbilLab/IPythia-70m")Quick Links
All IPythia models were trained on an internal GerbilLab high quality instruction dataset of ~75k instructions for 3 epochs. Prompt format:
Instruction: [instruction goes here]
Input: [input goes here]
Output: [output will be generated here]
or
Instruction: [instruction goes here]
Output: [output will be generated here]
- Downloads last month
- 15
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="GerbilLab/IPythia-70m")