# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("rexwang8/py125")
model = AutoModelForCausalLM.from_pretrained("rexwang8/py125")Quick Links
YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
test repo for pythia 125 with additional files from neox20b to try to get it to work
- Downloads last month
- 12
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="rexwang8/py125")