# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("breadlicker45/gpt-ya")
model = AutoModelForCausalLM.from_pretrained("breadlicker45/gpt-ya")Quick Links
Gpt-ya
Ya meaning yahoo answer
training is done on this.
this ai model was trained on CPUs. keep your hopes on this being good very very low
- Downloads last month
- 9
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="breadlicker45/gpt-ya")