# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("SteveC/sdc_bot_15K")
model = AutoModelForCausalLM.from_pretrained("SteveC/sdc_bot_15K")Quick Links
YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
It's just a dialog bot trained on my Tweets. Unfortunately as tweets aren't very conversational it comes off pretty random.
- Downloads last month
- 8
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="SteveC/sdc_bot_15K")