# Load model directly
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("BigSalmon/Backwards")
model = AutoModelForCausalLM.from_pretrained("BigSalmon/Backwards")Quick Links
YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
Used the same dataset as the one I trained https://huggingface.co/BigSalmon/InformalToFormalLincoln73Paraphrase, but all the words are in the opposite order.
- Note: I think I probably train it for more epochs. The loss was very high. That said, keep an eye on my profile, if this is something you are interested in.
- Downloads last month
- 6
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("text-generation", model="BigSalmon/Backwards")