How to use tuhanasinan/Conclusion-generator with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("tuhanasinan/Conclusion-generator") model = AutoModelForSeq2SeqLM.from_pretrained("tuhanasinan/Conclusion-generator")
How to fix it?