Instructions to use espiusedwards/flant5-large-lora with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- PEFT
How to use espiusedwards/flant5-large-lora with PEFT:
from peft import PeftModel from transformers import AutoModelForSeq2SeqLM base_model = AutoModelForSeq2SeqLM.from_pretrained("google/flan-t5-large") model = PeftModel.from_pretrained(base_model, "espiusedwards/flant5-large-lora") - Notebooks
- Google Colab
- Kaggle
Librarian Bot: Add base_model information to model
#18 opened over 2 years ago
by
librarian-bot
change evaluation, r = 24, 8 epoch
#14 opened over 2 years ago
by
espiusedwards
fix argument, r = 24, 3 epoch
#12 opened over 2 years ago
by
espiusedwards
fix argument, r = 24, 3 epoch
#11 opened over 2 years ago
by
espiusedwards
simplify tokenize question - context
#8 opened over 2 years ago
by
espiusedwards