Amalq/shared_TaskA
Viewer • Updated • 1.3k • 13
How to use Amalq/flan_t5_large_chat_summary with Transformers:
# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer = AutoTokenizer.from_pretrained("Amalq/flan_t5_large_chat_summary")
model = AutoModelForSeq2SeqLM.from_pretrained("Amalq/flan_t5_large_chat_summary")This model is a fine-tuned version of google/flan-t5-large on the shared_TaskA dataset.
The following hyperparameters were used during training:
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
tokenizer_pre = AutoTokenizer.from_pretrained("Amalq/flan_t5_large_chat_summary")
model_pre = AutoModelForSeq2SeqLM.from_pretrained("Amalq/flan_t5_large_chat_summary")