Question Answering
Transformers
PyTorch
TensorFlow
JAX
Vietnamese
t5
text2text-generation
summarization
translation
text-generation-inference
Instructions to use VietAI/vit5-large with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use VietAI/vit5-large with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("question-answering", model="VietAI/vit5-large")# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("VietAI/vit5-large") model = AutoModelForSeq2SeqLM.from_pretrained("VietAI/vit5-large") - Notebooks
- Google Colab
- Kaggle
Commit History
Update README.md b4c595d
Update README.md 64b4f33
Update README.md 934b36a
Update config.json 398bbba
Update README.md 8a6430b
Fix OOV problem 0b699c1
root commited on
Update README.md a2a4014
Upload tokenizer.json 8170ced
Update README.md 0bab53e
Update README.md d6f16cf
Update README.md 8596981
Update README.md 3b4eaa2
Update config 7ecf377
root commited on
Add flax version for vit5 large 1m5steps f62106a
root commited on
Add tf version for vit5 large 1m5 steps ca105b1
root commited on
Add vit5 large 1m5 steps c71db58
root commited on
Delete tf_model.h5 aa55836
Delete pytorch_model.bin 321e804
Update README.md d20d355
Update README.md 3a92e0c
Create README.md 45e9490
Add vit5 large model 26c4133
root commited on