Transformers
PyTorch
English
t5
text2text-generation
t5-small
dialog state tracking
conversational system
task-oriented dialog
Eval Results (legacy)
text-generation-inference
Instructions to use ConvLab/t5-small-dst-sgd with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use ConvLab/t5-small-dst-sgd with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("ConvLab/t5-small-dst-sgd") model = AutoModelForSeq2SeqLM.from_pretrained("ConvLab/t5-small-dst-sgd") - Notebooks
- Google Colab
- Kaggle
Librarian Bot: Add base_model information to model
#2
by librarian-bot - opened
This pull request aims to enrich the metadata of your model by adding t5-small as a base_model field, situated in the YAML block of your model's README.md.
How did we find this information? We performed a regular expression match on your README.md file to determine the connection.
Why add this? Enhancing your model's metadata in this way:
- Boosts Discoverability - It becomes straightforward to trace the relationships between various models on the Hugging Face Hub.
- Highlights Impact - It showcases the contributions and influences different models have within the community.
For a hands-on example of how such metadata can play a pivotal role in mapping model connections, take a look at librarian-bots/base_model_explorer.
This PR comes courtesy of Librarian Bot. If you have any feedback, queries, or need assistance, please don't hesitate to reach out to @davanstrien. Your input is invaluable to us!