Instructions to use pollner/reviews-generator2 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use pollner/reviews-generator2 with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForSeq2SeqLM tokenizer = AutoTokenizer.from_pretrained("pollner/reviews-generator2") model = AutoModelForSeq2SeqLM.from_pretrained("pollner/reviews-generator2") - Notebooks
- Google Colab
- Kaggle
Librarian Bot: Add base_model information to model
This pull request aims to enrich the metadata of your model by adding facebook/bart-base as a base_model field, situated in the YAML block of your model's README.md.
How did we find this information? We performed a regular expression match on your README.md file to determine the connection.
Why add this? Enhancing your model's metadata in this way:
- Boosts Discoverability - It becomes straightforward to trace the relationships between various models on the Hugging Face Hub.
- Highlights Impact - It showcases the contributions and influences different models have within the community.
For a hands-on example of how such metadata can play a pivotal role in mapping model connections, take a look at librarian-bots/base_model_explorer.
This PR comes courtesy of Librarian Bot. If you have any feedback, queries, or need assistance, please don't hesitate to reach out to @davanstrien.
If you want to automatically add base_model metadata to more of your modes you can use the Librarian Bot Metadata Request Service!