Instructions to use Bochkov/bvv241-max with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Bochkov/bvv241-max with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="Bochkov/bvv241-max")# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("Bochkov/bvv241-max", dtype="auto") - Notebooks
- Google Colab
- Kaggle
Improve model card: Add metadata and GitHub link
#1
by nielsr HF Staff - opened
This PR enhances the model card for the Bochkov/bvv241-max tokenizer by:
- Adding
license: apache-2.0,library_name: transformers, andpipeline_tag: feature-extractionto the YAML metadata. This improves discoverability on the Hugging Face Hub and ensures the correct "how to use" widget appears for the tokenizer. - Adding a link to the associated research paper for easy reference.
- Including a direct link to the GitHub repository where the code and research resources are hosted.
These changes provide more comprehensive information for users and integrate the model better within the Hugging Face ecosystem.
Thank you
Bochkov changed pull request status to merged