Instructions to use Bochkov/bvv241-abs with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use Bochkov/bvv241-abs with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="Bochkov/bvv241-abs")# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("Bochkov/bvv241-abs", dtype="auto") - Notebooks
- Google Colab
- Kaggle
Improve model card: Add metadata and paper link
#1
by nielsr HF Staff - opened
This PR enhances the model card by:
- Adding
pipeline_tag: feature-extractionto improve discoverability on the Hub. - Specifying
library_name: transformersfor better integration with thetransformerslibrary. - Including the
license: apache-2.0. - Adding relevant
tagssuch astokenizer,embeddings,LLM,MoE, andunicodefor improved searchability. - Linking directly to the Hugging Face paper page (Growing Transformers: Modular Composition and Layer-wise Expansion on a Frozen Substrate) and providing a brief summary of the paper's focus for quick context.
These changes will provide clearer and more comprehensive information to users.
Thank you so much
Bochkov changed pull request status to merged