Instructions to use MindscapeRAG/MiA-Emb-4B with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use MindscapeRAG/MiA-Emb-4B with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("feature-extraction", model="MindscapeRAG/MiA-Emb-4B")# Load model directly from transformers import AutoModel model = AutoModel.from_pretrained("MindscapeRAG/MiA-Emb-4B", dtype="auto") - Notebooks
- Google Colab
- Kaggle
Add pipeline tag and library name to model card
#1
by nielsr HF Staff - opened
Hi! I'm Niels from the Hugging Face community team.
This PR improves your model card's metadata by adding:
pipeline_tag: feature-extraction: This helps users find your model when filtering by task on the Hub.library_name: transformers: This enables an automated code snippet on the model page, showcasing how to use the model with the Transformers library.
The rest of the model card remains unchanged.