YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

BGE-M3 Embedding Model (Private Repository)

This is the BGE-M3 embedding model from FlagEmbedding, a state-of-the-art embedding model that supports 100+ languages.

Model Details

  • Model Type: Text Embedding
  • Languages: 100+ languages
  • Input: Text
  • Output: Vector embeddings
  • Model Architecture: Based on the BERT architecture with modifications

Usage

from FlagEmbedding import FlagModel

model = FlagModel('LA1512/bge-m3-new-finetune-SimCSE-hn', use_auth_token="your_token_here")
embeddings = model.encode("Your text here") 

Access Control

This is a private repository. Only authorized users can access this model.

Downloads last month
1
Safetensors
Model size
0.6B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support