Hugging Face's logo Hugging Face
  • Models
  • Datasets
  • Spaces
  • Buckets new
  • Docs
  • Enterprise
  • Pricing

  • Log In
  • Sign Up

openbmb
/
MiniCPM-Embedding-Light

Feature Extraction
Transformers
Safetensors
sentence-transformers
minicpm
mteb
custom_code
Eval Results (legacy)
Model card Files Files and versions
xet
Community

Instructions to use openbmb/MiniCPM-Embedding-Light with libraries, inference providers, notebooks, and local apps. Follow these links to get started.

  • Libraries
  • Transformers

    How to use openbmb/MiniCPM-Embedding-Light with Transformers:

    # Use a pipeline as a high-level helper
    from transformers import pipeline
    
    pipe = pipeline("feature-extraction", model="openbmb/MiniCPM-Embedding-Light", trust_remote_code=True)
    # Load model directly
    from transformers import AutoModel
    model = AutoModel.from_pretrained("openbmb/MiniCPM-Embedding-Light", trust_remote_code=True, dtype="auto")
  • sentence-transformers

    How to use openbmb/MiniCPM-Embedding-Light with sentence-transformers:

    from sentence_transformers import SentenceTransformer
    
    model = SentenceTransformer("openbmb/MiniCPM-Embedding-Light", trust_remote_code=True)
    
    sentences = [
        "The weather is lovely today.",
        "It's so sunny outside!",
        "He drove to the stadium."
    ]
    embeddings = model.encode(sentences)
    
    similarities = model.similarity(embeddings, embeddings)
    print(similarities.shape)
    # [3, 3]
  • Notebooks
  • Google Colab
  • Kaggle
MiniCPM-Embedding-Light / results
955 kB
Ctrl+K
Ctrl+K
  • 2 contributors
History: 1 commit
1
init
75f07f8 over 1 year ago
  • dense.md
    356 kB
    init over 1 year ago
  • dense_sparse.md
    300 kB
    init over 1 year ago
  • sparse.md
    299 kB
    init over 1 year ago