Tevatron/msmarco-passage-corpus
Viewer • Updated • 8.84M • 1.11k • 13
How to use cnmoro/static-nomic-eng-ptbr with Model2Vec:
from model2vec import StaticModel
model = StaticModel.from_pretrained("cnmoro/static-nomic-eng-ptbr")This Model2Vec model was created by using Tokenlearn, with nomic-embed-text-v2-moe as a base, trained on around 3.5M passages (english and portuguese).
I have yet to run any benchmarks on it, but it easily outperforms potion-multilingual-128M on my custom-portuguese-testing-workload-thing.
The output dimension is 512.
Load this model using the from_pretrained method:
from model2vec import StaticModel
# Load a pretrained Model2Vec model
model = StaticModel.from_pretrained("cnmoro/static-nomic-eng-ptbr")
# Compute text embeddings
embeddings = model.encode(["Example sentence"])
Base model
FacebookAI/xlm-roberta-base