nomic-embed-text-v2-moe-unsupervised
nomic-embed-text-v2-moe-unsupervised is multilingual MoE Text Embedding model. This is a checkpoint after contrastive pretraining from multi-stage contrastive training of the
final model.
If you want to use a model to extract embeddings, we suggest using nomic-embed-text-v2-moe
Join the Nomic Community