The model is intialized with nomic-ai/modernbert-embed-base-unsupervised.
Fine-tuning details:
- Training steps: 10K
- Training data: Contrastive training on
Tevatron/msmarco-passage-newwithout title - Batch size: 32 * 2
- Training group size: 8
- Downloads last month
- -
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support