gety-embed-v0

Fine-tuned from intfloat/multilingual-e5-small using open-source and proprietary synthetic data, optimized for local search scenarios.

ONNX

File Quantization Size
onnx/model_uint8.onnx UINT8 Dynamic 112 MB
Downloads last month
13
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for gety-ai/gety-embed-v0

Quantized
(10)
this model