Update README.md
Browse files
README.md
CHANGED
|
@@ -46,7 +46,7 @@ multilinguality:
|
|
| 46 |
|
| 47 |
|
| 48 |
### Description
|
| 49 |
-
SparkEmbedding-300m is a 300 million parameter multilingual text embedding model with **SoTA cross‑lingual retrieval** developed by the XenArcAI team. Fine-tuned from Google's EmbeddingGemma-300m, it incorporates an additional 1 million curated samples across 119 languages, emphasizing data complexity, linguistic diversity, and deep language understanding. This optimization enhances cross-lingual retrieval, producing embeddings with superior semantic alignment and efficacy in multilingual settings.
|
| 50 |
|
| 51 |
The model generates high-dimensional vector representations capturing rich semantic and contextual information, excelling in bridging linguistic gaps for applications like global information retrieval, multilingual question answering, and cross-language semantic search. With a native 2048-token context window, it handles extended inputs (e.g., full articles or documents) while preserving long-range dependencies.
|
| 52 |
|
|
|
|
| 46 |
|
| 47 |
|
| 48 |
### Description
|
| 49 |
+
SparkEmbedding-300m is a 300 million parameter multilingual text embedding model with **SoTA cross‑lingual retrieval** developed by the XenArcAI team. Fine-tuned from Google's EmbeddingGemma-300m, it incorporates an additional 1 million curated samples across 119(all 22 Indian languages included) languages, emphasizing data complexity, linguistic diversity, and deep language understanding. This optimization enhances cross-lingual retrieval, producing embeddings with superior semantic alignment and efficacy in multilingual settings.
|
| 50 |
|
| 51 |
The model generates high-dimensional vector representations capturing rich semantic and contextual information, excelling in bridging linguistic gaps for applications like global information retrieval, multilingual question answering, and cross-language semantic search. With a native 2048-token context window, it handles extended inputs (e.g., full articles or documents) while preserving long-range dependencies.
|
| 52 |
|