Spaces:
Running
Running
Commit
·
fbf4793
1
Parent(s):
b490a25
Steffen's review
Browse files
README.md
CHANGED
|
@@ -7,31 +7,23 @@ sdk: static
|
|
| 7 |
pinned: false
|
| 8 |
---
|
| 9 |
|
| 10 |
-
Sinequa Hugging Face homepage
|
| 11 |
-
|
| 12 |
# About Sinequa
|
| 13 |
|
| 14 |
-
Sinequa
|
| 15 |
-
accurate, secure work partner so they are more effective, more informed, more productive, and less stressed. Best of all,
|
| 16 |
-
Sinequa Assistants streamline workflows and automatically navigate the chaotic enterprise information landscape, so that employees
|
| 17 |
-
can skip the grind and focus on doing the kind of work that makes the most impact. Sinequa’s Assistants achieve this by combining
|
| 18 |
-
the power of comprehensive enterprise search with the ease of generative AI in a configurable and easily managed Assistant framework,
|
| 19 |
-
for an accurate, traceable, and fully secure conversational experience. Deploy an out-of-the-box Assistant or configure a tailored
|
| 20 |
-
experience and specialized workflow to augment your people and your company. For more information, visit www.sinequa.com.
|
| 21 |
|
| 22 |
# Neural Search models
|
| 23 |
|
| 24 |
-
Sinequa Search
|
| 25 |
-
This search workflow implies
|
| 26 |
|
| 27 |
-
The two collections below bring together the recommended model combinations for English only and multilingual
|
| 28 |
|
| 29 |
## Vectorizer
|
| 30 |
|
| 31 |
Vectorizers are models which produce an embedding vector given a passage or a query. The passage vectors are stored in our vector index and the
|
| 32 |
query vector is used at query time to look up relevant passages in the index.
|
| 33 |
|
| 34 |
-
Here is an overview of the
|
| 35 |
|
| 36 |
| Model | Languages | Relevance | Inference Time | GPU Memory |
|
| 37 |
|--------------------------------|-----------------------------|-----------|----------------|------------|
|
|
@@ -44,9 +36,9 @@ Here is an overview of the model we deliver publicly here.
|
|
| 44 |
|
| 45 |
## Passage Ranker
|
| 46 |
|
| 47 |
-
Passage Rankers are models which produce a relevance score given a query-passage pair and is used to order search results coming from
|
| 48 |
|
| 49 |
-
Here is an overview of the
|
| 50 |
|
| 51 |
| Model | Languages | Relevance | Inference Time | GPU Memory |
|
| 52 |
|---------------------------------|-----------------------------|-----------|----------------|------------|
|
|
@@ -58,23 +50,3 @@ Here is an overview of the model we deliver publicly here.
|
|
| 58 |
| passage-ranker.strawberry | de, en, es, fr, it, ja, nl, pt, zs | 0.451 | 63 ms | 1060 MiB |
|
| 59 |
| passage-ranker.mango | de, en, es, fr, it, ja, nl, pt, zs | 0.480 | 358 ms | 1070 MiB |
|
| 60 |
| passage-ranker.pistachio | de, en, es, fr, it, ja, nl, pt, zs, pl | 0.380 | 358 ms | 1070 MiB |
|
| 61 |
-
|
| 62 |
-
|
| 63 |
-
## Answer Finder
|
| 64 |
-
|
| 65 |
-
Answer Finder are extractive question answering models developed by Sinequa. Given a query and a passage, they produce two lists of logit scores corresponding
|
| 66 |
-
to the start token and end token of an answer.
|
| 67 |
-
|
| 68 |
-
Here is an overview of the model we deliver publicly here.
|
| 69 |
-
|
| 70 |
-
| Model | Languages | de | en | es | fr | ja |
|
| 71 |
-
|--------------------------------|-----------------|------|------|------|------|------|
|
| 72 |
-
| answer-finder-v1-S-en | en | 70.6 | 79.5 | 54.1 | 0.5 | X |
|
| 73 |
-
| answer-finder-v1-L-multilingual | de, en, es, fr | 90.8 | 75.0 | 67.1 | 73.4 | X |
|
| 74 |
-
| answer-finder.yuzu | ja | X | X | X | X | 91.5 |
|
| 75 |
-
|
| 76 |
-
| Model | Inference Time | GPU Memory |
|
| 77 |
-
|--------------------------------|----------------|------------|
|
| 78 |
-
| answer-finder-v1-S-en | 128 ms | 560 MiB |
|
| 79 |
-
| answer-finder-v1-L-multilingual | 362 ms | 1060 MiB |
|
| 80 |
-
| answer-finder.yuzu | 361 ms | 1320 MiB |
|
|
|
|
| 7 |
pinned: false
|
| 8 |
---
|
| 9 |
|
|
|
|
|
|
|
| 10 |
# About Sinequa
|
| 11 |
|
| 12 |
+
Sinequa provides an Enterprise Search solution that lets you search through your company's internal documents. It uses Neural Search to provide the most relevant content for your search request.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 13 |
|
| 14 |
# Neural Search models
|
| 15 |
|
| 16 |
+
Sinequa Search uses on a technology called Neural Search. Neural Search is an hybrid search solution based on both Keyword Search and Vector Searched.
|
| 17 |
+
This search workflow implies two types of models for which we deliver various version here.
|
| 18 |
|
| 19 |
+
The two collections below bring together the recommended model combinations for English only and multilingual content.
|
| 20 |
|
| 21 |
## Vectorizer
|
| 22 |
|
| 23 |
Vectorizers are models which produce an embedding vector given a passage or a query. The passage vectors are stored in our vector index and the
|
| 24 |
query vector is used at query time to look up relevant passages in the index.
|
| 25 |
|
| 26 |
+
Here is an overview of the models we deliver publicly.
|
| 27 |
|
| 28 |
| Model | Languages | Relevance | Inference Time | GPU Memory |
|
| 29 |
|--------------------------------|-----------------------------|-----------|----------------|------------|
|
|
|
|
| 36 |
|
| 37 |
## Passage Ranker
|
| 38 |
|
| 39 |
+
Passage Rankers are models which produce a relevance score given a query-passage pair and is used to order search results coming from Keyword and Vector search.
|
| 40 |
|
| 41 |
+
Here is an overview of the models we deliver publicly.
|
| 42 |
|
| 43 |
| Model | Languages | Relevance | Inference Time | GPU Memory |
|
| 44 |
|---------------------------------|-----------------------------|-----------|----------------|------------|
|
|
|
|
| 50 |
| passage-ranker.strawberry | de, en, es, fr, it, ja, nl, pt, zs | 0.451 | 63 ms | 1060 MiB |
|
| 51 |
| passage-ranker.mango | de, en, es, fr, it, ja, nl, pt, zs | 0.480 | 358 ms | 1070 MiB |
|
| 52 |
| passage-ranker.pistachio | de, en, es, fr, it, ja, nl, pt, zs, pl | 0.380 | 358 ms | 1070 MiB |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|