Add new SentenceTransformer model
Browse files- README.md +183 -209
- config_sentence_transformers.json +1 -1
README.md
CHANGED
|
@@ -7,112 +7,86 @@ tags:
|
|
| 7 |
- generated_from_trainer
|
| 8 |
- dataset_size:111470
|
| 9 |
- loss:MultipleNegativesRankingLoss
|
| 10 |
-
base_model:
|
| 11 |
widget:
|
| 12 |
-
- source_sentence:
|
| 13 |
sentences:
|
| 14 |
-
-
|
| 15 |
-
|
| 16 |
-
|
| 17 |
-
|
| 18 |
-
|
| 19 |
-
|
| 20 |
-
|
| 21 |
-
|
| 22 |
-
|
| 23 |
-
|
| 24 |
-
|
| 25 |
-
|
| 26 |
-
|
| 27 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 28 |
sentences:
|
| 29 |
-
-
|
| 30 |
-
|
| 31 |
-
|
| 32 |
-
|
| 33 |
-
|
| 34 |
-
|
| 35 |
-
|
| 36 |
-
|
| 37 |
-
|
| 38 |
-
|
| 39 |
-
|
| 40 |
-
In 1933, the first Test series in India was played between India and England with
|
| 41 |
-
matches in Bombay, Calcutta (now Kolkata) and Madras (now Chennai). England won
|
| 42 |
-
the series 2–0.[16] The Indian team continued to improve throughout the 1930s
|
| 43 |
-
and '40s but did not achieve an international victory during this period. In the
|
| 44 |
-
early 1940s, India didn't play any Test cricket due to the Second World War. The
|
| 45 |
-
team's first series as an independent country was in late 1947 against Sir Donald
|
| 46 |
-
Bradman's Invincibles (a name given to the Australia national cricket team of
|
| 47 |
-
that time). It was also the first Test series India played which was not against
|
| 48 |
-
England. Australia won the five-match series 4–0, with Bradman tormenting the
|
| 49 |
-
Indian bowling in his final Australian summer.[17] India subsequently played their
|
| 50 |
-
first Test series at home not against England against the West Indies in 1948.
|
| 51 |
-
West Indies won the 5-Test series 1–0.[18]
|
| 52 |
-
- Hindi Medium (film) Raj Batra (Irrfan Khan) is a rich businessman from Delhi staying
|
| 53 |
-
with his wife Mita (Saba Qamar). They studied in a Hindi Medium school but wants
|
| 54 |
-
their 5 year old daughter, Pia (Dishita Sehgal), to be admitted to one of the
|
| 55 |
-
top schools in Delhi. The top school, 'Delhi Grammar School', has a condition
|
| 56 |
-
that they will admit students who reside within 3km radius, so the family moves
|
| 57 |
-
to Vasant Vihar.
|
| 58 |
-
- source_sentence: i am human and nothing of that which is human is alien to me meaning
|
| 59 |
sentences:
|
| 60 |
-
-
|
| 61 |
-
|
| 62 |
-
|
| 63 |
-
|
| 64 |
-
|
| 65 |
-
|
| 66 |
-
|
| 67 |
-
|
| 68 |
-
|
| 69 |
-
|
| 70 |
-
|
| 71 |
-
|
| 72 |
-
|
| 73 |
-
|
| 74 |
-
|
| 75 |
-
|
| 76 |
-
- '
|
| 77 |
-
|
| 78 |
-
|
| 79 |
-
|
|
|
|
| 80 |
sentences:
|
| 81 |
-
-
|
| 82 |
-
|
| 83 |
-
|
| 84 |
-
|
| 85 |
-
|
| 86 |
-
|
| 87 |
-
|
| 88 |
-
|
| 89 |
-
|
| 90 |
-
|
| 91 |
-
|
| 92 |
-
|
| 93 |
-
|
| 94 |
-
|
| 95 |
-
Tracy McConnell appears in 8 episodes from "Lucky Penny" to "The Time Travelers"
|
| 96 |
-
as an unseen character; she was first seen fully in "Something New" and was promoted
|
| 97 |
-
to a main character in season 9. The Mother is played by Cristin Milioti.
|
| 98 |
-
- Marsupial Marsupials are any members of the mammalian infraclass Marsupialia.
|
| 99 |
-
All extant marsupials are endemic to Australasia and the Americas. A distinctive
|
| 100 |
-
characteristic common to these species is that most of the young are carried in
|
| 101 |
-
a pouch. Well-known marsupials include kangaroos, wallabies, koalas, possums,
|
| 102 |
-
opossums, wombats, and Tasmanian devils. Some lesser-known marsupials are the
|
| 103 |
-
potoroo and the quokka.
|
| 104 |
-
- source_sentence: 'It was Easipower that said :'
|
| 105 |
sentences:
|
| 106 |
-
-
|
| 107 |
-
|
| 108 |
-
|
| 109 |
-
|
| 110 |
-
|
| 111 |
-
|
| 112 |
-
separate and simultaneous state elections instead of a single national election
|
| 113 |
-
run by the federal government.
|
| 114 |
-
- It is said that Easipower was ,
|
| 115 |
-
- 'It was Easipower that said :'
|
| 116 |
pipeline_tag: sentence-similarity
|
| 117 |
library_name: sentence-transformers
|
| 118 |
metrics:
|
|
@@ -132,7 +106,7 @@ metrics:
|
|
| 132 |
- cosine_mrr@10
|
| 133 |
- cosine_map@100
|
| 134 |
model-index:
|
| 135 |
-
- name: SentenceTransformer based on
|
| 136 |
results:
|
| 137 |
- task:
|
| 138 |
type: information-retrieval
|
|
@@ -142,49 +116,49 @@ model-index:
|
|
| 142 |
type: NanoMSMARCO
|
| 143 |
metrics:
|
| 144 |
- type: cosine_accuracy@1
|
| 145 |
-
value: 0.
|
| 146 |
name: Cosine Accuracy@1
|
| 147 |
- type: cosine_accuracy@3
|
| 148 |
-
value: 0.
|
| 149 |
name: Cosine Accuracy@3
|
| 150 |
- type: cosine_accuracy@5
|
| 151 |
-
value: 0.
|
| 152 |
name: Cosine Accuracy@5
|
| 153 |
- type: cosine_accuracy@10
|
| 154 |
-
value: 0.
|
| 155 |
name: Cosine Accuracy@10
|
| 156 |
- type: cosine_precision@1
|
| 157 |
-
value: 0.
|
| 158 |
name: Cosine Precision@1
|
| 159 |
- type: cosine_precision@3
|
| 160 |
-
value: 0.
|
| 161 |
name: Cosine Precision@3
|
| 162 |
- type: cosine_precision@5
|
| 163 |
-
value: 0.
|
| 164 |
name: Cosine Precision@5
|
| 165 |
- type: cosine_precision@10
|
| 166 |
-
value: 0.
|
| 167 |
name: Cosine Precision@10
|
| 168 |
- type: cosine_recall@1
|
| 169 |
-
value: 0.
|
| 170 |
name: Cosine Recall@1
|
| 171 |
- type: cosine_recall@3
|
| 172 |
-
value: 0.
|
| 173 |
name: Cosine Recall@3
|
| 174 |
- type: cosine_recall@5
|
| 175 |
-
value: 0.
|
| 176 |
name: Cosine Recall@5
|
| 177 |
- type: cosine_recall@10
|
| 178 |
-
value: 0.
|
| 179 |
name: Cosine Recall@10
|
| 180 |
- type: cosine_ndcg@10
|
| 181 |
-
value: 0.
|
| 182 |
name: Cosine Ndcg@10
|
| 183 |
- type: cosine_mrr@10
|
| 184 |
-
value: 0.
|
| 185 |
name: Cosine Mrr@10
|
| 186 |
- type: cosine_map@100
|
| 187 |
-
value: 0.
|
| 188 |
name: Cosine Map@100
|
| 189 |
- task:
|
| 190 |
type: information-retrieval
|
|
@@ -194,49 +168,49 @@ model-index:
|
|
| 194 |
type: NanoNQ
|
| 195 |
metrics:
|
| 196 |
- type: cosine_accuracy@1
|
| 197 |
-
value: 0.
|
| 198 |
name: Cosine Accuracy@1
|
| 199 |
- type: cosine_accuracy@3
|
| 200 |
-
value: 0.
|
| 201 |
name: Cosine Accuracy@3
|
| 202 |
- type: cosine_accuracy@5
|
| 203 |
-
value: 0.
|
| 204 |
name: Cosine Accuracy@5
|
| 205 |
- type: cosine_accuracy@10
|
| 206 |
-
value: 0.
|
| 207 |
name: Cosine Accuracy@10
|
| 208 |
- type: cosine_precision@1
|
| 209 |
-
value: 0.
|
| 210 |
name: Cosine Precision@1
|
| 211 |
- type: cosine_precision@3
|
| 212 |
-
value: 0.
|
| 213 |
name: Cosine Precision@3
|
| 214 |
- type: cosine_precision@5
|
| 215 |
-
value: 0.
|
| 216 |
name: Cosine Precision@5
|
| 217 |
- type: cosine_precision@10
|
| 218 |
-
value: 0.
|
| 219 |
name: Cosine Precision@10
|
| 220 |
- type: cosine_recall@1
|
| 221 |
-
value: 0.
|
| 222 |
name: Cosine Recall@1
|
| 223 |
- type: cosine_recall@3
|
| 224 |
-
value: 0.
|
| 225 |
name: Cosine Recall@3
|
| 226 |
- type: cosine_recall@5
|
| 227 |
-
value: 0.
|
| 228 |
name: Cosine Recall@5
|
| 229 |
- type: cosine_recall@10
|
| 230 |
-
value: 0.
|
| 231 |
name: Cosine Recall@10
|
| 232 |
- type: cosine_ndcg@10
|
| 233 |
-
value: 0.
|
| 234 |
name: Cosine Ndcg@10
|
| 235 |
- type: cosine_mrr@10
|
| 236 |
-
value: 0.
|
| 237 |
name: Cosine Mrr@10
|
| 238 |
- type: cosine_map@100
|
| 239 |
-
value: 0.
|
| 240 |
name: Cosine Map@100
|
| 241 |
- task:
|
| 242 |
type: nano-beir
|
|
@@ -246,61 +220,61 @@ model-index:
|
|
| 246 |
type: NanoBEIR_mean
|
| 247 |
metrics:
|
| 248 |
- type: cosine_accuracy@1
|
| 249 |
-
value: 0.
|
| 250 |
name: Cosine Accuracy@1
|
| 251 |
- type: cosine_accuracy@3
|
| 252 |
-
value: 0.
|
| 253 |
name: Cosine Accuracy@3
|
| 254 |
- type: cosine_accuracy@5
|
| 255 |
-
value: 0.
|
| 256 |
name: Cosine Accuracy@5
|
| 257 |
- type: cosine_accuracy@10
|
| 258 |
value: 0.5800000000000001
|
| 259 |
name: Cosine Accuracy@10
|
| 260 |
- type: cosine_precision@1
|
| 261 |
-
value: 0.
|
| 262 |
name: Cosine Precision@1
|
| 263 |
- type: cosine_precision@3
|
| 264 |
-
value: 0.
|
| 265 |
name: Cosine Precision@3
|
| 266 |
- type: cosine_precision@5
|
| 267 |
-
value: 0.
|
| 268 |
name: Cosine Precision@5
|
| 269 |
- type: cosine_precision@10
|
| 270 |
-
value: 0.
|
| 271 |
name: Cosine Precision@10
|
| 272 |
- type: cosine_recall@1
|
| 273 |
-
value: 0.
|
| 274 |
name: Cosine Recall@1
|
| 275 |
- type: cosine_recall@3
|
| 276 |
-
value: 0.
|
| 277 |
name: Cosine Recall@3
|
| 278 |
- type: cosine_recall@5
|
| 279 |
-
value: 0.
|
| 280 |
name: Cosine Recall@5
|
| 281 |
- type: cosine_recall@10
|
| 282 |
-
value: 0.
|
| 283 |
name: Cosine Recall@10
|
| 284 |
- type: cosine_ndcg@10
|
| 285 |
-
value: 0.
|
| 286 |
name: Cosine Ndcg@10
|
| 287 |
- type: cosine_mrr@10
|
| 288 |
-
value: 0.
|
| 289 |
name: Cosine Mrr@10
|
| 290 |
- type: cosine_map@100
|
| 291 |
-
value: 0.
|
| 292 |
name: Cosine Map@100
|
| 293 |
---
|
| 294 |
|
| 295 |
-
# SentenceTransformer based on
|
| 296 |
|
| 297 |
-
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [
|
| 298 |
|
| 299 |
## Model Details
|
| 300 |
|
| 301 |
### Model Description
|
| 302 |
- **Model Type:** Sentence Transformer
|
| 303 |
-
- **Base model:** [
|
| 304 |
- **Maximum Sequence Length:** 128 tokens
|
| 305 |
- **Output Dimensionality:** 384 dimensions
|
| 306 |
- **Similarity Function:** Cosine Similarity
|
|
@@ -342,9 +316,9 @@ from sentence_transformers import SentenceTransformer
|
|
| 342 |
model = SentenceTransformer("redis/model-b-structured")
|
| 343 |
# Run inference
|
| 344 |
sentences = [
|
| 345 |
-
'
|
| 346 |
-
'
|
| 347 |
-
'
|
| 348 |
]
|
| 349 |
embeddings = model.encode(sentences)
|
| 350 |
print(embeddings.shape)
|
|
@@ -353,9 +327,9 @@ print(embeddings.shape)
|
|
| 353 |
# Get the similarity scores for the embeddings
|
| 354 |
similarities = model.similarity(embeddings, embeddings)
|
| 355 |
print(similarities)
|
| 356 |
-
# tensor([[1.0000, 1.0000, 0.
|
| 357 |
-
# [1.0000, 1.0000, 0.
|
| 358 |
-
# [0.
|
| 359 |
```
|
| 360 |
|
| 361 |
<!--
|
|
@@ -393,21 +367,21 @@ You can finetune this model on your own dataset.
|
|
| 393 |
|
| 394 |
| Metric | NanoMSMARCO | NanoNQ |
|
| 395 |
|:--------------------|:------------|:-----------|
|
| 396 |
-
| cosine_accuracy@1 | 0.
|
| 397 |
-
| cosine_accuracy@3 | 0.
|
| 398 |
-
| cosine_accuracy@5 | 0.
|
| 399 |
-
| cosine_accuracy@10 | 0.
|
| 400 |
-
| cosine_precision@1 | 0.
|
| 401 |
-
| cosine_precision@3 | 0.
|
| 402 |
-
| cosine_precision@5 | 0.
|
| 403 |
-
| cosine_precision@10 | 0.
|
| 404 |
-
| cosine_recall@1 | 0.
|
| 405 |
-
| cosine_recall@3 | 0.
|
| 406 |
-
| cosine_recall@5 | 0.
|
| 407 |
-
| cosine_recall@10 | 0.
|
| 408 |
-
| **cosine_ndcg@10** | **0.
|
| 409 |
-
| cosine_mrr@10 | 0.
|
| 410 |
-
| cosine_map@100 | 0.
|
| 411 |
|
| 412 |
#### Nano BEIR
|
| 413 |
|
|
@@ -425,21 +399,21 @@ You can finetune this model on your own dataset.
|
|
| 425 |
|
| 426 |
| Metric | Value |
|
| 427 |
|:--------------------|:-----------|
|
| 428 |
-
| cosine_accuracy@1 | 0.
|
| 429 |
-
| cosine_accuracy@3 | 0.
|
| 430 |
-
| cosine_accuracy@5 | 0.
|
| 431 |
| cosine_accuracy@10 | 0.58 |
|
| 432 |
-
| cosine_precision@1 | 0.
|
| 433 |
-
| cosine_precision@3 | 0.
|
| 434 |
-
| cosine_precision@5 | 0.
|
| 435 |
-
| cosine_precision@10 | 0.
|
| 436 |
-
| cosine_recall@1 | 0.
|
| 437 |
-
| cosine_recall@3 | 0.
|
| 438 |
-
| cosine_recall@5 | 0.
|
| 439 |
-
| cosine_recall@10 | 0.
|
| 440 |
-
| **cosine_ndcg@10** | **0.
|
| 441 |
-
| cosine_mrr@10 | 0.
|
| 442 |
-
| cosine_map@100 | 0.
|
| 443 |
|
| 444 |
<!--
|
| 445 |
## Bias, Risks and Limitations
|
|
@@ -465,13 +439,13 @@ You can finetune this model on your own dataset.
|
|
| 465 |
| | anchor | positive | negative |
|
| 466 |
|:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
|
| 467 |
| type | string | string | string |
|
| 468 |
-
| details | <ul><li>min:
|
| 469 |
* Samples:
|
| 470 |
-
| anchor
|
| 471 |
-
|
| 472 |
-
| <code>
|
| 473 |
-
| <code>
|
| 474 |
-
| <code>
|
| 475 |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
|
| 476 |
```json
|
| 477 |
{
|
|
@@ -491,13 +465,13 @@ You can finetune this model on your own dataset.
|
|
| 491 |
| | anchor | positive | negative |
|
| 492 |
|:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
|
| 493 |
| type | string | string | string |
|
| 494 |
-
| details | <ul><li>min:
|
| 495 |
* Samples:
|
| 496 |
-
| anchor
|
| 497 |
-
|
| 498 |
-
| <code>In
|
| 499 |
-
| <code>
|
| 500 |
-
| <code>
|
| 501 |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
|
| 502 |
```json
|
| 503 |
{
|
|
@@ -656,20 +630,20 @@ You can finetune this model on your own dataset.
|
|
| 656 |
### Training Logs
|
| 657 |
| Epoch | Step | Training Loss | Validation Loss | NanoMSMARCO_cosine_ndcg@10 | NanoNQ_cosine_ndcg@10 | NanoBEIR_mean_cosine_ndcg@10 |
|
| 658 |
|:------:|:----:|:-------------:|:---------------:|:--------------------------:|:---------------------:|:----------------------------:|
|
| 659 |
-
| 0 | 0 | - |
|
| 660 |
-
| 0.2874 | 250 | 3.
|
| 661 |
-
| 0.5747 | 500 | 3.
|
| 662 |
-
| 0.8621 | 750 | 3.
|
| 663 |
-
| 1.1494 | 1000 |
|
| 664 |
-
| 1.4368 | 1250 |
|
| 665 |
-
| 1.7241 | 1500 | 2.
|
| 666 |
-
| 2.0115 | 1750 | 2.
|
| 667 |
-
| 2.2989 | 2000 | 2.
|
| 668 |
-
| 2.5862 | 2250 | 2.
|
| 669 |
-
| 2.8736 | 2500 | 2.
|
| 670 |
-
| 3.1609 | 2750 | 2.
|
| 671 |
-
| 3.4483 | 3000 | 2.
|
| 672 |
-
| 3.7356 | 3250 | 2.
|
| 673 |
|
| 674 |
|
| 675 |
### Framework Versions
|
|
|
|
| 7 |
- generated_from_trainer
|
| 8 |
- dataset_size:111470
|
| 9 |
- loss:MultipleNegativesRankingLoss
|
| 10 |
+
base_model: thenlper/gte-small
|
| 11 |
widget:
|
| 12 |
+
- source_sentence: why are some rocks radioactive
|
| 13 |
sentences:
|
| 14 |
+
- Radioactive accessory minerals such as zircon may contribute to the radioactivity
|
| 15 |
+
of a mineral which is otherwise non-radioactive by calculation. Many granites
|
| 16 |
+
or other igneous rocks contain some radioactivity because of minor, but highly
|
| 17 |
+
radioactive, accessory minerals.re = mineral density (S Atomic number / Molecular
|
| 18 |
+
Weight) where re is the electron density in grams/cc.efinition. Radioactivity
|
| 19 |
+
in minerals are caused by the inclusion of naturally-occurring radioactive elements
|
| 20 |
+
in the mineral's composition. The degree of radioactivity is dependent on the
|
| 21 |
+
concentration and isotope present in the mineral.
|
| 22 |
+
- Taking B-complex vitamins, which include vitamin B12, can cause urine to have
|
| 23 |
+
a bright yellow or even orange color, but check with your doctor to be sure that's
|
| 24 |
+
what is going on in your case. B vitamins are water-soluble vitamins, which means
|
| 25 |
+
that what your body doesn't use is excreted in your urine. Riboflavin (vitamin
|
| 26 |
+
B2) is especially likely to cause this color change in urine. Several medications
|
| 27 |
+
can also turn urine a bright yellow or orange color. Changes in urine color may
|
| 28 |
+
also signal certain health problems.
|
| 29 |
+
- Radioactive material is just another name for a group of unstable atoms that emit
|
| 30 |
+
ionizing radiation. These groups of unstable atoms emit radiation because they
|
| 31 |
+
try to become stable. Radioactive materials emit radiation in a process called
|
| 32 |
+
radioactive decay.
|
| 33 |
+
- source_sentence: How was your experience of Lucid dreaming at home?
|
| 34 |
sentences:
|
| 35 |
+
- How was your experience of Lucid dreaming at home?
|
| 36 |
+
- How was your experience of Lucid dreaming outside the home?
|
| 37 |
+
- "Bournemouth /Ë\x88bÉ\x94É\x99rnmÉ\x99θ/ is a large coastal resort town on the\
|
| 38 |
+
\ south coast of England directly to the east of the Jurassic Coast, a 96-mile\
|
| 39 |
+
\ (155 km) World Heritage Site. According to the 2011 census, the town has a population\
|
| 40 |
+
\ of 183,491 making it the largest settlement in Dorset.he Bournemouth Eye is\
|
| 41 |
+
\ a helium-filled balloon attached to a steel cable in the town's lower gardens.\
|
| 42 |
+
\ The spherical balloon is 69 m (226 ft) in circumference and carries an enclosed,\
|
| 43 |
+
\ steel gondola. Rising to a height of 150 m (492 ft), it provides a panoramic\
|
| 44 |
+
\ view of the surrounding area for up to 28 passengers."
|
| 45 |
+
- source_sentence: what is iraq's dominant religion
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 46 |
sentences:
|
| 47 |
+
- 'If you are working, consider taking maternity leave as early as you can. This
|
| 48 |
+
makes sense anyway because carrying twins is hard work, and most twins arrive
|
| 49 |
+
earlier than single babies (NCCWCH 2011: 128) . More than half of twins arrive
|
| 50 |
+
early, before 37 weeks (NCCWCH 2011: 120, Tamba 2012) .Talk to your midwife or
|
| 51 |
+
doctor if you are feeling down about your pregnancy (NICE 2011) .f you are working,
|
| 52 |
+
consider taking maternity leave as early as you can. This makes sense anyway because
|
| 53 |
+
carrying twins is hard work, and most twins arrive earlier than single babies
|
| 54 |
+
(NCCWCH 2011: 128) . More than half of twins arrive early, before 37 weeks (NCCWCH
|
| 55 |
+
2011: 120, Tamba 2012) .'
|
| 56 |
+
- "Introduction. Although Iranâ\x80\x99s state religion is Shiite Islam and the\
|
| 57 |
+
\ majority of its population is ethnically Persian, millions of minorities from\
|
| 58 |
+
\ various ethnic, religious, and linguistic backgrounds also reside in Iran. Among\
|
| 59 |
+
\ these groups are ethnic Kurds, Baluchis, and Azeris.lthough Iranâ\x80\x99s state\
|
| 60 |
+
\ religion is Shiite Islam and the majority of its population is ethnically Persian,\
|
| 61 |
+
\ millions of minorities from various ethnic, religious, and linguistic backgrounds\
|
| 62 |
+
\ also reside in Iran."
|
| 63 |
+
- In today's Republic of Iraq, where Islam is the state religion and claims the
|
| 64 |
+
beliefs of 95 percent of the population, the majority of Iraqis identify with
|
| 65 |
+
Arab culture. The second-largest cultural group is the Kurds, who are in the highlands
|
| 66 |
+
and mountain valleys of the north in a politically autonomous settlement.
|
| 67 |
+
- source_sentence: how many years of education are needed to become a pediatric nurse
|
| 68 |
sentences:
|
| 69 |
+
- In terms of educational background, pediatric nurse requirements include either
|
| 70 |
+
an Associate's or a Bachelor's degree in Nursing. An Associate's degree (ADN)
|
| 71 |
+
typically takes two years to complete, while a Bachelor's degree (BSN) typically
|
| 72 |
+
takes four years. ADN programs are usually offered by community colleges.
|
| 73 |
+
- "Photo of Oxford Suites Sonoma County - Rohnert Park - Rohnert Park, CA, United\
|
| 74 |
+
\ States Photo of Oxford Suites Sonoma County - Rohnert Park - Rohnert Park, CA,\
|
| 75 |
+
\ United States Living area with king bed by Monique' M. â\x80\x9CAnd there's\
|
| 76 |
+
\ a complimentary reception with 2 drinks, soup and salad bar nightly.â\x80\x9D\
|
| 77 |
+
\ in 2 reviews"
|
| 78 |
+
- 'From there, additional training specific to the care of children is required.
|
| 79 |
+
Pediatric nurses can become certified in the field and may choose to further specialize
|
| 80 |
+
in a particular area. Program Levels: Associate''s degree, bachelor''s degree.'
|
| 81 |
+
- source_sentence: Schliemann recognized five shafts and cleared them like the graves
|
| 82 |
+
mentioned by Pausanias .
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 83 |
sentences:
|
| 84 |
+
- IBM banned the usage of the POWER5+ in its System p5 510Q, 520Q, 550Q and 560Q
|
| 85 |
+
servers.
|
| 86 |
+
- Schliemann cleared five shafts and recognized them as the graves mentioned by
|
| 87 |
+
Pausania .
|
| 88 |
+
- Schliemann recognized five shafts and cleared them like the graves mentioned by
|
| 89 |
+
Pausanias .
|
|
|
|
|
|
|
|
|
|
|
|
|
| 90 |
pipeline_tag: sentence-similarity
|
| 91 |
library_name: sentence-transformers
|
| 92 |
metrics:
|
|
|
|
| 106 |
- cosine_mrr@10
|
| 107 |
- cosine_map@100
|
| 108 |
model-index:
|
| 109 |
+
- name: SentenceTransformer based on thenlper/gte-small
|
| 110 |
results:
|
| 111 |
- task:
|
| 112 |
type: information-retrieval
|
|
|
|
| 116 |
type: NanoMSMARCO
|
| 117 |
metrics:
|
| 118 |
- type: cosine_accuracy@1
|
| 119 |
+
value: 0.24
|
| 120 |
name: Cosine Accuracy@1
|
| 121 |
- type: cosine_accuracy@3
|
| 122 |
+
value: 0.46
|
| 123 |
name: Cosine Accuracy@3
|
| 124 |
- type: cosine_accuracy@5
|
| 125 |
+
value: 0.54
|
| 126 |
name: Cosine Accuracy@5
|
| 127 |
- type: cosine_accuracy@10
|
| 128 |
+
value: 0.66
|
| 129 |
name: Cosine Accuracy@10
|
| 130 |
- type: cosine_precision@1
|
| 131 |
+
value: 0.24
|
| 132 |
name: Cosine Precision@1
|
| 133 |
- type: cosine_precision@3
|
| 134 |
+
value: 0.15333333333333332
|
| 135 |
name: Cosine Precision@3
|
| 136 |
- type: cosine_precision@5
|
| 137 |
+
value: 0.10800000000000001
|
| 138 |
name: Cosine Precision@5
|
| 139 |
- type: cosine_precision@10
|
| 140 |
+
value: 0.066
|
| 141 |
name: Cosine Precision@10
|
| 142 |
- type: cosine_recall@1
|
| 143 |
+
value: 0.24
|
| 144 |
name: Cosine Recall@1
|
| 145 |
- type: cosine_recall@3
|
| 146 |
+
value: 0.46
|
| 147 |
name: Cosine Recall@3
|
| 148 |
- type: cosine_recall@5
|
| 149 |
+
value: 0.54
|
| 150 |
name: Cosine Recall@5
|
| 151 |
- type: cosine_recall@10
|
| 152 |
+
value: 0.66
|
| 153 |
name: Cosine Recall@10
|
| 154 |
- type: cosine_ndcg@10
|
| 155 |
+
value: 0.44291538669701197
|
| 156 |
name: Cosine Ndcg@10
|
| 157 |
- type: cosine_mrr@10
|
| 158 |
+
value: 0.3744682539682539
|
| 159 |
name: Cosine Mrr@10
|
| 160 |
- type: cosine_map@100
|
| 161 |
+
value: 0.3864402026320918
|
| 162 |
name: Cosine Map@100
|
| 163 |
- task:
|
| 164 |
type: information-retrieval
|
|
|
|
| 168 |
type: NanoNQ
|
| 169 |
metrics:
|
| 170 |
- type: cosine_accuracy@1
|
| 171 |
+
value: 0.24
|
| 172 |
name: Cosine Accuracy@1
|
| 173 |
- type: cosine_accuracy@3
|
| 174 |
+
value: 0.34
|
| 175 |
name: Cosine Accuracy@3
|
| 176 |
- type: cosine_accuracy@5
|
| 177 |
+
value: 0.38
|
| 178 |
name: Cosine Accuracy@5
|
| 179 |
- type: cosine_accuracy@10
|
| 180 |
+
value: 0.5
|
| 181 |
name: Cosine Accuracy@10
|
| 182 |
- type: cosine_precision@1
|
| 183 |
+
value: 0.24
|
| 184 |
name: Cosine Precision@1
|
| 185 |
- type: cosine_precision@3
|
| 186 |
+
value: 0.11333333333333333
|
| 187 |
name: Cosine Precision@3
|
| 188 |
- type: cosine_precision@5
|
| 189 |
+
value: 0.08
|
| 190 |
name: Cosine Precision@5
|
| 191 |
- type: cosine_precision@10
|
| 192 |
+
value: 0.052000000000000005
|
| 193 |
name: Cosine Precision@10
|
| 194 |
- type: cosine_recall@1
|
| 195 |
+
value: 0.22
|
| 196 |
name: Cosine Recall@1
|
| 197 |
- type: cosine_recall@3
|
| 198 |
+
value: 0.32
|
| 199 |
name: Cosine Recall@3
|
| 200 |
- type: cosine_recall@5
|
| 201 |
+
value: 0.37
|
| 202 |
name: Cosine Recall@5
|
| 203 |
- type: cosine_recall@10
|
| 204 |
+
value: 0.48
|
| 205 |
name: Cosine Recall@10
|
| 206 |
- type: cosine_ndcg@10
|
| 207 |
+
value: 0.3425602412795631
|
| 208 |
name: Cosine Ndcg@10
|
| 209 |
- type: cosine_mrr@10
|
| 210 |
+
value: 0.31126984126984125
|
| 211 |
name: Cosine Mrr@10
|
| 212 |
- type: cosine_map@100
|
| 213 |
+
value: 0.30988313931877554
|
| 214 |
name: Cosine Map@100
|
| 215 |
- task:
|
| 216 |
type: nano-beir
|
|
|
|
| 220 |
type: NanoBEIR_mean
|
| 221 |
metrics:
|
| 222 |
- type: cosine_accuracy@1
|
| 223 |
+
value: 0.24
|
| 224 |
name: Cosine Accuracy@1
|
| 225 |
- type: cosine_accuracy@3
|
| 226 |
+
value: 0.4
|
| 227 |
name: Cosine Accuracy@3
|
| 228 |
- type: cosine_accuracy@5
|
| 229 |
+
value: 0.46
|
| 230 |
name: Cosine Accuracy@5
|
| 231 |
- type: cosine_accuracy@10
|
| 232 |
value: 0.5800000000000001
|
| 233 |
name: Cosine Accuracy@10
|
| 234 |
- type: cosine_precision@1
|
| 235 |
+
value: 0.24
|
| 236 |
name: Cosine Precision@1
|
| 237 |
- type: cosine_precision@3
|
| 238 |
+
value: 0.13333333333333333
|
| 239 |
name: Cosine Precision@3
|
| 240 |
- type: cosine_precision@5
|
| 241 |
+
value: 0.094
|
| 242 |
name: Cosine Precision@5
|
| 243 |
- type: cosine_precision@10
|
| 244 |
+
value: 0.059000000000000004
|
| 245 |
name: Cosine Precision@10
|
| 246 |
- type: cosine_recall@1
|
| 247 |
+
value: 0.22999999999999998
|
| 248 |
name: Cosine Recall@1
|
| 249 |
- type: cosine_recall@3
|
| 250 |
+
value: 0.39
|
| 251 |
name: Cosine Recall@3
|
| 252 |
- type: cosine_recall@5
|
| 253 |
+
value: 0.455
|
| 254 |
name: Cosine Recall@5
|
| 255 |
- type: cosine_recall@10
|
| 256 |
+
value: 0.5700000000000001
|
| 257 |
name: Cosine Recall@10
|
| 258 |
- type: cosine_ndcg@10
|
| 259 |
+
value: 0.39273781398828755
|
| 260 |
name: Cosine Ndcg@10
|
| 261 |
- type: cosine_mrr@10
|
| 262 |
+
value: 0.3428690476190476
|
| 263 |
name: Cosine Mrr@10
|
| 264 |
- type: cosine_map@100
|
| 265 |
+
value: 0.34816167097543366
|
| 266 |
name: Cosine Map@100
|
| 267 |
---
|
| 268 |
|
| 269 |
+
# SentenceTransformer based on thenlper/gte-small
|
| 270 |
|
| 271 |
+
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [thenlper/gte-small](https://huggingface.co/thenlper/gte-small). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
|
| 272 |
|
| 273 |
## Model Details
|
| 274 |
|
| 275 |
### Model Description
|
| 276 |
- **Model Type:** Sentence Transformer
|
| 277 |
+
- **Base model:** [thenlper/gte-small](https://huggingface.co/thenlper/gte-small) <!-- at revision 17e1f347d17fe144873b1201da91788898c639cd -->
|
| 278 |
- **Maximum Sequence Length:** 128 tokens
|
| 279 |
- **Output Dimensionality:** 384 dimensions
|
| 280 |
- **Similarity Function:** Cosine Similarity
|
|
|
|
| 316 |
model = SentenceTransformer("redis/model-b-structured")
|
| 317 |
# Run inference
|
| 318 |
sentences = [
|
| 319 |
+
'Schliemann recognized five shafts and cleared them like the graves mentioned by Pausanias .',
|
| 320 |
+
'Schliemann recognized five shafts and cleared them like the graves mentioned by Pausanias .',
|
| 321 |
+
'Schliemann cleared five shafts and recognized them as the graves mentioned by Pausania .',
|
| 322 |
]
|
| 323 |
embeddings = model.encode(sentences)
|
| 324 |
print(embeddings.shape)
|
|
|
|
| 327 |
# Get the similarity scores for the embeddings
|
| 328 |
similarities = model.similarity(embeddings, embeddings)
|
| 329 |
print(similarities)
|
| 330 |
+
# tensor([[1.0000, 1.0000, 0.9879],
|
| 331 |
+
# [1.0000, 1.0000, 0.9879],
|
| 332 |
+
# [0.9879, 0.9879, 1.0000]])
|
| 333 |
```
|
| 334 |
|
| 335 |
<!--
|
|
|
|
| 367 |
|
| 368 |
| Metric | NanoMSMARCO | NanoNQ |
|
| 369 |
|:--------------------|:------------|:-----------|
|
| 370 |
+
| cosine_accuracy@1 | 0.24 | 0.24 |
|
| 371 |
+
| cosine_accuracy@3 | 0.46 | 0.34 |
|
| 372 |
+
| cosine_accuracy@5 | 0.54 | 0.38 |
|
| 373 |
+
| cosine_accuracy@10 | 0.66 | 0.5 |
|
| 374 |
+
| cosine_precision@1 | 0.24 | 0.24 |
|
| 375 |
+
| cosine_precision@3 | 0.1533 | 0.1133 |
|
| 376 |
+
| cosine_precision@5 | 0.108 | 0.08 |
|
| 377 |
+
| cosine_precision@10 | 0.066 | 0.052 |
|
| 378 |
+
| cosine_recall@1 | 0.24 | 0.22 |
|
| 379 |
+
| cosine_recall@3 | 0.46 | 0.32 |
|
| 380 |
+
| cosine_recall@5 | 0.54 | 0.37 |
|
| 381 |
+
| cosine_recall@10 | 0.66 | 0.48 |
|
| 382 |
+
| **cosine_ndcg@10** | **0.4429** | **0.3426** |
|
| 383 |
+
| cosine_mrr@10 | 0.3745 | 0.3113 |
|
| 384 |
+
| cosine_map@100 | 0.3864 | 0.3099 |
|
| 385 |
|
| 386 |
#### Nano BEIR
|
| 387 |
|
|
|
|
| 399 |
|
| 400 |
| Metric | Value |
|
| 401 |
|:--------------------|:-----------|
|
| 402 |
+
| cosine_accuracy@1 | 0.24 |
|
| 403 |
+
| cosine_accuracy@3 | 0.4 |
|
| 404 |
+
| cosine_accuracy@5 | 0.46 |
|
| 405 |
| cosine_accuracy@10 | 0.58 |
|
| 406 |
+
| cosine_precision@1 | 0.24 |
|
| 407 |
+
| cosine_precision@3 | 0.1333 |
|
| 408 |
+
| cosine_precision@5 | 0.094 |
|
| 409 |
+
| cosine_precision@10 | 0.059 |
|
| 410 |
+
| cosine_recall@1 | 0.23 |
|
| 411 |
+
| cosine_recall@3 | 0.39 |
|
| 412 |
+
| cosine_recall@5 | 0.455 |
|
| 413 |
+
| cosine_recall@10 | 0.57 |
|
| 414 |
+
| **cosine_ndcg@10** | **0.3927** |
|
| 415 |
+
| cosine_mrr@10 | 0.3429 |
|
| 416 |
+
| cosine_map@100 | 0.3482 |
|
| 417 |
|
| 418 |
<!--
|
| 419 |
## Bias, Risks and Limitations
|
|
|
|
| 439 |
| | anchor | positive | negative |
|
| 440 |
|:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
|
| 441 |
| type | string | string | string |
|
| 442 |
+
| details | <ul><li>min: 4 tokens</li><li>mean: 10.95 tokens</li><li>max: 60 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 67.57 tokens</li><li>max: 128 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 66.64 tokens</li><li>max: 128 tokens</li></ul> |
|
| 443 |
* Samples:
|
| 444 |
+
| anchor | positive | negative |
|
| 445 |
+
|:----------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
| 446 |
+
| <code>how far is sandos caracol eco resort from cancun airport</code> | <code>The Sandos Caracol Eco Resort is 2 miles from the Church of Guadalupe and a 45-minute drive from Cancun Cancún. Airport The Gran Coral Golf Riviera maya is located within the same estate as The. Sandos we speak your! Language Hotel: rooms, 680 Hotel: Chain Sandos & Hotels. resorts</code> | <code>Featuring a spa, 8 restaurants and 2 outdoor pools, Sandos Caracol Eco Resort is set on Playa del Carmen Beach, overlooking Cozumel Island. Its rooms have balconies overlooking the Caribbean Sea. Sandos Caracol Eco Resort is in beautiful gardens and features bright accommodations.</code> |
|
| 447 |
+
| <code>can eggs expire</code> | <code>Here is a link from Georgia Eggs Commission about eggs and expiration dates. The following is from Swedish Medical Center Eggs: If you ve purchased a carton of eggs before the date expires, you should be able to use them safely for three to five weeks after expiration.ere is a link from Georgia Eggs Commission about eggs and expiration dates. The following is from Swedish Medical Center Eggs: If you ve purchased a carton of eggs before the date expires, you should be able to use them safely for three to five weeks after expiration.</code> | <code>The answer to this question may surprise you: while uncooked eggs typically last four to five weeks when properly refrigerated, hard-boiled eggs will only last about a week. This is because egg shells, which are highly porous, are sprayed before sale with a thin coating of mineral oil that seals the egg.</code> |
|
| 448 |
+
| <code>how old are first graders?</code> | <code>First Grade Worksheets Online. 6 and 7 year old kids get their first taste of real schooling in first grade. Help children learn the basics in math, reading, language and science with our printable first grade worksheets. Spelling Worksheets for 1st Grade.</code> | <code>Average BMI percentile-for-age values were 59.5 (28.8) for first-graders, 59.5 (30.5) for third-graders, and 62.4 (31.7) for fifth-graders. The number of participants classified as obese was 144 (25.6% of first-graders, 28.5% of third-graders, and 34.5% of fifth-graders). The percentage of students who reported a reasonable height or weight ranged from 20% (first grade, height) to 92% (fifth grade, weight) (Table). In general, self-report ability was better in older children and when self-reporting weight.</code> |
|
| 449 |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
|
| 450 |
```json
|
| 451 |
{
|
|
|
|
| 465 |
| | anchor | positive | negative |
|
| 466 |
|:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
|
| 467 |
| type | string | string | string |
|
| 468 |
+
| details | <ul><li>min: 4 tokens</li><li>mean: 11.11 tokens</li><li>max: 66 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 67.99 tokens</li><li>max: 128 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 66.08 tokens</li><li>max: 128 tokens</li></ul> |
|
| 469 |
* Samples:
|
| 470 |
+
| anchor | positive | negative |
|
| 471 |
+
|:----------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
|
| 472 |
+
| <code>In 1883 , the first schools were built in the vicinity for 400 white and 60 black students .</code> | <code>In 1883 , the first schools were built in the vicinity for 400 white and 60 black students .</code> | <code>In 1883 , the first schools in the area were built for 400 black students and 60 white students .</code> |
|
| 473 |
+
| <code>what is the origin of the name haja</code> | <code>Haja is a Muslim baby Girl name, it is an Arabic originated name. Haja name meaning is In the heart condition through and the lucky number associated with Haja is 5. Find all the relevant details about the Haja Meaning, Origin, Lucky Number and Religion from this page. Average rating of Haja is 1 stars, based on 0 reviews.</code> | <code>Synonomis with the exclamation commonly used in urban circles Holla. Haba is derived from the term, Holla Bitches, which became Haba Litches, which eventually evolved to Habalicious, and finally became just Haba. When seeing a fine female passing by, Russell exclaimed, Haba.</code> |
|
| 474 |
+
| <code>what causes itch rash</code> | <code>A rash is a noticeable change in the texture or color of the skin. The skin may become itchy, bumpy, chapped, scaly, or otherwise irritated. Rashes are caused by a wide range of conditions, including allergies, medication, cosmetics, and various diseases. The rash is often reddish and itchy, with a scaly texture. 2 bug bites: tick bites are of particular concern, as they can transmit disease. 3 psoriasis: a scaly, itchy, red rash that forms along the scalp and joints. 4 dandruff: an itchy, flaky rash on the scalp.</code> | <code>Causes of Similar Symptoms to Behind knee rash. Research the causes of these symptoms that are similar to, or related to, the symptom Behind knee rash: 1 Behind knee itch (14 causes). 2 Knee rash (18 causes).3 Knee pain (122 causes). 4 Knee tingling (6 causes). 5 Knee symptoms (149 causes). 6 Skin itch (1068 causes). 7 Skin rash (461 causes). 8 Insect bite.auses of Similar Symptoms to Behind knee rash. Research the causes of these symptoms that are similar to, or related to, the symptom Behind knee rash: 1 Behind knee itch (14 causes). 2 Knee rash (18 causes).</code> |
|
| 475 |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
|
| 476 |
```json
|
| 477 |
{
|
|
|
|
| 630 |
### Training Logs
|
| 631 |
| Epoch | Step | Training Loss | Validation Loss | NanoMSMARCO_cosine_ndcg@10 | NanoNQ_cosine_ndcg@10 | NanoBEIR_mean_cosine_ndcg@10 |
|
| 632 |
|:------:|:----:|:-------------:|:---------------:|:--------------------------:|:---------------------:|:----------------------------:|
|
| 633 |
+
| 0 | 0 | - | 4.9037 | 0.6259 | 0.6583 | 0.6421 |
|
| 634 |
+
| 0.2874 | 250 | 3.7176 | 3.0286 | 0.5347 | 0.4519 | 0.4933 |
|
| 635 |
+
| 0.5747 | 500 | 3.111 | 2.9929 | 0.4499 | 0.4041 | 0.4270 |
|
| 636 |
+
| 0.8621 | 750 | 3.0734 | 2.9741 | 0.4816 | 0.3752 | 0.4284 |
|
| 637 |
+
| 1.1494 | 1000 | 3.0287 | 2.9680 | 0.4802 | 0.3422 | 0.4112 |
|
| 638 |
+
| 1.4368 | 1250 | 3.0024 | 2.9618 | 0.4850 | 0.3506 | 0.4178 |
|
| 639 |
+
| 1.7241 | 1500 | 2.9962 | 2.9568 | 0.4677 | 0.3843 | 0.4260 |
|
| 640 |
+
| 2.0115 | 1750 | 2.9903 | 2.9532 | 0.4694 | 0.3430 | 0.4062 |
|
| 641 |
+
| 2.2989 | 2000 | 2.9473 | 2.9527 | 0.4446 | 0.3497 | 0.3972 |
|
| 642 |
+
| 2.5862 | 2250 | 2.9458 | 2.9519 | 0.4300 | 0.3340 | 0.3820 |
|
| 643 |
+
| 2.8736 | 2500 | 2.9392 | 2.9500 | 0.4570 | 0.3466 | 0.4018 |
|
| 644 |
+
| 3.1609 | 2750 | 2.9292 | 2.9500 | 0.4529 | 0.3474 | 0.4001 |
|
| 645 |
+
| 3.4483 | 3000 | 2.9165 | 2.9502 | 0.4507 | 0.3365 | 0.3936 |
|
| 646 |
+
| 3.7356 | 3250 | 2.9151 | 2.9491 | 0.4429 | 0.3426 | 0.3927 |
|
| 647 |
|
| 648 |
|
| 649 |
### Framework Versions
|
config_sentence_transformers.json
CHANGED
|
@@ -1,10 +1,10 @@
|
|
| 1 |
{
|
|
|
|
| 2 |
"__version__": {
|
| 3 |
"sentence_transformers": "5.2.0",
|
| 4 |
"transformers": "4.57.3",
|
| 5 |
"pytorch": "2.9.1+cu128"
|
| 6 |
},
|
| 7 |
-
"model_type": "SentenceTransformer",
|
| 8 |
"prompts": {
|
| 9 |
"query": "",
|
| 10 |
"document": ""
|
|
|
|
| 1 |
{
|
| 2 |
+
"model_type": "SentenceTransformer",
|
| 3 |
"__version__": {
|
| 4 |
"sentence_transformers": "5.2.0",
|
| 5 |
"transformers": "4.57.3",
|
| 6 |
"pytorch": "2.9.1+cu128"
|
| 7 |
},
|
|
|
|
| 8 |
"prompts": {
|
| 9 |
"query": "",
|
| 10 |
"document": ""
|