File size: 43,725 Bytes
c782d0a
 
 
 
 
 
 
9fbaebd
c782d0a
e5ee0c6
c782d0a
b2f3977
c782d0a
b2f3977
 
 
 
 
 
 
 
 
 
 
 
 
 
c782d0a
b2f3977
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c782d0a
b2f3977
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c782d0a
b2f3977
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c782d0a
b2f3977
 
 
 
 
 
 
 
 
 
c782d0a
 
85a3389
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e5ee0c6
85a3389
 
 
 
 
 
 
 
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
 
 
 
 
 
 
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
 
 
 
 
 
 
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
 
e5ee0c6
85a3389
c782d0a
 
e5ee0c6
c782d0a
e5ee0c6
c782d0a
 
 
 
 
e5ee0c6
c782d0a
e5ee0c6
c782d0a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
e5ee0c6
 
 
c782d0a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
85a3389
c782d0a
5c42563
b2f3977
 
 
c782d0a
5c42563
 
e5ee0c6
c782d0a
 
5c42563
c782d0a
e5ee0c6
 
 
c782d0a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
85a3389
 
 
 
 
 
 
 
 
9abacba
 
e5ee0c6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
85a3389
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9a61b12
 
e5ee0c6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
85a3389
c782d0a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9fbaebd
85a3389
 
9fbaebd
 
 
e5ee0c6
85a3389
b2f3977
 
 
 
 
85a3389
 
 
65c2b38
85a3389
 
 
 
 
 
 
 
 
c70489f
85a3389
c782d0a
9fbaebd
 
 
e5ee0c6
c782d0a
b2f3977
 
 
 
 
c782d0a
 
 
65c2b38
c782d0a
 
 
 
 
 
 
 
85a3389
 
 
e5ee0c6
 
 
85a3389
c782d0a
85a3389
 
 
 
 
 
 
 
 
c782d0a
 
 
 
 
 
85a3389
c782d0a
85a3389
 
c782d0a
 
 
 
 
e5ee0c6
 
c782d0a
 
 
85a3389
 
e5ee0c6
c782d0a
 
85a3389
c782d0a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
85a3389
 
 
c782d0a
 
 
 
85a3389
c782d0a
 
 
 
 
 
 
 
 
85a3389
c782d0a
 
 
 
 
 
85a3389
c782d0a
 
 
 
 
 
85a3389
c782d0a
85a3389
c782d0a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
85a3389
c782d0a
 
 
 
 
 
85a3389
c782d0a
 
 
 
 
 
df8cabd
 
e5ee0c6
 
 
 
 
 
 
 
 
 
 
 
 
 
df8cabd
c782d0a
 
 
 
 
 
 
85a3389
c782d0a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
662
663
664
665
666
667
668
669
670
671
672
673
674
675
676
677
678
679
680
681
682
683
684
685
686
687
688
689
690
691
692
693
694
695
696
697
698
699
700
701
702
703
704
705
706
707
708
709
710
711
712
713
714
715
716
717
718
719
720
721
722
723
724
725
726
727
728
729
---
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- dense
- generated_from_trainer
- dataset_size:111470
- loss:MultipleNegativesRankingLoss
base_model: thenlper/gte-small
widget:
- source_sentence: when was the first elephant brought to america
  sentences:
  - Old Bet The first elephant brought to the United States was in 1796, aboard the
    America which set sail from Calcutta for New York on December 3, 1795.[4] However,
    it is not certain that this was Old Bet.[2] The first references to Old Bet start
    in 1804 in Boston as part of a menagerie.[1] In 1808, while residing in Somers,
    New York, Hachaliah Bailey purchased the menagerie elephant for $1,000 and named
    it "Old Bet".[5][6]
  - Cronus Rhea secretly gave birth to Zeus in Crete, and handed Cronus a stone wrapped
    in swaddling clothes, also known as the Omphalos Stone, which he promptly swallowed,
    thinking that it was his son.
  - Renal artery One or two accessory renal arteries are frequently found, especially
    on the left side since they usually arise from the aorta, and may come off above
    (more common) or below the main artery. Instead of entering the kidney at the
    hilus, they usually pierce the upper or lower part of the organ.
- source_sentence: who won the india's next superstar grand finale
  sentences:
  - India's Next Superstars India's Next Superstars is a talent-search Indian reality
    TV show, which premiered on Star Plus and is streamed on Hotstar.[1] Karan Johar
    and Rohit Shetty are the judges for the show. [2] Aman Gandotra and Natasha Bharadwaj
    were declared winners of 2018 season. Shruti Sharma won a 'Special Mention' award.
    Runners up in the male category were Aashish Mehrotra and Harshvardhan Deo and
    in the female category were Naina Singh and Shruti Sharma. [3]
  - India national cricket team India was invited to The Imperial Cricket Council
    in 1926, and made their debut as a Test playing nation in England in 1932, led
    by CK Nayudu, who was considered as the best Indian batsman at the time.[14] The
    one-off Test match between the two sides was played at Lord's in London. The team
    was not strong in their batting at this point and went on to lose by 158 runs.[15]
    In 1933, the first Test series in India was played between India and England with
    matches in Bombay, Calcutta (now Kolkata) and Madras (now Chennai). England won
    the series 2–0.[16] The Indian team continued to improve throughout the 1930s
    and '40s but did not achieve an international victory during this period. In the
    early 1940s, India didn't play any Test cricket due to the Second World War. The
    team's first series as an independent country was in late 1947 against Sir Donald
    Bradman's Invincibles (a name given to the Australia national cricket team of
    that time). It was also the first Test series India played which was not against
    England. Australia won the five-match series 4–0, with Bradman tormenting the
    Indian bowling in his final Australian summer.[17] India subsequently played their
    first Test series at home not against England against the West Indies in 1948.
    West Indies won the 5-Test series 1–0.[18]
  - Hindi Medium (film) Raj Batra (Irrfan Khan) is a rich businessman from Delhi staying
    with his wife Mita (Saba Qamar). They studied in a Hindi Medium school but wants
    their 5 year old daughter, Pia (Dishita Sehgal), to be admitted to one of the
    top schools in Delhi. The top school, 'Delhi Grammar School', has a condition
    that they will admit students who reside within 3km radius, so the family moves
    to Vasant Vihar.
- source_sentence: i am human and nothing of that which is human is alien to me meaning
  sentences:
  - America's Got Talent Introduced in season nine, the "Golden Buzzer" is located
    on the center of the judges' desk and may be used once per season by each judge.
    In season 9, a judge could press the golden buzzer to save an act from elimination,
    regardless of the number of X's earned from the other judges. Starting in season
    10 and onward, any act that receives a golden buzzer advances directly to the
    live show; and in season 11, the hosts also were given the power to use the golden
    buzzer. The golden buzzer is also used in the Judge Cuts format.
  - You'll Never Walk Alone "You'll Never Walk Alone" is a show tune from the 1945
    Rodgers and Hammerstein musical Carousel. In the second act of the musical, Nettie
    Fowler, the cousin of the female protagonist Julie Jordan, sings "You'll Never
    Walk Alone" to comfort and encourage Julie when her husband, Billy Bigelow, the
    male lead, commits suicide after a failed robbery attempt. It is reprised in the
    final scene to encourage a graduation class of which Louise (Billy and Julie's
    daughter) is a member. The now invisible Billy, who has been granted the chance
    to return to Earth for one day in order to redeem himself, watches the ceremony
    and is able to silently motivate the unhappy Louise to join in the song.
  - 'Terence One famous quotation by Terence reads: "Homo sum, humani nihil a me alienum
    puto", or "I am human, and I think that nothing of that which is human is alien
    to me." This appeared in his play Heauton Timorumenos.'
- source_sentence: what do glial cells do in the brain
  sentences:
  - 'Neuroglia Neuroglia, also called glial cells or simply glia, are non-neuronal
    cells in the central nervous system (brain and spinal cord) and the peripheral
    nervous system. They maintain homeostasis, form myelin, and provide support and
    protection for neurons.[1] In the central nervous system, glial cells include
    oligodendrocytes, astrocytes, ependymal cells and microglia, and in the peripheral
    nervous system glial cells include Schwann cells and satellite cells. They have
    four main functions: (1) To surround neurons and hold them in place (2) To supply
    nutrients and oxygen to neurons (3) To insulate one neuron from another (4) To
    destroy pathogens and remove dead neurons. They also play a role in neurotransmission
    and synaptic connections,[2] and in physiological processes like breathing,[3][4]
    .'
  - The Mother (How I Met Your Mother) Tracy McConnell, better known as "The Mother",
    is the title character from the CBS television sitcom How I Met Your Mother. The
    show, narrated by Future Ted, tells the story of how Ted Mosby met The Mother.
    Tracy McConnell appears in 8 episodes from "Lucky Penny" to "The Time Travelers"
    as an unseen character; she was first seen fully in "Something New" and was promoted
    to a main character in season 9. The Mother is played by Cristin Milioti.
  - Marsupial Marsupials are any members of the mammalian infraclass Marsupialia.
    All extant marsupials are endemic to Australasia and the Americas. A distinctive
    characteristic common to these species is that most of the young are carried in
    a pouch. Well-known marsupials include kangaroos, wallabies, koalas, possums,
    opossums, wombats, and Tasmanian devils. Some lesser-known marsupials are the
    potoroo and the quokka.
- source_sentence: 'It was Easipower that said :'
  sentences:
  - United States presidential election However, federal law does specify that all
    electors must be selected on the same day, which is "the first Tuesday after the
    first Monday in November," i.e., a Tuesday no earlier than November 2 and no later
    than November 8.[17] Today, the states and the District of Columbia each conduct
    their own popular elections on Election Day to help determine their respective
    slate of electors. Thus, the presidential election is really an amalgamation of
    separate and simultaneous state elections instead of a single national election
    run by the federal government.
  - It is said that Easipower was ,
  - 'It was Easipower that said :'
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100
model-index:
- name: SentenceTransformer based on thenlper/gte-small
  results:
  - task:
      type: information-retrieval
      name: Information Retrieval
    dataset:
      name: NanoMSMARCO
      type: NanoMSMARCO
    metrics:
    - type: cosine_accuracy@1
      value: 0.16
      name: Cosine Accuracy@1
    - type: cosine_accuracy@3
      value: 0.32
      name: Cosine Accuracy@3
    - type: cosine_accuracy@5
      value: 0.4
      name: Cosine Accuracy@5
    - type: cosine_accuracy@10
      value: 0.54
      name: Cosine Accuracy@10
    - type: cosine_precision@1
      value: 0.16
      name: Cosine Precision@1
    - type: cosine_precision@3
      value: 0.10666666666666666
      name: Cosine Precision@3
    - type: cosine_precision@5
      value: 0.08
      name: Cosine Precision@5
    - type: cosine_precision@10
      value: 0.05400000000000001
      name: Cosine Precision@10
    - type: cosine_recall@1
      value: 0.16
      name: Cosine Recall@1
    - type: cosine_recall@3
      value: 0.32
      name: Cosine Recall@3
    - type: cosine_recall@5
      value: 0.4
      name: Cosine Recall@5
    - type: cosine_recall@10
      value: 0.54
      name: Cosine Recall@10
    - type: cosine_ndcg@10
      value: 0.32698862634234876
      name: Cosine Ndcg@10
    - type: cosine_mrr@10
      value: 0.2620793650793651
      name: Cosine Mrr@10
    - type: cosine_map@100
      value: 0.2747949118190278
      name: Cosine Map@100
  - task:
      type: information-retrieval
      name: Information Retrieval
    dataset:
      name: NanoNQ
      type: NanoNQ
    metrics:
    - type: cosine_accuracy@1
      value: 0.22
      name: Cosine Accuracy@1
    - type: cosine_accuracy@3
      value: 0.44
      name: Cosine Accuracy@3
    - type: cosine_accuracy@5
      value: 0.52
      name: Cosine Accuracy@5
    - type: cosine_accuracy@10
      value: 0.64
      name: Cosine Accuracy@10
    - type: cosine_precision@1
      value: 0.22
      name: Cosine Precision@1
    - type: cosine_precision@3
      value: 0.14666666666666664
      name: Cosine Precision@3
    - type: cosine_precision@5
      value: 0.10400000000000001
      name: Cosine Precision@5
    - type: cosine_precision@10
      value: 0.06400000000000002
      name: Cosine Precision@10
    - type: cosine_recall@1
      value: 0.22
      name: Cosine Recall@1
    - type: cosine_recall@3
      value: 0.42
      name: Cosine Recall@3
    - type: cosine_recall@5
      value: 0.49
      name: Cosine Recall@5
    - type: cosine_recall@10
      value: 0.6
      name: Cosine Recall@10
    - type: cosine_ndcg@10
      value: 0.39877036805974797
      name: Cosine Ndcg@10
    - type: cosine_mrr@10
      value: 0.3438015873015873
      name: Cosine Mrr@10
    - type: cosine_map@100
      value: 0.3445409270682024
      name: Cosine Map@100
  - task:
      type: nano-beir
      name: Nano BEIR
    dataset:
      name: NanoBEIR mean
      type: NanoBEIR_mean
    metrics:
    - type: cosine_accuracy@1
      value: 0.19
      name: Cosine Accuracy@1
    - type: cosine_accuracy@3
      value: 0.38
      name: Cosine Accuracy@3
    - type: cosine_accuracy@5
      value: 0.46
      name: Cosine Accuracy@5
    - type: cosine_accuracy@10
      value: 0.5900000000000001
      name: Cosine Accuracy@10
    - type: cosine_precision@1
      value: 0.19
      name: Cosine Precision@1
    - type: cosine_precision@3
      value: 0.12666666666666665
      name: Cosine Precision@3
    - type: cosine_precision@5
      value: 0.092
      name: Cosine Precision@5
    - type: cosine_precision@10
      value: 0.05900000000000001
      name: Cosine Precision@10
    - type: cosine_recall@1
      value: 0.19
      name: Cosine Recall@1
    - type: cosine_recall@3
      value: 0.37
      name: Cosine Recall@3
    - type: cosine_recall@5
      value: 0.445
      name: Cosine Recall@5
    - type: cosine_recall@10
      value: 0.5700000000000001
      name: Cosine Recall@10
    - type: cosine_ndcg@10
      value: 0.3628794972010484
      name: Cosine Ndcg@10
    - type: cosine_mrr@10
      value: 0.30294047619047615
      name: Cosine Mrr@10
    - type: cosine_map@100
      value: 0.3096679194436151
      name: Cosine Map@100
---

# SentenceTransformer based on thenlper/gte-small

This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [thenlper/gte-small](https://huggingface.co/thenlper/gte-small). It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

## Model Details

### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [thenlper/gte-small](https://huggingface.co/thenlper/gte-small) <!-- at revision 17e1f347d17fe144873b1201da91788898c639cd -->
- **Maximum Sequence Length:** 128 tokens
- **Output Dimensionality:** 384 dimensions
- **Similarity Function:** Cosine Similarity
<!-- - **Training Dataset:** Unknown -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->

### Model Sources

- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/huggingface/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)

### Full Model Architecture

```
SentenceTransformer(
  (0): Transformer({'max_seq_length': 128, 'do_lower_case': False, 'architecture': 'BertModel'})
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)
```

## Usage

### Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

```bash
pip install -U sentence-transformers
```

Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("redis/model-b-structured")
# Run inference
sentences = [
    'It was Easipower that said :',
    'It was Easipower that said :',
    'It is said that Easipower was ,',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities)
# tensor([[1.0001, 1.0001, 0.4386],
#         [1.0001, 1.0001, 0.4386],
#         [0.4386, 0.4386, 1.0000]])
```

<!--
### Direct Usage (Transformers)

<details><summary>Click to see the direct usage in Transformers</summary>

</details>
-->

<!--
### Downstream Usage (Sentence Transformers)

You can finetune this model on your own dataset.

<details><summary>Click to expand</summary>

</details>
-->

<!--
### Out-of-Scope Use

*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->

## Evaluation

### Metrics

#### Information Retrieval

* Datasets: `NanoMSMARCO` and `NanoNQ`
* Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)

| Metric              | NanoMSMARCO | NanoNQ     |
|:--------------------|:------------|:-----------|
| cosine_accuracy@1   | 0.16        | 0.22       |
| cosine_accuracy@3   | 0.32        | 0.44       |
| cosine_accuracy@5   | 0.4         | 0.52       |
| cosine_accuracy@10  | 0.54        | 0.64       |
| cosine_precision@1  | 0.16        | 0.22       |
| cosine_precision@3  | 0.1067      | 0.1467     |
| cosine_precision@5  | 0.08        | 0.104      |
| cosine_precision@10 | 0.054       | 0.064      |
| cosine_recall@1     | 0.16        | 0.22       |
| cosine_recall@3     | 0.32        | 0.42       |
| cosine_recall@5     | 0.4         | 0.49       |
| cosine_recall@10    | 0.54        | 0.6        |
| **cosine_ndcg@10**  | **0.327**   | **0.3988** |
| cosine_mrr@10       | 0.2621      | 0.3438     |
| cosine_map@100      | 0.2748      | 0.3445     |

#### Nano BEIR

* Dataset: `NanoBEIR_mean`
* Evaluated with [<code>NanoBEIREvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.NanoBEIREvaluator) with these parameters:
  ```json
  {
      "dataset_names": [
          "msmarco",
          "nq"
      ],
      "dataset_id": "lightonai/NanoBEIR-en"
  }
  ```

| Metric              | Value      |
|:--------------------|:-----------|
| cosine_accuracy@1   | 0.19       |
| cosine_accuracy@3   | 0.38       |
| cosine_accuracy@5   | 0.46       |
| cosine_accuracy@10  | 0.59       |
| cosine_precision@1  | 0.19       |
| cosine_precision@3  | 0.1267     |
| cosine_precision@5  | 0.092      |
| cosine_precision@10 | 0.059      |
| cosine_recall@1     | 0.19       |
| cosine_recall@3     | 0.37       |
| cosine_recall@5     | 0.445      |
| cosine_recall@10    | 0.57       |
| **cosine_ndcg@10**  | **0.3629** |
| cosine_mrr@10       | 0.3029     |
| cosine_map@100      | 0.3097     |

<!--
## Bias, Risks and Limitations

*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->

<!--
### Recommendations

*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->

## Training Details

### Training Dataset

#### Unnamed Dataset

* Size: 111,470 training samples
* Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code>
* Approximate statistics based on the first 1000 samples:
  |         | anchor                                                                            | positive                                                                           | negative                                                                           |
  |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                             | string                                                                             |
  | details | <ul><li>min: 6 tokens</li><li>mean: 13.22 tokens</li><li>max: 44 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 90.67 tokens</li><li>max: 128 tokens</li></ul> | <ul><li>min: 7 tokens</li><li>mean: 89.65 tokens</li><li>max: 128 tokens</li></ul> |
* Samples:
  | anchor                                                                                                  | positive                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                            | negative                                                                                                                                                                                                                                                                                                                                                                                                                               |
  |:--------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>which state is home to the arizona ice tea beverage company</code>                                | <code>Arizona Beverage Company Arizona Beverages USA (stylized as AriZona) is an American producer of many flavors of iced tea, juice cocktails and energy drinks based in Woodbury, New York.[2] Arizona's first product was made available in 1992.</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                        | <code>Arya Vaishya Arya Vaishya (Arya Vysya) is an Indian caste. Orthodox Arya Vaishyas follow rituals prescribed in the Vasavi Puranam, a religious text written in the late Middle Ages. Their kuladevata is Vasavi. The community were formerly known as Komati Chettiars in Tamil Nadu but now prefer to be referred to as Arya Vaishya.[1]</code>                                                                                 |
  | <code>when were afro-american and africana studies programs founded in colleges and universities</code> | <code>African-American studies Programs and departments of African-American studies were first created in the 1960s and 1970s as a result of inter-ethnic student and faculty activism at many universities, sparked by a five-month strike for black studies at San Francisco State. In February 1968, San Francisco State hired sociologist Nathan Hare to coordinate the first black studies program and write a proposal for the first Department of Black Studies; the department was created in September 1968 and gained official status at the end of the five-months strike in the spring of 1969. The creation of programs and departments in Black studies was a common demand of protests and sit-ins by minority students and their allies, who felt that their cultures and interests were underserved by the traditional academic structures.</code> | <code>Maze Runner: The Death Cure Maze Runner: The Death Cure was originally set to be released on February 17, 2017, in the United States by 20th Century Fox, but the studio rescheduled the film's release for January 26, 2018 in theatres and IMAX, allowing time for O'Brien to recover from injuries he sustained during filming. The film received mixed reviews from critics and grossed over $284 million worldwide.</code> |
  | <code>who recorded the song total eclipse of the heart</code>                                           | <code>Bonnie Tyler Bonnie Tyler (born Gaynor Hopkins; 8 June 1951) is a Welsh singer, known for her distinctive husky voice. Tyler came to prominence with the release of her 1977 album The World Starts Tonight and its singles "Lost in France" and "More Than a Lover". Her 1978 single "It's a Heartache" reached number four on the UK Singles Chart, and number three on the US Billboard Hot 100.</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                    | <code>Manny Pacquiao vs. Juan Manuel Márquez IV Marquez defeated Pacquiao by knockout with one second remaining in the sixth round. It was named Fight of the Year and Knockout of the Year by Ring Magazine, with round five garnering Round of the Year honors.[2]</code>                                                                                                                                                            |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
  ```json
  {
      "scale": 3.0,
      "similarity_fct": "cos_sim",
      "gather_across_devices": false
  }
  ```

### Evaluation Dataset

#### Unnamed Dataset

* Size: 12,386 evaluation samples
* Columns: <code>anchor</code>, <code>positive</code>, and <code>negative</code>
* Approximate statistics based on the first 1000 samples:
  |         | anchor                                                                            | positive                                                                           | negative                                                                           |
  |:--------|:----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------|
  | type    | string                                                                            | string                                                                             | string                                                                             |
  | details | <ul><li>min: 6 tokens</li><li>mean: 13.03 tokens</li><li>max: 44 tokens</li></ul> | <ul><li>min: 6 tokens</li><li>mean: 89.36 tokens</li><li>max: 128 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 88.87 tokens</li><li>max: 128 tokens</li></ul> |
* Samples:
  | anchor                                                                                                                                  | positive                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                         | negative                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                        |
  |:----------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
  | <code>In early July , Steve Whitley , the criminal father of Harper Whitley and Garrett Whitley , and brother of Benny Cameron .</code> | <code>In early July , Steve Whitley , the criminal father of Harper Whitley and Garrett Whitley , and brother of Benny Cameron .</code>                                                                                                                                                                                                                                                                                                                                                                                                                          | <code>In early July , Garrett Whitley , who is the criminal father of Harper Whitley and Steve Whitley , and the brother of Benny Cameron , appeared .</code>                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                   |
  | <code>when will the next season of house of cards be released</code>                                                                    | <code>House of Cards (season 6) The sixth and final season of the American political drama web television series House of Cards was confirmed by Netflix on December 4, 2017, and is scheduled to be released on November 2, 2018. Unlike previous seasons that consisted of thirteen episodes each, the sixth season will consist of only eight. The season will not include former lead actor Kevin Spacey, who was fired from the show due to sexual misconduct allegations.</code>                                                                           | <code>Wild 'n Out For the first four seasons, the show filmed from Los Angeles/Hollywood and aired on MTV. The first run episodes were suspended as Mr. Renaissance Entertainment became Ncredible Entertainment in 2012. Upon being revived in 2012, the show was produced in New York City and aired on MTV2 during Seasons 5–7, it also returned to that location for Season 9. In 2016, the show returned to airing new episodes on MTV and also for the first time since Season 4, production is in Los Angeles.</code>                                                                                                                                                                                                                                                                                                                                                                  |
  | <code>who played the father on father knows best</code>                                                                                 | <code>Father Knows Best The series began August 25, 1949, on NBC Radio. Set in the Midwest, it starred Robert Young as the General Insurance agent Jim Anderson. His wife Margaret was first portrayed by June Whitley and later by Jean Vander Pyl. The Anderson children were Betty (Rhoda Williams), Bud (Ted Donaldson), and Kathy (Norma Jean Nilsson). Others in the cast were Eleanor Audley, Herb Vigran and Sam Edwards. Sponsored through most of its run by General Foods, the series was heard Thursday evenings on NBC until March 25, 1954.</code> | <code>List of To Kill a Mockingbird characters Maycomb children believe he is a horrible person, due to the rumors spread about him and a trial he underwent as a teenager. It is implied during the story that Boo is a very lonely man who attempts to reach out to Jem and Scout for love and friendship, such as leaving them small gifts and figures in a tree knothole. Scout finally meets him at the very end of the book, when he saves the children's lives from Bob Ewell. Scout describes him as being sickly white, with a thin mouth, thin and feathery hair, and grey eyes, almost as if he were blind. During the same night, when Boo whispers to Scout to walk him back to the Radley house, Scout takes a moment to picture what it would be like to be Boo Radley. While standing on his porch, she realizes his "exile" inside his house is really not that lonely.</code> |
* Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
  ```json
  {
      "scale": 3.0,
      "similarity_fct": "cos_sim",
      "gather_across_devices": false
  }
  ```

### Training Hyperparameters
#### Non-Default Hyperparameters

- `eval_strategy`: steps
- `per_device_train_batch_size`: 128
- `per_device_eval_batch_size`: 128
- `learning_rate`: 8e-05
- `weight_decay`: 0.005
- `max_steps`: 3375
- `warmup_ratio`: 0.1
- `fp16`: True
- `dataloader_drop_last`: True
- `dataloader_num_workers`: 1
- `dataloader_prefetch_factor`: 1
- `load_best_model_at_end`: True
- `optim`: adamw_torch
- `ddp_find_unused_parameters`: False
- `push_to_hub`: True
- `hub_model_id`: redis/model-b-structured
- `eval_on_start`: True

#### All Hyperparameters
<details><summary>Click to expand</summary>

- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: steps
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 128
- `per_device_eval_batch_size`: 128
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `torch_empty_cache_steps`: None
- `learning_rate`: 8e-05
- `weight_decay`: 0.005
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1.0
- `num_train_epochs`: 3.0
- `max_steps`: 3375
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.1
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `bf16`: False
- `fp16`: True
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: True
- `dataloader_num_workers`: 1
- `dataloader_prefetch_factor`: 1
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: True
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `parallelism_config`: None
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `project`: huggingface
- `trackio_space_id`: trackio
- `ddp_find_unused_parameters`: False
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: True
- `resume_from_checkpoint`: None
- `hub_model_id`: redis/model-b-structured
- `hub_strategy`: every_save
- `hub_private_repo`: None
- `hub_always_push`: False
- `hub_revision`: None
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `include_for_metrics`: []
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`: 
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: no
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `eval_on_start`: True
- `use_liger_kernel`: False
- `liger_kernel_config`: None
- `eval_use_gather_object`: False
- `average_tokens_across_devices`: True
- `prompts`: None
- `batch_sampler`: batch_sampler
- `multi_dataset_batch_sampler`: proportional
- `router_mapping`: {}
- `learning_rate_mapping`: {}

</details>

### Training Logs
| Epoch  | Step | Training Loss | Validation Loss | NanoMSMARCO_cosine_ndcg@10 | NanoNQ_cosine_ndcg@10 | NanoBEIR_mean_cosine_ndcg@10 |
|:------:|:----:|:-------------:|:---------------:|:--------------------------:|:---------------------:|:----------------------------:|
| 0      | 0    | -             | 4.9445          | 0.6259                     | 0.6583                | 0.6421                       |
| 0.2874 | 250  | 3.6887        | 3.0013          | 0.4676                     | 0.4424                | 0.4550                       |
| 0.5747 | 500  | 3.0661        | 2.9415          | 0.4647                     | 0.4688                | 0.4667                       |
| 0.8621 | 750  | 3.0125        | 2.9161          | 0.3994                     | 0.4479                | 0.4237                       |
| 1.1494 | 1000 | 2.9691        | 2.9044          | 0.3845                     | 0.4090                | 0.3968                       |
| 1.4368 | 1250 | 2.9407        | 2.8981          | 0.3614                     | 0.3858                | 0.3736                       |
| 1.7241 | 1500 | 2.9321        | 2.8893          | 0.3182                     | 0.3811                | 0.3496                       |
| 2.0115 | 1750 | 2.9227        | 2.8817          | 0.3444                     | 0.3973                | 0.3708                       |
| 2.2989 | 2000 | 2.8854        | 2.8807          | 0.3088                     | 0.3730                | 0.3409                       |
| 2.5862 | 2250 | 2.8832        | 2.8744          | 0.3251                     | 0.3968                | 0.3610                       |
| 2.8736 | 2500 | 2.8857        | 2.8730          | 0.3504                     | 0.4101                | 0.3802                       |
| 3.1609 | 2750 | 2.8677        | 2.8714          | 0.3233                     | 0.4021                | 0.3627                       |
| 3.4483 | 3000 | 2.86          | 2.8697          | 0.3239                     | 0.4106                | 0.3673                       |
| 3.7356 | 3250 | 2.8584        | 2.8686          | 0.3270                     | 0.3988                | 0.3629                       |


### Framework Versions
- Python: 3.10.18
- Sentence Transformers: 5.2.0
- Transformers: 4.57.3
- PyTorch: 2.9.1+cu128
- Accelerate: 1.12.0
- Datasets: 2.21.0
- Tokenizers: 0.22.1

## Citation

### BibTeX

#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}
```

#### MultipleNegativesRankingLoss
```bibtex
@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply},
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}
```

<!--
## Glossary

*Clearly define terms in order to be accessible across audiences.*
-->

<!--
## Model Card Authors

*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->

<!--
## Model Card Contact

*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
-->