tien314 commited on
Commit
ca201f8
·
verified ·
1 Parent(s): 5ac8027

mtien/miriad-embedding

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 768,
3
+ "pooling_mode_cls_token": false,
4
+ "pooling_mode_mean_tokens": true,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,864 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - sentence-transformers
4
+ - sentence-similarity
5
+ - feature-extraction
6
+ - dense
7
+ - generated_from_trainer
8
+ - dataset_size:2000
9
+ - loss:MatryoshkaLoss
10
+ - loss:MultipleNegativesRankingLoss
11
+ base_model: sentence-transformers/all-mpnet-base-v2
12
+ widget:
13
+ - source_sentence: 'What methods have been attempted to improve resin bond strength
14
+ to irradiated dentin?
15
+
16
+ '
17
+ sentences:
18
+ - Patients with BHD syndrome may have concerns about communicating genetic risk
19
+ to their family members, especially if their family has different communication
20
+ patterns or cultural norms. Some patients may find it difficult to share information
21
+ about an inherited, potentially lethal disorder with their family members. It
22
+ is observed that families in which affected members have experienced significant
23
+ morbidity are more likely to pursue genetic testing and surveillance. However,
24
+ this phenomenon has not been systematically studied in the BHD population. Patients
25
+ may also worry that their family members are not motivated to pursue genetic testing
26
+ and surveillance. In these situations, patients can share medical papers and handouts
27
+ with their family members and inform them about the process to obtain genetic
28
+ testing. Additionally, patients can encourage their family members to attend scientific
29
+ meetings and connect with other BHD families through resources like the Myrovlytis
30
+ website. Cancer Genetic Counselors (CGC) and/or Advanced Practice Nurses in Genetics
31
+ (APNG) can also provide support and guidance to patients and their families in
32
+ coping with the psychosocial ramifications of BHD.
33
+ - Psychological stress has been found to have a significant impact on medical illness,
34
+ including ocular disease. While vision researchers have not fully embraced the
35
+ approach of psychoneuroimmunology in addressing ocular disease, it is clear that
36
+ no organ system is protected from the effects of negative emotional states. Stress
37
+ is more prevalent among the elderly, and conditions such as retirement, chronic
38
+ illness, loss of loved ones, and caregiver's stress can induce chronic debilitating
39
+ stress. Ophthalmologists should prioritize time with patients to establish a compassionate
40
+ rapport and address emotional factors that may contribute to ocular conditions.
41
+ Failure to do so compromises the individual's opportunity for healing.
42
+ - Many researchers have attempted to improve resin bond strength to irradiated dentin
43
+ by removing the denatured layer mechanically and chemically. However, efficient
44
+ methods for clinical application have not yet been established. The reduction
45
+ of dentin bonding strength is believed to be due to the denatured layer of dentin
46
+ surface, which has led to the exploration of various techniques to remove or mitigate
47
+ its effects.
48
+ - source_sentence: 'What are the clinical features of peripheral ossifying fibroma?
49
+
50
+ '
51
+ sentences:
52
+ - The management of intracranial hemorrhage after thrombolysis is still uncertain.
53
+ It is unclear whether patients with severe intracranial hemorrhage soon after
54
+ thrombolytic therapy should receive only supportive medical care or should be
55
+ aggressively managed with treatment of increased intracranial pressure, ventriculostomy,
56
+ or neurosurgical evacuation. The use of clinical decision-making aids, such as
57
+ Figure 1, may assist clinicians in making empirical decisions for these patients.
58
+ - When the diagnosis of HIT is confirmed, therapeutic doses of alternative non-heparin
59
+ anticoagulants are usually required. Heparin treatments must be stopped immediately,
60
+ including heparin-bonded catheters and heparin flushes. Patients should be given
61
+ a non-heparin anticoagulant such as direct thrombin inhibitors like Bivalirudin,
62
+ Argatroban, or Lepirudin. These inhibitors directly inhibit the actions of thrombin
63
+ and do not require a cofactor. They are active against both free and clot-bound
64
+ thrombin and do not interact with or produce heparin-dependent antibodies.
65
+ - Histopathological evaluation of biopsy specimens of peripheral ossifying fibroma
66
+ typically reveals intact or ulcerated stratified squamous surface epithelium,
67
+ potentially mature mineralized material, epithelial proliferation, benign fibrous
68
+ connective tissue with varying fibroblast content, myofibroblasts and collagen,
69
+ lamellar or woven osteoid, and cement-like material or dystrophic calcifications.
70
+ The presence of acute and chronic inflammatory cells may also be observed.
71
+ - source_sentence: 'What are the common clinical features and diagnostic criteria
72
+ of relapsing polychondritis?
73
+
74
+ '
75
+ sentences:
76
+ - Lethal complications of relapsing polychondritis are often associated with airway
77
+ or cardiovascular involvement. This can include complications such as aortic incompetence,
78
+ mitral regurgitation, pericarditis, cardiac ischemia, aneurysms of large arteries,
79
+ vasculitis of the central nervous system, phlebitis, and Raynaud's phenomenon.
80
+ Neurological and renal system involvement can also occur, although it is rare.
81
+ Regular follow-up and management are important to monitor and prevent potential
82
+ complications in patients with relapsing polychondritis.
83
+ - Media focus can contribute to the risk of burnout in managers. Burnout is a prolonged
84
+ response to chronic emotional and interpersonal stressors at work. The pressure
85
+ and scrutiny from the media can lead to feelings of exhaustion, cynicism, and
86
+ inefficacy, which are the three dimensions of burnout. Managers may respond to
87
+ increased pressure by becoming avoidant, narrow-minded, and hard on themselves,
88
+ their subordinates, and their families. They may also try to establish emotional
89
+ and cognitive distance from the pressuring situation. Ultimately, the exposure
90
+ to negative media focus with elements of personification can increase the risk
91
+ of burnout in some managers.
92
+ - Intrathymic injection of MBP has potential applications in various medical treatments.
93
+ It can be used in surgical brain injuries caused by cutting, electric coagulation,
94
+ suction, and traction to alleviate the secondary attack to the brain tissue and
95
+ reduce the auto-inflammation process triggered by the exposure of autoantigens.
96
+ It may also be beneficial for elective surgeries, such as intracranial tumor operations,
97
+ to induce immune tolerance and alleviate auto-inflammation. With the development
98
+ of minimally invasive operation techniques, intrathymic injection without exposing
99
+ the thorax can become a simple, efficient, and safe procedure. Further studies
100
+ are needed to investigate the potential applications of intrathymic injection
101
+ of MBP in vivo.
102
+ - source_sentence: 'What are some potential mechanisms by which quercetin may protect
103
+ against cancer?
104
+
105
+ '
106
+ sentences:
107
+ - There is a significant correlation between serum B2M levels and some biochemical
108
+ parameters, such as ALK, bilirubin, and INR, in patients with liver disease. However,
109
+ no significant correlation has been found between serum B2M levels and viral load
110
+ among patients with liver disease.
111
+ - When the diagnosis of HIT is confirmed, therapeutic doses of alternative non-heparin
112
+ anticoagulants are usually required. Heparin treatments must be stopped immediately,
113
+ including heparin-bonded catheters and heparin flushes. Patients should be given
114
+ a non-heparin anticoagulant such as direct thrombin inhibitors like Bivalirudin,
115
+ Argatroban, or Lepirudin. These inhibitors directly inhibit the actions of thrombin
116
+ and do not require a cofactor. They are active against both free and clot-bound
117
+ thrombin and do not interact with or produce heparin-dependent antibodies.
118
+ - Silymarin and Ginkgo biloba extract have been found to possess hepatoprotective
119
+ effects against NDEA-induced hepatocarcinogenesis. These extracts can scavenge
120
+ free radicals, prevent hepatocellular damage, and suppress the leakage of enzymes
121
+ through plasma membranes. They may also modify the biotransformation/detoxification
122
+ of NDEA, reducing its liver toxicity. Additionally, silymarin can reduce intracellular
123
+ ROS levels, prevent oxidative stress-induced cellular damage, and stimulate hepatic
124
+ cell proliferation for liver regeneration. These effects make silymarin and Ginkgo
125
+ biloba extract strong candidates as chemopreventive agents for liver cancer.
126
+ - source_sentence: 'What are the molecular mechanisms involved in the synergistic
127
+ induction of SAA by IL-1, TNF-α, and IL-6?
128
+
129
+ '
130
+ sentences:
131
+ - The complex formation of STAT3, NF-κB p65, and p300 is involved in the transcriptional
132
+ activity of the SAA1 gene. STAT3 and p300 are recruited to the SAA1 promoter region
133
+ in response to IL-6 or IL-1β + IL-6 stimulation. Co-expression of wild type p300
134
+ with wild type STAT3 enhances the luciferase activity of the SAA1 gene in a dose-dependent
135
+ manner. This suggests that the heteromeric complex formation of STAT3, NF-κB p65,
136
+ and p300 contributes to the transcriptional activity of the SAA1 gene.
137
+ - Intrathymic injection of MBP has potential applications in various medical treatments.
138
+ It can be used in surgical brain injuries caused by cutting, electric coagulation,
139
+ suction, and traction to alleviate the secondary attack to the brain tissue and
140
+ reduce the auto-inflammation process triggered by the exposure of autoantigens.
141
+ It may also be beneficial for elective surgeries, such as intracranial tumor operations,
142
+ to induce immune tolerance and alleviate auto-inflammation. With the development
143
+ of minimally invasive operation techniques, intrathymic injection without exposing
144
+ the thorax can become a simple, efficient, and safe procedure. Further studies
145
+ are needed to investigate the potential applications of intrathymic injection
146
+ of MBP in vivo.
147
+ - Phenotypic screens of approved drug collections and synergistic combinations can
148
+ be a useful approach for rapid identification of new therapeutics for drug-resistant
149
+ bacteria. This approach can also be applied to emerging outbreaks of infectious
150
+ diseases where vaccines and therapeutic agents are unavailable or unrealistic
151
+ to develop in a short period of time. By screening existing drugs and combinations,
152
+ new therapeutics can be identified and potentially repurposed for the treatment
153
+ of drug-resistant infections.
154
+ pipeline_tag: sentence-similarity
155
+ library_name: sentence-transformers
156
+ metrics:
157
+ - cosine_accuracy@1
158
+ - cosine_accuracy@3
159
+ - cosine_accuracy@5
160
+ - cosine_accuracy@10
161
+ - cosine_precision@1
162
+ - cosine_precision@3
163
+ - cosine_precision@5
164
+ - cosine_precision@10
165
+ - cosine_recall@1
166
+ - cosine_recall@3
167
+ - cosine_recall@5
168
+ - cosine_recall@10
169
+ - cosine_ndcg@10
170
+ - cosine_mrr@10
171
+ - cosine_map@100
172
+ model-index:
173
+ - name: SentenceTransformer based on sentence-transformers/all-mpnet-base-v2
174
+ results:
175
+ - task:
176
+ type: information-retrieval
177
+ name: Information Retrieval
178
+ dataset:
179
+ name: dim 768
180
+ type: dim_768
181
+ metrics:
182
+ - type: cosine_accuracy@1
183
+ value: 0.7775
184
+ name: Cosine Accuracy@1
185
+ - type: cosine_accuracy@3
186
+ value: 0.8885
187
+ name: Cosine Accuracy@3
188
+ - type: cosine_accuracy@5
189
+ value: 0.917
190
+ name: Cosine Accuracy@5
191
+ - type: cosine_accuracy@10
192
+ value: 0.947
193
+ name: Cosine Accuracy@10
194
+ - type: cosine_precision@1
195
+ value: 0.7775
196
+ name: Cosine Precision@1
197
+ - type: cosine_precision@3
198
+ value: 0.29616666666666663
199
+ name: Cosine Precision@3
200
+ - type: cosine_precision@5
201
+ value: 0.18340000000000004
202
+ name: Cosine Precision@5
203
+ - type: cosine_precision@10
204
+ value: 0.09470000000000002
205
+ name: Cosine Precision@10
206
+ - type: cosine_recall@1
207
+ value: 0.7775
208
+ name: Cosine Recall@1
209
+ - type: cosine_recall@3
210
+ value: 0.8885
211
+ name: Cosine Recall@3
212
+ - type: cosine_recall@5
213
+ value: 0.917
214
+ name: Cosine Recall@5
215
+ - type: cosine_recall@10
216
+ value: 0.947
217
+ name: Cosine Recall@10
218
+ - type: cosine_ndcg@10
219
+ value: 0.8637977392462012
220
+ name: Cosine Ndcg@10
221
+ - type: cosine_mrr@10
222
+ value: 0.8369255952380947
223
+ name: Cosine Mrr@10
224
+ - type: cosine_map@100
225
+ value: 0.8394380047776188
226
+ name: Cosine Map@100
227
+ - task:
228
+ type: information-retrieval
229
+ name: Information Retrieval
230
+ dataset:
231
+ name: dim 512
232
+ type: dim_512
233
+ metrics:
234
+ - type: cosine_accuracy@1
235
+ value: 0.7785
236
+ name: Cosine Accuracy@1
237
+ - type: cosine_accuracy@3
238
+ value: 0.8825
239
+ name: Cosine Accuracy@3
240
+ - type: cosine_accuracy@5
241
+ value: 0.917
242
+ name: Cosine Accuracy@5
243
+ - type: cosine_accuracy@10
244
+ value: 0.944
245
+ name: Cosine Accuracy@10
246
+ - type: cosine_precision@1
247
+ value: 0.7785
248
+ name: Cosine Precision@1
249
+ - type: cosine_precision@3
250
+ value: 0.29416666666666663
251
+ name: Cosine Precision@3
252
+ - type: cosine_precision@5
253
+ value: 0.18340000000000004
254
+ name: Cosine Precision@5
255
+ - type: cosine_precision@10
256
+ value: 0.09440000000000003
257
+ name: Cosine Precision@10
258
+ - type: cosine_recall@1
259
+ value: 0.7785
260
+ name: Cosine Recall@1
261
+ - type: cosine_recall@3
262
+ value: 0.8825
263
+ name: Cosine Recall@3
264
+ - type: cosine_recall@5
265
+ value: 0.917
266
+ name: Cosine Recall@5
267
+ - type: cosine_recall@10
268
+ value: 0.944
269
+ name: Cosine Recall@10
270
+ - type: cosine_ndcg@10
271
+ value: 0.8623716893141778
272
+ name: Cosine Ndcg@10
273
+ - type: cosine_mrr@10
274
+ value: 0.8360055555555553
275
+ name: Cosine Mrr@10
276
+ - type: cosine_map@100
277
+ value: 0.8388749447751291
278
+ name: Cosine Map@100
279
+ - task:
280
+ type: information-retrieval
281
+ name: Information Retrieval
282
+ dataset:
283
+ name: dim 128
284
+ type: dim_128
285
+ metrics:
286
+ - type: cosine_accuracy@1
287
+ value: 0.7555
288
+ name: Cosine Accuracy@1
289
+ - type: cosine_accuracy@3
290
+ value: 0.8655
291
+ name: Cosine Accuracy@3
292
+ - type: cosine_accuracy@5
293
+ value: 0.9145
294
+ name: Cosine Accuracy@5
295
+ - type: cosine_accuracy@10
296
+ value: 0.943
297
+ name: Cosine Accuracy@10
298
+ - type: cosine_precision@1
299
+ value: 0.7555
300
+ name: Cosine Precision@1
301
+ - type: cosine_precision@3
302
+ value: 0.2884999999999999
303
+ name: Cosine Precision@3
304
+ - type: cosine_precision@5
305
+ value: 0.18290000000000003
306
+ name: Cosine Precision@5
307
+ - type: cosine_precision@10
308
+ value: 0.09430000000000001
309
+ name: Cosine Precision@10
310
+ - type: cosine_recall@1
311
+ value: 0.7555
312
+ name: Cosine Recall@1
313
+ - type: cosine_recall@3
314
+ value: 0.8655
315
+ name: Cosine Recall@3
316
+ - type: cosine_recall@5
317
+ value: 0.9145
318
+ name: Cosine Recall@5
319
+ - type: cosine_recall@10
320
+ value: 0.943
321
+ name: Cosine Recall@10
322
+ - type: cosine_ndcg@10
323
+ value: 0.8499528413626729
324
+ name: Cosine Ndcg@10
325
+ - type: cosine_mrr@10
326
+ value: 0.8199301587301584
327
+ name: Cosine Mrr@10
328
+ - type: cosine_map@100
329
+ value: 0.8224780775804242
330
+ name: Cosine Map@100
331
+ - task:
332
+ type: information-retrieval
333
+ name: Information Retrieval
334
+ dataset:
335
+ name: dim 64
336
+ type: dim_64
337
+ metrics:
338
+ - type: cosine_accuracy@1
339
+ value: 0.714
340
+ name: Cosine Accuracy@1
341
+ - type: cosine_accuracy@3
342
+ value: 0.8365
343
+ name: Cosine Accuracy@3
344
+ - type: cosine_accuracy@5
345
+ value: 0.877
346
+ name: Cosine Accuracy@5
347
+ - type: cosine_accuracy@10
348
+ value: 0.9285
349
+ name: Cosine Accuracy@10
350
+ - type: cosine_precision@1
351
+ value: 0.714
352
+ name: Cosine Precision@1
353
+ - type: cosine_precision@3
354
+ value: 0.27883333333333327
355
+ name: Cosine Precision@3
356
+ - type: cosine_precision@5
357
+ value: 0.1754
358
+ name: Cosine Precision@5
359
+ - type: cosine_precision@10
360
+ value: 0.09285
361
+ name: Cosine Precision@10
362
+ - type: cosine_recall@1
363
+ value: 0.714
364
+ name: Cosine Recall@1
365
+ - type: cosine_recall@3
366
+ value: 0.8365
367
+ name: Cosine Recall@3
368
+ - type: cosine_recall@5
369
+ value: 0.877
370
+ name: Cosine Recall@5
371
+ - type: cosine_recall@10
372
+ value: 0.9285
373
+ name: Cosine Recall@10
374
+ - type: cosine_ndcg@10
375
+ value: 0.8195584918161248
376
+ name: Cosine Ndcg@10
377
+ - type: cosine_mrr@10
378
+ value: 0.7848236111111104
379
+ name: Cosine Mrr@10
380
+ - type: cosine_map@100
381
+ value: 0.7878148778237813
382
+ name: Cosine Map@100
383
+ ---
384
+
385
+ # SentenceTransformer based on sentence-transformers/all-mpnet-base-v2
386
+
387
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [sentence-transformers/all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
388
+
389
+ ## Model Details
390
+
391
+ ### Model Description
392
+ - **Model Type:** Sentence Transformer
393
+ - **Base model:** [sentence-transformers/all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) <!-- at revision e8c3b32edf5434bc2275fc9bab85f82640a19130 -->
394
+ - **Maximum Sequence Length:** 384 tokens
395
+ - **Output Dimensionality:** 768 dimensions
396
+ - **Similarity Function:** Cosine Similarity
397
+ <!-- - **Training Dataset:** Unknown -->
398
+ <!-- - **Language:** Unknown -->
399
+ <!-- - **License:** Unknown -->
400
+
401
+ ### Model Sources
402
+
403
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
404
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/huggingface/sentence-transformers)
405
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
406
+
407
+ ### Full Model Architecture
408
+
409
+ ```
410
+ SentenceTransformer(
411
+ (0): Transformer({'max_seq_length': 384, 'do_lower_case': False, 'architecture': 'MPNetModel'})
412
+ (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
413
+ (2): Normalize()
414
+ )
415
+ ```
416
+
417
+ ## Usage
418
+
419
+ ### Direct Usage (Sentence Transformers)
420
+
421
+ First install the Sentence Transformers library:
422
+
423
+ ```bash
424
+ pip install -U sentence-transformers
425
+ ```
426
+
427
+ Then you can load this model and run inference.
428
+ ```python
429
+ from sentence_transformers import SentenceTransformer
430
+
431
+ # Download from the 🤗 Hub
432
+ model = SentenceTransformer("sentence_transformers_model_id")
433
+ # Run inference
434
+ sentences = [
435
+ 'What are the molecular mechanisms involved in the synergistic induction of SAA by IL-1, TNF-α, and IL-6?\n',
436
+ 'The complex formation of STAT3, NF-κB p65, and p300 is involved in the transcriptional activity of the SAA1 gene. STAT3 and p300 are recruited to the SAA1 promoter region in response to IL-6 or IL-1β + IL-6 stimulation. Co-expression of wild type p300 with wild type STAT3 enhances the luciferase activity of the SAA1 gene in a dose-dependent manner. This suggests that the heteromeric complex formation of STAT3, NF-κB p65, and p300 contributes to the transcriptional activity of the SAA1 gene.',
437
+ 'Phenotypic screens of approved drug collections and synergistic combinations can be a useful approach for rapid identification of new therapeutics for drug-resistant bacteria. This approach can also be applied to emerging outbreaks of infectious diseases where vaccines and therapeutic agents are unavailable or unrealistic to develop in a short period of time. By screening existing drugs and combinations, new therapeutics can be identified and potentially repurposed for the treatment of drug-resistant infections.',
438
+ ]
439
+ embeddings = model.encode(sentences)
440
+ print(embeddings.shape)
441
+ # [3, 768]
442
+
443
+ # Get the similarity scores for the embeddings
444
+ similarities = model.similarity(embeddings, embeddings)
445
+ print(similarities)
446
+ # tensor([[1.0000, 0.7925, 0.1356],
447
+ # [0.7925, 1.0000, 0.1694],
448
+ # [0.1356, 0.1694, 1.0000]])
449
+ ```
450
+
451
+ <!--
452
+ ### Direct Usage (Transformers)
453
+
454
+ <details><summary>Click to see the direct usage in Transformers</summary>
455
+
456
+ </details>
457
+ -->
458
+
459
+ <!--
460
+ ### Downstream Usage (Sentence Transformers)
461
+
462
+ You can finetune this model on your own dataset.
463
+
464
+ <details><summary>Click to expand</summary>
465
+
466
+ </details>
467
+ -->
468
+
469
+ <!--
470
+ ### Out-of-Scope Use
471
+
472
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
473
+ -->
474
+
475
+ ## Evaluation
476
+
477
+ ### Metrics
478
+
479
+ #### Information Retrieval
480
+
481
+ * Dataset: `dim_768`
482
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:
483
+ ```json
484
+ {
485
+ "truncate_dim": 768
486
+ }
487
+ ```
488
+
489
+ | Metric | Value |
490
+ |:--------------------|:-----------|
491
+ | cosine_accuracy@1 | 0.7775 |
492
+ | cosine_accuracy@3 | 0.8885 |
493
+ | cosine_accuracy@5 | 0.917 |
494
+ | cosine_accuracy@10 | 0.947 |
495
+ | cosine_precision@1 | 0.7775 |
496
+ | cosine_precision@3 | 0.2962 |
497
+ | cosine_precision@5 | 0.1834 |
498
+ | cosine_precision@10 | 0.0947 |
499
+ | cosine_recall@1 | 0.7775 |
500
+ | cosine_recall@3 | 0.8885 |
501
+ | cosine_recall@5 | 0.917 |
502
+ | cosine_recall@10 | 0.947 |
503
+ | **cosine_ndcg@10** | **0.8638** |
504
+ | cosine_mrr@10 | 0.8369 |
505
+ | cosine_map@100 | 0.8394 |
506
+
507
+ #### Information Retrieval
508
+
509
+ * Dataset: `dim_512`
510
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:
511
+ ```json
512
+ {
513
+ "truncate_dim": 512
514
+ }
515
+ ```
516
+
517
+ | Metric | Value |
518
+ |:--------------------|:-----------|
519
+ | cosine_accuracy@1 | 0.7785 |
520
+ | cosine_accuracy@3 | 0.8825 |
521
+ | cosine_accuracy@5 | 0.917 |
522
+ | cosine_accuracy@10 | 0.944 |
523
+ | cosine_precision@1 | 0.7785 |
524
+ | cosine_precision@3 | 0.2942 |
525
+ | cosine_precision@5 | 0.1834 |
526
+ | cosine_precision@10 | 0.0944 |
527
+ | cosine_recall@1 | 0.7785 |
528
+ | cosine_recall@3 | 0.8825 |
529
+ | cosine_recall@5 | 0.917 |
530
+ | cosine_recall@10 | 0.944 |
531
+ | **cosine_ndcg@10** | **0.8624** |
532
+ | cosine_mrr@10 | 0.836 |
533
+ | cosine_map@100 | 0.8389 |
534
+
535
+ #### Information Retrieval
536
+
537
+ * Dataset: `dim_128`
538
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:
539
+ ```json
540
+ {
541
+ "truncate_dim": 128
542
+ }
543
+ ```
544
+
545
+ | Metric | Value |
546
+ |:--------------------|:---------|
547
+ | cosine_accuracy@1 | 0.7555 |
548
+ | cosine_accuracy@3 | 0.8655 |
549
+ | cosine_accuracy@5 | 0.9145 |
550
+ | cosine_accuracy@10 | 0.943 |
551
+ | cosine_precision@1 | 0.7555 |
552
+ | cosine_precision@3 | 0.2885 |
553
+ | cosine_precision@5 | 0.1829 |
554
+ | cosine_precision@10 | 0.0943 |
555
+ | cosine_recall@1 | 0.7555 |
556
+ | cosine_recall@3 | 0.8655 |
557
+ | cosine_recall@5 | 0.9145 |
558
+ | cosine_recall@10 | 0.943 |
559
+ | **cosine_ndcg@10** | **0.85** |
560
+ | cosine_mrr@10 | 0.8199 |
561
+ | cosine_map@100 | 0.8225 |
562
+
563
+ #### Information Retrieval
564
+
565
+ * Dataset: `dim_64`
566
+ * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator) with these parameters:
567
+ ```json
568
+ {
569
+ "truncate_dim": 64
570
+ }
571
+ ```
572
+
573
+ | Metric | Value |
574
+ |:--------------------|:-----------|
575
+ | cosine_accuracy@1 | 0.714 |
576
+ | cosine_accuracy@3 | 0.8365 |
577
+ | cosine_accuracy@5 | 0.877 |
578
+ | cosine_accuracy@10 | 0.9285 |
579
+ | cosine_precision@1 | 0.714 |
580
+ | cosine_precision@3 | 0.2788 |
581
+ | cosine_precision@5 | 0.1754 |
582
+ | cosine_precision@10 | 0.0929 |
583
+ | cosine_recall@1 | 0.714 |
584
+ | cosine_recall@3 | 0.8365 |
585
+ | cosine_recall@5 | 0.877 |
586
+ | cosine_recall@10 | 0.9285 |
587
+ | **cosine_ndcg@10** | **0.8196** |
588
+ | cosine_mrr@10 | 0.7848 |
589
+ | cosine_map@100 | 0.7878 |
590
+
591
+ <!--
592
+ ## Bias, Risks and Limitations
593
+
594
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
595
+ -->
596
+
597
+ <!--
598
+ ### Recommendations
599
+
600
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
601
+ -->
602
+
603
+ ## Training Details
604
+
605
+ ### Training Dataset
606
+
607
+ #### Unnamed Dataset
608
+
609
+ * Size: 2,000 training samples
610
+ * Columns: <code>anchor</code> and <code>positive</code>
611
+ * Approximate statistics based on the first 1000 samples:
612
+ | | anchor | positive |
613
+ |:--------|:----------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|
614
+ | type | string | string |
615
+ | details | <ul><li>min: 8 tokens</li><li>mean: 20.92 tokens</li><li>max: 51 tokens</li></ul> | <ul><li>min: 30 tokens</li><li>mean: 116.22 tokens</li><li>max: 227 tokens</li></ul> |
616
+ * Samples:
617
+ | anchor | positive |
618
+ |:------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
619
+ | <code>What are the common clinical features and diagnostic criteria of relapsing polychondritis?<br></code> | <code>Lethal complications of relapsing polychondritis are often associated with airway or cardiovascular involvement. This can include complications such as aortic incompetence, mitral regurgitation, pericarditis, cardiac ischemia, aneurysms of large arteries, vasculitis of the central nervous system, phlebitis, and Raynaud's phenomenon. Neurological and renal system involvement can also occur, although it is rare. Regular follow-up and management are important to monitor and prevent potential complications in patients with relapsing polychondritis.</code> |
620
+ | <code>What are the treatment options for relapsing polychondritis?<br></code> | <code>Lethal complications of relapsing polychondritis are often associated with airway or cardiovascular involvement. This can include complications such as aortic incompetence, mitral regurgitation, pericarditis, cardiac ischemia, aneurysms of large arteries, vasculitis of the central nervous system, phlebitis, and Raynaud's phenomenon. Neurological and renal system involvement can also occur, although it is rare. Regular follow-up and management are important to monitor and prevent potential complications in patients with relapsing polychondritis.</code> |
621
+ | <code>What are the potential complications associated with relapsing polychondritis?<br></code> | <code>Lethal complications of relapsing polychondritis are often associated with airway or cardiovascular involvement. This can include complications such as aortic incompetence, mitral regurgitation, pericarditis, cardiac ischemia, aneurysms of large arteries, vasculitis of the central nervous system, phlebitis, and Raynaud's phenomenon. Neurological and renal system involvement can also occur, although it is rare. Regular follow-up and management are important to monitor and prevent potential complications in patients with relapsing polychondritis.</code> |
622
+ * Loss: [<code>MatryoshkaLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
623
+ ```json
624
+ {
625
+ "loss": "MultipleNegativesRankingLoss",
626
+ "matryoshka_dims": [
627
+ 768,
628
+ 512,
629
+ 128,
630
+ 64
631
+ ],
632
+ "matryoshka_weights": [
633
+ 1,
634
+ 1,
635
+ 1,
636
+ 1
637
+ ],
638
+ "n_dims_per_step": -1
639
+ }
640
+ ```
641
+
642
+ ### Training Hyperparameters
643
+ #### Non-Default Hyperparameters
644
+
645
+ - `eval_strategy`: steps
646
+ - `per_device_train_batch_size`: 16
647
+ - `gradient_accumulation_steps`: 4
648
+ - `learning_rate`: 2e-05
649
+ - `num_train_epochs`: 1
650
+ - `lr_scheduler_type`: cosine
651
+ - `warmup_ratio`: 0.1
652
+ - `warmup_steps`: 0.1
653
+ - `bf16`: True
654
+ - `load_best_model_at_end`: True
655
+ - `batch_sampler`: no_duplicates
656
+
657
+ #### All Hyperparameters
658
+ <details><summary>Click to expand</summary>
659
+
660
+ - `do_predict`: False
661
+ - `eval_strategy`: steps
662
+ - `prediction_loss_only`: True
663
+ - `per_device_train_batch_size`: 16
664
+ - `per_device_eval_batch_size`: 8
665
+ - `gradient_accumulation_steps`: 4
666
+ - `eval_accumulation_steps`: None
667
+ - `torch_empty_cache_steps`: None
668
+ - `learning_rate`: 2e-05
669
+ - `weight_decay`: 0.0
670
+ - `adam_beta1`: 0.9
671
+ - `adam_beta2`: 0.999
672
+ - `adam_epsilon`: 1e-08
673
+ - `max_grad_norm`: 1.0
674
+ - `num_train_epochs`: 1
675
+ - `max_steps`: -1
676
+ - `lr_scheduler_type`: cosine
677
+ - `lr_scheduler_kwargs`: None
678
+ - `warmup_ratio`: 0.1
679
+ - `warmup_steps`: 0.1
680
+ - `log_level`: passive
681
+ - `log_level_replica`: warning
682
+ - `log_on_each_node`: True
683
+ - `logging_nan_inf_filter`: True
684
+ - `enable_jit_checkpoint`: False
685
+ - `save_on_each_node`: False
686
+ - `save_only_model`: False
687
+ - `restore_callback_states_from_checkpoint`: False
688
+ - `use_cpu`: False
689
+ - `seed`: 42
690
+ - `data_seed`: None
691
+ - `bf16`: True
692
+ - `fp16`: False
693
+ - `bf16_full_eval`: False
694
+ - `fp16_full_eval`: False
695
+ - `tf32`: None
696
+ - `local_rank`: -1
697
+ - `ddp_backend`: None
698
+ - `debug`: []
699
+ - `dataloader_drop_last`: False
700
+ - `dataloader_num_workers`: 0
701
+ - `dataloader_prefetch_factor`: None
702
+ - `disable_tqdm`: False
703
+ - `remove_unused_columns`: True
704
+ - `label_names`: None
705
+ - `load_best_model_at_end`: True
706
+ - `ignore_data_skip`: False
707
+ - `fsdp`: []
708
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
709
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
710
+ - `parallelism_config`: None
711
+ - `deepspeed`: None
712
+ - `label_smoothing_factor`: 0.0
713
+ - `optim`: adamw_torch_fused
714
+ - `optim_args`: None
715
+ - `group_by_length`: False
716
+ - `length_column_name`: length
717
+ - `project`: huggingface
718
+ - `trackio_space_id`: trackio
719
+ - `ddp_find_unused_parameters`: None
720
+ - `ddp_bucket_cap_mb`: None
721
+ - `ddp_broadcast_buffers`: False
722
+ - `dataloader_pin_memory`: True
723
+ - `dataloader_persistent_workers`: False
724
+ - `skip_memory_metrics`: True
725
+ - `push_to_hub`: False
726
+ - `resume_from_checkpoint`: None
727
+ - `hub_model_id`: None
728
+ - `hub_strategy`: every_save
729
+ - `hub_private_repo`: None
730
+ - `hub_always_push`: False
731
+ - `hub_revision`: None
732
+ - `gradient_checkpointing`: False
733
+ - `gradient_checkpointing_kwargs`: None
734
+ - `include_for_metrics`: []
735
+ - `eval_do_concat_batches`: True
736
+ - `auto_find_batch_size`: False
737
+ - `full_determinism`: False
738
+ - `ddp_timeout`: 1800
739
+ - `torch_compile`: False
740
+ - `torch_compile_backend`: None
741
+ - `torch_compile_mode`: None
742
+ - `include_num_input_tokens_seen`: no
743
+ - `neftune_noise_alpha`: None
744
+ - `optim_target_modules`: None
745
+ - `batch_eval_metrics`: False
746
+ - `eval_on_start`: False
747
+ - `use_liger_kernel`: False
748
+ - `liger_kernel_config`: None
749
+ - `eval_use_gather_object`: False
750
+ - `average_tokens_across_devices`: True
751
+ - `use_cache`: False
752
+ - `prompts`: None
753
+ - `batch_sampler`: no_duplicates
754
+ - `multi_dataset_batch_sampler`: proportional
755
+ - `router_mapping`: {}
756
+ - `learning_rate_mapping`: {}
757
+
758
+ </details>
759
+
760
+ ### Training Logs
761
+ | Epoch | Step | Training Loss | dim_768_cosine_ndcg@10 | dim_512_cosine_ndcg@10 | dim_128_cosine_ndcg@10 | dim_64_cosine_ndcg@10 |
762
+ |:-----:|:----:|:-------------:|:----------------------:|:----------------------:|:----------------------:|:---------------------:|
763
+ | -1 | -1 | - | 0.8142 | 0.8058 | 0.7676 | 0.7053 |
764
+ | 0.032 | 1 | 1.5764 | 0.8146 | 0.8055 | 0.7669 | 0.7049 |
765
+ | 0.064 | 2 | 2.6620 | 0.8162 | 0.8077 | 0.7690 | 0.7086 |
766
+ | 0.096 | 3 | 1.9032 | 0.8204 | 0.8126 | 0.7759 | 0.7173 |
767
+ | 0.128 | 4 | 1.6601 | 0.8252 | 0.8177 | 0.7849 | 0.7282 |
768
+ | 0.16 | 5 | 1.1083 | 0.8315 | 0.8251 | 0.7902 | 0.7419 |
769
+ | 0.192 | 6 | 2.7345 | 0.8361 | 0.8317 | 0.7970 | 0.7510 |
770
+ | 0.224 | 7 | 1.2922 | 0.8375 | 0.8351 | 0.8025 | 0.7620 |
771
+ | 0.256 | 8 | 1.6647 | 0.8399 | 0.8367 | 0.8080 | 0.7686 |
772
+ | 0.288 | 9 | 1.1997 | 0.8425 | 0.8398 | 0.8133 | 0.7754 |
773
+ | 0.32 | 10 | 0.8064 | 0.8441 | 0.8419 | 0.8181 | 0.7799 |
774
+ | 0.352 | 11 | 1.1935 | 0.8468 | 0.8442 | 0.8220 | 0.7843 |
775
+ | 0.384 | 12 | 0.7776 | 0.8482 | 0.8462 | 0.8242 | 0.7886 |
776
+ | 0.416 | 13 | 0.9272 | 0.8494 | 0.8484 | 0.8261 | 0.7940 |
777
+ | 0.448 | 14 | 1.2406 | 0.8510 | 0.8502 | 0.8294 | 0.7978 |
778
+ | 0.48 | 15 | 1.0830 | 0.8520 | 0.8518 | 0.8325 | 0.7999 |
779
+ | 0.512 | 16 | 1.9336 | 0.8534 | 0.8532 | 0.8340 | 0.8017 |
780
+ | 0.544 | 17 | 1.2190 | 0.8541 | 0.8537 | 0.8360 | 0.8026 |
781
+ | 0.576 | 18 | 1.7060 | 0.8554 | 0.8545 | 0.8388 | 0.8063 |
782
+ | 0.608 | 19 | 1.4131 | 0.8571 | 0.8561 | 0.8412 | 0.8084 |
783
+ | 0.64 | 20 | 1.1700 | 0.8581 | 0.8569 | 0.8429 | 0.8101 |
784
+ | 0.672 | 21 | 0.5671 | 0.8599 | 0.8580 | 0.8445 | 0.8118 |
785
+ | 0.704 | 22 | 1.4699 | 0.8613 | 0.8596 | 0.8455 | 0.8140 |
786
+ | 0.736 | 23 | 1.6544 | 0.8620 | 0.8608 | 0.8463 | 0.8158 |
787
+ | 0.768 | 24 | 2.0854 | 0.8624 | 0.8614 | 0.8476 | 0.8169 |
788
+ | 0.8 | 25 | 0.9175 | 0.8630 | 0.8616 | 0.8484 | 0.8180 |
789
+ | 0.832 | 26 | 1.3673 | 0.8632 | 0.8615 | 0.8485 | 0.8182 |
790
+ | 0.864 | 27 | 1.2114 | 0.8637 | 0.8617 | 0.8491 | 0.8190 |
791
+ | 0.896 | 28 | 0.9807 | 0.8637 | 0.8620 | 0.8497 | 0.8190 |
792
+ | 0.928 | 29 | 0.9052 | 0.8635 | 0.8620 | 0.8497 | 0.8192 |
793
+ | 0.96 | 30 | 1.7420 | 0.8640 | 0.8624 | 0.8500 | 0.8194 |
794
+ | 0.992 | 31 | 1.3071 | 0.8640 | 0.8622 | 0.8497 | 0.8193 |
795
+ | 1.0 | 32 | 1.3117 | 0.8638 | 0.8624 | 0.8500 | 0.8196 |
796
+
797
+
798
+ ### Framework Versions
799
+ - Python: 3.12.12
800
+ - Sentence Transformers: 5.2.3
801
+ - Transformers: 5.0.0
802
+ - PyTorch: 2.10.0+cu128
803
+ - Accelerate: 1.12.0
804
+ - Datasets: 4.0.0
805
+ - Tokenizers: 0.22.2
806
+
807
+ ## Citation
808
+
809
+ ### BibTeX
810
+
811
+ #### Sentence Transformers
812
+ ```bibtex
813
+ @inproceedings{reimers-2019-sentence-bert,
814
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
815
+ author = "Reimers, Nils and Gurevych, Iryna",
816
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
817
+ month = "11",
818
+ year = "2019",
819
+ publisher = "Association for Computational Linguistics",
820
+ url = "https://arxiv.org/abs/1908.10084",
821
+ }
822
+ ```
823
+
824
+ #### MatryoshkaLoss
825
+ ```bibtex
826
+ @misc{kusupati2024matryoshka,
827
+ title={Matryoshka Representation Learning},
828
+ author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
829
+ year={2024},
830
+ eprint={2205.13147},
831
+ archivePrefix={arXiv},
832
+ primaryClass={cs.LG}
833
+ }
834
+ ```
835
+
836
+ #### MultipleNegativesRankingLoss
837
+ ```bibtex
838
+ @misc{henderson2017efficient,
839
+ title={Efficient Natural Language Response Suggestion for Smart Reply},
840
+ author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
841
+ year={2017},
842
+ eprint={1705.00652},
843
+ archivePrefix={arXiv},
844
+ primaryClass={cs.CL}
845
+ }
846
+ ```
847
+
848
+ <!--
849
+ ## Glossary
850
+
851
+ *Clearly define terms in order to be accessible across audiences.*
852
+ -->
853
+
854
+ <!--
855
+ ## Model Card Authors
856
+
857
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
858
+ -->
859
+
860
+ <!--
861
+ ## Model Card Contact
862
+
863
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
864
+ -->
config.json ADDED
@@ -0,0 +1,24 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "architectures": [
3
+ "MPNetModel"
4
+ ],
5
+ "attention_probs_dropout_prob": 0.1,
6
+ "bos_token_id": 0,
7
+ "dtype": "float32",
8
+ "eos_token_id": 2,
9
+ "hidden_act": "gelu",
10
+ "hidden_dropout_prob": 0.1,
11
+ "hidden_size": 768,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 3072,
14
+ "layer_norm_eps": 1e-05,
15
+ "max_position_embeddings": 514,
16
+ "model_type": "mpnet",
17
+ "num_attention_heads": 12,
18
+ "num_hidden_layers": 12,
19
+ "pad_token_id": 1,
20
+ "relative_attention_num_buckets": 32,
21
+ "tie_word_embeddings": true,
22
+ "transformers_version": "5.0.0",
23
+ "vocab_size": 30527
24
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "5.2.3",
4
+ "transformers": "5.0.0",
5
+ "pytorch": "2.10.0+cu128"
6
+ },
7
+ "model_type": "SentenceTransformer",
8
+ "prompts": {
9
+ "query": "",
10
+ "document": ""
11
+ },
12
+ "default_prompt_name": null,
13
+ "similarity_fn_name": "cosine"
14
+ }
eval/Information-Retrieval_evaluation_dim_128_results.csv ADDED
@@ -0,0 +1,38 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch,steps,cosine-Accuracy@1,cosine-Accuracy@3,cosine-Accuracy@5,cosine-Accuracy@10,cosine-Precision@1,cosine-Recall@1,cosine-Precision@3,cosine-Recall@3,cosine-Precision@5,cosine-Recall@5,cosine-Precision@10,cosine-Recall@10,cosine-MRR@10,cosine-NDCG@10,cosine-MAP@100
2
+ 0.04,10,0.575,0.691375,0.733375,0.787625,0.575,0.575,0.23045833333333332,0.691375,0.146675,0.733375,0.0787625,0.787625,0.6441267361111113,0.6786380265078881,0.649418689463333
3
+ 0.004,1,0.575,0.691375,0.733375,0.787625,0.575,0.575,0.23045833333333332,0.691375,0.146675,0.733375,0.0787625,0.787625,0.6441267361111113,0.6786380265078881,0.649418689463333
4
+ 0.008,1,0.575,0.691375,0.733375,0.787625,0.575,0.575,0.23045833333333332,0.691375,0.146675,0.733375,0.0787625,0.787625,0.6441267361111113,0.6786380265078881,0.649418689463333
5
+ 0.016,2,0.574875,0.691625,0.732875,0.78825,0.574875,0.574875,0.23054166666666664,0.691625,0.14657499999999998,0.732875,0.078825,0.78825,0.6442039682539683,0.6788389368229334,0.6494562837448018
6
+ 0.024,3,0.57675,0.691625,0.734375,0.78875,0.57675,0.57675,0.23054166666666664,0.691625,0.146875,0.734375,0.078875,0.78875,0.6453487599206355,0.6798218082864294,0.6506137757898717
7
+ 0.032,1,0.6805,0.777,0.8115,0.8595,0.6805,0.6805,0.259,0.777,0.1623,0.8115,0.08595000000000001,0.8595,0.737736706349206,0.766934049324939,0.7422491371283857
8
+ 0.064,2,0.6825,0.7815,0.8145,0.861,0.6825,0.6825,0.2605,0.7815,0.16290000000000002,0.8145,0.08610000000000001,0.861,0.7399787698412695,0.7690444113693291,0.7445272878077759
9
+ 0.096,3,0.691,0.7845,0.82,0.868,0.691,0.691,0.2615,0.7845,0.164,0.82,0.08680000000000002,0.868,0.7468555555555552,0.7759006792421442,0.7511691345495797
10
+ 0.128,4,0.6995,0.7945,0.8305,0.878,0.6995,0.6995,0.2648333333333333,0.7945,0.16609999999999997,0.8305,0.08780000000000002,0.878,0.7556011904761899,0.7849364917684537,0.7594771045902264
11
+ 0.16,5,0.701,0.8015,0.8445,0.8845,0.701,0.701,0.2671666666666666,0.8015,0.1689,0.8445,0.08845000000000001,0.8845,0.7603162698412695,0.7902464513281268,0.7640101745956649
12
+ 0.192,6,0.707,0.813,0.852,0.8905,0.707,0.707,0.271,0.813,0.1704,0.852,0.08905000000000002,0.8905,0.7671827380952376,0.796996035770871,0.7708475717989774
13
+ 0.224,7,0.714,0.8175,0.8585,0.8935,0.714,0.714,0.2725,0.8175,0.1717,0.8585,0.08935000000000003,0.8935,0.7734748015873014,0.8025453073289789,0.7773357900811564
14
+ 0.256,8,0.7195,0.8205,0.86,0.9,0.7195,0.7195,0.2735,0.8205,0.172,0.86,0.09000000000000001,0.9,0.7786384920634918,0.8079829676026711,0.7822203184510239
15
+ 0.288,9,0.7245,0.826,0.866,0.9045,0.7245,0.7245,0.2753333333333333,0.826,0.1732,0.866,0.09045000000000002,0.9045,0.7841779761904758,0.8132992652456396,0.7876619483670485
16
+ 0.32,10,0.73,0.827,0.871,0.91,0.73,0.73,0.2756666666666666,0.827,0.17420000000000002,0.871,0.09100000000000001,0.91,0.7888827380952381,0.8181264990801913,0.7921345716418697
17
+ 0.352,11,0.734,0.8335,0.877,0.913,0.734,0.734,0.2778333333333333,0.8335,0.1754,0.877,0.0913,0.913,0.7928722222222219,0.8219660805063054,0.7960459204102145
18
+ 0.384,12,0.735,0.842,0.8825,0.9145,0.735,0.735,0.2806666666666666,0.842,0.1765,0.8825,0.09145000000000002,0.9145,0.7952206349206346,0.8242243195036895,0.798516074746295
19
+ 0.416,13,0.7355,0.846,0.8845,0.9175,0.7355,0.7355,0.282,0.846,0.17690000000000003,0.8845,0.09175,0.9175,0.7967228174603171,0.8261096934678105,0.7999960881284415
20
+ 0.448,14,0.7375,0.849,0.8865,0.9205,0.7375,0.7375,0.2829999999999999,0.849,0.1773,0.8865,0.09205,0.9205,0.8000948412698409,0.8294319489861681,0.803328130800734
21
+ 0.48,15,0.739,0.8495,0.891,0.925,0.739,0.739,0.2831666666666666,0.8495,0.17820000000000003,0.891,0.0925,0.925,0.8026706349206346,0.8324623501755561,0.8056592282362983
22
+ 0.512,16,0.74,0.8515,0.8945,0.9255,0.74,0.74,0.2838333333333333,0.8515,0.1789,0.8945,0.09255000000000001,0.9255,0.8044456349206346,0.8340317145605646,0.8075158147667176
23
+ 0.544,17,0.7425,0.853,0.897,0.927,0.7425,0.7425,0.28433333333333327,0.853,0.17940000000000003,0.897,0.09270000000000002,0.927,0.8065587301587298,0.8359912033745117,0.8096508110438471
24
+ 0.576,18,0.7475,0.856,0.899,0.9275,0.7475,0.7475,0.2853333333333333,0.856,0.17980000000000002,0.899,0.09275000000000001,0.9275,0.8100607142857139,0.8387780033707947,0.813271652089544
25
+ 0.608,19,0.7485,0.858,0.902,0.933,0.7485,0.7485,0.286,0.858,0.1804,0.902,0.09330000000000001,0.933,0.811662301587301,0.8412165919216121,0.8145207748903083
26
+ 0.64,20,0.75,0.8585,0.906,0.935,0.75,0.75,0.2861666666666666,0.8585,0.18120000000000003,0.906,0.09350000000000003,0.935,0.8132216269841267,0.8428993494120013,0.8159916268164611
27
+ 0.672,21,0.751,0.86,0.9085,0.937,0.751,0.751,0.2866666666666666,0.86,0.18170000000000003,0.9085,0.09370000000000002,0.937,0.8147244047619046,0.8445398100169161,0.8174219347850519
28
+ 0.704,22,0.7515,0.8605,0.9095,0.9385,0.7515,0.7515,0.2868333333333333,0.8605,0.18190000000000003,0.9095,0.09385000000000002,0.9385,0.8155331349206346,0.8455213483361718,0.8181997278913368
29
+ 0.736,23,0.752,0.861,0.9115,0.939,0.752,0.752,0.287,0.861,0.18230000000000002,0.9115,0.09390000000000003,0.939,0.8163089285714282,0.8462576895083372,0.8190022009262776
30
+ 0.768,24,0.754,0.863,0.912,0.94,0.754,0.754,0.2876666666666666,0.863,0.18240000000000003,0.912,0.09400000000000001,0.94,0.8177426587301584,0.8475859217819075,0.8203979211180829
31
+ 0.8,25,0.7535,0.865,0.912,0.942,0.7535,0.7535,0.2883333333333333,0.865,0.18240000000000003,0.912,0.09420000000000002,0.942,0.8181787698412694,0.8483820060507,0.8206961025415495
32
+ 0.832,26,0.7535,0.8645,0.9135,0.942,0.7535,0.7535,0.2881666666666666,0.8645,0.18270000000000003,0.9135,0.09420000000000002,0.942,0.8182884920634916,0.8484779602466193,0.8208425760250101
33
+ 0.864,27,0.7545,0.8645,0.913,0.942,0.7545,0.7545,0.2881666666666666,0.8645,0.1826,0.913,0.09420000000000002,0.942,0.8191484126984123,0.8491368585096968,0.8217332401237013
34
+ 0.896,28,0.7555,0.8645,0.9135,0.9425,0.7555,0.7555,0.2881666666666666,0.8645,0.18270000000000003,0.9135,0.09425000000000001,0.9425,0.8197345238095235,0.8496817132700939,0.8222992871916319
35
+ 0.928,29,0.7555,0.8645,0.913,0.9425,0.7555,0.7555,0.2881666666666666,0.8645,0.1826,0.913,0.09425000000000001,0.9425,0.8197059523809519,0.8496549535331434,0.8222882052349033
36
+ 0.96,30,0.756,0.865,0.914,0.9425,0.756,0.756,0.2883333333333333,0.865,0.18280000000000002,0.914,0.09425000000000001,0.9425,0.8201031746031743,0.8499702257593353,0.82268559169512
37
+ 0.992,31,0.7555,0.865,0.914,0.9425,0.7555,0.7555,0.2883333333333333,0.865,0.18280000000000002,0.914,0.09425000000000001,0.9425,0.8197634920634916,0.849714771417121,0.8223578731890134
38
+ 1.0,32,0.7555,0.8655,0.9145,0.943,0.7555,0.7555,0.2884999999999999,0.8655,0.18290000000000003,0.9145,0.09430000000000001,0.943,0.8199301587301584,0.8499528413626729,0.8224780775804242
eval/Information-Retrieval_evaluation_dim_512_results.csv ADDED
@@ -0,0 +1,40 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch,steps,cosine-Accuracy@1,cosine-Accuracy@3,cosine-Accuracy@5,cosine-Accuracy@10,cosine-Precision@1,cosine-Recall@1,cosine-Precision@3,cosine-Recall@3,cosine-Precision@5,cosine-Recall@5,cosine-Precision@10,cosine-Recall@10,cosine-MRR@10,cosine-NDCG@10,cosine-MAP@100
2
+ 0.04,10,0.623,0.739875,0.78225,0.834,0.623,0.623,0.246625,0.739875,0.15644999999999998,0.78225,0.0834,0.834,0.6919211309523794,0.7261635261817171,0.6966834448618515
3
+ 0.004,1,0.623,0.739875,0.78225,0.834,0.623,0.623,0.246625,0.739875,0.15644999999999998,0.78225,0.0834,0.834,0.6919211309523794,0.7261635261817171,0.6966834448618515
4
+ 0.008,1,0.623,0.739875,0.78225,0.834,0.623,0.623,0.246625,0.739875,0.15644999999999998,0.78225,0.0834,0.834,0.6919211309523794,0.7261635261817171,0.6966834448618515
5
+ 0.016,2,0.623375,0.73975,0.78225,0.834,0.623375,0.623375,0.24658333333333332,0.73975,0.15644999999999998,0.78225,0.0834,0.834,0.6920851190476177,0.7262849679068353,0.6968552256797925
6
+ 0.024,3,0.624625,0.739875,0.783,0.834375,0.624625,0.624625,0.246625,0.739875,0.1566,0.783,0.0834375,0.834375,0.6928214285714267,0.7269205729287098,0.6975935999478502
7
+ 0.032,4,0.62525,0.740125,0.784625,0.835875,0.62525,0.62525,0.2467083333333333,0.740125,0.156925,0.784625,0.08358750000000001,0.835875,0.6935397321428544,0.727817401445926,0.698234810037224
8
+ 0.008,1,0.616875,0.73675,0.778125,0.829125,0.616875,0.616875,0.24558333333333332,0.73675,0.155625,0.778125,0.08291250000000001,0.829125,0.6867003968253955,0.7210577274511136,0.6915203290210378
9
+ 0.032,1,0.719,0.8195,0.859,0.8955,0.719,0.719,0.2731666666666666,0.8195,0.1718,0.859,0.08955,0.8955,0.7767442460317455,0.8054660716640676,0.7803908528152956
10
+ 0.064,2,0.7195,0.821,0.8625,0.899,0.7195,0.7195,0.2736666666666666,0.821,0.1725,0.8625,0.08990000000000001,0.899,0.7786208333333329,0.8077385092665426,0.7820502585141732
11
+ 0.096,3,0.7245,0.8255,0.865,0.9025,0.7245,0.7245,0.2751666666666666,0.8255,0.173,0.865,0.09025,0.9025,0.7838982142857135,0.8126398568794393,0.7872407831664519
12
+ 0.128,4,0.7285,0.834,0.8725,0.9055,0.7285,0.7285,0.278,0.834,0.17450000000000002,0.8725,0.09055,0.9055,0.7894962301587296,0.8177383537451753,0.7928702978150627
13
+ 0.16,5,0.738,0.8435,0.878,0.9095,0.738,0.738,0.2811666666666666,0.8435,0.17560000000000003,0.878,0.09095000000000002,0.9095,0.7978470238095235,0.825070780319082,0.8011617302066206
14
+ 0.192,6,0.746,0.85,0.887,0.914,0.746,0.746,0.28333333333333327,0.85,0.17740000000000003,0.887,0.09140000000000001,0.914,0.8051607142857135,0.8317400998557125,0.8083522523708684
15
+ 0.224,7,0.7515,0.8525,0.8885,0.918,0.7515,0.7515,0.2841666666666666,0.8525,0.17770000000000002,0.8885,0.0918,0.918,0.8084732142857131,0.8351071359877208,0.8115495552281191
16
+ 0.256,8,0.7535,0.857,0.8915,0.918,0.7535,0.7535,0.2856666666666666,0.857,0.17830000000000001,0.8915,0.0918,0.918,0.8104964285714279,0.8367420757747164,0.8137732837440027
17
+ 0.288,9,0.7585,0.858,0.8935,0.92,0.7585,0.7585,0.286,0.858,0.1787,0.8935,0.092,0.92,0.8139027777777772,0.8398017322516969,0.8172214156838584
18
+ 0.32,10,0.76,0.8595,0.895,0.924,0.76,0.76,0.2865,0.8595,0.179,0.895,0.09240000000000001,0.924,0.815524404761904,0.8419431852850898,0.8186486952621506
19
+ 0.352,11,0.7625,0.8625,0.8975,0.9255,0.7625,0.7625,0.2874999999999999,0.8625,0.1795,0.8975,0.09255000000000001,0.9255,0.8179480158730155,0.8441772947534036,0.8210761964314017
20
+ 0.384,12,0.7645,0.865,0.901,0.9275,0.7645,0.7645,0.2883333333333333,0.865,0.18020000000000003,0.901,0.09275000000000001,0.9275,0.81992996031746,0.846168914810826,0.8230234863222943
21
+ 0.416,13,0.7675,0.8675,0.904,0.928,0.7675,0.7675,0.2891666666666666,0.8675,0.18080000000000002,0.904,0.09280000000000001,0.928,0.8226111111111107,0.8483550929151045,0.8258763867249792
22
+ 0.448,14,0.7685,0.869,0.9035,0.9315,0.7685,0.7685,0.28966666666666663,0.869,0.18070000000000003,0.9035,0.09315000000000001,0.9315,0.8240527777777773,0.8502432777457252,0.8271529182158947
23
+ 0.48,15,0.769,0.872,0.9035,0.934,0.769,0.769,0.29066666666666663,0.872,0.18070000000000003,0.9035,0.09340000000000001,0.934,0.8252809523809518,0.85175714457259,0.82829910172932
24
+ 0.512,16,0.7685,0.872,0.9055,0.937,0.7685,0.7685,0.29066666666666663,0.872,0.1811,0.9055,0.0937,0.937,0.8262124999999998,0.8531862497509192,0.8290421219459508
25
+ 0.544,17,0.7675,0.872,0.9075,0.938,0.7675,0.7675,0.29066666666666663,0.872,0.1815,0.9075,0.09380000000000001,0.938,0.826553968253968,0.8537331550294353,0.8293906649262901
26
+ 0.576,18,0.7685,0.8725,0.909,0.939,0.7685,0.7685,0.2908333333333333,0.8725,0.18180000000000002,0.909,0.09390000000000001,0.939,0.8272208333333332,0.8544673422751337,0.830057668166273
27
+ 0.608,19,0.7695,0.8735,0.912,0.9415,0.7695,0.7695,0.29116666666666663,0.8735,0.18240000000000003,0.912,0.09415000000000003,0.9415,0.8286297619047617,0.8561146716347957,0.8313427968275068
28
+ 0.64,20,0.7715,0.8755,0.912,0.941,0.7715,0.7715,0.2918333333333333,0.8755,0.18240000000000003,0.912,0.09410000000000002,0.941,0.8298319444444443,0.8569327233265853,0.8326593775680619
29
+ 0.672,21,0.773,0.877,0.9145,0.941,0.773,0.773,0.29233333333333333,0.877,0.18290000000000003,0.9145,0.09410000000000002,0.941,0.8312283730158728,0.8580251095927921,0.8341128747301987
30
+ 0.704,22,0.7765,0.8775,0.914,0.941,0.7765,0.7765,0.2925,0.8775,0.18280000000000002,0.914,0.09410000000000002,0.941,0.8332499999999999,0.8595556613092501,0.8361719860315263
31
+ 0.736,23,0.7775,0.8795,0.916,0.942,0.7775,0.7775,0.29316666666666663,0.8795,0.18320000000000006,0.916,0.09420000000000002,0.942,0.8345416666666665,0.860790524246275,0.8374246221931334
32
+ 0.768,24,0.7785,0.8805,0.9155,0.9425,0.7785,0.7785,0.2935,0.8805,0.1831,0.9155,0.09425000000000001,0.9425,0.8352079365079362,0.8614029284760106,0.8380864669685953
33
+ 0.8,25,0.7785,0.881,0.916,0.943,0.7785,0.7785,0.29366666666666663,0.881,0.18320000000000003,0.916,0.09430000000000001,0.943,0.8353817460317459,0.8616450418404342,0.8382631284981227
34
+ 0.832,26,0.778,0.8815,0.9155,0.943,0.778,0.778,0.2938333333333333,0.8815,0.1831,0.9155,0.09430000000000001,0.943,0.8351831349206347,0.8615061993679543,0.8380910260465462
35
+ 0.864,27,0.778,0.8825,0.916,0.9435,0.778,0.778,0.29416666666666663,0.8825,0.18320000000000003,0.916,0.09435000000000002,0.9435,0.8353492063492061,0.8617483556802487,0.8382305859137469
36
+ 0.896,28,0.7785,0.8825,0.9165,0.9435,0.7785,0.7785,0.29416666666666663,0.8825,0.18330000000000002,0.9165,0.09435000000000002,0.9435,0.8356928571428568,0.8620082241480976,0.8385912633555377
37
+ 0.928,29,0.778,0.8825,0.917,0.9435,0.778,0.778,0.29416666666666663,0.8825,0.18340000000000004,0.917,0.09435000000000002,0.9435,0.8356749999999998,0.8620147273177122,0.8385833889087004
38
+ 0.96,30,0.7785,0.8815,0.9175,0.944,0.7785,0.7785,0.2938333333333333,0.8815,0.18350000000000002,0.9175,0.09440000000000003,0.944,0.8359742063492062,0.8623529766504225,0.8388387410866271
39
+ 0.992,31,0.7785,0.882,0.917,0.944,0.7785,0.7785,0.294,0.882,0.18340000000000004,0.917,0.09440000000000003,0.944,0.835813095238095,0.8622222495084777,0.8386818610740074
40
+ 1.0,32,0.7785,0.8825,0.917,0.944,0.7785,0.7785,0.29416666666666663,0.8825,0.18340000000000004,0.917,0.09440000000000003,0.944,0.8360055555555553,0.8623716893141778,0.8388749447751291
eval/Information-Retrieval_evaluation_dim_64_results.csv ADDED
@@ -0,0 +1,38 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch,steps,cosine-Accuracy@1,cosine-Accuracy@3,cosine-Accuracy@5,cosine-Accuracy@10,cosine-Precision@1,cosine-Recall@1,cosine-Precision@3,cosine-Recall@3,cosine-Precision@5,cosine-Recall@5,cosine-Precision@10,cosine-Recall@10,cosine-MRR@10,cosine-NDCG@10,cosine-MAP@100
2
+ 0.04,10,0.5,0.618375,0.66425,0.722875,0.5,0.5,0.206125,0.618375,0.13285,0.66425,0.07228749999999999,0.722875,0.5708128968253983,0.6073085496344399,0.5773798753985288
3
+ 0.004,1,0.5,0.618375,0.66425,0.722875,0.5,0.5,0.206125,0.618375,0.13285,0.66425,0.07228749999999999,0.722875,0.5708128968253983,0.6073085496344399,0.5773798753985288
4
+ 0.008,1,0.5,0.618375,0.66425,0.722875,0.5,0.5,0.206125,0.618375,0.13285,0.66425,0.07228749999999999,0.722875,0.5708128968253983,0.6073085496344399,0.5773798753985288
5
+ 0.016,2,0.5,0.618875,0.665,0.7235,0.5,0.5,0.20629166666666665,0.618875,0.133,0.665,0.07235,0.7235,0.5710846726190489,0.6076678211191968,0.577636858683181
6
+ 0.024,3,0.5015,0.61975,0.66525,0.724625,0.5015,0.5015,0.2065833333333333,0.61975,0.13305,0.66525,0.0724625,0.724625,0.5724763392857158,0.6089942473875521,0.5790054730620181
7
+ 0.032,1,0.6075,0.7135,0.7575,0.8085,0.6075,0.6075,0.2378333333333333,0.7135,0.1515,0.7575,0.08084999999999999,0.8085,0.6721408730158728,0.704939686143244,0.6780006920349362
8
+ 0.064,2,0.6105,0.7165,0.761,0.812,0.6105,0.6105,0.23883333333333331,0.7165,0.1522,0.761,0.08120000000000001,0.812,0.6758531746031747,0.7086111963133628,0.6816449110563683
9
+ 0.096,3,0.623,0.725,0.769,0.8165,0.623,0.623,0.2416666666666667,0.725,0.15380000000000002,0.769,0.08165,0.8165,0.685842857142857,0.7172988468993426,0.691831885413325
10
+ 0.128,4,0.634,0.735,0.781,0.828,0.634,0.634,0.245,0.735,0.15619999999999998,0.781,0.0828,0.828,0.6966019841269839,0.7281890709729733,0.7022621576437416
11
+ 0.16,5,0.6475,0.7515,0.7915,0.8425,0.6475,0.6475,0.2505,0.7515,0.15830000000000002,0.7915,0.08425000000000002,0.8425,0.710149603174603,0.7419113376727037,0.7152669968166208
12
+ 0.192,6,0.656,0.7605,0.8,0.853,0.656,0.656,0.25349999999999995,0.7605,0.16,0.8,0.08530000000000001,0.853,0.7187648809523808,0.7509556318176539,0.723653823828694
13
+ 0.224,7,0.668,0.769,0.8125,0.864,0.668,0.668,0.25633333333333336,0.769,0.1625,0.8125,0.0864,0.864,0.729863492063492,0.7620257757574136,0.7342935023862088
14
+ 0.256,8,0.673,0.7805,0.8235,0.87,0.673,0.673,0.2601666666666666,0.7805,0.16469999999999999,0.8235,0.087,0.87,0.7364648809523809,0.768642745259573,0.74088132990401
15
+ 0.288,9,0.679,0.789,0.83,0.8755,0.679,0.679,0.263,0.789,0.166,0.83,0.08755000000000002,0.8755,0.7435724206349207,0.7754211768339871,0.748020160386372
16
+ 0.32,10,0.6795,0.7975,0.8375,0.8825,0.6795,0.6795,0.2658333333333333,0.7975,0.1675,0.8375,0.08825,0.8825,0.7472067460317463,0.7799357946514196,0.7515110606154827
17
+ 0.352,11,0.6835,0.7995,0.843,0.8865,0.6835,0.6835,0.2665,0.7995,0.16860000000000003,0.843,0.08865,0.8865,0.7516863095238097,0.7843265508742878,0.7559771879915966
18
+ 0.384,12,0.686,0.807,0.85,0.892,0.686,0.686,0.269,0.807,0.17,0.85,0.08920000000000002,0.892,0.7555261904761905,0.7886232766354258,0.7596799368375443
19
+ 0.416,13,0.6925,0.809,0.8515,0.899,0.6925,0.6925,0.2696666666666666,0.809,0.1703,0.8515,0.08990000000000001,0.899,0.7604974206349207,0.7939585238732911,0.7643151373617535
20
+ 0.448,14,0.694,0.8135,0.858,0.9065,0.694,0.694,0.2711666666666666,0.8135,0.17160000000000003,0.858,0.09065000000000001,0.9065,0.7633210317460314,0.797838488742098,0.766722782226307
21
+ 0.48,15,0.6955,0.8165,0.861,0.907,0.6955,0.6955,0.2721666666666666,0.8165,0.17220000000000002,0.861,0.0907,0.907,0.7656529761904759,0.7998601550776309,0.7692181139028066
22
+ 0.512,16,0.696,0.8175,0.8635,0.9095,0.696,0.696,0.2725,0.8175,0.17270000000000005,0.8635,0.09095000000000002,0.9095,0.7672934523809519,0.8017461465199474,0.7708312234241821
23
+ 0.544,17,0.696,0.82,0.8675,0.9095,0.696,0.696,0.2733333333333333,0.82,0.1735,0.8675,0.09095000000000002,0.9095,0.7683271825396824,0.8026299233502736,0.7720590839932417
24
+ 0.576,18,0.7,0.8245,0.8675,0.914,0.7,0.7,0.27483333333333326,0.8245,0.17350000000000002,0.8675,0.09140000000000001,0.914,0.7718053571428569,0.8062557517386626,0.775273366883454
25
+ 0.608,19,0.7025,0.8265,0.869,0.916,0.7025,0.7025,0.2755,0.8265,0.1738,0.869,0.09160000000000001,0.916,0.7740037698412693,0.808422961487686,0.777433805214516
26
+ 0.64,20,0.7035,0.8265,0.87,0.9185,0.7035,0.7035,0.27549999999999997,0.8265,0.174,0.87,0.09185000000000001,0.9185,0.7754817460317456,0.8101227039731832,0.7788407891913901
27
+ 0.672,21,0.7045,0.829,0.871,0.9205,0.7045,0.7045,0.2763333333333333,0.829,0.1742,0.871,0.09205,0.9205,0.7770789682539677,0.8118171672346021,0.7804012721701683
28
+ 0.704,22,0.707,0.8295,0.874,0.923,0.707,0.707,0.2765,0.8295,0.1748,0.874,0.09230000000000001,0.923,0.7791535714285709,0.8139547247570527,0.7823493187453222
29
+ 0.736,23,0.709,0.831,0.8735,0.9265,0.709,0.709,0.277,0.831,0.17470000000000002,0.8735,0.09265000000000001,0.9265,0.7806351190476185,0.815841040996392,0.7835671920496632
30
+ 0.768,24,0.7115,0.8305,0.8745,0.926,0.7115,0.7115,0.2768333333333334,0.8305,0.1749,0.8745,0.0926,0.926,0.7821702380952374,0.8169055077744363,0.7852207442974272
31
+ 0.8,25,0.713,0.8325,0.875,0.927,0.713,0.713,0.2775,0.8325,0.175,0.875,0.0927,0.927,0.7832513888888881,0.81796168303122,0.7862742036940107
32
+ 0.832,26,0.7125,0.8325,0.8755,0.928,0.7125,0.7125,0.2775,0.8325,0.17510000000000003,0.8755,0.09280000000000001,0.928,0.7832660714285706,0.8182124259942498,0.7862380234452195
33
+ 0.864,27,0.714,0.834,0.8755,0.928,0.714,0.714,0.278,0.834,0.1751,0.8755,0.09280000000000001,0.928,0.784346031746031,0.8190474599929967,0.7873381010537127
34
+ 0.896,28,0.714,0.835,0.876,0.9275,0.714,0.714,0.2783333333333333,0.835,0.1752,0.876,0.09275,0.9275,0.7844458333333327,0.8190345716043556,0.7875113219785125
35
+ 0.928,29,0.7135,0.8355,0.877,0.9285,0.7135,0.7135,0.2785,0.8355,0.1754,0.877,0.09285,0.9285,0.7844027777777771,0.8192322567980077,0.7873782069610507
36
+ 0.96,30,0.714,0.8355,0.877,0.9285,0.714,0.714,0.2785,0.8355,0.17539999999999997,0.877,0.09285,0.9285,0.7846738095238089,0.8194406410771478,0.7876666097003387
37
+ 0.992,31,0.7135,0.8345,0.8775,0.9285,0.7135,0.7135,0.2781666666666666,0.8345,0.1755,0.8775,0.09285,0.9285,0.7844365079365073,0.8192642815893417,0.7874332571155255
38
+ 1.0,32,0.714,0.8365,0.877,0.9285,0.714,0.714,0.27883333333333327,0.8365,0.1754,0.877,0.09285,0.9285,0.7848236111111104,0.8195584918161248,0.7878148778237813
eval/Information-Retrieval_evaluation_dim_768_results.csv ADDED
@@ -0,0 +1,40 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ epoch,steps,cosine-Accuracy@1,cosine-Accuracy@3,cosine-Accuracy@5,cosine-Accuracy@10,cosine-Precision@1,cosine-Recall@1,cosine-Precision@3,cosine-Recall@3,cosine-Precision@5,cosine-Recall@5,cosine-Precision@10,cosine-Recall@10,cosine-MRR@10,cosine-NDCG@10,cosine-MAP@100
2
+ 0.04,10,0.62675,0.7475,0.787375,0.839125,0.62675,0.62675,0.24916666666666665,0.7475,0.157475,0.787375,0.0839125,0.839125,0.6970065476190463,0.7313021522721747,0.7017586049481668
3
+ 0.004,1,0.62675,0.7475,0.787375,0.839125,0.62675,0.62675,0.24916666666666665,0.7475,0.157475,0.787375,0.0839125,0.839125,0.6970065476190463,0.7313021522721747,0.7017586049481668
4
+ 0.008,1,0.62675,0.7475,0.787375,0.839125,0.62675,0.62675,0.24916666666666665,0.7475,0.157475,0.787375,0.0839125,0.839125,0.6970065476190463,0.7313021522721747,0.7017586049481668
5
+ 0.016,2,0.627125,0.7475,0.78775,0.839625,0.627125,0.627125,0.24916666666666662,0.7475,0.15755,0.78775,0.08396250000000001,0.839625,0.697394444444443,0.7317144781338307,0.7021009618559859
6
+ 0.024,3,0.627625,0.747625,0.78775,0.840125,0.627625,0.627625,0.2492083333333333,0.747625,0.15755000000000002,0.78775,0.08401250000000002,0.840125,0.6977708829365066,0.732109251952619,0.702458572749224
7
+ 0.032,4,0.628125,0.748375,0.788625,0.841125,0.628125,0.628125,0.2494583333333333,0.748375,0.157725,0.788625,0.0841125,0.841125,0.6983445932539664,0.73278828853728,0.7029842805539042
8
+ 0.008,1,0.622125,0.743125,0.784375,0.836625,0.622125,0.622125,0.2477083333333333,0.743125,0.156875,0.784375,0.0836625,0.836625,0.6928252976190464,0.7274848128840606,0.6974804142607848
9
+ 0.032,1,0.729,0.8245,0.8665,0.9045,0.729,0.729,0.2748333333333333,0.8245,0.1733,0.8665,0.09045,0.9045,0.78605873015873,0.8146233575175829,0.7892834369675533
10
+ 0.064,2,0.7305,0.8265,0.869,0.906,0.7305,0.7305,0.2755,0.8265,0.1738,0.869,0.09060000000000001,0.906,0.7875956349206346,0.8161756499685691,0.790779415670245
11
+ 0.096,3,0.737,0.833,0.8715,0.9075,0.737,0.737,0.2776666666666666,0.833,0.1743,0.8715,0.09075,0.9075,0.7926101190476188,0.8203841578858828,0.795856766658934
12
+ 0.128,4,0.741,0.8385,0.876,0.91,0.741,0.741,0.2795,0.8385,0.17520000000000002,0.876,0.09100000000000001,0.91,0.7979531746031743,0.8251617754991674,0.8012390257014121
13
+ 0.16,5,0.7465,0.8445,0.885,0.917,0.7465,0.7465,0.2814999999999999,0.8445,0.177,0.885,0.0917,0.917,0.8040509920634915,0.8314633219747707,0.8069413574570579
14
+ 0.192,6,0.751,0.852,0.8885,0.92,0.751,0.751,0.284,0.852,0.1777,0.8885,0.092,0.92,0.8091216269841263,0.8361132764895302,0.8119339101033282
15
+ 0.224,7,0.751,0.858,0.891,0.9215,0.751,0.751,0.286,0.858,0.17820000000000003,0.891,0.09215000000000001,0.9215,0.8103501984126977,0.8374843292743606,0.8132450271457848
16
+ 0.256,8,0.7545,0.862,0.8935,0.924,0.7545,0.7545,0.2873333333333333,0.862,0.1787,0.8935,0.09240000000000001,0.924,0.8127174603174596,0.8398562335625106,0.815524276482181
17
+ 0.288,9,0.759,0.8625,0.8945,0.925,0.759,0.759,0.2875,0.8625,0.1789,0.8945,0.09250000000000001,0.925,0.8158624999999992,0.8424711472124009,0.8187995668566092
18
+ 0.32,10,0.7605,0.865,0.895,0.926,0.7605,0.7605,0.28833333333333333,0.865,0.179,0.895,0.09260000000000003,0.926,0.8176849206349199,0.8441224154196798,0.8207136431469253
19
+ 0.352,11,0.765,0.867,0.897,0.928,0.765,0.765,0.289,0.867,0.17940000000000003,0.897,0.09280000000000001,0.928,0.8206541666666659,0.8468001158580941,0.823656190832764
20
+ 0.384,12,0.766,0.869,0.9015,0.9295,0.766,0.766,0.28966666666666663,0.869,0.18030000000000004,0.9015,0.09295000000000002,0.9295,0.8220462301587299,0.8482367139537283,0.8250310865002473
21
+ 0.416,13,0.766,0.8705,0.9035,0.931,0.766,0.766,0.29016666666666663,0.8705,0.18070000000000006,0.9035,0.0931,0.931,0.8230438492063489,0.849386244106597,0.8260404766413967
22
+ 0.448,14,0.768,0.8705,0.906,0.933,0.768,0.768,0.29016666666666663,0.8705,0.18120000000000003,0.906,0.09330000000000001,0.933,0.8245803571428572,0.8510064069021597,0.8275233721994071
23
+ 0.48,15,0.7665,0.8705,0.908,0.9365,0.7665,0.7665,0.29016666666666663,0.8705,0.1816,0.908,0.09365000000000001,0.9365,0.8247773809523808,0.8519690493412264,0.8275216856839888
24
+ 0.512,16,0.767,0.872,0.909,0.9385,0.767,0.767,0.29066666666666663,0.872,0.18180000000000002,0.909,0.09385,0.9385,0.8260617063492064,0.8534295762476782,0.8287243278120745
25
+ 0.544,17,0.7665,0.8765,0.91,0.9395,0.7665,0.7665,0.29216666666666663,0.8765,0.182,0.91,0.09395,0.9395,0.8265644841269838,0.85410523359625,0.8291954347186519
26
+ 0.576,18,0.7665,0.879,0.9115,0.942,0.7665,0.7665,0.2929999999999999,0.879,0.18230000000000002,0.9115,0.09420000000000002,0.942,0.8275023809523806,0.8554239362145454,0.8300007101998121
27
+ 0.608,19,0.77,0.881,0.912,0.9425,0.77,0.77,0.29366666666666663,0.881,0.18240000000000003,0.912,0.09425000000000001,0.9425,0.8296075396825393,0.8571165900335822,0.8321480183985285
28
+ 0.64,20,0.7705,0.8825,0.913,0.943,0.7705,0.7705,0.29416666666666663,0.8825,0.1826,0.913,0.09430000000000001,0.943,0.8307418650793646,0.8581212309055561,0.833318979870546
29
+ 0.672,21,0.7735,0.884,0.9125,0.9435,0.7735,0.7735,0.29466666666666663,0.884,0.1825,0.9125,0.09435000000000002,0.9435,0.8328736111111106,0.8598683172832627,0.8354782460926624
30
+ 0.704,22,0.7745,0.886,0.913,0.9455,0.7745,0.7745,0.2953333333333333,0.886,0.1826,0.913,0.09455000000000001,0.9455,0.8341117063492056,0.8612654106913422,0.8365684448363621
31
+ 0.736,23,0.7755,0.886,0.914,0.946,0.7755,0.7755,0.2953333333333333,0.886,0.18280000000000002,0.914,0.0946,0.946,0.8349136904761898,0.8619927812856331,0.8373824847338985
32
+ 0.768,24,0.775,0.8875,0.914,0.9475,0.775,0.775,0.29583333333333334,0.8875,0.18280000000000002,0.914,0.09475,0.9475,0.8350505952380946,0.8624415608033617,0.8374210480672024
33
+ 0.8,25,0.776,0.888,0.9145,0.947,0.776,0.776,0.296,0.888,0.18290000000000003,0.9145,0.09470000000000002,0.947,0.8359511904761897,0.8630339076624403,0.8383963877039214
34
+ 0.832,26,0.7765,0.8885,0.915,0.9465,0.7765,0.7765,0.29616666666666663,0.8885,0.183,0.915,0.09465000000000001,0.9465,0.8362726190476186,0.8631867714673521,0.8387877865210644
35
+ 0.864,27,0.7775,0.889,0.9155,0.947,0.7775,0.7775,0.29633333333333334,0.889,0.1831,0.9155,0.09470000000000002,0.947,0.8368220238095233,0.8637095307333018,0.8393139804615898
36
+ 0.896,28,0.777,0.8885,0.916,0.947,0.777,0.777,0.29616666666666663,0.8885,0.18320000000000003,0.916,0.09470000000000002,0.947,0.8367289682539678,0.863650975023247,0.8392289785025171
37
+ 0.928,29,0.777,0.8885,0.9165,0.9465,0.777,0.777,0.29616666666666663,0.8885,0.18330000000000002,0.9165,0.09465000000000001,0.9465,0.8366059523809518,0.8634508462011512,0.8391514317674956
38
+ 0.96,30,0.7775,0.889,0.916,0.947,0.7775,0.7775,0.29633333333333334,0.889,0.18320000000000003,0.916,0.09470000000000002,0.947,0.8371333333333328,0.8639644931826581,0.8396465624394464
39
+ 0.992,31,0.7775,0.889,0.917,0.947,0.7775,0.7775,0.29633333333333334,0.889,0.18340000000000004,0.917,0.09470000000000002,0.947,0.8371458333333328,0.8639747676476234,0.8396591650905201
40
+ 1.0,32,0.7775,0.8885,0.917,0.947,0.7775,0.7775,0.29616666666666663,0.8885,0.18340000000000004,0.917,0.09470000000000002,0.947,0.8369255952380947,0.8637977392462012,0.8394380047776188
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:69e84d3b2f13e1f25cc8fe6f0fca4ef2e9b712fb1598ae0ab9f9a8dd6adb0504
3
+ size 437967648
modules.json ADDED
@@ -0,0 +1,20 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ },
14
+ {
15
+ "idx": 2,
16
+ "name": "2",
17
+ "path": "2_Normalize",
18
+ "type": "sentence_transformers.models.Normalize"
19
+ }
20
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 384,
3
+ "do_lower_case": false
4
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,16 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "backend": "tokenizers",
3
+ "bos_token": "<s>",
4
+ "cls_token": "<s>",
5
+ "do_lower_case": true,
6
+ "eos_token": "</s>",
7
+ "is_local": false,
8
+ "mask_token": "<mask>",
9
+ "model_max_length": 384,
10
+ "pad_token": "<pad>",
11
+ "sep_token": "</s>",
12
+ "strip_accents": null,
13
+ "tokenize_chinese_chars": true,
14
+ "tokenizer_class": "MPNetTokenizer",
15
+ "unk_token": "[UNK]"
16
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7e8d5380b372edd789d97972f713b2d45ad34bd303aa1fd0212daa8a8382d586
3
+ size 5521