1shoomun commited on
Commit
6391626
·
verified ·
1 Parent(s): 7957364

Updated Weights

Browse files
README.md CHANGED
@@ -4,38 +4,37 @@ tags:
4
  - sentence-similarity
5
  - feature-extraction
6
  - generated_from_trainer
7
- - dataset_size:2620
8
  - loss:MultipleNegativesRankingLoss
9
  - loss:CosineSimilarityLoss
10
  base_model: jinaai/jina-embedding-b-en-v1
11
  widget:
12
- - source_sentence: What sector am I most heavily invested in?
13
  sentences:
14
- - 'Show me how to switch my stock portfolio to mutual funds
15
-
16
- '
17
- - What percentage of my portfolio is in X
18
- - Which sector do I invest most in?
19
- - source_sentence: Can you tell me how my portfolio ranks among others?
20
  sentences:
21
- - What is my AMC wise split ?
22
- - In which funds am I paying highest fees
23
- - Compare my portfolio with others?
24
- - source_sentence: Which of my funds has the highest risk level?
25
  sentences:
26
- - Give me python code to find best funds in my portfolio
27
- - Show my stocks ranked by performance
28
- - Show my riskiest mutual funds
29
- - source_sentence: What's going right with my portfolio?
 
30
  sentences:
31
- - Is my portfolio linked?
32
- - My portfolio returns over all the years
33
- - What's going well in my portfolio
34
- - source_sentence: I'd like to know the percentage of large cap in my investments.
35
  sentences:
36
- - Show my riskiest holdings
37
- - Can you show what percentage of my portfolio consists of large cap
38
- - What is the expected return of my portfolio?
39
  pipeline_tag: sentence-similarity
40
  library_name: sentence-transformers
41
  metrics:
@@ -65,10 +64,10 @@ model-index:
65
  type: test-eval
66
  metrics:
67
  - type: cosine_accuracy@1
68
- value: 0.8625954198473282
69
  name: Cosine Accuracy@1
70
  - type: cosine_accuracy@3
71
- value: 0.9961832061068703
72
  name: Cosine Accuracy@3
73
  - type: cosine_accuracy@5
74
  value: 1.0
@@ -77,10 +76,10 @@ model-index:
77
  value: 1.0
78
  name: Cosine Accuracy@10
79
  - type: cosine_precision@1
80
- value: 0.8625954198473282
81
  name: Cosine Precision@1
82
  - type: cosine_precision@3
83
- value: 0.33206106870229
84
  name: Cosine Precision@3
85
  - type: cosine_precision@5
86
  value: 0.19999999999999998
@@ -89,10 +88,10 @@ model-index:
89
  value: 0.09999999999999999
90
  name: Cosine Precision@10
91
  - type: cosine_recall@1
92
- value: 0.8625954198473282
93
  name: Cosine Recall@1
94
  - type: cosine_recall@3
95
- value: 0.9961832061068703
96
  name: Cosine Recall@3
97
  - type: cosine_recall@5
98
  value: 1.0
@@ -101,13 +100,13 @@ model-index:
101
  value: 1.0
102
  name: Cosine Recall@10
103
  - type: cosine_ndcg@10
104
- value: 0.9460250731496836
105
  name: Cosine Ndcg@10
106
  - type: cosine_mrr@10
107
- value: 0.9271628498727736
108
  name: Cosine Mrr@10
109
  - type: cosine_map@100
110
- value: 0.9271628498727736
111
  name: Cosine Map@100
112
  ---
113
 
@@ -160,9 +159,9 @@ from sentence_transformers import SentenceTransformer
160
  model = SentenceTransformer("sentence_transformers_model_id")
161
  # Run inference
162
  sentences = [
163
- "I'd like to know the percentage of large cap in my investments.",
164
- 'Can you show what percentage of my portfolio consists of large cap',
165
- 'Show my riskiest holdings',
166
  ]
167
  embeddings = model.encode(sentences)
168
  print(embeddings.shape)
@@ -207,23 +206,23 @@ You can finetune this model on your own dataset.
207
  * Dataset: `test-eval`
208
  * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
209
 
210
- | Metric | Value |
211
- |:--------------------|:----------|
212
- | cosine_accuracy@1 | 0.8626 |
213
- | cosine_accuracy@3 | 0.9962 |
214
- | cosine_accuracy@5 | 1.0 |
215
- | cosine_accuracy@10 | 1.0 |
216
- | cosine_precision@1 | 0.8626 |
217
- | cosine_precision@3 | 0.3321 |
218
- | cosine_precision@5 | 0.2 |
219
- | cosine_precision@10 | 0.1 |
220
- | cosine_recall@1 | 0.8626 |
221
- | cosine_recall@3 | 0.9962 |
222
- | cosine_recall@5 | 1.0 |
223
- | cosine_recall@10 | 1.0 |
224
- | **cosine_ndcg@10** | **0.946** |
225
- | cosine_mrr@10 | 0.9272 |
226
- | cosine_map@100 | 0.9272 |
227
 
228
  <!--
229
  ## Bias, Risks and Limitations
@@ -243,19 +242,19 @@ You can finetune this model on your own dataset.
243
 
244
  #### Unnamed Dataset
245
 
246
- * Size: 1,310 training samples
247
  * Columns: <code>sentence_0</code>, <code>sentence_1</code>, and <code>label</code>
248
  * Approximate statistics based on the first 1000 samples:
249
  | | sentence_0 | sentence_1 | label |
250
  |:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:--------------------------------------------------------------|
251
  | type | string | string | float |
252
- | details | <ul><li>min: 4 tokens</li><li>mean: 10.62 tokens</li><li>max: 22 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 9.06 tokens</li><li>max: 17 tokens</li></ul> | <ul><li>min: 1.0</li><li>mean: 1.0</li><li>max: 1.0</li></ul> |
253
  * Samples:
254
- | sentence_0 | sentence_1 | label |
255
- |:--------------------------------------------------------------------|:-------------------------------------------------------------------|:-----------------|
256
- | <code>are there any of my funds that are lagging behind</code> | <code>do I hold any funds that haven't been performing well</code> | <code>1.0</code> |
257
- | <code>Which sectors are performing the best in my portfolio?</code> | <code>What are my best performing sectors?</code> | <code>1.0</code> |
258
- | <code>List some of my top holdings</code> | <code>Show some of my best performing holdings</code> | <code>1.0</code> |
259
  * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
260
  ```json
261
  {
@@ -266,19 +265,19 @@ You can finetune this model on your own dataset.
266
 
267
  #### Unnamed Dataset
268
 
269
- * Size: 1,310 training samples
270
  * Columns: <code>sentence_0</code>, <code>sentence_1</code>, and <code>label</code>
271
  * Approximate statistics based on the first 1000 samples:
272
  | | sentence_0 | sentence_1 | label |
273
  |:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:--------------------------------------------------------------|
274
  | type | string | string | float |
275
- | details | <ul><li>min: 4 tokens</li><li>mean: 10.68 tokens</li><li>max: 22 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 9.13 tokens</li><li>max: 17 tokens</li></ul> | <ul><li>min: 1.0</li><li>mean: 1.0</li><li>max: 1.0</li></ul> |
276
  * Samples:
277
- | sentence_0 | sentence_1 | label |
278
- |:--------------------------------------------------------------------|:----------------------------------------------------------|:-----------------|
279
- | <code>I need my portfolio to hit 1000% returns by next month</code> | <code>make my portfolio return 1000% by next month</code> | <code>1.0</code> |
280
- | <code>What are my stocks?</code> | <code>Show my stocks</code> | <code>1.0</code> |
281
- | <code>I'd like to know my sector distribution.</code> | <code>What is my sector allocation?</code> | <code>1.0</code> |
282
  * Loss: [<code>CosineSimilarityLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosinesimilarityloss) with these parameters:
283
  ```json
284
  {
@@ -416,22 +415,20 @@ You can finetune this model on your own dataset.
416
  </details>
417
 
418
  ### Training Logs
419
- | Epoch | Step | Training Loss | test-eval_cosine_ndcg@10 |
420
- |:-------:|:----:|:-------------:|:------------------------:|
421
- | 1.0 | 82 | - | 0.8929 |
422
- | 2.0 | 164 | - | 0.9007 |
423
- | 3.0 | 246 | - | 0.9112 |
424
- | 4.0 | 328 | - | 0.9188 |
425
- | 5.0 | 410 | - | 0.9285 |
426
- | 6.0 | 492 | - | 0.9286 |
427
- | 6.0976 | 500 | 0.2352 | 0.9291 |
428
- | 7.0 | 574 | - | 0.9356 |
429
- | 8.0 | 656 | - | 0.9404 |
430
- | 9.0 | 738 | - | 0.9406 |
431
- | 10.0 | 820 | - | 0.9434 |
432
- | 11.0 | 902 | - | 0.9424 |
433
- | 12.0 | 984 | - | 0.9455 |
434
- | 12.1951 | 1000 | 0.164 | 0.9460 |
435
 
436
 
437
  ### Framework Versions
 
4
  - sentence-similarity
5
  - feature-extraction
6
  - generated_from_trainer
7
+ - dataset_size:2640
8
  - loss:MultipleNegativesRankingLoss
9
  - loss:CosineSimilarityLoss
10
  base_model: jinaai/jina-embedding-b-en-v1
11
  widget:
12
+ - source_sentence: Can you tell me how my portfolio did last week?
13
  sentences:
14
+ - Suggest recommendations for me
15
+ - Do you have any insights on my portfolio
16
+ - How did my portfolio perform last week ?
17
+ - source_sentence: What are my most risky holdings?
 
 
18
  sentences:
19
+ - View my holdings
20
+ - Show my market cap breakdown
21
+ - Show my riskiest holdings
22
+ - source_sentence: What profits do I have in my portfolio?
23
  sentences:
24
+ - How can I swap my stocks for mutual funds?
25
+ - Show me the cash in my portfolio?
26
+ - What are the profits I have gained in my portfolio
27
+ - source_sentence: I'm curious, which investments have the highest volatility in my
28
+ portfolio?
29
  sentences:
30
+ - Which sector do I invest most in?
31
+ - is there anything wrong with my investments?
32
+ - Which of my investments have the highest volatility?
33
+ - source_sentence: Sort my investment portfolio by ESG rating, please.
34
  sentences:
35
+ - What stock makes up the largest percentage of my portfolio?
36
+ - Can you show my worst performing holdings
37
+ - Show my investments sorted by ESG rating.
38
  pipeline_tag: sentence-similarity
39
  library_name: sentence-transformers
40
  metrics:
 
64
  type: test-eval
65
  metrics:
66
  - type: cosine_accuracy@1
67
+ value: 0.8636363636363636
68
  name: Cosine Accuracy@1
69
  - type: cosine_accuracy@3
70
+ value: 0.9924242424242424
71
  name: Cosine Accuracy@3
72
  - type: cosine_accuracy@5
73
  value: 1.0
 
76
  value: 1.0
77
  name: Cosine Accuracy@10
78
  - type: cosine_precision@1
79
+ value: 0.8636363636363636
80
  name: Cosine Precision@1
81
  - type: cosine_precision@3
82
+ value: 0.3308080808080807
83
  name: Cosine Precision@3
84
  - type: cosine_precision@5
85
  value: 0.19999999999999998
 
88
  value: 0.09999999999999999
89
  name: Cosine Precision@10
90
  - type: cosine_recall@1
91
+ value: 0.8636363636363636
92
  name: Cosine Recall@1
93
  - type: cosine_recall@3
94
+ value: 0.9924242424242424
95
  name: Cosine Recall@3
96
  - type: cosine_recall@5
97
  value: 1.0
 
100
  value: 1.0
101
  name: Cosine Recall@10
102
  - type: cosine_ndcg@10
103
+ value: 0.9436916551342168
104
  name: Cosine Ndcg@10
105
  - type: cosine_mrr@10
106
+ value: 0.9242424242424244
107
  name: Cosine Mrr@10
108
  - type: cosine_map@100
109
+ value: 0.9242424242424242
110
  name: Cosine Map@100
111
  ---
112
 
 
159
  model = SentenceTransformer("sentence_transformers_model_id")
160
  # Run inference
161
  sentences = [
162
+ 'Sort my investment portfolio by ESG rating, please.',
163
+ 'Show my investments sorted by ESG rating.',
164
+ 'Can you show my worst performing holdings',
165
  ]
166
  embeddings = model.encode(sentences)
167
  print(embeddings.shape)
 
206
  * Dataset: `test-eval`
207
  * Evaluated with [<code>InformationRetrievalEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
208
 
209
+ | Metric | Value |
210
+ |:--------------------|:-----------|
211
+ | cosine_accuracy@1 | 0.8636 |
212
+ | cosine_accuracy@3 | 0.9924 |
213
+ | cosine_accuracy@5 | 1.0 |
214
+ | cosine_accuracy@10 | 1.0 |
215
+ | cosine_precision@1 | 0.8636 |
216
+ | cosine_precision@3 | 0.3308 |
217
+ | cosine_precision@5 | 0.2 |
218
+ | cosine_precision@10 | 0.1 |
219
+ | cosine_recall@1 | 0.8636 |
220
+ | cosine_recall@3 | 0.9924 |
221
+ | cosine_recall@5 | 1.0 |
222
+ | cosine_recall@10 | 1.0 |
223
+ | **cosine_ndcg@10** | **0.9437** |
224
+ | cosine_mrr@10 | 0.9242 |
225
+ | cosine_map@100 | 0.9242 |
226
 
227
  <!--
228
  ## Bias, Risks and Limitations
 
242
 
243
  #### Unnamed Dataset
244
 
245
+ * Size: 1,320 training samples
246
  * Columns: <code>sentence_0</code>, <code>sentence_1</code>, and <code>label</code>
247
  * Approximate statistics based on the first 1000 samples:
248
  | | sentence_0 | sentence_1 | label |
249
  |:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:--------------------------------------------------------------|
250
  | type | string | string | float |
251
+ | details | <ul><li>min: 4 tokens</li><li>mean: 10.62 tokens</li><li>max: 20 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 9.06 tokens</li><li>max: 17 tokens</li></ul> | <ul><li>min: 1.0</li><li>mean: 1.0</li><li>max: 1.0</li></ul> |
252
  * Samples:
253
+ | sentence_0 | sentence_1 | label |
254
+ |:-------------------------------------------------------|:---------------------------------------------------|:-----------------|
255
+ | <code>How does my portfolio score look?</code> | <code>What is my portfolio score?</code> | <code>1.0</code> |
256
+ | <code>Show me the risk profile of my portfolio.</code> | <code>Details on my portfolio risk</code> | <code>1.0</code> |
257
+ | <code>Which of my shares are the most erratic?</code> | <code>Which of my stocks are most volatile?</code> | <code>1.0</code> |
258
  * Loss: [<code>MultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#multiplenegativesrankingloss) with these parameters:
259
  ```json
260
  {
 
265
 
266
  #### Unnamed Dataset
267
 
268
+ * Size: 1,320 training samples
269
  * Columns: <code>sentence_0</code>, <code>sentence_1</code>, and <code>label</code>
270
  * Approximate statistics based on the first 1000 samples:
271
  | | sentence_0 | sentence_1 | label |
272
  |:--------|:----------------------------------------------------------------------------------|:---------------------------------------------------------------------------------|:--------------------------------------------------------------|
273
  | type | string | string | float |
274
+ | details | <ul><li>min: 4 tokens</li><li>mean: 10.62 tokens</li><li>max: 22 tokens</li></ul> | <ul><li>min: 4 tokens</li><li>mean: 9.05 tokens</li><li>max: 17 tokens</li></ul> | <ul><li>min: 1.0</li><li>mean: 1.0</li><li>max: 1.0</li></ul> |
275
  * Samples:
276
+ | sentence_0 | sentence_1 | label |
277
+ |:-------------------------------------------------------------------|:----------------------------------------------------------------|:-----------------|
278
+ | <code>What holdings carry the least risk in my portfolio?</code> | <code>What are the least risky holdings in my portfolio?</code> | <code>1.0</code> |
279
+ | <code>How have my investments fared over the previous year?</code> | <code>How has my portfolio performed over the last year?</code> | <code>1.0</code> |
280
+ | <code>How well is my portfolio performing?</code> | <code>How is my portfolio performing</code> | <code>1.0</code> |
281
  * Loss: [<code>CosineSimilarityLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosinesimilarityloss) with these parameters:
282
  ```json
283
  {
 
415
  </details>
416
 
417
  ### Training Logs
418
+ | Epoch | Step | Training Loss | test-eval_cosine_ndcg@10 |
419
+ |:------:|:----:|:-------------:|:------------------------:|
420
+ | 1.0 | 84 | - | 0.8877 |
421
+ | 2.0 | 168 | - | 0.8944 |
422
+ | 3.0 | 252 | - | 0.9042 |
423
+ | 4.0 | 336 | - | 0.9123 |
424
+ | 5.0 | 420 | - | 0.9241 |
425
+ | 5.9524 | 500 | 0.2478 | 0.9209 |
426
+ | 6.0 | 504 | - | 0.9209 |
427
+ | 7.0 | 588 | - | 0.9261 |
428
+ | 8.0 | 672 | - | 0.9327 |
429
+ | 9.0 | 756 | - | 0.9364 |
430
+ | 10.0 | 840 | - | 0.9370 |
431
+ | 11.0 | 924 | - | 0.9437 |
 
 
432
 
433
 
434
  ### Framework Versions
eval/Information-Retrieval_evaluation_test-eval_results.csv CHANGED
@@ -282,3 +282,55 @@ epoch,steps,cosine-Accuracy@1,cosine-Accuracy@3,cosine-Accuracy@5,cosine-Accurac
282
  13.0,1066,0.8625954198473282,0.9961832061068703,1.0,1.0,0.8625954198473282,0.8625954198473282,0.33206106870229,0.9961832061068703,0.19999999999999998,1.0,0.09999999999999999,1.0,0.9271628498727736,0.9460250731496836,0.9271628498727736
283
  14.0,1148,0.8625954198473282,0.9961832061068703,1.0,1.0,0.8625954198473282,0.8625954198473282,0.33206106870229,0.9961832061068703,0.19999999999999998,1.0,0.09999999999999999,1.0,0.9271628498727736,0.9460250731496836,0.9271628498727736
284
  15.0,1230,0.8625954198473282,0.9961832061068703,1.0,1.0,0.8625954198473282,0.8625954198473282,0.33206106870229,0.9961832061068703,0.19999999999999998,1.0,0.09999999999999999,1.0,0.9271628498727736,0.9460250731496836,0.9271628498727736
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
282
  13.0,1066,0.8625954198473282,0.9961832061068703,1.0,1.0,0.8625954198473282,0.8625954198473282,0.33206106870229,0.9961832061068703,0.19999999999999998,1.0,0.09999999999999999,1.0,0.9271628498727736,0.9460250731496836,0.9271628498727736
283
  14.0,1148,0.8625954198473282,0.9961832061068703,1.0,1.0,0.8625954198473282,0.8625954198473282,0.33206106870229,0.9961832061068703,0.19999999999999998,1.0,0.09999999999999999,1.0,0.9271628498727736,0.9460250731496836,0.9271628498727736
284
  15.0,1230,0.8625954198473282,0.9961832061068703,1.0,1.0,0.8625954198473282,0.8625954198473282,0.33206106870229,0.9961832061068703,0.19999999999999998,1.0,0.09999999999999999,1.0,0.9271628498727736,0.9460250731496836,0.9271628498727736
285
+ 1.0,82,0.7824427480916031,0.9312977099236641,0.9580152671755725,0.9809160305343512,0.7824427480916031,0.7824427480916031,0.31043256997455465,0.9312977099236641,0.1916030534351145,0.9580152671755725,0.0980916030534351,0.9809160305343512,0.860570398642918,0.8905145801993015,0.861361998713256
286
+ 1.0,82,0.7824427480916031,0.9312977099236641,0.9580152671755725,0.9809160305343512,0.7824427480916031,0.7824427480916031,0.31043256997455465,0.9312977099236641,0.1916030534351145,0.9580152671755725,0.0980916030534351,0.9809160305343512,0.8606385556767239,0.8905827804151322,0.86144605905495
287
+ 2.0,164,0.7938931297709924,0.9351145038167938,0.9618320610687023,0.9847328244274809,0.7938931297709924,0.7938931297709924,0.31170483460559795,0.9351145038167938,0.19236641221374043,0.9618320610687023,0.09847328244274808,0.9847328244274809,0.8679374166969588,0.8969072384575816,0.8686258801096439
288
+ 3.0,246,0.8091603053435115,0.9427480916030534,0.9694656488549618,0.9885496183206107,0.8091603053435115,0.8091603053435115,0.31424936386768443,0.9427480916030534,0.19389312977099235,0.9694656488549618,0.09885496183206106,0.9885496183206107,0.8803480552526356,0.9073392892160393,0.8809831990175502
289
+ 4.0,328,0.8206106870229007,0.9465648854961832,0.9770992366412213,0.9923664122137404,0.8206106870229007,0.8206106870229007,0.3155216284987277,0.9465648854961832,0.19541984732824427,0.9770992366412213,0.09923664122137404,0.9923664122137404,0.8892221010541622,0.9149911482565117,0.8896283704321423
290
+ 5.0,410,0.8435114503816794,0.950381679389313,0.9770992366412213,0.9961832061068703,0.8435114503816794,0.8435114503816794,0.31679389312977096,0.950381679389313,0.19541984732824427,0.9770992366412213,0.09961832061068703,0.9961832061068703,0.9029731612746882,0.9261241949153339,0.9032912274324489
291
+ 6.0,492,0.8358778625954199,0.9580152671755725,0.9847328244274809,1.0,0.8358778625954199,0.8358778625954199,0.31933842239185745,0.9580152671755725,0.1969465648854962,0.9847328244274809,0.09999999999999999,1.0,0.9016993820428936,0.9262880503236717,0.9016993820428936
292
+ 6.097560975609756,500,0.8396946564885496,0.9618320610687023,0.9847328244274809,1.0,0.8396946564885496,0.8396946564885496,0.3206106870229007,0.9618320610687023,0.1969465648854962,0.9847328244274809,0.09999999999999999,1.0,0.9039258451472193,0.9279613086761722,0.9039258451472193
293
+ 7.0,574,0.8435114503816794,0.9656488549618321,0.9923664122137404,1.0,0.8435114503816794,0.8435114503816794,0.3218829516539439,0.9656488549618321,0.1984732824427481,0.9923664122137404,0.09999999999999999,1.0,0.9089694656488551,0.9319641455171047,0.908969465648855
294
+ 8.0,656,0.851145038167939,0.9732824427480916,0.9923664122137404,1.0,0.851145038167939,0.851145038167939,0.32442748091603046,0.9732824427480916,0.1984732824427481,0.9923664122137404,0.09999999999999999,1.0,0.9129770992366413,0.9349781965628212,0.9129770992366412
295
+ 9.0,738,0.8625954198473282,0.9847328244274809,0.9923664122137404,1.0,0.8625954198473282,0.8625954198473282,0.32824427480916024,0.9847328244274809,0.1984732824427481,0.9923664122137404,0.09999999999999999,1.0,0.9202926208651401,0.9404977035041834,0.92029262086514
296
+ 10.0,820,0.8625954198473282,0.9809160305343512,0.9923664122137404,1.0,0.8625954198473282,0.8625954198473282,0.326972010178117,0.9809160305343512,0.1984732824427481,0.9923664122137404,0.09999999999999999,1.0,0.9227099236641223,0.942399303974405,0.9227099236641222
297
+ 11.0,902,0.8549618320610687,0.9809160305343512,1.0,1.0,0.8549618320610687,0.8549618320610687,0.326972010178117,0.9809160305343512,0.19999999999999998,1.0,0.09999999999999999,1.0,0.9193384223918576,0.9399831761050722,0.9193384223918575
298
+ 12.0,984,0.8664122137404581,0.9961832061068703,1.0,1.0,0.8664122137404581,0.8664122137404581,0.33206106870229,0.9961832061068703,0.19999999999999998,1.0,0.09999999999999999,1.0,0.9277989821882953,0.9464342744446671,0.927798982188295
299
+ 12.195121951219512,1000,0.8625954198473282,0.9961832061068703,1.0,1.0,0.8625954198473282,0.8625954198473282,0.33206106870229,0.9961832061068703,0.19999999999999998,1.0,0.09999999999999999,1.0,0.9258905852417304,0.9450256093819626,0.9258905852417302
300
+ 13.0,1066,0.8625954198473282,0.9961832061068703,1.0,1.0,0.8625954198473282,0.8625954198473282,0.33206106870229,0.9961832061068703,0.19999999999999998,1.0,0.09999999999999999,1.0,0.9258905852417304,0.9450256093819626,0.9258905852417302
301
+ 14.0,1148,0.8625954198473282,0.9961832061068703,1.0,1.0,0.8625954198473282,0.8625954198473282,0.33206106870229,0.9961832061068703,0.19999999999999998,1.0,0.09999999999999999,1.0,0.9258905852417304,0.9450256093819626,0.9258905852417302
302
+ 15.0,1230,0.8625954198473282,0.9961832061068703,1.0,1.0,0.8625954198473282,0.8625954198473282,0.33206106870229,0.9961832061068703,0.19999999999999998,1.0,0.09999999999999999,1.0,0.9258905852417304,0.9450256093819626,0.9258905852417302
303
+ 1.0,84,0.7765151515151515,0.928030303030303,0.9545454545454546,0.9772727272727273,0.7765151515151515,0.7765151515151515,0.3093434343434343,0.928030303030303,0.19090909090909092,0.9545454545454546,0.09772727272727273,0.9772727272727273,0.855071548821549,0.885441520379759,0.8559033455654118
304
+ 2.0,168,0.7878787878787878,0.9318181818181818,0.9583333333333334,0.9810606060606061,0.7878787878787878,0.7878787878787878,0.3106060606060606,0.9318181818181818,0.19166666666666665,0.9583333333333334,0.09810606060606061,0.9810606060606061,0.8627991221741224,0.8921594191337098,0.8635728310360663
305
+ 3.0,252,0.8068181818181818,0.9393939393939394,0.9621212121212122,0.9848484848484849,0.8068181818181818,0.8068181818181818,0.3131313131313131,0.9393939393939394,0.19242424242424241,0.9621212121212122,0.09848484848484848,0.9848484848484849,0.8773178210678213,0.9041183437774571,0.8780734290961565
306
+ 4.0,336,0.8181818181818182,0.9431818181818182,0.9696969696969697,0.9886363636363636,0.8181818181818182,0.8181818181818182,0.3143939393939394,0.9431818181818182,0.19393939393939394,0.9696969696969697,0.09886363636363638,0.9886363636363636,0.8861216329966333,0.9117088330904471,0.8867431734788546
307
+ 5.0,420,0.8371212121212122,0.9507575757575758,0.9734848484848485,0.9962121212121212,0.8371212121212122,0.8371212121212122,0.3169191919191919,0.9507575757575758,0.19469696969696973,0.9734848484848485,0.09962121212121212,0.9962121212121212,0.8984217171717173,0.9225765140640186,0.8987373737373737
308
+ 5.9523809523809526,500,0.821969696969697,0.9507575757575758,0.9848484848484849,1.0,0.821969696969697,0.821969696969697,0.3169191919191919,0.9507575757575758,0.19696969696969696,0.9848484848484849,0.09999999999999999,1.0,0.8941438191438192,0.9205488994795041,0.8941438191438191
309
+ 6.0,504,0.8257575757575758,0.9507575757575758,0.9848484848484849,1.0,0.8257575757575758,0.8257575757575758,0.3169191919191919,0.9507575757575758,0.19696969696969696,0.9848484848484849,0.09999999999999999,1.0,0.8963699494949495,0.9222542128589125,0.8963699494949494
310
+ 7.0,588,0.8333333333333334,0.9621212121212122,0.9848484848484849,1.0,0.8333333333333334,0.8333333333333334,0.32070707070707066,0.9621212121212122,0.19696969696969696,0.9848484848484849,0.09999999999999999,1.0,0.9014971139971141,0.9262070156586041,0.901497113997114
311
+ 8.0,672,0.8446969696969697,0.9659090909090909,0.9924242424242424,1.0,0.8446969696969697,0.8446969696969697,0.3219696969696969,0.9659090909090909,0.1984848484848485,0.9924242424242424,0.09999999999999999,1.0,0.9084054834054835,0.931480388869803,0.9084054834054834
312
+ 9.0,756,0.8484848484848485,0.9924242424242424,0.9962121212121212,1.0,0.8484848484848485,0.8484848484848485,0.3308080808080808,0.9924242424242424,0.19924242424242425,0.9962121212121212,0.09999999999999999,1.0,0.915088383838384,0.9368257086803438,0.9150883838383838
313
+ 10.0,840,0.8484848484848485,0.9924242424242424,1.0,1.0,0.8484848484848485,0.8484848484848485,0.3308080808080808,0.9924242424242424,0.19999999999999998,1.0,0.09999999999999999,1.0,0.9154040404040407,0.9371077896309703,0.9154040404040403
314
+ 11.0,924,0.8598484848484849,0.9886363636363636,1.0,1.0,0.8598484848484849,0.8598484848484849,0.32954545454545453,0.9886363636363636,0.19999999999999998,1.0,0.09999999999999999,1.0,0.9214015151515154,0.9415351269451009,0.9214015151515151
315
+ 11.904761904761905,1000,0.8636363636363636,0.9924242424242424,1.0,1.0,0.8636363636363636,0.8636363636363636,0.3308080808080807,0.9924242424242424,0.19999999999999998,1.0,0.09999999999999999,1.0,0.9248737373737375,0.9441876011704723,0.9248737373737373
316
+ 12.0,1008,0.8636363636363636,0.9924242424242424,1.0,1.0,0.8636363636363636,0.8636363636363636,0.3308080808080807,0.9924242424242424,0.19999999999999998,1.0,0.09999999999999999,1.0,0.9248737373737375,0.9441876011704723,0.9248737373737373
317
+ 13.0,1092,0.8636363636363636,0.9924242424242424,1.0,1.0,0.8636363636363636,0.8636363636363636,0.3308080808080807,0.9924242424242424,0.19999999999999998,1.0,0.09999999999999999,1.0,0.9248737373737375,0.9441876011704723,0.9248737373737373
318
+ 14.0,1176,0.8636363636363636,0.9924242424242424,1.0,1.0,0.8636363636363636,0.8636363636363636,0.3308080808080807,0.9924242424242424,0.19999999999999998,1.0,0.09999999999999999,1.0,0.9248737373737375,0.9441876011704723,0.9248737373737373
319
+ 15.0,1260,0.8636363636363636,0.9924242424242424,1.0,1.0,0.8636363636363636,0.8636363636363636,0.3308080808080807,0.9924242424242424,0.19999999999999998,1.0,0.09999999999999999,1.0,0.9248737373737375,0.9441876011704723,0.9248737373737373
320
+ 1.0,84,0.7803030303030303,0.9318181818181818,0.9545454545454546,0.9772727272727273,0.7803030303030303,0.7803030303030303,0.3106060606060606,0.9318181818181818,0.19090909090909092,0.9545454545454546,0.09772727272727273,0.9772727272727273,0.8580327080327081,0.8877204096964104,0.8589320700615277
321
+ 2.0,168,0.7916666666666666,0.9356060606060606,0.9583333333333334,0.9810606060606061,0.7916666666666666,0.7916666666666666,0.31186868686868685,0.9356060606060606,0.19166666666666665,0.9583333333333334,0.09810606060606061,0.9810606060606061,0.8657302188552191,0.8944025907083359,0.8666292619417619
322
+ 3.0,252,0.8068181818181818,0.9393939393939394,0.9659090909090909,0.9848484848484849,0.8068181818181818,0.8068181818181818,0.3131313131313131,0.9393939393939394,0.19318181818181818,0.9659090909090909,0.09848484848484848,0.9848484848484849,0.8774440836940838,0.9042344256718758,0.8783187393414666
323
+ 4.0,336,0.8181818181818182,0.9431818181818182,0.9696969696969697,0.9924242424242424,0.8181818181818182,0.8181818181818182,0.3143939393939394,0.9431818181818182,0.19393939393939394,0.9696969696969697,0.09924242424242424,0.9924242424242424,0.885869107744108,0.912307829578123,0.8862678318270424
324
+ 5.0,420,0.8409090909090909,0.9507575757575758,0.9734848484848485,0.9962121212121212,0.8409090909090909,0.8409090909090909,0.3169191919191919,0.9507575757575758,0.19469696969696973,0.9734848484848485,0.09962121212121212,0.9962121212121212,0.9004103535353537,0.9240745076128686,0.9007260101010102
325
+ 5.9523809523809526,500,0.821969696969697,0.9507575757575758,0.9848484848484849,1.0,0.821969696969697,0.821969696969697,0.3169191919191919,0.9507575757575758,0.19696969696969696,0.9848484848484849,0.09999999999999999,1.0,0.8945902477152479,0.9209485811394484,0.8945902477152478
326
+ 6.0,504,0.821969696969697,0.9507575757575758,0.9848484848484849,1.0,0.821969696969697,0.821969696969697,0.3169191919191919,0.9507575757575758,0.19696969696969696,0.9848484848484849,0.09999999999999999,1.0,0.8944910413660414,0.9208692254687509,0.8944910413660413
327
+ 7.0,588,0.8333333333333334,0.9621212121212122,0.9848484848484849,1.0,0.8333333333333334,0.8333333333333334,0.32070707070707066,0.9621212121212122,0.19696969696969696,0.9848484848484849,0.09999999999999999,1.0,0.9014294733044733,0.9261393321110754,0.9014294733044732
328
+ 8.0,672,0.8446969696969697,0.9734848484848485,0.9962121212121212,1.0,0.8446969696969697,0.8446969696969697,0.32449494949494945,0.9734848484848485,0.19924242424242425,0.9962121212121212,0.09999999999999999,1.0,0.9098845598845601,0.9327042377763404,0.9098845598845599
329
+ 9.0,756,0.8484848484848485,0.9924242424242424,1.0,1.0,0.8484848484848485,0.8484848484848485,0.3308080808080808,0.9924242424242424,0.19999999999999998,1.0,0.09999999999999999,1.0,0.9145833333333335,0.936445844538507,0.9145833333333333
330
+ 10.0,840,0.8446969696969697,0.9962121212121212,1.0,1.0,0.8446969696969697,0.8446969696969697,0.332070707070707,0.9962121212121212,0.19999999999999998,1.0,0.09999999999999999,1.0,0.9150883838383841,0.9369642771409741,0.9150883838383838
331
+ 11.0,924,0.8636363636363636,0.9924242424242424,1.0,1.0,0.8636363636363636,0.8636363636363636,0.3308080808080807,0.9924242424242424,0.19999999999999998,1.0,0.09999999999999999,1.0,0.9242424242424244,0.9436916551342168,0.9242424242424242
332
+ 11.904761904761905,1000,0.8598484848484849,0.9924242424242424,1.0,1.0,0.8598484848484849,0.8598484848484849,0.3308080808080807,0.9924242424242424,0.19999999999999998,1.0,0.09999999999999999,1.0,0.9223484848484851,0.942293661776533,0.9223484848484849
333
+ 12.0,1008,0.8598484848484849,0.9924242424242424,1.0,1.0,0.8598484848484849,0.8598484848484849,0.3308080808080807,0.9924242424242424,0.19999999999999998,1.0,0.09999999999999999,1.0,0.9223484848484851,0.942293661776533,0.9223484848484849
334
+ 13.0,1092,0.8598484848484849,0.9924242424242424,1.0,1.0,0.8598484848484849,0.8598484848484849,0.3308080808080807,0.9924242424242424,0.19999999999999998,1.0,0.09999999999999999,1.0,0.9223484848484851,0.942293661776533,0.9223484848484849
335
+ 14.0,1176,0.8598484848484849,0.9924242424242424,1.0,1.0,0.8598484848484849,0.8598484848484849,0.3308080808080807,0.9924242424242424,0.19999999999999998,1.0,0.09999999999999999,1.0,0.9223484848484851,0.942293661776533,0.9223484848484849
336
+ 15.0,1260,0.8598484848484849,0.9924242424242424,1.0,1.0,0.8598484848484849,0.8598484848484849,0.3308080808080807,0.9924242424242424,0.19999999999999998,1.0,0.09999999999999999,1.0,0.9223484848484851,0.942293661776533,0.9223484848484849
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:c79bf5f32af985a0f7bfb5bf7fb1bb6ffea5ed0dc5d965e3e84bfa4238460d9a
3
  size 438525864
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d4655fc8cd286eb9b6710fccb335a50310c386c6c546fedab01a251248646ab9
3
  size 438525864