Update README.md
Browse files
README.md
CHANGED
|
@@ -15,28 +15,33 @@ tags:
|
|
| 15 |
base_model: prajjwal1/bert-mini
|
| 16 |
widget:
|
| 17 |
- text: what did marlo thomas play on
|
| 18 |
-
- text:
|
| 19 |
-
|
| 20 |
-
|
| 21 |
-
|
| 22 |
-
|
| 23 |
-
|
| 24 |
-
|
| 25 |
-
|
| 26 |
-
|
| 27 |
-
|
| 28 |
-
|
| 29 |
-
|
| 30 |
-
|
| 31 |
-
|
| 32 |
-
|
| 33 |
-
|
| 34 |
-
|
| 35 |
-
|
| 36 |
-
|
| 37 |
-
|
| 38 |
-
|
| 39 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 40 |
pipeline_tag: feature-extraction
|
| 41 |
library_name: sentence-transformers
|
| 42 |
metrics:
|
|
@@ -126,38 +131,13 @@ model-index:
|
|
| 126 |
- type: corpus_sparsity_ratio
|
| 127 |
value: 0.9962823369573983
|
| 128 |
name: Corpus Sparsity Ratio
|
|
|
|
|
|
|
| 129 |
---
|
| 130 |
|
| 131 |
# SPLADE-BERT-Mini-Distil
|
| 132 |
|
| 133 |
This is a [SPLADE Sparse Encoder](https://www.sbert.net/docs/sparse_encoder/usage/usage.html) model finetuned from [prajjwal1/bert-mini](https://huggingface.co/prajjwal1/bert-mini) using the [sentence-transformers](https://www.SBERT.net) library. It maps sentences & paragraphs to a 30522-dimensional sparse vector space and can be used for semantic search and sparse retrieval.
|
| 134 |
-
## Model Details
|
| 135 |
-
|
| 136 |
-
### Model Description
|
| 137 |
-
- **Model Type:** SPLADE Sparse Encoder
|
| 138 |
-
- **Base model:** [prajjwal1/bert-mini](https://huggingface.co/prajjwal1/bert-mini) <!-- at revision 5e123abc2480f0c4b4cac186d3b3f09299c258fc -->
|
| 139 |
-
- **Maximum Sequence Length:** 512 tokens
|
| 140 |
-
- **Output Dimensionality:** 30522 dimensions
|
| 141 |
-
- **Similarity Function:** Dot Product
|
| 142 |
-
<!-- - **Training Dataset:** Unknown -->
|
| 143 |
-
- **Language:** en
|
| 144 |
-
- **License:** mit
|
| 145 |
-
|
| 146 |
-
### Model Sources
|
| 147 |
-
|
| 148 |
-
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
|
| 149 |
-
- **Documentation:** [Sparse Encoder Documentation](https://www.sbert.net/docs/sparse_encoder/usage/usage.html)
|
| 150 |
-
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
|
| 151 |
-
- **Hugging Face:** [Sparse Encoders on Hugging Face](https://huggingface.co/models?library=sentence-transformers&other=sparse-encoder)
|
| 152 |
-
|
| 153 |
-
### Full Model Architecture
|
| 154 |
-
|
| 155 |
-
```
|
| 156 |
-
SparseEncoder(
|
| 157 |
-
(0): MLMTransformer({'max_seq_length': 512, 'do_lower_case': False, 'architecture': 'BertForMaskedLM'})
|
| 158 |
-
(1): SpladePooling({'pooling_strategy': 'max', 'activation_function': 'relu', 'word_embedding_dimension': 30522})
|
| 159 |
-
)
|
| 160 |
-
```
|
| 161 |
|
| 162 |
## Usage
|
| 163 |
|
|
@@ -174,7 +154,7 @@ Then you can load this model and run inference.
|
|
| 174 |
from sentence_transformers import SparseEncoder
|
| 175 |
|
| 176 |
# Download from the 🤗 Hub
|
| 177 |
-
model = SparseEncoder("
|
| 178 |
# Run inference
|
| 179 |
queries = [
|
| 180 |
"definition of fermentation in the lab",
|
|
@@ -219,6 +199,36 @@ You can finetune this model on your own dataset.
|
|
| 219 |
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
|
| 220 |
-->
|
| 221 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 222 |
## Evaluation
|
| 223 |
|
| 224 |
### Metrics
|
|
@@ -515,4 +525,5 @@ You can finetune this model on your own dataset.
|
|
| 515 |
## Model Card Contact
|
| 516 |
|
| 517 |
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
|
| 518 |
-
-->
|
|
|
|
|
|
| 15 |
base_model: prajjwal1/bert-mini
|
| 16 |
widget:
|
| 17 |
- text: what did marlo thomas play on
|
| 18 |
+
- text: >-
|
| 19 |
+
Unused vacation does not roll over. to next calendar year and is not paid
|
| 20 |
+
out at termination. Please Note: This table applies to employees in
|
| 21 |
+
positions with 100% FTE. The number of hours/days of vacation are pro-rated
|
| 22 |
+
for FTEs between 75%. and 99%. For Example: If a non-exempt employee who is
|
| 23 |
+
within their first five years of service has an FTE of 75%, then the number
|
| 24 |
+
of hours they would. accrue each month would be 6 (8 x .75 = 6) and not 8.
|
| 25 |
+
- text: >-
|
| 26 |
+
To convert from miles to feet by hand, multiply miles by 5280. miles * 5280
|
| 27 |
+
= feet. To convert from feet to miles by hand, divide feet by 5280. feet /
|
| 28 |
+
5280 = miles. An automated version of this calculator can be found here:
|
| 29 |
+
- text: >-
|
| 30 |
+
All you have to do is click on the button directly below and follow the
|
| 31 |
+
instructions. Click he relevant payment button below for £55 payment.
|
| 32 |
+
--------------------------. Star Attuned Crystals for Activation. Another
|
| 33 |
+
way to receive star energies and activations is to buy the attuned crystals
|
| 34 |
+
that I provide.These carry the energies of specific stars or star beings and
|
| 35 |
+
guides. By holding these you can feel the energies of the star or star being
|
| 36 |
+
and this brings healing, activation, spiritual growth and sometimes
|
| 37 |
+
communication.he highest form of star energy work / activation is to receive
|
| 38 |
+
a Star Attunement. Star Attunements carry the power of stars and evolved
|
| 39 |
+
star beings. They are off the scale and profoundly beautiful and spiritual.
|
| 40 |
+
- text: >-
|
| 41 |
+
Fermentation is a metabolic pathway that produce ATP molecules under
|
| 42 |
+
anaerobic conditions (only undergoes glycolysis), NAD+ is used directly in
|
| 43 |
+
glycolysis to form ATP molecules, which is not as efficient as cellular
|
| 44 |
+
respiration because only 2ATP molecules are formed during the glycolysis.
|
| 45 |
pipeline_tag: feature-extraction
|
| 46 |
library_name: sentence-transformers
|
| 47 |
metrics:
|
|
|
|
| 131 |
- type: corpus_sparsity_ratio
|
| 132 |
value: 0.9962823369573983
|
| 133 |
name: Corpus Sparsity Ratio
|
| 134 |
+
datasets:
|
| 135 |
+
- microsoft/ms_marco
|
| 136 |
---
|
| 137 |
|
| 138 |
# SPLADE-BERT-Mini-Distil
|
| 139 |
|
| 140 |
This is a [SPLADE Sparse Encoder](https://www.sbert.net/docs/sparse_encoder/usage/usage.html) model finetuned from [prajjwal1/bert-mini](https://huggingface.co/prajjwal1/bert-mini) using the [sentence-transformers](https://www.SBERT.net) library. It maps sentences & paragraphs to a 30522-dimensional sparse vector space and can be used for semantic search and sparse retrieval.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 141 |
|
| 142 |
## Usage
|
| 143 |
|
|
|
|
| 154 |
from sentence_transformers import SparseEncoder
|
| 155 |
|
| 156 |
# Download from the 🤗 Hub
|
| 157 |
+
model = SparseEncoder("rasyosef/splade-mini")
|
| 158 |
# Run inference
|
| 159 |
queries = [
|
| 160 |
"definition of fermentation in the lab",
|
|
|
|
| 199 |
*List how the model may foreseeably be misused and address what users ought not to do with the model.*
|
| 200 |
-->
|
| 201 |
|
| 202 |
+
## Model Details
|
| 203 |
+
|
| 204 |
+
### Model Description
|
| 205 |
+
- **Model Type:** SPLADE Sparse Encoder
|
| 206 |
+
- **Base model:** [prajjwal1/bert-mini](https://huggingface.co/prajjwal1/bert-mini) <!-- at revision 5e123abc2480f0c4b4cac186d3b3f09299c258fc -->
|
| 207 |
+
- **Maximum Sequence Length:** 512 tokens
|
| 208 |
+
- **Output Dimensionality:** 30522 dimensions
|
| 209 |
+
- **Similarity Function:** Dot Product
|
| 210 |
+
<!-- - **Training Dataset:** Unknown -->
|
| 211 |
+
- **Language:** en
|
| 212 |
+
- **License:** mit
|
| 213 |
+
|
| 214 |
+
### Model Sources
|
| 215 |
+
|
| 216 |
+
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
|
| 217 |
+
- **Documentation:** [Sparse Encoder Documentation](https://www.sbert.net/docs/sparse_encoder/usage/usage.html)
|
| 218 |
+
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
|
| 219 |
+
- **Hugging Face:** [Sparse Encoders on Hugging Face](https://huggingface.co/models?library=sentence-transformers&other=sparse-encoder)
|
| 220 |
+
|
| 221 |
+
### Full Model Architecture
|
| 222 |
+
|
| 223 |
+
```
|
| 224 |
+
SparseEncoder(
|
| 225 |
+
(0): MLMTransformer({'max_seq_length': 512, 'do_lower_case': False, 'architecture': 'BertForMaskedLM'})
|
| 226 |
+
(1): SpladePooling({'pooling_strategy': 'max', 'activation_function': 'relu', 'word_embedding_dimension': 30522})
|
| 227 |
+
)
|
| 228 |
+
```
|
| 229 |
+
|
| 230 |
+
## More
|
| 231 |
+
<details><summary>Click to expand</summary>
|
| 232 |
## Evaluation
|
| 233 |
|
| 234 |
### Metrics
|
|
|
|
| 525 |
## Model Card Contact
|
| 526 |
|
| 527 |
*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
|
| 528 |
+
-->
|
| 529 |
+
</details>
|