rasyosef commited on
Commit
5c01cdf
·
verified ·
1 Parent(s): 5e1cebb

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -134,7 +134,7 @@ language:
134
 
135
  This is a SPLADE sparse retrieval model based on BERT-Small (29M) that was trained by distilling a Cross-Encoder on the MSMARCO dataset. The cross-encoder used was [ms-marco-MiniLM-L6-v2](https://huggingface.co/cross-encoder/ms-marco-MiniLM-L6-v2).
136
 
137
- This tiny SPLADE model is `2x` smaller than Naver's official `splade-v3-distilbert` while having `91%` of it's performance on the MSMARCO benchmark. This model is small enough to be used without a GPU on a dataset of a few thousand documents.
138
 
139
  - `Collection:` https://huggingface.co/collections/rasyosef/splade-tiny-msmarco-687c548c0691d95babf65b70
140
  - `Distillation Dataset:` https://huggingface.co/datasets/yosefw/msmarco-train-distil-v2
 
134
 
135
  This is a SPLADE sparse retrieval model based on BERT-Small (29M) that was trained by distilling a Cross-Encoder on the MSMARCO dataset. The cross-encoder used was [ms-marco-MiniLM-L6-v2](https://huggingface.co/cross-encoder/ms-marco-MiniLM-L6-v2).
136
 
137
+ This SPLADE model is `2x` smaller than Naver's official `splade-v3-distilbert` while having `91%` of it's performance on the MSMARCO benchmark. This model is small enough to be used without a GPU on a dataset of a few thousand documents.
138
 
139
  - `Collection:` https://huggingface.co/collections/rasyosef/splade-tiny-msmarco-687c548c0691d95babf65b70
140
  - `Distillation Dataset:` https://huggingface.co/datasets/yosefw/msmarco-train-distil-v2