rasyosef commited on
Commit
34aef2d
·
verified ·
1 Parent(s): e03eb83

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -152,7 +152,7 @@ datasets:
152
 
153
  This is a SPLADE sparse retrieval model based on BERT-Tiny (4M) that was trained by distilling a Cross-Encoder on the MSMARCO dataset. The cross-encoder used was [ms-marco-MiniLM-L6-v2](https://huggingface.co/cross-encoder/ms-marco-MiniLM-L6-v2).
154
 
155
- This tiny SPLADE model is `15x` smaller than Naver's official `splade-v3-distilbert` while having `80%` of it's performance on the MSMARCO benchmark. This model is small enough to be used without a GPU on a dataset of a few thousand documents.
156
 
157
  - `Collection:` https://huggingface.co/collections/rasyosef/splade-tiny-msmarco-687c548c0691d95babf65b70
158
  - `Distillation Dataset:` https://huggingface.co/datasets/yosefw/msmarco-train-distil-v2
 
152
 
153
  This is a SPLADE sparse retrieval model based on BERT-Tiny (4M) that was trained by distilling a Cross-Encoder on the MSMARCO dataset. The cross-encoder used was [ms-marco-MiniLM-L6-v2](https://huggingface.co/cross-encoder/ms-marco-MiniLM-L6-v2).
154
 
155
+ Theis Tiny SPLADE model beats `BM25` by `65.6%` on the MSMARCO benchmark. While this model is `15x` smaller than Naver's official `splade-v3-distilbert`, is posesses `80%` of it's performance on the MSMARCO benchmark. This model is small enough to be used without a GPU on a dataset of a few thousand documents.
156
 
157
  - `Collection:` https://huggingface.co/collections/rasyosef/splade-tiny-msmarco-687c548c0691d95babf65b70
158
  - `Distillation Dataset:` https://huggingface.co/datasets/yosefw/msmarco-train-distil-v2