Update README.md
Browse files
README.md
CHANGED
|
@@ -145,6 +145,7 @@ The full [anserini evaluation log](https://huggingface.co/prithivida/Splade_PP_e
|
|
| 145 |
|
| 146 |
## 5. Roadmap and future directions for Industry Suitability.
|
| 147 |
|
|
|
|
| 148 |
- **Custom/Domain Finetuning**: OOD Zeroshot performance of SPLADE models is great but unimportant in the industry setting as we need the ability to finetune on custom datasets or domains. Finetuning SPLADE on a new dataset is not cheap and needs labelling of queries and passages.
|
| 149 |
So we will continue to see how we can enable economically finetuning our recipe on custom datasets without expensive labelling.
|
| 150 |
- **Multilingual SPLADE**: Training cost of SPLADE i.e (GPU budget) directly proportional to Vocab size of the base model, So Mulitlingual SPLADE either using mbert or XLMR can be expensive as they have
|
|
|
|
| 145 |
|
| 146 |
## 5. Roadmap and future directions for Industry Suitability.
|
| 147 |
|
| 148 |
+
- **Improve efficiency**: This is a bottomless pit, Will continue to improve serving and retrieval efficiency.
|
| 149 |
- **Custom/Domain Finetuning**: OOD Zeroshot performance of SPLADE models is great but unimportant in the industry setting as we need the ability to finetune on custom datasets or domains. Finetuning SPLADE on a new dataset is not cheap and needs labelling of queries and passages.
|
| 150 |
So we will continue to see how we can enable economically finetuning our recipe on custom datasets without expensive labelling.
|
| 151 |
- **Multilingual SPLADE**: Training cost of SPLADE i.e (GPU budget) directly proportional to Vocab size of the base model, So Mulitlingual SPLADE either using mbert or XLMR can be expensive as they have
|