Update README.md
Browse files
README.md
CHANGED
|
@@ -7,6 +7,10 @@ language:
|
|
| 7 |
pretty_name: Wikpedia Paragraphs MPNet Embeddings
|
| 8 |
---
|
| 9 |
|
| 10 |
-
Embeddings of the [english Wikipedia](https://huggingface.co/datasets/wikipedia) [paragraphs](https://huggingface.co/datasets/olmer/wiki_paragraphs) using [all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) sentence transformers encoder.
|
| 11 |
-
The dataset contains 43 911 155 paragraphs from 6 458 670 Wikipedia articles.
|
| 12 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 7 |
pretty_name: Wikpedia Paragraphs MPNet Embeddings
|
| 8 |
---
|
| 9 |
|
| 10 |
+
Embeddings of the [english Wikipedia](https://huggingface.co/datasets/wikipedia) [paragraphs](https://huggingface.co/datasets/olmer/wiki_paragraphs) using [all-mpnet-base-v2](https://huggingface.co/sentence-transformers/all-mpnet-base-v2) sentence transformers encoder.
|
| 11 |
+
The dataset contains 43 911 155 paragraphs from 6 458 670 Wikipedia articles.
|
| 12 |
+
The size of each paragraph varies from 20 to 2000 characters.
|
| 13 |
+
For each paragraph there is an embedding of size 768.
|
| 14 |
+
Embeddings are stored in numpy files, 1 000 000 embeddings per file.
|
| 15 |
+
For each embedding file, there is an ids file that contains the list of ids of the corresponding paragraphs.
|
| 16 |
+
__Be careful, dataset size is 151Gb__.
|