Marcus2112's picture
Update README.md
f1f1902 verified
metadata
language:
  - en
size_categories:
  - 10M<n<100M

Containing 768-dimensional embedding vectors derived from the "content" column of the RefinedWeb dataset.
The embeddings were generated using the E5-Base-4k model with a context length of 1024, employing scaled dot-product attention.

Original dataset:
This dataset bases as derivative work on RefinedWeb, an English web dataset created by the TII (Technology Innovation Institute).
Attribution is given to the TII as original authors of the RefinedWeb dataset as per Section 4 of RefinedWeb's Open Data Commons Attribution License (ODC-By).
Produced with the MiniCorpus pipeline, an investigation of the MiniPile data-distillation method (Kaddour, 2023).

  • Embedding Model: E5-Base-4k
  • Embedding Dimensions: 768
  • Embedding Context Length: 1024 tokens
  • Embedding Implementation Specifics: Scaled Dot-Product Attention
  • Embedding Scope: For embedding, all but the "content" field are disregarded.

Original Dataset Link:
https://huggingface.co/datasets/tiiuae/falcon-refinedweb