Datasets:
Capitalize "English" on dataset card.
#75
by MihaiPopa-1 - opened
README.md
CHANGED
|
@@ -509,7 +509,7 @@ configs:
|
|
| 509 |
|
| 510 |
## What is it?
|
| 511 |
|
| 512 |
-
The π· FineWeb dataset consists of more than **18.5T tokens** (originally 15T tokens) of cleaned and deduplicated
|
| 513 |
|
| 514 |
π· FineWeb was originally meant to be a fully open replication of π¦
[RefinedWeb](https://huggingface.co/papers/2306.01116), with a release of the **full dataset** under the **ODC-By 1.0 license**. However, by carefully adding additional filtering steps, we managed to push the performance of π· FineWeb well above that of the original π¦
RefinedWeb, and models trained on our dataset also outperform models trained on other commonly used high quality web datasets (like C4, Dolma-v1.6, The Pile, SlimPajama, RedPajam2) on our aggregate group of [benchmark tasks](https://huggingface.co/datasets/HuggingFaceFW/fineweb/blob/main/lighteval_tasks.py).
|
| 515 |
|
|
|
|
| 509 |
|
| 510 |
## What is it?
|
| 511 |
|
| 512 |
+
The π· FineWeb dataset consists of more than **18.5T tokens** (originally 15T tokens) of cleaned and deduplicated English web data from CommonCrawl. The data processing pipeline is optimized for LLM performance and ran on the π [`datatrove`](https://github.com/huggingface/datatrove/) library, our large scale data processing library.
|
| 513 |
|
| 514 |
π· FineWeb was originally meant to be a fully open replication of π¦
[RefinedWeb](https://huggingface.co/papers/2306.01116), with a release of the **full dataset** under the **ODC-By 1.0 license**. However, by carefully adding additional filtering steps, we managed to push the performance of π· FineWeb well above that of the original π¦
RefinedWeb, and models trained on our dataset also outperform models trained on other commonly used high quality web datasets (like C4, Dolma-v1.6, The Pile, SlimPajama, RedPajam2) on our aggregate group of [benchmark tasks](https://huggingface.co/datasets/HuggingFaceFW/fineweb/blob/main/lighteval_tasks.py).
|
| 515 |
|