Update README.md
Browse files
README.md
CHANGED
|
@@ -7,7 +7,7 @@ language:
|
|
| 7 |
|
| 8 |
### Dataset Summary
|
| 9 |
|
| 10 |
-
Data obtained from
|
| 11 |
|
| 12 |
Downloaded and processed using [code](https://github.com/jedcheng/c4-dataset-script) based on another [project](https://github.com/shjwudp/c4-dataset-script) attempting to recreate the C4 dataset.
|
| 13 |
|
|
@@ -17,3 +17,10 @@ It was then filtered using a [modified list](https://github.com/jedcheng/c4-data
|
|
| 17 |
|
| 18 |
|
| 19 |
Unfortunately, I don't have enough funding to run a deduplication across all dumps at the moment. I will do it in the future (hopefully).
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 7 |
|
| 8 |
### Dataset Summary
|
| 9 |
|
| 10 |
+
Data obtained from 2013~2025 Common Crawl.
|
| 11 |
|
| 12 |
Downloaded and processed using [code](https://github.com/jedcheng/c4-dataset-script) based on another [project](https://github.com/shjwudp/c4-dataset-script) attempting to recreate the C4 dataset.
|
| 13 |
|
|
|
|
| 17 |
|
| 18 |
|
| 19 |
Unfortunately, I don't have enough funding to run a deduplication across all dumps at the moment. I will do it in the future (hopefully).
|
| 20 |
+
|
| 21 |
+
### Acknowoledgement
|
| 22 |
+
Special thanks to [Eons Data Communications Limited](https://eons.cloud/) and [Votee AI](https://votee.ai/) for the compute resources to download and process the Common Crawl Dumps.
|
| 23 |
+
|
| 24 |
+
Deduplication carried out using compute resource offered under the category of General Projects by Research Insituite for Information Technology, Kyushu University.
|
| 25 |
+
This work was also partly achieved through the use of SQUID at D3 Center, The University of Osaka.
|
| 26 |
+
|