File size: 2,089 Bytes
976d36a | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 | ---
license: odc-by
task_categories:
- fill-mask
- text-generation
language:
- en
tags:
- manufacturing
- engineering
size_categories:
- 1M<n<10M
---
# Manu-FineWeb
**Manu-FineWeb** is a high-quality, large-scale corpus specifically curated for the **manufacturing domain**. It was extracted from the 15-trillion-token FineWeb dataset and refined to facilitate efficient domain-specific pretraining for models like **ManufactuBERT**.
## Dataset Summary
- **Developed by:** Robin Armingaud and Romaric Besançon (Université Paris-Saclay, CEA, List)
- **Statistics:** 2B tokens/4,5 million documents
## Construction & Curation
The dataset was built using a rigorous pipeline to ensure high relevance and low redundancy:
### 1. Domain-Specific Filtering
A **fastText classifier** was trained on a positive set of manufacturing-specific sources to filter the general FineWeb corpus. The training sources included:
* **Elsevier:** Abstracts from industrial and manufacturing engineering journals.
* **ArXiv:** Abstracts from categories like physics, computer science, and engineering related to industrial processes.
* **Wikipedia:** Articles from manufacturing and engineering categories.
* **BigPatent:** Patent descriptions containing "manufacturing" keywords.
### 2. Multi-Stage Deduplication
To improve training efficiency, the 10B token corpus was reduced by ~80% through:
* **Lexical Deduplication (MinHash):** Eliminating near-exact text duplicates.
* **Semantic Deduplication (SemDeDup):** Identifying and removing semantically redundant documents using sentence embeddings (all-MiniLM-L6-v2), leaving only the most representative data points.
## Citation
If you use ManufactuBERT in your research, please cite:
```bibtex
@misc{armingaud2025manufactubertefficientcontinualpretraining,
title={ManufactuBERT: Efficient Continual Pretraining for Manufacturing},
author={Robin Armingaud and Romaric Besançon},
year={2025},
eprint={2511.05135},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2511.05135},
}
``` |