allenai/Bolmo-1B
Text Generation
•
1B
•
Updated
•
474
•
37
Data used to train Bolmo, the first family of competitive fully open byte-level language models (LMs).
See our technical report for details: https://allenai.org/papers/bolmo.
| Name | Tokens | License |
|---|---|---|
| Common Crawl | 121.0B | ODC-BY |
| olmOCR Science PDFs | 19.9B | ODC-BY |
| StackEdu | 26.3B | ODC-BY |
| FineMath 3+ | 4.1B | ODC-BY |
| arXiv | 1.3B | ODC-BY |
| Wikipedia & Wikibooks | 64.6M | ODC-BY |
| Character Understanding | 75.5M | ODC-BY |
| Total | 172.7B |
Bolmo models are trained for less than one epoch (~39.3B tokens) on this mix.
Bolmo Mix is licensed under the Open Data Commons Attribution License v1.0 (ODC-By). It is intended for research and educational use. For more information, please see our Responsible Use Guidelines.
@misc{bolmo,
title={Bolmo: Byteifying the Next Generation of Language Models},
author={Benjamin Minixhofer and Tyler Murray and Tomasz Limisiewicz and Anna Korhonen and Luke Zettlemoyer and Noah A. Smith and Edoardo M. Ponti and Luca Soldaini and Valentin Hofmann},
year={2025},
eprint={2512.15586},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2512.15586},
}