Update README.md
#1
by
lgcharpe
- opened
README.md
CHANGED
|
@@ -2,9 +2,12 @@
|
|
| 2 |
license: apache-2.0
|
| 3 |
datasets:
|
| 4 |
- HPLT/HPLT3.0
|
| 5 |
-
- allenai/MADLAD-400
|
| 6 |
- allenai/olmo-mix-1124
|
| 7 |
- HuggingFaceFW/finepdfs
|
|
|
|
|
|
|
|
|
|
|
|
|
| 8 |
language:
|
| 9 |
- nb
|
| 10 |
- nn
|
|
@@ -22,7 +25,8 @@ tags:
|
|
| 22 |
|
| 23 |
This is a base (not instruction-tuned) large language model, continually pre-trained on Norwegian data starting from the English [OLMo2-13B](https://huggingface.co/allenai/OLMo-2-1124-13B) model.
|
| 24 |
|
| 25 |
-
Our training data mixture included [HPLTv3](https://huggingface.co/datasets/HPLT/HPLT3.0) Bokmål and Nynorsk, FinePDF Bokmål and Nynorsk,
|
|
|
|
| 26 |
The model was trained for 33 000 steps on around 300 billion tokens. Intermediate checkpoints are published here as branches.
|
| 27 |
|
| 28 |
Training was conducted as a part of the [HPLT project](https://hplt-project.org/).
|
|
|
|
| 2 |
license: apache-2.0
|
| 3 |
datasets:
|
| 4 |
- HPLT/HPLT3.0
|
|
|
|
| 5 |
- allenai/olmo-mix-1124
|
| 6 |
- HuggingFaceFW/finepdfs
|
| 7 |
+
- HuggingFaceTB/finemath
|
| 8 |
+
- LLM360/MegaMath
|
| 9 |
+
- HuggingFaceTB/stack-edu
|
| 10 |
+
- HuggingFaceFW/finepdfs-edu
|
| 11 |
language:
|
| 12 |
- nb
|
| 13 |
- nn
|
|
|
|
| 25 |
|
| 26 |
This is a base (not instruction-tuned) large language model, continually pre-trained on Norwegian data starting from the English [OLMo2-13B](https://huggingface.co/allenai/OLMo-2-1124-13B) model.
|
| 27 |
|
| 28 |
+
Our training data mixture for the first stage (steps 0-24 000) included [HPLTv3](https://huggingface.co/datasets/HPLT/HPLT3.0) Bokmål and Nynorsk as well as Faroese, Icelandic, Danish, and Swedish, FinePDF Bokmål and Nynorsk as well as Faroese, Icelandic, Danish, and Swedish, OLMo-Mix, Northern Sami dataset.
|
| 29 |
+
For the second stage (steps 24 000-33 000), our training data mixture includes a filtered [HPLTv3](https://huggingface.co/datasets/HPLT/HPLT3.0) Bokmål and Nynorsk as well as Faroese, Icelandic, Danish, and Swedish, FinePDF-Edu Bokmål and Nynorsk as well as Icelandic, Danish, Swedish, and English, FinePDF Faroese, Stack-Edu, MegaMath Web-Pro, FineMath 4+, InfiWebMath 4+, Northern Sami dataset
|
| 30 |
The model was trained for 33 000 steps on around 300 billion tokens. Intermediate checkpoints are published here as branches.
|
| 31 |
|
| 32 |
Training was conducted as a part of the [HPLT project](https://hplt-project.org/).
|