Update README.md
Browse files
README.md
CHANGED
|
@@ -19,7 +19,9 @@ license: apache-2.0
|
|
| 19 |
|
| 20 |
|
| 21 |
|
| 22 |
-
We are excited to introduce JT-Math-8B-Base: an 8-billion-parameter foundation model engineered for mathematical reasoning and the cornerstone of the JT-Math family.
|
|
|
|
|
|
|
| 23 |
|
| 24 |
|
| 25 |
|
|
|
|
| 19 |
|
| 20 |
|
| 21 |
|
| 22 |
+
We are excited to introduce JT-Math-8B-Base: an 8-billion-parameter foundation model engineered for mathematical reasoning and the cornerstone of the JT-Math family.
|
| 23 |
+
JT-Math-8B-Base was pre-trained on top of JT-Coder-8B-Base using an additional 210 billion tokens of high-quality mathematical and general-domain data.
|
| 24 |
+
With a native 32,768-token context window, it provides a robust, scalable, and reproducible foundation for downstream fine-tuning—enabling researchers and developers to advance the frontier of math-centric AI applications. Technical details, training recipes, and reproducibility notes are available in our technical report.
|
| 25 |
|
| 26 |
|
| 27 |
|