Update README.md
Browse files
README.md
CHANGED
|
@@ -5,6 +5,7 @@ datasets:
|
|
| 5 |
- microsoft/orca-math-word-problems-200k
|
| 6 |
- sahil2801/CodeAlpaca-20k
|
| 7 |
- ttbui/alpaca_data_with_html_output
|
|
|
|
| 8 |
language:
|
| 9 |
- en
|
| 10 |
tags:
|
|
@@ -20,7 +21,7 @@ tags:
|
|
| 20 |
- fast
|
| 21 |
---
|
| 22 |
|
| 23 |
-
**After the success of Apex 1.5 Coder, I've built something entirely new: Axiom 1 Coder. It's 350M, but trained with 120k Orca-Math samples and FineWeb-Edu.**
|
| 24 |
|
| 25 |
This model is based on Apex 1.6 Instruct, my newest and best model for chat and facts without coding.
|
| 26 |
|
|
|
|
| 5 |
- microsoft/orca-math-word-problems-200k
|
| 6 |
- sahil2801/CodeAlpaca-20k
|
| 7 |
- ttbui/alpaca_data_with_html_output
|
| 8 |
+
- yahma/alpaca-cleaned
|
| 9 |
language:
|
| 10 |
- en
|
| 11 |
tags:
|
|
|
|
| 21 |
- fast
|
| 22 |
---
|
| 23 |
|
| 24 |
+
**After the success of Apex 1.5 Coder, I've built something entirely new: *Axiom 1 Coder*. It's 350M, but trained with 120k Orca-Math samples and FineWeb-Edu.**
|
| 25 |
|
| 26 |
This model is based on Apex 1.6 Instruct, my newest and best model for chat and facts without coding.
|
| 27 |
|