File size: 609 Bytes
a319a59 d3871da a319a59 d3871da a319a59 8e0ba05 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 | ---
license: apache-2.0
datasets:
- HuggingFaceFW/fineweb-edu
- microsoft/orca-math-word-problems-200k
- sahil2801/CodeAlpaca-20k
- ttbui/alpaca_data_with_html_output
- yahma/alpaca-cleaned
language:
- en
tags:
- small
- tiny
- llm
- finetuned
- instruct
- code
- coding
- math
- cpu
- fast
---
**After the success of Apex 1.5 Coder, I've built something entirely new: *Axiom 1 Coder*. It's 350M, but trained with 120k Orca-Math samples and FineWeb-Edu.**
This model is based on Apex 1.6 Instruct, my newest and best model for chat and facts without coding.
Stay tuned - weights and code coming march 2026! |