LH-Tech-AI's picture
Update README.md
8e0ba05 verified
metadata
license: apache-2.0
datasets:
  - HuggingFaceFW/fineweb-edu
  - microsoft/orca-math-word-problems-200k
  - sahil2801/CodeAlpaca-20k
  - ttbui/alpaca_data_with_html_output
  - yahma/alpaca-cleaned
language:
  - en
tags:
  - small
  - tiny
  - llm
  - finetuned
  - instruct
  - code
  - coding
  - math
  - cpu
  - fast

After the success of Apex 1.5 Coder, I've built something entirely new: Axiom 1 Coder. It's 350M, but trained with 120k Orca-Math samples and FineWeb-Edu.

This model is based on Apex 1.6 Instruct, my newest and best model for chat and facts without coding.

Stay tuned - weights and code coming march 2026!