Aethon-1.0-Base-100M-ResearchPreview

Aethon-1.0-Base-100M-ResearchPreview is the first public base checkpoint in the Aethon family, a memory-native recurrent language model line built outside the standard transformer path.

This release is a research preview. It is intended to establish the first public Aethon base model, show that the architecture is real and trainable, and mark the beginning of a larger Aethon release line.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 1 Ask for provider support