Akelei-270M

Showcase image for Akelei-270M

This is a continued pre-train of Google's Gemma 3 270M base model, incorporating roughly 35% of the German Wikipedia; the model is designed to be used in German-speaking environments where paramount efficiency and language understanding is key.

Cite as

@misc{akelei270m,
  author       = {Magnus Leonard Schlinsog},
  title        = {Akelei 270M: A small-scale base language model for the German language},
  year         = {2025},
  url          = {https://huggingface.co/mags0ft/Akelei-270M},
}
Downloads last month
4
Safetensors
Model size
0.3B params
Tensor type
F32
·
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for mags0ft/Akelei-270M

Finetuned
(119)
this model
Quantizations
1 model

Dataset used to train mags0ft/Akelei-270M