Akelei-270M
This is a continued pre-train of Google's Gemma 3 270M base model, incorporating roughly 35% of the German Wikipedia; the model is designed to be used in German-speaking environments where paramount efficiency and language understanding is key.
Cite as
@misc{akelei270m,
author = {Magnus Leonard Schlinsog},
title = {Akelei 270M: A small-scale base language model for the German language},
year = {2025},
url = {https://huggingface.co/mags0ft/Akelei-270M},
}
- Downloads last month
- 4