metadata
language:
- en
license: apache-2.0
library_name: transformers
pipeline_tag: text-generation
base_model: Qwen/Qwen2.5-14B-Instruct
base_model_relation: adapter
tags:
- text-generation
- mixture-of-experts
- moe
- ternary
- qwen2
- qwen2.5
- outlier
- legacy
- research
- superseded
Outlier-40B — V1 (legacy)
⚠ Superseded. This is the V1 release of Outlier-40B, preserved for reproducibility. Use Outlier-40B-V3.2 for current work.
Why this repo still exists
- Reproducibility of early experiments and papers that cite this version.
- Historical interest — this was part of the Day-17 2026 Outlier launch.
What changed in later versions
V3.2 and later apply the alpha-fix scalar correction (+1.61pp at 70B per registry E05) and incorporate the expert-paging three-tier memory architecture. Legacy releases do not.
V1 release. Use V3.2 for current work.
See also
- Current: Outlier-40B-V3.2
- All Outlier models: https://huggingface.co/Outlier-Ai
Citation
@misc{outlier2026,
author = {Kerr, Matt},
title = {Outlier: Ternary Mixture-of-Experts for Consumer Hardware},
year = {2026},
publisher = {Hugging Face},
howpublished = {\url{https://huggingface.co/Outlier-Ai}}
}
Links
- Website: https://outlier.host
- GitHub: https://github.com/Outlier-host/Outlier
- All models: https://huggingface.co/Outlier-Ai
- Consumer Edition collection: https://huggingface.co/collections/Outlier-Ai/outlier-consumer-edition-69e2fb4a0df119ea1747275e
- Server V3.2 collection: https://huggingface.co/collections/Outlier-Ai/outlier-server-v32-69e2fb4b71984614b3c7a4a3
- Research collection: https://huggingface.co/collections/Outlier-Ai/outlier-research-69e2fb3a71984614b3c7a279