--- title: Organization card emoji: 🚀 colorFrom: purple colorTo: red sdk: static pinned: false --- **TempestTeam** **Mission:** We aim to efficiently train large-scale State Space Models (SSM) while significantly reducing infrastructure usage. Our goal is to minimize economic and environmental impacts without substantially compromising linguistic performance. **Model:** **Tempest-LLM** – an efficient language model based on **Mamba2**, leveraging advanced compression methods to achieve an encoding efficiency of **1.58 bits per parameter**. **Training Approach:** Our model benefits from a balanced multilingual training strategy, ensuring equal proficiency in: - 🇫🇷 **French** - 🇬🇧 **English** - 🇪🇸 **Spanish** This multilingual training enhances linguistic versatility and cultural adaptability across different languages and contexts. **Impact:** - **Economic:** Reduced computational infrastructure leads to lower operational costs. - **Ecological:** Lower power consumption and minimal infrastructure requirements decrease environmental footprint. - **Performance:** Maintains robust linguistic accuracy and fluency despite compression and optimization. **Vision:** TempestTeam is committed to showing that linguistic AI technologies can be both powerful and sustainable, contributing responsibly to AI innovation.