Spaces:
Running
Running
| title: Organization card | |
| emoji: π | |
| colorFrom: purple | |
| colorTo: red | |
| sdk: static | |
| pinned: false | |
| **TempestTeam** | |
| **Mission:** | |
| We aim to efficiently train large-scale State Space Models (SSM) while significantly reducing infrastructure usage. Our goal is to minimize economic and environmental impacts without substantially compromising linguistic performance. | |
| **Model:** | |
| **Tempest-LLM** β an efficient language model based on **Mamba2**, leveraging advanced compression methods to achieve an encoding efficiency of **1.58 bits per parameter**. | |
| **Training Approach:** | |
| Our model benefits from a balanced multilingual training strategy, ensuring equal proficiency in: | |
| - π«π· **French** | |
| - π¬π§ **English** | |
| - πͺπΈ **Spanish** | |
| This multilingual training enhances linguistic versatility and cultural adaptability across different languages and contexts. | |
| **Impact:** | |
| - **Economic:** Reduced computational infrastructure leads to lower operational costs. | |
| - **Ecological:** Lower power consumption and minimal infrastructure requirements decrease environmental footprint. | |
| - **Performance:** Maintains robust linguistic accuracy and fluency despite compression and optimization. | |
| **Vision:** | |
| TempestTeam is committed to showing that linguistic AI technologies can be both powerful and sustainable, contributing responsibly to AI innovation. |