Spaces:
Build error
Build error
| title: Reachy Mini Open Conversation | |
| emoji: π€ | |
| colorFrom: indigo | |
| colorTo: purple | |
| sdk: docker | |
| pinned: false | |
| short_description: Voice conversation with Reachy Mini using open-source AI | |
| # π€ Reachy Mini Open Conversation | |
| A voice conversation app powered by fully open-source models: | |
| - **STT**: [faster-whisper](https://github.com/SYSTRAN/faster-whisper) β fast speech-to-text | |
| - **LLM**: [Ollama](https://ollama.com/) β local LLM inference (llama3.2 by default) | |
| - **TTS**: [edge-tts](https://github.com/rany2/edge-tts) β high-quality text-to-speech | |
| ## Setup | |
| Set these environment variables (as Space secrets for HF Spaces): | |
| | Variable | Default | Description | | |
| |---|---|---| | |
| | `OLLAMA_BASE_URL` | `http://localhost:11434` | URL of your Ollama server | | |
| | `MODEL_NAME` | `llama3.2` | Ollama model to use | | |
| | `STT_MODEL` | `base` | faster-whisper model size (tiny/base/small/medium/large-v3) | | |
| | `TTS_VOICE` | `en-US-AriaNeural` | edge-tts voice name | | |
| > **Note**: You need a running Ollama server accessible from the Space. Set `OLLAMA_BASE_URL` to point to your remote Ollama instance. | |