Spaces:
Build error
Build error
| title: Ollama Server | |
| emoji: 🦙 | |
| colorFrom: blue | |
| colorTo: green | |
| sdk: docker | |
| app_port: 11434 | |
| pinned: true | |
| # Ollama Server on Hugging Face Spaces | |
| A lightweight [Ollama](https://ollama.com/) inference server running on a Hugging Face Space. | |
| ## Pre-pulled Models | |
| - `gemma2:2b` | |
| - `alibilge/Huihui-GLM-4.6V-Flash-abliterated:q4_k_s` | |
| - `frob/mradermacher-Llama3.3-8B-Thinking-Heretic-Claude-4.5-Opus:q8_0` | |
| ## Usage | |
| Once the Space is running, you can interact with the Ollama API at the Space's URL. | |
| ### Pull additional models | |
| ```bash | |
| curl https://your-space.hf.space/api/pull -d '{"model": "llama3.2"}' | |
| ``` | |
| ### Generate a completion | |
| ```bash | |
| curl https://your-space.hf.space/api/generate -d '{ | |
| "model": "frob/mradermacher-Llama3.3-8B-Thinking-Heretic-Claude-4.5-Opus:q8_0", | |
| "prompt": "Hello!" | |
| }' | |
| ``` | |
| ## API | |
| This Space exposes the standard Ollama REST API on port `11434`. | |