ollama-api / README.md
enoch10jason's picture
Final setup: Ollama API with gemma3:1b
db53725
metadata
title: Ollama API
emoji: 🐳
colorFrom: indigo
colorTo: pink
sdk: docker
app_port: 7860
pinned: false
license: mit

Ollama API on Hugging Face Spaces

Run Ollama models via a REST API.

Example usage:

curl -X POST "https://enoch10jason-ollama-api.hf.space/generate" \
  -H "Authorization: Bearer YOUR_SECRET_API_KEY" \
  -H "Content-Type: application/json" \
  -d '{"prompt": "Hello from Ollama!"}'



---

### 🔹 What each field does
- `title`: Title shown on your Space page.  
- `emoji`: Icon for your Space.  
- `colorFrom`, `colorTo`: Background gradient.  
- `sdk: docker`: Tells Hugging Face this is a Docker Space.  
- `app_port: 7860`: Required, because your FastAPI app runs on port 7860.  
- `license`: Optional but good to include (`mit`, `apache-2.0`, etc).  

---

👉 If you push again after adding this block, the warning will disappear.  

Do you want me to also include an example **`Modelfile`** so Ollama automatically pulls and prepares a model like `gemma:2b` during build? That way you don’t need to manually pull inside the Space.