Spaces:
Running
Running
metadata
title: deepfake-fastapi
emoji: 🚀
colorFrom: blue
colorTo: green
sdk: docker
Model FastAPI
FastAPI service that provides two capabilities:
- Deepfake image detection using a SigLIP image-classification backbone with a LoRA adapter.
- A news-aware chatbot that selectively performs web search via Tavily and responds with verified sources when evidence is found.
Project Layout
- main app entry: main.py
- Deepfake detector: app/detector.py, app/core/detector/, app/services/detector/, model/output/siglip-lora-optimized/
- Chatbot: app/chatbot.py, app/core/chatbot/, app/services/chatbot/, app/schemas/chat.py
- Database migration (chat history table): supabase/migrations/
Requirements
- Python 3.13+
- uv package manager (https://docs.astral.sh/uv/)
- Access to the SigLIP base model and the LoRA adapter stored at model/output/siglip-lora-optimized/
Environment Variables
- DATABASE_URL (or SUPABASE_DATABASE_URL): PostgreSQL connection string for chat history (if save_to_db=true).
- GROQ_API_KEY: required by the Groq LLM used for responses.
- OPENROUTER_API_KEY: required by the OpenRouter model used for classification/query rewriting.
- TAVILY_API_KEY: required for Tavily search.
- Optional: set CUDA-visible devices as needed for GPU inference.
Setup (local)
uv sync
uv run main.py # starts FastAPI on 0.0.0.0:7860
The detector will load the SigLIP base model and apply the LoRA adapter from model/output/siglip-lora-optimized/.
API
- POST /detect
- Form-data file field: file (image). Returns predicted_class (index), predicted_label (from id2label), prediction (real/fake thresholded at P(real) >= 0.90), confidence, and class probabilities.
- POST /chat
- JSON: {"query": "...", "session_id": "optional", "save_to_db": true|false}
- Auto-classifies need for search, optionally queries Tavily, then responds. Returns response.content, session_id, used_search, and search_reason.
- DELETE /chat/{session_id}
- Clears chat history (both in-memory guest sessions and DB rows).
Quick cURL examples
# Deepfake detection
curl -X POST "http://localhost:7860/detect" \
-F "file=@path/to/image.jpg"
# Chat (without forcing search decision)
curl -X POST "http://localhost:7860/chat" \
-H "Content-Type: application/json" \
-d '{"query": "Is the latest SpaceX launch successful?", "save_to_db": false}'
Docker
A Dockerfile is provided using the uv base image.
docker build -t model-fast-api .
docker run -p 7860:7860 --env-file .env model-fast-api
The image installs dependencies with uv sync and runs uv run main.py.
Notes
- The news verification prompt only returns source links when the claim is supported by the retrieved results; otherwise it replies with UNDETERMINED and no links.
- If chat history persistence is disabled (save_to_db=false), sessions are stored in-memory.