Spaces:
Sleeping
title: FlowPilot
emoji: 📬
colorFrom: blue
colorTo: green
sdk: docker
app_port: 7860
short_description: Gmail-first small business workflow automation with FastAPI.
FlowPilot
FlowPilot is a Gmail-first automation layer for small business owners. The project includes:
- A FastAPI backend with analysis, workflow suggestion, workflow build, deploy, upload, status, and escalation endpoints.
- A primitive-based workflow engine that compiles and executes JSON workflows.
- A Chrome extension scaffold that injects a sidebar into Gmail and walks through the onboarding flow.
- Lightweight demo storage and tests for the core owner flow.
Project Structure
backend/
extension/
tests/
Dockerfile
README.md
Local Development
Install dependencies and run the backend:
pip install -r backend/requirements.txt
uvicorn backend.main:app --reload
Create a .env in the repo root first:
nano .env
Set these values when you want live Groq responses:
AI_PROVIDER=groq
GROQ_API_KEY=your_groq_key
GROQ_MODEL=llama-3.3-70b-versatile
GROQ_BASE_URL=https://api.groq.com/openai/v1
GROQ_TIMEOUT_SECONDS=8
If GROQ_API_KEY is blank, FlowPilot falls back to local deterministic logic so the app and tests still work.
Hugging Face Space
This repo is ready for a Docker Space. The container listens on port 7860, and the metadata at the top of this README tells Hugging Face to treat it as a Docker app.
Typical flow:
git clone https://huggingface.co/spaces/technophyle/flow-pilot
cd flow-pilot
Copy this project into the cloned Space repo, then push:
git add .
git commit -m "Switch FlowPilot to Hugging Face Space with Groq"
git push
In the Hugging Face Space settings, add these secrets:
GROQ_API_KEY
Optional Space variables:
AI_PROVIDER=groqGROQ_MODEL=llama-3.3-70b-versatileGROQ_BASE_URL=https://api.groq.com/openai/v1GROQ_TIMEOUT_SECONDS=8ANALYZE_WITH_AI=false
For GitHub Actions auto-deploys, add these repository settings:
- Secret:
HF_TOKEN - Variable:
HF_SPACE_REPO=technophyle/flow-pilot
Extension
Load extension/ as an unpacked Chrome extension. The sidebar currently points to http://localhost:8000/api for local work, so update that base URL when you connect it to your hosted Space backend.
Notes
- Backend storage is currently lightweight demo storage suitable for iteration.
- Groq is used for live AI calls through its OpenAI-compatible chat completions API.
- The
/api/analyzeendpoint can still stay fast and deterministic whenANALYZE_WITH_AI=false.