| # Nova Autonomous LLM System | |
| This directory scaffolds a next‑gen autonomous LLM stack suitable for container deployment using supervisord. It includes a gateway, a basic router, and a tools MCP server, and is designed to integrate with vLLM and external public endpoints. | |
| Key components | |
| - plane/api/gateway.py: OpenAI‑compatible proxy and route registry | |
| - plane/orchestration/router.py: Task routing stub for multi‑backend selection | |
| - plane/tools/mcp_server.py: Minimal MCP‑style tool registry and executor | |
| - config/external_routes.yaml: External/public URL candidates (Cloudflare and local) | |
| - requirements-nova.txt: Minimal runtime dependencies | |
| Start locally | |
| - vLLM: `bash elizabeth/deployment_configs/serve_vllm_no_steering.sh` | |
| - Supervisord: `supervisord -c supervisord.conf` then `supervisorctl status` | |
| Environment | |
| - UPSTREAM_OPENAI_BASE: upstream OpenAI‑compatible base URL (default http://127.0.0.1:8000) | |
| - EXTERNAL_ROUTES_FILE: path to routes YAML | |
| - PORT: per‑service port overrides | |