Spaces:
Running
Running
| # Munger Engine API | |
| This project converts the Munger Stock Analysis Strategy into a deployable API for Hugging Face Spaces. | |
| ## Quick Start | |
| ### Local Development | |
| 1. Install dependencies: | |
| ```bash | |
| npm install | |
| ``` | |
| 2. Run the development server: | |
| ```bash | |
| npm run dev | |
| ``` | |
| *Note: If behind a corporate proxy or experiencing SSL issues, use:* | |
| ```bash | |
| npm run dev:insecure | |
| ``` | |
| 3. Endpoints will be available at `http://localhost:3000/api/v1/...` | |
| ### API Endpoints | |
| > [!TIP] | |
| > You can import the included `munger-api.postman_collection.json` file into Postman to quickly test all endpoints. | |
| | Method | Endpoint | Description | | |
| | :--- | :--- | :--- | | |
| | `GET` | `/api/v1/health` | Check system status. | | |
| | `POST` | `/api/v1/sync` | Trigger scan. Payload: `{ "force": true, "symbols": ["AAPL"] }`. | | |
| | `GET` | `/api/v1/signals` | Get list of filtered stocks. Query params: `signal` (e.g. BUY_TRIGGER), `interesting=true` (BUY_TRIGGER + WATCHLIST). | | |
| | `GET` | `/api/v1/ticker/[symbol]` | Get detailed analysis for a specific stock. | | |
| | `GET` | `/api/v1/portfolio` | Get Alpaca Paper Trading positions. | | |
| ### Usage Examples | |
| **Trigger Scan (Full):** | |
| ```bash | |
| curl -X POST http://localhost:3000/api/v1/sync \ | |
| -H "Content-Type: application/json" \ | |
| -d '{"force": true}' | |
| ``` | |
| **Trigger Scan (Specific Symbols - Ideal for Testing):** | |
| ```bash | |
| curl -X POST http://localhost:3000/api/v1/sync \ | |
| -H "Content-Type: application/json" \ | |
| -d '{"symbols": ["ERIE", "AAPL"], "force": true}' | |
| ``` | |
| **Get Signals:** | |
| ```bash | |
| curl http://localhost:3000/api/v1/signals | |
| ``` | |
| **Get Ticker Details:** | |
| ```bash | |
| curl http://localhost:3000/api/v1/ticker/AAPL | |
| ``` | |
| ## Order Execution Engine | |
| The API uses a deterministic **Trade Plan Builder** for signals marked as `BUY_TRIGGER`. | |
| ### Advanced Logic Blueprint | |
| * **Playbook:** `BUY_TRIGGER_REBOUND_CONFIRM` (Breakout-style Entry). | |
| * **Risk Model:** Wilder's ATR(14) with 1:3 Risk-Reward Ratio. | |
| * *Risk Unit (R):* 2.0 * ATR. | |
| * **Entry Order:** `STOP_LIMIT` (Day). | |
| * *Stop Price:* Trigger Level + Tick ($0.01). | |
| * *Limit Price:* Entry Stop + Slippage Buffer (0.75%). | |
| * *Expiry:* 5 Trading Days. | |
| * **Protective Stop:** `STOP` Order (GTC). | |
| * *Price:* Entry - 1R. | |
| * **Take Profit:** `LIMIT` Order (GTC). | |
| * *Price:* Entry + 3R. | |
| * **Position Sizing:** Portfolio Aware (Alpaca Integration). | |
| * Fetches Equity & Buying Power. | |
| * Risks **0.5%** of Equity per trade. | |
| * Caps Position Size at **10%** of Equity. | |
| * Checks Buying Power constraints. | |
| * **Output:** Returns a comprehensive JSON `TradePlan` including specific order parameters, risk calculation details, and sizing constraints. | |
| ### Deployment to Hugging Face Spaces | |
| 1. Create a new **Docker** Space on Hugging Face. | |
| 2. Upload this repository. | |
| 3. Set the following **Secret** variables in your Space settings: | |
| * `ALPACA_KEY` | |
| * `ALPACA_SECRET` | |
| * `HF_TOKEN` (optional, for future dataset integration) | |
| The Dockerfile is configured to run the Next.js application on port 7860. | |
| ### Hugging Face Space Access | |
| Your API is deployed and accessible at: | |
| `https://dromerosm-munger-engine.hf.space` | |
| **Authentication Required** | |
| The API is protected by an API Key. All requests to `/api/*` endpoint must include the `x-api-key` header. | |
| ```bash | |
| x-api-key: <YOUR_API_KEY> | |
| ``` | |
| **Example:** | |
| ```bash | |
| curl -H "x-api-key: 5664955..." https://dromerosm-munger-engine.hf.space/api/v1/health | |
| ``` | |
| ## Configuration | |
| * **Watchlist**: Currently uses `scripts/sp500_symbols.json`. | |
| * **Persistence**: Data is stored in `data/stocks.json`. **Note:** On standard HF Spaces, this data is ephemeral and will reset on restart. For production, consider using HF Datasets or an external DB. | |