--- title: Synapse-Base Inference API emoji: 🧠 colorFrom: blue colorTo: purple sdk: docker pinned: true license: cc-by-nc-4.0 --- # 🧠 Synapse-Base Inference API High-performance chess move prediction API powered by Synapse-Base v3.0. ## 🎯 Features - **Deep Search Algorithm**: Advanced alpha-beta pruning with move ordering - **CPU Optimized**: Runs efficiently on 2 vCPU + 16GB RAM - **REST API**: Simple POST endpoint for move generation - **Model**: 38.1M parameter hybrid CNN-Transformer ## 📡 API Endpoint ### `POST /get-move` **Request:** ```json { "fen": "rnbqkbnr/pppppppp/8/8/8/8/PPPPPPPP/RNBQKBNR w KQkq - 0 1", "depth": 3, "time_limit": 5000 } ``` **Response:** ```json { "best_move": "e2e4", "evaluation": 0.25, "depth_searched": 3, "nodes_evaluated": 15234, "time_taken": 1247 } ``` ## 🔧 Parameters - **fen** (required): Current board position in FEN notation - **depth** (optional): Search depth (1-5, default: 3) - **time_limit** (optional): Max time in milliseconds (default: 5000) ## 💻 Local Testing ```bash docker build -t synapse-inference . docker run -p 7860:7860 synapse-inference ``` ## 📊 Performance - **Average Response Time**: 1-3 seconds per move - **Memory Usage**: ~4GB RAM - **Concurrent Requests**: Up to 4 simultaneous ## ⚠️ Rate Limits Free tier: 100 requests/hour per IP --- Built with ❤️ by GambitFlow