| # Project V1 Backend | |
| ## Overview | |
| This backend implements the Genesis Engine for mathematical universe simulation, axiom evolution, theorem derivation, and persistent history tracking. It is built with FastAPI, SQLAlchemy, and SymPy. | |
| ## Core Logic & Algorithms | |
| - **AlphaGeometry, Lean 4, and Coq Integration:** | |
| - The backend can now run external proof engines for advanced theorem proving and verification. | |
| - Adapters: `backend/core/alphageometry_adapter.py`, `lean_adapter.py`, `coq_adapter.py`. | |
| - Usage example: | |
| ```python | |
| from backend.core.alphageometry_adapter import run_alphageometry | |
| output = run_alphageometry("path/to/input_file") | |
| ``` | |
| - Similar usage for `run_lean4` and `run_coq`. | |
| - Make sure the external tools are downloaded and paths are correct (see adapters for details). | |
| - **Universe Generation:** Create universes with custom types and axioms. | |
| - **Axiom Evolution:** Add, evolve, and track axioms with lineage and versioning. | |
| - **Theorem Derivation:** Use symbolic logic (SymPy) to derive theorems from axioms and store proofs. | |
| - **History Tracking:** All changes to universes and axioms are versioned and timestamped. | |
| - **Neuro-Symbolic Network:** Train and use a neural network (PyTorch) to guide proof search and theory growth. | |
| - **Quantum-Inspired Algorithms:** Classical simulation of Grover’s search and other quantum algorithms for proof exploration. | |
| - **Cross-Universe Analysis:** Compare multiple universes to find shared axioms, theorems, and patterns. Results are stored in the database. | |
| - Advanced analysis algorithms can be added to detect invariants, patterns, and relationships across universes. Results are queryable via the API and stored for further research. | |
| - **3D Visualization & Query Interface:** Backend endpoints provide graph data for universes, axioms, and theorems to support interactive frontend visualization. | |
| - **Query Engine:** API endpoints answer complex mathematical questions and generate universe/theorem summaries for research and visualization. | |
| ## API Endpoints | |
| - `POST /universes` — Create a new universe (with optional axioms and type) | |
| - `GET /universes` — List all universes | |
| - `GET /universes/{universe_id}/history` — Retrieve universe history and axiom lineage | |
| - `GET /axioms/{universe_id}` — List axioms for a universe | |
| - `POST /axioms` — Add a new axiom | |
| - `POST /axioms/evolve` — Evolve an axiom (with lineage) | |
| - `POST /theorems/derive` — Derive a theorem from axioms | |
| - `GET /theorems/{universe_id}` — List theorems for a universe | |
| - `POST /neuro/train` — Train the neuro-symbolic network | |
| - `POST /neuro/predict` — Predict with the neuro-symbolic network | |
| - `POST /neuro/guide` — Guide proof search using the neuro-symbolic network | |
| - `POST /quantum/grover` — Run Grover’s search algorithm simulation | |
| - `POST /analysis/cross_universe` — Run cross-universe analysis and retrieve shared axioms/theorems | |
| - `GET /visualization/universe/{universe_id}` — Get graph data for a single universe | |
| - `GET /visualization/universes` — Get graph data for all universes | |
| - `GET /query/universe_summary/{universe_id}` — Get a summary of a universe (axioms, theorems, counts) | |
| - `GET /query/axiom_usage/{axiom_id}` — Get usage of an axiom in theorems | |
| ## Usage Example | |
| ```python | |
| # Create a universe | |
| POST /universes | |
| { | |
| "name": "Group Theory", | |
| "description": "Universe for group theory", | |
| "universe_type": "group_theory", | |
| "axioms": ["Closure", "Associativity", "Identity", "Inverse"] | |
| } | |
| # Add an axiom | |
| POST /axioms | |
| { | |
| "universe_id": 1, | |
| "statement": "Commutativity" | |
| } | |
| # Evolve an axiom | |
| POST /axioms/evolve | |
| { | |
| "axiom_id": 2, | |
| "new_statement": "Commutativity (strong)" | |
| } | |
| # Derive a theorem | |
| POST /theorems/derive | |
| { | |
| "universe_id": 1, | |
| "axiom_ids": [1, 2], | |
| "statement": "Closure Commutativity" | |
| } | |
| # Train the neuro-symbolic network | |
| POST /neuro/train | |
| { | |
| "training_data": [[0.1, 0.2, ...], [0.3, 0.4, ...]], | |
| "labels": [0, 1], | |
| "epochs": 10 | |
| } | |
| # Predict with the neuro-symbolic network | |
| POST /neuro/predict | |
| { | |
| "input_data": [[0.1, 0.2, ...], [0.3, 0.4, ...]] | |
| } | |
| # Guide proof search | |
| POST /neuro/guide | |
| { | |
| "universe_id": 1, | |
| "axiom_ids": [1, 2, 3] | |
| } | |
| # Run Grover’s search | |
| POST /quantum/grover | |
| { | |
| "database_size": 16, | |
| "target_idx": 5, | |
| "iterations": 3 | |
| } | |
| # Run cross-universe analysis | |
| POST /analysis/cross_universe | |
| { | |
| "universe_ids": [1, 2, 3] | |
| } | |
| # Get graph data for a universe | |
| GET /visualization/universe/1 | |
| # Get graph data for all universes | |
| GET /visualization/universes | |
| # Get a universe summary | |
| GET /query/universe_summary/1 | |
| # Get axiom usage | |
| GET /query/axiom_usage/2 | |
| ``` | |
| ## Developer Guide | |
| - All core logic is in `backend/core/`. | |
| - Database models are in `backend/db/models.py`. | |
| - API endpoints are in `backend/api/routes.py`. | |
| - Cross-universe analysis logic is in `backend/core/cross_universe_analysis.py`. | |
| - API endpoint for analysis is in `backend/api/analysis_routes.py`. | |
| - Tests are in `backend/tests/`. | |
| - Tests for analysis are in `backend/tests/test_analysis.py`. | |
| - Environment variables are set in `.env`. | |
| ## Running & Testing | |
| 1. Install dependencies: `pip install -r requirements.txt` | |
| 2. Start server: `uvicorn backend.app:app --reload` | |
| 3. Run tests: `pytest backend/tests/` | |
| ## Deployment & Maintenance | |
| ### Docker | |
| Build and run the backend in a container: | |
| ```sh | |
| docker build -t projectv1-backend . | |
| docker run -p 8000:8000 --env-file backend/.env projectv1-backend | |
| ``` | |
| ### CI/CD | |
| GitHub Actions workflow is set up in `.github/workflows/ci.yml` to run tests on every push and pull request. | |
| ### Maintenance | |
| - Monitor logs and errors for performance issues. | |
| - Regularly update dependencies and security patches. | |
| - Scale with Docker and orchestration tools as needed. | |
| ## Contributing | |
| - Follow code style and add tests for new features. | |
| --- | |
| ## Production Monitoring & Logging | |
| - **Sentry Integration:** | |
| - Sentry is integrated for error monitoring. To enable, set the `SENTRY_DSN` environment variable in your `.env` file. | |
| - Install Sentry with `pip install sentry-sdk` (already included in requirements). | |
| - Adjust `traces_sample_rate` in `backend/core/logging_config.py` for your needs. | |
| - **Prometheus/Grafana:** | |
| - For advanced metrics, consider adding [Prometheus FastAPI Instrumentator](https://github.com/trallard/fastapi_prometheus) and exporting metrics to Grafana. | |
| - Example: `pip install prometheus-fastapi-instrumentator` | |
| ## Database Optimization | |
| - All major foreign keys and frequently queried fields are indexed (see `backend/db/models.py`). | |
| - For large-scale deployments, consider query profiling and further index tuning based on real-world usage. | |
| ## Security Best Practices | |
| - API key authentication is required for all endpoints (see `backend/api/auth.py`). | |
| - Store secrets in `.env` and never commit them to version control. | |
| - Regularly update dependencies for security patches. | |
| - Use HTTPS in production. | |
| - Limit database and API access by IP/firewall as needed. | |
| ## Troubleshooting | |
| - **Common Issues:** | |
| - *Database connection errors*: Check your DB URL and credentials in `.env`. | |
| - *Missing dependencies*: Run `pip install -r requirements.txt`. | |
| - *Sentry not reporting*: Ensure `SENTRY_DSN` is set and `sentry-sdk` is installed. | |
| - *API key errors*: Make sure your request includes the correct API key header. | |
| - **Logs:** | |
| - All errors and important events are logged. Check your server logs for details. | |
| ## External Resources | |
| - [FastAPI Documentation](https://fastapi.tiangolo.com/) | |
| - [SQLAlchemy Documentation](https://docs.sqlalchemy.org/) | |
| - [Sentry for Python](https://docs.sentry.io/platforms/python/) | |
| - [Prometheus FastAPI Instrumentator](https://github.com/trallard/fastapi_prometheus) | |
| - [PyTorch](https://pytorch.org/) | |
| - [SymPy](https://www.sympy.org/) | |
| - [Docker](https://docs.docker.com/) | |
| - [GitHub Actions](https://docs.github.com/en/actions) |