Project V1 Backend
Overview
This backend implements the Genesis Engine for mathematical universe simulation, axiom evolution, theorem derivation, and persistent history tracking. It is built with FastAPI, SQLAlchemy, and SymPy.
Core Logic & Algorithms
- AlphaGeometry, Lean 4, and Coq Integration:
- The backend can now run external proof engines for advanced theorem proving and verification.
- Adapters:
backend/core/alphageometry_adapter.py,lean_adapter.py,coq_adapter.py. - Usage example:
from backend.core.alphageometry_adapter import run_alphageometry output = run_alphageometry("path/to/input_file")- Similar usage for
run_lean4andrun_coq.
- Similar usage for
- Make sure the external tools are downloaded and paths are correct (see adapters for details).
- Universe Generation: Create universes with custom types and axioms.
- Axiom Evolution: Add, evolve, and track axioms with lineage and versioning.
- Theorem Derivation: Use symbolic logic (SymPy) to derive theorems from axioms and store proofs.
- History Tracking: All changes to universes and axioms are versioned and timestamped.
- Neuro-Symbolic Network: Train and use a neural network (PyTorch) to guide proof search and theory growth.
- Quantum-Inspired Algorithms: Classical simulation of Grover’s search and other quantum algorithms for proof exploration.
- Cross-Universe Analysis: Compare multiple universes to find shared axioms, theorems, and patterns. Results are stored in the database.
- Advanced analysis algorithms can be added to detect invariants, patterns, and relationships across universes. Results are queryable via the API and stored for further research.
- 3D Visualization & Query Interface: Backend endpoints provide graph data for universes, axioms, and theorems to support interactive frontend visualization.
- Query Engine: API endpoints answer complex mathematical questions and generate universe/theorem summaries for research and visualization.
API Endpoints
POST /universes— Create a new universe (with optional axioms and type)GET /universes— List all universesGET /universes/{universe_id}/history— Retrieve universe history and axiom lineageGET /axioms/{universe_id}— List axioms for a universePOST /axioms— Add a new axiomPOST /axioms/evolve— Evolve an axiom (with lineage)POST /theorems/derive— Derive a theorem from axiomsGET /theorems/{universe_id}— List theorems for a universePOST /neuro/train— Train the neuro-symbolic networkPOST /neuro/predict— Predict with the neuro-symbolic networkPOST /neuro/guide— Guide proof search using the neuro-symbolic networkPOST /quantum/grover— Run Grover’s search algorithm simulationPOST /analysis/cross_universe— Run cross-universe analysis and retrieve shared axioms/theoremsGET /visualization/universe/{universe_id}— Get graph data for a single universeGET /visualization/universes— Get graph data for all universesGET /query/universe_summary/{universe_id}— Get a summary of a universe (axioms, theorems, counts)GET /query/axiom_usage/{axiom_id}— Get usage of an axiom in theorems
Usage Example
# Create a universe
POST /universes
{
"name": "Group Theory",
"description": "Universe for group theory",
"universe_type": "group_theory",
"axioms": ["Closure", "Associativity", "Identity", "Inverse"]
}
# Add an axiom
POST /axioms
{
"universe_id": 1,
"statement": "Commutativity"
}
# Evolve an axiom
POST /axioms/evolve
{
"axiom_id": 2,
"new_statement": "Commutativity (strong)"
}
# Derive a theorem
POST /theorems/derive
{
"universe_id": 1,
"axiom_ids": [1, 2],
"statement": "Closure Commutativity"
}
# Train the neuro-symbolic network
POST /neuro/train
{
"training_data": [[0.1, 0.2, ...], [0.3, 0.4, ...]],
"labels": [0, 1],
"epochs": 10
}
# Predict with the neuro-symbolic network
POST /neuro/predict
{
"input_data": [[0.1, 0.2, ...], [0.3, 0.4, ...]]
}
# Guide proof search
POST /neuro/guide
{
"universe_id": 1,
"axiom_ids": [1, 2, 3]
}
# Run Grover’s search
POST /quantum/grover
{
"database_size": 16,
"target_idx": 5,
"iterations": 3
}
# Run cross-universe analysis
POST /analysis/cross_universe
{
"universe_ids": [1, 2, 3]
}
# Get graph data for a universe
GET /visualization/universe/1
# Get graph data for all universes
GET /visualization/universes
# Get a universe summary
GET /query/universe_summary/1
# Get axiom usage
GET /query/axiom_usage/2
Developer Guide
- All core logic is in
backend/core/. - Database models are in
backend/db/models.py. - API endpoints are in
backend/api/routes.py. - Cross-universe analysis logic is in
backend/core/cross_universe_analysis.py. - API endpoint for analysis is in
backend/api/analysis_routes.py. - Tests are in
backend/tests/. - Tests for analysis are in
backend/tests/test_analysis.py. - Environment variables are set in
.env.
Running & Testing
- Install dependencies:
pip install -r requirements.txt - Start server:
uvicorn backend.app:app --reload - Run tests:
pytest backend/tests/
Deployment & Maintenance
Docker
Build and run the backend in a container:
docker build -t projectv1-backend .
docker run -p 8000:8000 --env-file backend/.env projectv1-backend
CI/CD
GitHub Actions workflow is set up in .github/workflows/ci.yml to run tests on every push and pull request.
Maintenance
- Monitor logs and errors for performance issues.
- Regularly update dependencies and security patches.
- Scale with Docker and orchestration tools as needed.
Contributing
- Follow code style and add tests for new features.
Production Monitoring & Logging
Sentry Integration:
- Sentry is integrated for error monitoring. To enable, set the
SENTRY_DSNenvironment variable in your.envfile. - Install Sentry with
pip install sentry-sdk(already included in requirements). - Adjust
traces_sample_rateinbackend/core/logging_config.pyfor your needs.
- Sentry is integrated for error monitoring. To enable, set the
Prometheus/Grafana:
- For advanced metrics, consider adding Prometheus FastAPI Instrumentator and exporting metrics to Grafana.
- Example:
pip install prometheus-fastapi-instrumentator
Database Optimization
- All major foreign keys and frequently queried fields are indexed (see
backend/db/models.py). - For large-scale deployments, consider query profiling and further index tuning based on real-world usage.
Security Best Practices
- API key authentication is required for all endpoints (see
backend/api/auth.py). - Store secrets in
.envand never commit them to version control. - Regularly update dependencies for security patches.
- Use HTTPS in production.
- Limit database and API access by IP/firewall as needed.
Troubleshooting
- Common Issues:
- Database connection errors: Check your DB URL and credentials in
.env. - Missing dependencies: Run
pip install -r requirements.txt. - Sentry not reporting: Ensure
SENTRY_DSNis set andsentry-sdkis installed. - API key errors: Make sure your request includes the correct API key header.
- Database connection errors: Check your DB URL and credentials in
- Logs:
- All errors and important events are logged. Check your server logs for details.