File size: 6,770 Bytes
77da330 | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 | ---
title: Physics-Informed Bayesian Optimization
emoji: ⚗️
colorFrom: blue
colorTo: green
sdk: gradio
sdk_version: "5.23.0"
app_file: app.py
pinned: true
license: mit
tags:
- bayesian-optimization
- physics-informed
- experiment-design
- materials-science
- gaussian-process
---
# Physics-Informed Bayesian Optimization Platform (PIBO)
A platform for designing experiments using physics-informed surrogate models with Bayesian optimization. The core idea: **use physical models as structured priors for Gaussian Processes**, so the GP learns residuals between physics predictions and real observations, dramatically improving sample efficiency.
## Architecture
```
physics_informed_bo/
├── config.py # Configuration classes
├── models/ # Surrogate models
│ ├── base.py # Abstract base class
│ ├── physics_model.py # Physics model wrapper + GPyTorch mean function
│ ├── gp_model.py # Standard GP & Physics-Informed GP
│ ├── hybrid_model.py # Hybrid surrogate (physics + GP)
│ └── multi_fidelity.py # Multi-fidelity model (physics=low, data=high)
├── priors/ # Prior management
│ ├── data_prior.py # Initial data management
│ ├── physics_prior.py # Physics model + constraints
│ └── prior_manager.py # Orchestrates prior combination
├── optimizers/ # Optimizer backends
│ ├── base_optimizer.py # Abstract optimizer
│ ├── botorch_optimizer.py # BoTorch backend (primary)
│ ├── ax_optimizer.py # AX Platform backend
│ ├── bofire_optimizer.py # BoFire backend
│ └── factory.py # Optimizer factory
├── experiment/ # Experiment design
│ ├── parameter_space.py # Parameter space definitions
│ ├── designer.py # Main experiment designer API
│ └── campaign.py # Full campaign management
├── utils/ # Utilities
│ ├── visualization.py # Plotting functions
│ └── diagnostics.py # Model diagnostics
└── examples/ # Usage examples
├── minimal_example.py # Quick start (~30 lines)
├── polymer_optimization.py # Full polymer design example
└── multi_fidelity_example.py
```
## Core Concept
Traditional BO uses a GP with a constant or zero mean function. This platform replaces that with a **physics model as the GP mean function**:
```
f(x) = physics_model(x) + GP_residual(x)
```
Where `GP_residual ~ GP(0, k(x,x'))` only needs to learn the discrepancy between physics and reality. Benefits:
1. **Sample efficiency**: Physics captures the trend, GP only needs to learn deviations
2. **Extrapolation**: Physics model provides reasonable predictions outside observed data
3. **Constraint awareness**: Physical constraints are naturally incorporated
4. **Graceful degradation**: System works with physics-only (no data), hybrid, or GP-only modes
## Quick Start
```python
import torch
from physics_informed_bo import ExperimentDesigner, ParameterSpace
# Define your physics model
def my_physics_model(X):
temp, pressure = X[:, 0], X[:, 1]
return torch.exp(-5000 / temp) * pressure**0.5
# Define parameter space
space = ParameterSpace()
space.add_continuous("temperature", 300, 600, units="K")
space.add_continuous("pressure", 1, 50, units="bar")
# Create designer with physics + initial data
designer = ExperimentDesigner(
parameter_space=space,
physics_fn=my_physics_model,
initial_data=(X_init, y_init), # Your initial experiments
)
# Get next experiment suggestions
next_experiments = designer.suggest(n=3)
# After running experiments, update
designer.update(X_new, y_new)
```
## Full Campaign Example
```python
from physics_informed_bo import OptimizationCampaign, ParameterSpace
from physics_informed_bo.config import OptimizationConfig, AcquisitionType
config = OptimizationConfig(
acquisition_type=AcquisitionType.PHYSICS_INFORMED_EI,
max_iterations=30,
)
campaign = OptimizationCampaign(
name="my_experiment",
parameter_space=space,
physics_fn=my_physics_model,
initial_data=(X_init, y_init),
config=config,
)
# Automated loop
results = campaign.run_automated(objective_fn=run_experiment)
# Or human-in-the-loop
suggestion = campaign.suggest_next()
# ... run experiment manually ...
campaign.report_result(suggestion[0], measured_value)
```
## Surrogate Model Modes
The platform automatically selects the best mode based on available information:
| Data Available | Physics Model | Mode Selected | Description |
|---|---|---|---|
| None | Yes | `physics_only` | Pure physics predictions |
| < 20 points | Yes | `physics_as_mean` | Physics as GP mean, GP learns residual |
| 20-50 points | Yes | `weighted_ensemble` | Adaptive weighting of physics + GP |
| Any | No | `gp_only` | Standard GP (data-driven only) |
## Optimizer Backends
### BoTorch (Default)
- Full BoTorch acquisition function suite (EI, UCB, KG, NEI)
- Custom `PhysicsInformedEI` that penalizes physically implausible regions
- Batch optimization support
### AX Platform
- Structured experiment management
- Human-in-the-loop support
- Trial tracking and analysis
### BoFire
- Chemistry/materials-focused features
- Mixture constraints (sum-to-one)
- Multi-objective optimization
- Categorical and molecular parameters
## Physics Constraints
```python
from physics_informed_bo.priors import PhysicsPrior
physics = PhysicsPrior(physics_fn=my_model)
# Add thermodynamic constraint
physics.add_constraint(
name="gibbs_feasibility",
constraint_fn=lambda X: compute_gibbs(X), # <=0 is feasible
constraint_type="inequality",
)
# Add mass balance constraint
physics.add_constraint(
name="mass_balance",
constraint_fn=lambda X: X.sum(dim=-1) - 1.0, # ==0
constraint_type="equality",
)
```
## Installation
```bash
pip install torch gpytorch botorch numpy pandas matplotlib
# Optional backends
pip install ax-platform # For AX
pip install bofire # For BoFire
```
## Key Dependencies
- **PyTorch**: Tensor computation and autograd
- **GPyTorch**: Gaussian Process models
- **BoTorch**: Bayesian optimization acquisition functions
- **AX Platform** (optional): Experiment management
- **BoFire** (optional): Chemistry-focused BO
|