diff --git a/README.md b/README.md index 88a15139f664e5fe56fe67037c9cca1a701559bb..fbf376fd1382a1b6d6e6d0b9af1d214def5d5880 100644 --- a/README.md +++ b/README.md @@ -1 +1,159 @@ -See examples in ideal_poly_volume_toolkit/examples. +# Ideal Polyhedra Volume Toolkit + +A Python toolkit for computing and optimizing volumes of ideal hyperbolic polyhedra using Delaunay triangulation, hull projection, and fast/exact Lobachevsky functions. + +## Installation + +Install the package in development mode: + +```bash +pip install -e . +``` + +Dependencies: `numpy`, `scipy`, `mpmath`, `torch` + +## Project Structure + +``` +ideal_poly_volume_toolkit/ +├── ideal_poly_volume_toolkit/ # Core package +│ ├── geometry.py # Core geometry and volume computation functions +│ ├── visualization.py # 3D visualization utilities +│ ├── rivin_holonomy.py # Penner-Rivin holonomy computation +│ ├── pointset_to_fuchsian.py # Full Fuchsian group pipeline +│ └── __init__.py +│ +├── bin/ # Command-line tools and GUI +│ ├── gui.py # 🎨 Interactive Gradio web interface +│ ├── optimize_polyhedron.py # General optimization wrapper +│ ├── analyze_distribution.py # Distribution analysis wrapper +│ └── README.md +│ +├── examples/ # Organized example scripts +│ ├── distributions/ # Distribution analysis examples +│ │ ├── tetrahedron/ # Tetrahedron volume distributions +│ │ ├── five_vertex/ # 5-vertex polyhedra distributions +│ │ ├── six_vertex/ # 6-vertex polyhedra distributions +│ │ └── euclidean/ # Euclidean tetrahedra analysis +│ ├── optimization/ # Optimization examples by vertex count +│ │ ├── 7vertex/ # 7-vertex optimization (octahedron variants) +│ │ ├── 12vertex/ # 12-vertex optimization +│ │ ├── 20vertex/ # 20-vertex optimization (icosahedron) +│ │ └── platonic/ # Platonic solid analysis +│ ├── visualization/ # Visualization scripts +│ └── analysis/ # Statistical and theoretical analysis +│ +├── scripts/ # Active research/development scripts +├── results/ # Output files +│ ├── data/ # JSON configuration files +│ ├── plots/ # PNG visualization outputs +│ └── logs/ # Optimization logs +└── docs/ # Documentation + ├── RESULTS_SUMMARY.md + └── PLATONIC_MAXIMALITY_RESULTS.md +``` + +## Core Functionality + +### `ideal_poly_volume_toolkit.geometry` - Volume Computation + +- **Stereographic projection**: `lift_to_sphere_with_inf()`, `inverse_stereographic_from_sphere_pts()` +- **Triangulation**: `delaunay_triangulation_indices()`, `hull_tris_projected_back()` +- **Lobachevsky function**: `lob_fast()` (PyTorch autodiff), `lob_exact()` (mpmath high-precision) +- **Volume computation**: + - `triangle_volume_from_points()` - Single triangle volume + - `ideal_poly_volume_via_delaunay()` - Full polyhedron via Delaunay + - `ideal_poly_volume_via_hull_project_back()` - Full polyhedron via convex hull + +### `ideal_poly_volume_toolkit.rivin_holonomy` - Penner-Rivin Algorithm + +- **Holonomy computation**: `generators_from_triangulation()` - Compute Fuchsian group generators +- **Arithmeticity testing**: Check if polyhedra have arithmetic holonomy (traces in number fields) +- **Triangulation structures**: `Triangulation` class for managing ideal triangulations + +### `ideal_poly_volume_toolkit.pointset_to_fuchsian` - Full Pipeline + +- **Group computation**: `group_from_pointset()` - Convert point sets to Fuchsian groups +- **Trace field analysis**: `invariant_trace_field_signature()` - Analyze arithmetic properties +- **Visualization**: `render_snapshot()` - High-quality rendering with iridescence and transparency +- **Mesh export**: `hull_to_mesh()`, `export_mesh_obj()` - Export to OBJ format + +## Quick Start + +### 🎨 Interactive GUI (Easiest) + +The fastest way to get started is with the Gradio web interface: + +```bash +python bin/gui.py +``` + +Then open your browser to `http://127.0.0.1:7860` + +**Features:** +- Interactive optimization with real-time progress +- Distribution analysis with automatic plotting +- 3D visualization in sphere and Poincaré ball models +- No need to remember command-line arguments! + +### Command-Line Tools + +For scripting and batch processing, use the wrapper scripts in `bin/`: + +```bash +# Optimize a 7-vertex polyhedron (10 trials) +python bin/optimize_polyhedron.py --vertices 7 --trials 10 + +# Analyze volume distribution for tetrahedra +python bin/analyze_distribution.py --vertices 4 --samples 10000 + +# Get help on any tool +python bin/optimize_polyhedron.py --help +``` + +See `bin/README.md` for detailed usage and examples. + +### Computing a volume (Python API) + +```python +import numpy as np +from ideal_poly_volume_toolkit.geometry import ideal_poly_volume_via_delaunay + +# Define vertices in the complex plane (stereographic projection) +vertices = np.array([0.0+0.0j, 1.0+0.0j, 0.5+0.866j]) +volume = ideal_poly_volume_via_delaunay(vertices) +print(f"Volume: {volume}") +``` + +### Running examples + +Examples are organized by topic. For instance: + +```bash +# 7-vertex optimization +cd examples/optimization/7vertex +python optimize_7vertex.py + +# Tetrahedron distribution analysis +cd examples/distributions/tetrahedron +python tetrahedron_volume_distribution.py + +# Visualization +cd examples/visualization +python visualize_golden_config.py +``` + +## Key Examples + +- **7-vertex optimization**: Testing the hypothesis that the maximum volume is an octahedron with one stellated face +- **20-vertex optimization**: Finding maximal volume configurations for icosahedron-like polyhedra +- **Distribution analysis**: Statistical analysis of volume distributions for various polyhedra +- **Platonic solids**: Analysis of regular polyhedra and their perturbations + +## Research Results + +See `docs/RESULTS_SUMMARY.md` and `docs/PLATONIC_MAXIMALITY_RESULTS.md` for detailed findings. + +## License + +See LICENSE file. diff --git a/bin/.gradio/certificate.pem b/bin/.gradio/certificate.pem new file mode 100644 index 0000000000000000000000000000000000000000..b85c8037f6b60976b2546fdbae88312c5246d9a3 --- /dev/null +++ b/bin/.gradio/certificate.pem @@ -0,0 +1,31 @@ +-----BEGIN CERTIFICATE----- +MIIFazCCA1OgAwIBAgIRAIIQz7DSQONZRGPgu2OCiwAwDQYJKoZIhvcNAQELBQAw +TzELMAkGA1UEBhMCVVMxKTAnBgNVBAoTIEludGVybmV0IFNlY3VyaXR5IFJlc2Vh +cmNoIEdyb3VwMRUwEwYDVQQDEwxJU1JHIFJvb3QgWDEwHhcNMTUwNjA0MTEwNDM4 +WhcNMzUwNjA0MTEwNDM4WjBPMQswCQYDVQQGEwJVUzEpMCcGA1UEChMgSW50ZXJu +ZXQgU2VjdXJpdHkgUmVzZWFyY2ggR3JvdXAxFTATBgNVBAMTDElTUkcgUm9vdCBY +MTCCAiIwDQYJKoZIhvcNAQEBBQADggIPADCCAgoCggIBAK3oJHP0FDfzm54rVygc +h77ct984kIxuPOZXoHj3dcKi/vVqbvYATyjb3miGbESTtrFj/RQSa78f0uoxmyF+ +0TM8ukj13Xnfs7j/EvEhmkvBioZxaUpmZmyPfjxwv60pIgbz5MDmgK7iS4+3mX6U +A5/TR5d8mUgjU+g4rk8Kb4Mu0UlXjIB0ttov0DiNewNwIRt18jA8+o+u3dpjq+sW +T8KOEUt+zwvo/7V3LvSye0rgTBIlDHCNAymg4VMk7BPZ7hm/ELNKjD+Jo2FR3qyH +B5T0Y3HsLuJvW5iB4YlcNHlsdu87kGJ55tukmi8mxdAQ4Q7e2RCOFvu396j3x+UC +B5iPNgiV5+I3lg02dZ77DnKxHZu8A/lJBdiB3QW0KtZB6awBdpUKD9jf1b0SHzUv +KBds0pjBqAlkd25HN7rOrFleaJ1/ctaJxQZBKT5ZPt0m9STJEadao0xAH0ahmbWn +OlFuhjuefXKnEgV4We0+UXgVCwOPjdAvBbI+e0ocS3MFEvzG6uBQE3xDk3SzynTn +jh8BCNAw1FtxNrQHusEwMFxIt4I7mKZ9YIqioymCzLq9gwQbooMDQaHWBfEbwrbw +qHyGO0aoSCqI3Haadr8faqU9GY/rOPNk3sgrDQoo//fb4hVC1CLQJ13hef4Y53CI +rU7m2Ys6xt0nUW7/vGT1M0NPAgMBAAGjQjBAMA4GA1UdDwEB/wQEAwIBBjAPBgNV +HRMBAf8EBTADAQH/MB0GA1UdDgQWBBR5tFnme7bl5AFzgAiIyBpY9umbbjANBgkq +hkiG9w0BAQsFAAOCAgEAVR9YqbyyqFDQDLHYGmkgJykIrGF1XIpu+ILlaS/V9lZL +ubhzEFnTIZd+50xx+7LSYK05qAvqFyFWhfFQDlnrzuBZ6brJFe+GnY+EgPbk6ZGQ +3BebYhtF8GaV0nxvwuo77x/Py9auJ/GpsMiu/X1+mvoiBOv/2X/qkSsisRcOj/KK +NFtY2PwByVS5uCbMiogziUwthDyC3+6WVwW6LLv3xLfHTjuCvjHIInNzktHCgKQ5 +ORAzI4JMPJ+GslWYHb4phowim57iaztXOoJwTdwJx4nLCgdNbOhdjsnvzqvHu7Ur +TkXWStAmzOVyyghqpZXjFaH3pO3JLF+l+/+sKAIuvtd7u+Nxe5AW0wdeRlN8NwdC +jNPElpzVmbUq4JUagEiuTDkHzsxHpFKVK7q4+63SM1N95R1NbdWhscdCb+ZAJzVc +oyi3B43njTOQ5yOf+1CceWxG1bQVs5ZufpsMljq4Ui0/1lvh+wjChP4kqKOJ2qxq +4RgqsahDYVvTH9w7jXbyLeiNdd8XM2w9U/t7y0Ff/9yi0GE44Za4rF2LN9d11TPA +mRGunUHBcnWEvgJBQl9nJEiU0Zsnvgc/ubhPgXRR4Xq37Z0j4r7g1SgEEzwxA57d +emyPxgcYxn/eR44/KJ4EBs+lVDR3veyJm+kXQ99b21/+jh5Xos1AnX5iItreGCc= +-----END CERTIFICATE----- diff --git a/bin/README.md b/bin/README.md new file mode 100644 index 0000000000000000000000000000000000000000..2baccc56b9bf680e14f63576000d71db1af51018 --- /dev/null +++ b/bin/README.md @@ -0,0 +1,153 @@ +# Command-Line Tools + +This directory contains general-purpose wrapper scripts and GUI for common tasks. + +## 🎨 Interactive GUI + +### `gui.py` + +**Launch the interactive Gradio web interface:** + +```bash +# Local only (default) +python bin/gui.py + +# Create shareable public link +python bin/gui.py --share + +# Custom port +python bin/gui.py --port 8080 + +# Get help +python bin/gui.py --help +``` + +The GUI will open in your browser at `http://127.0.0.1:7860` (or your custom port) + +**Features:** +- **Optimization Tab**: Find maximal volume configurations with adjustable parameters + - Number of vertices, trials, iterations + - Real-time progress tracking + - Automatic result saving + +- **Distribution Analysis Tab**: Sample random configurations and analyze statistics + - Configurable sample size + - Automatic histogram generation + - Statistics: mean, median, quartiles, std dev + +- **3D Visualization Tab**: Interactive polyhedron visualization + - Delaunay triangulation (2D complex plane) + - Sphere projection (3D) + - Poincaré ball model (3D hyperbolic geometry) + - Adjustable subdivision for smooth surfaces + - Load results directly from optimization + +- **About Tab**: Documentation and usage tips + +**Advantages:** +- No need to remember command-line arguments +- Visual feedback and progress bars +- Interactive 3D plots you can rotate and zoom +- Seamless workflow: optimize → visualize +- Beginner-friendly interface + +--- + +## Available Tools + +### `optimize_polyhedron.py` + +General-purpose optimization wrapper for finding maximal volume configurations. + +**Usage:** +```bash +# Optimize a 7-vertex polyhedron with 10 trials +python bin/optimize_polyhedron.py --vertices 7 --trials 10 + +# Optimize 12-vertex with more iterations +python bin/optimize_polyhedron.py --vertices 12 --trials 20 --maxiter 300 + +# Custom output location +python bin/optimize_polyhedron.py --vertices 20 --trials 5 --output my_results.json +``` + +**Arguments:** +- `--vertices, -v`: Number of vertices (required, must be >= 4) +- `--trials, -t`: Number of optimization trials (default: 10) +- `--maxiter, -m`: Max iterations per trial (default: 200) +- `--popsize, -p`: Population size for differential evolution (default: 15) +- `--output, -o`: Output JSON file (default: auto-generated in results/data/) +- `--seed, -s`: Random seed base (default: 42) + +**Output:** +Saves a JSON file with: +- Best configuration found (volume, parameters, vertex coordinates) +- Combinatorial structure (faces, edges, Euler characteristic) +- All trial results + +--- + +### `analyze_distribution.py` + +Analyze volume distributions by sampling random polyhedra configurations. + +**Usage:** +```bash +# Analyze 10,000 random tetrahedra +python bin/analyze_distribution.py --vertices 4 --samples 10000 + +# Analyze with custom output +python bin/analyze_distribution.py --vertices 6 --samples 50000 --output my_plot.png + +# Include reference volume and save data +python bin/analyze_distribution.py --vertices 5 --samples 20000 --reference 3.66 --data distribution_data.json +``` + +**Arguments:** +- `--vertices, -v`: Number of vertices (required, must be >= 3) +- `--samples, -n`: Number of random samples (default: 10000) +- `--seed, -s`: Random seed (default: 42) +- `--output, -o`: Output plot file (default: auto-generated in results/plots/) +- `--data, -d`: Output data JSON file (optional) +- `--reference, -r`: Reference volume to mark on plot (optional) +- `--series-terms`: Number of series terms for Lobachevsky function (default: 96) + +**Output:** +- PNG plot with histogram and box plot +- Optional JSON file with statistics and all volume samples + +--- + +## Examples + +### Quick Optimization Run +```bash +# Find best 7-vertex configuration with 5 quick trials +python bin/optimize_polyhedron.py -v 7 -t 5 --maxiter 100 +``` + +### Distribution Analysis Pipeline +```bash +# Analyze distribution, save data, and use max volume as reference +python bin/analyze_distribution.py -v 5 -n 10000 --data dist_data.json +``` + +### Reproduce Research Results +```bash +# Optimize different vertex counts systematically +for n in 5 6 7 8; do + python bin/optimize_polyhedron.py -v $n -t 20 -m 300 +done +``` + +--- + +## Output Conventions + +All outputs are saved to standardized locations: + +- **Optimization results**: `results/data/{n}vertex_optimization_TIMESTAMP.json` +- **Distribution plots**: `results/plots/{n}vertex_distribution_TIMESTAMP.png` +- **Distribution data**: `results/data/{n}vertex_distribution_TIMESTAMP.json` + +Timestamps are in format: `YYYYMMDD_HHMMSS` diff --git a/bin/analyze_distribution.py b/bin/analyze_distribution.py new file mode 100755 index 0000000000000000000000000000000000000000..04a0e6eafd4493dcb593bd80aef0eefc56bc0595 --- /dev/null +++ b/bin/analyze_distribution.py @@ -0,0 +1,298 @@ +#!/usr/bin/env python3 +""" +General-purpose wrapper for analyzing volume distributions of ideal polyhedra. + +Usage: + python bin/analyze_distribution.py --vertices 4 --samples 10000 + python bin/analyze_distribution.py --vertices 6 --samples 50000 --output custom_plot.png +""" + +import argparse +import json +import numpy as np +import matplotlib.pyplot as plt +from datetime import datetime +from pathlib import Path +import sys + +from ideal_poly_volume_toolkit.geometry import ( + delaunay_triangulation_indices, + ideal_poly_volume_via_delaunay, +) + + +def sample_random_vertex(): + """ + Sample a uniform random point on the unit sphere and project to complex plane. + Uses stereographic projection from north pole. + """ + # Sample uniform point on sphere using Gaussian method + vec = np.random.randn(3) + vec = vec / np.linalg.norm(vec) + x, y, z = vec + + # Skip near north pole (maps to infinity) + if z > 0.999: + return None + + # Stereographic projection + w = complex(x/(1-z), y/(1-z)) + return w + + +def analyze_distribution(n_vertices, n_samples, seed=42, series_terms=96): + """ + Analyze volume distribution for n_vertices polyhedra. + + Args: + n_vertices: Number of vertices (must be >= 3) + n_samples: Number of random configurations to sample + seed: Random seed + series_terms: Number of terms for Lobachevsky function approximation + + Returns: + dict with volumes and statistics + """ + np.random.seed(seed) + + # First 3 vertices are fixed to break symmetry + fixed_vertices = [complex(0, 0), complex(1, 0), complex(0, 1)] + n_random = n_vertices - 3 + + if n_random < 0: + raise ValueError("Need at least 3 vertices") + + volumes = [] + print(f"Sampling {n_samples} random {n_vertices}-vertex configurations...") + + for i in range(n_samples): + if (i + 1) % (n_samples // 10) == 0: + print(f" Progress: {i + 1}/{n_samples} ({100*(i+1)/n_samples:.1f}%)") + + # Build configuration + vertices = fixed_vertices.copy() + + # Add random vertices + for _ in range(n_random): + v = sample_random_vertex() + if v is None: + continue # Skip degenerate samples + + # Skip if too close to existing vertices + too_close = False + for existing in vertices: + if abs(v - existing) < 0.01: + too_close = True + break + if too_close: + continue + + vertices.append(v) + + # Only proceed if we have the right number of vertices + if len(vertices) != n_vertices: + continue + + # Compute volume + try: + vertices_np = np.array(vertices, dtype=np.complex128) + vol = ideal_poly_volume_via_delaunay( + vertices_np, mode='fast', series_terms=series_terms + ) + + # Sanity check + if vol > 0 and vol < 1000: + volumes.append(vol) + except: + pass # Skip invalid configurations + + volumes = np.array(volumes) + + if len(volumes) == 0: + raise ValueError("No valid configurations found!") + + print(f"\nSuccessfully analyzed {len(volumes)} valid configurations") + + return { + 'volumes': volumes, + 'n_samples_requested': n_samples, + 'n_valid': len(volumes), + 'mean': np.mean(volumes), + 'median': np.median(volumes), + 'std': np.std(volumes), + 'min': np.min(volumes), + 'max': np.max(volumes), + 'q25': np.percentile(volumes, 25), + 'q75': np.percentile(volumes, 75), + } + + +def plot_distribution(volumes, stats, n_vertices, output_file, reference_volume=None): + """Create histogram plot of volume distribution.""" + fig, (ax1, ax2) = plt.subplots(1, 2, figsize=(14, 5)) + + # Histogram + ax1.hist(volumes, bins=50, density=True, alpha=0.7, + color='steelblue', edgecolor='black', linewidth=0.5) + ax1.axvline(stats['mean'], color='red', linestyle='--', linewidth=2, + label=f"Mean: {stats['mean']:.4f}") + ax1.axvline(stats['median'], color='green', linestyle='--', linewidth=2, + label=f"Median: {stats['median']:.4f}") + + if reference_volume is not None: + ax1.axvline(reference_volume, color='orange', linestyle='--', linewidth=2, + label=f"Reference: {reference_volume:.4f}") + + ax1.set_xlabel('Volume', fontsize=12) + ax1.set_ylabel('Density', fontsize=12) + ax1.set_title(f'{n_vertices}-Vertex Ideal Polyhedra Volume Distribution', fontsize=14) + ax1.legend(fontsize=10) + ax1.grid(True, alpha=0.3) + + # Box plot + ax2.boxplot([volumes], vert=True, patch_artist=True, + boxprops=dict(facecolor='lightblue', alpha=0.7), + medianprops=dict(color='red', linewidth=2), + flierprops=dict(marker='o', markerfacecolor='gray', markersize=4, alpha=0.5)) + ax2.set_ylabel('Volume', fontsize=12) + ax2.set_title('Volume Distribution (Box Plot)', fontsize=14) + ax2.set_xticklabels([f'{n_vertices} vertices']) + ax2.grid(True, alpha=0.3, axis='y') + + plt.tight_layout() + plt.savefig(output_file, dpi=150, bbox_inches='tight') + print(f"Plot saved to: {output_file}") + plt.close() + + +def main(): + parser = argparse.ArgumentParser( + description='Analyze volume distributions of ideal polyhedra', + formatter_class=argparse.RawDescriptionHelpFormatter, + epilog=""" +Examples: + %(prog)s --vertices 4 --samples 10000 + %(prog)s --vertices 6 --samples 50000 --output my_analysis.png + %(prog)s --vertices 5 --samples 20000 --reference 3.66 + """ + ) + + parser.add_argument('--vertices', '-v', type=int, required=True, + help='Number of vertices (must be >= 3)') + parser.add_argument('--samples', '-n', type=int, default=10000, + help='Number of random samples (default: 10000)') + parser.add_argument('--seed', '-s', type=int, default=42, + help='Random seed (default: 42)') + parser.add_argument('--output', '-o', type=str, default=None, + help='Output plot file (default: results/plots/{n}vertex_distribution_TIMESTAMP.png)') + parser.add_argument('--data', '-d', type=str, default=None, + help='Output data JSON file (optional)') + parser.add_argument('--reference', '-r', type=float, default=None, + help='Reference volume to mark on plot (optional)') + parser.add_argument('--series-terms', type=int, default=96, + help='Number of series terms for Lobachevsky function (default: 96)') + + args = parser.parse_args() + + if args.vertices < 3: + print("Error: Number of vertices must be at least 3") + sys.exit(1) + + # Setup output files + timestamp = datetime.now().strftime("%Y%m%d_%H%M%S") + + if args.output is None: + plot_file = f"results/plots/{args.vertices}vertex_distribution_{timestamp}.png" + else: + plot_file = args.output + + if args.data is not None: + data_file = args.data + else: + data_file = f"results/data/{args.vertices}vertex_distribution_{timestamp}.json" + + # Ensure output directories exist + Path(plot_file).parent.mkdir(parents=True, exist_ok=True) + if args.data is not None: + Path(data_file).parent.mkdir(parents=True, exist_ok=True) + + print("=" * 70) + print("Ideal Polyhedron Volume Distribution Analysis") + print("=" * 70) + print(f"Vertices: {args.vertices}") + print(f"Samples: {args.samples}") + print(f"Random seed: {args.seed}") + print(f"Plot output: {plot_file}") + if args.data: + print(f"Data output: {data_file}") + print("=" * 70) + print() + + # Run analysis + results = analyze_distribution( + args.vertices, + args.samples, + seed=args.seed, + series_terms=args.series_terms + ) + + # Print statistics + print("\n" + "=" * 70) + print("STATISTICS:") + print("=" * 70) + print(f"Valid configs: {results['n_valid']:,} / {results['n_samples_requested']:,}") + print(f"Mean volume: {results['mean']:.8f}") + print(f"Median volume: {results['median']:.8f}") + print(f"Std deviation: {results['std']:.8f}") + print(f"Min volume: {results['min']:.8f}") + print(f"Max volume: {results['max']:.8f}") + print(f"25th percentile: {results['q25']:.8f}") + print(f"75th percentile: {results['q75']:.8f}") + + if args.reference is not None: + print(f"\nReference volume: {args.reference:.8f}") + print(f"Mean/Reference: {results['mean']/args.reference:.4f}") + print(f"Max/Reference: {results['max']/args.reference:.4f}") + + # Create plot + plot_distribution( + results['volumes'], + results, + args.vertices, + plot_file, + reference_volume=args.reference + ) + + # Save data if requested + if args.data is not None: + output_data = { + 'metadata': { + 'timestamp': datetime.now().isoformat(), + 'n_vertices': args.vertices, + 'n_samples_requested': args.samples, + 'n_valid': results['n_valid'], + 'seed': args.seed, + 'series_terms': args.series_terms, + }, + 'statistics': { + 'mean': float(results['mean']), + 'median': float(results['median']), + 'std': float(results['std']), + 'min': float(results['min']), + 'max': float(results['max']), + 'q25': float(results['q25']), + 'q75': float(results['q75']), + }, + 'volumes': results['volumes'].tolist(), + } + + with open(data_file, 'w') as f: + json.dump(output_data, f, indent=2) + + print(f"\nData saved to: {data_file}") + + print("=" * 70) + + +if __name__ == '__main__': + main() diff --git a/bin/gui.py b/bin/gui.py new file mode 100755 index 0000000000000000000000000000000000000000..b350d2ef86949be077cfb0d90d1c99be961b08ba --- /dev/null +++ b/bin/gui.py @@ -0,0 +1,893 @@ +#!/usr/bin/env python3 +""" +Gradio GUI for Ideal Polyhedron Volume Toolkit + +Interactive interface for: +- Optimizing polyhedra +- Analyzing distributions +- 3D visualization in sphere and Poincaré ball models + +Usage: + python bin/gui.py # Local only (127.0.0.1:7860) + python bin/gui.py --share # Create shareable public link + python bin/gui.py --port 8080 # Custom port +""" + +import gradio as gr +import numpy as np +import json +import io +import matplotlib.pyplot as plt +import argparse +from datetime import datetime +from pathlib import Path +from PIL import Image + +from ideal_poly_volume_toolkit.geometry import ( + delaunay_triangulation_indices, + triangle_volume_from_points_torch, + ideal_poly_volume_via_delaunay, +) +from ideal_poly_volume_toolkit.visualization import ( + plot_polyhedron_klein, + plot_polyhedron_poincare, + plot_delaunay_2d, + create_polyhedron_mesh, +) +from ideal_poly_volume_toolkit.rivin_holonomy import ( + Triangulation, + generators_from_triangulation, +) +from ideal_poly_volume_toolkit.symmetry import ( + compute_symmetry_group, + format_symmetry_report, +) + +import torch +from scipy.optimize import differential_evolution + +# Note: GPU is slower than CPU for this problem due to small tensor sizes +# and transfer overhead, so we use CPU explicitly +DEVICE = torch.device('cpu') + + +# ============================================================================ +# Optimization Functions +# ============================================================================ + +def spherical_to_complex(theta, phi): + """Convert spherical coordinates to complex via stereographic projection.""" + return np.tan(theta/2) * np.exp(1j * phi) + + +def compute_volume(params, n_vertices): + """Compute volume for a polyhedron with n_vertices. + + Performance optimizations: + - Reduced series_terms to 64 (good balance of speed/accuracy) + - Single torch tensor conversion + - Parallel evaluation via differential_evolution workers + + Args: + params: Optimization parameters (theta, phi pairs) + n_vertices: Total number of vertices + + Returns: + Negative volume (for minimization) + + Note: + The polishing step (L-BFGS-B) uses finite differences for gradients. + PyTorch autodiff could be used but the Delaunay triangulation is + not differentiable, making gradients unreliable near topology changes. + """ + complex_points = [complex(0, 0), complex(1, 0), complex(0, 1)] + n_params = n_vertices - 3 + + for i in range(n_params): + theta = params[2*i] + phi = params[2*i + 1] + z = spherical_to_complex(theta, phi) + complex_points.append(z) + + Z_np = np.array(complex_points, dtype=np.complex128) + + try: + idx = delaunay_triangulation_indices(Z_np) + except: + return 1000.0 + + # Single torch conversion (CPU is faster than GPU for small tensors) + Z_torch = torch.tensor(Z_np, dtype=torch.complex128, device=DEVICE) + + total_volume = 0 + for (i, j, k) in idx: + try: + vol = triangle_volume_from_points_torch( + Z_torch[i], Z_torch[j], Z_torch[k], series_terms=64 + ) + total_volume += vol.item() + except: + return 1000.0 + + return -total_volume + + +def run_optimization(n_vertices, n_trials, max_iter, pop_size, seed, progress=gr.Progress()): + """Run optimization with progress tracking. + + Uses parallel workers (all CPU cores) for faster optimization. + """ + import os + from functools import partial + + # Validate and convert inputs + try: + n_vertices = int(n_vertices) + if n_vertices < 4: + return "Error: Number of vertices must be at least 4", None + if n_vertices > 100: + return "Error: Number of vertices limited to 100 for practical computation time", None + except (ValueError, TypeError): + return "Error: Number of vertices must be an integer", None + + n_cpus = os.cpu_count() + progress(0, desc=f"Starting optimization (using {n_cpus} CPU cores)...") + + n_free_vertices = n_vertices - 3 + n_params = n_free_vertices * 2 + bounds = [(0.1, np.pi - 0.1), (0, 2*np.pi)] * n_free_vertices + + # Adaptive settings for large vertex counts + # For high dimensions, reduce popsize to avoid excessive evaluations + adaptive_popsize = min(pop_size, max(10, 15 - (n_vertices - 7) // 3)) + + best_volume = 0 + best_params = None + all_volumes = [] + + # Create picklable objective function using partial + # (lambdas can't be pickled for multiprocessing) + objective_func = partial(compute_volume, n_vertices=n_vertices) + + for trial in range(n_trials): + progress((trial + 1) / n_trials, desc=f"Trial {trial + 1}/{n_trials}") + + result = differential_evolution( + objective_func, + bounds, + maxiter=max_iter, + popsize=adaptive_popsize, + seed=seed + trial, + polish=True, + disp=False, + workers=-1, # Use all CPU cores for parallel evaluation + updating='deferred' # Better for parallel workers + ) + + volume = -result.fun + all_volumes.append(volume) + + if volume > best_volume: + best_volume = volume + best_params = result.x + + # Reconstruct best configuration + complex_points = [complex(0, 0), complex(1, 0), complex(0, 1)] + for i in range(n_free_vertices): + theta = best_params[2*i] + phi = best_params[2*i + 1] + z = spherical_to_complex(theta, phi) + complex_points.append(z) + + Z_np = np.array(complex_points, dtype=np.complex128) + idx = delaunay_triangulation_indices(Z_np) + + # Compute statistics + stats = { + 'n_vertices': n_vertices, + 'n_faces': len(idx), + 'best_volume': float(best_volume), + 'mean_volume': float(np.mean(all_volumes)), + 'std_volume': float(np.std(all_volumes)), + 'all_volumes': [float(v) for v in all_volumes], + } + + # Save result + timestamp = datetime.now().strftime("%Y%m%d_%H%M%S") + output_file = f"results/data/{n_vertices}vertex_optimization_{timestamp}.json" + Path(output_file).parent.mkdir(parents=True, exist_ok=True) + + with open(output_file, 'w') as f: + json.dump({ + 'metadata': {'timestamp': datetime.now().isoformat()}, + 'best': { + 'volume': stats['best_volume'], + 'params': best_params.tolist(), + 'vertices_real': Z_np.real.tolist(), + 'vertices_imag': Z_np.imag.tolist(), + }, + 'statistics': stats, + }, f, indent=2) + + # Create summary text + summary = f""" +## Optimization Results + +**Configuration:** +- Vertices: {n_vertices} +- Trials: {n_trials} +- Iterations per trial: {max_iter} + +**Best Result:** +- Volume: {best_volume:.8f} +- Faces: {len(idx)} + +**Statistics over all trials:** +- Mean: {stats['mean_volume']:.8f} +- Std Dev: {stats['std_volume']:.8f} +- Best/Mean: {best_volume/stats['mean_volume']:.4f} + +**Saved to:** `{output_file}` +""" + + # Return data as dict for state management + opt_data = { + 'vertices': Z_np, + 'triangulation': idx, + 'stats': stats + } + + return summary, opt_data + + +# ============================================================================ +# Distribution Analysis Functions +# ============================================================================ + +def analyze_volume_distribution(n_vertices, n_samples, seed, progress=gr.Progress()): + """Analyze volume distribution with progress tracking. + + Note: n_vertices is the number of random vertices to add. + Total vertices = n_vertices + 3 fixed (0, 1, i) + 1 at infinity = n_vertices + 4 + """ + np.random.seed(seed) + + # Fixed vertices: 0, 1, i (infinity is implicit in the volume computation) + fixed_vertices = [complex(0, 0), complex(1, 0), complex(0, 1)] + n_random = n_vertices + total_vertices = n_vertices + 3 # Will have infinity implicitly + + volumes = [] + + def sample_random_vertex(): + vec = np.random.randn(3) + vec = vec / np.linalg.norm(vec) + x, y, z = vec + if z > 0.999: + return None + w = complex(x/(1-z), y/(1-z)) + return w + + for i in range(n_samples): + if (i + 1) % 100 == 0: + progress((i + 1) / n_samples, desc=f"Sampling: {i + 1}/{n_samples}") + + vertices = fixed_vertices.copy() + + for _ in range(n_random): + v = sample_random_vertex() + if v is None or any(abs(v - existing) < 0.01 for existing in vertices): + continue + vertices.append(v) + + if len(vertices) != total_vertices: + continue + + try: + vertices_np = np.array(vertices, dtype=np.complex128) + vol = ideal_poly_volume_via_delaunay(vertices_np, mode='fast', series_terms=96) + if 0 < vol < 1000: + volumes.append(vol) + except: + pass + + volumes = np.array(volumes) + + # Create histogram + fig, ax = plt.subplots(figsize=(10, 6)) + ax.hist(volumes, bins=50, density=True, alpha=0.7, color='steelblue', edgecolor='black') + ax.axvline(np.mean(volumes), color='red', linestyle='--', linewidth=2, + label=f'Mean: {np.mean(volumes):.4f}') + ax.axvline(np.median(volumes), color='green', linestyle='--', linewidth=2, + label=f'Median: {np.median(volumes):.4f}') + ax.set_xlabel('Volume', fontsize=12) + ax.set_ylabel('Density', fontsize=12) + ax.set_title(f'{total_vertices}-Vertex Volume Distribution ({len(volumes)} samples)', fontsize=14) + ax.legend() + ax.grid(True, alpha=0.3) + + # Save plot to BytesIO and convert to PIL Image for Gradio + buf = io.BytesIO() + plt.tight_layout() + plt.savefig(buf, format='png', dpi=150) + buf.seek(0) + plt.close() + + # Convert BytesIO to PIL Image for Gradio + img = Image.open(buf) + + # Statistics summary + summary = f""" +## Distribution Analysis Results + +**Configuration:** +- Random vertices: {n_random} +- Fixed vertices: 3 (at 0, 1, i) +- **Total vertices: {total_vertices}** (+ ∞) +- Samples requested: {n_samples} +- Valid samples: {len(volumes)} + +**Statistics:** +- Mean: {np.mean(volumes):.8f} +- Median: {np.median(volumes):.8f} +- Std Dev: {np.std(volumes):.8f} +- Min: {np.min(volumes):.8f} +- Max: {np.max(volumes):.8f} +- Q25: {np.percentile(volumes, 25):.8f} +- Q75: {np.percentile(volumes, 75):.8f} +""" + + return summary, img + + +# ============================================================================ +# Visualization Functions +# ============================================================================ + +def visualize_configuration(vertices_real, vertices_imag, vis_type, subdivisions): + """Create visualization based on user selection.""" + if not vertices_real or not vertices_imag: + return None, "Please provide vertices" + + try: + # Parse vertices + real_parts = [float(x.strip()) for x in vertices_real.split(',')] + imag_parts = [float(x.strip()) for x in vertices_imag.split(',')] + + if len(real_parts) != len(imag_parts): + return None, "Real and imaginary parts must have same length" + + vertices = np.array([complex(r, i) for r, i in zip(real_parts, imag_parts)]) + + if vis_type == "Delaunay (2D)": + idx = delaunay_triangulation_indices(vertices) + fig = plot_delaunay_2d(vertices, idx) + return fig, f"Showing Delaunay triangulation with {len(idx)} faces" + + elif vis_type == "Klein Ball (3D)": + fig = plot_polyhedron_klein(vertices, subdivisions=subdivisions) + return fig, f"Showing Klein model (subdivision level: {subdivisions})" + + elif vis_type == "Poincaré Ball (3D)": + fig = plot_polyhedron_poincare(vertices, subdivisions=subdivisions) + return fig, f"Showing Poincaré ball model (subdivision level: {subdivisions})" + + except Exception as e: + return None, f"Error: {str(e)}" + + +def load_from_optimization(opt_data): + """Load vertices from optimization results.""" + if opt_data is None: + return "", "" + + # opt_data is a dict with vertices + vertices = opt_data.get('vertices', None) + if vertices is None: + return "", "" + + real_str = ", ".join(f"{v.real:.6f}" for v in vertices) + imag_str = ", ".join(f"{v.imag:.6f}" for v in vertices) + + return real_str, imag_str + + +# ============================================================================ +# Holonomy/Arithmeticity Functions +# ============================================================================ + +def compute_holonomy_analysis(vertices_real, vertices_imag, progress=gr.Progress()): + """Compute holonomy generators and check arithmeticity.""" + if not vertices_real or not vertices_imag: + return "Please provide vertices", None + + try: + # Parse vertices + real_parts = [float(x.strip()) for x in vertices_real.split(',')] + imag_parts = [float(x.strip()) for x in vertices_imag.split(',')] + + if len(real_parts) != len(imag_parts): + return "Real and imaginary parts must have same length", None + + vertices = np.array([complex(r, i) for r, i in zip(real_parts, imag_parts)]) + + progress(0.2, desc="Computing Delaunay triangulation...") + + # Get triangulation + idx = delaunay_triangulation_indices(vertices) + F = len(idx) + + progress(0.4, desc="Building adjacency structure...") + + # Build adjacency structure + adjacency = {} + edge_id_map = {} + edge_id = 0 + + for i, tri_i in enumerate(idx): + for side_i in range(3): + v1_i, v2_i = tri_i[side_i], tri_i[(side_i + 1) % 3] + edge = tuple(sorted([v1_i, v2_i])) + + # Find matching triangle + for j, tri_j in enumerate(idx): + if i == j: + continue + for side_j in range(3): + v1_j, v2_j = tri_j[side_j], tri_j[(side_j + 1) % 3] + if set([v1_j, v2_j]) == set([v1_i, v2_i]): + if (i, side_i) not in adjacency: + if edge not in edge_id_map: + edge_id_map[edge] = edge_id + edge_id += 1 + adjacency[(i, side_i)] = (j, side_j, edge_id_map[edge]) + + progress(0.6, desc="Computing holonomy generators...") + + # Define order and orientation + order = {t: [0, 1, 2] for t in range(F)} + orientation = {} + for edge, eid in edge_id_map.items(): + for (t, s), (u, su, e) in adjacency.items(): + if e == eid: + orientation[eid] = ((t, s), (u, su)) + break + + # Create triangulation + T = Triangulation(F, adjacency, order, orientation) + + # Zero shears for ideal polyhedra + Z = {eid: 0.0 for eid in range(edge_id)} + + # Compute generators + gens = generators_from_triangulation(T, Z, root=0) + + progress(0.8, desc="Analyzing traces...") + + # Analyze traces + trace_analysis = [] + traces = [] + integral_count = 0 + + for i, (u, v, tokens, M) in enumerate(gens): + trace = M[0][0] + M[1][1] + traces.append(trace) + + nearest_int = round(trace) + distance = abs(trace - nearest_int) + is_close = distance < 0.01 + + if is_close: + integral_count += 1 + + trace_analysis.append({ + 'generator': i, + 'edge': (u, v), + 'trace': float(trace), + 'nearest_int': int(nearest_int), + 'distance': float(distance), + 'is_close': is_close + }) + + progress(1.0, desc="Complete!") + + # Create summary + summary = f""" +## Holonomy Analysis Results + +**Configuration:** +- Vertices: {len(vertices)} +- Triangular faces: {F} +- Number of generators: {len(gens)} + +**Arithmeticity Test:** +- Generators with integral traces: {integral_count}/{len(gens)} +- Percentage: {100*integral_count/len(gens):.1f}% + +**Interpretation:** +""" + if integral_count == len(gens): + summary += "✅ **ALL TRACES ARE INTEGERS!**\n\n" + summary += "This polyhedron is **ARITHMETIC** - it has deep number-theoretic structure!\n" + summary += "The holonomy lies in PSL(2,O_K) for some number field K." + elif integral_count > len(gens) * 0.7: + summary += "⚠️ **MOST TRACES ARE CLOSE TO INTEGERS**\n\n" + summary += f"This suggests possible arithmetic structure with {integral_count}/{len(gens)} integral traces.\n" + summary += "May be commensurable with an arithmetic group." + else: + summary += "❌ **NOT ARITHMETIC**\n\n" + summary += "Only a few traces are close to integers. This is likely a generic configuration." + + summary += "\n\n## Trace Details:\n\n" + summary += "| Generator | Edge | Trace | Nearest Int | Distance | Status |\n" + summary += "|-----------|------|-------|-------------|----------|--------|\n" + + for ta in trace_analysis: + status = "✅ INTEGRAL" if ta['is_close'] else "❌" + summary += f"| {ta['generator']} | {ta['edge'][0]}-{ta['edge'][1]} | " + summary += f"{ta['trace']:.6f} | {ta['nearest_int']} | " + summary += f"{ta['distance']:.6f} | {status} |\n" + + # Create plot + fig, (ax1, ax2) = plt.subplots(1, 2, figsize=(12, 5)) + + # Plot traces + gen_nums = [ta['generator'] for ta in trace_analysis] + trace_vals = [ta['trace'] for ta in trace_analysis] + colors = ['green' if ta['is_close'] else 'red' for ta in trace_analysis] + + ax1.bar(gen_nums, trace_vals, color=colors, alpha=0.7, edgecolor='black') + ax1.axhline(y=0, color='k', linestyle='-', linewidth=0.5) + ax1.set_xlabel('Generator', fontsize=12) + ax1.set_ylabel('Trace', fontsize=12) + ax1.set_title('Holonomy Generator Traces', fontsize=14) + ax1.grid(True, alpha=0.3) + + # Plot distances from integers + distances = [ta['distance'] for ta in trace_analysis] + ax2.bar(gen_nums, distances, color=colors, alpha=0.7, edgecolor='black') + ax2.axhline(y=0.01, color='orange', linestyle='--', linewidth=2, + label='Threshold (0.01)') + ax2.set_xlabel('Generator', fontsize=12) + ax2.set_ylabel('Distance from nearest integer', fontsize=12) + ax2.set_title('Integrality Test', fontsize=14) + ax2.legend() + ax2.grid(True, alpha=0.3) + ax2.set_yscale('log') + + plt.tight_layout() + + # Save plot to BytesIO and convert to PIL Image for Gradio + buf = io.BytesIO() + plt.savefig(buf, format='png', dpi=150) + buf.seek(0) + plt.close() + + # Convert BytesIO to PIL Image for Gradio + img = Image.open(buf) + + return summary, img + + except Exception as e: + return f"Error: {str(e)}", None + + +def compute_symmetry_analysis(vertices_real, vertices_imag): + """Compute symmetry group of the polyhedron.""" + if not vertices_real or not vertices_imag: + return "Please provide vertices" + + try: + # Parse vertices + real_parts = [float(x.strip()) for x in vertices_real.split(',')] + imag_parts = [float(x.strip()) for x in vertices_imag.split(',')] + + if len(real_parts) != len(imag_parts): + return "Real and imaginary parts must have same length" + + vertices = np.array([complex(r, i) for r, i in zip(real_parts, imag_parts)]) + + # Lift to 3D (Klein model in the ball) + from ideal_poly_volume_toolkit.visualization import lift_to_sphere_with_inf + vertices_3d = lift_to_sphere_with_inf(vertices) + + # Compute symmetry group + sym_info = compute_symmetry_group(vertices_3d) + + # Format report + report = format_symmetry_report(sym_info) + + return report + + except Exception as e: + return f"Error: {str(e)}" + + +# ============================================================================ +# Gradio Interface +# ============================================================================ + +def create_gui(): + """Create the main Gradio interface.""" + + with gr.Blocks(title="Ideal Polyhedron Volume Toolkit", theme=gr.themes.Soft()) as demo: + # Shared state for passing optimization results to visualization + opt_result_state = gr.State(None) + gr.Markdown(""" + # 🔺 Ideal Polyhedron Volume Toolkit + + Interactive GUI for computing and optimizing volumes of ideal hyperbolic polyhedra. + """) + + with gr.Tabs(): + # ================================================================ + # Tab 1: Optimization + # ================================================================ + with gr.Tab("🎯 Optimization"): + gr.Markdown("Find maximal volume configurations for ideal polyhedra") + + with gr.Row(): + with gr.Column(): + opt_vertices = gr.Number(value=7, label="Number of Vertices", + minimum=4, maximum=100, + info="Recommended: 4-30 (higher is much slower)") + opt_trials = gr.Slider(1, 50, value=10, step=1, + label="Number of Trials") + opt_maxiter = gr.Slider(50, 500, value=150, step=50, + label="Max Iterations per Trial", + info="150-200 is usually sufficient") + opt_popsize = gr.Slider(10, 30, value=15, step=5, + label="Population Size") + opt_seed = gr.Number(value=42, label="Random Seed") + + opt_button = gr.Button("Run Optimization", variant="primary") + + with gr.Column(): + opt_output = gr.Markdown("Results will appear here...") + + opt_button.click( + run_optimization, + inputs=[opt_vertices, opt_trials, opt_maxiter, opt_popsize, opt_seed], + outputs=[opt_output, opt_result_state] + ) + + # ================================================================ + # Tab 2: Distribution Analysis + # ================================================================ + with gr.Tab("📊 Distribution Analysis"): + gr.Markdown(""" + Analyze volume distributions by sampling random configurations. + + **Note:** Vertices are added to fixed base (0, 1, i, ∞). + So 4 random vertices = 7 total vertices. + """) + + with gr.Row(): + with gr.Column(): + dist_vertices = gr.Slider(1, 10, value=4, step=1, + label="Number of Random Vertices") + dist_samples = gr.Slider(100, 50000, value=5000, step=100, + label="Number of Samples") + dist_seed = gr.Number(value=42, label="Random Seed") + + dist_button = gr.Button("Analyze Distribution", variant="primary") + + with gr.Column(): + dist_output = gr.Markdown("Results will appear here...") + dist_plot = gr.Image(label="Distribution") + + dist_button.click( + analyze_volume_distribution, + inputs=[dist_vertices, dist_samples, dist_seed], + outputs=[dist_output, dist_plot] + ) + + # ================================================================ + # Tab 3: 3D Visualization + # ================================================================ + with gr.Tab("🔮 3D Visualization"): + gr.Markdown("Visualize polyhedra in different models") + + with gr.Row(): + with gr.Column(): + gr.Markdown("### Input Vertices") + gr.Markdown("Enter vertices as comma-separated values in the complex plane") + + vis_real = gr.Textbox( + label="Real parts", + value="0, 1, 0, 0.5", + placeholder="0, 1, 0.5, -0.5" + ) + vis_imag = gr.Textbox( + label="Imaginary parts", + value="0, 0, 1, 0.866", + placeholder="0, 0, 0.866, 0.866" + ) + + with gr.Row(): + load_opt_button = gr.Button("Load from Optimization", size="sm") + + gr.Markdown("### Visualization Options") + vis_type = gr.Radio( + ["Delaunay (2D)", "Klein Ball (3D)", "Poincaré Ball (3D)"], + value="Klein Ball (3D)", + label="Visualization Type" + ) + vis_subdivisions = gr.Slider(0, 5, value=3, step=1, + label="Subdivision Level (3D only)", + info="Higher = smoother curves (slower rendering)") + + vis_button = gr.Button("Generate Visualization", variant="primary") + + with gr.Column(): + vis_plot = gr.Plot(label="Visualization") + vis_status = gr.Textbox(label="Status", interactive=False) + + vis_button.click( + visualize_configuration, + inputs=[vis_real, vis_imag, vis_type, vis_subdivisions], + outputs=[vis_plot, vis_status] + ) + + load_opt_button.click( + load_from_optimization, + inputs=[opt_result_state], + outputs=[vis_real, vis_imag] + ) + + # ================================================================ + # Tab 4: Arithmeticity / Holonomy + # ================================================================ + with gr.Tab("🔬 Arithmeticity"): + gr.Markdown("Check if a polyhedron is arithmetic using Penner-Rivin holonomy") + + with gr.Row(): + with gr.Column(): + gr.Markdown("### Input Vertices") + arith_real = gr.Textbox( + label="Real parts", + value="0, 1, 0, 0.5", + placeholder="0, 1, 0.5, -0.5" + ) + arith_imag = gr.Textbox( + label="Imaginary parts", + value="0, 0, 1, 0.866", + placeholder="0, 0, 0.866, 0.866" + ) + + with gr.Row(): + load_opt_arith_button = gr.Button("Load from Optimization", size="sm") + + arith_button = gr.Button("Compute Holonomy & Check Arithmeticity", variant="primary") + symmetry_button = gr.Button("Compute Symmetry Group", variant="secondary") + + with gr.Column(): + arith_output = gr.Markdown("Results will appear here...") + arith_plot = gr.Image(label="Trace Analysis") + + arith_button.click( + compute_holonomy_analysis, + inputs=[arith_real, arith_imag], + outputs=[arith_output, arith_plot] + ) + + symmetry_button.click( + compute_symmetry_analysis, + inputs=[arith_real, arith_imag], + outputs=[arith_output] + ) + + load_opt_arith_button.click( + load_from_optimization, + inputs=[opt_result_state], + outputs=[arith_real, arith_imag] + ) + + gr.Markdown(""" + ### About Arithmeticity + + A hyperbolic 3-manifold is **arithmetic** if its holonomy representation lies in PSL(2, O_K) + where O_K is the ring of integers in a number field K. + + For ideal polyhedra, this can be tested by computing: + 1. Holonomy generators (Penner-Rivin algorithm) + 2. Traces of these generators + 3. Checking if traces are integers (or lie in a number field) + + **Arithmetic polyhedra have deep number-theoretic significance!** + + If all traces are integers, the configuration is arithmetic and related to + special lattices in hyperbolic space. + """) + + # ================================================================ + # Tab 5: About + # ================================================================ + with gr.Tab("ℹ️ About"): + gr.Markdown(""" + ## About This Tool + + This GUI provides an interactive interface to the **Ideal Polyhedron Volume Toolkit**. + + ### Features + + - **Optimization**: Find maximal volume configurations using differential evolution + - **Distribution Analysis**: Sample random configurations and analyze volume distributions + - **3D Visualization**: View polyhedra in multiple models: + - Delaunay triangulation in complex plane (2D) + - Stereographic projection on unit sphere (3D) + - Poincaré ball model (3D hyperbolic geometry) + - **Arithmeticity Testing**: Check if polyhedra have arithmetic holonomy (Penner-Rivin) + + ### Mathematical Background + + Ideal polyhedra are polyhedra in hyperbolic 3-space with all vertices at infinity. + Their volumes can be computed using: + - Delaunay triangulation of vertex positions + - Lobachevsky's formula for ideal tetrahedra + - Stereographic projection from the complex plane + + ### Usage Tips + + 1. Start with **Optimization** to find interesting configurations + 2. Use **Load from Optimization** in the Visualization tab to see results + 3. Adjust **subdivision level** for smoother 3D visualizations + 4. Compare sphere and Poincaré ball models to understand hyperbolic geometry + + ### Documentation + + - See `README.md` for installation and Python API + - See `bin/README.md` for command-line tools + - See `examples/` for research scripts + + --- + + **Version:** 0.3.0 + **License:** MIT + """) + + gr.Markdown("---") + gr.Markdown("*Ideal Polyhedron Volume Toolkit GUI*") + + return demo + + +if __name__ == "__main__": + parser = argparse.ArgumentParser( + description="Launch Gradio GUI for Ideal Polyhedron Volume Toolkit", + formatter_class=argparse.RawDescriptionHelpFormatter, + epilog=""" +Examples: + %(prog)s # Launch locally on 127.0.0.1:7860 + %(prog)s --share # Create shareable public link + %(prog)s --port 8080 # Use custom port + %(prog)s --share --port 8080 # Share with custom port + """ + ) + + parser.add_argument('--share', action='store_true', + help='Create a shareable public Gradio link') + parser.add_argument('--port', type=int, default=7860, + help='Port to run the server on (default: 7860)') + parser.add_argument('--server-name', type=str, default="127.0.0.1", + help='Server name/IP to bind to (default: 127.0.0.1)') + + args = parser.parse_args() + + demo = create_gui() + + print("=" * 70) + print("🎨 Ideal Polyhedron Volume Toolkit - GUI") + print("=" * 70) + if args.share: + print("Creating shareable public link...") + print("⚠️ WARNING: Public links expose your local server to the internet") + else: + print(f"Launching local server at http://{args.server_name}:{args.port}") + print("=" * 70) + + demo.launch( + share=args.share, + server_name=args.server_name, + server_port=args.port + ) diff --git a/bin/optimize_polyhedron.py b/bin/optimize_polyhedron.py new file mode 100755 index 0000000000000000000000000000000000000000..a0ada7dd9b82957830b2a18174ccbc4e3023ea98 --- /dev/null +++ b/bin/optimize_polyhedron.py @@ -0,0 +1,263 @@ +#!/usr/bin/env python3 +""" +General-purpose wrapper for optimizing ideal polyhedron volumes. + +Usage: + python bin/optimize_polyhedron.py --vertices 7 --trials 10 + python bin/optimize_polyhedron.py --vertices 12 --trials 20 --output custom_name.json +""" + +import argparse +import json +import numpy as np +import torch +from datetime import datetime +from pathlib import Path +import sys + +from ideal_poly_volume_toolkit.geometry import ( + delaunay_triangulation_indices, + triangle_volume_from_points_torch, +) +from scipy.optimize import differential_evolution + + +def spherical_to_complex(theta, phi): + """Convert spherical coordinates to complex via stereographic projection.""" + return np.tan(theta/2) * np.exp(1j * phi) + + +def compute_volume(params, n_vertices): + """ + Compute volume for a polyhedron with n_vertices. + + First 3 vertices are fixed to break symmetry: + - z1 = 0 + - z2 = 1 + - z3 = i + + Remaining (n_vertices - 3) vertices are parameterized by spherical coords. + """ + # Fixed vertices + complex_points = [complex(0, 0), complex(1, 0), complex(0, 1)] + + # Parameterized vertices (2 params each: theta, phi) + n_params = n_vertices - 3 + for i in range(n_params): + theta = params[2*i] + phi = params[2*i + 1] + z = spherical_to_complex(theta, phi) + complex_points.append(z) + + Z_np = np.array(complex_points, dtype=np.complex128) + + # Delaunay triangulation + try: + idx = delaunay_triangulation_indices(Z_np) + except: + return 1000.0 # Penalty for degenerate configuration + + # Convert to torch for volume computation + Z_torch = torch.tensor(Z_np, dtype=torch.complex128) + + # Compute total volume + total_volume = 0 + for (i, j, k) in idx: + try: + vol = triangle_volume_from_points_torch( + Z_torch[i], Z_torch[j], Z_torch[k], series_terms=96 + ) + total_volume += vol.item() + except: + return 1000.0 # Penalty for invalid configuration + + return -total_volume # Negative for minimization + + +def analyze_structure(Z_np, idx): + """Analyze the combinatorial structure of the triangulation.""" + n_vertices = len(Z_np) + n_faces = len(idx) + + # Count edges + edges = set() + for (i, j, k) in idx: + edges.add((min(i,j), max(i,j))) + edges.add((min(i,k), max(i,k))) + edges.add((min(j,k), max(j,k))) + n_edges = len(edges) + + # Vertex degrees + vertex_degrees = [0] * n_vertices + for edge in edges: + vertex_degrees[edge[0]] += 1 + vertex_degrees[edge[1]] += 1 + + # Euler characteristic check + euler_char = n_vertices - n_edges + n_faces + + return { + 'n_vertices': n_vertices, + 'n_faces': n_faces, + 'n_edges': n_edges, + 'euler_characteristic': euler_char, + 'vertex_degrees': sorted(vertex_degrees), + } + + +def reconstruct_vertices(params, n_vertices): + """Reconstruct complex vertices from parameters.""" + complex_points = [complex(0, 0), complex(1, 0), complex(0, 1)] + + n_params = n_vertices - 3 + for i in range(n_params): + theta = params[2*i] + phi = params[2*i + 1] + z = spherical_to_complex(theta, phi) + complex_points.append(z) + + return np.array(complex_points, dtype=np.complex128) + + +def main(): + parser = argparse.ArgumentParser( + description='Optimize ideal polyhedron volumes', + formatter_class=argparse.RawDescriptionHelpFormatter, + epilog=""" +Examples: + %(prog)s --vertices 7 --trials 10 + %(prog)s --vertices 12 --trials 20 --maxiter 300 + %(prog)s --vertices 20 --trials 5 --output results/data/my_20vertex.json + """ + ) + + parser.add_argument('--vertices', '-v', type=int, required=True, + help='Number of vertices (must be >= 4)') + parser.add_argument('--trials', '-t', type=int, default=10, + help='Number of optimization trials (default: 10)') + parser.add_argument('--maxiter', '-m', type=int, default=200, + help='Max iterations per trial (default: 200)') + parser.add_argument('--popsize', '-p', type=int, default=15, + help='Population size for differential evolution (default: 15)') + parser.add_argument('--output', '-o', type=str, default=None, + help='Output JSON file (default: results/data/{n}vertex_optimization_TIMESTAMP.json)') + parser.add_argument('--seed', '-s', type=int, default=42, + help='Random seed base (default: 42)') + + args = parser.parse_args() + + if args.vertices < 4: + print("Error: Number of vertices must be at least 4") + sys.exit(1) + + # Setup output file + if args.output is None: + timestamp = datetime.now().strftime("%Y%m%d_%H%M%S") + output_file = f"results/data/{args.vertices}vertex_optimization_{timestamp}.json" + else: + output_file = args.output + + # Ensure output directory exists + Path(output_file).parent.mkdir(parents=True, exist_ok=True) + + # Calculate number of parameters (2 per free vertex) + n_free_vertices = args.vertices - 3 + n_params = n_free_vertices * 2 + bounds = [(0.1, np.pi - 0.1), (0, 2*np.pi)] * n_free_vertices + + print("=" * 70) + print(f"Ideal Polyhedron Volume Optimization") + print("=" * 70) + print(f"Vertices: {args.vertices}") + print(f"Free vertices: {n_free_vertices} (parameterized)") + print(f"Parameters: {n_params} (spherical coordinates)") + print(f"Trials: {args.trials}") + print(f"Max iterations: {args.maxiter}") + print(f"Population: {args.popsize}") + print(f"Output: {output_file}") + print("=" * 70) + + best_volume = 0 + best_params = None + all_results = [] + + for trial in range(args.trials): + print(f"\nTrial {trial + 1}/{args.trials}...") + + result = differential_evolution( + lambda p: compute_volume(p, args.vertices), + bounds, + maxiter=args.maxiter, + popsize=args.popsize, + seed=args.seed + trial, + polish=True, + disp=False + ) + + volume = -result.fun + print(f" Volume: {volume:.8f}") + print(f" Success: {result.success}") + print(f" Iterations: {result.nit}") + + all_results.append({ + 'trial': trial + 1, + 'volume': float(volume), + 'params': result.x.tolist(), + 'success': bool(result.success), + 'iterations': int(result.nit), + 'function_evals': int(result.nfev) + }) + + if volume > best_volume: + best_volume = volume + best_params = result.x + print(f" → NEW BEST!") + + # Analyze best configuration + print("\n" + "=" * 70) + print("BEST RESULT:") + print("=" * 70) + print(f"Volume: {best_volume:.10f}") + + # Reconstruct and analyze + Z_np = reconstruct_vertices(best_params, args.vertices) + idx = delaunay_triangulation_indices(Z_np) + structure = analyze_structure(Z_np, idx) + + print(f"\nCombinatorial structure:") + print(f" Vertices: {structure['n_vertices']}") + print(f" Edges: {structure['n_edges']}") + print(f" Faces: {structure['n_faces']}") + print(f" Euler char: {structure['euler_characteristic']} (should be 2 for sphere)") + print(f" Vertex degrees: {structure['vertex_degrees']}") + + # Save results + output_data = { + 'metadata': { + 'timestamp': datetime.now().isoformat(), + 'n_vertices': args.vertices, + 'n_trials': args.trials, + 'maxiter': args.maxiter, + 'popsize': args.popsize, + 'seed_base': args.seed, + }, + 'best': { + 'volume': float(best_volume), + 'params': best_params.tolist(), + 'vertices_real': Z_np.real.tolist(), + 'vertices_imag': Z_np.imag.tolist(), + 'structure': structure, + 'triangulation': [list(map(int, tri)) for tri in idx], + }, + 'all_trials': all_results, + } + + with open(output_file, 'w') as f: + json.dump(output_data, f, indent=2) + + print(f"\nResults saved to: {output_file}") + print("=" * 70) + + +if __name__ == '__main__': + main() diff --git a/bin/results/data/20vertex_optimization_20251026_203700.json b/bin/results/data/20vertex_optimization_20251026_203700.json new file mode 100644 index 0000000000000000000000000000000000000000..6bed76c8ac9714de29c2299762f6d8bf3a78f6ec --- /dev/null +++ b/bin/results/data/20vertex_optimization_20251026_203700.json @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:6d5c59c2eff34286466db53ecd01ec78225d18bf90c079c9e135f94943acc02e +size 2270 diff --git a/bin/results/data/7vertex_optimization_20251026_191114.json b/bin/results/data/7vertex_optimization_20251026_191114.json new file mode 100644 index 0000000000000000000000000000000000000000..6a90bfbeb4d9af64e47fade7bdf7179cc687f074 --- /dev/null +++ b/bin/results/data/7vertex_optimization_20251026_191114.json @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:95afdc4df0c3c803fd61f1e8aa8575595273114afa20650b3f135b837ea44eff +size 1141 diff --git a/bin/results/data/7vertex_optimization_20251026_193813.json b/bin/results/data/7vertex_optimization_20251026_193813.json new file mode 100644 index 0000000000000000000000000000000000000000..bd96a52f76c341dcd536f62c4d7f17534b5ad6ce --- /dev/null +++ b/bin/results/data/7vertex_optimization_20251026_193813.json @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:e689212237436e5f0e27cfed6a331d4f35c0acefc2a191c543801a9ea00a2f28 +size 1141 diff --git a/bin/results/data/7vertex_optimization_20251026_194915.json b/bin/results/data/7vertex_optimization_20251026_194915.json new file mode 100644 index 0000000000000000000000000000000000000000..35856822820b8b64c527a70b7042fc2c01f694bb --- /dev/null +++ b/bin/results/data/7vertex_optimization_20251026_194915.json @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:cabef18150170f696ba1de13d3a78c94846c8da6dc504f7b31e3fa5bedc919f1 +size 1141 diff --git a/bin/results/data/7vertex_optimization_20251026_195701.json b/bin/results/data/7vertex_optimization_20251026_195701.json new file mode 100644 index 0000000000000000000000000000000000000000..43608208720a7027885db7af212e79315fe3ecbd --- /dev/null +++ b/bin/results/data/7vertex_optimization_20251026_195701.json @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:6b10bdad76be2cae8ae2013d4fde01eb7469d09e98e0b92048287d11914854a7 +size 894 diff --git a/bin/results/data/7vertex_optimization_20251026_200737.json b/bin/results/data/7vertex_optimization_20251026_200737.json new file mode 100644 index 0000000000000000000000000000000000000000..ac8db4991ea27946907693918311453d07b4b32a --- /dev/null +++ b/bin/results/data/7vertex_optimization_20251026_200737.json @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:3595713fd3c48b86f2ea9c81c567d340f075f014ff27004ec16454f1e2850463 +size 894 diff --git a/bin/results/data/7vertex_optimization_20251026_201344.json b/bin/results/data/7vertex_optimization_20251026_201344.json new file mode 100644 index 0000000000000000000000000000000000000000..a8e97a6b49536b70e2999ef7311705e0895f5fd6 --- /dev/null +++ b/bin/results/data/7vertex_optimization_20251026_201344.json @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:c0dd44c390c87270a8449ce4fa672e5931975d7cc657c678de8a678eaa753041 +size 894 diff --git a/bin/results/data/7vertex_optimization_20251026_202024.json b/bin/results/data/7vertex_optimization_20251026_202024.json new file mode 100644 index 0000000000000000000000000000000000000000..4413b0ecfcb3c105c0a972a0fb5dad203111ea5e --- /dev/null +++ b/bin/results/data/7vertex_optimization_20251026_202024.json @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:1b508899ef59f5dde600783de57f0299f17b852d8dfae29deec2aa8f22c25943 +size 894 diff --git a/bin/results/data/7vertex_optimization_20251026_205947.json b/bin/results/data/7vertex_optimization_20251026_205947.json new file mode 100644 index 0000000000000000000000000000000000000000..8290f0dd559df8ea85bcc06cd9e95b0f2c9f9c1c --- /dev/null +++ b/bin/results/data/7vertex_optimization_20251026_205947.json @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:b19d5cb571808690d50c3eab47b7054b243250d5a6db1252d74df50d7318a938 +size 901 diff --git a/bin/results/data/9vertex_optimization_20251026_205959.json b/bin/results/data/9vertex_optimization_20251026_205959.json new file mode 100644 index 0000000000000000000000000000000000000000..0b8d0c36f5c336d7501855bc5e3809332b118298 --- /dev/null +++ b/bin/results/data/9vertex_optimization_20251026_205959.json @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:5daec075aa4a4fe8e0765f38daf454e16a6df26bdb092b0c6c56f40e7fddd204 +size 1109 diff --git a/bin/results/data/9vertex_optimization_20251026_210116.json b/bin/results/data/9vertex_optimization_20251026_210116.json new file mode 100644 index 0000000000000000000000000000000000000000..2660ef4c1eed37a5654ef749a6dd5b4a61bf5338 --- /dev/null +++ b/bin/results/data/9vertex_optimization_20251026_210116.json @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:0ee32067e12f06496d538600e41a2204da27e1f978596b0e05d6007a53a82f7f +size 1200 diff --git a/PLATONIC_MAXIMALITY_RESULTS.md b/docs/PLATONIC_MAXIMALITY_RESULTS.md similarity index 100% rename from PLATONIC_MAXIMALITY_RESULTS.md rename to docs/PLATONIC_MAXIMALITY_RESULTS.md diff --git a/RESULTS_SUMMARY.md b/docs/RESULTS_SUMMARY.md similarity index 100% rename from RESULTS_SUMMARY.md rename to docs/RESULTS_SUMMARY.md diff --git a/examples/README.md b/examples/README.md new file mode 100644 index 0000000000000000000000000000000000000000..0d570b07e49a5c61db84f7c04e61252ed38e50a6 --- /dev/null +++ b/examples/README.md @@ -0,0 +1,44 @@ +# Examples Directory + +This directory contains organized example scripts demonstrating the use of the ideal polyhedra volume toolkit. + +## Directory Structure + +### `distributions/` +Analysis of volume distributions for various polyhedra: +- `tetrahedron/` - 4-vertex polyhedra volume distributions (10 files) +- `five_vertex/` - 5-vertex polyhedra distributions +- `six_vertex/` - 6-vertex polyhedra distributions +- `euclidean/` - Euclidean tetrahedra analysis and fitting + +### `optimization/` +Optimization scripts organized by vertex count: +- `7vertex/` - 7-vertex optimization (octahedron with stellated face hypothesis) - 9 scripts +- `12vertex/` - 12-vertex optimization - 3 scripts +- `20vertex/` - 20-vertex optimization (icosahedron-like) - 6 scripts +- `platonic/` - Platonic solid analysis and perturbations + +### `visualization/` +Visualization scripts (5 scripts): +- Sphere projection visualizations +- Golden ratio configurations +- Maximal volume configurations +- Volume landscapes + +### `analysis/` +Statistical and theoretical analysis scripts (14 scripts): +- Beta distribution theory +- Central limit theorem analysis +- Combinatorial mixture analysis +- Special configuration analysis + +## Running Examples + +All examples can be run from their respective directories: + +```bash +cd examples/optimization/7vertex +python optimize_7vertex.py +``` + +Or from the project root using absolute imports (package must be installed with `pip install -e .`). diff --git a/ideal_poly_volume_toolkit/examples/analytical_challenge_simple.py b/examples/analysis/analytical_challenge_simple.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/analytical_challenge_simple.py rename to examples/analysis/analytical_challenge_simple.py diff --git a/ideal_poly_volume_toolkit/examples/analytical_mean_challenge.py b/examples/analysis/analytical_mean_challenge.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/analytical_mean_challenge.py rename to examples/analysis/analytical_mean_challenge.py diff --git a/ideal_poly_volume_toolkit/examples/analyze_both_configs.py b/examples/analysis/analyze_both_configs.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/analyze_both_configs.py rename to examples/analysis/analyze_both_configs.py diff --git a/ideal_poly_volume_toolkit/examples/analyze_distribution_shape.py b/examples/analysis/analyze_distribution_shape.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/analyze_distribution_shape.py rename to examples/analysis/analyze_distribution_shape.py diff --git a/ideal_poly_volume_toolkit/examples/analyze_special_configs.py b/examples/analysis/analyze_special_configs.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/analyze_special_configs.py rename to examples/analysis/analyze_special_configs.py diff --git a/ideal_poly_volume_toolkit/examples/beta_distribution_theory.py b/examples/analysis/beta_distribution_theory.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/beta_distribution_theory.py rename to examples/analysis/beta_distribution_theory.py diff --git a/ideal_poly_volume_toolkit/examples/beta_fit_analysis.py b/examples/analysis/beta_fit_analysis.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/beta_fit_analysis.py rename to examples/analysis/beta_fit_analysis.py diff --git a/ideal_poly_volume_toolkit/examples/check_lob_math.py b/examples/analysis/check_lob_math.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/check_lob_math.py rename to examples/analysis/check_lob_math.py diff --git a/ideal_poly_volume_toolkit/examples/check_statistical_precision.py b/examples/analysis/check_statistical_precision.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/check_statistical_precision.py rename to examples/analysis/check_statistical_precision.py diff --git a/ideal_poly_volume_toolkit/examples/clt_analysis.py b/examples/analysis/clt_analysis.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/clt_analysis.py rename to examples/analysis/clt_analysis.py diff --git a/ideal_poly_volume_toolkit/examples/combinatorial_mixture_analysis.py b/examples/analysis/combinatorial_mixture_analysis.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/combinatorial_mixture_analysis.py rename to examples/analysis/combinatorial_mixture_analysis.py diff --git a/ideal_poly_volume_toolkit/examples/concentration_location_analysis.py b/examples/analysis/concentration_location_analysis.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/concentration_location_analysis.py rename to examples/analysis/concentration_location_analysis.py diff --git a/create_distribution_comparison_plot.py b/examples/analysis/create_distribution_comparison_plot.py similarity index 100% rename from create_distribution_comparison_plot.py rename to examples/analysis/create_distribution_comparison_plot.py diff --git a/ideal_poly_volume_toolkit/examples/sanity_check_5_7_vertices.py b/examples/analysis/sanity_check_5_7_vertices.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/sanity_check_5_7_vertices.py rename to examples/analysis/sanity_check_5_7_vertices.py diff --git a/euclidean_distribution_fitting.py b/examples/distributions/euclidean/euclidean_distribution_fitting.py similarity index 100% rename from euclidean_distribution_fitting.py rename to examples/distributions/euclidean/euclidean_distribution_fitting.py diff --git a/euclidean_fit_analysis.py b/examples/distributions/euclidean/euclidean_fit_analysis.py similarity index 100% rename from euclidean_fit_analysis.py rename to examples/distributions/euclidean/euclidean_fit_analysis.py diff --git a/euclidean_tetrahedron_distribution.py b/examples/distributions/euclidean/euclidean_tetrahedron_distribution.py similarity index 100% rename from euclidean_tetrahedron_distribution.py rename to examples/distributions/euclidean/euclidean_tetrahedron_distribution.py diff --git a/ideal_poly_volume_toolkit/examples/five_vertex_distribution.py b/examples/distributions/five_vertex/five_vertex_distribution.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/five_vertex_distribution.py rename to examples/distributions/five_vertex/five_vertex_distribution.py diff --git a/debug_six_vertex.py b/examples/distributions/six_vertex/debug_six_vertex.py similarity index 100% rename from debug_six_vertex.py rename to examples/distributions/six_vertex/debug_six_vertex.py diff --git a/ideal_poly_volume_toolkit/examples/six_vertex_distribution.py b/examples/distributions/six_vertex/six_vertex_distribution.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/six_vertex_distribution.py rename to examples/distributions/six_vertex/six_vertex_distribution.py diff --git a/ideal_poly_volume_toolkit/examples/ideal_tetrahedron.py b/examples/distributions/tetrahedron/ideal_tetrahedron.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/ideal_tetrahedron.py rename to examples/distributions/tetrahedron/ideal_tetrahedron.py diff --git a/ideal_poly_volume_toolkit/examples/quick_tetrahedron_analysis.py b/examples/distributions/tetrahedron/quick_tetrahedron_analysis.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/quick_tetrahedron_analysis.py rename to examples/distributions/tetrahedron/quick_tetrahedron_analysis.py diff --git a/ideal_poly_volume_toolkit/examples/run_tetrahedron_distribution.py b/examples/distributions/tetrahedron/run_tetrahedron_distribution.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/run_tetrahedron_distribution.py rename to examples/distributions/tetrahedron/run_tetrahedron_distribution.py diff --git a/ideal_poly_volume_toolkit/examples/tetrahedron_volume_distribution.py b/examples/distributions/tetrahedron/tetrahedron_volume_distribution.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/tetrahedron_volume_distribution.py rename to examples/distributions/tetrahedron/tetrahedron_volume_distribution.py diff --git a/examples/distributions/tetrahedron/tetrahedron_volume_histogram.png b/examples/distributions/tetrahedron/tetrahedron_volume_histogram.png new file mode 100644 index 0000000000000000000000000000000000000000..aa8ff1e4b3a20a5c7a49e41dc59f6f1f9ff23d6e --- /dev/null +++ b/examples/distributions/tetrahedron/tetrahedron_volume_histogram.png @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:5239d843cc9362cd1e26f3f13b9ba878e8703c8db95dab17a13658edf9ebf88f +size 53905 diff --git a/ideal_poly_volume_toolkit/examples/analyze_12vertex_combinatorics.py b/examples/optimization/12vertex/analyze_12vertex_combinatorics.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/analyze_12vertex_combinatorics.py rename to examples/optimization/12vertex/analyze_12vertex_combinatorics.py diff --git a/ideal_poly_volume_toolkit/examples/analyze_12vertex_results.py b/examples/optimization/12vertex/analyze_12vertex_results.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/analyze_12vertex_results.py rename to examples/optimization/12vertex/analyze_12vertex_results.py diff --git a/ideal_poly_volume_toolkit/examples/visualize_maximal_12vertex.py b/examples/optimization/12vertex/visualize_maximal_12vertex.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/visualize_maximal_12vertex.py rename to examples/optimization/12vertex/visualize_maximal_12vertex.py diff --git a/check_arithmetic_holonomy.py b/examples/optimization/20vertex/check_arithmetic_holonomy.py similarity index 97% rename from check_arithmetic_holonomy.py rename to examples/optimization/20vertex/check_arithmetic_holonomy.py index 2309fa828d59c56dcf4f8fcfe656fd6fcb4e715a..2408ddd675d1391d4bb7cf38c65d6ce6aeda48da 100644 --- a/check_arithmetic_holonomy.py +++ b/examples/optimization/20vertex/check_arithmetic_holonomy.py @@ -1,9 +1,7 @@ import numpy as np -import sys -sys.path.append('/Users/igorrivin/devel/platonic') -from rivin_holonomy import Triangulation, generators_from_triangulation import json from scipy.spatial import Delaunay +from ideal_poly_volume_toolkit.rivin_holonomy import Triangulation, generators_from_triangulation def compute_holonomy_traces(vertices_complex, triangles): """ diff --git a/check_local_maxima_arithmetic.py b/examples/optimization/20vertex/check_local_maxima_arithmetic.py similarity index 98% rename from check_local_maxima_arithmetic.py rename to examples/optimization/20vertex/check_local_maxima_arithmetic.py index fa76f7f33c761d4209dbc333efc7771a82654b7a..8ab872fc1fda4d5496249638be7036c0f069d378 100644 --- a/check_local_maxima_arithmetic.py +++ b/examples/optimization/20vertex/check_local_maxima_arithmetic.py @@ -6,11 +6,9 @@ Tests the conjecture that local maxima are more likely to be arithmetic. import numpy as np import json -import sys -sys.path.append('/Users/igorrivin/devel/platonic') -from rivin_holonomy import Triangulation, generators_from_triangulation -from scipy.spatial import Delaunay import os +from scipy.spatial import Delaunay +from ideal_poly_volume_toolkit.rivin_holonomy import Triangulation, generators_from_triangulation def build_triangulation_from_config(vertices_dict): """Convert vertex configuration to triangulation for holonomy computation.""" diff --git a/check_single_config_arithmetic.py b/examples/optimization/20vertex/check_single_config_arithmetic.py similarity index 95% rename from check_single_config_arithmetic.py rename to examples/optimization/20vertex/check_single_config_arithmetic.py index f83194b5c4260815d8c992d671e7d51bf55f2c95..f17b9bfddb34e03e70e9fc0c7a99654255a9d08b 100644 --- a/check_single_config_arithmetic.py +++ b/examples/optimization/20vertex/check_single_config_arithmetic.py @@ -3,7 +3,8 @@ import json import sys -sys.path.append('/Users/igorrivin/devel/platonic') +# Import from local module +sys.path.insert(0, '.') from check_local_maxima_arithmetic import check_arithmeticity # Load the current best configuration diff --git a/optimize_20vertex_background.py b/examples/optimization/20vertex/optimize_20vertex_background.py similarity index 100% rename from optimize_20vertex_background.py rename to examples/optimization/20vertex/optimize_20vertex_background.py diff --git a/optimize_20vertex_save_local_maxima.py b/examples/optimization/20vertex/optimize_20vertex_save_local_maxima.py similarity index 100% rename from optimize_20vertex_save_local_maxima.py rename to examples/optimization/20vertex/optimize_20vertex_save_local_maxima.py diff --git a/optimize_20vertex_scipy.py b/examples/optimization/20vertex/optimize_20vertex_scipy.py similarity index 100% rename from optimize_20vertex_scipy.py rename to examples/optimization/20vertex/optimize_20vertex_scipy.py diff --git a/analyze_7vertex_result.py b/examples/optimization/7vertex/analyze_7vertex_result.py similarity index 100% rename from analyze_7vertex_result.py rename to examples/optimization/7vertex/analyze_7vertex_result.py diff --git a/debug_7vertex.py b/examples/optimization/7vertex/debug_7vertex.py similarity index 100% rename from debug_7vertex.py rename to examples/optimization/7vertex/debug_7vertex.py diff --git a/find_7vertex_local_maxima.py b/examples/optimization/7vertex/find_7vertex_local_maxima.py similarity index 100% rename from find_7vertex_local_maxima.py rename to examples/optimization/7vertex/find_7vertex_local_maxima.py diff --git a/find_7vertex_maxima_quick.py b/examples/optimization/7vertex/find_7vertex_maxima_quick.py similarity index 100% rename from find_7vertex_maxima_quick.py rename to examples/optimization/7vertex/find_7vertex_maxima_quick.py diff --git a/fix_7vertex_analysis.py b/examples/optimization/7vertex/fix_7vertex_analysis.py similarity index 100% rename from fix_7vertex_analysis.py rename to examples/optimization/7vertex/fix_7vertex_analysis.py diff --git a/optimize_7vertex.py b/examples/optimization/7vertex/optimize_7vertex.py similarity index 100% rename from optimize_7vertex.py rename to examples/optimization/7vertex/optimize_7vertex.py diff --git a/optimize_7vertex_quick.py b/examples/optimization/7vertex/optimize_7vertex_quick.py similarity index 100% rename from optimize_7vertex_quick.py rename to examples/optimization/7vertex/optimize_7vertex_quick.py diff --git a/test_7vertex_variations.py b/examples/optimization/7vertex/test_7vertex_variations.py similarity index 100% rename from test_7vertex_variations.py rename to examples/optimization/7vertex/test_7vertex_variations.py diff --git a/visualize_7vertex.py b/examples/optimization/7vertex/visualize_7vertex.py similarity index 100% rename from visualize_7vertex.py rename to examples/optimization/7vertex/visualize_7vertex.py diff --git a/ideal_poly_volume_toolkit.examples.optimize_lbfgs_delaunay.py b/examples/optimization/ideal_poly_volume_toolkit.examples.optimize_lbfgs_delaunay.py similarity index 100% rename from ideal_poly_volume_toolkit.examples.optimize_lbfgs_delaunay.py rename to examples/optimization/ideal_poly_volume_toolkit.examples.optimize_lbfgs_delaunay.py diff --git a/ideal_poly_volume_toolkit/examples/multiple_optimization_trials.py b/examples/optimization/multiple_optimization_trials.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/multiple_optimization_trials.py rename to examples/optimization/multiple_optimization_trials.py diff --git a/ideal_poly_volume_toolkit/examples/optimize_lbfgs_delaunay.py b/examples/optimization/optimize_lbfgs_delaunay.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/optimize_lbfgs_delaunay.py rename to examples/optimization/optimize_lbfgs_delaunay.py diff --git a/ideal_poly_volume_toolkit/examples/optimize_lbfgs_delaunay_5points.py b/examples/optimization/optimize_lbfgs_delaunay_5points.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/optimize_lbfgs_delaunay_5points.py rename to examples/optimization/optimize_lbfgs_delaunay_5points.py diff --git a/ideal_poly_volume_toolkit/examples/optimize_lbfgs_delaunay_debug.py b/examples/optimization/optimize_lbfgs_delaunay_debug.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/optimize_lbfgs_delaunay_debug.py rename to examples/optimization/optimize_lbfgs_delaunay_debug.py diff --git a/ideal_poly_volume_toolkit/examples/optimize_lbfgs_delaunay_fixed.py b/examples/optimization/optimize_lbfgs_delaunay_fixed.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/optimize_lbfgs_delaunay_fixed.py rename to examples/optimization/optimize_lbfgs_delaunay_fixed.py diff --git a/ideal_poly_volume_toolkit/examples/optimize_lbfgs_delaunay_free.py b/examples/optimization/optimize_lbfgs_delaunay_free.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/optimize_lbfgs_delaunay_free.py rename to examples/optimization/optimize_lbfgs_delaunay_free.py diff --git a/ideal_poly_volume_toolkit/examples/optimize_sgd_delaunay.py b/examples/optimization/optimize_sgd_delaunay.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/optimize_sgd_delaunay.py rename to examples/optimization/optimize_sgd_delaunay.py diff --git a/ideal_poly_volume_toolkit/examples/optimize_simplex.py b/examples/optimization/optimize_simplex.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/optimize_simplex.py rename to examples/optimization/optimize_simplex.py diff --git a/ideal_poly_volume_toolkit/examples/optimize_simplex_correct.py b/examples/optimization/optimize_simplex_correct.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/optimize_simplex_correct.py rename to examples/optimization/optimize_simplex_correct.py diff --git a/ideal_poly_volume_toolkit/examples/compute_regular_icosahedron.py b/examples/optimization/platonic/compute_regular_icosahedron.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/compute_regular_icosahedron.py rename to examples/optimization/platonic/compute_regular_icosahedron.py diff --git a/explore_dodecahedron_perturbation.py b/examples/optimization/platonic/explore_dodecahedron_perturbation.py similarity index 100% rename from explore_dodecahedron_perturbation.py rename to examples/optimization/platonic/explore_dodecahedron_perturbation.py diff --git a/ideal_poly_volume_toolkit/examples/optimize_icosahedron_test.py b/examples/optimization/platonic/optimize_icosahedron_test.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/optimize_icosahedron_test.py rename to examples/optimization/platonic/optimize_icosahedron_test.py diff --git a/ideal_poly_volume_toolkit/examples/regular_platonics.py b/examples/optimization/platonic/regular_platonics.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/regular_platonics.py rename to examples/optimization/platonic/regular_platonics.py diff --git a/ideal_poly_volume_toolkit/examples/verify_octahedron_configs.py b/examples/optimization/platonic/verify_octahedron_configs.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/verify_octahedron_configs.py rename to examples/optimization/platonic/verify_octahedron_configs.py diff --git a/ideal_poly_volume_toolkit/examples/plot_volume_landscape.py b/examples/visualization/plot_volume_landscape.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/plot_volume_landscape.py rename to examples/visualization/plot_volume_landscape.py diff --git a/ideal_poly_volume_toolkit/examples/visualize_golden_config.py b/examples/visualization/visualize_golden_config.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/visualize_golden_config.py rename to examples/visualization/visualize_golden_config.py diff --git a/ideal_poly_volume_toolkit/examples/visualize_golden_simple.py b/examples/visualization/visualize_golden_simple.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/visualize_golden_simple.py rename to examples/visualization/visualize_golden_simple.py diff --git a/ideal_poly_volume_toolkit/examples/visualize_maximal_config.py b/examples/visualization/visualize_maximal_config.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/visualize_maximal_config.py rename to examples/visualization/visualize_maximal_config.py diff --git a/ideal_poly_volume_toolkit/examples/visualize_sphere_projection.py b/examples/visualization/visualize_sphere_projection.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/visualize_sphere_projection.py rename to examples/visualization/visualize_sphere_projection.py diff --git a/ideal_poly_volume_toolkit/examples/ideal_poly_volume_toolkit.examples.optimize_lbfgs_delaunay.py b/ideal_poly_volume_toolkit/examples/ideal_poly_volume_toolkit.examples.optimize_lbfgs_delaunay.py deleted file mode 100644 index 834c48f107f0231573bd1294e8bfac80202a69de..0000000000000000000000000000000000000000 --- a/ideal_poly_volume_toolkit/examples/ideal_poly_volume_toolkit.examples.optimize_lbfgs_delaunay.py +++ /dev/null @@ -1,67 +0,0 @@ -import argparse, numpy as np, torch, time -from ideal_poly_volume_toolkit.geometry import delaunay_triangulation_indices, triangle_volume_from_points_torch - -def random_angles(K, rng): return 2*np.pi*rng.random(K) - -def build_Z(thetas): - Z = torch.empty(thetas.numel()+2, dtype=torch.complex128, device=thetas.device) - Z[0] = 1+0j; Z[1] = 0+0j - Z[2:] = torch.exp(1j*thetas.to(torch.complex128)) - return Z - -def main(): - ap = argparse.ArgumentParser() - ap.add_argument('--seed', type=int, default=0) - ap.add_argument('--iters', type=int, default=75) - ap.add_argument('--series', type=int, default=96) - ap.add_argument('--print-every', type=int, default=5) - ap.add_argument('--device', type=str, default='cpu') - args = ap.parse_args() - - rng = np.random.default_rng(args.seed) - K = 3 - thetas = torch.tensor(random_angles(K, rng), dtype=torch.float64, device=args.device, requires_grad=True) - opt = torch.optim.LBFGS([thetas], lr=1.0, max_iter=20, line_search_fn='strong_wolfe') - - # Precompute triangulation connectivity outside the graph each outer iter - def closure_once(idx): - def _c(): - opt.zero_grad(set_to_none=True) - Z_t = build_Z(thetas) # complex torch, depends on real angles - total = torch.zeros((), dtype=torch.complex128, device=thetas.device) - for (i,j,k) in idx: - total = total + triangle_volume_from_points_torch(Z_t[i], Z_t[j], Z_t[k], - series_terms=args.series) - loss = -total.real # maximize volume - loss.backward() - return loss - return _c - - hist = []; t0 = time.time() - for it in range(1, args.iters+1): - # Rebuild Delaunay triangles using current (detached) positions - with torch.no_grad(): - Z_np = build_Z(thetas).detach().cpu().numpy() - idx = delaunay_triangulation_indices(Z_np) - - loss = opt.step(closure_once(idx)) - with torch.no_grad(): hist.append(float(-loss.item())) - if it % args.print_every == 0 or it in (1, args.iters): - print(f'[{it:03d}] fast volume ~ {hist[-1]:.10f} (tris={idx.shape[0]})') - - t1 = time.time() - - # Final exact eval - with torch.no_grad(): - Zf = build_Z(thetas).detach().cpu().numpy() - from ideal_poly_volume_toolkit.geometry import ideal_poly_volume_via_delaunay as eval_del - vol_exact = eval_del(Zf, mode='eval_only', dps=250) - - print('\\n=== Optimization (Delaunay) done ===') - print(f'iters={args.iters}, time={t1-t0:.2f}s') - print(f'final fast volume ~ {hist[-1]:.12f}') - print(f'final exact volume {vol_exact:.12f}') - print('final angles (rad):', thetas.detach().cpu().numpy()) - -if __name__ == '__main__': - main() diff --git a/ideal_poly_volume_toolkit/pointset_to_fuchsian.py b/ideal_poly_volume_toolkit/pointset_to_fuchsian.py new file mode 100644 index 0000000000000000000000000000000000000000..737d0149a9aec3588e5e72f2ff490efc27a376a6 --- /dev/null +++ b/ideal_poly_volume_toolkit/pointset_to_fuchsian.py @@ -0,0 +1,630 @@ +""" +pointset_to_fuchsian.py +----------------------- +Pipeline: + (a) Start from a finite point set P in the complex plane (R^2). + (b) Back-project to the unit sphere S^2 via inverse stereographic projection. + (c) Compute the convex hull in R^3 to get an ideal polyhedron combinatorics. + (d) Triangulate faces, compute complex shear–bend parameters via cross-ratios, + feed shears (real parts) to the Penner–Rivin algorithm to get the Fuchsian group. + (e) Since we get cross-ratios, recover exterior dihedral angles and (by a pulling triangulation) + the hyperbolic volume via the Bloch–Wigner dilogarithm. + +Requirements: + - numpy + - scipy (for 3D convex hull) + - mpmath (for Bloch–Wigner) + - trimesh, pyrender (optional: meshing & rendering) + - rivin_holonomy.py in the same directory (Penner–Rivin core) +""" + +from typing import List, Tuple, Dict, Any, Optional +import numpy as np +import math, cmath + +# Sentinel for infinity +INF = None + +# Optional deps +try: + from scipy.spatial import ConvexHull + _HAS_SCIPY = True +except Exception: + _HAS_SCIPY = False + +try: + import mpmath as mp + _HAS_MPMATH = True +except Exception: + _HAS_MPMATH = False + +# local +from rivin_holonomy import Triangulation, generators_from_triangulation + +# ---------- Stereographic and cross ratios ---------- + +def stereographic_inverse(z: complex) -> np.ndarray: + if z is None or z is INF: + return np.array([0.0, 0.0, 1.0], dtype=float) + x, y = z.real, z.imag + d = x*x + y*y + 1.0 + return np.array([2*x/d, 2*y/d, (x*x + y*y - 1.0)/d], dtype=float) + +def stereographic(p: np.ndarray) -> complex: + x, y, z = float(p[0]), float(p[1]), float(p[2]) + # North pole maps to infinity (None) + if abs(z - 1.0) < 1e-14: + return None + denom = 1.0 - z + if abs(denom) < 1e-15: + return None + return complex(x/denom, y/denom) + +def cross_ratio(z1: complex, z2: complex, z3: complex, z4: complex) -> complex: + return (z1 - z3) * (z2 - z4) / ((z1 - z4) * (z2 - z3)) + +def shear_bend_from_four(zx: complex, za: complex, zy: complex, zb: complex) -> complex: + CR = cross_ratio(za, zx, zb, zy) + if abs(CR.imag) < 1e-15 and CR.real < 0: + CR += 1e-15j + return cmath.log(CR) + +def cr_axby(a, x, b, y): + """ + Möbius-invariant cross ratio CR(a,x; b,y) = M(a)/M(b), M(z)=(z-x)/(z-y), + allowing any of a,x,b,y to be INF (None). Returns complex. + """ + # Degenerate edge (x=y) is not expected here. + # Handle cases where either x or y is INF first. + if x is INF and y is INF: + raise ValueError("Degenerate configuration: x=y=INF") + if x is INF: + # M(z) ~ 1/(z - y) + if a is INF or b is INF: + # With x=INF, 'a' or 'b' being INF does not occur in our triangulated setup. + raise ValueError("Unexpected INF pattern with x=INF.") + return (b - y) / (a - y) + if y is INF: + # M(z) ~ (z - x) + if a is INF or b is INF: + raise ValueError("Unexpected INF pattern with y=INF.") + return (a - x) / (b - x) + # Now x,y are finite + if a is INF and b is INF: + return 1.0 + 0.0j # M(a)=M(b)=1 + if a is INF: + # 1 / ((b-x)/(b-y)) = (b - y)/(b - x) + return (b - y) / (b - x) + if b is INF: + # (a-x)/(a-y) + return (a - x) / (a - y) + # All finite: standard expression + return ((a - x)*(b - y)) / ((a - y)*(b - x)) + +def shear_bend_from_four_safe(zx, za, zy, zb): + """ + Safe complex shear-bend with symbolic INF support: + tau = log CR(za, zx; zb, zy). + """ + CR = cr_axby(za, zx, zb, zy) + # Avoid branch cut on negative real axis for numeric stability + if abs(getattr(CR, "imag", 0.0)) < 1e-15 and getattr(CR, "real", 0.0) < 0: + CR += 1e-15j + return cmath.log(CR) + +# ---------- Convex hull & faces ---------- + +def convex_hull_from_points_on_sphere(pts3: np.ndarray) -> Dict[str, Any]: + if not _HAS_SCIPY: + raise RuntimeError("SciPy (scipy.spatial.ConvexHull) is required for hull computation.") + hull = ConvexHull(pts3) + planes = {} + for fi, (a,b,c,d) in enumerate(hull.equations): + n = np.array([a,b,c], dtype=float) + norm = np.linalg.norm(n) + if norm == 0: continue + n = n / norm + key = tuple(np.round(np.append(n, d/norm), 8)) + planes.setdefault(key, []).append(hull.simplices[fi]) + faces = [] + for tris in planes.values(): + edge_count = {} + for tri in tris: + v = list(tri) + for e in [(v[0],v[1]),(v[1],v[2]),(v[2],v[0])]: + edge_count[e] = edge_count.get(e,0)+1 + boundary = [e for e,cnt in edge_count.items() if cnt==1] + nxt = {} + for u,v in boundary: nxt[u]=v + if not boundary: continue + start = boundary[0][0]; poly=[start]; seen={start} + while poly[-1] in nxt and nxt[poly[-1]] not in seen: + poly.append(nxt[poly[-1]]); seen.add(poly[-1]) + if len(poly)>=3: faces.append(poly) + edge_set=set() + for f in faces: + for i in range(len(f)): + edge_set.add(tuple(sorted((f[i], f[(i+1)%len(f)])))) + return {"vertices": pts3, "faces": faces, "edges": edge_set} + +# ---------- Triangulation ---------- + +def triangulate_faces_with_face_id(faces: List[List[int]]): + tris=[]; tri_face_id=[] + for fi,f in enumerate(faces): + if len(f)==3: tris.append(tuple(f)); tri_face_id.append(fi) + else: + a0=f[0] + for i in range(1,len(f)-1): + tris.append((a0,f[i],f[i+1])); tri_face_id.append(fi) + return tris, tri_face_id + +def build_penner_rivin_from_hull(pts3: np.ndarray, faces: List[List[int]]): + N=pts3.shape[0] + z=np.array([stereographic(pts3[i]) for i in range(N)], dtype=complex) + tris, tri_face_id = triangulate_faces_with_face_id(faces) + Tverts=[tuple(t) for t in tris] + + def canonical_edge(u,v): return (u,v) if u Dict[str,Any]: + pts3 = np.vstack([stereographic_inverse(z) for z in points]) + hull = convex_hull_from_points_on_sphere(pts3) + T, Z, info = build_penner_rivin_from_hull(pts3, hull["faces"]) + gens = generators_from_triangulation(T, Z, root=0) + return {"Gamma_generators":[M for (_,_,_,M) in gens], "T":T, "Z":Z, "info":info, "faces":hull["faces"]} + +def trace(M): return M[0][0]+M[1][1] + +def invariant_trace_field_signature(generators): + vals=[] + for M in generators: + M2=[[M[0][0]*M[0][0] + M[0][1]*M[1][0], M[0][0]*M[0][1] + M[0][1]*M[1][1]], + [M[1][0]*M[0][0] + M[1][1]*M[1][0], M[1][0]*M[0][1] + M[1][1]*M[1][1]]] + vals.append(trace(M2)) + def near_int(x,tol=1e-9): + r=round(x); return (abs(x-r) Dict[Tuple[int,int], float]: + z=info["stereo"]; triangles=info["triangles"]; tri_face_id=info["tri_face_id"]; edge_id=info["edge_id"] + dihedral={} + for (u,v),eid in edge_id.items(): + occ=[] + for t_idx, tri in enumerate(triangles): + for s in range(3): + a,b=tri[s%3], tri[(s+1)%3] + if {a,b}=={u,v}: occ.append((t_idx,s)) + if len(occ)!=2: continue + (t1,s1),(t2,s2)=occ + if tri_face_id[t1]==tri_face_id[t2]: + dihedral[tuple(sorted((u,v)))]=0.0; continue + tri1=triangles[t1]; tri2=triangles[t2] + x,y=tri1[s1%3], tri1[(s1+1)%3] + a=tri1[(s1+2)%3]; b=tri2[(s2+2)%3] + tau=shear_bend_from_four(z[x], z[a], z[y], z[b]) + dihedral[tuple(sorted((u,v)))]=abs(float(tau.imag)) + return dihedral + +def bloch_wigner(z: complex) -> float: + if not _HAS_MPMATH: + raise RuntimeError("mpmath is required for volume computation.") + return float(mp.im(mp.polylog(2, z)) + mp.arg(1 - z)*mp.log(abs(z))) + +def tetra_shape(z0, z1, z2, z3) -> complex: + # Use INF-safe cross ratio if available + CR = cr_axby(z0, z1, z2, z3) + if abs(CR.imag) < 1e-15 and CR.real < 0: CR += 1e-15j + return CR + +def volume_from_hull(pts3: np.ndarray, faces: List[List[int]], apex: Optional[int]=None) -> float: + if not _HAS_MPMATH: raise RuntimeError("mpmath is required for volume computation.") + z=[stereographic(p) for p in pts3] + tris,_=triangulate_faces_with_face_id(faces) + if apex is None: apex=0 + total=0.0 + for (a,b,c) in tris: + if apex in (a,b,c): continue + shp=tetra_shape(z[apex], z[a], z[b], z[c]) + total += abs(bloch_wigner(shp)) + return total + +def volume_from_pointset(points: List[complex], apex_index: Optional[int]=None, add_north_pole: bool=False) -> float: + """Compute volume using all hull simplices directly.""" + from scipy.spatial import ConvexHull + pts3 = np.vstack([stereographic_inverse(z) for z in points]) + if add_north_pole: + pts3 = np.vstack([pts3, np.array([[0.0,0.0,1.0]])]) + hull = ConvexHull(pts3) + + # Use all simplices + z = [stereographic(p) for p in pts3] + if apex_index is None: + apex_index = 0 + + total = 0.0 + for tri in hull.simplices: + a, b, c = tri + if apex_index in (a, b, c): + continue + shape = tetra_shape(z[apex_index], z[a], z[b], z[c]) + vol = abs(bloch_wigner(shape)) + total += vol + + return total + +# ---------- Meshing & model conversion ---------- + +try: + import trimesh as _trimesh + _HAS_TRIMESH=True +except Exception: + _HAS_TRIMESH=False + +def klein_to_poincare(V: np.ndarray) -> np.ndarray: + V=np.asarray(V,dtype=float) + r2=np.sum(V*V,axis=1) + s=np.sqrt(np.maximum(0.0,1.0-r2)) + denom=(1.0+s)[:,None] + mask=(np.abs(r2-1.0)<1e-14) + denom[mask]=1.0 + return V/denom + +def triangulated_mesh_from_hull(pts3, faces, refinement=0, model="klein", as_trimesh=False): + V=np.asarray(pts3,dtype=float).copy() + tris,_=triangulate_faces_with_face_id(faces) + F=np.array(tris,dtype=int) + def refine_once(V,F): + Vlist=V.tolist(); edge_mid={} + def midpoint(i,j): + a,b=min(i,j),max(i,j); key=(a,b) + if key in edge_mid: return edge_mid[key] + m=0.5*(V[a]+V[b]); idx=len(Vlist); Vlist.append(m.tolist()); edge_mid[key]=idx; return idx + newF=[] + for (i,j,k) in F: + ij=midpoint(i,j); jk=midpoint(j,k); ki=midpoint(k,i) + newF.extend([(i,ij,ki),(ij,j,jk),(ki,jk,k),(ij,jk,ki)]) + return np.array(Vlist,dtype=float), np.array(newF,dtype=int) + for _ in range(max(0,refinement)): + V,F=refine_once(V,F) + if model.lower().startswith("poin"): V=klein_to_poincare(V) + if as_trimesh and _HAS_TRIMESH: return _trimesh.Trimesh(vertices=V, faces=F, process=False) + return V,F + +def export_mesh_obj(filename: str, V: np.ndarray, F: np.ndarray): + with open(filename,"w",encoding="utf-8") as f: + for x,y,z in V: f.write(f"v {x:.10f} {y:.10f} {z:.10f}\n") + for a,b,c in F: f.write(f"f {a+1} {b+1} {c+1}\n") + return filename + +def hull_to_mesh(points: List[complex], add_north_pole=False, refinement=0, model="poincare", as_trimesh=True): + pts3=np.vstack([stereographic_inverse(z) for z in points]) + if add_north_pole: + pts3=np.vstack([pts3, np.array([[0.0,0.0,1.0]])]) + hull=convex_hull_from_points_on_sphere(pts3) + return triangulated_mesh_from_hull(pts3, hull["faces"], refinement=refinement, model=model, as_trimesh=as_trimesh) + +# ---------- Volume gradients ---------- + +def _log_two_sin_angle_from_complex(w, eps=1e-15): + """ + For w != real, log|2 sin Arg(w)| = log( 2 |Im w| / |w| ). + Branch-free (uses only real logs). 'eps' protects degeneracies. + """ + if w is INF: + # Should not occur for shape factors in valid configurations + return 0.0 + return math.log(2.0 * max(abs(w.imag), eps) / max(abs(w), eps)) + +def _G_of_shape(z): + """ + dVol = Im( G(z) * dz ), where + G(z) = (-sα + sβ)/z - sβ/(z-1) + sγ/(1-z), + with s• = log|2 sin(angle•)| evaluated branch-free via Im/|.|. + """ + # the three angles' log-sine factors + s_alpha = _log_two_sin_angle_from_complex(z) + s_beta = _log_two_sin_angle_from_complex(1.0 - 1.0/z if z != 0 else 1.0+0.0j) + s_gamma = _log_two_sin_angle_from_complex(1.0 - z) + + def inv(u): + if u is INF: # 1/∞ -> 0 + return 0.0 + 0.0j + if isinstance(u, complex): + if abs(u) < 1e-15: # safe reciprocal + return u.conjugate() / (abs(u)**2 + 1e-30) + return 1.0/u + return 1.0/complex(u) + + return (-s_alpha + s_beta) * inv(z) - s_beta * inv(z - 1.0) + s_gamma * inv(1.0 - z) + +def _dlog_cr_coeffs(a, x, b, y): + """ + Coefficients (S_a, S_x, S_b, S_y) such that d log z = S_a da + S_x dx + S_b db + S_y dy, + for z = CR(a,x; b,y), with INF handled. + """ + def inv_diff(u, v): + if u is INF or v is INF: + return 0.0 + 0.0j # 1/(∞ - finite) = 0 + d = (u - v) + if abs(d) < 1e-15: + return d.conjugate() / (abs(d)**2 + 1e-30) + return 1.0 / d + + # Reduced forms for INF cases + if a is INF and b is INF: + return 0j, 0j, 0j, 0j + if a is INF: + # z = (b - y)/(b - x) + Sa = 0j + Sb = inv_diff(b, y) - inv_diff(b, x) + Sx = +inv_diff(b, x) + Sy = -inv_diff(b, y) + return Sa, Sx, Sb, Sy + if b is INF: + # z = (a - x)/(a - y) + Sb = 0j + Sa = inv_diff(a, x) - inv_diff(a, y) + Sx = -inv_diff(a, x) + Sy = +inv_diff(a, y) + return Sa, Sx, Sb, Sy + if x is INF: + # z = (b - y)/(a - y) + Sx = 0j + Sb = inv_diff(b, y) + Sa = -inv_diff(a, y) + Sy = -inv_diff(b, y) + inv_diff(a, y) + return Sa, Sx, Sb, Sy + if y is INF: + # z = (a - x)/(b - x) + Sy = 0j + Sa = inv_diff(a, x) + Sb = -inv_diff(b, x) + Sx = -inv_diff(a, x) + inv_diff(b, x) + return Sa, Sx, Sb, Sy + + # General finite case + Sa = inv_diff(a, x) - inv_diff(a, y) + Sx = -inv_diff(a, x) + inv_diff(b, x) + Sb = inv_diff(b, y) - inv_diff(b, x) + Sy = -inv_diff(b, y) + inv_diff(a, y) + return Sa, Sx, Sb, Sy + +def volume_and_gradient_from_hull(pts3, faces, apex=None, variable_mask=None): + """ + Volume and analytic gradient wrt planar coords of vertices. + pts3: (N,3) on S^2; faces: polygon cycles (convex hull); apex: pulling vertex index. + Returns (V, grad) where grad[i] = (∂V/∂x_i, ∂V/∂y_i). + """ + # Note: We should use the hull simplices directly, not faces + # This is a compatibility function - prefer volume_and_gradient_from_hull_simplices + from scipy.spatial import ConvexHull + hull = ConvexHull(pts3) + return volume_and_gradient_from_hull_simplices(pts3, hull, apex, variable_mask) + +def volume_and_gradient_from_hull_simplices(pts3, hull, apex=None, variable_mask=None): + """ + Volume and analytic gradient using hull simplices directly. + """ + zC = [stereographic(p) for p in pts3] # complex chart; may be INF + N = len(zC) + if apex is None: + apex = 0 + tris = hull.simplices # Use all simplices directly + + V = 0.0 + grad = np.zeros((N, 2), dtype=float) + + for (a_idx, b_idx, c_idx) in tris: + if apex in (a_idx, b_idx, c_idx): + continue + z0, z1, z2, z3 = zC[apex], zC[a_idx], zC[b_idx], zC[c_idx] + + # shape for tetra (z0,z1,z2,z3) using CR(z2,z0; z3,z1) + z = cr_axby(z2, z0, z3, z1) + + # volume via Bloch–Wigner (value only; gradient doesn't need Li2) + V += abs(bloch_wigner(z)) + + # dVol = Im( G(z)*dz ), dz = z * d log z, d log z = S_a da + S_x dx + S_b db + S_y dy + G = _G_of_shape(z) + Sa, Sx, Sb, Sy = _dlog_cr_coeffs(z2, z0, z3, z1) + + W_a = G * z * Sa + W_x = G * z * Sx + W_b = G * z * Sb + W_y = G * z * Sy + + # Pack: Im(W)*dx + Re(W)*dy for each vertex + if variable_mask is None or variable_mask[b_idx]: + grad[b_idx, 0] += W_a.imag + grad[b_idx, 1] += W_a.real + if variable_mask is None or variable_mask[apex]: + grad[apex, 0] += W_x.imag + grad[apex, 1] += W_x.real + if variable_mask is None or variable_mask[c_idx]: + grad[c_idx, 0] += W_b.imag + grad[c_idx, 1] += W_b.real + if variable_mask is None or variable_mask[a_idx]: + grad[a_idx, 0] += W_y.imag + grad[a_idx, 1] += W_y.real + + return V, grad + +def volume_and_gradient_from_pointset(points, apex_index=None, add_north_pole=False, variable_mask=None): + """ + Wrapper: planar points -> hull -> volume and gradient. + """ + from scipy.spatial import ConvexHull + pts3 = np.vstack([stereographic_inverse(z) for z in points]) + if add_north_pole: + pts3 = np.vstack([pts3, np.array([[0.0, 0.0, 1.0]])]) + if variable_mask is not None: + variable_mask = variable_mask + [False] # North pole is fixed + hull = ConvexHull(pts3) + return volume_and_gradient_from_hull_simplices(pts3, hull, apex=apex_index, variable_mask=variable_mask) + +# ---------- Renderer with style knobs ---------- + +try: + import pyrender as _pyrender + _HAS_PYRENDER=True +except Exception: + _HAS_PYRENDER=False + +def _lambda_to_rgb_nm(lam_nm: float): + lam=lam_nm + if lam<380 or lam>700: return (0.0,0.0,0.0) + if lam<440: + t=(lam-380)/(440-380); r,g,b=(-t,0.0,1.0) + elif lam<490: + t=(lam-440)/(490-440); r,g,b=(0.0,t,1.0) + elif lam<510: + t=(lam-490)/(510-490); r,g,b=(0.0,1.0,1.0-t) + elif lam<580: + t=(lam-510)/(580-510); r,g,b=(t,1.0,0.0) + elif lam<645: + t=(lam-580)/(645-580); r,g,b=(1.0,1.0-t,0.0) + else: + t=(lam-645)/(700-645); r,g,b=(1.0,0.0,0.0) + if lam<420: factor=0.3+0.7*(lam-380)/(420-380) + elif lam>645: factor=0.3+0.7*(700-lam)/(700-645) + else: factor=1.0 + gamma=0.8 + def correct(c): c=max(0.0,min(1.0,c*factor)); return c**gamma + return (correct(r),correct(g),correct(b)) + +def _soap_iridescence_vertex_colors(mesh, camera_pos: np.ndarray, + n_film: float = 1.33, n_env: float = 1.0, + t0_nm: float = 500.0, t_amp_nm: float = 300.0, + swirl_scale: float = 5.0, + wavelengths_nm: Optional[List[float]] = None, + alpha: float = 0.35): + V=mesh.vertices; N=mesh.vertex_normals + if len(N)==0: + mesh.rezero(); mesh.remove_duplicate_faces(); N=mesh.vertex_normals + cam=np.asarray(camera_pos,dtype=float).reshape(3) + Vdir = cam[None,:]-V; L=np.linalg.norm(Vdir,axis=1)+1e-12; Vdir=Vdir/L[:,None] + Nn=N/(np.linalg.norm(N,axis=1)[:,None]+1e-12) + cos_theta=np.abs(np.sum(Nn*Vdir,axis=1)) + x,y,z=V[:,0],V[:,1],V[:,2] + phi=np.arctan2(y,x) + rho=np.linalg.norm(V,axis=1)+1e-12 + theta=np.arccos(np.clip(z/rho,-1,1)) + t_nm=t0_nm + t_amp_nm*(np.sin(swirl_scale*phi)*np.cos(0.5*swirl_scale*theta)) + lambdas=np.array(wavelengths_nm if wavelengths_nm else [450.0,530.0,610.0],dtype=float) + r12=((n_env-n_film)/(n_env+n_film))**2; r23=r12; sqrt_term=math.sqrt(max(0.0,r12*r23)) + C=np.zeros((V.shape[0],3),dtype=float) + for lam_nm in lambdas: + delta=(4.0*math.pi*n_film/lam_nm)*t_nm*cos_theta + R=r12+r23+2.0*sqrt_term*np.cos(delta); R=np.clip(R,0.0,1.0) + rgb=np.array(_lambda_to_rgb_nm(lam_nm))[None,:]; C += (R[:,None])*rgb + Cmax=np.maximum(1e-8, C.max(axis=1,keepdims=True)); C=np.clip(C/Cmax,0.0,1.0) + A=np.full((V.shape[0],1), np.clip(alpha,0.0,1.0), dtype=float) + rgba=np.concatenate([C,A],axis=1); return (rgba*255.0+0.5).astype(np.uint8) + +def _look_at(eye, target, up=np.array([0,0,1.0])): + eye=np.asarray(eye,dtype=float).reshape(3); target=np.asarray(target,dtype=float).reshape(3); up=np.asarray(up,dtype=float).reshape(3) + z=eye-target; z=z/(np.linalg.norm(z)+1e-12); x=np.cross(up,z); x=x/(np.linalg.norm(x)+1e-12); y=np.cross(z,x) + T=np.eye(4); T[:3,:3]=np.vstack([x,y,z]).T; T[:3,3]=eye; return T + +def render_snapshot(points: List[complex], + add_north_pole: bool = False, + refinement: int = 1, + model: str = "poincare", + width: int = 1280, height: int = 960, + background: Tuple[float,float,float,float] = (1.0,1.0,1.0,1.0), + camera_pos: Optional[Tuple[float,float,float]] = None, + target: Optional[Tuple[float,float,float]] = None, + transparency_alpha: float = 0.35, + light_setup: str = "three_point", + soap_iridescence: bool = True, + # style knobs + base_rgba: Tuple[float,float,float,float] = (0.6,0.8,1.0,0.35), + metallic: float = 0.05, roughness: float = 0.05, + key_intensity: float = 4.0, fill_intensity: float = 1.5, rim_intensity: float = 2.5, + ring_count: int = 8, ring_radius: float = 4.5, ring_height: float = 2.0, ring_intensity: float = 2.0, + film_n: float = 1.33, film_t0_nm: float = 500.0, film_t_amp_nm: float = 300.0, film_swirl_scale: float = 5.0, + film_wavelengths_nm: Optional[List[float]] = None, + outfile: str = "render.png") -> str: + if not (_HAS_TRIMESH and _HAS_PYRENDER): + raise RuntimeError("render_snapshot requires trimesh and pyrender (pip install trimesh pyrender).") + mesh = hull_to_mesh(points, add_north_pole=add_north_pole, refinement=refinement, model=model, as_trimesh=True) + if camera_pos is None: camera_pos=(2.7,2.1,1.6) + if target is None: target=(0.0,0.0,0.0) + if soap_iridescence: + vcols=_soap_iridescence_vertex_colors(mesh, np.array(camera_pos), + n_film=film_n, t0_nm=film_t0_nm, t_amp_nm=film_t_amp_nm, + swirl_scale=film_swirl_scale, wavelengths_nm=film_wavelengths_nm, + alpha=transparency_alpha) + mesh.visual.vertex_colors=vcols + material = _pyrender.MetallicRoughnessMaterial(baseColorFactor=[1.0,1.0,1.0,max(0.05,transparency_alpha)], + metallicFactor=metallic, roughnessFactor=roughness, + alphaMode="BLEND", doubleSided=True) + else: + base=np.array(base_rgba,dtype=float); base[3]=transparency_alpha + mesh.visual.vertex_colors=(base[None,:]*255.0+0.5).astype(np.uint8) + material = _pyrender.MetallicRoughnessMaterial(baseColorFactor=[float(base[0]),float(base[1]),float(base[2]),max(0.05,transparency_alpha)], + metallicFactor=metallic, roughnessFactor=roughness, + alphaMode="BLEND", doubleSided=True) + pm = _pyrender.Mesh.from_trimesh(mesh, material=material, smooth=True) + scene = _pyrender.Scene(bg_color=np.array(background,dtype=float)) + scene.add(pm) + cam=_pyrender.PerspectiveCamera(yfov=np.deg2rad(45.0)) + scene.add(cam, pose=_look_at(np.array(camera_pos), np.array(target))) + def add_three_point(scene): + key=_pyrender.DirectionalLight(color=np.ones(3), intensity=float(key_intensity)) + fill=_pyrender.DirectionalLight(color=np.ones(3), intensity=float(fill_intensity)) + rim=_pyrender.DirectionalLight(color=np.ones(3), intensity=float(rim_intensity)) + scene.add(key, pose=_look_at(np.array([3,2,2]), np.zeros(3))) + scene.add(fill, pose=_look_at(np.array([-2,3,1]), np.zeros(3))) + scene.add(rim, pose=_look_at(np.array([-3,-2,2.5]), np.zeros(3))) + def add_ring(scene, n=ring_count, radius=ring_radius, z=ring_height, intensity=ring_intensity): + for k in range(int(n)): + ang=2*math.pi*k/max(1,int(n)); pos=np.array([radius*math.cos(ang), radius*math.sin(ang), z]) + scene.add(_pyrender.PointLight(color=np.ones(3), intensity=float(intensity)), pose=_look_at(pos, np.zeros(3))) + if light_setup=="ring": add_ring(scene) + else: add_three_point(scene) + r=_pyrender.OffscreenRenderer(viewport_width=width, viewport_height=height) + color,_=r.render(scene, flags=_pyrender.RenderFlags.RGBA) + import imageio; imageio.imwrite(outfile, color); r.delete(); return outfile diff --git a/ideal_poly_volume_toolkit/rivin_holonomy.py b/ideal_poly_volume_toolkit/rivin_holonomy.py new file mode 100644 index 0000000000000000000000000000000000000000..e04e5863df46a276f7f6e0eace66152492fc4a37 --- /dev/null +++ b/ideal_poly_volume_toolkit/rivin_holonomy.py @@ -0,0 +1,193 @@ +""" +rivin_holonomy.py +----------------- +Reference implementation of the Penner–Rivin holonomy algorithm for an ideal triangulation +with optional shear data. Pure-Python (no external deps). + +Model: +- Triangles are labeled 0..F-1, each with sides 0,1,2 in cyclic order (counterclockwise). +- Adjacency maps each oriented side (t, s) to (u, su, eid), where u is the triangle glued across side s, + su is the side index in u, and eid is a global id for the underlying edge. +- Ori gives, for each global eid, a chosen orientation as ((t_from, s_from), (t_to, s_to)). +- Shears Z: dict eid -> real number (use 0.0 for zero-shear case). + +Outputs: +- Generators as 2x2 matrices in SL(2,R) (project to PSL(2,R)). +""" + +import math +from collections import deque, defaultdict +from typing import Dict, List, Tuple, Optional, Any + +def matmul(A, B): + return [[A[0][0]*B[0][0] + A[0][1]*B[1][0], A[0][0]*B[0][1] + A[0][1]*B[1][1]], + [A[1][0]*B[0][0] + A[1][1]*B[1][0], A[1][0]*B[0][1] + A[1][1]*B[1][1]]] + +def matI(): + return [[1.0, 0.0], [0.0, 1.0]] + +def matR(): + return [[1.0, 1.0], [-1.0, 0.0]] + +def matL(): + return [[0.0, -1.0], [1.0, 1.0]] + +def matX(z): + ez = math.exp(z/2.0) + return [[0.0, -ez], [1.0/ez, 0.0]] + +class Triangulation: + def __init__(self, F, adjacency, order, orientation): + """ + F: number of triangles + adjacency: dict (t, s) -> (u, su, eid) + order: dict t -> list [0,1,2] (cyclic order of sides in triangle t) + orientation: dict eid -> ((t_from, s_from), (t_to, s_to)) + """ + self.F = F + self.Adj = dict(adjacency) + self.Order = {t: list(order[t]) for t in order} + self.Ori = dict(orientation) + # basic sanity + for (t,s), (u,su,eid) in self.Adj.items(): + rt = self.Adj.get((u,su)) + assert rt is not None and rt[0]==t and rt[1]==s and rt[2]==eid, "Adjacency must be symmetric" + for t in range(F): + assert t in self.Order and len(self.Order[t])==3, "Order missing for triangle {}".format(t) + + def dual_graph(self): + """Return dual graph structure: dict t -> list of (u, s_in_t, s_in_u, eid).""" + G = defaultdict(list) + for (t,s), (u,su,eid) in self.Adj.items(): + G[t].append((u, s, su, eid)) + return G + +def bfs_spanning_tree(G, root=0): + parent = {root: None} + parent_edge = {root: None} # for v!=root: (eid, side_in_v, side_in_parent) + Q = deque([root]) + while Q: + v = Q.popleft() + for (w, sv, sw, eid) in G[v]: + if w not in parent: + parent[w] = v + parent_edge[w] = (eid, sw, sv) + Q.append(w) + return parent, parent_edge + +def path_in_tree(parent, v, u): + """Return list of vertices along the unique path v -> ... -> u in the tree.""" + Pv = [] + x = v + while x is not None: + Pv.append(x); x = parent[x] + Pu = [] + x = u + while x is not None: + Pu.append(x); x = parent[x] + i = len(Pv)-1; j = len(Pu)-1 + while i>=0 and j>=0 and Pv[i]==Pu[j]: + i -= 1; j -= 1 + up = Pv[:i+1] + down = list(reversed(Pu[:j+1])) + return up + [Pv[i+1]] + down + +def turn(order, side_in, side_out): + i = order.index(side_in); j = order.index(side_out) + d = (j - i) % 3 + if d == 1: return 'L' + if d == 2: return 'R' + raise ValueError("Degenerate turn inside triangle") + +def tokens_for_path(T: Triangulation, parent, parent_edge, path): + """ + Convert a path of triangles into a token list of 'L'/'R' and ('X', eid, sign). + """ + tokens = [] + side_in_prev = None + for k in range(len(path)-1): + a, b = path[k], path[k+1] + if parent[b] == a: + eid, side_in_b, side_in_a = parent_edge[b] # from a to b + elif parent[a] == b: + eid, side_in_a, side_in_b = parent_edge[a] # from a to b + else: + raise RuntimeError("Non-tree step in tokens_for_path") + if side_in_prev is not None: + tokens.append(('T', a, side_in_prev, side_in_a)) + (tf, sf), (tt, st) = T.Ori[eid] + if (tf == a and tt == b): sign = +1 + elif (tf == b and tt == a): sign = -1 + else: raise RuntimeError("Edge orientation inconsistent with step") + tokens.append(('X', eid, sign)) + side_in_prev = side_in_b + return _resolve_turns(T, tokens) + +def _resolve_turns(T: Triangulation, tokens): + resolved = [] + for tok in tokens: + if isinstance(tok, tuple) and tok and tok[0]=='T': + _, tri, s_in, s_out = tok + resolved.append(turn(T.Order[tri], s_in, s_out)) + else: + resolved.append(tok) + return resolved + +def tokens_to_matrix(tokens, Z): + M = matI() + for tok in tokens: + if tok == 'L': + M = matmul(M, matL()) + elif tok == 'R': + M = matmul(M, matR()) + else: + _, eid, sign = tok + z = Z.get(eid, 0.0) + if sign < 0: z = -z + M = matmul(M, matX(z)) + return M + +def generators_from_triangulation(T: Triangulation, Z, root=0): + """ + Compute a generating set for the holonomy: + one generator per non-tree dual edge. + Returns: list of (u, v, tokens, matrix) + """ + G = T.dual_graph() + parent, parent_edge = bfs_spanning_tree(G, root=root) + + tree_edges = set() + for v, p in parent.items(): + if p is None: continue + tree_edges.add(tuple(sorted((v, p)))) + + all_edges = set() + for v, lst in G.items(): + for (w, sv, sw, eid) in lst: + if v < w: + all_edges.add((v, w, eid)) + + gens = [] + for (u, v, eid) in all_edges: + if tuple(sorted((u, v))) in tree_edges: + continue + path_u = path_in_tree(parent, root, u) + path_v = path_in_tree(parent, root, v) + + tok1 = tokens_for_path(T, parent, parent_edge, path_u) # root -> u + (tf, sf), (tt, st) = T.Ori[eid] + if (tf == u and tt == v): sign = +1 + elif (tf == v and tt == u): sign = -1 + else: raise RuntimeError("Orientation mismatch on non-tree edge") + tok_mid = [('X', eid, sign)] + tok2 = tokens_for_path(T, parent, parent_edge, list(reversed(path_v))) # v -> root + + tokens = tok1 + tok_mid + tok2 + M = tokens_to_matrix(tokens, Z) + gens.append((u, v, tokens, M)) + return gens + +if __name__ == "__main__": + print("rivin_holonomy.py loaded. Define a Triangulation(F, adjacency, order, orientation),") + print("provide a shear dict Z (edge_id -> real), then call generators_from_triangulation(T, Z, root=0).") + print("Each generator entry is (u, v, tokens, 2x2 matrix).") diff --git a/ideal_poly_volume_toolkit/symmetry.py b/ideal_poly_volume_toolkit/symmetry.py new file mode 100644 index 0000000000000000000000000000000000000000..8c6e5523fc741e11c1b17b2e3ecf187b5277bd0f --- /dev/null +++ b/ideal_poly_volume_toolkit/symmetry.py @@ -0,0 +1,161 @@ +""" +Symmetry group computation for ideal polyhedra using nauty/pynauty. + +Computes the automorphism group of the 1-skeleton (graph) of the polyhedron. +""" + +import numpy as np +from scipy.spatial import ConvexHull + +try: + import pynauty + PYNAUTY_AVAILABLE = True +except ImportError: + PYNAUTY_AVAILABLE = False + + +def compute_symmetry_group(vertices_3d): + """ + Compute the automorphism group of a polyhedron's 1-skeleton. + + Args: + vertices_3d: N x 3 array of vertex coordinates (e.g., from lift_to_sphere_with_inf) + + Returns: + dict with keys: + - 'group_size': Total size of automorphism group + - 'num_generators': Number of group generators + - 'generators': List of permutations generating the group + - 'orbits': Orbit partition of vertices + - 'num_orbits': Number of vertex orbits + - 'group_name': Human-readable group identification (if recognized) + - 'available': Whether pynauty is available + + Raises: + ImportError: If pynauty is not installed + """ + if not PYNAUTY_AVAILABLE: + return { + 'available': False, + 'error': 'pynauty not installed. Install with: pip install pynauty' + } + + # Get convex hull to extract 1-skeleton (edge graph) + hull = ConvexHull(vertices_3d) + + # Build edge list from hull faces (simplices) + edges = set() + for simplex in hull.simplices: + # Each face is a triangle with 3 edges + v0, v1, v2 = simplex + edges.add(tuple(sorted([v0, v1]))) + edges.add(tuple(sorted([v1, v2]))) + edges.add(tuple(sorted([v2, v0]))) + + # Build adjacency list for pynauty + n_vertices = len(vertices_3d) + adjacency = {i: [] for i in range(n_vertices)} + for v1, v2 in edges: + adjacency[v1].append(v2) + adjacency[v2].append(v1) + + # Create pynauty graph + g = pynauty.Graph(number_of_vertices=n_vertices, directed=False, adjacency_dict=adjacency) + + # Compute automorphism group + # Returns: (generators, grpsize1, grpsize2, orbits, numorbits) + generators, grpsize1, grpsize2, orbits, numorbits = pynauty.autgrp(g) + + # Calculate total group size + # nauty represents size as grpsize1 × 10^grpsize2 + if isinstance(grpsize2, (int, float)): + group_size = int(grpsize1 * (10 ** grpsize2)) + else: + group_size = int(grpsize1) + + # Identify common symmetry groups + group_name = identify_group(group_size, n_vertices) + + return { + 'available': True, + 'group_size': group_size, + 'num_generators': len(generators), + 'generators': generators, + 'orbits': orbits, + 'num_orbits': numorbits, + 'group_name': group_name, + 'vertex_degrees': [len(adjacency[i]) for i in range(n_vertices)], + 'num_edges': len(edges), + 'num_faces': len(hull.simplices), + } + + +def identify_group(size, n_vertices=None): + """ + Identify common symmetry groups by order. + + Args: + size: Order of the group + n_vertices: Number of vertices (helps with identification) + + Returns: + Human-readable group name + """ + common_groups = { + 1: "C₁ (trivial, no symmetry)", + 2: "C₂ (single reflection/inversion)", + 3: "C₃ (3-fold rotation)", + 4: "V₄ (Klein four-group) or C₄ (4-fold rotation)", + 6: "S₃ ≅ D₃ (triangle symmetries)", + 8: "D₄ (square symmetries) or other order-8 group", + 12: "A₄ (tetrahedral rotations) or D₆ (hexagon symmetries)", + 24: "S₄ (octahedral symmetries) or other order-24 group", + 48: "Oh (full octahedral symmetries)", + 60: "A₅ (icosahedral rotations)", + 120: "Ih (full icosahedral symmetries) or S₅", + } + + if size in common_groups: + return common_groups[size] + else: + # Check for dihedral groups D_n (order 2n) + if size % 2 == 0: + n = size // 2 + return f"possibly D_{n} (dihedral) or other order-{size} group" + # Check for cyclic groups C_n + return f"possibly C_{size} (cyclic) or other order-{size} group" + + +def format_symmetry_report(sym_info): + """ + Format symmetry information as a readable string. + + Args: + sym_info: Dictionary returned by compute_symmetry_group + + Returns: + Formatted string report + """ + if not sym_info.get('available', False): + return f"❌ Symmetry computation unavailable: {sym_info.get('error', 'Unknown error')}" + + report = [] + report.append("## Symmetry Group Analysis\n") + report.append(f"**Group Order:** {sym_info['group_size']}") + report.append(f"**Identification:** {sym_info['group_name']}") + report.append(f"**Generators:** {sym_info['num_generators']}") + report.append(f"**Vertex Orbits:** {sym_info['num_orbits']}") + report.append("") + report.append("**Graph Properties:**") + report.append(f"- Vertices: {len(sym_info['vertex_degrees'])}") + report.append(f"- Edges: {sym_info['num_edges']}") + report.append(f"- Faces: {sym_info['num_faces']}") + report.append(f"- Vertex degrees: {sym_info['vertex_degrees']}") + + if sym_info['num_generators'] > 0 and sym_info['num_generators'] <= 5: + report.append("") + report.append("**Generators (as vertex permutations):**") + for i, gen in enumerate(sym_info['generators'], 1): + report.append(f"- σ_{i}: {gen}") + + return "\n".join(report) diff --git a/ideal_poly_volume_toolkit/visualization.py b/ideal_poly_volume_toolkit/visualization.py new file mode 100644 index 0000000000000000000000000000000000000000..eb6a53deb7818502a98da112866ab0c428ecbc8c --- /dev/null +++ b/ideal_poly_volume_toolkit/visualization.py @@ -0,0 +1,398 @@ +""" +3D Visualization utilities for ideal polyhedra. + +Supports: +- Poincaré ball model visualization +- Sphere projection with subdivision +- Interactive plots using plotly +""" + +import numpy as np +import plotly.graph_objects as go +from scipy.spatial import ConvexHull + + +def lift_to_sphere_with_inf(W: np.ndarray) -> np.ndarray: + """ + Lift complex points to sphere via stereographic projection. + + Args: + W: Complex array of points + + Returns: + N x 3 array of points on unit sphere + """ + P = np.zeros((W.shape[0], 3), dtype=np.float64) + is_inf = ~np.isfinite(W.real) | ~np.isfinite(W.imag) + F = ~is_inf + w = W[F] + r2 = (w.real**2 + w.imag**2) + denom = r2 + 1.0 + P[F, 0] = 2.0 * w.real / denom + P[F, 1] = 2.0 * w.imag / denom + P[F, 2] = (r2 - 1.0) / denom + P[is_inf] = np.array([0.0, 0.0, 1.0]) + return P + + +def subdivide_triangle_euclidean(v1, v2, v3, depth=1): + """ + Recursively subdivide a triangle using Euclidean (straight line) midpoints. + + This is used for subdividing in the Klein model (unit ball with Euclidean geometry). + + Args: + v1, v2, v3: Triangle vertices (3D points in the ball) + depth: Number of subdivision levels + + Returns: + List of subdivided triangular faces + """ + if depth == 0: + return [np.array([v1, v2, v3])] + + # Compute Euclidean midpoints (straight lines in Klein model) + m12 = (v1 + v2) / 2.0 + m23 = (v2 + v3) / 2.0 + m31 = (v3 + v1) / 2.0 + + # Recursively subdivide 4 new triangles + triangles = [] + triangles.extend(subdivide_triangle_euclidean(v1, m12, m31, depth - 1)) + triangles.extend(subdivide_triangle_euclidean(v2, m23, m12, depth - 1)) + triangles.extend(subdivide_triangle_euclidean(v3, m31, m23, depth - 1)) + triangles.extend(subdivide_triangle_euclidean(m12, m23, m31, depth - 1)) + + return triangles + + +def klein_to_poincare(K: np.ndarray) -> np.ndarray: + """ + Map Klein ball model to Poincaré ball model. + + The Klein model uses the unit ball with Euclidean (straight line) geodesics. + The Poincaré model uses the same ball with hyperbolic (curved) geodesics. + + Formula: If k is a point in Klein ball with |k| < 1, then + p = k / (1 + sqrt(1 - |k|^2)) + + Args: + K: N x 3 array of points in Klein ball + + Returns: + N x 3 array of points in Poincaré ball + """ + r_squared = np.sum(K**2, axis=1) + + # Clip to avoid numerical issues near boundary + r_squared = np.clip(r_squared, 0, 0.9999) + + # Klein to Poincaré transformation + denom = 1.0 + np.sqrt(1.0 - r_squared) + + result = K / denom[:, np.newaxis] + + return result + + +def create_polyhedron_mesh(vertices_complex, subdivisions=2): + """ + Create a subdivided mesh for visualization. + + Algorithm: + 1. Lift to sphere (gives Klein model in the ball) + 2. Get convex hull faces + 3. Subdivide each face using Euclidean midpoints (Klein model) + 4. Map subdivided vertices Klein → Poincaré + + Args: + vertices_complex: Complex array of vertices + subdivisions: Number of subdivision levels + + Returns: + dict with 'klein' and 'poincare' meshes + """ + # Step 1: Lift to sphere (this gives us the Klein model in the ball) + klein_vertices = lift_to_sphere_with_inf(vertices_complex) + + # Step 2: Compute convex hull (this is the Klein model of the polyhedron) + hull = ConvexHull(klein_vertices) + + # Step 3 & 4: Subdivide each face in Klein, then map to Poincaré + subdivided_triangles_klein = [] + subdivided_triangles_poincare = [] + + for simplex in hull.simplices: + v1, v2, v3 = klein_vertices[simplex] + + # Subdivide in Klein model (Euclidean straight-line subdivision) + sub_tris_klein = subdivide_triangle_euclidean(v1, v2, v3, depth=subdivisions) + subdivided_triangles_klein.extend(sub_tris_klein) + + # Map each subdivided triangle to Poincaré ball + for tri_klein in sub_tris_klein: + tri_poincare = klein_to_poincare(tri_klein) + subdivided_triangles_poincare.append(tri_poincare) + + return { + 'klein': { + 'triangles': subdivided_triangles_klein, + 'vertices': klein_vertices, + 'original_faces': hull.simplices + }, + 'poincare': { + 'triangles': subdivided_triangles_poincare, + 'vertices': klein_to_poincare(klein_vertices), + 'original_faces': hull.simplices + } + } + + +def plot_polyhedron_klein(vertices_complex, subdivisions=2, title="Ideal Polyhedron (Klein Model)"): + """ + Create interactive 3D plot of polyhedron in Klein ball model. + + Args: + vertices_complex: Complex array of vertices + subdivisions: Number of subdivision levels + title: Plot title + + Returns: + plotly Figure object + """ + mesh = create_polyhedron_mesh(vertices_complex, subdivisions) + triangles = mesh['klein']['triangles'] + + # Collect all vertices and triangle indices for Mesh3d + vertices_list = [] + indices_i, indices_j, indices_k = [], [], [] + vertex_map = {} + + for tri in triangles: + tri_indices = [] + for i in range(3): + vertex_tuple = tuple(tri[i]) + if vertex_tuple not in vertex_map: + vertex_map[vertex_tuple] = len(vertices_list) + vertices_list.append(tri[i]) + tri_indices.append(vertex_map[vertex_tuple]) + + # Add triangle indices + indices_i.append(tri_indices[0]) + indices_j.append(tri_indices[1]) + indices_k.append(tri_indices[2]) + + vertices_array = np.array(vertices_list) + + # Create figure + fig = go.Figure() + + # Add polyhedron as a mesh surface + fig.add_trace(go.Mesh3d( + x=vertices_array[:, 0], + y=vertices_array[:, 1], + z=vertices_array[:, 2], + i=indices_i, + j=indices_j, + k=indices_k, + color='lightblue', + opacity=0.7, + flatshading=False, + name='Polyhedron', + hoverinfo='skip' + )) + + # Add vertices + vertices = mesh['klein']['vertices'] + fig.add_trace(go.Scatter3d( + x=vertices[:, 0], y=vertices[:, 1], z=vertices[:, 2], + mode='markers', + marker=dict(size=8, color='red'), + name='Vertices', + hovertext=[f'Vertex {i}' for i in range(len(vertices))] + )) + + # Add transparent ball for reference + u = np.linspace(0, 2 * np.pi, 30) + v = np.linspace(0, np.pi, 20) + x_ball = np.outer(np.cos(u), np.sin(v)) + y_ball = np.outer(np.sin(u), np.sin(v)) + z_ball = np.outer(np.ones(np.size(u)), np.cos(v)) + + fig.add_trace(go.Surface( + x=x_ball, y=y_ball, z=z_ball, + opacity=0.1, + colorscale=[[0, 'lightgray'], [1, 'lightgray']], + showscale=False, + name='Unit Ball', + hoverinfo='skip' + )) + + # Layout + fig.update_layout( + title=title, + scene=dict( + xaxis=dict(range=[-1.2, 1.2], title='X'), + yaxis=dict(range=[-1.2, 1.2], title='Y'), + zaxis=dict(range=[-1.2, 1.2], title='Z'), + aspectmode='cube' + ), + showlegend=True, + width=800, + height=800 + ) + + return fig + + +def plot_polyhedron_poincare(vertices_complex, subdivisions=2, title="Ideal Polyhedron (Poincaré Ball)"): + """ + Create interactive 3D plot of polyhedron in Poincaré ball model. + + Args: + vertices_complex: Complex array of vertices + subdivisions: Number of subdivision levels + title: Plot title + + Returns: + plotly Figure object + """ + mesh = create_polyhedron_mesh(vertices_complex, subdivisions) + triangles = mesh['poincare']['triangles'] + + # Collect all vertices and triangle indices for Mesh3d + vertices_list = [] + indices_i, indices_j, indices_k = [], [], [] + vertex_map = {} + + for tri in triangles: + tri_indices = [] + for i in range(3): + vertex_tuple = tuple(tri[i]) + if vertex_tuple not in vertex_map: + vertex_map[vertex_tuple] = len(vertices_list) + vertices_list.append(tri[i]) + tri_indices.append(vertex_map[vertex_tuple]) + + # Add triangle indices + indices_i.append(tri_indices[0]) + indices_j.append(tri_indices[1]) + indices_k.append(tri_indices[2]) + + vertices_array = np.array(vertices_list) + + # Create figure + fig = go.Figure() + + # Add polyhedron as a mesh surface + fig.add_trace(go.Mesh3d( + x=vertices_array[:, 0], + y=vertices_array[:, 1], + z=vertices_array[:, 2], + i=indices_i, + j=indices_j, + k=indices_k, + color='lightblue', + opacity=0.7, + flatshading=False, + name='Polyhedron', + hoverinfo='skip' + )) + + # Add vertices + vertices = mesh['poincare']['vertices'] + fig.add_trace(go.Scatter3d( + x=vertices[:, 0], y=vertices[:, 1], z=vertices[:, 2], + mode='markers', + marker=dict(size=8, color='red'), + name='Vertices', + hovertext=[f'Vertex {i}' for i in range(len(vertices))] + )) + + # Add unit sphere boundary + u = np.linspace(0, 2 * np.pi, 30) + v = np.linspace(0, np.pi, 20) + x_sphere = np.outer(np.cos(u), np.sin(v)) + y_sphere = np.outer(np.sin(u), np.sin(v)) + z_sphere = np.outer(np.ones(np.size(u)), np.cos(v)) + + fig.add_trace(go.Surface( + x=x_sphere, y=y_sphere, z=z_sphere, + opacity=0.1, + colorscale=[[0, 'lightgray'], [1, 'lightgray']], + showscale=False, + name='Unit Ball', + hoverinfo='skip' + )) + + # Layout + fig.update_layout( + title=title, + scene=dict( + xaxis=dict(range=[-1.2, 1.2], title='X'), + yaxis=dict(range=[-1.2, 1.2], title='Y'), + zaxis=dict(range=[-1.2, 1.2], title='Z'), + aspectmode='cube' + ), + showlegend=True, + width=800, + height=800 + ) + + return fig + + +def plot_delaunay_2d(vertices_complex, triangulation_indices, title="Delaunay Triangulation"): + """ + Create 2D plot of Delaunay triangulation in complex plane. + + Args: + vertices_complex: Complex array of vertices + triangulation_indices: Array of triangle indices + title: Plot title + + Returns: + plotly Figure object + """ + fig = go.Figure() + + # Plot triangulation edges + for tri in triangulation_indices: + i, j, k = tri + vertices_tri = vertices_complex[[i, j, k, i]] # Close the triangle + + fig.add_trace(go.Scatter( + x=vertices_tri.real, + y=vertices_tri.imag, + mode='lines', + line=dict(color='blue', width=1), + showlegend=False, + hoverinfo='skip' + )) + + # Plot vertices + fig.add_trace(go.Scatter( + x=vertices_complex.real, + y=vertices_complex.imag, + mode='markers+text', + marker=dict(size=10, color='red'), + text=[f'{i}' for i in range(len(vertices_complex))], + textposition='top center', + name='Vertices', + hovertext=[f'Vertex {i}: {z:.3f}' for i, z in enumerate(vertices_complex)] + )) + + # Layout + fig.update_layout( + title=title, + xaxis_title='Real', + yaxis_title='Imaginary', + width=700, + height=700, + showlegend=True, + hovermode='closest' + ) + + fig.update_xaxes(scaleanchor="y", scaleratio=1) + + return fig diff --git a/pyproject.toml b/pyproject.toml index 7231d0137f46352e37e2e8e336e8d82a9f4a04d8..ea1b813d07caa12976406bb0e8abf86f54fe471a 100644 --- a/pyproject.toml +++ b/pyproject.toml @@ -8,4 +8,9 @@ version = "0.3.0" description = "Ideal hyperbolic polyhedron volume: Delaunay + hull-project, fast+exact Lobachevsky, LBFGS examples." readme = "README.md" requires-python = ">=3.9" -dependencies = ["numpy","scipy","mpmath","torch"] +dependencies = ["numpy","scipy","mpmath","torch","matplotlib","gradio","trimesh","plotly","pynauty"] + +[tool.setuptools.packages.find] +where = ["."] +include = ["ideal_poly_volume_toolkit*"] +exclude = ["examples*", "results*", "scripts*", "docs*"] diff --git a/results/README.md b/results/README.md new file mode 100644 index 0000000000000000000000000000000000000000..ba3943812b19e152512b982c4ded334303d5fe3e --- /dev/null +++ b/results/README.md @@ -0,0 +1,24 @@ +# Results Directory + +This directory contains all output files generated by the toolkit. + +## Structure + +- `plots/` - PNG visualization outputs (23 files) + - Volume distribution plots + - Configuration visualizations + - Comparative analysis plots + +- `data/` - JSON configuration files (6 files) + - Optimal configurations + - Local maxima data + - Analysis results + +- `logs/` - Optimization and execution logs (4 files) + - 7-vertex optimization logs + - 20-vertex optimization logs + - Analysis output logs + +## Note + +These files are generated outputs and may be regenerated by running the corresponding scripts in the `examples/` directory. diff --git a/20vertex_arithmetic_analysis.json b/results/data/20vertex_arithmetic_analysis.json similarity index 100% rename from 20vertex_arithmetic_analysis.json rename to results/data/20vertex_arithmetic_analysis.json diff --git a/20vertex_local_maxima.json b/results/data/20vertex_local_maxima.json similarity index 100% rename from 20vertex_local_maxima.json rename to results/data/20vertex_local_maxima.json diff --git a/20vertex_maximal_final.json b/results/data/20vertex_maximal_final.json similarity index 100% rename from 20vertex_maximal_final.json rename to results/data/20vertex_maximal_final.json diff --git a/20vertex_maximal_intermediate.json b/results/data/20vertex_maximal_intermediate.json similarity index 100% rename from 20vertex_maximal_intermediate.json rename to results/data/20vertex_maximal_intermediate.json diff --git a/results/data/5vertex_optimization_20251026_184501.json b/results/data/5vertex_optimization_20251026_184501.json new file mode 100644 index 0000000000000000000000000000000000000000..54b0127c9121f056993e018ebf8832d033e67f29 --- /dev/null +++ b/results/data/5vertex_optimization_20251026_184501.json @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:3af6db33382263fc128390706f70d1320123125bd338a15fa2aebf5338a126a4 +size 1865 diff --git a/ideal_poly_volume_toolkit_autograd_patch.zip b/results/data/ideal_poly_volume_toolkit_autograd_patch.zip similarity index 100% rename from ideal_poly_volume_toolkit_autograd_patch.zip rename to results/data/ideal_poly_volume_toolkit_autograd_patch.zip diff --git a/optimal_configurations.json b/results/data/optimal_configurations.json similarity index 100% rename from optimal_configurations.json rename to results/data/optimal_configurations.json diff --git a/volume_distribution_summary.json b/results/data/volume_distribution_summary.json similarity index 100% rename from volume_distribution_summary.json rename to results/data/volume_distribution_summary.json diff --git a/20vertex_local_maxima_log.txt b/results/logs/20vertex_local_maxima_log.txt similarity index 100% rename from 20vertex_local_maxima_log.txt rename to results/logs/20vertex_local_maxima_log.txt diff --git a/20vertex_log.txt b/results/logs/20vertex_log.txt similarity index 100% rename from 20vertex_log.txt rename to results/logs/20vertex_log.txt diff --git a/20vertex_scipy_log.txt b/results/logs/20vertex_scipy_log.txt similarity index 100% rename from 20vertex_scipy_log.txt rename to results/logs/20vertex_scipy_log.txt diff --git a/PATCH_geometry.py.txt b/results/logs/PATCH_geometry.py.txt similarity index 100% rename from PATCH_geometry.py.txt rename to results/logs/PATCH_geometry.py.txt diff --git a/results/plots/4vertex_distribution_20251026_184627.png b/results/plots/4vertex_distribution_20251026_184627.png new file mode 100644 index 0000000000000000000000000000000000000000..25417e67c9297b87958fed18e99b780ddaad98da --- /dev/null +++ b/results/plots/4vertex_distribution_20251026_184627.png @@ -0,0 +1,3 @@ +version https://git-lfs.github.com/spec/v1 +oid sha256:5e3e4f1c0de331034972e8f127d34f4f461d78d96f14f3a8ad3d22e23acec1eb +size 67857 diff --git a/7vertex_triangulation.png b/results/plots/7vertex_triangulation.png similarity index 100% rename from 7vertex_triangulation.png rename to results/plots/7vertex_triangulation.png diff --git a/analytical_challenge.png b/results/plots/analytical_challenge.png similarity index 100% rename from analytical_challenge.png rename to results/plots/analytical_challenge.png diff --git a/beta_distribution_theory.png b/results/plots/beta_distribution_theory.png similarity index 100% rename from beta_distribution_theory.png rename to results/plots/beta_distribution_theory.png diff --git a/beta_fit_analysis.png b/results/plots/beta_fit_analysis.png similarity index 100% rename from beta_fit_analysis.png rename to results/plots/beta_fit_analysis.png diff --git a/clt_analysis.png b/results/plots/clt_analysis.png similarity index 100% rename from clt_analysis.png rename to results/plots/clt_analysis.png diff --git a/combinatorial_mixture_analysis.png b/results/plots/combinatorial_mixture_analysis.png similarity index 100% rename from combinatorial_mixture_analysis.png rename to results/plots/combinatorial_mixture_analysis.png diff --git a/concentration_location_analysis.png b/results/plots/concentration_location_analysis.png similarity index 100% rename from concentration_location_analysis.png rename to results/plots/concentration_location_analysis.png diff --git a/debug_six_vertex.png b/results/plots/debug_six_vertex.png similarity index 100% rename from debug_six_vertex.png rename to results/plots/debug_six_vertex.png diff --git a/dodecahedron_perturbation.png b/results/plots/dodecahedron_perturbation.png similarity index 100% rename from dodecahedron_perturbation.png rename to results/plots/dodecahedron_perturbation.png diff --git a/euclidean_best_fits.png b/results/plots/euclidean_best_fits.png similarity index 100% rename from euclidean_best_fits.png rename to results/plots/euclidean_best_fits.png diff --git a/euclidean_fit_diagnostics.png b/results/plots/euclidean_fit_diagnostics.png similarity index 100% rename from euclidean_fit_diagnostics.png rename to results/plots/euclidean_fit_diagnostics.png diff --git a/euclidean_qq_plots.png b/results/plots/euclidean_qq_plots.png similarity index 100% rename from euclidean_qq_plots.png rename to results/plots/euclidean_qq_plots.png diff --git a/euclidean_tetrahedron_distribution.png b/results/plots/euclidean_tetrahedron_distribution.png similarity index 100% rename from euclidean_tetrahedron_distribution.png rename to results/plots/euclidean_tetrahedron_distribution.png diff --git a/five_vertex_distribution.png b/results/plots/five_vertex_distribution.png similarity index 100% rename from five_vertex_distribution.png rename to results/plots/five_vertex_distribution.png diff --git a/golden_config_plane.png b/results/plots/golden_config_plane.png similarity index 100% rename from golden_config_plane.png rename to results/plots/golden_config_plane.png diff --git a/golden_config_sphere.png b/results/plots/golden_config_sphere.png similarity index 100% rename from golden_config_sphere.png rename to results/plots/golden_config_sphere.png diff --git a/maximal_12vertex_plane.png b/results/plots/maximal_12vertex_plane.png similarity index 100% rename from maximal_12vertex_plane.png rename to results/plots/maximal_12vertex_plane.png diff --git a/maximal_config_plane.png b/results/plots/maximal_config_plane.png similarity index 100% rename from maximal_config_plane.png rename to results/plots/maximal_config_plane.png diff --git a/six_vertex_distribution.png b/results/plots/six_vertex_distribution.png similarity index 100% rename from six_vertex_distribution.png rename to results/plots/six_vertex_distribution.png diff --git a/tetrahedron_distribution_analysis.png b/results/plots/tetrahedron_distribution_analysis.png similarity index 100% rename from tetrahedron_distribution_analysis.png rename to results/plots/tetrahedron_distribution_analysis.png diff --git a/ideal_poly_volume_toolkit/examples/tetrahedron_volume_distribution.png b/results/plots/tetrahedron_volume_distribution.png similarity index 100% rename from ideal_poly_volume_toolkit/examples/tetrahedron_volume_distribution.png rename to results/plots/tetrahedron_volume_distribution.png diff --git a/ideal_poly_volume_toolkit/examples/tetrahedron_volume_histogram.png b/results/plots/tetrahedron_volume_histogram.png similarity index 100% rename from ideal_poly_volume_toolkit/examples/tetrahedron_volume_histogram.png rename to results/plots/tetrahedron_volume_histogram.png diff --git a/volume_distributions_summary.png b/results/plots/volume_distributions_summary.png similarity index 100% rename from volume_distributions_summary.png rename to results/plots/volume_distributions_summary.png diff --git a/ideal_poly_volume_toolkit/examples/debug_gradient_sign.py b/scripts/debug_gradient_sign.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/debug_gradient_sign.py rename to scripts/debug_gradient_sign.py diff --git a/ideal_poly_volume_toolkit/examples/debug_original.py b/scripts/debug_original.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/debug_original.py rename to scripts/debug_original.py diff --git a/ideal_poly_volume_toolkit/examples/test_from_bad_init.py b/scripts/test_from_bad_init.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/test_from_bad_init.py rename to scripts/test_from_bad_init.py diff --git a/ideal_poly_volume_toolkit/examples/test_gradients.py b/scripts/test_gradients.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/test_gradients.py rename to scripts/test_gradients.py diff --git a/ideal_poly_volume_toolkit/examples/test_lbfgs_issue.py b/scripts/test_lbfgs_issue.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/test_lbfgs_issue.py rename to scripts/test_lbfgs_issue.py diff --git a/ideal_poly_volume_toolkit/examples/verify_lob_derivative.py b/scripts/verify_lob_derivative.py similarity index 100% rename from ideal_poly_volume_toolkit/examples/verify_lob_derivative.py rename to scripts/verify_lob_derivative.py