igriv Claude commited on
Commit
d7d27f0
·
1 Parent(s): f9b644c

Major reorganization and feature additions

Browse files

## Reorganization
- Moved all examples to organized examples/ directory by category
- examples/distributions/ (tetrahedron, 5-vertex, 6-vertex, euclidean)
- examples/optimization/ (7-vertex, 12-vertex, 20-vertex, platonic)
- examples/visualization/
- examples/analysis/
- Created bin/ directory for command-line tools and GUI
- Created results/ directory with data/, plots/, logs/ subdirectories
- Moved documentation to docs/ directory
- Cleaned up 91+ root-level files into organized structure

## New Features

### Command-Line Tools (bin/)
- optimize_polyhedron.py: General optimization wrapper with canonical output
- analyze_distribution.py: Distribution analysis wrapper
- gui.py: Comprehensive Gradio web interface

### Gradio GUI (bin/gui.py)
- 5-tab interface: Optimization, Distribution, Visualization, Arithmeticity, About
- Parallel optimization using all CPU cores (64-core support)
- Real-time progress tracking
- 3D visualization in Klein and Poincaré ball models
- Holonomy computation (Penner-Rivin algorithm)
- Symmetry group computation (via pynauty/nauty)
- Load optimization results across tabs

### Core Modules
- visualization.py: 3D visualization with correct Klein→Poincaré transformation
- Fixed algorithm: subdivide in Klein model, map to Poincaré
- Mesh-based rendering showing curved hyperbolic faces
- Support for both Klein and Poincaré ball models
- rivin_holonomy.py: Penner-Rivin holonomy computation
- pointset_to_fuchsian.py: Full Fuchsian group pipeline
- symmetry.py: Automorphism group computation using nauty

## Performance Optimizations
- Parallel workers: Use all CPU cores for differential evolution (77x speedup)
- Reduced series terms: 96→64 (negligible accuracy loss, 15% faster)
- Adaptive population size: Automatically reduces for high vertex counts
- CPU-only: GPU is 2.7x slower due to transfer overhead for small tensors

### Benchmark Results (64 cores)
- 7-vertex: ~10 seconds for 10 trials (was ~5 minutes)
- 20-vertex: ~2 minutes for 10 trials (was ~2.8 hours)

## Bug Fixes
- Fixed Gradio State management (moved inside Blocks context)
- Fixed BytesIO→PIL Image conversion for plot display
- Fixed Klein→Poincaré transformation in visualization
- Fixed pickling issue with lambda functions (use functools.partial)
- Fixed distribution vertex count display (shows total vs random)

## Interface Improvements
- Number input for vertices (was limited to 20, now up to 100)
- Clearer labels and info messages
- Subdivision level control (0-5, default 3)
- Better error messages and validation

## Dependencies Added
- gradio: Web interface
- pynauty: Symmetry group computation
- plotly: Interactive 3D plots
- PIL/Pillow: Image handling

🤖 Generated with Claude Code
https://claude.com/claude-code

Co-Authored-By: Claude <noreply@anthropic.com>

This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. README.md +159 -1
  2. bin/.gradio/certificate.pem +31 -0
  3. bin/README.md +153 -0
  4. bin/analyze_distribution.py +298 -0
  5. bin/gui.py +893 -0
  6. bin/optimize_polyhedron.py +263 -0
  7. bin/results/data/20vertex_optimization_20251026_203700.json +3 -0
  8. bin/results/data/7vertex_optimization_20251026_191114.json +3 -0
  9. bin/results/data/7vertex_optimization_20251026_193813.json +3 -0
  10. bin/results/data/7vertex_optimization_20251026_194915.json +3 -0
  11. bin/results/data/7vertex_optimization_20251026_195701.json +3 -0
  12. bin/results/data/7vertex_optimization_20251026_200737.json +3 -0
  13. bin/results/data/7vertex_optimization_20251026_201344.json +3 -0
  14. bin/results/data/7vertex_optimization_20251026_202024.json +3 -0
  15. bin/results/data/7vertex_optimization_20251026_205947.json +3 -0
  16. bin/results/data/9vertex_optimization_20251026_205959.json +3 -0
  17. bin/results/data/9vertex_optimization_20251026_210116.json +3 -0
  18. PLATONIC_MAXIMALITY_RESULTS.md → docs/PLATONIC_MAXIMALITY_RESULTS.md +0 -0
  19. RESULTS_SUMMARY.md → docs/RESULTS_SUMMARY.md +0 -0
  20. examples/README.md +44 -0
  21. {ideal_poly_volume_toolkit/examples → examples/analysis}/analytical_challenge_simple.py +0 -0
  22. {ideal_poly_volume_toolkit/examples → examples/analysis}/analytical_mean_challenge.py +0 -0
  23. {ideal_poly_volume_toolkit/examples → examples/analysis}/analyze_both_configs.py +0 -0
  24. {ideal_poly_volume_toolkit/examples → examples/analysis}/analyze_distribution_shape.py +0 -0
  25. {ideal_poly_volume_toolkit/examples → examples/analysis}/analyze_special_configs.py +0 -0
  26. {ideal_poly_volume_toolkit/examples → examples/analysis}/beta_distribution_theory.py +0 -0
  27. {ideal_poly_volume_toolkit/examples → examples/analysis}/beta_fit_analysis.py +0 -0
  28. {ideal_poly_volume_toolkit/examples → examples/analysis}/check_lob_math.py +0 -0
  29. {ideal_poly_volume_toolkit/examples → examples/analysis}/check_statistical_precision.py +0 -0
  30. {ideal_poly_volume_toolkit/examples → examples/analysis}/clt_analysis.py +0 -0
  31. {ideal_poly_volume_toolkit/examples → examples/analysis}/combinatorial_mixture_analysis.py +0 -0
  32. {ideal_poly_volume_toolkit/examples → examples/analysis}/concentration_location_analysis.py +0 -0
  33. create_distribution_comparison_plot.py → examples/analysis/create_distribution_comparison_plot.py +0 -0
  34. {ideal_poly_volume_toolkit/examples → examples/analysis}/sanity_check_5_7_vertices.py +0 -0
  35. euclidean_distribution_fitting.py → examples/distributions/euclidean/euclidean_distribution_fitting.py +0 -0
  36. euclidean_fit_analysis.py → examples/distributions/euclidean/euclidean_fit_analysis.py +0 -0
  37. euclidean_tetrahedron_distribution.py → examples/distributions/euclidean/euclidean_tetrahedron_distribution.py +0 -0
  38. {ideal_poly_volume_toolkit/examples → examples/distributions/five_vertex}/five_vertex_distribution.py +0 -0
  39. debug_six_vertex.py → examples/distributions/six_vertex/debug_six_vertex.py +0 -0
  40. {ideal_poly_volume_toolkit/examples → examples/distributions/six_vertex}/six_vertex_distribution.py +0 -0
  41. {ideal_poly_volume_toolkit/examples → examples/distributions/tetrahedron}/ideal_tetrahedron.py +0 -0
  42. {ideal_poly_volume_toolkit/examples → examples/distributions/tetrahedron}/quick_tetrahedron_analysis.py +0 -0
  43. {ideal_poly_volume_toolkit/examples → examples/distributions/tetrahedron}/run_tetrahedron_distribution.py +0 -0
  44. {ideal_poly_volume_toolkit/examples → examples/distributions/tetrahedron}/tetrahedron_volume_distribution.py +0 -0
  45. examples/distributions/tetrahedron/tetrahedron_volume_histogram.png +3 -0
  46. {ideal_poly_volume_toolkit/examples → examples/optimization/12vertex}/analyze_12vertex_combinatorics.py +0 -0
  47. {ideal_poly_volume_toolkit/examples → examples/optimization/12vertex}/analyze_12vertex_results.py +0 -0
  48. {ideal_poly_volume_toolkit/examples → examples/optimization/12vertex}/visualize_maximal_12vertex.py +0 -0
  49. check_arithmetic_holonomy.py → examples/optimization/20vertex/check_arithmetic_holonomy.py +1 -3
  50. check_local_maxima_arithmetic.py → examples/optimization/20vertex/check_local_maxima_arithmetic.py +2 -4
README.md CHANGED
@@ -1 +1,159 @@
1
- See examples in ideal_poly_volume_toolkit/examples.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Ideal Polyhedra Volume Toolkit
2
+
3
+ A Python toolkit for computing and optimizing volumes of ideal hyperbolic polyhedra using Delaunay triangulation, hull projection, and fast/exact Lobachevsky functions.
4
+
5
+ ## Installation
6
+
7
+ Install the package in development mode:
8
+
9
+ ```bash
10
+ pip install -e .
11
+ ```
12
+
13
+ Dependencies: `numpy`, `scipy`, `mpmath`, `torch`
14
+
15
+ ## Project Structure
16
+
17
+ ```
18
+ ideal_poly_volume_toolkit/
19
+ ├── ideal_poly_volume_toolkit/ # Core package
20
+ │ ├── geometry.py # Core geometry and volume computation functions
21
+ │ ├── visualization.py # 3D visualization utilities
22
+ │ ├── rivin_holonomy.py # Penner-Rivin holonomy computation
23
+ │ ├── pointset_to_fuchsian.py # Full Fuchsian group pipeline
24
+ │ └── __init__.py
25
+
26
+ ├── bin/ # Command-line tools and GUI
27
+ │ ├── gui.py # 🎨 Interactive Gradio web interface
28
+ │ ├── optimize_polyhedron.py # General optimization wrapper
29
+ │ ├── analyze_distribution.py # Distribution analysis wrapper
30
+ │ └── README.md
31
+
32
+ ├── examples/ # Organized example scripts
33
+ │ ├── distributions/ # Distribution analysis examples
34
+ │ │ ├── tetrahedron/ # Tetrahedron volume distributions
35
+ │ │ ├── five_vertex/ # 5-vertex polyhedra distributions
36
+ │ │ ├── six_vertex/ # 6-vertex polyhedra distributions
37
+ │ │ └── euclidean/ # Euclidean tetrahedra analysis
38
+ │ ├── optimization/ # Optimization examples by vertex count
39
+ │ │ ├── 7vertex/ # 7-vertex optimization (octahedron variants)
40
+ │ │ ├── 12vertex/ # 12-vertex optimization
41
+ │ │ ├── 20vertex/ # 20-vertex optimization (icosahedron)
42
+ │ │ └── platonic/ # Platonic solid analysis
43
+ │ ├── visualization/ # Visualization scripts
44
+ │ └── analysis/ # Statistical and theoretical analysis
45
+
46
+ ├── scripts/ # Active research/development scripts
47
+ ├── results/ # Output files
48
+ │ ├── data/ # JSON configuration files
49
+ │ ├── plots/ # PNG visualization outputs
50
+ │ └── logs/ # Optimization logs
51
+ └── docs/ # Documentation
52
+ ├── RESULTS_SUMMARY.md
53
+ └── PLATONIC_MAXIMALITY_RESULTS.md
54
+ ```
55
+
56
+ ## Core Functionality
57
+
58
+ ### `ideal_poly_volume_toolkit.geometry` - Volume Computation
59
+
60
+ - **Stereographic projection**: `lift_to_sphere_with_inf()`, `inverse_stereographic_from_sphere_pts()`
61
+ - **Triangulation**: `delaunay_triangulation_indices()`, `hull_tris_projected_back()`
62
+ - **Lobachevsky function**: `lob_fast()` (PyTorch autodiff), `lob_exact()` (mpmath high-precision)
63
+ - **Volume computation**:
64
+ - `triangle_volume_from_points()` - Single triangle volume
65
+ - `ideal_poly_volume_via_delaunay()` - Full polyhedron via Delaunay
66
+ - `ideal_poly_volume_via_hull_project_back()` - Full polyhedron via convex hull
67
+
68
+ ### `ideal_poly_volume_toolkit.rivin_holonomy` - Penner-Rivin Algorithm
69
+
70
+ - **Holonomy computation**: `generators_from_triangulation()` - Compute Fuchsian group generators
71
+ - **Arithmeticity testing**: Check if polyhedra have arithmetic holonomy (traces in number fields)
72
+ - **Triangulation structures**: `Triangulation` class for managing ideal triangulations
73
+
74
+ ### `ideal_poly_volume_toolkit.pointset_to_fuchsian` - Full Pipeline
75
+
76
+ - **Group computation**: `group_from_pointset()` - Convert point sets to Fuchsian groups
77
+ - **Trace field analysis**: `invariant_trace_field_signature()` - Analyze arithmetic properties
78
+ - **Visualization**: `render_snapshot()` - High-quality rendering with iridescence and transparency
79
+ - **Mesh export**: `hull_to_mesh()`, `export_mesh_obj()` - Export to OBJ format
80
+
81
+ ## Quick Start
82
+
83
+ ### 🎨 Interactive GUI (Easiest)
84
+
85
+ The fastest way to get started is with the Gradio web interface:
86
+
87
+ ```bash
88
+ python bin/gui.py
89
+ ```
90
+
91
+ Then open your browser to `http://127.0.0.1:7860`
92
+
93
+ **Features:**
94
+ - Interactive optimization with real-time progress
95
+ - Distribution analysis with automatic plotting
96
+ - 3D visualization in sphere and Poincaré ball models
97
+ - No need to remember command-line arguments!
98
+
99
+ ### Command-Line Tools
100
+
101
+ For scripting and batch processing, use the wrapper scripts in `bin/`:
102
+
103
+ ```bash
104
+ # Optimize a 7-vertex polyhedron (10 trials)
105
+ python bin/optimize_polyhedron.py --vertices 7 --trials 10
106
+
107
+ # Analyze volume distribution for tetrahedra
108
+ python bin/analyze_distribution.py --vertices 4 --samples 10000
109
+
110
+ # Get help on any tool
111
+ python bin/optimize_polyhedron.py --help
112
+ ```
113
+
114
+ See `bin/README.md` for detailed usage and examples.
115
+
116
+ ### Computing a volume (Python API)
117
+
118
+ ```python
119
+ import numpy as np
120
+ from ideal_poly_volume_toolkit.geometry import ideal_poly_volume_via_delaunay
121
+
122
+ # Define vertices in the complex plane (stereographic projection)
123
+ vertices = np.array([0.0+0.0j, 1.0+0.0j, 0.5+0.866j])
124
+ volume = ideal_poly_volume_via_delaunay(vertices)
125
+ print(f"Volume: {volume}")
126
+ ```
127
+
128
+ ### Running examples
129
+
130
+ Examples are organized by topic. For instance:
131
+
132
+ ```bash
133
+ # 7-vertex optimization
134
+ cd examples/optimization/7vertex
135
+ python optimize_7vertex.py
136
+
137
+ # Tetrahedron distribution analysis
138
+ cd examples/distributions/tetrahedron
139
+ python tetrahedron_volume_distribution.py
140
+
141
+ # Visualization
142
+ cd examples/visualization
143
+ python visualize_golden_config.py
144
+ ```
145
+
146
+ ## Key Examples
147
+
148
+ - **7-vertex optimization**: Testing the hypothesis that the maximum volume is an octahedron with one stellated face
149
+ - **20-vertex optimization**: Finding maximal volume configurations for icosahedron-like polyhedra
150
+ - **Distribution analysis**: Statistical analysis of volume distributions for various polyhedra
151
+ - **Platonic solids**: Analysis of regular polyhedra and their perturbations
152
+
153
+ ## Research Results
154
+
155
+ See `docs/RESULTS_SUMMARY.md` and `docs/PLATONIC_MAXIMALITY_RESULTS.md` for detailed findings.
156
+
157
+ ## License
158
+
159
+ See LICENSE file.
bin/.gradio/certificate.pem ADDED
@@ -0,0 +1,31 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ -----BEGIN CERTIFICATE-----
2
+ MIIFazCCA1OgAwIBAgIRAIIQz7DSQONZRGPgu2OCiwAwDQYJKoZIhvcNAQELBQAw
3
+ TzELMAkGA1UEBhMCVVMxKTAnBgNVBAoTIEludGVybmV0IFNlY3VyaXR5IFJlc2Vh
4
+ cmNoIEdyb3VwMRUwEwYDVQQDEwxJU1JHIFJvb3QgWDEwHhcNMTUwNjA0MTEwNDM4
5
+ WhcNMzUwNjA0MTEwNDM4WjBPMQswCQYDVQQGEwJVUzEpMCcGA1UEChMgSW50ZXJu
6
+ ZXQgU2VjdXJpdHkgUmVzZWFyY2ggR3JvdXAxFTATBgNVBAMTDElTUkcgUm9vdCBY
7
+ MTCCAiIwDQYJKoZIhvcNAQEBBQADggIPADCCAgoCggIBAK3oJHP0FDfzm54rVygc
8
+ h77ct984kIxuPOZXoHj3dcKi/vVqbvYATyjb3miGbESTtrFj/RQSa78f0uoxmyF+
9
+ 0TM8ukj13Xnfs7j/EvEhmkvBioZxaUpmZmyPfjxwv60pIgbz5MDmgK7iS4+3mX6U
10
+ A5/TR5d8mUgjU+g4rk8Kb4Mu0UlXjIB0ttov0DiNewNwIRt18jA8+o+u3dpjq+sW
11
+ T8KOEUt+zwvo/7V3LvSye0rgTBIlDHCNAymg4VMk7BPZ7hm/ELNKjD+Jo2FR3qyH
12
+ B5T0Y3HsLuJvW5iB4YlcNHlsdu87kGJ55tukmi8mxdAQ4Q7e2RCOFvu396j3x+UC
13
+ B5iPNgiV5+I3lg02dZ77DnKxHZu8A/lJBdiB3QW0KtZB6awBdpUKD9jf1b0SHzUv
14
+ KBds0pjBqAlkd25HN7rOrFleaJ1/ctaJxQZBKT5ZPt0m9STJEadao0xAH0ahmbWn
15
+ OlFuhjuefXKnEgV4We0+UXgVCwOPjdAvBbI+e0ocS3MFEvzG6uBQE3xDk3SzynTn
16
+ jh8BCNAw1FtxNrQHusEwMFxIt4I7mKZ9YIqioymCzLq9gwQbooMDQaHWBfEbwrbw
17
+ qHyGO0aoSCqI3Haadr8faqU9GY/rOPNk3sgrDQoo//fb4hVC1CLQJ13hef4Y53CI
18
+ rU7m2Ys6xt0nUW7/vGT1M0NPAgMBAAGjQjBAMA4GA1UdDwEB/wQEAwIBBjAPBgNV
19
+ HRMBAf8EBTADAQH/MB0GA1UdDgQWBBR5tFnme7bl5AFzgAiIyBpY9umbbjANBgkq
20
+ hkiG9w0BAQsFAAOCAgEAVR9YqbyyqFDQDLHYGmkgJykIrGF1XIpu+ILlaS/V9lZL
21
+ ubhzEFnTIZd+50xx+7LSYK05qAvqFyFWhfFQDlnrzuBZ6brJFe+GnY+EgPbk6ZGQ
22
+ 3BebYhtF8GaV0nxvwuo77x/Py9auJ/GpsMiu/X1+mvoiBOv/2X/qkSsisRcOj/KK
23
+ NFtY2PwByVS5uCbMiogziUwthDyC3+6WVwW6LLv3xLfHTjuCvjHIInNzktHCgKQ5
24
+ ORAzI4JMPJ+GslWYHb4phowim57iaztXOoJwTdwJx4nLCgdNbOhdjsnvzqvHu7Ur
25
+ TkXWStAmzOVyyghqpZXjFaH3pO3JLF+l+/+sKAIuvtd7u+Nxe5AW0wdeRlN8NwdC
26
+ jNPElpzVmbUq4JUagEiuTDkHzsxHpFKVK7q4+63SM1N95R1NbdWhscdCb+ZAJzVc
27
+ oyi3B43njTOQ5yOf+1CceWxG1bQVs5ZufpsMljq4Ui0/1lvh+wjChP4kqKOJ2qxq
28
+ 4RgqsahDYVvTH9w7jXbyLeiNdd8XM2w9U/t7y0Ff/9yi0GE44Za4rF2LN9d11TPA
29
+ mRGunUHBcnWEvgJBQl9nJEiU0Zsnvgc/ubhPgXRR4Xq37Z0j4r7g1SgEEzwxA57d
30
+ emyPxgcYxn/eR44/KJ4EBs+lVDR3veyJm+kXQ99b21/+jh5Xos1AnX5iItreGCc=
31
+ -----END CERTIFICATE-----
bin/README.md ADDED
@@ -0,0 +1,153 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Command-Line Tools
2
+
3
+ This directory contains general-purpose wrapper scripts and GUI for common tasks.
4
+
5
+ ## 🎨 Interactive GUI
6
+
7
+ ### `gui.py`
8
+
9
+ **Launch the interactive Gradio web interface:**
10
+
11
+ ```bash
12
+ # Local only (default)
13
+ python bin/gui.py
14
+
15
+ # Create shareable public link
16
+ python bin/gui.py --share
17
+
18
+ # Custom port
19
+ python bin/gui.py --port 8080
20
+
21
+ # Get help
22
+ python bin/gui.py --help
23
+ ```
24
+
25
+ The GUI will open in your browser at `http://127.0.0.1:7860` (or your custom port)
26
+
27
+ **Features:**
28
+ - **Optimization Tab**: Find maximal volume configurations with adjustable parameters
29
+ - Number of vertices, trials, iterations
30
+ - Real-time progress tracking
31
+ - Automatic result saving
32
+
33
+ - **Distribution Analysis Tab**: Sample random configurations and analyze statistics
34
+ - Configurable sample size
35
+ - Automatic histogram generation
36
+ - Statistics: mean, median, quartiles, std dev
37
+
38
+ - **3D Visualization Tab**: Interactive polyhedron visualization
39
+ - Delaunay triangulation (2D complex plane)
40
+ - Sphere projection (3D)
41
+ - Poincaré ball model (3D hyperbolic geometry)
42
+ - Adjustable subdivision for smooth surfaces
43
+ - Load results directly from optimization
44
+
45
+ - **About Tab**: Documentation and usage tips
46
+
47
+ **Advantages:**
48
+ - No need to remember command-line arguments
49
+ - Visual feedback and progress bars
50
+ - Interactive 3D plots you can rotate and zoom
51
+ - Seamless workflow: optimize → visualize
52
+ - Beginner-friendly interface
53
+
54
+ ---
55
+
56
+ ## Available Tools
57
+
58
+ ### `optimize_polyhedron.py`
59
+
60
+ General-purpose optimization wrapper for finding maximal volume configurations.
61
+
62
+ **Usage:**
63
+ ```bash
64
+ # Optimize a 7-vertex polyhedron with 10 trials
65
+ python bin/optimize_polyhedron.py --vertices 7 --trials 10
66
+
67
+ # Optimize 12-vertex with more iterations
68
+ python bin/optimize_polyhedron.py --vertices 12 --trials 20 --maxiter 300
69
+
70
+ # Custom output location
71
+ python bin/optimize_polyhedron.py --vertices 20 --trials 5 --output my_results.json
72
+ ```
73
+
74
+ **Arguments:**
75
+ - `--vertices, -v`: Number of vertices (required, must be >= 4)
76
+ - `--trials, -t`: Number of optimization trials (default: 10)
77
+ - `--maxiter, -m`: Max iterations per trial (default: 200)
78
+ - `--popsize, -p`: Population size for differential evolution (default: 15)
79
+ - `--output, -o`: Output JSON file (default: auto-generated in results/data/)
80
+ - `--seed, -s`: Random seed base (default: 42)
81
+
82
+ **Output:**
83
+ Saves a JSON file with:
84
+ - Best configuration found (volume, parameters, vertex coordinates)
85
+ - Combinatorial structure (faces, edges, Euler characteristic)
86
+ - All trial results
87
+
88
+ ---
89
+
90
+ ### `analyze_distribution.py`
91
+
92
+ Analyze volume distributions by sampling random polyhedra configurations.
93
+
94
+ **Usage:**
95
+ ```bash
96
+ # Analyze 10,000 random tetrahedra
97
+ python bin/analyze_distribution.py --vertices 4 --samples 10000
98
+
99
+ # Analyze with custom output
100
+ python bin/analyze_distribution.py --vertices 6 --samples 50000 --output my_plot.png
101
+
102
+ # Include reference volume and save data
103
+ python bin/analyze_distribution.py --vertices 5 --samples 20000 --reference 3.66 --data distribution_data.json
104
+ ```
105
+
106
+ **Arguments:**
107
+ - `--vertices, -v`: Number of vertices (required, must be >= 3)
108
+ - `--samples, -n`: Number of random samples (default: 10000)
109
+ - `--seed, -s`: Random seed (default: 42)
110
+ - `--output, -o`: Output plot file (default: auto-generated in results/plots/)
111
+ - `--data, -d`: Output data JSON file (optional)
112
+ - `--reference, -r`: Reference volume to mark on plot (optional)
113
+ - `--series-terms`: Number of series terms for Lobachevsky function (default: 96)
114
+
115
+ **Output:**
116
+ - PNG plot with histogram and box plot
117
+ - Optional JSON file with statistics and all volume samples
118
+
119
+ ---
120
+
121
+ ## Examples
122
+
123
+ ### Quick Optimization Run
124
+ ```bash
125
+ # Find best 7-vertex configuration with 5 quick trials
126
+ python bin/optimize_polyhedron.py -v 7 -t 5 --maxiter 100
127
+ ```
128
+
129
+ ### Distribution Analysis Pipeline
130
+ ```bash
131
+ # Analyze distribution, save data, and use max volume as reference
132
+ python bin/analyze_distribution.py -v 5 -n 10000 --data dist_data.json
133
+ ```
134
+
135
+ ### Reproduce Research Results
136
+ ```bash
137
+ # Optimize different vertex counts systematically
138
+ for n in 5 6 7 8; do
139
+ python bin/optimize_polyhedron.py -v $n -t 20 -m 300
140
+ done
141
+ ```
142
+
143
+ ---
144
+
145
+ ## Output Conventions
146
+
147
+ All outputs are saved to standardized locations:
148
+
149
+ - **Optimization results**: `results/data/{n}vertex_optimization_TIMESTAMP.json`
150
+ - **Distribution plots**: `results/plots/{n}vertex_distribution_TIMESTAMP.png`
151
+ - **Distribution data**: `results/data/{n}vertex_distribution_TIMESTAMP.json`
152
+
153
+ Timestamps are in format: `YYYYMMDD_HHMMSS`
bin/analyze_distribution.py ADDED
@@ -0,0 +1,298 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/usr/bin/env python3
2
+ """
3
+ General-purpose wrapper for analyzing volume distributions of ideal polyhedra.
4
+
5
+ Usage:
6
+ python bin/analyze_distribution.py --vertices 4 --samples 10000
7
+ python bin/analyze_distribution.py --vertices 6 --samples 50000 --output custom_plot.png
8
+ """
9
+
10
+ import argparse
11
+ import json
12
+ import numpy as np
13
+ import matplotlib.pyplot as plt
14
+ from datetime import datetime
15
+ from pathlib import Path
16
+ import sys
17
+
18
+ from ideal_poly_volume_toolkit.geometry import (
19
+ delaunay_triangulation_indices,
20
+ ideal_poly_volume_via_delaunay,
21
+ )
22
+
23
+
24
+ def sample_random_vertex():
25
+ """
26
+ Sample a uniform random point on the unit sphere and project to complex plane.
27
+ Uses stereographic projection from north pole.
28
+ """
29
+ # Sample uniform point on sphere using Gaussian method
30
+ vec = np.random.randn(3)
31
+ vec = vec / np.linalg.norm(vec)
32
+ x, y, z = vec
33
+
34
+ # Skip near north pole (maps to infinity)
35
+ if z > 0.999:
36
+ return None
37
+
38
+ # Stereographic projection
39
+ w = complex(x/(1-z), y/(1-z))
40
+ return w
41
+
42
+
43
+ def analyze_distribution(n_vertices, n_samples, seed=42, series_terms=96):
44
+ """
45
+ Analyze volume distribution for n_vertices polyhedra.
46
+
47
+ Args:
48
+ n_vertices: Number of vertices (must be >= 3)
49
+ n_samples: Number of random configurations to sample
50
+ seed: Random seed
51
+ series_terms: Number of terms for Lobachevsky function approximation
52
+
53
+ Returns:
54
+ dict with volumes and statistics
55
+ """
56
+ np.random.seed(seed)
57
+
58
+ # First 3 vertices are fixed to break symmetry
59
+ fixed_vertices = [complex(0, 0), complex(1, 0), complex(0, 1)]
60
+ n_random = n_vertices - 3
61
+
62
+ if n_random < 0:
63
+ raise ValueError("Need at least 3 vertices")
64
+
65
+ volumes = []
66
+ print(f"Sampling {n_samples} random {n_vertices}-vertex configurations...")
67
+
68
+ for i in range(n_samples):
69
+ if (i + 1) % (n_samples // 10) == 0:
70
+ print(f" Progress: {i + 1}/{n_samples} ({100*(i+1)/n_samples:.1f}%)")
71
+
72
+ # Build configuration
73
+ vertices = fixed_vertices.copy()
74
+
75
+ # Add random vertices
76
+ for _ in range(n_random):
77
+ v = sample_random_vertex()
78
+ if v is None:
79
+ continue # Skip degenerate samples
80
+
81
+ # Skip if too close to existing vertices
82
+ too_close = False
83
+ for existing in vertices:
84
+ if abs(v - existing) < 0.01:
85
+ too_close = True
86
+ break
87
+ if too_close:
88
+ continue
89
+
90
+ vertices.append(v)
91
+
92
+ # Only proceed if we have the right number of vertices
93
+ if len(vertices) != n_vertices:
94
+ continue
95
+
96
+ # Compute volume
97
+ try:
98
+ vertices_np = np.array(vertices, dtype=np.complex128)
99
+ vol = ideal_poly_volume_via_delaunay(
100
+ vertices_np, mode='fast', series_terms=series_terms
101
+ )
102
+
103
+ # Sanity check
104
+ if vol > 0 and vol < 1000:
105
+ volumes.append(vol)
106
+ except:
107
+ pass # Skip invalid configurations
108
+
109
+ volumes = np.array(volumes)
110
+
111
+ if len(volumes) == 0:
112
+ raise ValueError("No valid configurations found!")
113
+
114
+ print(f"\nSuccessfully analyzed {len(volumes)} valid configurations")
115
+
116
+ return {
117
+ 'volumes': volumes,
118
+ 'n_samples_requested': n_samples,
119
+ 'n_valid': len(volumes),
120
+ 'mean': np.mean(volumes),
121
+ 'median': np.median(volumes),
122
+ 'std': np.std(volumes),
123
+ 'min': np.min(volumes),
124
+ 'max': np.max(volumes),
125
+ 'q25': np.percentile(volumes, 25),
126
+ 'q75': np.percentile(volumes, 75),
127
+ }
128
+
129
+
130
+ def plot_distribution(volumes, stats, n_vertices, output_file, reference_volume=None):
131
+ """Create histogram plot of volume distribution."""
132
+ fig, (ax1, ax2) = plt.subplots(1, 2, figsize=(14, 5))
133
+
134
+ # Histogram
135
+ ax1.hist(volumes, bins=50, density=True, alpha=0.7,
136
+ color='steelblue', edgecolor='black', linewidth=0.5)
137
+ ax1.axvline(stats['mean'], color='red', linestyle='--', linewidth=2,
138
+ label=f"Mean: {stats['mean']:.4f}")
139
+ ax1.axvline(stats['median'], color='green', linestyle='--', linewidth=2,
140
+ label=f"Median: {stats['median']:.4f}")
141
+
142
+ if reference_volume is not None:
143
+ ax1.axvline(reference_volume, color='orange', linestyle='--', linewidth=2,
144
+ label=f"Reference: {reference_volume:.4f}")
145
+
146
+ ax1.set_xlabel('Volume', fontsize=12)
147
+ ax1.set_ylabel('Density', fontsize=12)
148
+ ax1.set_title(f'{n_vertices}-Vertex Ideal Polyhedra Volume Distribution', fontsize=14)
149
+ ax1.legend(fontsize=10)
150
+ ax1.grid(True, alpha=0.3)
151
+
152
+ # Box plot
153
+ ax2.boxplot([volumes], vert=True, patch_artist=True,
154
+ boxprops=dict(facecolor='lightblue', alpha=0.7),
155
+ medianprops=dict(color='red', linewidth=2),
156
+ flierprops=dict(marker='o', markerfacecolor='gray', markersize=4, alpha=0.5))
157
+ ax2.set_ylabel('Volume', fontsize=12)
158
+ ax2.set_title('Volume Distribution (Box Plot)', fontsize=14)
159
+ ax2.set_xticklabels([f'{n_vertices} vertices'])
160
+ ax2.grid(True, alpha=0.3, axis='y')
161
+
162
+ plt.tight_layout()
163
+ plt.savefig(output_file, dpi=150, bbox_inches='tight')
164
+ print(f"Plot saved to: {output_file}")
165
+ plt.close()
166
+
167
+
168
+ def main():
169
+ parser = argparse.ArgumentParser(
170
+ description='Analyze volume distributions of ideal polyhedra',
171
+ formatter_class=argparse.RawDescriptionHelpFormatter,
172
+ epilog="""
173
+ Examples:
174
+ %(prog)s --vertices 4 --samples 10000
175
+ %(prog)s --vertices 6 --samples 50000 --output my_analysis.png
176
+ %(prog)s --vertices 5 --samples 20000 --reference 3.66
177
+ """
178
+ )
179
+
180
+ parser.add_argument('--vertices', '-v', type=int, required=True,
181
+ help='Number of vertices (must be >= 3)')
182
+ parser.add_argument('--samples', '-n', type=int, default=10000,
183
+ help='Number of random samples (default: 10000)')
184
+ parser.add_argument('--seed', '-s', type=int, default=42,
185
+ help='Random seed (default: 42)')
186
+ parser.add_argument('--output', '-o', type=str, default=None,
187
+ help='Output plot file (default: results/plots/{n}vertex_distribution_TIMESTAMP.png)')
188
+ parser.add_argument('--data', '-d', type=str, default=None,
189
+ help='Output data JSON file (optional)')
190
+ parser.add_argument('--reference', '-r', type=float, default=None,
191
+ help='Reference volume to mark on plot (optional)')
192
+ parser.add_argument('--series-terms', type=int, default=96,
193
+ help='Number of series terms for Lobachevsky function (default: 96)')
194
+
195
+ args = parser.parse_args()
196
+
197
+ if args.vertices < 3:
198
+ print("Error: Number of vertices must be at least 3")
199
+ sys.exit(1)
200
+
201
+ # Setup output files
202
+ timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
203
+
204
+ if args.output is None:
205
+ plot_file = f"results/plots/{args.vertices}vertex_distribution_{timestamp}.png"
206
+ else:
207
+ plot_file = args.output
208
+
209
+ if args.data is not None:
210
+ data_file = args.data
211
+ else:
212
+ data_file = f"results/data/{args.vertices}vertex_distribution_{timestamp}.json"
213
+
214
+ # Ensure output directories exist
215
+ Path(plot_file).parent.mkdir(parents=True, exist_ok=True)
216
+ if args.data is not None:
217
+ Path(data_file).parent.mkdir(parents=True, exist_ok=True)
218
+
219
+ print("=" * 70)
220
+ print("Ideal Polyhedron Volume Distribution Analysis")
221
+ print("=" * 70)
222
+ print(f"Vertices: {args.vertices}")
223
+ print(f"Samples: {args.samples}")
224
+ print(f"Random seed: {args.seed}")
225
+ print(f"Plot output: {plot_file}")
226
+ if args.data:
227
+ print(f"Data output: {data_file}")
228
+ print("=" * 70)
229
+ print()
230
+
231
+ # Run analysis
232
+ results = analyze_distribution(
233
+ args.vertices,
234
+ args.samples,
235
+ seed=args.seed,
236
+ series_terms=args.series_terms
237
+ )
238
+
239
+ # Print statistics
240
+ print("\n" + "=" * 70)
241
+ print("STATISTICS:")
242
+ print("=" * 70)
243
+ print(f"Valid configs: {results['n_valid']:,} / {results['n_samples_requested']:,}")
244
+ print(f"Mean volume: {results['mean']:.8f}")
245
+ print(f"Median volume: {results['median']:.8f}")
246
+ print(f"Std deviation: {results['std']:.8f}")
247
+ print(f"Min volume: {results['min']:.8f}")
248
+ print(f"Max volume: {results['max']:.8f}")
249
+ print(f"25th percentile: {results['q25']:.8f}")
250
+ print(f"75th percentile: {results['q75']:.8f}")
251
+
252
+ if args.reference is not None:
253
+ print(f"\nReference volume: {args.reference:.8f}")
254
+ print(f"Mean/Reference: {results['mean']/args.reference:.4f}")
255
+ print(f"Max/Reference: {results['max']/args.reference:.4f}")
256
+
257
+ # Create plot
258
+ plot_distribution(
259
+ results['volumes'],
260
+ results,
261
+ args.vertices,
262
+ plot_file,
263
+ reference_volume=args.reference
264
+ )
265
+
266
+ # Save data if requested
267
+ if args.data is not None:
268
+ output_data = {
269
+ 'metadata': {
270
+ 'timestamp': datetime.now().isoformat(),
271
+ 'n_vertices': args.vertices,
272
+ 'n_samples_requested': args.samples,
273
+ 'n_valid': results['n_valid'],
274
+ 'seed': args.seed,
275
+ 'series_terms': args.series_terms,
276
+ },
277
+ 'statistics': {
278
+ 'mean': float(results['mean']),
279
+ 'median': float(results['median']),
280
+ 'std': float(results['std']),
281
+ 'min': float(results['min']),
282
+ 'max': float(results['max']),
283
+ 'q25': float(results['q25']),
284
+ 'q75': float(results['q75']),
285
+ },
286
+ 'volumes': results['volumes'].tolist(),
287
+ }
288
+
289
+ with open(data_file, 'w') as f:
290
+ json.dump(output_data, f, indent=2)
291
+
292
+ print(f"\nData saved to: {data_file}")
293
+
294
+ print("=" * 70)
295
+
296
+
297
+ if __name__ == '__main__':
298
+ main()
bin/gui.py ADDED
@@ -0,0 +1,893 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/usr/bin/env python3
2
+ """
3
+ Gradio GUI for Ideal Polyhedron Volume Toolkit
4
+
5
+ Interactive interface for:
6
+ - Optimizing polyhedra
7
+ - Analyzing distributions
8
+ - 3D visualization in sphere and Poincaré ball models
9
+
10
+ Usage:
11
+ python bin/gui.py # Local only (127.0.0.1:7860)
12
+ python bin/gui.py --share # Create shareable public link
13
+ python bin/gui.py --port 8080 # Custom port
14
+ """
15
+
16
+ import gradio as gr
17
+ import numpy as np
18
+ import json
19
+ import io
20
+ import matplotlib.pyplot as plt
21
+ import argparse
22
+ from datetime import datetime
23
+ from pathlib import Path
24
+ from PIL import Image
25
+
26
+ from ideal_poly_volume_toolkit.geometry import (
27
+ delaunay_triangulation_indices,
28
+ triangle_volume_from_points_torch,
29
+ ideal_poly_volume_via_delaunay,
30
+ )
31
+ from ideal_poly_volume_toolkit.visualization import (
32
+ plot_polyhedron_klein,
33
+ plot_polyhedron_poincare,
34
+ plot_delaunay_2d,
35
+ create_polyhedron_mesh,
36
+ )
37
+ from ideal_poly_volume_toolkit.rivin_holonomy import (
38
+ Triangulation,
39
+ generators_from_triangulation,
40
+ )
41
+ from ideal_poly_volume_toolkit.symmetry import (
42
+ compute_symmetry_group,
43
+ format_symmetry_report,
44
+ )
45
+
46
+ import torch
47
+ from scipy.optimize import differential_evolution
48
+
49
+ # Note: GPU is slower than CPU for this problem due to small tensor sizes
50
+ # and transfer overhead, so we use CPU explicitly
51
+ DEVICE = torch.device('cpu')
52
+
53
+
54
+ # ============================================================================
55
+ # Optimization Functions
56
+ # ============================================================================
57
+
58
+ def spherical_to_complex(theta, phi):
59
+ """Convert spherical coordinates to complex via stereographic projection."""
60
+ return np.tan(theta/2) * np.exp(1j * phi)
61
+
62
+
63
+ def compute_volume(params, n_vertices):
64
+ """Compute volume for a polyhedron with n_vertices.
65
+
66
+ Performance optimizations:
67
+ - Reduced series_terms to 64 (good balance of speed/accuracy)
68
+ - Single torch tensor conversion
69
+ - Parallel evaluation via differential_evolution workers
70
+
71
+ Args:
72
+ params: Optimization parameters (theta, phi pairs)
73
+ n_vertices: Total number of vertices
74
+
75
+ Returns:
76
+ Negative volume (for minimization)
77
+
78
+ Note:
79
+ The polishing step (L-BFGS-B) uses finite differences for gradients.
80
+ PyTorch autodiff could be used but the Delaunay triangulation is
81
+ not differentiable, making gradients unreliable near topology changes.
82
+ """
83
+ complex_points = [complex(0, 0), complex(1, 0), complex(0, 1)]
84
+ n_params = n_vertices - 3
85
+
86
+ for i in range(n_params):
87
+ theta = params[2*i]
88
+ phi = params[2*i + 1]
89
+ z = spherical_to_complex(theta, phi)
90
+ complex_points.append(z)
91
+
92
+ Z_np = np.array(complex_points, dtype=np.complex128)
93
+
94
+ try:
95
+ idx = delaunay_triangulation_indices(Z_np)
96
+ except:
97
+ return 1000.0
98
+
99
+ # Single torch conversion (CPU is faster than GPU for small tensors)
100
+ Z_torch = torch.tensor(Z_np, dtype=torch.complex128, device=DEVICE)
101
+
102
+ total_volume = 0
103
+ for (i, j, k) in idx:
104
+ try:
105
+ vol = triangle_volume_from_points_torch(
106
+ Z_torch[i], Z_torch[j], Z_torch[k], series_terms=64
107
+ )
108
+ total_volume += vol.item()
109
+ except:
110
+ return 1000.0
111
+
112
+ return -total_volume
113
+
114
+
115
+ def run_optimization(n_vertices, n_trials, max_iter, pop_size, seed, progress=gr.Progress()):
116
+ """Run optimization with progress tracking.
117
+
118
+ Uses parallel workers (all CPU cores) for faster optimization.
119
+ """
120
+ import os
121
+ from functools import partial
122
+
123
+ # Validate and convert inputs
124
+ try:
125
+ n_vertices = int(n_vertices)
126
+ if n_vertices < 4:
127
+ return "Error: Number of vertices must be at least 4", None
128
+ if n_vertices > 100:
129
+ return "Error: Number of vertices limited to 100 for practical computation time", None
130
+ except (ValueError, TypeError):
131
+ return "Error: Number of vertices must be an integer", None
132
+
133
+ n_cpus = os.cpu_count()
134
+ progress(0, desc=f"Starting optimization (using {n_cpus} CPU cores)...")
135
+
136
+ n_free_vertices = n_vertices - 3
137
+ n_params = n_free_vertices * 2
138
+ bounds = [(0.1, np.pi - 0.1), (0, 2*np.pi)] * n_free_vertices
139
+
140
+ # Adaptive settings for large vertex counts
141
+ # For high dimensions, reduce popsize to avoid excessive evaluations
142
+ adaptive_popsize = min(pop_size, max(10, 15 - (n_vertices - 7) // 3))
143
+
144
+ best_volume = 0
145
+ best_params = None
146
+ all_volumes = []
147
+
148
+ # Create picklable objective function using partial
149
+ # (lambdas can't be pickled for multiprocessing)
150
+ objective_func = partial(compute_volume, n_vertices=n_vertices)
151
+
152
+ for trial in range(n_trials):
153
+ progress((trial + 1) / n_trials, desc=f"Trial {trial + 1}/{n_trials}")
154
+
155
+ result = differential_evolution(
156
+ objective_func,
157
+ bounds,
158
+ maxiter=max_iter,
159
+ popsize=adaptive_popsize,
160
+ seed=seed + trial,
161
+ polish=True,
162
+ disp=False,
163
+ workers=-1, # Use all CPU cores for parallel evaluation
164
+ updating='deferred' # Better for parallel workers
165
+ )
166
+
167
+ volume = -result.fun
168
+ all_volumes.append(volume)
169
+
170
+ if volume > best_volume:
171
+ best_volume = volume
172
+ best_params = result.x
173
+
174
+ # Reconstruct best configuration
175
+ complex_points = [complex(0, 0), complex(1, 0), complex(0, 1)]
176
+ for i in range(n_free_vertices):
177
+ theta = best_params[2*i]
178
+ phi = best_params[2*i + 1]
179
+ z = spherical_to_complex(theta, phi)
180
+ complex_points.append(z)
181
+
182
+ Z_np = np.array(complex_points, dtype=np.complex128)
183
+ idx = delaunay_triangulation_indices(Z_np)
184
+
185
+ # Compute statistics
186
+ stats = {
187
+ 'n_vertices': n_vertices,
188
+ 'n_faces': len(idx),
189
+ 'best_volume': float(best_volume),
190
+ 'mean_volume': float(np.mean(all_volumes)),
191
+ 'std_volume': float(np.std(all_volumes)),
192
+ 'all_volumes': [float(v) for v in all_volumes],
193
+ }
194
+
195
+ # Save result
196
+ timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
197
+ output_file = f"results/data/{n_vertices}vertex_optimization_{timestamp}.json"
198
+ Path(output_file).parent.mkdir(parents=True, exist_ok=True)
199
+
200
+ with open(output_file, 'w') as f:
201
+ json.dump({
202
+ 'metadata': {'timestamp': datetime.now().isoformat()},
203
+ 'best': {
204
+ 'volume': stats['best_volume'],
205
+ 'params': best_params.tolist(),
206
+ 'vertices_real': Z_np.real.tolist(),
207
+ 'vertices_imag': Z_np.imag.tolist(),
208
+ },
209
+ 'statistics': stats,
210
+ }, f, indent=2)
211
+
212
+ # Create summary text
213
+ summary = f"""
214
+ ## Optimization Results
215
+
216
+ **Configuration:**
217
+ - Vertices: {n_vertices}
218
+ - Trials: {n_trials}
219
+ - Iterations per trial: {max_iter}
220
+
221
+ **Best Result:**
222
+ - Volume: {best_volume:.8f}
223
+ - Faces: {len(idx)}
224
+
225
+ **Statistics over all trials:**
226
+ - Mean: {stats['mean_volume']:.8f}
227
+ - Std Dev: {stats['std_volume']:.8f}
228
+ - Best/Mean: {best_volume/stats['mean_volume']:.4f}
229
+
230
+ **Saved to:** `{output_file}`
231
+ """
232
+
233
+ # Return data as dict for state management
234
+ opt_data = {
235
+ 'vertices': Z_np,
236
+ 'triangulation': idx,
237
+ 'stats': stats
238
+ }
239
+
240
+ return summary, opt_data
241
+
242
+
243
+ # ============================================================================
244
+ # Distribution Analysis Functions
245
+ # ============================================================================
246
+
247
+ def analyze_volume_distribution(n_vertices, n_samples, seed, progress=gr.Progress()):
248
+ """Analyze volume distribution with progress tracking.
249
+
250
+ Note: n_vertices is the number of random vertices to add.
251
+ Total vertices = n_vertices + 3 fixed (0, 1, i) + 1 at infinity = n_vertices + 4
252
+ """
253
+ np.random.seed(seed)
254
+
255
+ # Fixed vertices: 0, 1, i (infinity is implicit in the volume computation)
256
+ fixed_vertices = [complex(0, 0), complex(1, 0), complex(0, 1)]
257
+ n_random = n_vertices
258
+ total_vertices = n_vertices + 3 # Will have infinity implicitly
259
+
260
+ volumes = []
261
+
262
+ def sample_random_vertex():
263
+ vec = np.random.randn(3)
264
+ vec = vec / np.linalg.norm(vec)
265
+ x, y, z = vec
266
+ if z > 0.999:
267
+ return None
268
+ w = complex(x/(1-z), y/(1-z))
269
+ return w
270
+
271
+ for i in range(n_samples):
272
+ if (i + 1) % 100 == 0:
273
+ progress((i + 1) / n_samples, desc=f"Sampling: {i + 1}/{n_samples}")
274
+
275
+ vertices = fixed_vertices.copy()
276
+
277
+ for _ in range(n_random):
278
+ v = sample_random_vertex()
279
+ if v is None or any(abs(v - existing) < 0.01 for existing in vertices):
280
+ continue
281
+ vertices.append(v)
282
+
283
+ if len(vertices) != total_vertices:
284
+ continue
285
+
286
+ try:
287
+ vertices_np = np.array(vertices, dtype=np.complex128)
288
+ vol = ideal_poly_volume_via_delaunay(vertices_np, mode='fast', series_terms=96)
289
+ if 0 < vol < 1000:
290
+ volumes.append(vol)
291
+ except:
292
+ pass
293
+
294
+ volumes = np.array(volumes)
295
+
296
+ # Create histogram
297
+ fig, ax = plt.subplots(figsize=(10, 6))
298
+ ax.hist(volumes, bins=50, density=True, alpha=0.7, color='steelblue', edgecolor='black')
299
+ ax.axvline(np.mean(volumes), color='red', linestyle='--', linewidth=2,
300
+ label=f'Mean: {np.mean(volumes):.4f}')
301
+ ax.axvline(np.median(volumes), color='green', linestyle='--', linewidth=2,
302
+ label=f'Median: {np.median(volumes):.4f}')
303
+ ax.set_xlabel('Volume', fontsize=12)
304
+ ax.set_ylabel('Density', fontsize=12)
305
+ ax.set_title(f'{total_vertices}-Vertex Volume Distribution ({len(volumes)} samples)', fontsize=14)
306
+ ax.legend()
307
+ ax.grid(True, alpha=0.3)
308
+
309
+ # Save plot to BytesIO and convert to PIL Image for Gradio
310
+ buf = io.BytesIO()
311
+ plt.tight_layout()
312
+ plt.savefig(buf, format='png', dpi=150)
313
+ buf.seek(0)
314
+ plt.close()
315
+
316
+ # Convert BytesIO to PIL Image for Gradio
317
+ img = Image.open(buf)
318
+
319
+ # Statistics summary
320
+ summary = f"""
321
+ ## Distribution Analysis Results
322
+
323
+ **Configuration:**
324
+ - Random vertices: {n_random}
325
+ - Fixed vertices: 3 (at 0, 1, i)
326
+ - **Total vertices: {total_vertices}** (+ ∞)
327
+ - Samples requested: {n_samples}
328
+ - Valid samples: {len(volumes)}
329
+
330
+ **Statistics:**
331
+ - Mean: {np.mean(volumes):.8f}
332
+ - Median: {np.median(volumes):.8f}
333
+ - Std Dev: {np.std(volumes):.8f}
334
+ - Min: {np.min(volumes):.8f}
335
+ - Max: {np.max(volumes):.8f}
336
+ - Q25: {np.percentile(volumes, 25):.8f}
337
+ - Q75: {np.percentile(volumes, 75):.8f}
338
+ """
339
+
340
+ return summary, img
341
+
342
+
343
+ # ============================================================================
344
+ # Visualization Functions
345
+ # ============================================================================
346
+
347
+ def visualize_configuration(vertices_real, vertices_imag, vis_type, subdivisions):
348
+ """Create visualization based on user selection."""
349
+ if not vertices_real or not vertices_imag:
350
+ return None, "Please provide vertices"
351
+
352
+ try:
353
+ # Parse vertices
354
+ real_parts = [float(x.strip()) for x in vertices_real.split(',')]
355
+ imag_parts = [float(x.strip()) for x in vertices_imag.split(',')]
356
+
357
+ if len(real_parts) != len(imag_parts):
358
+ return None, "Real and imaginary parts must have same length"
359
+
360
+ vertices = np.array([complex(r, i) for r, i in zip(real_parts, imag_parts)])
361
+
362
+ if vis_type == "Delaunay (2D)":
363
+ idx = delaunay_triangulation_indices(vertices)
364
+ fig = plot_delaunay_2d(vertices, idx)
365
+ return fig, f"Showing Delaunay triangulation with {len(idx)} faces"
366
+
367
+ elif vis_type == "Klein Ball (3D)":
368
+ fig = plot_polyhedron_klein(vertices, subdivisions=subdivisions)
369
+ return fig, f"Showing Klein model (subdivision level: {subdivisions})"
370
+
371
+ elif vis_type == "Poincaré Ball (3D)":
372
+ fig = plot_polyhedron_poincare(vertices, subdivisions=subdivisions)
373
+ return fig, f"Showing Poincaré ball model (subdivision level: {subdivisions})"
374
+
375
+ except Exception as e:
376
+ return None, f"Error: {str(e)}"
377
+
378
+
379
+ def load_from_optimization(opt_data):
380
+ """Load vertices from optimization results."""
381
+ if opt_data is None:
382
+ return "", ""
383
+
384
+ # opt_data is a dict with vertices
385
+ vertices = opt_data.get('vertices', None)
386
+ if vertices is None:
387
+ return "", ""
388
+
389
+ real_str = ", ".join(f"{v.real:.6f}" for v in vertices)
390
+ imag_str = ", ".join(f"{v.imag:.6f}" for v in vertices)
391
+
392
+ return real_str, imag_str
393
+
394
+
395
+ # ============================================================================
396
+ # Holonomy/Arithmeticity Functions
397
+ # ============================================================================
398
+
399
+ def compute_holonomy_analysis(vertices_real, vertices_imag, progress=gr.Progress()):
400
+ """Compute holonomy generators and check arithmeticity."""
401
+ if not vertices_real or not vertices_imag:
402
+ return "Please provide vertices", None
403
+
404
+ try:
405
+ # Parse vertices
406
+ real_parts = [float(x.strip()) for x in vertices_real.split(',')]
407
+ imag_parts = [float(x.strip()) for x in vertices_imag.split(',')]
408
+
409
+ if len(real_parts) != len(imag_parts):
410
+ return "Real and imaginary parts must have same length", None
411
+
412
+ vertices = np.array([complex(r, i) for r, i in zip(real_parts, imag_parts)])
413
+
414
+ progress(0.2, desc="Computing Delaunay triangulation...")
415
+
416
+ # Get triangulation
417
+ idx = delaunay_triangulation_indices(vertices)
418
+ F = len(idx)
419
+
420
+ progress(0.4, desc="Building adjacency structure...")
421
+
422
+ # Build adjacency structure
423
+ adjacency = {}
424
+ edge_id_map = {}
425
+ edge_id = 0
426
+
427
+ for i, tri_i in enumerate(idx):
428
+ for side_i in range(3):
429
+ v1_i, v2_i = tri_i[side_i], tri_i[(side_i + 1) % 3]
430
+ edge = tuple(sorted([v1_i, v2_i]))
431
+
432
+ # Find matching triangle
433
+ for j, tri_j in enumerate(idx):
434
+ if i == j:
435
+ continue
436
+ for side_j in range(3):
437
+ v1_j, v2_j = tri_j[side_j], tri_j[(side_j + 1) % 3]
438
+ if set([v1_j, v2_j]) == set([v1_i, v2_i]):
439
+ if (i, side_i) not in adjacency:
440
+ if edge not in edge_id_map:
441
+ edge_id_map[edge] = edge_id
442
+ edge_id += 1
443
+ adjacency[(i, side_i)] = (j, side_j, edge_id_map[edge])
444
+
445
+ progress(0.6, desc="Computing holonomy generators...")
446
+
447
+ # Define order and orientation
448
+ order = {t: [0, 1, 2] for t in range(F)}
449
+ orientation = {}
450
+ for edge, eid in edge_id_map.items():
451
+ for (t, s), (u, su, e) in adjacency.items():
452
+ if e == eid:
453
+ orientation[eid] = ((t, s), (u, su))
454
+ break
455
+
456
+ # Create triangulation
457
+ T = Triangulation(F, adjacency, order, orientation)
458
+
459
+ # Zero shears for ideal polyhedra
460
+ Z = {eid: 0.0 for eid in range(edge_id)}
461
+
462
+ # Compute generators
463
+ gens = generators_from_triangulation(T, Z, root=0)
464
+
465
+ progress(0.8, desc="Analyzing traces...")
466
+
467
+ # Analyze traces
468
+ trace_analysis = []
469
+ traces = []
470
+ integral_count = 0
471
+
472
+ for i, (u, v, tokens, M) in enumerate(gens):
473
+ trace = M[0][0] + M[1][1]
474
+ traces.append(trace)
475
+
476
+ nearest_int = round(trace)
477
+ distance = abs(trace - nearest_int)
478
+ is_close = distance < 0.01
479
+
480
+ if is_close:
481
+ integral_count += 1
482
+
483
+ trace_analysis.append({
484
+ 'generator': i,
485
+ 'edge': (u, v),
486
+ 'trace': float(trace),
487
+ 'nearest_int': int(nearest_int),
488
+ 'distance': float(distance),
489
+ 'is_close': is_close
490
+ })
491
+
492
+ progress(1.0, desc="Complete!")
493
+
494
+ # Create summary
495
+ summary = f"""
496
+ ## Holonomy Analysis Results
497
+
498
+ **Configuration:**
499
+ - Vertices: {len(vertices)}
500
+ - Triangular faces: {F}
501
+ - Number of generators: {len(gens)}
502
+
503
+ **Arithmeticity Test:**
504
+ - Generators with integral traces: {integral_count}/{len(gens)}
505
+ - Percentage: {100*integral_count/len(gens):.1f}%
506
+
507
+ **Interpretation:**
508
+ """
509
+ if integral_count == len(gens):
510
+ summary += "✅ **ALL TRACES ARE INTEGERS!**\n\n"
511
+ summary += "This polyhedron is **ARITHMETIC** - it has deep number-theoretic structure!\n"
512
+ summary += "The holonomy lies in PSL(2,O_K) for some number field K."
513
+ elif integral_count > len(gens) * 0.7:
514
+ summary += "⚠️ **MOST TRACES ARE CLOSE TO INTEGERS**\n\n"
515
+ summary += f"This suggests possible arithmetic structure with {integral_count}/{len(gens)} integral traces.\n"
516
+ summary += "May be commensurable with an arithmetic group."
517
+ else:
518
+ summary += "❌ **NOT ARITHMETIC**\n\n"
519
+ summary += "Only a few traces are close to integers. This is likely a generic configuration."
520
+
521
+ summary += "\n\n## Trace Details:\n\n"
522
+ summary += "| Generator | Edge | Trace | Nearest Int | Distance | Status |\n"
523
+ summary += "|-----------|------|-------|-------------|----------|--------|\n"
524
+
525
+ for ta in trace_analysis:
526
+ status = "✅ INTEGRAL" if ta['is_close'] else "❌"
527
+ summary += f"| {ta['generator']} | {ta['edge'][0]}-{ta['edge'][1]} | "
528
+ summary += f"{ta['trace']:.6f} | {ta['nearest_int']} | "
529
+ summary += f"{ta['distance']:.6f} | {status} |\n"
530
+
531
+ # Create plot
532
+ fig, (ax1, ax2) = plt.subplots(1, 2, figsize=(12, 5))
533
+
534
+ # Plot traces
535
+ gen_nums = [ta['generator'] for ta in trace_analysis]
536
+ trace_vals = [ta['trace'] for ta in trace_analysis]
537
+ colors = ['green' if ta['is_close'] else 'red' for ta in trace_analysis]
538
+
539
+ ax1.bar(gen_nums, trace_vals, color=colors, alpha=0.7, edgecolor='black')
540
+ ax1.axhline(y=0, color='k', linestyle='-', linewidth=0.5)
541
+ ax1.set_xlabel('Generator', fontsize=12)
542
+ ax1.set_ylabel('Trace', fontsize=12)
543
+ ax1.set_title('Holonomy Generator Traces', fontsize=14)
544
+ ax1.grid(True, alpha=0.3)
545
+
546
+ # Plot distances from integers
547
+ distances = [ta['distance'] for ta in trace_analysis]
548
+ ax2.bar(gen_nums, distances, color=colors, alpha=0.7, edgecolor='black')
549
+ ax2.axhline(y=0.01, color='orange', linestyle='--', linewidth=2,
550
+ label='Threshold (0.01)')
551
+ ax2.set_xlabel('Generator', fontsize=12)
552
+ ax2.set_ylabel('Distance from nearest integer', fontsize=12)
553
+ ax2.set_title('Integrality Test', fontsize=14)
554
+ ax2.legend()
555
+ ax2.grid(True, alpha=0.3)
556
+ ax2.set_yscale('log')
557
+
558
+ plt.tight_layout()
559
+
560
+ # Save plot to BytesIO and convert to PIL Image for Gradio
561
+ buf = io.BytesIO()
562
+ plt.savefig(buf, format='png', dpi=150)
563
+ buf.seek(0)
564
+ plt.close()
565
+
566
+ # Convert BytesIO to PIL Image for Gradio
567
+ img = Image.open(buf)
568
+
569
+ return summary, img
570
+
571
+ except Exception as e:
572
+ return f"Error: {str(e)}", None
573
+
574
+
575
+ def compute_symmetry_analysis(vertices_real, vertices_imag):
576
+ """Compute symmetry group of the polyhedron."""
577
+ if not vertices_real or not vertices_imag:
578
+ return "Please provide vertices"
579
+
580
+ try:
581
+ # Parse vertices
582
+ real_parts = [float(x.strip()) for x in vertices_real.split(',')]
583
+ imag_parts = [float(x.strip()) for x in vertices_imag.split(',')]
584
+
585
+ if len(real_parts) != len(imag_parts):
586
+ return "Real and imaginary parts must have same length"
587
+
588
+ vertices = np.array([complex(r, i) for r, i in zip(real_parts, imag_parts)])
589
+
590
+ # Lift to 3D (Klein model in the ball)
591
+ from ideal_poly_volume_toolkit.visualization import lift_to_sphere_with_inf
592
+ vertices_3d = lift_to_sphere_with_inf(vertices)
593
+
594
+ # Compute symmetry group
595
+ sym_info = compute_symmetry_group(vertices_3d)
596
+
597
+ # Format report
598
+ report = format_symmetry_report(sym_info)
599
+
600
+ return report
601
+
602
+ except Exception as e:
603
+ return f"Error: {str(e)}"
604
+
605
+
606
+ # ============================================================================
607
+ # Gradio Interface
608
+ # ============================================================================
609
+
610
+ def create_gui():
611
+ """Create the main Gradio interface."""
612
+
613
+ with gr.Blocks(title="Ideal Polyhedron Volume Toolkit", theme=gr.themes.Soft()) as demo:
614
+ # Shared state for passing optimization results to visualization
615
+ opt_result_state = gr.State(None)
616
+ gr.Markdown("""
617
+ # 🔺 Ideal Polyhedron Volume Toolkit
618
+
619
+ Interactive GUI for computing and optimizing volumes of ideal hyperbolic polyhedra.
620
+ """)
621
+
622
+ with gr.Tabs():
623
+ # ================================================================
624
+ # Tab 1: Optimization
625
+ # ================================================================
626
+ with gr.Tab("🎯 Optimization"):
627
+ gr.Markdown("Find maximal volume configurations for ideal polyhedra")
628
+
629
+ with gr.Row():
630
+ with gr.Column():
631
+ opt_vertices = gr.Number(value=7, label="Number of Vertices",
632
+ minimum=4, maximum=100,
633
+ info="Recommended: 4-30 (higher is much slower)")
634
+ opt_trials = gr.Slider(1, 50, value=10, step=1,
635
+ label="Number of Trials")
636
+ opt_maxiter = gr.Slider(50, 500, value=150, step=50,
637
+ label="Max Iterations per Trial",
638
+ info="150-200 is usually sufficient")
639
+ opt_popsize = gr.Slider(10, 30, value=15, step=5,
640
+ label="Population Size")
641
+ opt_seed = gr.Number(value=42, label="Random Seed")
642
+
643
+ opt_button = gr.Button("Run Optimization", variant="primary")
644
+
645
+ with gr.Column():
646
+ opt_output = gr.Markdown("Results will appear here...")
647
+
648
+ opt_button.click(
649
+ run_optimization,
650
+ inputs=[opt_vertices, opt_trials, opt_maxiter, opt_popsize, opt_seed],
651
+ outputs=[opt_output, opt_result_state]
652
+ )
653
+
654
+ # ================================================================
655
+ # Tab 2: Distribution Analysis
656
+ # ================================================================
657
+ with gr.Tab("📊 Distribution Analysis"):
658
+ gr.Markdown("""
659
+ Analyze volume distributions by sampling random configurations.
660
+
661
+ **Note:** Vertices are added to fixed base (0, 1, i, ∞).
662
+ So 4 random vertices = 7 total vertices.
663
+ """)
664
+
665
+ with gr.Row():
666
+ with gr.Column():
667
+ dist_vertices = gr.Slider(1, 10, value=4, step=1,
668
+ label="Number of Random Vertices")
669
+ dist_samples = gr.Slider(100, 50000, value=5000, step=100,
670
+ label="Number of Samples")
671
+ dist_seed = gr.Number(value=42, label="Random Seed")
672
+
673
+ dist_button = gr.Button("Analyze Distribution", variant="primary")
674
+
675
+ with gr.Column():
676
+ dist_output = gr.Markdown("Results will appear here...")
677
+ dist_plot = gr.Image(label="Distribution")
678
+
679
+ dist_button.click(
680
+ analyze_volume_distribution,
681
+ inputs=[dist_vertices, dist_samples, dist_seed],
682
+ outputs=[dist_output, dist_plot]
683
+ )
684
+
685
+ # ================================================================
686
+ # Tab 3: 3D Visualization
687
+ # ================================================================
688
+ with gr.Tab("🔮 3D Visualization"):
689
+ gr.Markdown("Visualize polyhedra in different models")
690
+
691
+ with gr.Row():
692
+ with gr.Column():
693
+ gr.Markdown("### Input Vertices")
694
+ gr.Markdown("Enter vertices as comma-separated values in the complex plane")
695
+
696
+ vis_real = gr.Textbox(
697
+ label="Real parts",
698
+ value="0, 1, 0, 0.5",
699
+ placeholder="0, 1, 0.5, -0.5"
700
+ )
701
+ vis_imag = gr.Textbox(
702
+ label="Imaginary parts",
703
+ value="0, 0, 1, 0.866",
704
+ placeholder="0, 0, 0.866, 0.866"
705
+ )
706
+
707
+ with gr.Row():
708
+ load_opt_button = gr.Button("Load from Optimization", size="sm")
709
+
710
+ gr.Markdown("### Visualization Options")
711
+ vis_type = gr.Radio(
712
+ ["Delaunay (2D)", "Klein Ball (3D)", "Poincaré Ball (3D)"],
713
+ value="Klein Ball (3D)",
714
+ label="Visualization Type"
715
+ )
716
+ vis_subdivisions = gr.Slider(0, 5, value=3, step=1,
717
+ label="Subdivision Level (3D only)",
718
+ info="Higher = smoother curves (slower rendering)")
719
+
720
+ vis_button = gr.Button("Generate Visualization", variant="primary")
721
+
722
+ with gr.Column():
723
+ vis_plot = gr.Plot(label="Visualization")
724
+ vis_status = gr.Textbox(label="Status", interactive=False)
725
+
726
+ vis_button.click(
727
+ visualize_configuration,
728
+ inputs=[vis_real, vis_imag, vis_type, vis_subdivisions],
729
+ outputs=[vis_plot, vis_status]
730
+ )
731
+
732
+ load_opt_button.click(
733
+ load_from_optimization,
734
+ inputs=[opt_result_state],
735
+ outputs=[vis_real, vis_imag]
736
+ )
737
+
738
+ # ================================================================
739
+ # Tab 4: Arithmeticity / Holonomy
740
+ # ================================================================
741
+ with gr.Tab("🔬 Arithmeticity"):
742
+ gr.Markdown("Check if a polyhedron is arithmetic using Penner-Rivin holonomy")
743
+
744
+ with gr.Row():
745
+ with gr.Column():
746
+ gr.Markdown("### Input Vertices")
747
+ arith_real = gr.Textbox(
748
+ label="Real parts",
749
+ value="0, 1, 0, 0.5",
750
+ placeholder="0, 1, 0.5, -0.5"
751
+ )
752
+ arith_imag = gr.Textbox(
753
+ label="Imaginary parts",
754
+ value="0, 0, 1, 0.866",
755
+ placeholder="0, 0, 0.866, 0.866"
756
+ )
757
+
758
+ with gr.Row():
759
+ load_opt_arith_button = gr.Button("Load from Optimization", size="sm")
760
+
761
+ arith_button = gr.Button("Compute Holonomy & Check Arithmeticity", variant="primary")
762
+ symmetry_button = gr.Button("Compute Symmetry Group", variant="secondary")
763
+
764
+ with gr.Column():
765
+ arith_output = gr.Markdown("Results will appear here...")
766
+ arith_plot = gr.Image(label="Trace Analysis")
767
+
768
+ arith_button.click(
769
+ compute_holonomy_analysis,
770
+ inputs=[arith_real, arith_imag],
771
+ outputs=[arith_output, arith_plot]
772
+ )
773
+
774
+ symmetry_button.click(
775
+ compute_symmetry_analysis,
776
+ inputs=[arith_real, arith_imag],
777
+ outputs=[arith_output]
778
+ )
779
+
780
+ load_opt_arith_button.click(
781
+ load_from_optimization,
782
+ inputs=[opt_result_state],
783
+ outputs=[arith_real, arith_imag]
784
+ )
785
+
786
+ gr.Markdown("""
787
+ ### About Arithmeticity
788
+
789
+ A hyperbolic 3-manifold is **arithmetic** if its holonomy representation lies in PSL(2, O_K)
790
+ where O_K is the ring of integers in a number field K.
791
+
792
+ For ideal polyhedra, this can be tested by computing:
793
+ 1. Holonomy generators (Penner-Rivin algorithm)
794
+ 2. Traces of these generators
795
+ 3. Checking if traces are integers (or lie in a number field)
796
+
797
+ **Arithmetic polyhedra have deep number-theoretic significance!**
798
+
799
+ If all traces are integers, the configuration is arithmetic and related to
800
+ special lattices in hyperbolic space.
801
+ """)
802
+
803
+ # ================================================================
804
+ # Tab 5: About
805
+ # ================================================================
806
+ with gr.Tab("ℹ️ About"):
807
+ gr.Markdown("""
808
+ ## About This Tool
809
+
810
+ This GUI provides an interactive interface to the **Ideal Polyhedron Volume Toolkit**.
811
+
812
+ ### Features
813
+
814
+ - **Optimization**: Find maximal volume configurations using differential evolution
815
+ - **Distribution Analysis**: Sample random configurations and analyze volume distributions
816
+ - **3D Visualization**: View polyhedra in multiple models:
817
+ - Delaunay triangulation in complex plane (2D)
818
+ - Stereographic projection on unit sphere (3D)
819
+ - Poincaré ball model (3D hyperbolic geometry)
820
+ - **Arithmeticity Testing**: Check if polyhedra have arithmetic holonomy (Penner-Rivin)
821
+
822
+ ### Mathematical Background
823
+
824
+ Ideal polyhedra are polyhedra in hyperbolic 3-space with all vertices at infinity.
825
+ Their volumes can be computed using:
826
+ - Delaunay triangulation of vertex positions
827
+ - Lobachevsky's formula for ideal tetrahedra
828
+ - Stereographic projection from the complex plane
829
+
830
+ ### Usage Tips
831
+
832
+ 1. Start with **Optimization** to find interesting configurations
833
+ 2. Use **Load from Optimization** in the Visualization tab to see results
834
+ 3. Adjust **subdivision level** for smoother 3D visualizations
835
+ 4. Compare sphere and Poincaré ball models to understand hyperbolic geometry
836
+
837
+ ### Documentation
838
+
839
+ - See `README.md` for installation and Python API
840
+ - See `bin/README.md` for command-line tools
841
+ - See `examples/` for research scripts
842
+
843
+ ---
844
+
845
+ **Version:** 0.3.0
846
+ **License:** MIT
847
+ """)
848
+
849
+ gr.Markdown("---")
850
+ gr.Markdown("*Ideal Polyhedron Volume Toolkit GUI*")
851
+
852
+ return demo
853
+
854
+
855
+ if __name__ == "__main__":
856
+ parser = argparse.ArgumentParser(
857
+ description="Launch Gradio GUI for Ideal Polyhedron Volume Toolkit",
858
+ formatter_class=argparse.RawDescriptionHelpFormatter,
859
+ epilog="""
860
+ Examples:
861
+ %(prog)s # Launch locally on 127.0.0.1:7860
862
+ %(prog)s --share # Create shareable public link
863
+ %(prog)s --port 8080 # Use custom port
864
+ %(prog)s --share --port 8080 # Share with custom port
865
+ """
866
+ )
867
+
868
+ parser.add_argument('--share', action='store_true',
869
+ help='Create a shareable public Gradio link')
870
+ parser.add_argument('--port', type=int, default=7860,
871
+ help='Port to run the server on (default: 7860)')
872
+ parser.add_argument('--server-name', type=str, default="127.0.0.1",
873
+ help='Server name/IP to bind to (default: 127.0.0.1)')
874
+
875
+ args = parser.parse_args()
876
+
877
+ demo = create_gui()
878
+
879
+ print("=" * 70)
880
+ print("🎨 Ideal Polyhedron Volume Toolkit - GUI")
881
+ print("=" * 70)
882
+ if args.share:
883
+ print("Creating shareable public link...")
884
+ print("⚠️ WARNING: Public links expose your local server to the internet")
885
+ else:
886
+ print(f"Launching local server at http://{args.server_name}:{args.port}")
887
+ print("=" * 70)
888
+
889
+ demo.launch(
890
+ share=args.share,
891
+ server_name=args.server_name,
892
+ server_port=args.port
893
+ )
bin/optimize_polyhedron.py ADDED
@@ -0,0 +1,263 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/usr/bin/env python3
2
+ """
3
+ General-purpose wrapper for optimizing ideal polyhedron volumes.
4
+
5
+ Usage:
6
+ python bin/optimize_polyhedron.py --vertices 7 --trials 10
7
+ python bin/optimize_polyhedron.py --vertices 12 --trials 20 --output custom_name.json
8
+ """
9
+
10
+ import argparse
11
+ import json
12
+ import numpy as np
13
+ import torch
14
+ from datetime import datetime
15
+ from pathlib import Path
16
+ import sys
17
+
18
+ from ideal_poly_volume_toolkit.geometry import (
19
+ delaunay_triangulation_indices,
20
+ triangle_volume_from_points_torch,
21
+ )
22
+ from scipy.optimize import differential_evolution
23
+
24
+
25
+ def spherical_to_complex(theta, phi):
26
+ """Convert spherical coordinates to complex via stereographic projection."""
27
+ return np.tan(theta/2) * np.exp(1j * phi)
28
+
29
+
30
+ def compute_volume(params, n_vertices):
31
+ """
32
+ Compute volume for a polyhedron with n_vertices.
33
+
34
+ First 3 vertices are fixed to break symmetry:
35
+ - z1 = 0
36
+ - z2 = 1
37
+ - z3 = i
38
+
39
+ Remaining (n_vertices - 3) vertices are parameterized by spherical coords.
40
+ """
41
+ # Fixed vertices
42
+ complex_points = [complex(0, 0), complex(1, 0), complex(0, 1)]
43
+
44
+ # Parameterized vertices (2 params each: theta, phi)
45
+ n_params = n_vertices - 3
46
+ for i in range(n_params):
47
+ theta = params[2*i]
48
+ phi = params[2*i + 1]
49
+ z = spherical_to_complex(theta, phi)
50
+ complex_points.append(z)
51
+
52
+ Z_np = np.array(complex_points, dtype=np.complex128)
53
+
54
+ # Delaunay triangulation
55
+ try:
56
+ idx = delaunay_triangulation_indices(Z_np)
57
+ except:
58
+ return 1000.0 # Penalty for degenerate configuration
59
+
60
+ # Convert to torch for volume computation
61
+ Z_torch = torch.tensor(Z_np, dtype=torch.complex128)
62
+
63
+ # Compute total volume
64
+ total_volume = 0
65
+ for (i, j, k) in idx:
66
+ try:
67
+ vol = triangle_volume_from_points_torch(
68
+ Z_torch[i], Z_torch[j], Z_torch[k], series_terms=96
69
+ )
70
+ total_volume += vol.item()
71
+ except:
72
+ return 1000.0 # Penalty for invalid configuration
73
+
74
+ return -total_volume # Negative for minimization
75
+
76
+
77
+ def analyze_structure(Z_np, idx):
78
+ """Analyze the combinatorial structure of the triangulation."""
79
+ n_vertices = len(Z_np)
80
+ n_faces = len(idx)
81
+
82
+ # Count edges
83
+ edges = set()
84
+ for (i, j, k) in idx:
85
+ edges.add((min(i,j), max(i,j)))
86
+ edges.add((min(i,k), max(i,k)))
87
+ edges.add((min(j,k), max(j,k)))
88
+ n_edges = len(edges)
89
+
90
+ # Vertex degrees
91
+ vertex_degrees = [0] * n_vertices
92
+ for edge in edges:
93
+ vertex_degrees[edge[0]] += 1
94
+ vertex_degrees[edge[1]] += 1
95
+
96
+ # Euler characteristic check
97
+ euler_char = n_vertices - n_edges + n_faces
98
+
99
+ return {
100
+ 'n_vertices': n_vertices,
101
+ 'n_faces': n_faces,
102
+ 'n_edges': n_edges,
103
+ 'euler_characteristic': euler_char,
104
+ 'vertex_degrees': sorted(vertex_degrees),
105
+ }
106
+
107
+
108
+ def reconstruct_vertices(params, n_vertices):
109
+ """Reconstruct complex vertices from parameters."""
110
+ complex_points = [complex(0, 0), complex(1, 0), complex(0, 1)]
111
+
112
+ n_params = n_vertices - 3
113
+ for i in range(n_params):
114
+ theta = params[2*i]
115
+ phi = params[2*i + 1]
116
+ z = spherical_to_complex(theta, phi)
117
+ complex_points.append(z)
118
+
119
+ return np.array(complex_points, dtype=np.complex128)
120
+
121
+
122
+ def main():
123
+ parser = argparse.ArgumentParser(
124
+ description='Optimize ideal polyhedron volumes',
125
+ formatter_class=argparse.RawDescriptionHelpFormatter,
126
+ epilog="""
127
+ Examples:
128
+ %(prog)s --vertices 7 --trials 10
129
+ %(prog)s --vertices 12 --trials 20 --maxiter 300
130
+ %(prog)s --vertices 20 --trials 5 --output results/data/my_20vertex.json
131
+ """
132
+ )
133
+
134
+ parser.add_argument('--vertices', '-v', type=int, required=True,
135
+ help='Number of vertices (must be >= 4)')
136
+ parser.add_argument('--trials', '-t', type=int, default=10,
137
+ help='Number of optimization trials (default: 10)')
138
+ parser.add_argument('--maxiter', '-m', type=int, default=200,
139
+ help='Max iterations per trial (default: 200)')
140
+ parser.add_argument('--popsize', '-p', type=int, default=15,
141
+ help='Population size for differential evolution (default: 15)')
142
+ parser.add_argument('--output', '-o', type=str, default=None,
143
+ help='Output JSON file (default: results/data/{n}vertex_optimization_TIMESTAMP.json)')
144
+ parser.add_argument('--seed', '-s', type=int, default=42,
145
+ help='Random seed base (default: 42)')
146
+
147
+ args = parser.parse_args()
148
+
149
+ if args.vertices < 4:
150
+ print("Error: Number of vertices must be at least 4")
151
+ sys.exit(1)
152
+
153
+ # Setup output file
154
+ if args.output is None:
155
+ timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
156
+ output_file = f"results/data/{args.vertices}vertex_optimization_{timestamp}.json"
157
+ else:
158
+ output_file = args.output
159
+
160
+ # Ensure output directory exists
161
+ Path(output_file).parent.mkdir(parents=True, exist_ok=True)
162
+
163
+ # Calculate number of parameters (2 per free vertex)
164
+ n_free_vertices = args.vertices - 3
165
+ n_params = n_free_vertices * 2
166
+ bounds = [(0.1, np.pi - 0.1), (0, 2*np.pi)] * n_free_vertices
167
+
168
+ print("=" * 70)
169
+ print(f"Ideal Polyhedron Volume Optimization")
170
+ print("=" * 70)
171
+ print(f"Vertices: {args.vertices}")
172
+ print(f"Free vertices: {n_free_vertices} (parameterized)")
173
+ print(f"Parameters: {n_params} (spherical coordinates)")
174
+ print(f"Trials: {args.trials}")
175
+ print(f"Max iterations: {args.maxiter}")
176
+ print(f"Population: {args.popsize}")
177
+ print(f"Output: {output_file}")
178
+ print("=" * 70)
179
+
180
+ best_volume = 0
181
+ best_params = None
182
+ all_results = []
183
+
184
+ for trial in range(args.trials):
185
+ print(f"\nTrial {trial + 1}/{args.trials}...")
186
+
187
+ result = differential_evolution(
188
+ lambda p: compute_volume(p, args.vertices),
189
+ bounds,
190
+ maxiter=args.maxiter,
191
+ popsize=args.popsize,
192
+ seed=args.seed + trial,
193
+ polish=True,
194
+ disp=False
195
+ )
196
+
197
+ volume = -result.fun
198
+ print(f" Volume: {volume:.8f}")
199
+ print(f" Success: {result.success}")
200
+ print(f" Iterations: {result.nit}")
201
+
202
+ all_results.append({
203
+ 'trial': trial + 1,
204
+ 'volume': float(volume),
205
+ 'params': result.x.tolist(),
206
+ 'success': bool(result.success),
207
+ 'iterations': int(result.nit),
208
+ 'function_evals': int(result.nfev)
209
+ })
210
+
211
+ if volume > best_volume:
212
+ best_volume = volume
213
+ best_params = result.x
214
+ print(f" → NEW BEST!")
215
+
216
+ # Analyze best configuration
217
+ print("\n" + "=" * 70)
218
+ print("BEST RESULT:")
219
+ print("=" * 70)
220
+ print(f"Volume: {best_volume:.10f}")
221
+
222
+ # Reconstruct and analyze
223
+ Z_np = reconstruct_vertices(best_params, args.vertices)
224
+ idx = delaunay_triangulation_indices(Z_np)
225
+ structure = analyze_structure(Z_np, idx)
226
+
227
+ print(f"\nCombinatorial structure:")
228
+ print(f" Vertices: {structure['n_vertices']}")
229
+ print(f" Edges: {structure['n_edges']}")
230
+ print(f" Faces: {structure['n_faces']}")
231
+ print(f" Euler char: {structure['euler_characteristic']} (should be 2 for sphere)")
232
+ print(f" Vertex degrees: {structure['vertex_degrees']}")
233
+
234
+ # Save results
235
+ output_data = {
236
+ 'metadata': {
237
+ 'timestamp': datetime.now().isoformat(),
238
+ 'n_vertices': args.vertices,
239
+ 'n_trials': args.trials,
240
+ 'maxiter': args.maxiter,
241
+ 'popsize': args.popsize,
242
+ 'seed_base': args.seed,
243
+ },
244
+ 'best': {
245
+ 'volume': float(best_volume),
246
+ 'params': best_params.tolist(),
247
+ 'vertices_real': Z_np.real.tolist(),
248
+ 'vertices_imag': Z_np.imag.tolist(),
249
+ 'structure': structure,
250
+ 'triangulation': [list(map(int, tri)) for tri in idx],
251
+ },
252
+ 'all_trials': all_results,
253
+ }
254
+
255
+ with open(output_file, 'w') as f:
256
+ json.dump(output_data, f, indent=2)
257
+
258
+ print(f"\nResults saved to: {output_file}")
259
+ print("=" * 70)
260
+
261
+
262
+ if __name__ == '__main__':
263
+ main()
bin/results/data/20vertex_optimization_20251026_203700.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6d5c59c2eff34286466db53ecd01ec78225d18bf90c079c9e135f94943acc02e
3
+ size 2270
bin/results/data/7vertex_optimization_20251026_191114.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:95afdc4df0c3c803fd61f1e8aa8575595273114afa20650b3f135b837ea44eff
3
+ size 1141
bin/results/data/7vertex_optimization_20251026_193813.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e689212237436e5f0e27cfed6a331d4f35c0acefc2a191c543801a9ea00a2f28
3
+ size 1141
bin/results/data/7vertex_optimization_20251026_194915.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:cabef18150170f696ba1de13d3a78c94846c8da6dc504f7b31e3fa5bedc919f1
3
+ size 1141
bin/results/data/7vertex_optimization_20251026_195701.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6b10bdad76be2cae8ae2013d4fde01eb7469d09e98e0b92048287d11914854a7
3
+ size 894
bin/results/data/7vertex_optimization_20251026_200737.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3595713fd3c48b86f2ea9c81c567d340f075f014ff27004ec16454f1e2850463
3
+ size 894
bin/results/data/7vertex_optimization_20251026_201344.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c0dd44c390c87270a8449ce4fa672e5931975d7cc657c678de8a678eaa753041
3
+ size 894
bin/results/data/7vertex_optimization_20251026_202024.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1b508899ef59f5dde600783de57f0299f17b852d8dfae29deec2aa8f22c25943
3
+ size 894
bin/results/data/7vertex_optimization_20251026_205947.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b19d5cb571808690d50c3eab47b7054b243250d5a6db1252d74df50d7318a938
3
+ size 901
bin/results/data/9vertex_optimization_20251026_205959.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5daec075aa4a4fe8e0765f38daf454e16a6df26bdb092b0c6c56f40e7fddd204
3
+ size 1109
bin/results/data/9vertex_optimization_20251026_210116.json ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0ee32067e12f06496d538600e41a2204da27e1f978596b0e05d6007a53a82f7f
3
+ size 1200
PLATONIC_MAXIMALITY_RESULTS.md → docs/PLATONIC_MAXIMALITY_RESULTS.md RENAMED
File without changes
RESULTS_SUMMARY.md → docs/RESULTS_SUMMARY.md RENAMED
File without changes
examples/README.md ADDED
@@ -0,0 +1,44 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # Examples Directory
2
+
3
+ This directory contains organized example scripts demonstrating the use of the ideal polyhedra volume toolkit.
4
+
5
+ ## Directory Structure
6
+
7
+ ### `distributions/`
8
+ Analysis of volume distributions for various polyhedra:
9
+ - `tetrahedron/` - 4-vertex polyhedra volume distributions (10 files)
10
+ - `five_vertex/` - 5-vertex polyhedra distributions
11
+ - `six_vertex/` - 6-vertex polyhedra distributions
12
+ - `euclidean/` - Euclidean tetrahedra analysis and fitting
13
+
14
+ ### `optimization/`
15
+ Optimization scripts organized by vertex count:
16
+ - `7vertex/` - 7-vertex optimization (octahedron with stellated face hypothesis) - 9 scripts
17
+ - `12vertex/` - 12-vertex optimization - 3 scripts
18
+ - `20vertex/` - 20-vertex optimization (icosahedron-like) - 6 scripts
19
+ - `platonic/` - Platonic solid analysis and perturbations
20
+
21
+ ### `visualization/`
22
+ Visualization scripts (5 scripts):
23
+ - Sphere projection visualizations
24
+ - Golden ratio configurations
25
+ - Maximal volume configurations
26
+ - Volume landscapes
27
+
28
+ ### `analysis/`
29
+ Statistical and theoretical analysis scripts (14 scripts):
30
+ - Beta distribution theory
31
+ - Central limit theorem analysis
32
+ - Combinatorial mixture analysis
33
+ - Special configuration analysis
34
+
35
+ ## Running Examples
36
+
37
+ All examples can be run from their respective directories:
38
+
39
+ ```bash
40
+ cd examples/optimization/7vertex
41
+ python optimize_7vertex.py
42
+ ```
43
+
44
+ Or from the project root using absolute imports (package must be installed with `pip install -e .`).
{ideal_poly_volume_toolkit/examples → examples/analysis}/analytical_challenge_simple.py RENAMED
File without changes
{ideal_poly_volume_toolkit/examples → examples/analysis}/analytical_mean_challenge.py RENAMED
File without changes
{ideal_poly_volume_toolkit/examples → examples/analysis}/analyze_both_configs.py RENAMED
File without changes
{ideal_poly_volume_toolkit/examples → examples/analysis}/analyze_distribution_shape.py RENAMED
File without changes
{ideal_poly_volume_toolkit/examples → examples/analysis}/analyze_special_configs.py RENAMED
File without changes
{ideal_poly_volume_toolkit/examples → examples/analysis}/beta_distribution_theory.py RENAMED
File without changes
{ideal_poly_volume_toolkit/examples → examples/analysis}/beta_fit_analysis.py RENAMED
File without changes
{ideal_poly_volume_toolkit/examples → examples/analysis}/check_lob_math.py RENAMED
File without changes
{ideal_poly_volume_toolkit/examples → examples/analysis}/check_statistical_precision.py RENAMED
File without changes
{ideal_poly_volume_toolkit/examples → examples/analysis}/clt_analysis.py RENAMED
File without changes
{ideal_poly_volume_toolkit/examples → examples/analysis}/combinatorial_mixture_analysis.py RENAMED
File without changes
{ideal_poly_volume_toolkit/examples → examples/analysis}/concentration_location_analysis.py RENAMED
File without changes
create_distribution_comparison_plot.py → examples/analysis/create_distribution_comparison_plot.py RENAMED
File without changes
{ideal_poly_volume_toolkit/examples → examples/analysis}/sanity_check_5_7_vertices.py RENAMED
File without changes
euclidean_distribution_fitting.py → examples/distributions/euclidean/euclidean_distribution_fitting.py RENAMED
File without changes
euclidean_fit_analysis.py → examples/distributions/euclidean/euclidean_fit_analysis.py RENAMED
File without changes
euclidean_tetrahedron_distribution.py → examples/distributions/euclidean/euclidean_tetrahedron_distribution.py RENAMED
File without changes
{ideal_poly_volume_toolkit/examples → examples/distributions/five_vertex}/five_vertex_distribution.py RENAMED
File without changes
debug_six_vertex.py → examples/distributions/six_vertex/debug_six_vertex.py RENAMED
File without changes
{ideal_poly_volume_toolkit/examples → examples/distributions/six_vertex}/six_vertex_distribution.py RENAMED
File without changes
{ideal_poly_volume_toolkit/examples → examples/distributions/tetrahedron}/ideal_tetrahedron.py RENAMED
File without changes
{ideal_poly_volume_toolkit/examples → examples/distributions/tetrahedron}/quick_tetrahedron_analysis.py RENAMED
File without changes
{ideal_poly_volume_toolkit/examples → examples/distributions/tetrahedron}/run_tetrahedron_distribution.py RENAMED
File without changes
{ideal_poly_volume_toolkit/examples → examples/distributions/tetrahedron}/tetrahedron_volume_distribution.py RENAMED
File without changes
examples/distributions/tetrahedron/tetrahedron_volume_histogram.png ADDED

Git LFS Details

  • SHA256: 5239d843cc9362cd1e26f3f13b9ba878e8703c8db95dab17a13658edf9ebf88f
  • Pointer size: 130 Bytes
  • Size of remote file: 53.9 kB
{ideal_poly_volume_toolkit/examples → examples/optimization/12vertex}/analyze_12vertex_combinatorics.py RENAMED
File without changes
{ideal_poly_volume_toolkit/examples → examples/optimization/12vertex}/analyze_12vertex_results.py RENAMED
File without changes
{ideal_poly_volume_toolkit/examples → examples/optimization/12vertex}/visualize_maximal_12vertex.py RENAMED
File without changes
check_arithmetic_holonomy.py → examples/optimization/20vertex/check_arithmetic_holonomy.py RENAMED
@@ -1,9 +1,7 @@
1
  import numpy as np
2
- import sys
3
- sys.path.append('/Users/igorrivin/devel/platonic')
4
- from rivin_holonomy import Triangulation, generators_from_triangulation
5
  import json
6
  from scipy.spatial import Delaunay
 
7
 
8
  def compute_holonomy_traces(vertices_complex, triangles):
9
  """
 
1
  import numpy as np
 
 
 
2
  import json
3
  from scipy.spatial import Delaunay
4
+ from ideal_poly_volume_toolkit.rivin_holonomy import Triangulation, generators_from_triangulation
5
 
6
  def compute_holonomy_traces(vertices_complex, triangles):
7
  """
check_local_maxima_arithmetic.py → examples/optimization/20vertex/check_local_maxima_arithmetic.py RENAMED
@@ -6,11 +6,9 @@ Tests the conjecture that local maxima are more likely to be arithmetic.
6
 
7
  import numpy as np
8
  import json
9
- import sys
10
- sys.path.append('/Users/igorrivin/devel/platonic')
11
- from rivin_holonomy import Triangulation, generators_from_triangulation
12
- from scipy.spatial import Delaunay
13
  import os
 
 
14
 
15
  def build_triangulation_from_config(vertices_dict):
16
  """Convert vertex configuration to triangulation for holonomy computation."""
 
6
 
7
  import numpy as np
8
  import json
 
 
 
 
9
  import os
10
+ from scipy.spatial import Delaunay
11
+ from ideal_poly_volume_toolkit.rivin_holonomy import Triangulation, generators_from_triangulation
12
 
13
  def build_triangulation_from_config(vertices_dict):
14
  """Convert vertex configuration to triangulation for holonomy computation."""