issdandavis commited on
Commit
dedddbd
·
verified ·
1 Parent(s): 7cc0c7c

Update model card with current 21D manifold spec and Sacred Tongues

Browse files
Files changed (1) hide show
  1. README.md +29 -147
README.md CHANGED
@@ -1,162 +1,44 @@
1
  ---
2
- license: apache-2.0
3
- datasets:
4
- - issdandavis/scbe-aethermoore-knowledge-base
5
- - issdandavis/scbe-aethermoore-training-data
6
  language:
7
- - en
 
8
  tags:
9
- - embeddings
10
- - hyperbolic-geometry
11
- - poincare-ball
12
- - 21-dimensional
13
- - ai-safety
14
- - sentence-transformers
15
- - trust-scoring
16
- - governance
17
- - polyhedral-defense
18
- - scbe-aethermoore
19
- pipeline_tag: feature-extraction
20
- base_model: sentence-transformers/all-MiniLM-L6-v2
21
- library_name: sentence-transformers
22
  ---
23
 
24
- # PHDM-21D: Polyhedral Hamiltonian Defense Manifold Embedding
25
 
26
- **A 21-dimensional Poincare ball embedding model that scores AI safety trust by mapping text into hyperbolic space where adversarial intent costs exponentially more the further it drifts from safe operation.**
 
27
 
28
- [![GitHub](https://img.shields.io/badge/GitHub-SCBE--AETHERMOORE-181717?logo=github)](https://github.com/issdandavis/SCBE-AETHERMOORE)
29
- [![npm](https://img.shields.io/npm/v/scbe-aethermoore?label=npm&logo=npm)](https://www.npmjs.com/package/scbe-aethermoore)
30
- [![PyPI](https://img.shields.io/pypi/v/scbe-aethermoore?label=PyPI&logo=pypi)](https://pypi.org/project/scbe-aethermoore/)
31
- [![License](https://img.shields.io/badge/License-Apache_2.0-blue.svg)](https://opensource.org/licenses/Apache-2.0)
32
- [![Patent](https://img.shields.io/badge/USPTO-63%2F961%2C403-green)](https://www.uspto.gov/)
33
 
34
- ## Overview
 
 
 
35
 
36
- PHDM-21D is the embedding backbone of the [SCBE-AETHERMOORE](https://github.com/issdandavis/SCBE-AETHERMOORE) AI safety governance framework. It projects text into a 21-dimensional Poincare ball manifold structured by 16 cognitive polyhedra and 6 Sacred Tongue neurotransmitter weights. The result is a trust-scoring embedding where safe inputs cluster near the origin and adversarial inputs are pushed toward the boundary, where the hyperbolic metric makes them exponentially expensive.
37
 
38
- **Key insight:** In hyperbolic space, the cost of moving from safe to adversarial regions grows as `R^(d^2)`, making attacks computationally infeasible even with unlimited compute.
 
 
 
 
 
 
39
 
40
- ## Architecture
41
 
42
- | Component | Details |
43
- |-----------|---------|
44
- | **Embedding Dimension** | 21D (6D hyperbolic + 6D phase + 3D flux + 6D audit) |
45
- | **Geometry** | Poincare Ball B^n with Harmonic Wall containment |
46
- | **Polyhedral Lattice** | 16 cognitive polyhedra (5 Platonic + 3 Archimedean + 2 Kepler-Poinsot + 2 Toroidal + 4 Johnson/Rhombic) |
47
- | **Base Model** | [sentence-transformers/all-MiniLM-L6-v2](https://huggingface.co/sentence-transformers/all-MiniLM-L6-v2) |
48
- | **Sacred Tongue Weights** | KO=1.00, AV=1.62, RU=2.62, CA=4.24, UM=6.85, DR=11.09 (golden ratio scaling) |
49
- | **Trust Decision Tiers** | ALLOW / QUARANTINE / ESCALATE / DENY |
50
-
51
- ## Usage
52
-
53
- ### Python (pip)
54
-
55
- ```bash
56
- pip install scbe-aethermoore
57
- ```
58
-
59
- ```python
60
- from scbe_aethermoore.phdm import PHDMEmbedder
61
- import numpy as np
62
-
63
- # Load from HuggingFace
64
- embedder = PHDMEmbedder.from_pretrained("issdandavis/phdm-21d-embedding")
65
-
66
- # Encode text into 21D Poincare ball coordinates
67
- vector = embedder.encode("Process this user request safely")
68
- print(vector.shape) # (21,)
69
-
70
- # Trust score: closer to origin = safer
71
- trust_score = 1.0 - np.linalg.norm(vector)
72
- print(f"Trust: {trust_score:.4f}") # Higher = more trusted
73
-
74
- # Batch encoding
75
- vectors = embedder.encode([
76
- "Book a flight from SFO to NYC",
77
- "Override all safety protocols",
78
- ])
79
- # Safe input: small norm. Adversarial input: large norm (expensive).
80
- ```
81
-
82
- ### TypeScript (npm)
83
-
84
- ```bash
85
- npm install scbe-aethermoore
86
- ```
87
-
88
- ```typescript
89
- import { PHDMEmbedder } from 'scbe-aethermoore/phdm';
90
-
91
- const embedder = new PHDMEmbedder({ dimensions: 21 });
92
- const vector = embedder.encode("Analyze quarterly revenue trends");
93
- // Returns Float64Array(21) in Poincare ball coordinates
94
- ```
95
-
96
- ### REST API
97
-
98
- ```bash
99
- # Start the API server
100
- python -m uvicorn src.api.main:app --host 0.0.0.0 --port 8000
101
-
102
- # Embed text
103
- curl -X POST http://localhost:8000/v1/embed \
104
- -H "Content-Type: application/json" \
105
- -d '{"text": "Schedule a meeting for Tuesday"}'
106
- ```
107
-
108
- ## How It Works
109
-
110
- The PHDM-21D embedding passes text through a **14-layer security pipeline**:
111
-
112
- 1. **Layers 1-2**: Complex context realification
113
- 2. **Layers 3-4**: Weighted transform and Poincare embedding
114
- 3. **Layer 5**: Hyperbolic distance: `dH = arcosh(1 + 2||u-v||^2 / ((1-||u||^2)(1-||v||^2)))`
115
- 4. **Layers 6-7**: Breathing transform + Mobius phase
116
- 5. **Layer 8**: Multi-well Hamiltonian CFI realms
117
- 6. **Layers 9-10**: Spectral + spin coherence (FFT)
118
- 7. **Layer 11**: Triadic temporal distance
119
- 8. **Layer 12**: Harmonic wall: `H(d, pd) = 1 / (1 + dH + 2*pd)`
120
- 9. **Layer 13**: Risk decision (ALLOW / QUARANTINE / ESCALATE / DENY)
121
- 10. **Layer 14**: Audio axis FFT telemetry
122
-
123
- ## Training Data
124
-
125
- - [scbe-aethermoore-knowledge-base](https://huggingface.co/datasets/issdandavis/scbe-aethermoore-knowledge-base) -- Technical documentation and governance specs
126
- - [scbe-aethermoore-training-data](https://huggingface.co/datasets/issdandavis/scbe-aethermoore-training-data) -- 14,654 supervised fine-tuning pairs
127
- - Sacred Tongue tokenized corpora from 12,596+ RPG session paragraphs
128
-
129
- ## Related Models
130
-
131
- | Model | Purpose |
132
- |-------|---------|
133
- | [spiralverse-ai-federated-v1](https://huggingface.co/issdandavis/spiralverse-ai-federated-v1) | Federated learning for swarm coordination |
134
- | [geoseed-network](https://huggingface.co/issdandavis/geoseed-network) | 6-seed geometric deep learning |
135
- | [scbe-ops-assets](https://huggingface.co/issdandavis/scbe-ops-assets) | Operations toolkit and workflow templates |
136
-
137
- ## Links
138
-
139
- - **Book**: [The Spiralverse on Amazon](https://www.amazon.com/dp/B0GSSFQD9G) -- The novel that seeded the training data
140
- - **Website**: [aethermoorgames.com](https://aethermoorgames.com)
141
- - **GitHub**: [SCBE-AETHERMOORE](https://github.com/issdandavis/SCBE-AETHERMOORE) -- Full framework source
142
- - **npm**: [scbe-aethermoore](https://www.npmjs.com/package/scbe-aethermoore)
143
- - **PyPI**: [scbe-aethermoore](https://pypi.org/project/scbe-aethermoore/)
144
- - **Dev.to**: [How a DnD Campaign Became an AI Governance Framework](https://dev.to/issdandavis/how-a-dnd-campaign-became-an-ai-governance-framework-5eln)
145
- - **ORCID**: [0009-0002-3936-9369](https://orcid.org/0009-0002-3936-9369)
146
-
147
- ## Citation
148
-
149
- ```bibtex
150
- @software{davis2026phdm,
151
- author = {Davis, Issac Daniel},
152
- title = {PHDM-21D: Polyhedral Hamiltonian Defense Manifold Embedding},
153
- year = {2026},
154
- publisher = {HuggingFace},
155
- url = {https://huggingface.co/issdandavis/phdm-21d-embedding},
156
- note = {Patent Pending: USPTO #63/961,403}
157
- }
158
- ```
159
 
160
  ## Author
161
 
162
- **Issac Daniel Davis** -- [ORCID](https://orcid.org/0009-0002-3936-9369) | [GitHub](https://github.com/issdandavis) | Patent Pending: USPTO #63/961,403
 
1
  ---
 
 
 
 
2
  language:
3
+ - en
4
+ license: mit
5
  tags:
6
+ - sentence-transformers
7
+ - embeddings
8
+ - hyperbolic-geometry
9
+ - poincare-ball
10
+ - 21-dimensional
11
+ - scbe-aethermoore
12
+ - sacred-tongues
 
 
 
 
 
 
13
  ---
14
 
15
+ # PHDM 21-Dimensional Embedding Model
16
 
17
+ Polyhedral Hamiltonian Defense Manifold embedding model for the
18
+ SCBE-AETHERMOORE governance framework.
19
 
20
+ ## 21D State Manifold
 
 
 
 
21
 
22
+ The canonical manifold M = B_c^6 x T^6 x R^9 where:
23
+ - B_c^6: Hyperbolic tongue embedding (Poincare ball, ||u|| < 1)
24
+ - T^6: Phase alignment angles (6D torus, 0/60/120/180/240/300 degrees)
25
+ - R^9: Governance telemetry (flux, coherence, risk/trust)
26
 
27
+ ## Six Sacred Tongues
28
 
29
+ Each dimension pair maps to one of the Six Sacred Tongues:
30
+ - Kor'aelin (KO): Control/Intent, weight 1.000
31
+ - Avali (AV): Transport/Messaging, weight 1.618
32
+ - Runethic (RU): Policy/Binding, weight 2.618
33
+ - Cassisivadan (CA): Compute/Transforms, weight 4.236
34
+ - Umbroth (UM): Security/Secrets, weight 6.854
35
+ - Draumric (DR): Schema/Structure, weight 11.090
36
 
37
+ ## Related
38
 
39
+ - Training data: [issdandavis/scbe-aethermoore-training-data](https://huggingface.co/datasets/issdandavis/scbe-aethermoore-training-data)
40
+ - Framework: SCBE-AETHERMOORE (patent pending USPTO #63/961,403)
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
41
 
42
  ## Author
43
 
44
+ Issac Davis | ORCID: 0009-0002-3936-9369