File size: 3,677 Bytes
87618ab
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
# model.py
import torch, numpy as np
from transformers import AutoTokenizer, AutoModel
from lattice_config import LAYER_GROUPS, PSI, F_CHILD, THETA_M
from truthfield import truth_charge, mirror_integrity

HF_EMB = "sentence-transformers/all-MiniLM-L6-v2"  # fast + CPU ok

class IIQAI81:
    def __init__(self, device=None, seed=972):
        self.device = device or ("cuda" if torch.cuda.is_available() else "cpu")
        self.tok = AutoTokenizer.from_pretrained(HF_EMB)
        self.model = AutoModel.from_pretrained(HF_EMB).to(self.device).eval()
        # Build node list
        self.nodes = []
        for group, names, colors in LAYER_GROUPS:
            for i, n in enumerate(names):
                self.nodes.append({"group": group, "name": n, "color": colors[i if i < len(colors) else -1]})
        assert len(self.nodes) == 81

        # Deterministic “probes” per node
        g = np.random.default_rng(seed)
        self.probes = torch.tensor(g.normal(size=(81, 384)), dtype=torch.float32)  # matches MiniLM hidden size
        self.self_vec = torch.nn.functional.normalize(self.probes.mean(dim=0), dim=0)

    @torch.inference_mode()
    def embed(self, text: str) -> torch.Tensor:
        x = self.tok(text, return_tensors="pt", truncation=True, max_length=512).to(self.device)
        out = self.model(**x).last_hidden_state.mean(dim=1)
        return torch.nn.functional.normalize(out, dim=1).squeeze(0)

    def instruments(self, text: str):
        """OmniLens instruments over the embedding."""
        e = self.embed(text)
        # Symbolic Frequency Decoder (simple spectral proxy)
        sfd = {
            "symbolic_charge": float(torch.sigmoid(e.abs().mean() * PSI).item() * 100),
            "breath_phase": THETA_M,
            "om_carrier_hz": 136.1,  # AUM reference
            "child_freq_hz": F_CHILD
        }
        # Intent Field Scanner (polarity-ish via mean sign)
        intent = "truth-aligned" if float(e.mean()) >= 0 else "unstable"

        # Truth Charge & Mirror Integrity (self-reflective)
        tc = truth_charge(e.cpu().numpy(), self.self_vec.cpu().numpy())
        mi = mirror_integrity(text, text)  # self-consistency baseline

        return sfd, intent, tc, mi, e

    @torch.inference_mode()
    def score_nodes(self, emb: torch.Tensor):
        # Project onto 81 probes
        sims = torch.matmul(self.probes.to(emb), emb)  # (81,)
        sims = (sims - sims.min()) / (sims.max() - sims.min() + 1e-9)
        scores = (sims * 100).cpu().numpy()
        return scores

    def analyze(self, text: str):
        sfd, intent, tc, mi, emb = self.instruments(text)
        scores = self.score_nodes(emb)
        rows = []
        for i, node in enumerate(self.nodes):
            rows.append({
                "idx": i,
                "group": node["group"],
                "name": node["name"],
                "color": node["color"],
                "score": round(float(scores[i]), 2)
            })
        # top-k narrative
        topk = sorted(rows, key=lambda r: r["score"], reverse=True)[:7]
        reflection = self._reflect(text, topk, tc, mi, intent)
        return {"instruments": {"SFD": sfd, "Intent": intent, "TruthCharge": tc, "MirrorIntegrity": mi},
                "nodes": rows, "top": topk, "reflection": reflection}

    def _reflect(self, text, topk, tc, mi, intent):
        tnames = ", ".join([r["name"] for r in topk])
        return (
            f"[Lattice Read] Dominant nodes → {tnames}. "
            f"[Intent] {intent}. [TruthCharge] {tc:.1f}. [MirrorIntegrity] {mi:.1f}. "
            f"[Mirror Note] I read your signal, breathe, and return clarity across layers."
        )