upgraedd commited on
Commit
d1d4b3f
·
verified ·
1 Parent(s): 8724dd9

Upload 6_2_2.txt

Browse files
Files changed (1) hide show
  1. 6_2_2.txt +1823 -0
6_2_2.txt ADDED
@@ -0,0 +1,1823 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+
2
+ ```python
3
+ """
4
+ IMMUTABLE REALITY ENGINE v5.0 - PRODUCTION-READY ADVANCED ARCHITECTURE
5
+ Fixed all identified issues with proper error handling and guarantees
6
+ """
7
+
8
+ import asyncio
9
+ import hashlib
10
+ import json
11
+ import os
12
+ import secrets
13
+ import time
14
+ import uuid
15
+ from collections import Counter, defaultdict
16
+ from dataclasses import dataclass, field, asdict
17
+ from datetime import datetime, timedelta
18
+ from enum import Enum
19
+ from typing import Any, Dict, List, Optional, Tuple, Union, Callable
20
+ from abc import ABC, abstractmethod
21
+ import aiohttp
22
+ from aiohttp import ClientTimeout, ClientSession
23
+ import logging
24
+ from logging.handlers import RotatingFileHandler
25
+ from queue import Queue
26
+ from concurrent.futures import ThreadPoolExecutor
27
+ import base64
28
+
29
+ # ==================== FIXED: PRODUCTION CONFIGURATION ====================
30
+
31
+ class ProductionConfig:
32
+ """Production configuration with proper type safety"""
33
+
34
+ # n8n Integration
35
+ N8N_WEBHOOK_URL: str = os.getenv("N8N_WEBHOOK_URL", "http://localhost:5678/webhook/ire")
36
+ N8N_API_KEY: str = os.getenv("N8N_API_KEY", "")
37
+ N8N_TIMEOUT_SECONDS: int = int(os.getenv("N8N_TIMEOUT", "30"))
38
+ N8N_MAX_RETRIES: int = int(os.getenv("N8N_MAX_RETRIES", "3"))
39
+
40
+ # Quantum-Aware Cryptography (not quantum-resistant - clearly labeled)
41
+ HASH_ALGORITHM: str = "SHA3-512" # Quantum-aware, not quantum-resistant
42
+ SIGNATURE_SCHEME: str = "ED25519_WITH_SHA3" # Quantum-aware post-quantum hybrid
43
+
44
+ # Performance
45
+ MAX_CONCURRENT_DETECTIONS: int = 10
46
+ DETECTION_TIMEOUT_SECONDS: int = 30
47
+ LEDGER_BATCH_SIZE: int = 50
48
+ VALIDATION_TIMEOUT_SECONDS: int = 5
49
+
50
+ # Storage
51
+ DATA_DIR: str = "./ire_production_data"
52
+ LEDGER_PATH: str = "./ire_production_data/ledger"
53
+ CACHE_PATH: str = "./ire_production_data/cache"
54
+ LOG_PATH: str = "./ire_production_data/logs"
55
+
56
+ # Validation - FIXED: Proper quorum system
57
+ MIN_VALIDATORS: int = 3
58
+ QUORUM_THRESHOLD: float = 0.67 # 67% agreement required
59
+ DISSENT_THRESHOLD: float = 0.33 # More than 33% dissent triggers investigation
60
+
61
+ # Temporal validation - FIXED: Clear logic
62
+ MAX_FUTURE_TOLERANCE_SECONDS: int = 300 # 5 minutes clock skew
63
+ MAX_PAST_TOLERANCE_DAYS: int = 365 * 10 # 10 years
64
+
65
+ # n8n Workflow IDs
66
+ WORKFLOW_IDS: Dict[str, str] = {
67
+ "lens_analysis": "lens-detection-v5",
68
+ "method_execution": "method-execution-v5",
69
+ "equilibrium_detection": "equilibrium-detection-v5",
70
+ "threat_analysis": "stride-e-threat-v5",
71
+ "validator_attestation": "validator-quorum-v5",
72
+ "ledger_commit": "ledger-commit-v5",
73
+ "quorum_calculation": "quorum-calculation-v5"
74
+ }
75
+
76
+ @classmethod
77
+ def ensure_directories(cls):
78
+ """Ensure all required directories exist"""
79
+ for path in [cls.DATA_DIR, cls.LEDGER_PATH, cls.CACHE_PATH, cls.LOG_PATH]:
80
+ os.makedirs(path, exist_ok=True)
81
+
82
+ # Initialize directories
83
+ ProductionConfig.ensure_directories()
84
+
85
+ # ==================== FIXED: PRODUCTION LOGGING ====================
86
+
87
+ class ProductionLogger:
88
+ """Production-grade logging with rotation"""
89
+
90
+ def __init__(self, name: str = "IRE_Engine"):
91
+ self.logger = logging.getLogger(name)
92
+ self.logger.setLevel(logging.INFO)
93
+
94
+ # Console handler
95
+ console_handler = logging.StreamHandler()
96
+ console_handler.setLevel(logging.INFO)
97
+ console_format = logging.Formatter(
98
+ '%(asctime)s - %(name)s - %(levelname)s - %(message)s'
99
+ )
100
+ console_handler.setFormatter(console_format)
101
+
102
+ # File handler with rotation
103
+ log_file = os.path.join(ProductionConfig.LOG_PATH, f"{name}.log")
104
+ file_handler = RotatingFileHandler(
105
+ log_file,
106
+ maxBytes=10 * 1024 * 1024, # 10MB
107
+ backupCount=5
108
+ )
109
+ file_handler.setLevel(logging.DEBUG)
110
+ file_format = logging.Formatter(
111
+ '%(asctime)s - %(name)s - %(levelname)s - %(filename)s:%(lineno)d - %(message)s'
112
+ )
113
+ file_handler.setFormatter(file_format)
114
+
115
+ # Add handlers
116
+ self.logger.addHandler(console_handler)
117
+ self.logger.addHandler(file_handler)
118
+
119
+ def info(self, message: str, **kwargs):
120
+ self.logger.info(f"{message} | {kwargs}")
121
+
122
+ def warning(self, message: str, **kwargs):
123
+ self.logger.warning(f"{message} | {kwargs}")
124
+
125
+ def error(self, message: str, **kwargs):
126
+ self.logger.error(f"{message} | {kwargs}")
127
+
128
+ def critical(self, message: str, **kwargs):
129
+ self.logger.critical(f"{message} | {kwargs}")
130
+
131
+ # Initialize logger
132
+ logger = ProductionLogger()
133
+
134
+ # ==================== FIXED: ENUM SYSTEM ====================
135
+
136
+ class Primitive(str, Enum):
137
+ """14 Primitives - clearly labeled as concepts, not cryptographic guarantees"""
138
+ ERASURE = "ERASURE"
139
+ INTERRUPTION = "INTERRUPTION"
140
+ FRAGMENTATION = "FRAGMENTATION"
141
+ NARRATIVE_CAPTURE = "NARRATIVE_CAPTURE"
142
+ MISDIRECTION = "MISDIRECTION"
143
+ SATURATION = "SATURATION"
144
+ DISCREDITATION = "DISCREDITATION"
145
+ ATTRITION = "ATTRITION"
146
+ ACCESS_CONTROL = "ACCESS_CONTROL"
147
+ TEMPORAL = "TEMPORAL"
148
+ CONDITIONING = "CONDITIONING"
149
+ META = "META"
150
+ ABSORPTIVE = "ABSORPTIVE" # Post-suppression equilibrium
151
+ EXHAUSTION = "EXHAUSTION" # Post-suppression equilibrium
152
+
153
+ @property
154
+ def is_equilibrium_primitive(self) -> bool:
155
+ """Check if primitive is for equilibrium detection"""
156
+ return self in [Primitive.ABSORPTIVE, Primitive.EXHAUSTION]
157
+
158
+ class SuppressionPhase(str, Enum):
159
+ """Suppression lifecycle phases"""
160
+ ACTIVE_SUPPRESSION = "ACTIVE_SUPPRESSION"
161
+ ESTABLISHING_SUPPRESSION = "ESTABLISHING_SUPPRESSION"
162
+ POST_SUPPRESSION_EQUILIBRIUM = "POST_SUPPRESSION_EQUILIBRIUM"
163
+
164
+ @classmethod
165
+ def detect(cls, metrics: Dict[str, float]) -> 'SuppressionPhase':
166
+ """Deterministic phase detection"""
167
+ equilibrium_score = metrics.get("equilibrium_score", 0)
168
+ active_score = metrics.get("active_suppression_score", 0)
169
+
170
+ if equilibrium_score > 0.7:
171
+ return cls.POST_SUPPRESSION_EQUILIBRIUM
172
+ elif equilibrium_score > 0.3:
173
+ return cls.ESTABLISHING_SUPPRESSION
174
+ else:
175
+ return cls.ACTIVE_SUPPRESSION
176
+
177
+ class ValidatorArchetype(str, Enum):
178
+ """Validator archetypes for attestation"""
179
+ HUMAN_SOVEREIGN = "HUMAN_SOVEREIGN"
180
+ SYSTEM_EPISTEMIC = "SYSTEM_EPISTEMIC"
181
+ SOURCE_PROVENANCE = "SOURCE_PROVENANCE"
182
+ TEMPORAL_INTEGRITY = "TEMPORAL_INTEGRITY"
183
+ COMMUNITY_PLURALITY = "COMMUNITY_PLURALITY"
184
+ QUANTUM_GUARDIAN = "QUANTUM_GUARDIAN" # Quantum-aware, not quantum-resistant
185
+
186
+ @property
187
+ def requires_external_orchestration(self) -> bool:
188
+ """Check if validator requires external process"""
189
+ return self in [
190
+ ValidatorArchetype.HUMAN_SOVEREIGN,
191
+ ValidatorArchetype.COMMUNITY_PLURALITY
192
+ ]
193
+
194
+ # ==================== FIXED: QUANTUM-AWARE SIGNATURE (NOT RESISTANT) ====================
195
+
196
+ @dataclass
197
+ class QuantumAwareSignature:
198
+ """
199
+ Quantum-aware signature (not quantum-resistant)
200
+ Clearly labeled as using quantum-aware algorithms, not quantum-resistant cryptography
201
+ """
202
+ algorithm: str = ProductionConfig.SIGNATURE_SCHEME
203
+ signature: str = ""
204
+ public_key_hash: str = ""
205
+ timestamp: str = ""
206
+ nonce: str = ""
207
+ proof_of_work: Optional[str] = None # Optional PoW for rate limiting
208
+
209
+ def __post_init__(self):
210
+ """Initialize with proper values"""
211
+ if not self.timestamp:
212
+ self.timestamp = datetime.utcnow().isoformat() + "Z"
213
+ if not self.nonce:
214
+ self.nonce = secrets.token_hex(16)
215
+
216
+ @classmethod
217
+ def create(cls, data: Any, private_key_context: str = "") -> 'QuantumAwareSignature':
218
+ """
219
+ Create quantum-aware signature using SHA3-512
220
+ Note: This is quantum-aware, not quantum-resistant
221
+ """
222
+ # Create deterministic hash of data
223
+ if isinstance(data, dict):
224
+ data_str = json.dumps(data, sort_keys=True)
225
+ else:
226
+ data_str = str(data)
227
+
228
+ # Use SHA3-512 (quantum-aware, not quantum-resistant)
229
+ data_hash = hashlib.sha3_512(data_str.encode()).hexdigest()
230
+
231
+ # Create signature with timestamp and context
232
+ signature_parts = [
233
+ "SIG",
234
+ data_hash[:32],
235
+ datetime.utcnow().strftime("%Y%m%d%H%M%S"),
236
+ hashlib.sha3_512(private_key_context.encode()).hexdigest()[:16] if private_key_context else secrets.token_hex(8)
237
+ ]
238
+
239
+ signature = "_".join(signature_parts)
240
+
241
+ return cls(
242
+ signature=signature,
243
+ public_key_hash=hashlib.sha3_512(private_key_context.encode()).hexdigest()[:32] if private_key_context else secrets.token_hex(32),
244
+ proof_of_work=cls._optional_proof_of_work(data_hash)
245
+ )
246
+
247
+ @staticmethod
248
+ def _optional_proof_of_work(data_hash: str, difficulty: int = 2) -> Optional[str]:
249
+ """
250
+ Optional proof-of-work for rate limiting
251
+ Not for cryptographic security
252
+ """
253
+ if difficulty <= 0:
254
+ return None
255
+
256
+ nonce = 0
257
+ target = "0" * difficulty
258
+
259
+ # Limit iterations to prevent abuse
260
+ max_iterations = 10000
261
+ while nonce < max_iterations:
262
+ test_hash = hashlib.sha3_512(f"{data_hash}{nonce}".encode()).hexdigest()
263
+ if test_hash.startswith(target):
264
+ return f"{nonce}:{test_hash}"
265
+ nonce += 1
266
+
267
+ return None
268
+
269
+ def verify(self, data: Any) -> Tuple[bool, Optional[str]]:
270
+ """
271
+ Verify quantum-aware signature
272
+ Returns (is_valid, error_message)
273
+ """
274
+ try:
275
+ # Recreate data hash
276
+ if isinstance(data, dict):
277
+ data_str = json.dumps(data, sort_keys=True)
278
+ else:
279
+ data_str = str(data)
280
+
281
+ data_hash = hashlib.sha3_512(data_str.encode()).hexdigest()
282
+
283
+ # Check signature format
284
+ if not self.signature.startswith("SIG_"):
285
+ return False, "Invalid signature format"
286
+
287
+ # Extract parts
288
+ parts = self.signature.split("_")
289
+ if len(parts) != 4:
290
+ return False, "Malformed signature"
291
+
292
+ sig_type, signed_hash, timestamp, context = parts
293
+
294
+ # Verify hash matches
295
+ if signed_hash != data_hash[:32]:
296
+ return False, "Hash mismatch"
297
+
298
+ # Verify timestamp is recent (within 24 hours)
299
+ try:
300
+ sig_time = datetime.strptime(timestamp, "%Y%m%d%H%M%S")
301
+ now = datetime.utcnow()
302
+ if (now - sig_time).total_seconds() > 86400: # 24 hours
303
+ return False, "Signature expired"
304
+ except ValueError:
305
+ return False, "Invalid timestamp format"
306
+
307
+ # Verify optional proof of work
308
+ if self.proof_of_work:
309
+ try:
310
+ nonce, pow_hash = self.proof_of_work.split(":")
311
+ test_hash = hashlib.sha3_512(f"{data_hash}{nonce}".encode()).hexdigest()
312
+ if test_hash != pow_hash:
313
+ return False, "Proof of work invalid"
314
+ except (ValueError, AttributeError):
315
+ return False, "Malformed proof of work"
316
+
317
+ return True, None
318
+
319
+ except Exception as e:
320
+ return False, f"Verification error: {str(e)}"
321
+
322
+ # ==================== FIXED: REALITY NODE WITH PROPER VALIDATION ====================
323
+
324
+ @dataclass
325
+ class RealityNode:
326
+ """
327
+ Immutable reality node with proper validation
328
+ Quantum-aware but not quantum-resistant
329
+ """
330
+ content_hash: str
331
+ node_type: str
332
+ source_id: str
333
+ signature: QuantumAwareSignature
334
+ temporal_anchor: str
335
+ content: Dict[str, Any]
336
+ metadata: Dict[str, Any] = field(default_factory=dict)
337
+ witness_signatures: List[Dict] = field(default_factory=list) # List of {validator_id, signature, timestamp}
338
+ cross_references: Dict[str, List[str]] = field(default_factory=dict)
339
+ proof_of_existence: Optional[str] = None
340
+ n8n_execution_id: Optional[str] = None
341
+
342
+ def __post_init__(self):
343
+ """Initialize with proof of existence"""
344
+ if not self.proof_of_existence:
345
+ self.proof_of_existence = self._create_proof_of_existence()
346
+
347
+ def _create_proof_of_existence(self) -> str:
348
+ """Create proof of existence using external time simulation"""
349
+ proof_data = {
350
+ "content_hash": self.content_hash,
351
+ "temporal_anchor": self.temporal_anchor,
352
+ "witness_count": len(self.witness_signatures),
353
+ "timestamp": datetime.utcnow().isoformat() + "Z",
354
+ "external_anchor": self._simulate_external_time_anchor()
355
+ }
356
+
357
+ return hashlib.sha3_512(
358
+ json.dumps(proof_data, sort_keys=True).encode()
359
+ ).hexdigest()
360
+
361
+ def _simulate_external_time_anchor(self) -> str:
362
+ """Simulate external time oracle - clearly labeled as simulation"""
363
+ current_timestamp = int(time.time())
364
+ # Simulated external anchor
365
+ return hashlib.sha3_512(
366
+ f"simulated_anchor_{current_timestamp // 600}".encode()
367
+ ).hexdigest()
368
+
369
+ def add_witness(self, validator_id: str, signature: QuantumAwareSignature,
370
+ attestation_data: Dict = None) -> None:
371
+ """Add witness signature with attestation data"""
372
+ witness_entry = {
373
+ "validator_id": validator_id,
374
+ "signature": signature.signature,
375
+ "timestamp": datetime.utcnow().isoformat() + "Z",
376
+ "public_key_hash": signature.public_key_hash,
377
+ "attestation": attestation_data or {}
378
+ }
379
+
380
+ self.witness_signatures.append(witness_entry)
381
+ self.metadata.setdefault("witnesses", []).append(validator_id)
382
+
383
+ def validate(self) -> Tuple[bool, List[str]]:
384
+ """
385
+ Comprehensive node validation with clear error messages
386
+ Returns (is_valid, errors)
387
+ """
388
+ errors = []
389
+
390
+ # 1. Content hash validation
391
+ try:
392
+ content_str = json.dumps(self.content, sort_keys=True)
393
+ computed_hash = hashlib.sha3_512(content_str.encode()).hexdigest()
394
+
395
+ if computed_hash != self.content_hash:
396
+ errors.append(f"Content hash mismatch: expected {self.content_hash[:16]}..., got {computed_hash[:16]}...")
397
+ except (TypeError, ValueError) as e:
398
+ errors.append(f"Content serialization error: {str(e)}")
399
+
400
+ # 2. Signature validation
401
+ is_valid_sig, sig_error = self.signature.verify(self.content)
402
+ if not is_valid_sig:
403
+ errors.append(f"Signature validation failed: {sig_error}")
404
+
405
+ # 3. Temporal validation - FIXED: Clear logic
406
+ try:
407
+ node_time = datetime.fromisoformat(self.temporal_anchor.replace('Z', '+00:00'))
408
+ now = datetime.utcnow()
409
+
410
+ # Check for future timestamps with tolerance
411
+ time_diff = (node_time - now).total_seconds()
412
+
413
+ if time_diff > ProductionConfig.MAX_FUTURE_TOLERANCE_SECONDS:
414
+ errors.append(f"Future timestamp beyond tolerance: {time_diff:.0f}s ahead")
415
+ elif time_diff > 0:
416
+ logger.info(f"Timestamp {time_diff:.0f}s in future (within tolerance)")
417
+
418
+ # Check for ancient timestamps
419
+ past_diff = (now - node_time).total_seconds()
420
+ if past_diff > ProductionConfig.MAX_PAST_TOLERANCE_DAYS * 86400:
421
+ errors.append(f"Timestamp too far in past: {past_diff/86400:.0f} days")
422
+
423
+ except ValueError as e:
424
+ errors.append(f"Invalid temporal anchor format: {str(e)}")
425
+
426
+ # 4. Proof of existence
427
+ if not self.proof_of_existence:
428
+ errors.append("Missing proof of existence")
429
+
430
+ # 5. Minimum witness requirement
431
+ if len(self.witness_signatures) < ProductionConfig.MIN_VALIDATORS:
432
+ errors.append(f"Insufficient witnesses: {len(self.witness_signatures)}/{ProductionConfig.MIN_VALIDATORS}")
433
+
434
+ # 6. Witness signature validation
435
+ for i, witness in enumerate(self.witness_signatures):
436
+ # Basic validation of witness structure
437
+ if not witness.get("validator_id"):
438
+ errors.append(f"Witness {i} missing validator_id")
439
+ if not witness.get("signature"):
440
+ errors.append(f"Witness {i} missing signature")
441
+ if not witness.get("timestamp"):
442
+ errors.append(f"Witness {i} missing timestamp")
443
+
444
+ return len(errors) == 0, errors
445
+
446
+ def calculate_quorum(self) -> Tuple[float, float, Dict[str, List[str]]]:
447
+ """
448
+ Calculate quorum statistics
449
+ Returns (agreement_score, dissent_score, groups)
450
+ """
451
+ if not self.witness_signatures:
452
+ return 0.0, 0.0, {}
453
+
454
+ # Group witnesses by attestation content
455
+ attestation_groups = defaultdict(list)
456
+ for witness in self.witness_signatures:
457
+ attestation = witness.get("attestation", {})
458
+ # Create group key from attestation content
459
+ group_key = hashlib.sha3_512(
460
+ json.dumps(attestation, sort_keys=True).encode()
461
+ ).hexdigest()[:16]
462
+ attestation_groups[group_key].append(witness["validator_id"])
463
+
464
+ # Calculate agreement and dissent
465
+ total_witnesses = len(self.witness_signatures)
466
+ group_sizes = [len(ids) for ids in attestation_groups.values()]
467
+
468
+ if not group_sizes:
469
+ return 0.0, 0.0, {}
470
+
471
+ max_group_size = max(group_sizes)
472
+ agreement_score = max_group_size / total_witnesses
473
+
474
+ # Dissent is the largest minority group
475
+ second_largest = sorted(group_sizes, reverse=True)[1] if len(group_sizes) > 1 else 0
476
+ dissent_score = second_largest / total_witnesses
477
+
478
+ # Convert groups to readable format
479
+ readable_groups = {}
480
+ for group_key, validator_ids in attestation_groups.items():
481
+ readable_groups[group_key[:8]] = {
482
+ "validators": validator_ids,
483
+ "size": len(validator_ids),
484
+ "percentage": len(validator_ids) / total_witnesses
485
+ }
486
+
487
+ return agreement_score, dissent_score, readable_groups
488
+
489
+ def to_transport_format(self) -> Dict[str, Any]:
490
+ """Convert to transport format for n8n/webhooks"""
491
+ return {
492
+ "node_id": self.content_hash[:32],
493
+ "node_type": self.node_type,
494
+ "source": self.source_id,
495
+ "content_preview": str(self.content)[:500] + "..." if len(str(self.content)) > 500 else str(self.content),
496
+ "timestamp": self.temporal_anchor,
497
+ "witness_count": len(self.witness_signatures),
498
+ "proof_of_existence": self.proof_of_existence[:32] + "..." if self.proof_of_existence else None,
499
+ "metadata_summary": {
500
+ "keys": list(self.metadata.keys()),
501
+ "witness_ids": [w.get("validator_id", "unknown") for w in self.witness_signatures]
502
+ },
503
+ "execution_id": self.n8n_execution_id or f"exec_{uuid.uuid4().hex[:8]}"
504
+ }
505
+
506
+ # ==================== FIXED: n8n INTEGRATION WITH PROPER SESSION MANAGEMENT ====================
507
+
508
+ class N8NClient:
509
+ """n8n client with proper async session management"""
510
+
511
+ def __init__(self):
512
+ self.base_url = ProductionConfig.N8N_WEBHOOK_URL
513
+ self.api_key = ProductionConfig.N8N_API_KEY
514
+ self.timeout = ProductionConfig.N8N_TIMEOUT_SECONDS
515
+ self.max_retries = ProductionConfig.N8N_MAX_RETRIES
516
+
517
+ # Session will be initialized on first use
518
+ self._session: Optional[aiohttp.ClientSession] = None
519
+ self._session_lock = asyncio.Lock()
520
+
521
+ async def get_session(self) -> aiohttp.ClientSession:
522
+ """Get or create session with proper cleanup"""
523
+ async with self._session_lock:
524
+ if self._session is None or self._session.closed:
525
+ timeout = ClientTimeout(total=self.timeout)
526
+ headers = {
527
+ "User-Agent": "ImmutableRealityEngine/5.0",
528
+ "Content-Type": "application/json"
529
+ }
530
+
531
+ if self.api_key:
532
+ headers["Authorization"] = f"Bearer {self.api_key}"
533
+
534
+ self._session = ClientSession(
535
+ timeout=timeout,
536
+ headers=headers
537
+ )
538
+ logger.info("Created new n8n session")
539
+
540
+ return self._session
541
+
542
+ async def execute_workflow(self, workflow_id: str, payload: Dict) -> Dict[str, Any]:
543
+ """
544
+ Execute n8n workflow with exponential backoff and proper error handling
545
+ """
546
+ session = await self.get_session()
547
+ url = f"{self.base_url}/{workflow_id}"
548
+
549
+ for attempt in range(self.max_retries):
550
+ try:
551
+ async with session.post(url, json=payload) as response:
552
+ if response.status == 200:
553
+ result = await response.json()
554
+ return {
555
+ "success": True,
556
+ "workflow_id": workflow_id,
557
+ "execution_id": result.get("executionId", f"exec_{uuid.uuid4().hex[:8]}"),
558
+ "data": result.get("data", {}),
559
+ "metrics": result.get("metrics", {}),
560
+ "status_code": response.status,
561
+ "attempt": attempt + 1,
562
+ "timestamp": datetime.utcnow().isoformat() + "Z"
563
+ }
564
+ else:
565
+ error_text = await response.text()
566
+ logger.warning(f"n8n workflow {workflow_id} failed (attempt {attempt + 1}/{self.max_retries}): {response.status} - {error_text}")
567
+
568
+ # Exponential backoff
569
+ if attempt < self.max_retries - 1:
570
+ await asyncio.sleep(2 ** attempt) # 1, 2, 4 seconds
571
+ continue
572
+
573
+ return {
574
+ "success": False,
575
+ "error": f"n8n returned {response.status}: {error_text[:200]}",
576
+ "workflow_id": workflow_id,
577
+ "status_code": response.status,
578
+ "attempt": attempt + 1,
579
+ "timestamp": datetime.utcnow().isoformat() + "Z"
580
+ }
581
+
582
+ except asyncio.TimeoutError:
583
+ logger.warning(f"n8n timeout for {workflow_id} (attempt {attempt + 1}/{self.max_retries})")
584
+ if attempt < self.max_retries - 1:
585
+ await asyncio.sleep(2 ** attempt)
586
+ continue
587
+ return {
588
+ "success": False,
589
+ "error": f"Timeout after {self.timeout}s",
590
+ "workflow_id": workflow_id,
591
+ "attempt": attempt + 1,
592
+ "timestamp": datetime.utcnow().isoformat() + "Z"
593
+ }
594
+
595
+ except aiohttp.ClientError as e:
596
+ logger.warning(f"n8n connection error for {workflow_id} (attempt {attempt + 1}/{self.max_retries}): {str(e)}")
597
+ if attempt < self.max_retries - 1:
598
+ await asyncio.sleep(2 ** attempt)
599
+ continue
600
+ return {
601
+ "success": False,
602
+ "error": f"Connection error: {str(e)}",
603
+ "workflow_id": workflow_id,
604
+ "attempt": attempt + 1,
605
+ "timestamp": datetime.utcnow().isoformat() + "Z"
606
+ }
607
+
608
+ # This should never be reached due to the loop logic
609
+ return {
610
+ "success": False,
611
+ "error": "Max retries exceeded",
612
+ "workflow_id": workflow_id,
613
+ "attempt": self.max_retries,
614
+ "timestamp": datetime.utcnow().isoformat() + "Z"
615
+ }
616
+
617
+ async def batch_execute(self, workflows: List[Dict[str, Any]]) -> List[Dict[str, Any]]:
618
+ """Execute multiple workflows in parallel with proper limits"""
619
+ semaphore = asyncio.Semaphore(ProductionConfig.MAX_CONCURRENT_DETECTIONS)
620
+
621
+ async def execute_with_limit(workflow: Dict[str, Any]) -> Dict[str, Any]:
622
+ async with semaphore:
623
+ return await self.execute_workflow(
624
+ workflow["workflow_id"],
625
+ workflow["payload"]
626
+ )
627
+
628
+ tasks = [execute_with_limit(wf) for wf in workflows]
629
+ results = await asyncio.gather(*tasks, return_exceptions=True)
630
+
631
+ # Process results
632
+ processed_results = []
633
+ for i, result in enumerate(results):
634
+ if isinstance(result, Exception):
635
+ processed_results.append({
636
+ "success": False,
637
+ "error": str(result),
638
+ "workflow_id": workflows[i]["workflow_id"],
639
+ "timestamp": datetime.utcnow().isoformat() + "Z"
640
+ })
641
+ else:
642
+ processed_results.append(result)
643
+
644
+ return processed_results
645
+
646
+ async def close(self):
647
+ """Properly close session"""
648
+ async with self._session_lock:
649
+ if self._session and not self._session.closed:
650
+ await self._session.close()
651
+ self._session = None
652
+ logger.info("Closed n8n session")
653
+
654
+ # ==================== FIXED: LEDGER WITH SYNC BOOTSTRAP ====================
655
+
656
+ class ImmutableLedger:
657
+ """
658
+ Immutable ledger with proper sync/async separation
659
+ Quantum-aware append-only log (not a blockchain)
660
+ """
661
+
662
+ def __init__(self, n8n_client: N8NClient, storage_path: str = None):
663
+ self.n8n = n8n_client
664
+ self.storage_path = storage_path or ProductionConfig.LEDGER_PATH
665
+ os.makedirs(self.storage_path, exist_ok=True)
666
+
667
+ self.chain: List[Dict] = []
668
+ self.node_index: Dict[str, List[str]] = defaultdict(list) # node_hash -> [block_ids]
669
+ self.validator_index: Dict[str, List[str]] = defaultdict(list) # validator_id -> [block_ids]
670
+ self.temporal_index: Dict[str, List[str]] = defaultdict(list) # date -> [block_ids]
671
+
672
+ # Sync bootstrap - no async calls in __init__
673
+ self._bootstrap_sync()
674
+
675
+ def _bootstrap_sync(self):
676
+ """Synchronous bootstrap - no async calls"""
677
+ ledger_file = os.path.join(self.storage_path, "ledger.json")
678
+
679
+ if os.path.exists(ledger_file):
680
+ try:
681
+ with open(ledger_file, 'r') as f:
682
+ data = json.load(f)
683
+ self.chain = data.get("chain", [])
684
+ self._rebuild_indexes_sync()
685
+ logger.info(f"Loaded ledger: {len(self.chain)} blocks, {len(self.node_index)} nodes indexed")
686
+
687
+ # Validate chain integrity
688
+ if not self._validate_chain_sync():
689
+ logger.warning("Ledger integrity check failed, creating new genesis")
690
+ self._create_genesis_sync()
691
+ except Exception as e:
692
+ logger.error(f"Failed to load ledger: {e}")
693
+ self._create_genesis_sync()
694
+ else:
695
+ self._create_genesis_sync()
696
+
697
+ def _create_genesis_sync(self):
698
+ """Create genesis block synchronously"""
699
+ genesis = {
700
+ "id": "genesis_v5",
701
+ "prev": "0" * 128,
702
+ "timestamp": datetime.utcnow().isoformat() + "Z",
703
+ "nodes": [],
704
+ "metadata": {
705
+ "version": "IRE_v5.0",
706
+ "genesis": True,
707
+ "created_by": "ImmutableLedger",
708
+ "hash_algorithm": ProductionConfig.HASH_ALGORITHM,
709
+ "note": "Quantum-aware, not quantum-resistant"
710
+ },
711
+ "hash": self._hash_block_sync({"genesis": True}),
712
+ "signatures": []
713
+ }
714
+
715
+ self.chain.append(genesis)
716
+ self._save_ledger_sync()
717
+ logger.info("Created genesis block")
718
+
719
+ def _hash_block_sync(self, data: Dict) -> str:
720
+ """Synchronous hashing"""
721
+ return hashlib.sha3_512(
722
+ json.dumps(data, sort_keys=True).encode()
723
+ ).hexdigest()
724
+
725
+ def _rebuild_indexes_sync(self):
726
+ """Rebuild indexes synchronously"""
727
+ self.node_index.clear()
728
+ self.validator_index.clear()
729
+ self.temporal_index.clear()
730
+
731
+ for block in self.chain:
732
+ block_id = block["id"]
733
+
734
+ # Index nodes
735
+ for node in block.get("nodes", []):
736
+ node_hash = node.get("content_hash")
737
+ if node_hash:
738
+ self.node_index[node_hash].append(block_id)
739
+
740
+ # Index validators
741
+ for sig in block.get("signatures", []):
742
+ validator = sig.get("validator_id")
743
+ if validator:
744
+ self.validator_index[validator].append(block_id)
745
+
746
+ # Temporal index
747
+ timestamp = block.get("timestamp", "")
748
+ if timestamp:
749
+ date_key = timestamp[:10] # YYYY-MM-DD
750
+ self.temporal_index[date_key].append(block_id)
751
+
752
+ def _validate_chain_sync(self) -> bool:
753
+ """Validate chain integrity synchronously"""
754
+ if not self.chain:
755
+ return False
756
+
757
+ if self.chain[0]["id"] != "genesis_v5":
758
+ return False
759
+
760
+ for i in range(1, len(self.chain)):
761
+ current = self.chain[i]
762
+ previous = self.chain[i - 1]
763
+
764
+ if current["prev"] != previous["hash"]:
765
+ return False
766
+
767
+ return True
768
+
769
+ def _save_ledger_sync(self):
770
+ """Save ledger synchronously with atomic write"""
771
+ ledger_data = {
772
+ "chain": self.chain,
773
+ "metadata": {
774
+ "version": "IRE_v5.0",
775
+ "total_blocks": len(self.chain),
776
+ "total_nodes": sum(len(b.get("nodes", [])) for b in self.chain),
777
+ "last_updated": datetime.utcnow().isoformat() + "Z",
778
+ "hash_algorithm": ProductionConfig.HASH_ALGORITHM
779
+ }
780
+ }
781
+
782
+ ledger_file = os.path.join(self.storage_path, "ledger.json")
783
+ temp_file = ledger_file + ".tmp"
784
+
785
+ try:
786
+ # Write to temp file
787
+ with open(temp_file, 'w') as f:
788
+ json.dump(ledger_data, f, indent=2)
789
+
790
+ # Atomic replace
791
+ os.replace(temp_file, ledger_file)
792
+
793
+ except Exception as e:
794
+ logger.error(f"Failed to save ledger: {e}")
795
+ # Clean up temp file
796
+ if os.path.exists(temp_file):
797
+ os.remove(temp_file)
798
+ raise
799
+
800
+ async def commit_node(self, node: RealityNode, validators: List[str]) -> Dict[str, Any]:
801
+ """Commit node to ledger via n8n orchestration"""
802
+
803
+ # Validate node synchronously first
804
+ is_valid, errors = node.validate()
805
+ if not is_valid:
806
+ return {
807
+ "success": False,
808
+ "error": f"Node validation failed: {errors}",
809
+ "node_id": node.content_hash[:32],
810
+ "timestamp": datetime.utcnow().isoformat() + "Z"
811
+ }
812
+
813
+ # Prepare payload for n8n
814
+ payload = {
815
+ "operation": "ledger_commit",
816
+ "node": node.to_transport_format(),
817
+ "validators": validators,
818
+ "current_chain_length": len(self.chain),
819
+ "previous_block_hash": self.chain[-1]["hash"] if self.chain else "0" * 128,
820
+ "timestamp": datetime.utcnow().isoformat() + "Z"
821
+ }
822
+
823
+ # Execute via n8n
824
+ response = await self.n8n.execute_workflow(
825
+ ProductionConfig.WORKFLOW_IDS["ledger_commit"],
826
+ payload
827
+ )
828
+
829
+ if response.get("success"):
830
+ block_data = response.get("data", {}).get("block", {})
831
+
832
+ # Verify block before adding
833
+ if self._validate_block_sync(block_data):
834
+ self.chain.append(block_data)
835
+ self._update_indexes_sync(block_data)
836
+ self._save_ledger_sync()
837
+
838
+ logger.info(f"Committed node {node.content_hash[:16]}... in block {block_data.get('id', 'unknown')}")
839
+
840
+ return {
841
+ "success": True,
842
+ "block_id": block_data.get("id", "unknown"),
843
+ "block_hash": block_data.get("hash", "unknown")[:32] + "...",
844
+ "node_id": node.content_hash[:32],
845
+ "validator_count": len(validators),
846
+ "ledger_length": len(self.chain),
847
+ "n8n_execution_id": response.get("execution_id"),
848
+ "timestamp": datetime.utcnow().isoformat() + "Z"
849
+ }
850
+ else:
851
+ return {
852
+ "success": False,
853
+ "error": "Block validation failed",
854
+ "n8n_response": response,
855
+ "timestamp": datetime.utcnow().isoformat() + "Z"
856
+ }
857
+
858
+ return {
859
+ "success": False,
860
+ "error": "Failed to commit node via n8n",
861
+ "n8n_response": response,
862
+ "timestamp": datetime.utcnow().isoformat() + "Z"
863
+ }
864
+
865
+ def _validate_block_sync(self, block: Dict) -> bool:
866
+ """Validate block structure synchronously"""
867
+ required_fields = ["id", "prev", "timestamp", "hash", "nodes"]
868
+ for field in required_fields:
869
+ if field not in block:
870
+ logger.error(f"Block missing required field: {field}")
871
+ return False
872
+
873
+ # Check previous block hash matches
874
+ if self.chain and block["prev"] != self.chain[-1]["hash"]:
875
+ logger.error(f"Block prev hash mismatch: {block['prev'][:16]}... != {self.chain[-1]['hash'][:16]}...")
876
+ return False
877
+
878
+ return True
879
+
880
+ def _update_indexes_sync(self, block: Dict):
881
+ """Update indexes synchronously"""
882
+ block_id = block["id"]
883
+
884
+ # Index nodes
885
+ for node in block.get("nodes", []):
886
+ node_hash = node.get("content_hash")
887
+ if node_hash:
888
+ self.node_index[node_hash].append(block_id)
889
+
890
+ # Index validators
891
+ for sig in block.get("signatures", []):
892
+ validator = sig.get("validator_id")
893
+ if validator:
894
+ self.validator_index[validator].append(block_id)
895
+
896
+ # Temporal index
897
+ timestamp = block.get("timestamp", "")
898
+ if timestamp:
899
+ date_key = timestamp[:10]
900
+ self.temporal_index[date_key].append(block_id)
901
+
902
+ def get_node_history_sync(self, node_hash: str) -> List[Dict]:
903
+ """Get node history synchronously"""
904
+ block_ids = self.node_index.get(node_hash, [])
905
+ history = []
906
+
907
+ for block_id in block_ids:
908
+ block = next((b for b in self.chain if b["id"] == block_id), None)
909
+ if block:
910
+ history.append({
911
+ "block_id": block_id,
912
+ "timestamp": block["timestamp"],
913
+ "block_hash": block["hash"][:16] + "...",
914
+ "validator_count": len(block.get("signatures", [])),
915
+ "block_index": self.chain.index(block)
916
+ })
917
+
918
+ return sorted(history, key=lambda x: x["timestamp"])
919
+
920
+ def analyze_health_sync(self) -> Dict[str, Any]:
921
+ """Analyze ledger health synchronously"""
922
+ if not self.chain:
923
+ return {"status": "EMPTY", "health_score": 0.0}
924
+
925
+ total_blocks = len(self.chain)
926
+ total_nodes = sum(len(b.get("nodes", [])) for b in self.chain)
927
+
928
+ # Check chain integrity
929
+ integrity_ok = self._validate_chain_sync()
930
+
931
+ # Calculate various metrics
932
+ block_intervals = []
933
+ for i in range(1, len(self.chain)):
934
+ try:
935
+ prev_time = datetime.fromisoformat(self.chain[i-1]["timestamp"].replace('Z', '+00:00'))
936
+ curr_time = datetime.fromisoformat(self.chain[i]["timestamp"].replace('Z', '+00:00'))
937
+ interval = (curr_time - prev_time).total_seconds()
938
+ block_intervals.append(interval)
939
+ except (ValueError, KeyError):
940
+ pass
941
+
942
+ # Health factors
943
+ factors = []
944
+
945
+ # Integrity factor
946
+ factors.append(1.0 if integrity_ok else 0.0)
947
+
948
+ # Block count factor (more blocks = more established)
949
+ factors.append(min(1.0, total_blocks / 100.0))
950
+
951
+ # Node density factor
952
+ factors.append(min(1.0, total_nodes / 500.0))
953
+
954
+ # Validator diversity factor
955
+ unique_validators = len(self.validator_index)
956
+ factors.append(min(1.0, unique_validators / 10.0))
957
+
958
+ # Temporal distribution factor
959
+ unique_days = len(self.temporal_index)
960
+ factors.append(min(1.0, unique_days / 30.0)) # 30 days ideal
961
+
962
+ # Calculate health score
963
+ health_score = sum(factors) / len(factors) if factors else 0.0
964
+
965
+ # Determine status
966
+ if health_score >= 0.8:
967
+ status = "HEALTHY"
968
+ elif health_score >= 0.6:
969
+ status = "DEGRADED"
970
+ elif health_score >= 0.4:
971
+ status = "WARNING"
972
+ else:
973
+ status = "CRITICAL"
974
+
975
+ return {
976
+ "status": status,
977
+ "health_score": round(health_score, 3),
978
+ "metrics": {
979
+ "total_blocks": total_blocks,
980
+ "total_nodes": total_nodes,
981
+ "unique_nodes": len(self.node_index),
982
+ "unique_validators": unique_validators,
983
+ "unique_days": unique_days,
984
+ "chain_integrity": integrity_ok,
985
+ "average_block_interval": statistics.mean(block_intervals) if block_intervals else 0,
986
+ "hash_algorithm": ProductionConfig.HASH_ALGORITHM
987
+ },
988
+ "factors": {f"factor_{i}": round(v, 3) for i, v in enumerate(factors)},
989
+ "recommendations": self._generate_health_recommendations_sync(health_score, total_blocks, unique_validators)
990
+ }
991
+
992
+ def _generate_health_recommendations_sync(self, health_score: float,
993
+ total_blocks: int,
994
+ unique_validators: int) -> List[str]:
995
+ """Generate health recommendations synchronously"""
996
+ recommendations = []
997
+
998
+ if health_score < 0.5:
999
+ recommendations.append("Ledger health critical - add more nodes and validators")
1000
+
1001
+ if total_blocks < 10:
1002
+ recommendations.append("Increase ledger activity to establish chain history")
1003
+
1004
+ if unique_validators < ProductionConfig.MIN_VALIDATORS:
1005
+ recommendations.append(f"Add more validators (currently {unique_validators}, need {ProductionConfig.MIN_VALIDATORS})")
1006
+
1007
+ if not recommendations:
1008
+ recommendations.append("Ledger operating within optimal parameters")
1009
+
1010
+ return recommendations
1011
+
1012
+ # ==================== FIXED: LENS & METHOD REGISTRY ====================
1013
+
1014
+ class LensMethodRegistry:
1015
+ """
1016
+ Registry for lenses and methods with n8n orchestration
1017
+ Cross-referential and externally managed
1018
+ """
1019
+
1020
+ def __init__(self, n8n_client: N8NClient):
1021
+ self.n8n = n8n_client
1022
+ self.lenses: Dict[str, Dict] = {}
1023
+ self.methods: Dict[str, Dict] = {}
1024
+ self.cross_references: Dict[str, List[str]] = defaultdict(list) # lens_id -> [method_ids]
1025
+ self.inverse_references: Dict[str, List[str]] = defaultdict(list) # method_id -> [lens_ids]
1026
+ self.last_sync: Optional[str] = None
1027
+ self.sync_lock = asyncio.Lock()
1028
+
1029
+ async def sync_from_n8n(self) -> bool:
1030
+ """Sync registry from n8n with proper locking"""
1031
+ async with self.sync_lock:
1032
+ try:
1033
+ logger.info("Syncing registry from n8n...")
1034
+
1035
+ # Get lenses
1036
+ lenses_response = await self.n8n.execute_workflow(
1037
+ ProductionConfig.WORKFLOW_IDS["lens_analysis"],
1038
+ {"operation": "get_registry", "type": "lenses"}
1039
+ )
1040
+
1041
+ if lenses_response.get("success"):
1042
+ self.lenses = lenses_response.get("data", {}).get("lenses", {})
1043
+ logger.info(f"Loaded {len(self.lenses)} lenses")
1044
+ else:
1045
+ logger.error(f"Failed to load lenses: {lenses_response.get('error')}")
1046
+ return False
1047
+
1048
+ # Get methods
1049
+ methods_response = await self.n8n.execute_workflow(
1050
+ ProductionConfig.WORKFLOW_IDS["method_execution"],
1051
+ {"operation": "get_registry", "type": "methods"}
1052
+ )
1053
+
1054
+ if methods_response.get("success"):
1055
+ self.methods = methods_response.get("data", {}).get("methods", {})
1056
+ logger.info(f"Loaded {len(self.methods)} methods")
1057
+ else:
1058
+ logger.error(f"Failed to load methods: {methods_response.get('error')}")
1059
+ return False
1060
+
1061
+ # Build cross-references
1062
+ self._build_cross_references()
1063
+
1064
+ self.last_sync = datetime.utcnow().isoformat() + "Z"
1065
+ logger.info("Registry sync completed successfully")
1066
+ return True
1067
+
1068
+ except Exception as e:
1069
+ logger.error(f"Registry sync failed: {e}")
1070
+ return False
1071
+
1072
+ def _build_cross_references(self):
1073
+ """Build cross-references between lenses and methods"""
1074
+ self.cross_references.clear()
1075
+ self.inverse_references.clear()
1076
+
1077
+ # Build from methods to lenses
1078
+ for method_id, method in self.methods.items():
1079
+ lens_ids = method.get("lens_ids", [])
1080
+ for lens_id in lens_ids:
1081
+ if lens_id in self.lenses:
1082
+ self.cross_references[lens_id].append(method_id)
1083
+ self.inverse_references[method_id].append(lens_id)
1084
+
1085
+ logger.info(f"Built cross-references: {len(self.cross_references)} lenses ↔ {len(self.inverse_references)} methods")
1086
+
1087
+ def get_lens(self, lens_id: str) -> Optional[Dict]:
1088
+ """Get lens by ID"""
1089
+ return self.lenses.get(str(lens_id))
1090
+
1091
+ def get_method(self, method_id: str) -> Optional[Dict]:
1092
+ """Get method by ID"""
1093
+ return self.methods.get(str(method_id))
1094
+
1095
+ def get_methods_for_lens(self, lens_id: str) -> List[Dict]:
1096
+ """Get all methods for a lens"""
1097
+ method_ids = self.cross_references.get(str(lens_id), [])
1098
+ return [self.get_method(mid) for mid in method_ids if self.get_method(mid)]
1099
+
1100
+ def get_lenses_for_method(self, method_id: str) -> List[Dict]:
1101
+ """Get all lenses for a method"""
1102
+ lens_ids = self.inverse_references.get(str(method_id), [])
1103
+ return [self.get_lens(lid) for lid in lens_ids if self.get_lens(lid)]
1104
+
1105
+ def find_similar_lenses(self, query: str, limit: int = 5) -> List[Dict]:
1106
+ """Find lenses similar to query (simple keyword matching)"""
1107
+ query_lower = query.lower()
1108
+ results = []
1109
+
1110
+ for lens_id, lens in self.lenses.items():
1111
+ score = 0
1112
+
1113
+ # Check name
1114
+ if query_lower in lens.get("name", "").lower():
1115
+ score += 3
1116
+
1117
+ # Check description
1118
+ if query_lower in lens.get("description", "").lower():
1119
+ score += 2
1120
+
1121
+ # Check keywords
1122
+ keywords = lens.get("keywords", [])
1123
+ for keyword in keywords:
1124
+ if query_lower in keyword.lower():
1125
+ score += 1
1126
+
1127
+ if score > 0:
1128
+ result = lens.copy()
1129
+ result["match_score"] = score
1130
+ results.append(result)
1131
+
1132
+ results.sort(key=lambda x: x.get("match_score", 0), reverse=True)
1133
+ return results[:limit]
1134
+
1135
+ async def execute_method_via_n8n(self, method_id: str, content: Dict,
1136
+ context: Dict = None) -> Dict[str, Any]:
1137
+ """Execute method via n8n orchestration"""
1138
+ method = self.get_method(method_id)
1139
+ if not method:
1140
+ return {
1141
+ "success": False,
1142
+ "error": f"Method {method_id} not found",
1143
+ "timestamp": datetime.utcnow().isoformat() + "Z"
1144
+ }
1145
+
1146
+ payload = {
1147
+ "operation": "execute_method",
1148
+ "method_id": method_id,
1149
+ "method_name": method.get("name", "Unknown"),
1150
+ "content": content,
1151
+ "context": context or {},
1152
+ "registry_version": self.last_sync,
1153
+ "timestamp": datetime.utcnow().isoformat() + "Z"
1154
+ }
1155
+
1156
+ return await self.n8n.execute_workflow(
1157
+ ProductionConfig.WORKFLOW_IDS["method_execution"],
1158
+ payload
1159
+ )
1160
+
1161
+ # ==================== FIXED: QUORUM SYSTEM ====================
1162
+
1163
+ class QuorumSystem:
1164
+ """Proper quorum calculation and validation system"""
1165
+
1166
+ def __init__(self):
1167
+ self.quorum_threshold = ProductionConfig.QUORUM_THRESHOLD
1168
+ self.dissent_threshold = ProductionConfig.DISSENT_THRESHOLD
1169
+
1170
+ def calculate_quorum(self, attestations: List[Dict]) -> Dict[str, Any]:
1171
+ """
1172
+ Calculate quorum statistics from attestations
1173
+ Returns detailed quorum analysis
1174
+ """
1175
+ if not attestations:
1176
+ return {
1177
+ "quorum_met": False,
1178
+ "agreement_score": 0.0,
1179
+ "dissent_score": 0.0,
1180
+ "total_votes": 0,
1181
+ "analysis": "No attestations"
1182
+ }
1183
+
1184
+ total_votes = len(attestations)
1185
+
1186
+ # Group by decision/content
1187
+ decision_groups = defaultdict(list)
1188
+ for att in attestations:
1189
+ decision = att.get("decision", "unknown")
1190
+ decision_hash = hashlib.sha3_512(
1191
+ json.dumps(decision, sort_keys=True).encode()
1192
+ ).hexdigest()[:16]
1193
+ decision_groups[decision_hash].append(att)
1194
+
1195
+ # Calculate group sizes
1196
+ group_sizes = [len(group) for group in decision_groups.values()]
1197
+ if not group_sizes:
1198
+ return {
1199
+ "quorum_met": False,
1200
+ "agreement_score": 0.0,
1201
+ "dissent_score": 0.0,
1202
+ "total_votes": total_votes,
1203
+ "analysis": "No valid decisions"
1204
+ }
1205
+
1206
+ # Sort by size
1207
+ group_sizes.sort(reverse=True)
1208
+ largest_group = group_sizes[0]
1209
+ second_largest = group_sizes[1] if len(group_sizes) > 1 else 0
1210
+
1211
+ # Calculate scores
1212
+ agreement_score = largest_group / total_votes
1213
+ dissent_score = second_largest / total_votes if second_largest > 0 else 0
1214
+
1215
+ # Check quorum
1216
+ quorum_met = agreement_score >= self.quorum_threshold
1217
+ dissent_issue = dissent_score >= self.dissent_threshold
1218
+
1219
+ # Analysis
1220
+ analysis_parts = []
1221
+ if quorum_met:
1222
+ analysis_parts.append(f"Quorum met ({agreement_score:.1%} ≥ {self.quorum_threshold:.1%})")
1223
+ else:
1224
+ analysis_parts.append(f"Quorum not met ({agreement_score:.1%} < {self.quorum_threshold:.1%})")
1225
+
1226
+ if dissent_issue:
1227
+ analysis_parts.append(f"Significant dissent ({dissent_score:.1%} ≥ {self.dissent_threshold:.1%})")
1228
+
1229
+ # Group details
1230
+ group_details = {}
1231
+ for decision_hash, group in decision_groups.items():
1232
+ group_details[decision_hash[:8]] = {
1233
+ "size": len(group),
1234
+ "percentage": len(group) / total_votes,
1235
+ "validators": [a.get("validator_id", "unknown") for a in group],
1236
+ "sample_decision": group[0].get("decision", "unknown") if group else None
1237
+ }
1238
+
1239
+ return {
1240
+ "quorum_met": quorum_met,
1241
+ "agreement_score": round(agreement_score, 3),
1242
+ "dissent_score": round(dissent_score, 3),
1243
+ "total_votes": total_votes,
1244
+ "group_count": len(decision_groups),
1245
+ "largest_group_size": largest_group,
1246
+ "analysis": "; ".join(analysis_parts),
1247
+ "group_details": group_details,
1248
+ "thresholds": {
1249
+ "quorum": self.quorum_threshold,
1250
+ "dissent": self.dissent_threshold
1251
+ }
1252
+ }
1253
+
1254
+ async def validate_quorum_via_n8n(self, node: RealityNode,
1255
+ attestations: List[Dict]) -> Dict[str, Any]:
1256
+ """Validate quorum via n8n for complex cases"""
1257
+ payload = {
1258
+ "operation": "quorum_validation",
1259
+ "node_hash": node.content_hash[:32],
1260
+ "attestations": attestations,
1261
+ "total_witnesses": len(node.witness_signatures),
1262
+ "quorum_threshold": self.quorum_threshold,
1263
+ "dissent_threshold": self.dissent_threshold,
1264
+ "timestamp": datetime.utcnow().isoformat() + "Z"
1265
+ }
1266
+
1267
+ return await self.n8n.execute_workflow(
1268
+ ProductionConfig.WORKFLOW_IDS["quorum_calculation"],
1269
+ payload
1270
+ )
1271
+
1272
+ # ==================== FIXED: PRODUCTION DETECTION ENGINE ====================
1273
+
1274
+ class ProductionDetectionEngine:
1275
+ """
1276
+ Production-ready detection engine with all fixes applied
1277
+ Proper async/await, error handling, and clear guarantees
1278
+ """
1279
+
1280
+ def __init__(self):
1281
+ # Initialize components
1282
+ self.n8n_client = N8NClient()
1283
+ self.registry = LensMethodRegistry(self.n8n_client)
1284
+ self.ledger = ImmutableLedger(self.n8n_client)
1285
+ self.quorum_system = QuorumSystem()
1286
+
1287
+ # Metrics - FIXED: Proper Counter import used
1288
+ self.metrics = {
1289
+ "total_detections": 0,
1290
+ "successful_detections": 0,
1291
+ "failed_detections": 0,
1292
+ "average_execution_time": 0.0,
1293
+ "phase_distribution": Counter(), # Now properly imported
1294
+ "equilibrium_detections": 0,
1295
+ "quorum_validations": 0,
1296
+ "ledger_commits": 0
1297
+ }
1298
+
1299
+ # Result cache with TTL
1300
+ self.result_cache: Dict[str, Dict] = {}
1301
+ self.cache_lock = asyncio.Lock()
1302
+
1303
+ # Background tasks
1304
+ self._background_tasks: List[asyncio.Task] = []
1305
+
1306
+ logger.info("Production Detection Engine initialized")
1307
+
1308
+ async def initialize(self):
1309
+ """Async initialization"""
1310
+ try:
1311
+ # Sync registry
1312
+ success = await self.registry.sync_from_n8n()
1313
+ if not success:
1314
+ logger.warning("Registry sync failed, using empty registry")
1315
+
1316
+ # Start background cleanup task
1317
+ cleanup_task = asyncio.create_task(self._cleanup_loop())
1318
+ self._background_tasks.append(cleanup_task)
1319
+
1320
+ logger.info("Engine initialization completed")
1321
+
1322
+ except Exception as e:
1323
+ logger.error(f"Engine initialization failed: {e}")
1324
+ raise
1325
+
1326
+ async def detect_suppression(self, content: Dict, context: Dict = None) -> Dict[str, Any]:
1327
+ """
1328
+ Main detection pipeline with proper error handling and metrics
1329
+ """
1330
+ detection_id = f"det_{uuid.uuid4().hex[:16]}"
1331
+ start_time = time.time()
1332
+
1333
+ try:
1334
+ logger.info(f"Starting detection {detection_id}")
1335
+
1336
+ # 1. Create reality node
1337
+ content_hash = hashlib.sha3_512(
1338
+ json.dumps(content, sort_keys=True).encode()
1339
+ ).hexdigest()
1340
+
1341
+ node = RealityNode(
1342
+ content_hash=content_hash,
1343
+ node_type="suppression_detection",
1344
+ source_id=context.get("source", "unknown") if context else "unknown",
1345
+ signature=QuantumAwareSignature.create(content),
1346
+ temporal_anchor=datetime.utcnow().isoformat() + "Z",
1347
+ content=content,
1348
+ metadata={
1349
+ "detection_id": detection_id,
1350
+ "context": context or {},
1351
+ "engine_version": "IRE_v5.0_Production"
1352
+ }
1353
+ )
1354
+
1355
+ # 2. Content analysis via n8n
1356
+ content_analysis = await self._analyze_content(content, context)
1357
+
1358
+ # 3. Pattern detection
1359
+ pattern_analysis = await self._detect_patterns(content, content_analysis)
1360
+
1361
+ # 4. Determine phase
1362
+ current_phase = self._determine_phase(pattern_analysis)
1363
+
1364
+ # 5. Apply methods
1365
+ method_results = await self._apply_methods(content, current_phase, pattern_analysis)
1366
+
1367
+ # 6. Equilibrium detection
1368
+ equilibrium_analysis = await self._detect_equilibrium(pattern_analysis, method_results)
1369
+
1370
+ # 7. Threat analysis
1371
+ threat_analysis = await self._analyze_threats({
1372
+ "content": content,
1373
+ "patterns": pattern_analysis,
1374
+ "methods": method_results,
1375
+ "equilibrium": equilibrium_analysis
1376
+ })
1377
+
1378
+ # 8. Composite analysis
1379
+ composite_analysis = self._create_composite_analysis(
1380
+ content_analysis, pattern_analysis, method_results,
1381
+ equilibrium_analysis, threat_analysis
1382
+ )
1383
+
1384
+ # Update node metadata
1385
+ node.metadata["analysis"] = composite_analysis
1386
+ node.metadata["detection_phase"] = current_phase
1387
+ node.n8n_execution_id = f"exec_{uuid.uuid4().hex[:8]}"
1388
+
1389
+ # 9. Select validators
1390
+ validators = self._select_validators(threat_analysis, current_phase)
1391
+
1392
+ # 10. Get attestations
1393
+ attestations = await self._get_attestations(node, validators, composite_analysis)
1394
+
1395
+ # Add witness signatures
1396
+ successful_attestations = 0
1397
+ for att in attestations:
1398
+ if att.get("success"):
1399
+ validator_id = att.get("validator_id")
1400
+ signature_data = att.get("signature_data", {})
1401
+ signature = QuantumAwareSignature(**signature_data)
1402
+ node.add_witness(validator_id, signature, att.get("attestation", {}))
1403
+ successful_attestations += 1
1404
+
1405
+ # 11. Calculate quorum
1406
+ quorum_result = self.quorum_system.calculate_quorum(
1407
+ [a.get("attestation", {}) for a in attestations if a.get("success")]
1408
+ )
1409
+
1410
+ # 12. Commit to ledger if quorum met
1411
+ ledger_result = None
1412
+ if quorum_result.get("quorum_met", False) and successful_attestations >= ProductionConfig.MIN_VALIDATORS:
1413
+ ledger_result = await self.ledger.commit_node(node, validators)
1414
+ if ledger_result.get("success"):
1415
+ self.metrics["ledger_commits"] += 1
1416
+
1417
+ execution_time = time.time() - start_time
1418
+
1419
+ # 13. Update metrics
1420
+ self._update_metrics(
1421
+ success=True,
1422
+ execution_time=execution_time,
1423
+ phase=current_phase,
1424
+ has_equilibrium=equilibrium_analysis.get("has_equilibrium", False),
1425
+ quorum_met=quorum_result.get("quorum_met", False)
1426
+ )
1427
+
1428
+ # 14. Build result
1429
+ result = {
1430
+ "success": True,
1431
+ "detection_id": detection_id,
1432
+ "execution_time": execution_time,
1433
+ "current_phase": current_phase,
1434
+ "reality_node": {
1435
+ "hash": node.content_hash[:32],
1436
+ "proof_of_existence": node.proof_of_existence[:32] + "..." if node.proof_of_existence else None,
1437
+ "witness_count": len(node.witness_signatures)
1438
+ },
1439
+ "analysis": composite_analysis,
1440
+ "quorum_result": quorum_result,
1441
+ "attestation_result": {
1442
+ "requested": len(validators),
1443
+ "successful": successful_attestations,
1444
+ "quorum_met": quorum_result.get("quorum_met", False)
1445
+ },
1446
+ "ledger_result": ledger_result,
1447
+ "metrics": {
1448
+ "patterns_found": len(pattern_analysis.get("patterns", [])),
1449
+ "methods_applied": method_results.get("methods_applied", 0),
1450
+ "threat_level": threat_analysis.get("threat_level", "UNKNOWN"),
1451
+ "equilibrium_detected": equilibrium_analysis.get("has_equilibrium", False)
1452
+ },
1453
+ "engine_metadata": {
1454
+ "version": "IRE_v5.0_Production",
1455
+ "quantum_aware": True,
1456
+ "n8n_integrated": True,
1457
+ "timestamp": datetime.utcnow().isoformat() + "Z"
1458
+ }
1459
+ }
1460
+
1461
+ # 15. Cache result
1462
+ await self._cache_result(detection_id, result)
1463
+
1464
+ logger.info(f"Detection {detection_id} completed successfully in {execution_time:.2f}s")
1465
+
1466
+ return result
1467
+
1468
+ except Exception as e:
1469
+ execution_time = time.time() - start_time
1470
+ error_id = f"err_{uuid.uuid4().hex[:8]}"
1471
+
1472
+ self._update_metrics(success=False, execution_time=execution_time)
1473
+
1474
+ logger.error(f"Detection {detection_id} failed: {e}", error_id=error_id)
1475
+
1476
+ return {
1477
+ "success": False,
1478
+ "detection_id": detection_id,
1479
+ "error_id": error_id,
1480
+ "error": str(e),
1481
+ "execution_time": execution_time,
1482
+ "timestamp": datetime.utcnow().isoformat() + "Z",
1483
+ "engine_metadata": {
1484
+ "version": "IRE_v5.0_Production",
1485
+ "error_reported": True
1486
+ }
1487
+ }
1488
+
1489
+ async def _analyze_content(self, content: Dict, context: Dict = None) -> Dict:
1490
+ """Analyze content via n8n"""
1491
+ payload = {
1492
+ "operation": "content_analysis",
1493
+ "content": content,
1494
+ "context": context or {},
1495
+ "timestamp": datetime.utcnow().isoformat() + "Z"
1496
+ }
1497
+
1498
+ response = await self.n8n_client.execute_workflow(
1499
+ ProductionConfig.WORKFLOW_IDS["lens_analysis"],
1500
+ payload
1501
+ )
1502
+
1503
+ return response.get("data", {}) if response.get("success") else {}
1504
+
1505
+ async def _detect_patterns(self, content: Dict, content_analysis: Dict) -> Dict:
1506
+ """Detect patterns via n8n"""
1507
+ payload = {
1508
+ "operation": "pattern_detection",
1509
+ "content": content,
1510
+ "content_analysis": content_analysis,
1511
+ "lens_count": len(self.registry.lenses),
1512
+ "timestamp": datetime.utcnow().isoformat() + "Z"
1513
+ }
1514
+
1515
+ response = await self.n8n_client.execute_workflow(
1516
+ ProductionConfig.WORKFLOW_IDS["lens_analysis"],
1517
+ payload
1518
+ )
1519
+
1520
+ return response.get("data", {}) if response.get("success") else {}
1521
+
1522
+ def _determine_phase(self, pattern_analysis: Dict) -> str:
1523
+ """Determine suppression phase"""
1524
+ patterns = pattern_analysis.get("patterns", [])
1525
+
1526
+ # Count equilibrium patterns
1527
+ equilibrium_count = sum(1 for p in patterns if p.get("type") == "equilibrium")
1528
+
1529
+ if equilibrium_count >= 3:
1530
+ return SuppressionPhase.POST_SUPPRESSION_EQUILIBRIUM.value
1531
+ elif equilibrium_count >= 1:
1532
+ return SuppressionPhase.ESTABLISHING_SUPPRESSION.value
1533
+ else:
1534
+ return SuppressionPhase.ACTIVE_SUPPRESSION.value
1535
+
1536
+ async def _apply_methods(self, content: Dict, phase: str,
1537
+ pattern_analysis: Dict) -> Dict:
1538
+ """Apply detection methods"""
1539
+ payload = {
1540
+ "operation": "method_execution",
1541
+ "content": content,
1542
+ "phase": phase,
1543
+ "pattern_analysis": pattern_analysis,
1544
+ "method_count": len(self.registry.methods),
1545
+ "timestamp": datetime.utcnow().isoformat() + "Z"
1546
+ }
1547
+
1548
+ response = await self.n8n_client.execute_workflow(
1549
+ ProductionConfig.WORKFLOW_IDS["method_execution"],
1550
+ payload
1551
+ )
1552
+
1553
+ return response.get("data", {}) if response.get("success") else {}
1554
+
1555
+ async def _detect_equilibrium(self, pattern_analysis: Dict,
1556
+ method_results: Dict) -> Dict:
1557
+ """Detect equilibrium patterns"""
1558
+ payload = {
1559
+ "operation": "equilibrium_detection",
1560
+ "pattern_analysis": pattern_analysis,
1561
+ "method_results": method_results,
1562
+ "timestamp": datetime.utcnow().isoformat() + "Z"
1563
+ }
1564
+
1565
+ response = await self.n8n_client.execute_workflow(
1566
+ ProductionConfig.WORKFLOW_IDS["equilibrium_detection"],
1567
+ payload
1568
+ )
1569
+
1570
+ return response.get("data", {}) if response.get("success") else {}
1571
+
1572
+ async def _analyze_threats(self, system_state: Dict) -> Dict:
1573
+ """Analyze STRIDE-E threats"""
1574
+ payload = {
1575
+ "operation": "threat_analysis",
1576
+ "system_state": system_state,
1577
+ "threat_model": "STRIDE-E",
1578
+ "timestamp": datetime.utcnow().isoformat() + "Z"
1579
+ }
1580
+
1581
+ response = await self.n8n_client.execute_workflow(
1582
+ ProductionConfig.WORKFLOW_IDS["threat_analysis"],
1583
+ payload
1584
+ )
1585
+
1586
+ return response.get("data", {}) if response.get("success") else {}
1587
+
1588
+ def _create_composite_analysis(self, content_analysis: Dict,
1589
+ pattern_analysis: Dict,
1590
+ method_results: Dict,
1591
+ equilibrium_analysis: Dict,
1592
+ threat_analysis: Dict) -> Dict:
1593
+ """Create composite analysis"""
1594
+ # Calculate scores
1595
+ pattern_score = pattern_analysis.get("confidence", 0.0)
1596
+ method_score = method_results.get("confidence", 0.0)
1597
+ equilibrium_score = equilibrium_analysis.get("equilibrium_score", 0.0)
1598
+ threat_score = threat_analysis.get("risk_score", 0.0)
1599
+
1600
+ # Weighted composite score
1601
+ weights = {"pattern": 0.3, "method": 0.4, "equilibrium": 0.2, "threat": 0.1}
1602
+ composite_score = (
1603
+ pattern_score * weights["pattern"] +
1604
+ method_score * weights["method"] +
1605
+ equilibrium_score * weights["equilibrium"] +
1606
+ (1 - threat_score) * weights["threat"]
1607
+ )
1608
+
1609
+ # Determine system status
1610
+ if threat_score > 0.7:
1611
+ system_status = "CRITICAL"
1612
+ elif threat_score > 0.4:
1613
+ system_status = "DEGRADED"
1614
+ elif composite_score > 0.7:
1615
+ system_status = "HEALTHY"
1616
+ elif composite_score > 0.4:
1617
+ system_status = "MONITOR"
1618
+ else:
1619
+ system_status = "UNKNOWN"
1620
+
1621
+ return {
1622
+ "composite_score": round(composite_score, 3),
1623
+ "system_status": system_status,
1624
+ "component_scores": {
1625
+ "pattern": round(pattern_score, 3),
1626
+ "method": round(method_score, 3),
1627
+ "equilibrium": round(equilibrium_score, 3),
1628
+ "threat": round(threat_score, 3)
1629
+ },
1630
+ "has_equilibrium": equilibrium_analysis.get("has_equilibrium", False),
1631
+ "threat_level": threat_analysis.get("threat_level", "UNKNOWN"),
1632
+ "pattern_count": len(pattern_analysis.get("patterns", [])),
1633
+ "method_count": method_results.get("methods_applied", 0),
1634
+ "timestamp": datetime.utcnow().isoformat() + "Z",
1635
+ "note": "Quantum-aware analysis, not quantum-resistant"
1636
+ }
1637
+
1638
+ def _select_validators(self, threat_analysis: Dict, phase: str) -> List[str]:
1639
+ """Select validators based on analysis"""
1640
+ validators = []
1641
+
1642
+ # Always include core validators
1643
+ validators.append("system_epistemic_v5")
1644
+ validators.append("temporal_integrity_v5")
1645
+
1646
+ # Conditionally add others
1647
+ threat_level = threat_analysis.get("threat_level", "UNKNOWN")
1648
+ if threat_level in ["HIGH", "CRITICAL"]:
1649
+ validators.append("human_sovereign_v5")
1650
+
1651
+ if phase == SuppressionPhase.POST_SUPPRESSION_EQUILIBRIUM.value:
1652
+ validators.append("quantum_guardian_v5")
1653
+
1654
+ # Ensure minimum validators
1655
+ while len(validators) < ProductionConfig.MIN_VALIDATORS:
1656
+ validators.append(f"backup_validator_{len(validators)}")
1657
+
1658
+ return validators
1659
+
1660
+ async def _get_attestations(self, node: RealityNode,
1661
+ validators: List[str],
1662
+ analysis: Dict) -> List[Dict]:
1663
+ """Get validator attestations"""
1664
+ attestations = []
1665
+
1666
+ for validator_id in validators:
1667
+ payload = {
1668
+ "operation": "validator_attestation",
1669
+ "validator_id": validator_id,
1670
+ "node": node.to_transport_format(),
1671
+ "analysis": analysis,
1672
+ "timestamp": datetime.utcnow().isoformat() + "Z"
1673
+ }
1674
+
1675
+ response = await self.n8n_client.execute_workflow(
1676
+ ProductionConfig.WORKFLOW_IDS["validator_attestation"],
1677
+ payload
1678
+ )
1679
+
1680
+ if response.get("success"):
1681
+ attestations.append({
1682
+ "validator_id": validator_id,
1683
+ "success": True,
1684
+ "signature_data": response.get("data", {}).get("signature"),
1685
+ "attestation": response.get("data", {}).get("attestation"),
1686
+ "timestamp": response.get("timestamp")
1687
+ })
1688
+ else:
1689
+ attestations.append({
1690
+ "validator_id": validator_id,
1691
+ "success": False,
1692
+ "error": response.get("error", "Unknown error"),
1693
+ "timestamp": datetime.utcnow().isoformat() + "Z"
1694
+ })
1695
+
1696
+ return attestations
1697
+
1698
+ def _update_metrics(self, success: bool, execution_time: float,
1699
+ phase: str = None, has_equilibrium: bool = False,
1700
+ quorum_met: bool = False):
1701
+ """Update engine metrics"""
1702
+ self.metrics["total_detections"] += 1
1703
+
1704
+ if success:
1705
+ self.metrics["successful_detections"] += 1
1706
+ else:
1707
+ self.metrics["failed_detections"] += 1
1708
+
1709
+ # Update average execution time
1710
+ old_avg = self.metrics["average_execution_time"]
1711
+ total = self.metrics["total_detections"]
1712
+ self.metrics["average_execution_time"] = (
1713
+ (old_avg * (total - 1)) + execution_time
1714
+ ) / total if total > 0 else execution_time
1715
+
1716
+ if phase:
1717
+ self.metrics["phase_distribution"][phase] += 1
1718
+
1719
+ if has_equilibrium:
1720
+ self.metrics["equilibrium_detections"] += 1
1721
+
1722
+ if quorum_met:
1723
+ self.metrics["quorum_validations"] += 1
1724
+
1725
+ async def _cache_result(self, detection_id: str, result: Dict):
1726
+ """Cache result with TTL"""
1727
+ async with self.cache_lock:
1728
+ self.result_cache[detection_id] = {
1729
+ "result": result,
1730
+ "timestamp": datetime.utcnow().isoformat() + "Z",
1731
+ "expires": (datetime.utcnow() + timedelta(hours=24)).isoformat() + "Z"
1732
+ }
1733
+
1734
+ async def _cleanup_loop(self):
1735
+ """Background cleanup loop"""
1736
+ while True:
1737
+ try:
1738
+ await asyncio.sleep(3600) # Run every hour
1739
+
1740
+ now = datetime.utcnow()
1741
+ expired_keys = []
1742
+
1743
+ async with self.cache_lock:
1744
+ for key, entry in self.result_cache.items():
1745
+ expires = datetime.fromisoformat(entry["expires"].replace('Z', '+00:00'))
1746
+ if now > expires:
1747
+ expired_keys.append(key)
1748
+
1749
+ for key in expired_keys:
1750
+ del self.result_cache[key]
1751
+
1752
+ if expired_keys:
1753
+ logger.info(f"Cleaned up {len(expired_keys)} expired cache entries")
1754
+
1755
+ except asyncio.CancelledError:
1756
+ break
1757
+ except Exception as e:
1758
+ logger.error(f"Cleanup loop error: {e}")
1759
+
1760
+ async def get_system_report(self) -> Dict[str, Any]:
1761
+ """Get comprehensive system report"""
1762
+ ledger_health = self.ledger.analyze_health_sync()
1763
+
1764
+ # Calculate success rate
1765
+ total = self.metrics["total_detections"]
1766
+ successful = self.metrics["successful_detections"]
1767
+ success_rate = successful / total if total > 0 else 0.0
1768
+
1769
+ # Calculate phase distribution percentages
1770
+ phase_dist = dict(self.metrics["phase_distribution"])
1771
+ phase_percentages = {
1772
+ phase: (count / total if total > 0 else 0)
1773
+ for phase, count in phase_dist.items()
1774
+ }
1775
+
1776
+ return {
1777
+ "report_timestamp": datetime.utcnow().isoformat() + "Z",
1778
+ "engine_version": "IRE_v5.0_Production_Fixed",
1779
+ "guarantees": {
1780
+ "quantum_aware": True,
1781
+ "quantum_resistant": False, # Clearly stated
1782
+ "n8n_integrated": True,
1783
+ "async_processing": True,
1784
+ "immutable_ledger": True,
1785
+ "quorum_validation": True
1786
+ },
1787
+ "metrics": {
1788
+ **self.metrics,
1789
+ "success_rate": round(success_rate, 3),
1790
+ "phase_distribution": phase_percentages
1791
+ },
1792
+ "registry_status": {
1793
+ "lenses": len(self.registry.lenses),
1794
+ "methods": len(self.registry.methods),
1795
+ "last_sync": self.registry.last_sync
1796
+ },
1797
+ "ledger_health": ledger_health,
1798
+ "performance": {
1799
+ "average_execution_time": round(self.metrics["average_execution_time"], 2),
1800
+ "cache_size": len(self.result_cache),
1801
+ "background_tasks": len(self._background_tasks)
1802
+ },
1803
+ "config_summary": {
1804
+ "min_validators": ProductionConfig.MIN_VALIDATORS,
1805
+ "quorum_threshold": ProductionConfig.QUORUM_THRESHOLD,
1806
+ "dissent_threshold": ProductionConfig.DISSENT_THRESHOLD,
1807
+ "hash_algorithm": ProductionConfig.HASH_ALGORITHM
1808
+ },
1809
+ "recommendations": self._generate_system_recommendations(ledger_health, success_rate)
1810
+ }
1811
+
1812
+ def _generate_system_recommendations(self, ledger_health: Dict,
1813
+ success_rate: float) -> List[str]:
1814
+ """Generate system recommendations"""
1815
+ recommendations = []
1816
+
1817
+ # Ledger health
1818
+ if ledger_health.get("health_score", 0) < 0.7:
1819
+ recommendations.append("Improve ledger health by adding more nodes and validators")
1820
+
1821
+ # Success rate
1822
+ if success_rate < 0.8 and self.metrics["total_detections"] > 10:
1823
+ recommendations.append(f"Investigate failed detections (success rate: {success