Archie commited on
Commit
40d7073
·
0 Parent(s):

Fix dimension/dimensions bug and positional insert/search args

Browse files
This view is limited to 50 files because it contains too many changes.   See raw diff
Files changed (50) hide show
  1. .gitignore +2 -0
  2. BUGFIX-README.md +33 -0
  3. HOOKS.md +221 -0
  4. PACKAGE_SUMMARY.md +409 -0
  5. README.md +2228 -0
  6. bin/cli.js +0 -0
  7. bin/mcp-server.js +0 -0
  8. dist/analysis/complexity.d.ts +52 -0
  9. dist/analysis/complexity.d.ts.map +1 -0
  10. dist/analysis/complexity.js +146 -0
  11. dist/analysis/index.d.ts +15 -0
  12. dist/analysis/index.d.ts.map +1 -0
  13. dist/analysis/index.js +38 -0
  14. dist/analysis/patterns.d.ts +71 -0
  15. dist/analysis/patterns.d.ts.map +1 -0
  16. dist/analysis/patterns.js +243 -0
  17. dist/analysis/security.d.ts +51 -0
  18. dist/analysis/security.d.ts.map +1 -0
  19. dist/analysis/security.js +139 -0
  20. dist/core/adaptive-embedder.d.ts +156 -0
  21. dist/core/adaptive-embedder.d.ts.map +1 -0
  22. dist/core/adaptive-embedder.js +837 -0
  23. dist/core/agentdb-fast.d.ts +149 -0
  24. dist/core/agentdb-fast.d.ts.map +1 -0
  25. dist/core/agentdb-fast.js +301 -0
  26. dist/core/ast-parser.d.ts +108 -0
  27. dist/core/ast-parser.d.ts.map +1 -0
  28. dist/core/ast-parser.js +602 -0
  29. dist/core/attention-fallbacks.d.ts +321 -0
  30. dist/core/attention-fallbacks.d.ts.map +1 -0
  31. dist/core/attention-fallbacks.js +552 -0
  32. dist/core/cluster-wrapper.d.ts +148 -0
  33. dist/core/cluster-wrapper.d.ts.map +1 -0
  34. dist/core/cluster-wrapper.js +271 -0
  35. dist/core/coverage-router.d.ts +88 -0
  36. dist/core/coverage-router.d.ts.map +1 -0
  37. dist/core/coverage-router.js +315 -0
  38. dist/core/diff-embeddings.d.ts +93 -0
  39. dist/core/diff-embeddings.d.ts.map +1 -0
  40. dist/core/diff-embeddings.js +334 -0
  41. dist/core/gnn-wrapper.d.ts +143 -0
  42. dist/core/gnn-wrapper.d.ts.map +1 -0
  43. dist/core/gnn-wrapper.js +213 -0
  44. dist/core/graph-algorithms.d.ts +83 -0
  45. dist/core/graph-algorithms.d.ts.map +1 -0
  46. dist/core/graph-algorithms.js +514 -0
  47. dist/core/graph-wrapper.d.ts +147 -0
  48. dist/core/graph-wrapper.d.ts.map +1 -0
  49. dist/core/graph-wrapper.js +299 -0
  50. dist/core/index.d.ts +48 -0
.gitignore ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ *.wasm
2
+ *.db
BUGFIX-README.md ADDED
@@ -0,0 +1,33 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # RuVector — Bug Fixes
2
+
3
+ This is a patched version of [ruvector](https://github.com/ruvector/ruvector) with two critical bugs fixed.
4
+
5
+ ## Bugs Fixed
6
+
7
+ ### Bug 1: CLI `create` command fails with "Missing field `dimensions`"
8
+
9
+ **Symptom:** `npx ruvector create ./db -d 384` fails with `Missing field 'dimensions'`
10
+
11
+ **Root Cause:** The CLI passes `{ dimension: 384 }` (singular) to the `VectorDB` constructor, but the native Rust binding (`@ruvector/core`) expects `{ dimensions: 384 }` (plural).
12
+
13
+ **Fix:** The `VectorDBWrapper` constructor now normalizes `dimension` → `dimensions` automatically. (`dist/index.js`)
14
+
15
+ ### Bug 2: JS API insert fails with "Dimension mismatch: expected 384, got 0"
16
+
17
+ **Symptom:** `await db.insert([...384 floats...], metadata)` fails with dimension mismatch even though the vector has the correct length.
18
+
19
+ **Root Cause:** The `insert()` method only accepted object-style args `insert({vector, metadata})`, but users naturally call it with positional args `insert(vector, metadata)`. When a Float32Array was passed as the first arg, `entry.vector` was `undefined`, creating an empty Float32Array(0).
20
+
21
+ **Fix:** Both `insert()` and `search()` now accept positional arguments in addition to object-style:
22
+ - `db.insert(vector, metadata)` — positional style (new)
23
+ - `db.insert({vector, metadata})` — object style (still works)
24
+ - `db.search(vector, k)` — positional style (new)
25
+ - `db.search({vector, k})` — object style (still works)
26
+
27
+ ## Files Modified
28
+
29
+ - `dist/index.js` — VectorDBWrapper class (constructor, insert, search methods)
30
+
31
+ ## Original Repository
32
+
33
+ https://github.com/ruvector/ruvector
HOOKS.md ADDED
@@ -0,0 +1,221 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # RuVector Hooks for Claude Code
2
+
3
+ Self-learning intelligence hooks that enhance Claude Code with Q-learning, vector memory, and automatic agent routing.
4
+
5
+ ## Quick Start
6
+
7
+ ```bash
8
+ # Full setup: hooks + pretrain + optimized agents
9
+ npx ruvector hooks init --pretrain --build-agents quality
10
+
11
+ # Or step by step:
12
+ npx ruvector hooks init # Setup hooks
13
+ npx ruvector hooks pretrain # Analyze repository
14
+ npx ruvector hooks build-agents # Generate agent configs
15
+ ```
16
+
17
+ ## What It Does
18
+
19
+ RuVector hooks integrate with Claude Code to provide:
20
+
21
+ | Feature | Description |
22
+ |---------|-------------|
23
+ | **Agent Routing** | Suggests the best agent for each file type based on learned patterns |
24
+ | **Co-edit Patterns** | Predicts "likely next files" from git history |
25
+ | **Vector Memory** | Semantic recall of project context |
26
+ | **Command Analysis** | Risk assessment for bash commands |
27
+ | **Self-Learning** | Q-learning improves suggestions over time |
28
+
29
+ ## Commands
30
+
31
+ ### Initialization
32
+
33
+ ```bash
34
+ # Full configuration
35
+ npx ruvector hooks init
36
+
37
+ # With pretrain and agent building
38
+ npx ruvector hooks init --pretrain --build-agents security
39
+
40
+ # Minimal (basic hooks only)
41
+ npx ruvector hooks init --minimal
42
+
43
+ # Options
44
+ --force # Overwrite existing settings
45
+ --minimal # Basic hooks only
46
+ --pretrain # Run pretrain after init
47
+ --build-agents # Generate optimized agents (quality|speed|security|testing|fullstack)
48
+ --no-claude-md # Skip CLAUDE.md creation
49
+ --no-permissions # Skip permissions config
50
+ --no-env # Skip environment variables
51
+ --no-gitignore # Skip .gitignore update
52
+ --no-mcp # Skip MCP server config
53
+ --no-statusline # Skip status line config
54
+ ```
55
+
56
+ ### Pretrain
57
+
58
+ Analyze your repository to bootstrap intelligence:
59
+
60
+ ```bash
61
+ npx ruvector hooks pretrain
62
+
63
+ # Options
64
+ --depth <n> # Git history depth (default: 100)
65
+ --verbose # Show detailed progress
66
+ --skip-git # Skip git history analysis
67
+ --skip-files # Skip file structure analysis
68
+ ```
69
+
70
+ **What it learns:**
71
+ - File type → Agent mapping (`.rs` → rust-developer)
72
+ - Co-edit patterns from git history
73
+ - Directory → Agent mapping
74
+ - Project context memories
75
+
76
+ ### Build Agents
77
+
78
+ Generate optimized `.claude/agents/` configurations:
79
+
80
+ ```bash
81
+ npx ruvector hooks build-agents --focus quality
82
+
83
+ # Focus modes
84
+ --focus quality # Code quality, best practices (default)
85
+ --focus speed # Rapid development, prototyping
86
+ --focus security # OWASP, input validation, encryption
87
+ --focus testing # TDD, comprehensive coverage
88
+ --focus fullstack # Balanced frontend/backend/database
89
+
90
+ # Options
91
+ --output <dir> # Output directory (default: .claude/agents)
92
+ --format <fmt> # yaml, json, or md (default: yaml)
93
+ --include-prompts # Include system prompts in agent configs
94
+ ```
95
+
96
+ ### Verification & Diagnostics
97
+
98
+ ```bash
99
+ # Check if hooks are working
100
+ npx ruvector hooks verify
101
+
102
+ # Diagnose and fix issues
103
+ npx ruvector hooks doctor
104
+ npx ruvector hooks doctor --fix
105
+ ```
106
+
107
+ ### Data Management
108
+
109
+ ```bash
110
+ # View statistics
111
+ npx ruvector hooks stats
112
+
113
+ # Export intelligence data
114
+ npx ruvector hooks export -o backup.json
115
+ npx ruvector hooks export --include-all
116
+
117
+ # Import intelligence data
118
+ npx ruvector hooks import backup.json
119
+ npx ruvector hooks import backup.json --merge
120
+ ```
121
+
122
+ ### Memory Operations
123
+
124
+ ```bash
125
+ # Store context in vector memory
126
+ npx ruvector hooks remember "API uses JWT auth" -t project
127
+
128
+ # Semantic search memory
129
+ npx ruvector hooks recall "authentication"
130
+
131
+ # Route a task to best agent
132
+ npx ruvector hooks route "implement user login"
133
+ ```
134
+
135
+ ## Hook Events
136
+
137
+ | Event | Trigger | RuVector Action |
138
+ |-------|---------|-----------------|
139
+ | **PreToolUse** | Before Edit/Write/Bash | Agent routing, file analysis, command risk |
140
+ | **PostToolUse** | After Edit/Write/Bash | Q-learning update, pattern recording |
141
+ | **SessionStart** | Conversation begins | Load intelligence, display stats |
142
+ | **Stop** | Conversation ends | Save learning data |
143
+ | **UserPromptSubmit** | User sends message | Context suggestions |
144
+ | **PreCompact** | Before context compaction | Preserve important context |
145
+ | **Notification** | Any notification | Track events for learning |
146
+
147
+ ## Generated Files
148
+
149
+ After running `hooks init`:
150
+
151
+ ```
152
+ your-project/
153
+ ├── .claude/
154
+ │ ├── settings.json # Hooks configuration
155
+ │ ├── statusline.sh # Status bar script
156
+ │ └── agents/ # Generated agents (with --build-agents)
157
+ │ ├── rust-specialist.yaml
158
+ │ ├── typescript-specialist.yaml
159
+ │ ├── test-architect.yaml
160
+ │ └── project-coordinator.yaml
161
+ ├── .ruvector/
162
+ │ └── intelligence.json # Learning data
163
+ ├── CLAUDE.md # Project documentation
164
+ └── .gitignore # Updated with .ruvector/
165
+ ```
166
+
167
+ ## Environment Variables
168
+
169
+ | Variable | Default | Description |
170
+ |----------|---------|-------------|
171
+ | `RUVECTOR_INTELLIGENCE_ENABLED` | `true` | Enable/disable intelligence |
172
+ | `RUVECTOR_LEARNING_RATE` | `0.1` | Q-learning rate (0.0-1.0) |
173
+ | `RUVECTOR_MEMORY_BACKEND` | `rvlite` | Memory storage backend |
174
+ | `INTELLIGENCE_MODE` | `treatment` | A/B testing mode |
175
+
176
+ ## Example Output
177
+
178
+ ### Agent Routing
179
+ ```
180
+ 🧠 Intelligence Analysis:
181
+ 📁 src/api/routes.ts
182
+ 🤖 Recommended: typescript-developer (85% confidence)
183
+ → learned from 127 .ts files in repo
184
+ 📎 Likely next files:
185
+ - src/api/handlers.ts (12 co-edits)
186
+ - src/types/api.ts (8 co-edits)
187
+ ```
188
+
189
+ ### Command Analysis
190
+ ```
191
+ 🧠 Command Analysis:
192
+ 📦 Category: rust
193
+ 🏷️ Type: test
194
+ ✅ Risk: LOW
195
+ ```
196
+
197
+ ## Best Practices
198
+
199
+ 1. **Run pretrain on existing repos** — Bootstrap intelligence before starting work
200
+ 2. **Use focus modes** — Match agent generation to your current task
201
+ 3. **Export before major changes** — Backup learning data
202
+ 4. **Let it learn** — Intelligence improves with each edit
203
+
204
+ ## Troubleshooting
205
+
206
+ ```bash
207
+ # Check setup
208
+ npx ruvector hooks verify
209
+
210
+ # Fix common issues
211
+ npx ruvector hooks doctor --fix
212
+
213
+ # Reset and reinitialize
214
+ npx ruvector hooks init --force --pretrain
215
+ ```
216
+
217
+ ## Links
218
+
219
+ - [RuVector GitHub](https://github.com/ruvnet/ruvector)
220
+ - [npm Package](https://www.npmjs.com/package/ruvector)
221
+ - [Claude Code Documentation](https://docs.anthropic.com/claude-code)
PACKAGE_SUMMARY.md ADDED
@@ -0,0 +1,409 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # ruvector Package Summary
2
+
3
+ ## Overview
4
+
5
+ The main `ruvector` package provides a unified interface for high-performance vector database operations in Node.js, with automatic platform detection and smart fallback between native (Rust) and WASM implementations.
6
+
7
+ ## Package Structure
8
+
9
+ ```
10
+ /workspaces/ruvector/npm/packages/ruvector/
11
+ ├── src/ # TypeScript source
12
+ │ ├── index.ts # Smart loader with platform detection
13
+ │ └── types.ts # TypeScript type definitions
14
+ ├── dist/ # Compiled JavaScript and types
15
+ │ ├── index.js # Main entry point
16
+ │ ├── index.d.ts # Type definitions
17
+ │ ├── types.js # Compiled types
18
+ │ └── types.d.ts # Type definitions
19
+ ├── bin/
20
+ │ └── cli.js # CLI tool
21
+ ├── test/
22
+ │ ├── mock-implementation.js # Mock VectorDB for testing
23
+ │ ├── standalone-test.js # Package structure tests
24
+ │ └── integration.js # Integration tests
25
+ ├── examples/
26
+ │ ├── api-usage.js # API usage examples
27
+ │ └── cli-demo.sh # CLI demonstration
28
+ ├── package.json # NPM package configuration
29
+ ├── tsconfig.json # TypeScript configuration
30
+ └── README.md # Package documentation
31
+ ```
32
+
33
+ ## Key Features
34
+
35
+ ### 1. Smart Platform Detection
36
+
37
+ The package automatically detects and loads the best available implementation:
38
+
39
+ ```typescript
40
+ // Tries to load in this order:
41
+ // 1. @ruvector/core (native Rust, fastest)
42
+ // 2. @ruvector/wasm (WebAssembly, universal fallback)
43
+
44
+ import { VectorDB, getImplementationType, isNative, isWasm } from 'ruvector';
45
+
46
+ console.log(getImplementationType()); // 'native' or 'wasm'
47
+ console.log(isNative()); // true if using native
48
+ console.log(isWasm()); // true if using WASM
49
+ ```
50
+
51
+ ### 2. Complete TypeScript Support
52
+
53
+ Full type definitions for all APIs:
54
+
55
+ ```typescript
56
+ interface VectorEntry {
57
+ id: string;
58
+ vector: number[];
59
+ metadata?: Record<string, any>;
60
+ }
61
+
62
+ interface SearchQuery {
63
+ vector: number[];
64
+ k?: number;
65
+ filter?: Record<string, any>;
66
+ threshold?: number;
67
+ }
68
+
69
+ interface SearchResult {
70
+ id: string;
71
+ score: number;
72
+ vector: number[];
73
+ metadata?: Record<string, any>;
74
+ }
75
+
76
+ interface DbOptions {
77
+ dimension: number;
78
+ metric?: 'cosine' | 'euclidean' | 'dot';
79
+ path?: string;
80
+ autoPersist?: boolean;
81
+ hnsw?: {
82
+ m?: number;
83
+ efConstruction?: number;
84
+ efSearch?: number;
85
+ };
86
+ }
87
+ ```
88
+
89
+ ### 3. VectorDB API
90
+
91
+ Comprehensive vector database operations:
92
+
93
+ ```typescript
94
+ const db = new VectorDB({
95
+ dimension: 384,
96
+ metric: 'cosine'
97
+ });
98
+
99
+ // Insert operations
100
+ db.insert({ id: 'doc1', vector: [...], metadata: {...} });
101
+ db.insertBatch([...entries]);
102
+
103
+ // Search operations
104
+ const results = db.search({
105
+ vector: [...],
106
+ k: 10,
107
+ threshold: 0.7
108
+ });
109
+
110
+ // CRUD operations
111
+ const entry = db.get('doc1');
112
+ db.updateMetadata('doc1', { updated: true });
113
+ db.delete('doc1');
114
+
115
+ // Database management
116
+ const stats = db.stats();
117
+ db.save('./mydb.vec');
118
+ db.load('./mydb.vec');
119
+ db.buildIndex();
120
+ db.optimize();
121
+ ```
122
+
123
+ ### 4. CLI Tools
124
+
125
+ Command-line interface for database operations:
126
+
127
+ ```bash
128
+ # Create database
129
+ ruvector create mydb.vec --dimension 384 --metric cosine
130
+
131
+ # Insert vectors
132
+ ruvector insert mydb.vec vectors.json --batch-size 1000
133
+
134
+ # Search
135
+ ruvector search mydb.vec --vector "[0.1,0.2,...]" --top-k 10
136
+
137
+ # Statistics
138
+ ruvector stats mydb.vec
139
+
140
+ # Benchmark
141
+ ruvector benchmark --num-vectors 10000 --num-queries 1000
142
+
143
+ # Info
144
+ ruvector info
145
+ ```
146
+
147
+ ## API Reference
148
+
149
+ ### Constructor
150
+
151
+ ```typescript
152
+ new VectorDB(options: DbOptions): VectorDB
153
+ ```
154
+
155
+ ### Methods
156
+
157
+ - `insert(entry: VectorEntry): void` - Insert single vector
158
+ - `insertBatch(entries: VectorEntry[]): void` - Batch insert
159
+ - `search(query: SearchQuery): SearchResult[]` - Search similar vectors
160
+ - `get(id: string): VectorEntry | null` - Get by ID
161
+ - `delete(id: string): boolean` - Delete vector
162
+ - `updateMetadata(id: string, metadata: Record<string, any>): void` - Update metadata
163
+ - `stats(): DbStats` - Get database statistics
164
+ - `save(path?: string): void` - Save to disk
165
+ - `load(path: string): void` - Load from disk
166
+ - `clear(): void` - Clear all vectors
167
+ - `buildIndex(): void` - Build HNSW index
168
+ - `optimize(): void` - Optimize database
169
+
170
+ ### Utility Functions
171
+
172
+ - `getImplementationType(): 'native' | 'wasm'` - Get current implementation
173
+ - `isNative(): boolean` - Check if using native
174
+ - `isWasm(): boolean` - Check if using WASM
175
+ - `getVersion(): { version: string, implementation: string }` - Get version info
176
+
177
+ ## Dependencies
178
+
179
+ ### Production Dependencies
180
+
181
+ - `commander` (^11.1.0) - CLI framework
182
+ - `chalk` (^4.1.2) - Terminal styling
183
+ - `ora` (^5.4.1) - Spinners and progress
184
+
185
+ ### Optional Dependencies
186
+
187
+ - `@ruvector/core` (^0.1.1) - Native Rust bindings (when available)
188
+ - `@ruvector/wasm` (^0.1.1) - WebAssembly module (fallback)
189
+
190
+ ### Dev Dependencies
191
+
192
+ - `typescript` (^5.3.3) - TypeScript compiler
193
+ - `@types/node` (^20.10.5) - Node.js type definitions
194
+
195
+ ## Package.json Configuration
196
+
197
+ ```json
198
+ {
199
+ "name": "ruvector",
200
+ "version": "0.1.1",
201
+ "main": "dist/index.js",
202
+ "types": "dist/index.d.ts",
203
+ "bin": {
204
+ "ruvector": "./bin/cli.js"
205
+ },
206
+ "scripts": {
207
+ "build": "tsc",
208
+ "test": "node test/standalone-test.js"
209
+ }
210
+ }
211
+ ```
212
+
213
+ ## Build Process
214
+
215
+ ```bash
216
+ # Install dependencies
217
+ npm install
218
+
219
+ # Build TypeScript
220
+ npm run build
221
+
222
+ # Run tests
223
+ npm test
224
+
225
+ # Package for NPM
226
+ npm pack
227
+ ```
228
+
229
+ ## Testing
230
+
231
+ The package includes comprehensive tests:
232
+
233
+ ### 1. Standalone Test (`test/standalone-test.js`)
234
+
235
+ Tests package structure and API using mock implementation:
236
+ - Package structure validation
237
+ - TypeScript type definitions
238
+ - VectorDB API functionality
239
+ - CLI structure
240
+ - Smart loader logic
241
+
242
+ ### 2. Integration Test (`test/integration.js`)
243
+
244
+ Tests integration with real implementations when available.
245
+
246
+ ### 3. Mock Implementation (`test/mock-implementation.js`)
247
+
248
+ JavaScript-based VectorDB implementation for testing and demonstration purposes.
249
+
250
+ ## Examples
251
+
252
+ ### API Usage (`examples/api-usage.js`)
253
+
254
+ Demonstrates:
255
+ - Basic CRUD operations
256
+ - Batch operations
257
+ - Semantic search
258
+ - Different distance metrics
259
+ - Performance benchmarking
260
+ - Persistence
261
+
262
+ ### CLI Demo (`examples/cli-demo.sh`)
263
+
264
+ Bash script demonstrating CLI tools.
265
+
266
+ ## Usage Examples
267
+
268
+ ### Simple Vector Search
269
+
270
+ ```javascript
271
+ const { VectorDB } = require('ruvector');
272
+
273
+ const db = new VectorDB({ dimension: 3 });
274
+
275
+ db.insertBatch([
276
+ { id: 'cat', vector: [0.9, 0.1, 0.1], metadata: { animal: 'cat' } },
277
+ { id: 'dog', vector: [0.1, 0.9, 0.1], metadata: { animal: 'dog' } },
278
+ { id: 'tiger', vector: [0.8, 0.2, 0.15], metadata: { animal: 'tiger' } }
279
+ ]);
280
+
281
+ const results = db.search({
282
+ vector: [0.9, 0.1, 0.1],
283
+ k: 2
284
+ });
285
+
286
+ console.log(results);
287
+ // [
288
+ // { id: 'cat', score: 1.0, ... },
289
+ // { id: 'tiger', score: 0.97, ... }
290
+ // ]
291
+ ```
292
+
293
+ ### Semantic Document Search
294
+
295
+ ```javascript
296
+ const db = new VectorDB({ dimension: 768, metric: 'cosine' });
297
+
298
+ // Insert documents with embeddings (from your embedding model)
299
+ db.insertBatch([
300
+ { id: 'doc1', vector: embedding1, metadata: { title: 'AI Guide' } },
301
+ { id: 'doc2', vector: embedding2, metadata: { title: 'Web Dev' } }
302
+ ]);
303
+
304
+ // Search with query embedding
305
+ const results = db.search({
306
+ vector: queryEmbedding,
307
+ k: 10,
308
+ threshold: 0.7
309
+ });
310
+ ```
311
+
312
+ ### Persistence
313
+
314
+ ```javascript
315
+ const db = new VectorDB({
316
+ dimension: 384,
317
+ path: './vectors.db',
318
+ autoPersist: true
319
+ });
320
+
321
+ // Changes automatically saved
322
+ db.insert({ id: 'doc1', vector: [...] });
323
+
324
+ // Or manual save
325
+ db.save('./backup.db');
326
+
327
+ // Load from disk
328
+ db.load('./vectors.db');
329
+ ```
330
+
331
+ ## Performance Characteristics
332
+
333
+ ### Mock Implementation (JavaScript)
334
+ - Insert: ~1M vectors/sec (batch)
335
+ - Search: ~400 queries/sec (1000 vectors, k=10)
336
+
337
+ ### Native Implementation (Rust)
338
+ - Insert: ~10M+ vectors/sec (batch)
339
+ - Search: ~100K+ queries/sec with HNSW index
340
+ - 150x faster than pgvector
341
+
342
+ ### WASM Implementation
343
+ - Insert: ~1M+ vectors/sec (batch)
344
+ - Search: ~10K+ queries/sec with HNSW index
345
+ - ~10x faster than pure JavaScript
346
+
347
+ ## Integration with Other Packages
348
+
349
+ This package serves as the main interface and coordinates between:
350
+
351
+ 1. **@ruvector/core** - Native Rust bindings (napi-rs)
352
+ - Platform-specific native modules
353
+ - Maximum performance
354
+ - Optional dependency
355
+
356
+ 2. **@ruvector/wasm** - WebAssembly module
357
+ - Universal compatibility
358
+ - Near-native performance
359
+ - Fallback implementation
360
+
361
+ ## Error Handling
362
+
363
+ The package provides clear error messages when implementations are unavailable:
364
+
365
+ ```
366
+ Failed to load ruvector: Neither native nor WASM implementation available.
367
+ Native error: Cannot find module '@ruvector/core'
368
+ WASM error: Cannot find module '@ruvector/wasm'
369
+ ```
370
+
371
+ ## Environment Variables
372
+
373
+ - `RUVECTOR_DEBUG=1` - Enable debug logging for implementation loading
374
+
375
+ ## Next Steps
376
+
377
+ To complete the package ecosystem:
378
+
379
+ 1. **Create @ruvector/core**
380
+ - napi-rs bindings to Rust code
381
+ - Platform-specific builds (Linux, macOS, Windows)
382
+ - Native module packaging
383
+
384
+ 2. **Create @ruvector/wasm**
385
+ - wasm-pack build from Rust code
386
+ - WebAssembly module
387
+ - Universal compatibility layer
388
+
389
+ 3. **Update Dependencies**
390
+ - Add @ruvector/core as optionalDependency
391
+ - Add @ruvector/wasm as dependency
392
+ - Configure proper fallback chain
393
+
394
+ 4. **Publishing**
395
+ - Publish all three packages to npm
396
+ - Set up CI/CD for builds
397
+ - Create platform-specific releases
398
+
399
+ ## Version
400
+
401
+ Current version: **0.1.1**
402
+
403
+ ## License
404
+
405
+ MIT
406
+
407
+ ## Repository
408
+
409
+ https://github.com/ruvnet/ruvector
README.md ADDED
@@ -0,0 +1,2228 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # ruvector
2
+
3
+ [![npm version](https://badge.fury.io/js/ruvector.svg)](https://www.npmjs.com/package/ruvector)
4
+ [![License: MIT](https://img.shields.io/badge/License-MIT-blue.svg)](https://opensource.org/licenses/MIT)
5
+ [![Node Version](https://img.shields.io/node/v/ruvector)](https://nodejs.org)
6
+ [![Downloads](https://img.shields.io/npm/dm/ruvector)](https://www.npmjs.com/package/ruvector)
7
+ [![Build Status](https://img.shields.io/badge/build-passing-brightgreen.svg)](https://github.com/ruvnet/ruvector)
8
+ [![Performance](https://img.shields.io/badge/latency-<0.5ms-green.svg)](https://github.com/ruvnet/ruvector)
9
+ [![GitHub Stars](https://img.shields.io/github/stars/ruvnet/ruvector?style=social)](https://github.com/ruvnet/ruvector)
10
+
11
+ **The fastest vector database for Node.js—built in Rust, runs everywhere**
12
+
13
+ Ruvector is a next-generation vector database that brings **enterprise-grade semantic search** to Node.js applications. Unlike cloud-only solutions or Python-first databases, Ruvector is designed specifically for JavaScript/TypeScript developers who need **blazing-fast vector similarity search** without the complexity of external services.
14
+
15
+ > 🚀 **Sub-millisecond queries** • 🎯 **52,000+ inserts/sec** • 💾 **~50 bytes per vector** • 🌍 **Runs anywhere**
16
+
17
+ Built by [rUv](https://ruv.io) with production-grade Rust performance and intelligent platform detection—**automatically uses native bindings when available, falls back to WebAssembly when needed**.
18
+
19
+ 🌐 **[Visit ruv.io](https://ruv.io)** | 📦 **[GitHub](https://github.com/ruvnet/ruvector)** | 📚 **[Documentation](https://github.com/ruvnet/ruvector/tree/main/docs)**
20
+
21
+ ---
22
+
23
+ ## 🧠 Claude Code Intelligence v2.0
24
+
25
+ **Self-learning intelligence for Claude Code** — RuVector provides optimized hooks with ONNX embeddings, AST analysis, and coverage-aware routing.
26
+
27
+ ```bash
28
+ # One-command setup with pretrain and agent generation
29
+ npx ruvector hooks init --pretrain --build-agents quality
30
+ ```
31
+
32
+ ### Core Features
33
+ - 🎯 **Smart Agent Routing** — Q-learning optimized suggestions with 80%+ accuracy
34
+ - 📚 **9-Phase Pretrain** — AST, diff, coverage, neural, and graph analysis
35
+ - 🤖 **Agent Builder** — Generates optimized `.claude/agents/` configs
36
+ - 🔗 **Co-edit Patterns** — Learns file relationships from git history
37
+ - 💾 **Vector Memory** — HNSW-indexed semantic recall (150x faster)
38
+
39
+ ### New in v2.0
40
+ - ⚡ **ONNX WASM Embeddings** — all-MiniLM-L6-v2 (384d) runs locally, no API needed
41
+ - 🌳 **AST Analysis** — Symbol extraction, complexity metrics, import graphs
42
+ - 📊 **Diff Embeddings** — Semantic change classification with risk scoring
43
+ - 🧪 **Coverage Routing** — Test coverage-aware agent selection
44
+ - 🔍 **Graph Algorithms** — MinCut boundaries, Louvain communities, Spectral clustering
45
+ - 🛡️ **Security Scanning** — Parallel vulnerability pattern detection
46
+ - 🎯 **RAG Context** — Semantic retrieval with HNSW indexing
47
+
48
+ ### Performance
49
+ | Backend | Read Time | Speedup |
50
+ |---------|-----------|---------|
51
+ | ONNX inference | ~400ms | baseline |
52
+ | HNSW search | ~0.045ms | 8,800x |
53
+ | Memory cache | ~0.01ms | **40,000x** |
54
+
55
+ 📖 **[Full Hooks Documentation →](https://github.com/ruvnet/ruvector/blob/main/npm/packages/ruvector/HOOKS.md)**
56
+
57
+ ### MCP Server Integration
58
+
59
+ RuVector includes an MCP server for Claude Code with 30+ tools:
60
+
61
+ ```bash
62
+ # Add to Claude Code
63
+ claude mcp add ruvector -- npx ruvector mcp start
64
+ ```
65
+
66
+ **Available MCP Tools:**
67
+ - `hooks_route`, `hooks_route_enhanced` — Agent routing with signals
68
+ - `hooks_ast_analyze`, `hooks_ast_complexity` — Code structure analysis
69
+ - `hooks_diff_analyze`, `hooks_diff_classify` — Change classification
70
+ - `hooks_coverage_route`, `hooks_coverage_suggest` — Test-aware routing
71
+ - `hooks_graph_mincut`, `hooks_graph_cluster` — Code boundaries
72
+ - `hooks_security_scan` — Vulnerability detection
73
+ - `hooks_rag_context` — Semantic context retrieval
74
+ - `hooks_attention_info`, `hooks_gnn_info` — Neural capabilities
75
+
76
+ ---
77
+
78
+ ## 🌟 Why Ruvector?
79
+
80
+ ### The Problem with Existing Vector Databases
81
+
82
+ Most vector databases force you to choose between three painful trade-offs:
83
+
84
+ 1. **Cloud-Only Services** (Pinecone, Weaviate Cloud) - Expensive, vendor lock-in, latency issues, API rate limits
85
+ 2. **Python-First Solutions** (ChromaDB, Faiss) - Poor Node.js support, require separate Python processes
86
+ 3. **Self-Hosted Complexity** (Milvus, Qdrant) - Heavy infrastructure, Docker orchestration, operational overhead
87
+
88
+ **Ruvector eliminates these trade-offs.**
89
+
90
+ ### The Ruvector Advantage
91
+
92
+ Ruvector is purpose-built for **modern JavaScript/TypeScript applications** that need vector search:
93
+
94
+ 🎯 **Native Node.js Integration**
95
+ - Drop-in npm package—no Docker, no Python, no external services
96
+ - Full TypeScript support with complete type definitions
97
+ - Automatic platform detection with native Rust bindings
98
+ - Seamless WebAssembly fallback for universal compatibility
99
+
100
+ ⚡ **Production-Grade Performance**
101
+ - **52,000+ inserts/second** with native Rust (10x faster than Python alternatives)
102
+ - **<0.5ms query latency** with HNSW indexing and SIMD optimizations
103
+ - **~50 bytes per vector** with advanced memory optimization
104
+ - Scales from edge devices to millions of vectors
105
+
106
+ 🧠 **Built for AI Applications**
107
+ - Optimized for LLM embeddings (OpenAI, Cohere, Hugging Face)
108
+ - Perfect for RAG (Retrieval-Augmented Generation) systems
109
+ - Agent memory and semantic caching
110
+ - Real-time recommendation engines
111
+
112
+ 🌍 **Universal Deployment**
113
+ - **Linux, macOS, Windows** with native performance
114
+ - **Browser support** via WebAssembly (experimental)
115
+ - **Edge computing** and serverless environments
116
+ - **Alpine Linux** and non-glibc systems supported
117
+
118
+ 💰 **Zero Operational Costs**
119
+ - No cloud API fees or usage limits
120
+ - No infrastructure to manage
121
+ - No separate database servers
122
+ - Open source MIT license
123
+
124
+ ### Key Advantages
125
+
126
+ - ⚡ **Blazing Fast**: <0.5ms p50 latency with native Rust, 10-50ms with WASM fallback
127
+ - 🎯 **Automatic Platform Detection**: Uses native when available, falls back to WASM seamlessly
128
+ - 🧠 **AI-Native**: Built specifically for embeddings, RAG, semantic search, and agent memory
129
+ - 🔧 **CLI Tools Included**: Full command-line interface for database management
130
+ - 🌍 **Universal Deployment**: Works on all platforms—Linux, macOS, Windows, even browsers
131
+ - 💾 **Memory Efficient**: ~50 bytes per vector with advanced quantization
132
+ - 🚀 **Production Ready**: Battle-tested algorithms with comprehensive benchmarks
133
+ - 🔓 **Open Source**: MIT licensed, community-driven
134
+
135
+ ## 🚀 Quick Start Tutorial
136
+
137
+ ### Step 1: Installation
138
+
139
+ Install Ruvector with a single npm command:
140
+
141
+ ```bash
142
+ npm install ruvector
143
+ ```
144
+
145
+ **What happens during installation:**
146
+ - npm automatically detects your platform (Linux, macOS, Windows)
147
+ - Downloads the correct native binary for maximum performance
148
+ - Falls back to WebAssembly if native binaries aren't available
149
+ - No additional setup, Docker, or external services required
150
+
151
+ **Windows Installation (without build tools):**
152
+ ```bash
153
+ # Skip native compilation, use WASM fallback
154
+ npm install ruvector --ignore-scripts
155
+
156
+ # The ONNX WASM runtime (7.4MB) works without build tools
157
+ # Memory cache provides 40,000x speedup over inference
158
+ ```
159
+
160
+ **Verify installation:**
161
+ ```bash
162
+ npx ruvector info
163
+ ```
164
+
165
+ You should see your platform and implementation type (native Rust or WASM fallback).
166
+
167
+ ### Step 2: Your First Vector Database
168
+
169
+ Let's create a simple vector database and perform basic operations. This example demonstrates the complete CRUD (Create, Read, Update, Delete) workflow:
170
+
171
+ ```javascript
172
+ const { VectorDb } = require('ruvector');
173
+
174
+ async function tutorial() {
175
+ // Step 2.1: Create a new vector database
176
+ // The 'dimensions' parameter must match your embedding model
177
+ // Common sizes: 128, 384 (sentence-transformers), 768 (BERT), 1536 (OpenAI)
178
+ const db = new VectorDb({
179
+ dimensions: 128, // Vector size - MUST match your embeddings
180
+ maxElements: 10000, // Maximum vectors (can grow automatically)
181
+ storagePath: './my-vectors.db' // Persist to disk (omit for in-memory)
182
+ });
183
+
184
+ console.log('✅ Database created successfully');
185
+
186
+ // Step 2.2: Insert vectors
187
+ // In real applications, these would come from an embedding model
188
+ const documents = [
189
+ { id: 'doc1', text: 'Artificial intelligence and machine learning' },
190
+ { id: 'doc2', text: 'Deep learning neural networks' },
191
+ { id: 'doc3', text: 'Natural language processing' },
192
+ ];
193
+
194
+ for (const doc of documents) {
195
+ // Generate random vector for demonstration
196
+ // In production: use OpenAI, Cohere, or sentence-transformers
197
+ const vector = new Float32Array(128).map(() => Math.random());
198
+
199
+ await db.insert({
200
+ id: doc.id,
201
+ vector: vector,
202
+ metadata: {
203
+ text: doc.text,
204
+ timestamp: Date.now(),
205
+ category: 'AI'
206
+ }
207
+ });
208
+
209
+ console.log(`✅ Inserted: ${doc.id}`);
210
+ }
211
+
212
+ // Step 2.3: Search for similar vectors
213
+ // Create a query vector (in production, this would be from your search query)
214
+ const queryVector = new Float32Array(128).map(() => Math.random());
215
+
216
+ const results = await db.search({
217
+ vector: queryVector,
218
+ k: 5, // Return top 5 most similar vectors
219
+ threshold: 0.7 // Only return results with similarity > 0.7
220
+ });
221
+
222
+ console.log('\n🔍 Search Results:');
223
+ results.forEach((result, index) => {
224
+ console.log(`${index + 1}. ${result.id} - Score: ${result.score.toFixed(3)}`);
225
+ console.log(` Text: ${result.metadata.text}`);
226
+ });
227
+
228
+ // Step 2.4: Retrieve a specific vector
229
+ const retrieved = await db.get('doc1');
230
+ if (retrieved) {
231
+ console.log('\n📄 Retrieved document:', retrieved.metadata.text);
232
+ }
233
+
234
+ // Step 2.5: Get database statistics
235
+ const count = await db.len();
236
+ console.log(`\n📊 Total vectors in database: ${count}`);
237
+
238
+ // Step 2.6: Delete a vector
239
+ const deleted = await db.delete('doc1');
240
+ console.log(`\n🗑️ Deleted doc1: ${deleted ? 'Success' : 'Not found'}`);
241
+
242
+ // Final count
243
+ const finalCount = await db.len();
244
+ console.log(`📊 Final count: ${finalCount}`);
245
+ }
246
+
247
+ // Run the tutorial
248
+ tutorial().catch(console.error);
249
+ ```
250
+
251
+ **Expected Output:**
252
+ ```
253
+ ✅ Database created successfully
254
+ ✅ Inserted: doc1
255
+ ✅ Inserted: doc2
256
+ ✅ Inserted: doc3
257
+
258
+ 🔍 Search Results:
259
+ 1. doc2 - Score: 0.892
260
+ Text: Deep learning neural networks
261
+ 2. doc1 - Score: 0.856
262
+ Text: Artificial intelligence and machine learning
263
+ 3. doc3 - Score: 0.801
264
+ Text: Natural language processing
265
+
266
+ 📄 Retrieved document: Artificial intelligence and machine learning
267
+
268
+ 📊 Total vectors in database: 3
269
+
270
+ 🗑️ Deleted doc1: Success
271
+ 📊 Final count: 2
272
+ ```
273
+
274
+ ### Step 3: TypeScript Tutorial
275
+
276
+ Ruvector provides full TypeScript support with complete type safety. Here's how to use it:
277
+
278
+ ```typescript
279
+ import { VectorDb, VectorEntry, SearchQuery, SearchResult } from 'ruvector';
280
+
281
+ // Step 3.1: Define your custom metadata type
282
+ interface DocumentMetadata {
283
+ title: string;
284
+ content: string;
285
+ author: string;
286
+ date: Date;
287
+ tags: string[];
288
+ }
289
+
290
+ async function typescriptTutorial() {
291
+ // Step 3.2: Create typed database
292
+ const db = new VectorDb({
293
+ dimensions: 384, // sentence-transformers/all-MiniLM-L6-v2
294
+ maxElements: 10000,
295
+ storagePath: './typed-vectors.db'
296
+ });
297
+
298
+ // Step 3.3: Type-safe vector entry
299
+ const entry: VectorEntry<DocumentMetadata> = {
300
+ id: 'article-001',
301
+ vector: new Float32Array(384), // Your embedding here
302
+ metadata: {
303
+ title: 'Introduction to Vector Databases',
304
+ content: 'Vector databases enable semantic search...',
305
+ author: 'Jane Doe',
306
+ date: new Date('2024-01-15'),
307
+ tags: ['database', 'AI', 'search']
308
+ }
309
+ };
310
+
311
+ // Step 3.4: Insert with type checking
312
+ await db.insert(entry);
313
+ console.log('✅ Inserted typed document');
314
+
315
+ // Step 3.5: Type-safe search
316
+ const query: SearchQuery = {
317
+ vector: new Float32Array(384),
318
+ k: 10,
319
+ threshold: 0.8
320
+ };
321
+
322
+ // Step 3.6: Fully typed results
323
+ const results: SearchResult<DocumentMetadata>[] = await db.search(query);
324
+
325
+ // TypeScript knows the exact shape of metadata
326
+ results.forEach(result => {
327
+ console.log(`Title: ${result.metadata.title}`);
328
+ console.log(`Author: ${result.metadata.author}`);
329
+ console.log(`Tags: ${result.metadata.tags.join(', ')}`);
330
+ console.log(`Similarity: ${result.score.toFixed(3)}\n`);
331
+ });
332
+
333
+ // Step 3.7: Type-safe retrieval
334
+ const doc = await db.get('article-001');
335
+ if (doc) {
336
+ // TypeScript autocomplete works perfectly here
337
+ const publishYear = doc.metadata.date.getFullYear();
338
+ console.log(`Published in ${publishYear}`);
339
+ }
340
+ }
341
+
342
+ typescriptTutorial().catch(console.error);
343
+ ```
344
+
345
+ **TypeScript Benefits:**
346
+ - ✅ Full autocomplete for all methods and properties
347
+ - ✅ Compile-time type checking prevents errors
348
+ - ✅ IDE IntelliSense shows documentation
349
+ - ✅ Custom metadata types for your use case
350
+ - ✅ No `any` types - fully typed throughout
351
+
352
+ ## 🎯 Platform Detection
353
+
354
+ Ruvector automatically detects the best implementation for your platform:
355
+
356
+ ```javascript
357
+ const { getImplementationType, isNative, isWasm } = require('ruvector');
358
+
359
+ console.log(getImplementationType()); // 'native' or 'wasm'
360
+ console.log(isNative()); // true if using native Rust
361
+ console.log(isWasm()); // true if using WebAssembly fallback
362
+
363
+ // Performance varies by implementation:
364
+ // Native (Rust): <0.5ms latency, 50K+ ops/sec
365
+ // WASM fallback: 10-50ms latency, ~1K ops/sec
366
+ ```
367
+
368
+ ## 🔧 CLI Tools
369
+
370
+ Ruvector includes a full command-line interface for database management:
371
+
372
+ ### Create Database
373
+
374
+ ```bash
375
+ # Create a new vector database
376
+ npx ruvector create mydb.vec --dimensions 384 --metric cosine
377
+
378
+ # Options:
379
+ # --dimensions, -d Vector dimensionality (required)
380
+ # --metric, -m Distance metric (cosine, euclidean, dot)
381
+ # --max-elements Maximum number of vectors (default: 10000)
382
+ ```
383
+
384
+ ### Insert Vectors
385
+
386
+ ```bash
387
+ # Insert vectors from JSON file
388
+ npx ruvector insert mydb.vec vectors.json
389
+
390
+ # JSON format:
391
+ # [
392
+ # { "id": "doc1", "vector": [0.1, 0.2, ...], "metadata": {...} },
393
+ # { "id": "doc2", "vector": [0.3, 0.4, ...], "metadata": {...} }
394
+ # ]
395
+ ```
396
+
397
+ ### Search Vectors
398
+
399
+ ```bash
400
+ # Search for similar vectors
401
+ npx ruvector search mydb.vec --vector "[0.1,0.2,0.3,...]" --top-k 10
402
+
403
+ # Options:
404
+ # --vector, -v Query vector (JSON array)
405
+ # --top-k, -k Number of results (default: 10)
406
+ # --threshold Minimum similarity score
407
+ ```
408
+
409
+ ### Database Statistics
410
+
411
+ ```bash
412
+ # Show database statistics
413
+ npx ruvector stats mydb.vec
414
+
415
+ # Output:
416
+ # Total vectors: 10,000
417
+ # Dimensions: 384
418
+ # Metric: cosine
419
+ # Memory usage: ~500 KB
420
+ # Index type: HNSW
421
+ ```
422
+
423
+ ### Benchmarking
424
+
425
+ ```bash
426
+ # Run performance benchmark
427
+ npx ruvector benchmark --num-vectors 10000 --num-queries 1000
428
+
429
+ # Options:
430
+ # --num-vectors Number of vectors to insert
431
+ # --num-queries Number of search queries
432
+ # --dimensions Vector dimensionality (default: 128)
433
+ ```
434
+
435
+ ### System Information
436
+
437
+ ```bash
438
+ # Show platform and implementation info
439
+ npx ruvector info
440
+
441
+ # Output:
442
+ # Platform: linux-x64-gnu
443
+ # Implementation: native (Rust)
444
+ # GNN Module: Available
445
+ # Node.js: v18.17.0
446
+ # Performance: <0.5ms p50 latency
447
+ ```
448
+
449
+ ### Install Optional Packages
450
+
451
+ Ruvector supports optional packages that extend functionality. Use the `install` command to add them:
452
+
453
+ ```bash
454
+ # List available packages
455
+ npx ruvector install
456
+
457
+ # Output:
458
+ # Available Ruvector Packages:
459
+ #
460
+ # gnn not installed
461
+ # Graph Neural Network layers, tensor compression, differentiable search
462
+ # npm: @ruvector/gnn
463
+ #
464
+ # core ✓ installed
465
+ # Core vector database with native Rust bindings
466
+ # npm: @ruvector/core
467
+
468
+ # Install specific package
469
+ npx ruvector install gnn
470
+
471
+ # Install all optional packages
472
+ npx ruvector install --all
473
+
474
+ # Interactive selection
475
+ npx ruvector install -i
476
+ ```
477
+
478
+ The install command auto-detects your package manager (npm, yarn, pnpm, bun).
479
+
480
+ ### GNN Commands
481
+
482
+ Ruvector includes Graph Neural Network (GNN) capabilities for advanced tensor compression and differentiable search.
483
+
484
+ #### GNN Info
485
+
486
+ ```bash
487
+ # Show GNN module information
488
+ npx ruvector gnn info
489
+
490
+ # Output:
491
+ # GNN Module Information
492
+ # Status: Available
493
+ # Platform: linux
494
+ # Architecture: x64
495
+ #
496
+ # Available Features:
497
+ # • RuvectorLayer - GNN layer with multi-head attention
498
+ # • TensorCompress - Adaptive tensor compression (5 levels)
499
+ # • differentiableSearch - Soft attention-based search
500
+ # • hierarchicalForward - Multi-layer GNN processing
501
+ ```
502
+
503
+ #### GNN Layer
504
+
505
+ ```bash
506
+ # Create and test a GNN layer
507
+ npx ruvector gnn layer -i 128 -h 256 --test
508
+
509
+ # Options:
510
+ # -i, --input-dim Input dimension (required)
511
+ # -h, --hidden-dim Hidden dimension (required)
512
+ # -a, --heads Number of attention heads (default: 4)
513
+ # -d, --dropout Dropout rate (default: 0.1)
514
+ # --test Run a test forward pass
515
+ # -o, --output Save layer config to JSON file
516
+ ```
517
+
518
+ #### GNN Compress
519
+
520
+ ```bash
521
+ # Compress embeddings using adaptive tensor compression
522
+ npx ruvector gnn compress -f embeddings.json -l pq8 -o compressed.json
523
+
524
+ # Options:
525
+ # -f, --file Input JSON file with embeddings (required)
526
+ # -l, --level Compression level: none|half|pq8|pq4|binary (default: auto)
527
+ # -a, --access-freq Access frequency for auto compression (default: 0.5)
528
+ # -o, --output Output file for compressed data
529
+
530
+ # Compression levels:
531
+ # none (freq > 0.8) - Full precision, hot data
532
+ # half (freq > 0.4) - ~50% savings, warm data
533
+ # pq8 (freq > 0.1) - ~8x compression, cool data
534
+ # pq4 (freq > 0.01) - ~16x compression, cold data
535
+ # binary (freq <= 0.01) - ~32x compression, archive
536
+ ```
537
+
538
+ #### GNN Search
539
+
540
+ ```bash
541
+ # Differentiable search with soft attention
542
+ npx ruvector gnn search -q "[1.0,0.0,0.0]" -c candidates.json -k 5
543
+
544
+ # Options:
545
+ # -q, --query Query vector as JSON array (required)
546
+ # -c, --candidates Candidates file - JSON array of vectors (required)
547
+ # -k, --top-k Number of results (default: 5)
548
+ # -t, --temperature Softmax temperature (default: 1.0)
549
+ ```
550
+
551
+ ### Attention Commands
552
+
553
+ Ruvector includes high-performance attention mechanisms for transformer-based operations, hyperbolic embeddings, and graph attention.
554
+
555
+ ```bash
556
+ # Install the attention module (optional)
557
+ npm install @ruvector/attention
558
+ ```
559
+
560
+ #### Attention Mechanisms Reference
561
+
562
+ | Mechanism | Type | Complexity | When to Use |
563
+ |-----------|------|------------|-------------|
564
+ | **DotProductAttention** | Core | O(n²) | Standard scaled dot-product attention for transformers |
565
+ | **MultiHeadAttention** | Core | O(n²) | Parallel attention heads for capturing different relationships |
566
+ | **FlashAttention** | Core | O(n²) IO-optimized | Memory-efficient attention for long sequences |
567
+ | **HyperbolicAttention** | Core | O(n²) | Hierarchical data, tree-like structures, taxonomies |
568
+ | **LinearAttention** | Core | O(n) | Very long sequences where O(n²) is prohibitive |
569
+ | **MoEAttention** | Core | O(n*k) | Mixture of Experts routing, specialized attention |
570
+ | **GraphRoPeAttention** | Graph | O(n²) | Graph data with rotary position embeddings |
571
+ | **EdgeFeaturedAttention** | Graph | O(n²) | Graphs with rich edge features/attributes |
572
+ | **DualSpaceAttention** | Graph | O(n²) | Combined Euclidean + hyperbolic representation |
573
+ | **LocalGlobalAttention** | Graph | O(n*k) | Large graphs with local + global context |
574
+
575
+ #### Attention Info
576
+
577
+ ```bash
578
+ # Show attention module information
579
+ npx ruvector attention info
580
+
581
+ # Output:
582
+ # Attention Module Information
583
+ # Status: Available
584
+ # Version: 0.1.0
585
+ # Platform: linux
586
+ # Architecture: x64
587
+ #
588
+ # Core Attention Mechanisms:
589
+ # • DotProductAttention - Scaled dot-product attention
590
+ # • MultiHeadAttention - Multi-head self-attention
591
+ # • FlashAttention - Memory-efficient IO-aware attention
592
+ # • HyperbolicAttention - Poincaré ball attention
593
+ # • LinearAttention - O(n) linear complexity attention
594
+ # • MoEAttention - Mixture of Experts attention
595
+ ```
596
+
597
+ #### Attention List
598
+
599
+ ```bash
600
+ # List all available attention mechanisms
601
+ npx ruvector attention list
602
+
603
+ # With verbose details
604
+ npx ruvector attention list -v
605
+ ```
606
+
607
+ #### Attention Benchmark
608
+
609
+ ```bash
610
+ # Benchmark attention mechanisms
611
+ npx ruvector attention benchmark -d 256 -n 100 -i 100
612
+
613
+ # Options:
614
+ # -d, --dimension Vector dimension (default: 256)
615
+ # -n, --num-vectors Number of vectors (default: 100)
616
+ # -i, --iterations Benchmark iterations (default: 100)
617
+ # -t, --types Attention types to benchmark (default: dot,flash,linear)
618
+
619
+ # Example output:
620
+ # Dimension: 256
621
+ # Vectors: 100
622
+ # Iterations: 100
623
+ #
624
+ # dot: 0.012ms/op (84,386 ops/sec)
625
+ # flash: 0.012ms/op (82,844 ops/sec)
626
+ # linear: 0.066ms/op (15,259 ops/sec)
627
+ ```
628
+
629
+ #### Hyperbolic Operations
630
+
631
+ ```bash
632
+ # Calculate Poincaré distance between two points
633
+ npx ruvector attention hyperbolic -a distance -v "[0.1,0.2,0.3]" -b "[0.4,0.5,0.6]"
634
+
635
+ # Project vector to Poincaré ball
636
+ npx ruvector attention hyperbolic -a project -v "[1.5,2.0,0.8]"
637
+
638
+ # Möbius addition in hyperbolic space
639
+ npx ruvector attention hyperbolic -a mobius-add -v "[0.1,0.2]" -b "[0.3,0.4]"
640
+
641
+ # Exponential map (tangent space → Poincaré ball)
642
+ npx ruvector attention hyperbolic -a exp-map -v "[0.1,0.2,0.3]"
643
+
644
+ # Options:
645
+ # -a, --action Action: distance|project|mobius-add|exp-map|log-map
646
+ # -v, --vector Input vector as JSON array (required)
647
+ # -b, --vector-b Second vector for binary operations
648
+ # -c, --curvature Poincaré ball curvature (default: 1.0)
649
+ ```
650
+
651
+ #### When to Use Each Attention Type
652
+
653
+ | Use Case | Recommended Attention | Reason |
654
+ |----------|----------------------|--------|
655
+ | **Standard NLP/Transformers** | MultiHeadAttention | Industry standard, well-tested |
656
+ | **Long Documents (>4K tokens)** | FlashAttention or LinearAttention | Memory efficient |
657
+ | **Hierarchical Classification** | HyperbolicAttention | Captures tree-like structures |
658
+ | **Knowledge Graphs** | GraphRoPeAttention | Position-aware graph attention |
659
+ | **Multi-Relational Graphs** | EdgeFeaturedAttention | Leverages edge attributes |
660
+ | **Taxonomy/Ontology Search** | DualSpaceAttention | Best of both Euclidean + hyperbolic |
661
+ | **Large-Scale Graphs** | LocalGlobalAttention | Efficient local + global context |
662
+ | **Model Routing/MoE** | MoEAttention | Expert selection and routing |
663
+
664
+ ### ⚡ ONNX WASM Embeddings (v2.0)
665
+
666
+ RuVector includes a pure JavaScript ONNX runtime for local embeddings - no Python, no API calls, no build tools required.
667
+
668
+ ```bash
669
+ # Embeddings work out of the box
670
+ npx ruvector hooks remember "important context" -t project
671
+ npx ruvector hooks recall "context query"
672
+ npx ruvector hooks rag-context "how does auth work"
673
+ ```
674
+
675
+ **Model**: all-MiniLM-L6-v2 (384 dimensions, 23MB)
676
+ - Downloads automatically on first use
677
+ - Cached in `.ruvector/models/`
678
+ - SIMD-accelerated when available
679
+
680
+ **Performance:**
681
+ | Operation | Time | Notes |
682
+ |-----------|------|-------|
683
+ | Model load | ~2s | First use only |
684
+ | Embedding | ~50ms | Per text chunk |
685
+ | HNSW search | 0.045ms | 150x faster than brute force |
686
+ | Cache hit | 0.01ms | 40,000x faster than inference |
687
+
688
+ **Fallback Chain:**
689
+ 1. Native SQLite → best persistence
690
+ 2. WASM SQLite → cross-platform
691
+ 3. Memory Cache → fastest (no persistence)
692
+
693
+ ### 🧠 Self-Learning Hooks v2.0
694
+
695
+ Ruvector includes **self-learning intelligence hooks** for Claude Code integration with ONNX embeddings, AST analysis, and coverage-aware routing.
696
+
697
+ #### Initialize Hooks
698
+
699
+ ```bash
700
+ # Initialize hooks in your project
701
+ npx ruvector hooks init
702
+
703
+ # Options:
704
+ # --force Overwrite existing configuration
705
+ # --minimal Minimal configuration (no optional hooks)
706
+ # --pretrain Initialize + pretrain from git history
707
+ # --build-agents quality Generate optimized agent configs
708
+ ```
709
+
710
+ This creates `.claude/settings.json` with pre-configured hooks and `CLAUDE.md` with comprehensive documentation.
711
+
712
+ #### Session Management
713
+
714
+ ```bash
715
+ # Start a session (load intelligence data)
716
+ npx ruvector hooks session-start
717
+
718
+ # End a session (save learned patterns)
719
+ npx ruvector hooks session-end
720
+ ```
721
+
722
+ #### Pre/Post Edit Hooks
723
+
724
+ ```bash
725
+ # Before editing a file - get agent recommendations
726
+ npx ruvector hooks pre-edit src/index.ts
727
+ # Output: 🤖 Recommended: typescript-developer (85% confidence)
728
+
729
+ # After editing - record success/failure for learning
730
+ npx ruvector hooks post-edit src/index.ts --success
731
+ npx ruvector hooks post-edit src/index.ts --error "Type error on line 42"
732
+ ```
733
+
734
+ #### Pre/Post Command Hooks
735
+
736
+ ```bash
737
+ # Before running a command - risk analysis
738
+ npx ruvector hooks pre-command "npm test"
739
+ # Output: ✅ Risk: LOW, Category: test
740
+
741
+ # After running - record outcome
742
+ npx ruvector hooks post-command "npm test" --success
743
+ npx ruvector hooks post-command "npm test" --error "3 tests failed"
744
+ ```
745
+
746
+ #### Agent Routing
747
+
748
+ ```bash
749
+ # Get agent recommendation for a task
750
+ npx ruvector hooks route "fix the authentication bug in login.ts"
751
+ # Output: 🤖 Recommended: security-specialist (92% confidence)
752
+
753
+ npx ruvector hooks route "add unit tests for the API"
754
+ # Output: 🤖 Recommended: tester (88% confidence)
755
+ ```
756
+
757
+ #### Memory Operations
758
+
759
+ ```bash
760
+ # Store context in vector memory
761
+ npx ruvector hooks remember "API uses JWT tokens with 1h expiry" --type decision
762
+ npx ruvector hooks remember "Database schema in docs/schema.md" --type reference
763
+
764
+ # Semantic search memory
765
+ npx ruvector hooks recall "authentication mechanism"
766
+ # Returns relevant stored memories
767
+ ```
768
+
769
+ #### Context Suggestions
770
+
771
+ ```bash
772
+ # Get relevant context for current task
773
+ npx ruvector hooks suggest-context
774
+ # Output: Based on recent files, suggests relevant context
775
+ ```
776
+
777
+ #### Intelligence Statistics
778
+
779
+ ```bash
780
+ # Show learned patterns and statistics
781
+ npx ruvector hooks stats
782
+
783
+ # Output:
784
+ # Patterns: 156 learned
785
+ # Success rate: 87%
786
+ # Top agents: rust-developer, tester, reviewer
787
+ # Memory entries: 42
788
+ ```
789
+
790
+ #### Swarm Recommendations
791
+
792
+ ```bash
793
+ # Get agent recommendation for task type
794
+ npx ruvector hooks swarm-recommend "code-review"
795
+ # Output: Recommended agents for code review task
796
+ ```
797
+
798
+ #### AST Analysis (v2.0)
799
+
800
+ ```bash
801
+ # Analyze file structure, symbols, imports, complexity
802
+ npx ruvector hooks ast-analyze src/index.ts --json
803
+
804
+ # Get complexity metrics for multiple files
805
+ npx ruvector hooks ast-complexity src/*.ts --threshold 15
806
+ # Flags files exceeding cyclomatic complexity threshold
807
+ ```
808
+
809
+ #### Diff & Risk Analysis (v2.0)
810
+
811
+ ```bash
812
+ # Analyze commit with semantic embeddings and risk scoring
813
+ npx ruvector hooks diff-analyze HEAD
814
+ # Output: risk score, category, affected files
815
+
816
+ # Classify change type (feature, bugfix, refactor, docs, test)
817
+ npx ruvector hooks diff-classify
818
+
819
+ # Find similar past commits via embeddings
820
+ npx ruvector hooks diff-similar -k 5
821
+
822
+ # Git churn analysis (hot spots)
823
+ npx ruvector hooks git-churn --days 30
824
+ ```
825
+
826
+ #### Coverage-Aware Routing (v2.0)
827
+
828
+ ```bash
829
+ # Get coverage-aware routing for a file
830
+ npx ruvector hooks coverage-route src/api.ts
831
+ # Output: agent weights based on test coverage
832
+
833
+ # Suggest tests for files based on coverage gaps
834
+ npx ruvector hooks coverage-suggest src/*.ts
835
+ ```
836
+
837
+ #### Graph Analysis (v2.0)
838
+
839
+ ```bash
840
+ # Find optimal code boundaries (MinCut algorithm)
841
+ npx ruvector hooks graph-mincut src/*.ts
842
+
843
+ # Detect code communities (Louvain/Spectral clustering)
844
+ npx ruvector hooks graph-cluster src/*.ts --method louvain
845
+ ```
846
+
847
+ #### Security & RAG (v2.0)
848
+
849
+ ```bash
850
+ # Parallel security vulnerability scan
851
+ npx ruvector hooks security-scan src/*.ts
852
+
853
+ # RAG-enhanced context retrieval
854
+ npx ruvector hooks rag-context "how does auth work"
855
+
856
+ # Enhanced routing with all signals
857
+ npx ruvector hooks route-enhanced "fix bug" --file src/api.ts
858
+ ```
859
+
860
+ #### Hooks Configuration
861
+
862
+ The hooks integrate with Claude Code via `.claude/settings.json`:
863
+
864
+ ```json
865
+ {
866
+ "env": {
867
+ "RUVECTOR_INTELLIGENCE_ENABLED": "true",
868
+ "RUVECTOR_LEARNING_RATE": "0.1",
869
+ "RUVECTOR_AST_ENABLED": "true",
870
+ "RUVECTOR_DIFF_EMBEDDINGS": "true",
871
+ "RUVECTOR_COVERAGE_ROUTING": "true",
872
+ "RUVECTOR_GRAPH_ALGORITHMS": "true",
873
+ "RUVECTOR_SECURITY_SCAN": "true"
874
+ },
875
+ "hooks": {
876
+ "PreToolUse": [
877
+ {
878
+ "matcher": "Edit|Write|MultiEdit",
879
+ "hooks": [{ "type": "command", "command": "npx ruvector hooks pre-edit \"$TOOL_INPUT_file_path\"" }]
880
+ },
881
+ {
882
+ "matcher": "Bash",
883
+ "hooks": [{ "type": "command", "command": "npx ruvector hooks pre-command \"$TOOL_INPUT_command\"" }]
884
+ }
885
+ ],
886
+ "PostToolUse": [
887
+ {
888
+ "matcher": "Edit|Write|MultiEdit",
889
+ "hooks": [{ "type": "command", "command": "npx ruvector hooks post-edit \"$TOOL_INPUT_file_path\"" }]
890
+ }
891
+ ],
892
+ "SessionStart": [{ "hooks": [{ "type": "command", "command": "npx ruvector hooks session-start" }] }],
893
+ "Stop": [{ "hooks": [{ "type": "command", "command": "npx ruvector hooks session-end" }] }]
894
+ }
895
+ }
896
+ ```
897
+
898
+ #### How Self-Learning Works
899
+
900
+ 1. **Pattern Recording**: Every edit and command is recorded with context
901
+ 2. **Q-Learning**: Success/failure updates agent routing weights
902
+ 3. **AST Analysis**: Code complexity informs agent selection
903
+ 4. **Diff Embeddings**: Change patterns improve risk assessment
904
+ 5. **Coverage Routing**: Test coverage guides testing priorities
905
+ 6. **Vector Memory**: Decisions and references stored for semantic recall (HNSW indexed)
906
+ 7. **Continuous Improvement**: The more you use it, the smarter it gets
907
+
908
+ ## 📊 Performance Benchmarks
909
+
910
+ Tested on AMD Ryzen 9 5950X, 128-dimensional vectors:
911
+
912
+ ### Native Performance (Rust)
913
+
914
+ | Operation | Throughput | Latency (p50) | Latency (p99) |
915
+ |-----------|------------|---------------|---------------|
916
+ | Insert | 52,341 ops/sec | 0.019 ms | 0.045 ms |
917
+ | Search (k=10) | 11,234 ops/sec | 0.089 ms | 0.156 ms |
918
+ | Search (k=100) | 8,932 ops/sec | 0.112 ms | 0.203 ms |
919
+ | Delete | 45,678 ops/sec | 0.022 ms | 0.051 ms |
920
+
921
+ **Memory Usage**: ~50 bytes per 128-dim vector (including index)
922
+
923
+ ### Comparison with Alternatives
924
+
925
+ | Database | Insert (ops/sec) | Search (ops/sec) | Memory per Vector | Node.js | Browser |
926
+ |----------|------------------|------------------|-------------------|---------|---------|
927
+ | **Ruvector (Native)** | **52,341** | **11,234** | **50 bytes** | ✅ | ❌ |
928
+ | **Ruvector (WASM)** | **~1,000** | **~100** | **50 bytes** | ✅ | ✅ |
929
+ | Faiss (HNSW) | 38,200 | 9,800 | 68 bytes | ❌ | ❌ |
930
+ | Hnswlib | 41,500 | 10,200 | 62 bytes | ✅ | ❌ |
931
+ | ChromaDB | ~1,000 | ~20 | 150 bytes | ✅ | ❌ |
932
+
933
+ *Benchmarks measured with 100K vectors, 128 dimensions, k=10*
934
+
935
+ ## 🔍 Comparison with Other Vector Databases
936
+
937
+ Comprehensive comparison of Ruvector against popular vector database solutions:
938
+
939
+ | Feature | Ruvector | Pinecone | Qdrant | Weaviate | Milvus | ChromaDB | Faiss |
940
+ |---------|----------|----------|--------|----------|--------|----------|-------|
941
+ | **Deployment** |
942
+ | Installation | `npm install` ✅ | Cloud API ☁️ | Docker 🐳 | Docker 🐳 | Docker/K8s 🐳 | `pip install` 🐍 | `pip install` 🐍 |
943
+ | Node.js Native | ✅ First-class | ❌ API only | ⚠️ HTTP API | ⚠️ HTTP API | ⚠️ HTTP API | ❌ Python | ❌ Python |
944
+ | Setup Time | < 1 minute | 5-10 minutes | 10-30 minutes | 15-30 minutes | 30-60 minutes | 5 minutes | 5 minutes |
945
+ | Infrastructure | None required | Managed cloud | Self-hosted | Self-hosted | Self-hosted | Embedded | Embedded |
946
+ | **Performance** |
947
+ | Query Latency (p50) | **<0.5ms** | ~2-5ms | ~1-2ms | ~2-3ms | ~3-5ms | ~50ms | ~1ms |
948
+ | Insert Throughput | **52,341 ops/sec** | ~10,000 ops/sec | ~20,000 ops/sec | ~15,000 ops/sec | ~25,000 ops/sec | ~1,000 ops/sec | ~40,000 ops/sec |
949
+ | Memory per Vector (128d) | **50 bytes** | ~80 bytes | 62 bytes | ~100 bytes | ~70 bytes | 150 bytes | 68 bytes |
950
+ | Recall @ k=10 | 95%+ | 93% | 94% | 92% | 96% | 85% | 97% |
951
+ | **Platform Support** |
952
+ | Linux | ✅ Native | ☁️ API | ✅ Docker | ✅ Docker | ✅ Docker | ✅ Python | ✅ Python |
953
+ | macOS | ✅ Native | ☁️ API | ✅ Docker | ✅ Docker | ✅ Docker | ✅ Python | ✅ Python |
954
+ | Windows | ✅ Native | ☁️ API | ✅ Docker | ✅ Docker | ⚠️ WSL2 | ✅ Python | ✅ Python |
955
+ | Browser/WASM | ✅ Yes | ❌ No | ❌ No | ❌ No | ❌ No | ❌ No | ❌ No |
956
+ | ARM64 | ✅ Native | ☁️ API | ✅ Yes | ✅ Yes | ⚠️ Limited | ✅ Yes | ✅ Yes |
957
+ | Alpine Linux | ✅ WASM | ☁️ API | ⚠️ Build from source | ⚠️ Build from source | ❌ No | ✅ Yes | ✅ Yes |
958
+ | **Features** |
959
+ | Distance Metrics | Cosine, L2, Dot | Cosine, L2, Dot | 11 metrics | 10 metrics | 8 metrics | L2, Cosine, IP | L2, IP, Cosine |
960
+ | Filtering | ✅ Metadata | ✅ Advanced | ✅ Advanced | ✅ Advanced | ✅ Advanced | ✅ Basic | ❌ Limited |
961
+ | Persistence | ✅ File-based | ☁️ Managed | ✅ Disk | ✅ Disk | ✅ Disk | ✅ DuckDB | ❌ Memory |
962
+ | Indexing | HNSW | Proprietary | HNSW | HNSW | IVF/HNSW | HNSW | IVF/HNSW |
963
+ | Quantization | ✅ PQ | ✅ Yes | ✅ Scalar | ✅ PQ | ✅ PQ/SQ | ❌ No | ✅ PQ |
964
+ | Batch Operations | ✅ Yes | ✅ Yes | ✅ Yes | ✅ Yes | ✅ Yes | ✅ Yes | ✅ Yes |
965
+ | **Developer Experience** |
966
+ | TypeScript Types | ✅ Full | ✅ Generated | ⚠️ Community | ⚠️ Community | ⚠️ Community | ⚠️ Partial | ❌ No |
967
+ | Documentation | ✅ Excellent | ✅ Excellent | ✅ Good | ✅ Good | ✅ Good | ✅ Good | ⚠️ Technical |
968
+ | Examples | ✅ Many | ✅ Many | ✅ Good | ✅ Good | ✅ Many | ✅ Good | ⚠️ Limited |
969
+ | CLI Tools | ✅ Included | ⚠️ Limited | ✅ Yes | ✅ Yes | ✅ Yes | ⚠️ Basic | ❌ No |
970
+ | **Operations** |
971
+ | Monitoring | ✅ Metrics | ✅ Dashboard | ✅ Prometheus | ✅ Prometheus | ✅ Prometheus | ⚠️ Basic | ❌ No |
972
+ | Backups | ✅ File copy | ☁️ Automatic | ✅ Snapshots | ✅ Snapshots | ✅ Snapshots | ✅ File copy | ❌ Manual |
973
+ | High Availability | ⚠️ App-level | ✅ Built-in | ✅ Clustering | ✅ Clustering | ✅ Clustering | ❌ No | ❌ No |
974
+ | Auto-Scaling | ⚠️ App-level | ✅ Automatic | ⚠️ Manual | ⚠️ Manual | ⚠️ K8s HPA | ❌ No | ❌ No |
975
+ | **Cost** |
976
+ | Pricing Model | Free (MIT) | Pay-per-use | Free (Apache) | Free (BSD) | Free (Apache) | Free (Apache) | Free (MIT) |
977
+ | Monthly Cost (1M vectors) | **$0** | ~$70-200 | ~$20-50 (infra) | ~$30-60 (infra) | ~$50-100 (infra) | $0 | $0 |
978
+ | Monthly Cost (10M vectors) | **$0** | ~$500-1000 | ~$100-200 (infra) | ~$150-300 (infra) | ~$200-400 (infra) | $0 | $0 |
979
+ | API Rate Limits | None | Yes | None | None | None | None | None |
980
+ | **Use Cases** |
981
+ | RAG Systems | ✅ Excellent | ✅ Excellent | ✅ Excellent | ✅ Excellent | ✅ Excellent | ✅ Good | ⚠️ Limited |
982
+ | Serverless | ✅ Perfect | ✅ Good | ❌ No | ❌ No | ❌ No | ⚠️ Possible | ⚠️ Possible |
983
+ | Edge Computing | ✅ Excellent | ❌ No | ❌ No | ❌ No | ❌ No | ❌ No | ⚠️ Possible |
984
+ | Production Scale (100M+) | ⚠️ Single node | ✅ Yes | ✅ Yes | ✅ Yes | ✅ Excellent | ⚠️ Limited | ⚠️ Manual |
985
+ | Embedded Apps | ✅ Excellent | ❌ No | ❌ No | ❌ No | ❌ No | ⚠️ Possible | ✅ Good |
986
+
987
+ ### When to Choose Ruvector
988
+
989
+ ✅ **Perfect for:**
990
+ - **Node.js/TypeScript applications** needing embedded vector search
991
+ - **Serverless and edge computing** where external services aren't practical
992
+ - **Rapid prototyping and development** with minimal setup time
993
+ - **RAG systems** with LangChain, LlamaIndex, or custom implementations
994
+ - **Cost-sensitive projects** that can't afford cloud API pricing
995
+ - **Offline-first applications** requiring local vector search
996
+ - **Browser-based AI** with WebAssembly fallback
997
+ - **Small to medium scale** (up to 10M vectors per instance)
998
+
999
+ ⚠️ **Consider alternatives for:**
1000
+ - **Massive scale (100M+ vectors)** - Consider Pinecone, Milvus, or Qdrant clusters
1001
+ - **Multi-tenancy requirements** - Weaviate or Qdrant offer better isolation
1002
+ - **Distributed systems** - Milvus provides better horizontal scaling
1003
+ - **Zero-ops cloud solution** - Pinecone handles all infrastructure
1004
+
1005
+ ### Why Choose Ruvector Over...
1006
+
1007
+ **vs Pinecone:**
1008
+ - ✅ No API costs (save $1000s/month)
1009
+ - ✅ No network latency (10x faster queries)
1010
+ - ✅ No vendor lock-in
1011
+ - ✅ Works offline and in restricted environments
1012
+ - ❌ No managed multi-region clusters
1013
+
1014
+ **vs ChromaDB:**
1015
+ - ✅ 50x faster queries (native Rust vs Python)
1016
+ - ✅ True Node.js support (not HTTP API)
1017
+ - ✅ Better TypeScript integration
1018
+ - ✅ Lower memory usage
1019
+ - ❌ Smaller ecosystem and community
1020
+
1021
+ **vs Qdrant:**
1022
+ - ✅ Zero infrastructure setup
1023
+ - ✅ Embedded in your app (no Docker)
1024
+ - ✅ Better for serverless environments
1025
+ - ✅ Native Node.js bindings
1026
+ - ❌ No built-in clustering or HA
1027
+
1028
+ **vs Faiss:**
1029
+ - ✅ Full Node.js support (Faiss is Python-only)
1030
+ - ✅ Easier API and better developer experience
1031
+ - ✅ Built-in persistence and metadata
1032
+ - ⚠️ Slightly lower recall at same performance
1033
+
1034
+ ## 🎯 Real-World Tutorials
1035
+
1036
+ ### Tutorial 1: Building a RAG System with OpenAI
1037
+
1038
+ **What you'll learn:** Create a production-ready Retrieval-Augmented Generation system that enhances LLM responses with relevant context from your documents.
1039
+
1040
+ **Prerequisites:**
1041
+ ```bash
1042
+ npm install ruvector openai
1043
+ export OPENAI_API_KEY="your-api-key-here"
1044
+ ```
1045
+
1046
+ **Complete Implementation:**
1047
+
1048
+ ```javascript
1049
+ const { VectorDb } = require('ruvector');
1050
+ const OpenAI = require('openai');
1051
+
1052
+ class RAGSystem {
1053
+ constructor() {
1054
+ // Initialize OpenAI client
1055
+ this.openai = new OpenAI({
1056
+ apiKey: process.env.OPENAI_API_KEY
1057
+ });
1058
+
1059
+ // Create vector database for OpenAI embeddings
1060
+ // text-embedding-ada-002 produces 1536-dimensional vectors
1061
+ this.db = new VectorDb({
1062
+ dimensions: 1536,
1063
+ maxElements: 100000,
1064
+ storagePath: './rag-knowledge-base.db'
1065
+ });
1066
+
1067
+ console.log('✅ RAG System initialized');
1068
+ }
1069
+
1070
+ // Step 1: Index your knowledge base
1071
+ async indexDocuments(documents) {
1072
+ console.log(`📚 Indexing ${documents.length} documents...`);
1073
+
1074
+ for (let i = 0; i < documents.length; i++) {
1075
+ const doc = documents[i];
1076
+
1077
+ // Generate embedding for the document
1078
+ const response = await this.openai.embeddings.create({
1079
+ model: 'text-embedding-ada-002',
1080
+ input: doc.content
1081
+ });
1082
+
1083
+ // Store in vector database
1084
+ await this.db.insert({
1085
+ id: doc.id || `doc_${i}`,
1086
+ vector: new Float32Array(response.data[0].embedding),
1087
+ metadata: {
1088
+ title: doc.title,
1089
+ content: doc.content,
1090
+ source: doc.source,
1091
+ date: doc.date || new Date().toISOString()
1092
+ }
1093
+ });
1094
+
1095
+ console.log(` ✅ Indexed: ${doc.title}`);
1096
+ }
1097
+
1098
+ const count = await this.db.len();
1099
+ console.log(`\n✅ Indexed ${count} documents total`);
1100
+ }
1101
+
1102
+ // Step 2: Retrieve relevant context for a query
1103
+ async retrieveContext(query, k = 3) {
1104
+ console.log(`🔍 Searching for: "${query}"`);
1105
+
1106
+ // Generate embedding for the query
1107
+ const response = await this.openai.embeddings.create({
1108
+ model: 'text-embedding-ada-002',
1109
+ input: query
1110
+ });
1111
+
1112
+ // Search for similar documents
1113
+ const results = await this.db.search({
1114
+ vector: new Float32Array(response.data[0].embedding),
1115
+ k: k,
1116
+ threshold: 0.7 // Only use highly relevant results
1117
+ });
1118
+
1119
+ console.log(`📄 Found ${results.length} relevant documents\n`);
1120
+
1121
+ return results.map(r => ({
1122
+ content: r.metadata.content,
1123
+ title: r.metadata.title,
1124
+ score: r.score
1125
+ }));
1126
+ }
1127
+
1128
+ // Step 3: Generate answer with retrieved context
1129
+ async answer(question) {
1130
+ // Retrieve relevant context
1131
+ const context = await this.retrieveContext(question, 3);
1132
+
1133
+ if (context.length === 0) {
1134
+ return "I don't have enough information to answer that question.";
1135
+ }
1136
+
1137
+ // Build prompt with context
1138
+ const contextText = context
1139
+ .map((doc, i) => `[${i + 1}] ${doc.title}\n${doc.content}`)
1140
+ .join('\n\n');
1141
+
1142
+ const prompt = `Answer the question based on the following context. If the context doesn't contain the answer, say so.
1143
+
1144
+ Context:
1145
+ ${contextText}
1146
+
1147
+ Question: ${question}
1148
+
1149
+ Answer:`;
1150
+
1151
+ console.log('🤖 Generating answer...\n');
1152
+
1153
+ // Generate completion
1154
+ const completion = await this.openai.chat.completions.create({
1155
+ model: 'gpt-4',
1156
+ messages: [
1157
+ { role: 'system', content: 'You are a helpful assistant that answers questions based on provided context.' },
1158
+ { role: 'user', content: prompt }
1159
+ ],
1160
+ temperature: 0.3 // Lower temperature for more factual responses
1161
+ });
1162
+
1163
+ return {
1164
+ answer: completion.choices[0].message.content,
1165
+ sources: context.map(c => c.title)
1166
+ };
1167
+ }
1168
+ }
1169
+
1170
+ // Example Usage
1171
+ async function main() {
1172
+ const rag = new RAGSystem();
1173
+
1174
+ // Step 1: Index your knowledge base
1175
+ const documents = [
1176
+ {
1177
+ id: 'doc1',
1178
+ title: 'Ruvector Introduction',
1179
+ content: 'Ruvector is a high-performance vector database for Node.js built in Rust. It provides sub-millisecond query latency and supports over 52,000 inserts per second.',
1180
+ source: 'documentation'
1181
+ },
1182
+ {
1183
+ id: 'doc2',
1184
+ title: 'Vector Databases Explained',
1185
+ content: 'Vector databases store data as high-dimensional vectors, enabling semantic similarity search. They are essential for AI applications like RAG systems and recommendation engines.',
1186
+ source: 'blog'
1187
+ },
1188
+ {
1189
+ id: 'doc3',
1190
+ title: 'HNSW Algorithm',
1191
+ content: 'Hierarchical Navigable Small World (HNSW) is a graph-based algorithm for approximate nearest neighbor search. It provides excellent recall with low latency.',
1192
+ source: 'research'
1193
+ }
1194
+ ];
1195
+
1196
+ await rag.indexDocuments(documents);
1197
+
1198
+ // Step 2: Ask questions
1199
+ console.log('\n' + '='.repeat(60) + '\n');
1200
+
1201
+ const result = await rag.answer('What is Ruvector and what are its performance characteristics?');
1202
+
1203
+ console.log('📝 Answer:', result.answer);
1204
+ console.log('\n📚 Sources:', result.sources.join(', '));
1205
+ }
1206
+
1207
+ main().catch(console.error);
1208
+ ```
1209
+
1210
+ **Expected Output:**
1211
+ ```
1212
+ ✅ RAG System initialized
1213
+ 📚 Indexing 3 documents...
1214
+ ✅ Indexed: Ruvector Introduction
1215
+ ✅ Indexed: Vector Databases Explained
1216
+ ✅ Indexed: HNSW Algorithm
1217
+
1218
+ ✅ Indexed 3 documents total
1219
+
1220
+ ============================================================
1221
+
1222
+ 🔍 Searching for: "What is Ruvector and what are its performance characteristics?"
1223
+ 📄 Found 2 relevant documents
1224
+
1225
+ 🤖 Generating answer...
1226
+
1227
+ 📝 Answer: Ruvector is a high-performance vector database built in Rust for Node.js applications. Its key performance characteristics include:
1228
+ - Sub-millisecond query latency
1229
+ - Over 52,000 inserts per second
1230
+ - Optimized for semantic similarity search
1231
+
1232
+ 📚 Sources: Ruvector Introduction, Vector Databases Explained
1233
+ ```
1234
+
1235
+ **Production Tips:**
1236
+ - ✅ Use batch embedding for better throughput (OpenAI supports up to 2048 texts)
1237
+ - ✅ Implement caching for frequently asked questions
1238
+ - ✅ Add error handling for API rate limits
1239
+ - ✅ Monitor token usage and costs
1240
+ - ✅ Regularly update your knowledge base
1241
+
1242
+ ---
1243
+
1244
+ ### Tutorial 2: Semantic Search Engine
1245
+
1246
+ **What you'll learn:** Build a semantic search engine that understands meaning, not just keywords.
1247
+
1248
+ **Prerequisites:**
1249
+ ```bash
1250
+ npm install ruvector @xenova/transformers
1251
+ ```
1252
+
1253
+ **Complete Implementation:**
1254
+
1255
+ ```javascript
1256
+ const { VectorDb } = require('ruvector');
1257
+ const { pipeline } = require('@xenova/transformers');
1258
+
1259
+ class SemanticSearchEngine {
1260
+ constructor() {
1261
+ this.db = null;
1262
+ this.embedder = null;
1263
+ }
1264
+
1265
+ // Step 1: Initialize the embedding model
1266
+ async initialize() {
1267
+ console.log('🚀 Initializing semantic search engine...');
1268
+
1269
+ // Load sentence-transformers model (runs locally, no API needed!)
1270
+ console.log('📥 Loading embedding model...');
1271
+ this.embedder = await pipeline(
1272
+ 'feature-extraction',
1273
+ 'Xenova/all-MiniLM-L6-v2'
1274
+ );
1275
+
1276
+ // Create vector database (384 dimensions for all-MiniLM-L6-v2)
1277
+ this.db = new VectorDb({
1278
+ dimensions: 384,
1279
+ maxElements: 50000,
1280
+ storagePath: './semantic-search.db'
1281
+ });
1282
+
1283
+ console.log('✅ Search engine ready!\n');
1284
+ }
1285
+
1286
+ // Step 2: Generate embeddings
1287
+ async embed(text) {
1288
+ const output = await this.embedder(text, {
1289
+ pooling: 'mean',
1290
+ normalize: true
1291
+ });
1292
+
1293
+ // Convert to Float32Array
1294
+ return new Float32Array(output.data);
1295
+ }
1296
+
1297
+ // Step 3: Index documents
1298
+ async indexDocuments(documents) {
1299
+ console.log(`📚 Indexing ${documents.length} documents...`);
1300
+
1301
+ for (const doc of documents) {
1302
+ const vector = await this.embed(doc.content);
1303
+
1304
+ await this.db.insert({
1305
+ id: doc.id,
1306
+ vector: vector,
1307
+ metadata: {
1308
+ title: doc.title,
1309
+ content: doc.content,
1310
+ category: doc.category,
1311
+ url: doc.url
1312
+ }
1313
+ });
1314
+
1315
+ console.log(` ✅ ${doc.title}`);
1316
+ }
1317
+
1318
+ const count = await this.db.len();
1319
+ console.log(`\n✅ Indexed ${count} documents\n`);
1320
+ }
1321
+
1322
+ // Step 4: Semantic search
1323
+ async search(query, options = {}) {
1324
+ const {
1325
+ k = 5,
1326
+ category = null,
1327
+ threshold = 0.3
1328
+ } = options;
1329
+
1330
+ console.log(`🔍 Searching for: "${query}"`);
1331
+
1332
+ // Generate query embedding
1333
+ const queryVector = await this.embed(query);
1334
+
1335
+ // Search vector database
1336
+ const results = await this.db.search({
1337
+ vector: queryVector,
1338
+ k: k * 2, // Get more results for filtering
1339
+ threshold: threshold
1340
+ });
1341
+
1342
+ // Filter by category if specified
1343
+ let filtered = results;
1344
+ if (category) {
1345
+ filtered = results.filter(r => r.metadata.category === category);
1346
+ }
1347
+
1348
+ // Return top k after filtering
1349
+ const final = filtered.slice(0, k);
1350
+
1351
+ console.log(`📄 Found ${final.length} results\n`);
1352
+
1353
+ return final.map(r => ({
1354
+ id: r.id,
1355
+ title: r.metadata.title,
1356
+ content: r.metadata.content,
1357
+ category: r.metadata.category,
1358
+ score: r.score,
1359
+ url: r.metadata.url
1360
+ }));
1361
+ }
1362
+
1363
+ // Step 5: Find similar documents
1364
+ async findSimilar(documentId, k = 5) {
1365
+ const doc = await this.db.get(documentId);
1366
+
1367
+ if (!doc) {
1368
+ throw new Error(`Document ${documentId} not found`);
1369
+ }
1370
+
1371
+ const results = await this.db.search({
1372
+ vector: doc.vector,
1373
+ k: k + 1 // +1 because the document itself will be included
1374
+ });
1375
+
1376
+ // Remove the document itself from results
1377
+ return results
1378
+ .filter(r => r.id !== documentId)
1379
+ .slice(0, k);
1380
+ }
1381
+ }
1382
+
1383
+ // Example Usage
1384
+ async function main() {
1385
+ const engine = new SemanticSearchEngine();
1386
+ await engine.initialize();
1387
+
1388
+ // Sample documents (in production, load from your database)
1389
+ const documents = [
1390
+ {
1391
+ id: '1',
1392
+ title: 'Understanding Neural Networks',
1393
+ content: 'Neural networks are computing systems inspired by biological neural networks. They learn to perform tasks by considering examples.',
1394
+ category: 'AI',
1395
+ url: '/docs/neural-networks'
1396
+ },
1397
+ {
1398
+ id: '2',
1399
+ title: 'Introduction to Machine Learning',
1400
+ content: 'Machine learning is a subset of artificial intelligence that provides systems the ability to learn and improve from experience.',
1401
+ category: 'AI',
1402
+ url: '/docs/machine-learning'
1403
+ },
1404
+ {
1405
+ id: '3',
1406
+ title: 'Web Development Best Practices',
1407
+ content: 'Modern web development involves responsive design, performance optimization, and accessibility considerations.',
1408
+ category: 'Web',
1409
+ url: '/docs/web-dev'
1410
+ },
1411
+ {
1412
+ id: '4',
1413
+ title: 'Deep Learning Applications',
1414
+ content: 'Deep learning has revolutionized computer vision, natural language processing, and speech recognition.',
1415
+ category: 'AI',
1416
+ url: '/docs/deep-learning'
1417
+ }
1418
+ ];
1419
+
1420
+ // Index documents
1421
+ await engine.indexDocuments(documents);
1422
+
1423
+ // Example 1: Basic semantic search
1424
+ console.log('Example 1: Basic Search\n' + '='.repeat(60));
1425
+ const results1 = await engine.search('AI and neural nets');
1426
+ results1.forEach((result, i) => {
1427
+ console.log(`${i + 1}. ${result.title} (Score: ${result.score.toFixed(3)})`);
1428
+ console.log(` ${result.content.slice(0, 80)}...`);
1429
+ console.log(` Category: ${result.category}\n`);
1430
+ });
1431
+
1432
+ // Example 2: Category-filtered search
1433
+ console.log('\nExample 2: Category-Filtered Search\n' + '='.repeat(60));
1434
+ const results2 = await engine.search('learning algorithms', {
1435
+ category: 'AI',
1436
+ k: 3
1437
+ });
1438
+ results2.forEach((result, i) => {
1439
+ console.log(`${i + 1}. ${result.title} (Score: ${result.score.toFixed(3)})`);
1440
+ });
1441
+
1442
+ // Example 3: Find similar documents
1443
+ console.log('\n\nExample 3: Find Similar Documents\n' + '='.repeat(60));
1444
+ const similar = await engine.findSimilar('1', 2);
1445
+ console.log('Documents similar to "Understanding Neural Networks":');
1446
+ similar.forEach((doc, i) => {
1447
+ console.log(`${i + 1}. ${doc.metadata.title} (Score: ${doc.score.toFixed(3)})`);
1448
+ });
1449
+ }
1450
+
1451
+ main().catch(console.error);
1452
+ ```
1453
+
1454
+ **Key Features:**
1455
+ - ✅ Runs completely locally (no API keys needed)
1456
+ - ✅ Understands semantic meaning, not just keywords
1457
+ - ✅ Category filtering for better results
1458
+ - ✅ "Find similar" functionality
1459
+ - ✅ Fast: ~10ms query latency
1460
+
1461
+ ---
1462
+
1463
+ ### Tutorial 3: AI Agent Memory System
1464
+
1465
+ **What you'll learn:** Implement a memory system for AI agents that remembers past experiences and learns from them.
1466
+
1467
+ **Complete Implementation:**
1468
+
1469
+ ```javascript
1470
+ const { VectorDb } = require('ruvector');
1471
+
1472
+ class AgentMemory {
1473
+ constructor(agentId) {
1474
+ this.agentId = agentId;
1475
+
1476
+ // Create separate databases for different memory types
1477
+ this.episodicMemory = new VectorDb({
1478
+ dimensions: 768,
1479
+ storagePath: `./memory/${agentId}-episodic.db`
1480
+ });
1481
+
1482
+ this.semanticMemory = new VectorDb({
1483
+ dimensions: 768,
1484
+ storagePath: `./memory/${agentId}-semantic.db`
1485
+ });
1486
+
1487
+ console.log(`🧠 Memory system initialized for agent: ${agentId}`);
1488
+ }
1489
+
1490
+ // Step 1: Store an experience (episodic memory)
1491
+ async storeExperience(experience) {
1492
+ const {
1493
+ state,
1494
+ action,
1495
+ result,
1496
+ reward,
1497
+ embedding
1498
+ } = experience;
1499
+
1500
+ const experienceId = `exp_${Date.now()}_${Math.random()}`;
1501
+
1502
+ await this.episodicMemory.insert({
1503
+ id: experienceId,
1504
+ vector: new Float32Array(embedding),
1505
+ metadata: {
1506
+ state: state,
1507
+ action: action,
1508
+ result: result,
1509
+ reward: reward,
1510
+ timestamp: Date.now(),
1511
+ type: 'episodic'
1512
+ }
1513
+ });
1514
+
1515
+ console.log(`💾 Stored experience: ${action} -> ${result} (reward: ${reward})`);
1516
+ return experienceId;
1517
+ }
1518
+
1519
+ // Step 2: Store learned knowledge (semantic memory)
1520
+ async storeKnowledge(knowledge) {
1521
+ const {
1522
+ concept,
1523
+ description,
1524
+ embedding,
1525
+ confidence = 1.0
1526
+ } = knowledge;
1527
+
1528
+ const knowledgeId = `know_${Date.now()}`;
1529
+
1530
+ await this.semanticMemory.insert({
1531
+ id: knowledgeId,
1532
+ vector: new Float32Array(embedding),
1533
+ metadata: {
1534
+ concept: concept,
1535
+ description: description,
1536
+ confidence: confidence,
1537
+ learned: Date.now(),
1538
+ uses: 0,
1539
+ type: 'semantic'
1540
+ }
1541
+ });
1542
+
1543
+ console.log(`📚 Learned: ${concept}`);
1544
+ return knowledgeId;
1545
+ }
1546
+
1547
+ // Step 3: Recall similar experiences
1548
+ async recallExperiences(currentState, k = 5) {
1549
+ console.log(`🔍 Recalling similar experiences...`);
1550
+
1551
+ const results = await this.episodicMemory.search({
1552
+ vector: new Float32Array(currentState.embedding),
1553
+ k: k,
1554
+ threshold: 0.6 // Only recall reasonably similar experiences
1555
+ });
1556
+
1557
+ // Sort by reward to prioritize successful experiences
1558
+ const sorted = results.sort((a, b) => b.metadata.reward - a.metadata.reward);
1559
+
1560
+ console.log(`📝 Recalled ${sorted.length} relevant experiences`);
1561
+
1562
+ return sorted.map(r => ({
1563
+ state: r.metadata.state,
1564
+ action: r.metadata.action,
1565
+ result: r.metadata.result,
1566
+ reward: r.metadata.reward,
1567
+ similarity: r.score
1568
+ }));
1569
+ }
1570
+
1571
+ // Step 4: Query knowledge base
1572
+ async queryKnowledge(query, k = 3) {
1573
+ const results = await this.semanticMemory.search({
1574
+ vector: new Float32Array(query.embedding),
1575
+ k: k
1576
+ });
1577
+
1578
+ // Update usage statistics
1579
+ for (const result of results) {
1580
+ const knowledge = await this.semanticMemory.get(result.id);
1581
+ if (knowledge) {
1582
+ knowledge.metadata.uses += 1;
1583
+ // In production, update the entry
1584
+ }
1585
+ }
1586
+
1587
+ return results.map(r => ({
1588
+ concept: r.metadata.concept,
1589
+ description: r.metadata.description,
1590
+ confidence: r.metadata.confidence,
1591
+ relevance: r.score
1592
+ }));
1593
+ }
1594
+
1595
+ // Step 5: Reflect and learn from experiences
1596
+ async reflect() {
1597
+ console.log('\n🤔 Reflecting on experiences...');
1598
+
1599
+ // Get all experiences
1600
+ const totalExperiences = await this.episodicMemory.len();
1601
+ console.log(`📊 Total experiences: ${totalExperiences}`);
1602
+
1603
+ // Analyze success rate
1604
+ // In production, you'd aggregate experiences and extract patterns
1605
+ console.log('💡 Analysis complete');
1606
+
1607
+ return {
1608
+ totalExperiences: totalExperiences,
1609
+ knowledgeItems: await this.semanticMemory.len()
1610
+ };
1611
+ }
1612
+
1613
+ // Step 6: Get memory statistics
1614
+ async getStats() {
1615
+ return {
1616
+ episodicMemorySize: await this.episodicMemory.len(),
1617
+ semanticMemorySize: await this.semanticMemory.len(),
1618
+ agentId: this.agentId
1619
+ };
1620
+ }
1621
+ }
1622
+
1623
+ // Example Usage: Simulated agent learning to navigate
1624
+ async function main() {
1625
+ const agent = new AgentMemory('agent-001');
1626
+
1627
+ // Simulate embedding function (in production, use a real model)
1628
+ function embed(text) {
1629
+ return Array(768).fill(0).map(() => Math.random());
1630
+ }
1631
+
1632
+ console.log('\n' + '='.repeat(60));
1633
+ console.log('PHASE 1: Learning from experiences');
1634
+ console.log('='.repeat(60) + '\n');
1635
+
1636
+ // Store some experiences
1637
+ await agent.storeExperience({
1638
+ state: { location: 'room1', goal: 'room3' },
1639
+ action: 'move_north',
1640
+ result: 'reached room2',
1641
+ reward: 0.5,
1642
+ embedding: embed('navigating from room1 to room2')
1643
+ });
1644
+
1645
+ await agent.storeExperience({
1646
+ state: { location: 'room2', goal: 'room3' },
1647
+ action: 'move_east',
1648
+ result: 'reached room3',
1649
+ reward: 1.0,
1650
+ embedding: embed('navigating from room2 to room3')
1651
+ });
1652
+
1653
+ await agent.storeExperience({
1654
+ state: { location: 'room1', goal: 'room3' },
1655
+ action: 'move_south',
1656
+ result: 'hit wall',
1657
+ reward: -0.5,
1658
+ embedding: embed('failed navigation attempt')
1659
+ });
1660
+
1661
+ // Store learned knowledge
1662
+ await agent.storeKnowledge({
1663
+ concept: 'navigation_strategy',
1664
+ description: 'Moving north then east is efficient for reaching room3 from room1',
1665
+ embedding: embed('navigation strategy knowledge'),
1666
+ confidence: 0.9
1667
+ });
1668
+
1669
+ console.log('\n' + '='.repeat(60));
1670
+ console.log('PHASE 2: Applying memory');
1671
+ console.log('='.repeat(60) + '\n');
1672
+
1673
+ // Agent encounters a similar situation
1674
+ const currentState = {
1675
+ location: 'room1',
1676
+ goal: 'room3',
1677
+ embedding: embed('navigating from room1 to room3')
1678
+ };
1679
+
1680
+ // Recall relevant experiences
1681
+ const experiences = await agent.recallExperiences(currentState, 3);
1682
+
1683
+ console.log('\n📖 Recalled experiences:');
1684
+ experiences.forEach((exp, i) => {
1685
+ console.log(`${i + 1}. Action: ${exp.action} | Result: ${exp.result} | Reward: ${exp.reward} | Similarity: ${exp.similarity.toFixed(3)}`);
1686
+ });
1687
+
1688
+ // Query relevant knowledge
1689
+ const knowledge = await agent.queryKnowledge({
1690
+ embedding: embed('how to navigate efficiently')
1691
+ }, 2);
1692
+
1693
+ console.log('\n📚 Relevant knowledge:');
1694
+ knowledge.forEach((k, i) => {
1695
+ console.log(`${i + 1}. ${k.concept}: ${k.description} (confidence: ${k.confidence})`);
1696
+ });
1697
+
1698
+ console.log('\n' + '='.repeat(60));
1699
+ console.log('PHASE 3: Reflection');
1700
+ console.log('='.repeat(60) + '\n');
1701
+
1702
+ // Reflect on learning
1703
+ const stats = await agent.reflect();
1704
+ const memoryStats = await agent.getStats();
1705
+
1706
+ console.log('\n📊 Memory Statistics:');
1707
+ console.log(` Episodic memories: ${memoryStats.episodicMemorySize}`);
1708
+ console.log(` Semantic knowledge: ${memoryStats.semanticMemorySize}`);
1709
+ console.log(` Agent ID: ${memoryStats.agentId}`);
1710
+ }
1711
+
1712
+ main().catch(console.error);
1713
+ ```
1714
+
1715
+ **Expected Output:**
1716
+ ```
1717
+ 🧠 Memory system initialized for agent: agent-001
1718
+
1719
+ ============================================================
1720
+ PHASE 1: Learning from experiences
1721
+ ============================================================
1722
+
1723
+ 💾 Stored experience: move_north -> reached room2 (reward: 0.5)
1724
+ 💾 Stored experience: move_east -> reached room3 (reward: 1.0)
1725
+ 💾 Stored experience: move_south -> hit wall (reward: -0.5)
1726
+ 📚 Learned: navigation_strategy
1727
+
1728
+ ============================================================
1729
+ PHASE 2: Applying memory
1730
+ ============================================================
1731
+
1732
+ 🔍 Recalling similar experiences...
1733
+ 📝 Recalled 3 relevant experiences
1734
+
1735
+ 📖 Recalled experiences:
1736
+ 1. Action: move_east | Result: reached room3 | Reward: 1.0 | Similarity: 0.892
1737
+ 2. Action: move_north | Result: reached room2 | Reward: 0.5 | Similarity: 0.876
1738
+ 3. Action: move_south | Result: hit wall | Reward: -0.5 | Similarity: 0.654
1739
+
1740
+ 📚 Relevant knowledge:
1741
+ 1. navigation_strategy: Moving north then east is efficient for reaching room3 from room1 (confidence: 0.9)
1742
+
1743
+ ============================================================
1744
+ PHASE 3: Reflection
1745
+ ============================================================
1746
+
1747
+ 🤔 Reflecting on experiences...
1748
+ 📊 Total experiences: 3
1749
+ 💡 Analysis complete
1750
+
1751
+ 📊 Memory Statistics:
1752
+ Episodic memories: 3
1753
+ Semantic knowledge: 1
1754
+ Agent ID: agent-001
1755
+ ```
1756
+
1757
+ **Use Cases:**
1758
+ - ✅ Reinforcement learning agents
1759
+ - ✅ Chatbot conversation history
1760
+ - ✅ Game AI that learns from gameplay
1761
+ - ✅ Personal assistant memory
1762
+ - ✅ Robotic navigation systems
1763
+
1764
+ ## 🏗️ API Reference
1765
+
1766
+ ### Constructor
1767
+
1768
+ ```typescript
1769
+ new VectorDb(options: {
1770
+ dimensions: number; // Vector dimensionality (required)
1771
+ maxElements?: number; // Max vectors (default: 10000)
1772
+ storagePath?: string; // Persistent storage path
1773
+ ef_construction?: number; // HNSW construction parameter (default: 200)
1774
+ m?: number; // HNSW M parameter (default: 16)
1775
+ distanceMetric?: string; // 'cosine', 'euclidean', or 'dot' (default: 'cosine')
1776
+ })
1777
+ ```
1778
+
1779
+ ### Methods
1780
+
1781
+ #### insert(entry: VectorEntry): Promise<string>
1782
+ Insert a vector into the database.
1783
+
1784
+ ```javascript
1785
+ const id = await db.insert({
1786
+ id: 'doc_1',
1787
+ vector: new Float32Array([0.1, 0.2, 0.3, ...]),
1788
+ metadata: { title: 'Document 1' }
1789
+ });
1790
+ ```
1791
+
1792
+ #### search(query: SearchQuery): Promise<SearchResult[]>
1793
+ Search for similar vectors.
1794
+
1795
+ ```javascript
1796
+ const results = await db.search({
1797
+ vector: new Float32Array([0.1, 0.2, 0.3, ...]),
1798
+ k: 10,
1799
+ threshold: 0.7
1800
+ });
1801
+ ```
1802
+
1803
+ #### get(id: string): Promise<VectorEntry | null>
1804
+ Retrieve a vector by ID.
1805
+
1806
+ ```javascript
1807
+ const entry = await db.get('doc_1');
1808
+ if (entry) {
1809
+ console.log(entry.vector, entry.metadata);
1810
+ }
1811
+ ```
1812
+
1813
+ #### delete(id: string): Promise<boolean>
1814
+ Remove a vector from the database.
1815
+
1816
+ ```javascript
1817
+ const deleted = await db.delete('doc_1');
1818
+ console.log(deleted ? 'Deleted' : 'Not found');
1819
+ ```
1820
+
1821
+ #### len(): Promise<number>
1822
+ Get the total number of vectors.
1823
+
1824
+ ```javascript
1825
+ const count = await db.len();
1826
+ console.log(`Total vectors: ${count}`);
1827
+ ```
1828
+
1829
+ ## 🎨 Advanced Configuration
1830
+
1831
+ ### HNSW Parameters
1832
+
1833
+ ```javascript
1834
+ const db = new VectorDb({
1835
+ dimensions: 384,
1836
+ maxElements: 1000000,
1837
+ ef_construction: 200, // Higher = better recall, slower build
1838
+ m: 16, // Higher = better recall, more memory
1839
+ storagePath: './large-db.db'
1840
+ });
1841
+ ```
1842
+
1843
+ **Parameter Guidelines:**
1844
+ - `ef_construction`: 100-400 (higher = better recall, slower indexing)
1845
+ - `m`: 8-64 (higher = better recall, more memory)
1846
+ - Default values work well for most use cases
1847
+
1848
+ ### Distance Metrics
1849
+
1850
+ ```javascript
1851
+ // Cosine similarity (default, best for normalized vectors)
1852
+ const db1 = new VectorDb({
1853
+ dimensions: 128,
1854
+ distanceMetric: 'cosine'
1855
+ });
1856
+
1857
+ // Euclidean distance (L2, best for spatial data)
1858
+ const db2 = new VectorDb({
1859
+ dimensions: 128,
1860
+ distanceMetric: 'euclidean'
1861
+ });
1862
+
1863
+ // Dot product (best for pre-normalized vectors)
1864
+ const db3 = new VectorDb({
1865
+ dimensions: 128,
1866
+ distanceMetric: 'dot'
1867
+ });
1868
+ ```
1869
+
1870
+ ### Persistence
1871
+
1872
+ ```javascript
1873
+ // Auto-save to disk
1874
+ const persistent = new VectorDb({
1875
+ dimensions: 128,
1876
+ storagePath: './persistent.db'
1877
+ });
1878
+
1879
+ // In-memory only (faster, but data lost on exit)
1880
+ const temporary = new VectorDb({
1881
+ dimensions: 128
1882
+ // No storagePath = in-memory
1883
+ });
1884
+ ```
1885
+
1886
+ ## 📦 Platform Support
1887
+
1888
+ Automatically installs the correct implementation for:
1889
+
1890
+ ### Native (Rust) - Best Performance
1891
+ - **Linux**: x64, ARM64 (GNU libc)
1892
+ - **macOS**: x64 (Intel), ARM64 (Apple Silicon)
1893
+ - **Windows**: x64 (MSVC)
1894
+
1895
+ Performance: **<0.5ms latency**, **50K+ ops/sec**
1896
+
1897
+ ### WASM Fallback - Universal Compatibility
1898
+ - Any platform where native module isn't available
1899
+ - Browser environments (experimental)
1900
+ - Alpine Linux (musl) and other non-glibc systems
1901
+
1902
+ Performance: **10-50ms latency**, **~1K ops/sec**
1903
+
1904
+ **Node.js 18+ required** for all platforms.
1905
+
1906
+ ## 🔧 Building from Source
1907
+
1908
+ If you need to rebuild the native module:
1909
+
1910
+ ```bash
1911
+ # Install Rust toolchain
1912
+ curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs | sh
1913
+
1914
+ # Clone repository
1915
+ git clone https://github.com/ruvnet/ruvector.git
1916
+ cd ruvector
1917
+
1918
+ # Build native module
1919
+ cd npm/packages/core
1920
+ npm run build:napi
1921
+
1922
+ # Build wrapper package
1923
+ cd ../ruvector
1924
+ npm install
1925
+ npm run build
1926
+
1927
+ # Run tests
1928
+ npm test
1929
+ ```
1930
+
1931
+ **Requirements:**
1932
+ - Rust 1.77+
1933
+ - Node.js 18+
1934
+ - Cargo
1935
+
1936
+ ## 🌍 Ecosystem
1937
+
1938
+ ### Related Packages
1939
+
1940
+ - **[ruvector-core](https://www.npmjs.com/package/ruvector-core)** - Core native bindings (lower-level API)
1941
+ - **[ruvector-wasm](https://www.npmjs.com/package/ruvector-wasm)** - WebAssembly implementation for browsers
1942
+ - **[ruvector-cli](https://www.npmjs.com/package/ruvector-cli)** - Standalone CLI tools
1943
+ - **[@ruvector/rvf](https://www.npmjs.com/package/@ruvector/rvf)** - RVF cognitive container SDK
1944
+ - **[@ruvector/rvf-wasm](https://www.npmjs.com/package/@ruvector/rvf-wasm)** - RVF WASM build for browsers, Deno, and edge
1945
+ - **[rvlite](https://www.npmjs.com/package/rvlite)** - Lightweight vector database with SQL, SPARQL, and Cypher
1946
+
1947
+ ### Platform-Specific Packages (auto-installed)
1948
+
1949
+ - **[ruvector-core-linux-x64-gnu](https://www.npmjs.com/package/ruvector-core-linux-x64-gnu)**
1950
+ - **[ruvector-core-linux-arm64-gnu](https://www.npmjs.com/package/ruvector-core-linux-arm64-gnu)**
1951
+ - **[ruvector-core-darwin-x64](https://www.npmjs.com/package/ruvector-core-darwin-x64)**
1952
+ - **[ruvector-core-darwin-arm64](https://www.npmjs.com/package/ruvector-core-darwin-arm64)**
1953
+ - **[ruvector-core-win32-x64-msvc](https://www.npmjs.com/package/ruvector-core-win32-x64-msvc)**
1954
+
1955
+ ---
1956
+
1957
+ ## RVF Cognitive Containers
1958
+
1959
+ Ruvector integrates with [RVF (RuVector Format)](https://github.com/ruvnet/ruvector/tree/main/crates/rvf) — a universal binary substrate that stores vectors, models, graphs, compute kernels, and attestation in a single `.rvf` file.
1960
+
1961
+ ### Enable RVF Backend
1962
+
1963
+ ```bash
1964
+ # Install the optional RVF package
1965
+ npm install @ruvector/rvf
1966
+
1967
+ # Set backend via environment variable
1968
+ export RUVECTOR_BACKEND=rvf
1969
+
1970
+ # Or detect automatically (native -> rvf -> wasm fallback)
1971
+ npx ruvector info
1972
+ ```
1973
+
1974
+ ```typescript
1975
+ import { getImplementationType, isRvf } from 'ruvector';
1976
+
1977
+ console.log(getImplementationType()); // 'native' | 'rvf' | 'wasm'
1978
+ console.log(isRvf()); // true if RVF backend is active
1979
+ ```
1980
+
1981
+ ### RVF CLI Commands
1982
+
1983
+ 8 RVF-specific subcommands are available through the ruvector CLI:
1984
+
1985
+ ```bash
1986
+ # Create an RVF store
1987
+ npx ruvector rvf create mydb.rvf -d 384 --metric cosine
1988
+
1989
+ # Ingest vectors from JSON
1990
+ npx ruvector rvf ingest mydb.rvf --input vectors.json --format json
1991
+
1992
+ # Query nearest neighbors
1993
+ npx ruvector rvf query mydb.rvf --vector "[0.1,0.2,...]" --k 10
1994
+
1995
+ # File status and segment listing
1996
+ npx ruvector rvf status mydb.rvf
1997
+ npx ruvector rvf segments mydb.rvf
1998
+
1999
+ # COW branching — derive a child file
2000
+ npx ruvector rvf derive mydb.rvf --output child.rvf
2001
+
2002
+ # Compact and reclaim space
2003
+ npx ruvector rvf compact mydb.rvf
2004
+
2005
+ # Export to JSON
2006
+ npx ruvector rvf export mydb.rvf --output dump.json
2007
+ ```
2008
+
2009
+ ### RVF Platform Support
2010
+
2011
+ | Platform | Runtime | Backend |
2012
+ |----------|---------|---------|
2013
+ | Linux x86_64 / aarch64 | Node.js 18+ | Native (N-API) |
2014
+ | macOS x86_64 / arm64 | Node.js 18+ | Native (N-API) |
2015
+ | Windows x86_64 | Node.js 18+ | Native (N-API) |
2016
+ | Any | Deno | WASM (`@ruvector/rvf-wasm`) |
2017
+ | Any | Browser | WASM (`@ruvector/rvf-wasm`) |
2018
+ | Any | Cloudflare Workers | WASM (`@ruvector/rvf-wasm`) |
2019
+
2020
+ ### Download Example .rvf Files
2021
+
2022
+ 45 pre-built example files are available (~11 MB total):
2023
+
2024
+ ```bash
2025
+ # Download a specific example
2026
+ curl -LO https://raw.githubusercontent.com/ruvnet/ruvector/main/examples/rvf/output/basic_store.rvf
2027
+
2028
+ # Popular examples:
2029
+ # basic_store.rvf (152 KB) — 1,000 vectors, dim 128
2030
+ # semantic_search.rvf (755 KB) — Semantic search with HNSW
2031
+ # rag_pipeline.rvf (303 KB) — RAG pipeline embeddings
2032
+ # agent_memory.rvf (32 KB) — AI agent memory store
2033
+ # self_booting.rvf (31 KB) — Self-booting with kernel
2034
+ # progressive_index.rvf (2.5 MB) — Large-scale HNSW index
2035
+
2036
+ # Generate all examples locally
2037
+ cd crates/rvf && cargo run --example generate_all
2038
+ ```
2039
+
2040
+ Full catalog: [examples/rvf/output/](https://github.com/ruvnet/ruvector/tree/main/examples/rvf/output)
2041
+
2042
+ ### Working Examples: Cognitive Containers
2043
+
2044
+ #### Self-Booting Microservice
2045
+
2046
+ A single `.rvf` file that contains vectors AND a bootable Linux kernel:
2047
+
2048
+ ```bash
2049
+ # Build and run the self-booting example
2050
+ cd crates/rvf && cargo run --example self_booting
2051
+ # Output:
2052
+ # Ingested 50 vectors (128 dims)
2053
+ # Pre-kernel query: top-5 results OK (nearest ID=25)
2054
+ # Kernel: 4,640 bytes embedded (x86_64, Hermit)
2055
+ # Witness chain: 5 entries, all verified
2056
+ # File: bootable.rvf (31 KB) — data + runtime in one file
2057
+ ```
2058
+
2059
+ ```rust
2060
+ // The pattern: vectors + kernel + witness in one file
2061
+ let mut store = RvfStore::create("bootable.rvf", options)?;
2062
+ store.ingest_batch(&vectors, &ids, None)?;
2063
+ store.embed_kernel(KernelArch::X86_64 as u8, KernelType::Hermit as u8,
2064
+ 0x0018, &kernel_image, 8080, Some("console=ttyS0 quiet"))?;
2065
+ // Result: drop on a VM and it boots as a query service
2066
+ ```
2067
+
2068
+ #### Linux Microkernel Distribution
2069
+
2070
+ 20-package Linux distro with SSH keys and kernel in a single file:
2071
+
2072
+ ```bash
2073
+ cd crates/rvf && cargo run --example linux_microkernel
2074
+ # Output:
2075
+ # Installed 20 packages as vector embeddings
2076
+ # Kernel embedded: Linux x86_64 (4,640 bytes)
2077
+ # SSH keys: Ed25519, signed and verified
2078
+ # Witness chain: 22 entries (1 per package + kernel + SSH)
2079
+ # File: microkernel.rvf (14 KB) — immutable bootable system
2080
+ ```
2081
+
2082
+ Features: package search by embedding similarity, Ed25519 signed SSH keys, witness-audited installs, COW-derived child images for atomic updates.
2083
+
2084
+ #### Claude Code AI Appliance
2085
+
2086
+ A sealed, bootable AI development environment:
2087
+
2088
+ ```bash
2089
+ cd crates/rvf && cargo run --example claude_code_appliance
2090
+ # Output:
2091
+ # 20 dev packages (rust, node, python, docker, ...)
2092
+ # Kernel: Linux x86_64 with SSH on port 2222
2093
+ # eBPF: XDP distance program for fast-path lookups
2094
+ # Witness chain: 6 entries, all verified
2095
+ # Crypto: Ed25519 signature
2096
+ # File: claude_code_appliance.rvf (17 KB)
2097
+ ```
2098
+
2099
+ #### CLI Full Lifecycle
2100
+
2101
+ ```bash
2102
+ # Create → Ingest → Query → Derive → Inspect
2103
+ rvf create vectors.rvf --dimension 384
2104
+ rvf ingest vectors.rvf --input data.json --format json
2105
+ rvf query vectors.rvf --vector "0.1,0.2,..." --k 10
2106
+ rvf derive vectors.rvf child.rvf --type filter
2107
+ rvf inspect vectors.rvf
2108
+
2109
+ # Embed kernel and launch as microVM
2110
+ rvf embed-kernel vectors.rvf --image bzImage
2111
+ rvf launch vectors.rvf --port 8080
2112
+
2113
+ # Verify tamper-evident witness chain
2114
+ rvf verify-witness vectors.rvf
2115
+ rvf verify-attestation vectors.rvf
2116
+ ```
2117
+
2118
+ #### Integration Tests (46 passing)
2119
+
2120
+ ```bash
2121
+ cd crates/rvf
2122
+ cargo test --workspace
2123
+ # attestation .............. 6 passed
2124
+ # crypto ................... 10 passed
2125
+ # computational_container .. 8 passed
2126
+ # cow_branching ............ 8 passed
2127
+ # cross_platform ........... 6 passed
2128
+ # lineage .................. 4 passed
2129
+ # smoke .................... 4 passed
2130
+ # Total: 46/46 passed
2131
+ ```
2132
+
2133
+ ## 🐛 Troubleshooting
2134
+
2135
+ ### Native Module Not Loading
2136
+
2137
+ If you see "Cannot find module 'ruvector-core-*'":
2138
+
2139
+ ```bash
2140
+ # Reinstall with optional dependencies
2141
+ npm install --include=optional ruvector
2142
+
2143
+ # Verify platform
2144
+ npx ruvector info
2145
+
2146
+ # Check Node.js version (18+ required)
2147
+ node --version
2148
+ ```
2149
+
2150
+ ### WASM Fallback Performance
2151
+
2152
+ If you're using WASM fallback and need better performance:
2153
+
2154
+ 1. **Install native toolchain** for your platform
2155
+ 2. **Rebuild native module**: `npm rebuild ruvector`
2156
+ 3. **Verify native**: `npx ruvector info` should show "native (Rust)"
2157
+
2158
+ ### Platform Compatibility
2159
+
2160
+ - **Alpine Linux**: Uses WASM fallback (musl not supported)
2161
+ - **Windows ARM**: Not yet supported, uses WASM fallback
2162
+ - **Node.js < 18**: Not supported, upgrade to Node.js 18+
2163
+
2164
+ ## 📚 Documentation
2165
+
2166
+ - 🏠 [Homepage](https://ruv.io)
2167
+ - 📦 [GitHub Repository](https://github.com/ruvnet/ruvector)
2168
+ - 📚 [Full Documentation](https://github.com/ruvnet/ruvector/tree/main/docs)
2169
+ - 🚀 [Getting Started Guide](https://github.com/ruvnet/ruvector/blob/main/docs/guide/GETTING_STARTED.md)
2170
+ - 📖 [API Reference](https://github.com/ruvnet/ruvector/blob/main/docs/api/NODEJS_API.md)
2171
+ - 🎯 [Performance Tuning](https://github.com/ruvnet/ruvector/blob/main/docs/optimization/PERFORMANCE_TUNING_GUIDE.md)
2172
+ - 🐛 [Issue Tracker](https://github.com/ruvnet/ruvector/issues)
2173
+ - 💬 [Discussions](https://github.com/ruvnet/ruvector/discussions)
2174
+
2175
+ ## 🤝 Contributing
2176
+
2177
+ We welcome contributions! See [CONTRIBUTING.md](https://github.com/ruvnet/ruvector/blob/main/docs/development/CONTRIBUTING.md) for guidelines.
2178
+
2179
+ ### Quick Start
2180
+
2181
+ 1. Fork the repository
2182
+ 2. Create a feature branch: `git checkout -b feature/amazing-feature`
2183
+ 3. Commit changes: `git commit -m 'Add amazing feature'`
2184
+ 4. Push to branch: `git push origin feature/amazing-feature`
2185
+ 5. Open a Pull Request
2186
+
2187
+ ## 🌐 Community & Support
2188
+
2189
+ - **GitHub**: [github.com/ruvnet/ruvector](https://github.com/ruvnet/ruvector) - ⭐ Star and follow
2190
+ - **Discord**: [Join our community](https://discord.gg/ruvnet) - Chat with developers
2191
+ - **Twitter**: [@ruvnet](https://twitter.com/ruvnet) - Follow for updates
2192
+ - **Issues**: [Report bugs](https://github.com/ruvnet/ruvector/issues)
2193
+
2194
+ ### Enterprise Support
2195
+
2196
+ Need custom development or consulting?
2197
+
2198
+ 📧 [enterprise@ruv.io](mailto:enterprise@ruv.io)
2199
+
2200
+ ## 📜 License
2201
+
2202
+ **MIT License** - see [LICENSE](https://github.com/ruvnet/ruvector/blob/main/LICENSE) for details.
2203
+
2204
+ Free for commercial and personal use.
2205
+
2206
+ ## 🙏 Acknowledgments
2207
+
2208
+ Built with battle-tested technologies:
2209
+
2210
+ - **HNSW**: Hierarchical Navigable Small World graphs
2211
+ - **SIMD**: Hardware-accelerated vector operations via simsimd
2212
+ - **Rust**: Memory-safe, zero-cost abstractions
2213
+ - **NAPI-RS**: High-performance Node.js bindings
2214
+ - **WebAssembly**: Universal browser compatibility
2215
+
2216
+ ---
2217
+
2218
+ <div align="center">
2219
+
2220
+ **Built with ❤️ by [rUv](https://ruv.io)**
2221
+
2222
+ [![npm](https://img.shields.io/npm/v/ruvector.svg)](https://www.npmjs.com/package/ruvector)
2223
+ [![GitHub Stars](https://img.shields.io/github/stars/ruvnet/ruvector?style=social)](https://github.com/ruvnet/ruvector)
2224
+ [![Twitter](https://img.shields.io/twitter/follow/ruvnet?style=social)](https://twitter.com/ruvnet)
2225
+
2226
+ **[Get Started](https://github.com/ruvnet/ruvector/blob/main/docs/guide/GETTING_STARTED.md)** • **[Documentation](https://github.com/ruvnet/ruvector/tree/main/docs)** • **[API Reference](https://github.com/ruvnet/ruvector/blob/main/docs/api/NODEJS_API.md)** • **[Contributing](https://github.com/ruvnet/ruvector/blob/main/docs/development/CONTRIBUTING.md)**
2227
+
2228
+ </div>
bin/cli.js ADDED
The diff for this file is too large to render. See raw diff
 
bin/mcp-server.js ADDED
The diff for this file is too large to render. See raw diff
 
dist/analysis/complexity.d.ts ADDED
@@ -0,0 +1,52 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ /**
2
+ * Complexity Analysis Module - Consolidated code complexity metrics
3
+ *
4
+ * Single source of truth for cyclomatic complexity and code metrics.
5
+ * Used by native-worker.ts and parallel-workers.ts
6
+ */
7
+ export interface ComplexityResult {
8
+ file: string;
9
+ lines: number;
10
+ nonEmptyLines: number;
11
+ cyclomaticComplexity: number;
12
+ functions: number;
13
+ avgFunctionSize: number;
14
+ maxFunctionComplexity?: number;
15
+ }
16
+ export interface ComplexityThresholds {
17
+ complexity: number;
18
+ functions: number;
19
+ lines: number;
20
+ avgSize: number;
21
+ }
22
+ export declare const DEFAULT_THRESHOLDS: ComplexityThresholds;
23
+ /**
24
+ * Analyze complexity of a single file
25
+ */
26
+ export declare function analyzeFile(filePath: string, content?: string): ComplexityResult;
27
+ /**
28
+ * Analyze complexity of multiple files
29
+ */
30
+ export declare function analyzeFiles(files: string[], maxFiles?: number): ComplexityResult[];
31
+ /**
32
+ * Check if complexity exceeds thresholds
33
+ */
34
+ export declare function exceedsThresholds(result: ComplexityResult, thresholds?: ComplexityThresholds): boolean;
35
+ /**
36
+ * Get complexity rating
37
+ */
38
+ export declare function getComplexityRating(complexity: number): 'low' | 'medium' | 'high' | 'critical';
39
+ /**
40
+ * Filter files exceeding thresholds
41
+ */
42
+ export declare function filterComplex(results: ComplexityResult[], thresholds?: ComplexityThresholds): ComplexityResult[];
43
+ declare const _default: {
44
+ DEFAULT_THRESHOLDS: ComplexityThresholds;
45
+ analyzeFile: typeof analyzeFile;
46
+ analyzeFiles: typeof analyzeFiles;
47
+ exceedsThresholds: typeof exceedsThresholds;
48
+ getComplexityRating: typeof getComplexityRating;
49
+ filterComplex: typeof filterComplex;
50
+ };
51
+ export default _default;
52
+ //# sourceMappingURL=complexity.d.ts.map
dist/analysis/complexity.d.ts.map ADDED
@@ -0,0 +1 @@
 
 
1
+ {"version":3,"file":"complexity.d.ts","sourceRoot":"","sources":["../../src/analysis/complexity.ts"],"names":[],"mappings":"AAAA;;;;;GAKG;AAIH,MAAM,WAAW,gBAAgB;IAC/B,IAAI,EAAE,MAAM,CAAC;IACb,KAAK,EAAE,MAAM,CAAC;IACd,aAAa,EAAE,MAAM,CAAC;IACtB,oBAAoB,EAAE,MAAM,CAAC;IAC7B,SAAS,EAAE,MAAM,CAAC;IAClB,eAAe,EAAE,MAAM,CAAC;IACxB,qBAAqB,CAAC,EAAE,MAAM,CAAC;CAChC;AAED,MAAM,WAAW,oBAAoB;IACnC,UAAU,EAAE,MAAM,CAAC;IACnB,SAAS,EAAE,MAAM,CAAC;IAClB,KAAK,EAAE,MAAM,CAAC;IACd,OAAO,EAAE,MAAM,CAAC;CACjB;AAED,eAAO,MAAM,kBAAkB,EAAE,oBAKhC,CAAC;AAEF;;GAEG;AACH,wBAAgB,WAAW,CAAC,QAAQ,EAAE,MAAM,EAAE,OAAO,CAAC,EAAE,MAAM,GAAG,gBAAgB,CAsDhF;AAED;;GAEG;AACH,wBAAgB,YAAY,CAAC,KAAK,EAAE,MAAM,EAAE,EAAE,QAAQ,GAAE,MAAY,GAAG,gBAAgB,EAAE,CAExF;AAED;;GAEG;AACH,wBAAgB,iBAAiB,CAC/B,MAAM,EAAE,gBAAgB,EACxB,UAAU,GAAE,oBAAyC,GACpD,OAAO,CAOT;AAED;;GAEG;AACH,wBAAgB,mBAAmB,CAAC,UAAU,EAAE,MAAM,GAAG,KAAK,GAAG,QAAQ,GAAG,MAAM,GAAG,UAAU,CAK9F;AAED;;GAEG;AACH,wBAAgB,aAAa,CAC3B,OAAO,EAAE,gBAAgB,EAAE,EAC3B,UAAU,GAAE,oBAAyC,GACpD,gBAAgB,EAAE,CAEpB;;;;;;;;;AAED,wBAOE"}
dist/analysis/complexity.js ADDED
@@ -0,0 +1,146 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ "use strict";
2
+ /**
3
+ * Complexity Analysis Module - Consolidated code complexity metrics
4
+ *
5
+ * Single source of truth for cyclomatic complexity and code metrics.
6
+ * Used by native-worker.ts and parallel-workers.ts
7
+ */
8
+ var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
9
+ if (k2 === undefined) k2 = k;
10
+ var desc = Object.getOwnPropertyDescriptor(m, k);
11
+ if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
12
+ desc = { enumerable: true, get: function() { return m[k]; } };
13
+ }
14
+ Object.defineProperty(o, k2, desc);
15
+ }) : (function(o, m, k, k2) {
16
+ if (k2 === undefined) k2 = k;
17
+ o[k2] = m[k];
18
+ }));
19
+ var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
20
+ Object.defineProperty(o, "default", { enumerable: true, value: v });
21
+ }) : function(o, v) {
22
+ o["default"] = v;
23
+ });
24
+ var __importStar = (this && this.__importStar) || (function () {
25
+ var ownKeys = function(o) {
26
+ ownKeys = Object.getOwnPropertyNames || function (o) {
27
+ var ar = [];
28
+ for (var k in o) if (Object.prototype.hasOwnProperty.call(o, k)) ar[ar.length] = k;
29
+ return ar;
30
+ };
31
+ return ownKeys(o);
32
+ };
33
+ return function (mod) {
34
+ if (mod && mod.__esModule) return mod;
35
+ var result = {};
36
+ if (mod != null) for (var k = ownKeys(mod), i = 0; i < k.length; i++) if (k[i] !== "default") __createBinding(result, mod, k[i]);
37
+ __setModuleDefault(result, mod);
38
+ return result;
39
+ };
40
+ })();
41
+ Object.defineProperty(exports, "__esModule", { value: true });
42
+ exports.DEFAULT_THRESHOLDS = void 0;
43
+ exports.analyzeFile = analyzeFile;
44
+ exports.analyzeFiles = analyzeFiles;
45
+ exports.exceedsThresholds = exceedsThresholds;
46
+ exports.getComplexityRating = getComplexityRating;
47
+ exports.filterComplex = filterComplex;
48
+ const fs = __importStar(require("fs"));
49
+ exports.DEFAULT_THRESHOLDS = {
50
+ complexity: 10,
51
+ functions: 30,
52
+ lines: 500,
53
+ avgSize: 50,
54
+ };
55
+ /**
56
+ * Analyze complexity of a single file
57
+ */
58
+ function analyzeFile(filePath, content) {
59
+ try {
60
+ const fileContent = content ?? (fs.existsSync(filePath) ? fs.readFileSync(filePath, 'utf-8') : '');
61
+ if (!fileContent) {
62
+ return { file: filePath, lines: 0, nonEmptyLines: 0, cyclomaticComplexity: 1, functions: 0, avgFunctionSize: 0 };
63
+ }
64
+ const lines = fileContent.split('\n');
65
+ const nonEmptyLines = lines.filter(l => l.trim().length > 0).length;
66
+ // Count branching statements for cyclomatic complexity
67
+ const branches = (fileContent.match(/\bif\b/g)?.length || 0) +
68
+ (fileContent.match(/\belse\b/g)?.length || 0) +
69
+ (fileContent.match(/\bfor\b/g)?.length || 0) +
70
+ (fileContent.match(/\bwhile\b/g)?.length || 0) +
71
+ (fileContent.match(/\bswitch\b/g)?.length || 0) +
72
+ (fileContent.match(/\bcase\b/g)?.length || 0) +
73
+ (fileContent.match(/\bcatch\b/g)?.length || 0) +
74
+ (fileContent.match(/\?\?/g)?.length || 0) +
75
+ (fileContent.match(/&&/g)?.length || 0) +
76
+ (fileContent.match(/\|\|/g)?.length || 0) +
77
+ (fileContent.match(/\?[^:]/g)?.length || 0); // Ternary
78
+ const cyclomaticComplexity = branches + 1;
79
+ // Count functions
80
+ const functionPatterns = [
81
+ /function\s+\w+/g,
82
+ /\w+\s*=\s*(?:async\s*)?\(/g,
83
+ /\w+\s*:\s*(?:async\s*)?\(/g,
84
+ /(?:async\s+)?(?:public|private|protected)?\s+\w+\s*\([^)]*\)\s*[:{]/g,
85
+ ];
86
+ let functions = 0;
87
+ for (const pattern of functionPatterns) {
88
+ functions += (fileContent.match(pattern) || []).length;
89
+ }
90
+ // Deduplicate by rough estimate
91
+ functions = Math.ceil(functions / 2);
92
+ const avgFunctionSize = functions > 0 ? Math.round(nonEmptyLines / functions) : nonEmptyLines;
93
+ return {
94
+ file: filePath,
95
+ lines: lines.length,
96
+ nonEmptyLines,
97
+ cyclomaticComplexity,
98
+ functions,
99
+ avgFunctionSize,
100
+ };
101
+ }
102
+ catch {
103
+ return { file: filePath, lines: 0, nonEmptyLines: 0, cyclomaticComplexity: 1, functions: 0, avgFunctionSize: 0 };
104
+ }
105
+ }
106
+ /**
107
+ * Analyze complexity of multiple files
108
+ */
109
+ function analyzeFiles(files, maxFiles = 100) {
110
+ return files.slice(0, maxFiles).map(f => analyzeFile(f));
111
+ }
112
+ /**
113
+ * Check if complexity exceeds thresholds
114
+ */
115
+ function exceedsThresholds(result, thresholds = exports.DEFAULT_THRESHOLDS) {
116
+ return (result.cyclomaticComplexity > thresholds.complexity ||
117
+ result.functions > thresholds.functions ||
118
+ result.lines > thresholds.lines ||
119
+ result.avgFunctionSize > thresholds.avgSize);
120
+ }
121
+ /**
122
+ * Get complexity rating
123
+ */
124
+ function getComplexityRating(complexity) {
125
+ if (complexity <= 5)
126
+ return 'low';
127
+ if (complexity <= 10)
128
+ return 'medium';
129
+ if (complexity <= 20)
130
+ return 'high';
131
+ return 'critical';
132
+ }
133
+ /**
134
+ * Filter files exceeding thresholds
135
+ */
136
+ function filterComplex(results, thresholds = exports.DEFAULT_THRESHOLDS) {
137
+ return results.filter(r => exceedsThresholds(r, thresholds));
138
+ }
139
+ exports.default = {
140
+ DEFAULT_THRESHOLDS: exports.DEFAULT_THRESHOLDS,
141
+ analyzeFile,
142
+ analyzeFiles,
143
+ exceedsThresholds,
144
+ getComplexityRating,
145
+ filterComplex,
146
+ };
dist/analysis/index.d.ts ADDED
@@ -0,0 +1,15 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ /**
2
+ * Analysis Module - Consolidated code analysis utilities
3
+ *
4
+ * Single source of truth for:
5
+ * - Security scanning
6
+ * - Complexity analysis
7
+ * - Pattern extraction
8
+ */
9
+ export * from './security';
10
+ export * from './complexity';
11
+ export * from './patterns';
12
+ export { default as security } from './security';
13
+ export { default as complexity } from './complexity';
14
+ export { default as patterns } from './patterns';
15
+ //# sourceMappingURL=index.d.ts.map
dist/analysis/index.d.ts.map ADDED
@@ -0,0 +1 @@
 
 
1
+ {"version":3,"file":"index.d.ts","sourceRoot":"","sources":["../../src/analysis/index.ts"],"names":[],"mappings":"AAAA;;;;;;;GAOG;AAEH,cAAc,YAAY,CAAC;AAC3B,cAAc,cAAc,CAAC;AAC7B,cAAc,YAAY,CAAC;AAG3B,OAAO,EAAE,OAAO,IAAI,QAAQ,EAAE,MAAM,YAAY,CAAC;AACjD,OAAO,EAAE,OAAO,IAAI,UAAU,EAAE,MAAM,cAAc,CAAC;AACrD,OAAO,EAAE,OAAO,IAAI,QAAQ,EAAE,MAAM,YAAY,CAAC"}
dist/analysis/index.js ADDED
@@ -0,0 +1,38 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ "use strict";
2
+ /**
3
+ * Analysis Module - Consolidated code analysis utilities
4
+ *
5
+ * Single source of truth for:
6
+ * - Security scanning
7
+ * - Complexity analysis
8
+ * - Pattern extraction
9
+ */
10
+ var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
11
+ if (k2 === undefined) k2 = k;
12
+ var desc = Object.getOwnPropertyDescriptor(m, k);
13
+ if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
14
+ desc = { enumerable: true, get: function() { return m[k]; } };
15
+ }
16
+ Object.defineProperty(o, k2, desc);
17
+ }) : (function(o, m, k, k2) {
18
+ if (k2 === undefined) k2 = k;
19
+ o[k2] = m[k];
20
+ }));
21
+ var __exportStar = (this && this.__exportStar) || function(m, exports) {
22
+ for (var p in m) if (p !== "default" && !Object.prototype.hasOwnProperty.call(exports, p)) __createBinding(exports, m, p);
23
+ };
24
+ var __importDefault = (this && this.__importDefault) || function (mod) {
25
+ return (mod && mod.__esModule) ? mod : { "default": mod };
26
+ };
27
+ Object.defineProperty(exports, "__esModule", { value: true });
28
+ exports.patterns = exports.complexity = exports.security = void 0;
29
+ __exportStar(require("./security"), exports);
30
+ __exportStar(require("./complexity"), exports);
31
+ __exportStar(require("./patterns"), exports);
32
+ // Re-export defaults for convenience
33
+ var security_1 = require("./security");
34
+ Object.defineProperty(exports, "security", { enumerable: true, get: function () { return __importDefault(security_1).default; } });
35
+ var complexity_1 = require("./complexity");
36
+ Object.defineProperty(exports, "complexity", { enumerable: true, get: function () { return __importDefault(complexity_1).default; } });
37
+ var patterns_1 = require("./patterns");
38
+ Object.defineProperty(exports, "patterns", { enumerable: true, get: function () { return __importDefault(patterns_1).default; } });
dist/analysis/patterns.d.ts ADDED
@@ -0,0 +1,71 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ /**
2
+ * Pattern Extraction Module - Consolidated code pattern detection
3
+ *
4
+ * Single source of truth for extracting functions, imports, exports, etc.
5
+ * Used by native-worker.ts and parallel-workers.ts
6
+ */
7
+ export interface PatternMatch {
8
+ type: 'function' | 'class' | 'import' | 'export' | 'todo' | 'variable' | 'type';
9
+ match: string;
10
+ file: string;
11
+ line?: number;
12
+ }
13
+ export interface FilePatterns {
14
+ file: string;
15
+ language: string;
16
+ functions: string[];
17
+ classes: string[];
18
+ imports: string[];
19
+ exports: string[];
20
+ todos: string[];
21
+ variables: string[];
22
+ }
23
+ /**
24
+ * Detect language from file extension
25
+ */
26
+ export declare function detectLanguage(file: string): string;
27
+ /**
28
+ * Extract function names from content
29
+ */
30
+ export declare function extractFunctions(content: string): string[];
31
+ /**
32
+ * Extract class names from content
33
+ */
34
+ export declare function extractClasses(content: string): string[];
35
+ /**
36
+ * Extract import statements from content
37
+ */
38
+ export declare function extractImports(content: string): string[];
39
+ /**
40
+ * Extract export statements from content
41
+ */
42
+ export declare function extractExports(content: string): string[];
43
+ /**
44
+ * Extract TODO/FIXME comments from content
45
+ */
46
+ export declare function extractTodos(content: string): string[];
47
+ /**
48
+ * Extract all patterns from a file
49
+ */
50
+ export declare function extractAllPatterns(filePath: string, content?: string): FilePatterns;
51
+ /**
52
+ * Extract patterns from multiple files
53
+ */
54
+ export declare function extractFromFiles(files: string[], maxFiles?: number): FilePatterns[];
55
+ /**
56
+ * Convert FilePatterns to PatternMatch array (for native-worker compatibility)
57
+ */
58
+ export declare function toPatternMatches(patterns: FilePatterns): PatternMatch[];
59
+ declare const _default: {
60
+ detectLanguage: typeof detectLanguage;
61
+ extractFunctions: typeof extractFunctions;
62
+ extractClasses: typeof extractClasses;
63
+ extractImports: typeof extractImports;
64
+ extractExports: typeof extractExports;
65
+ extractTodos: typeof extractTodos;
66
+ extractAllPatterns: typeof extractAllPatterns;
67
+ extractFromFiles: typeof extractFromFiles;
68
+ toPatternMatches: typeof toPatternMatches;
69
+ };
70
+ export default _default;
71
+ //# sourceMappingURL=patterns.d.ts.map
dist/analysis/patterns.d.ts.map ADDED
@@ -0,0 +1 @@
 
 
1
+ {"version":3,"file":"patterns.d.ts","sourceRoot":"","sources":["../../src/analysis/patterns.ts"],"names":[],"mappings":"AAAA;;;;;GAKG;AAIH,MAAM,WAAW,YAAY;IAC3B,IAAI,EAAE,UAAU,GAAG,OAAO,GAAG,QAAQ,GAAG,QAAQ,GAAG,MAAM,GAAG,UAAU,GAAG,MAAM,CAAC;IAChF,KAAK,EAAE,MAAM,CAAC;IACd,IAAI,EAAE,MAAM,CAAC;IACb,IAAI,CAAC,EAAE,MAAM,CAAC;CACf;AAED,MAAM,WAAW,YAAY;IAC3B,IAAI,EAAE,MAAM,CAAC;IACb,QAAQ,EAAE,MAAM,CAAC;IACjB,SAAS,EAAE,MAAM,EAAE,CAAC;IACpB,OAAO,EAAE,MAAM,EAAE,CAAC;IAClB,OAAO,EAAE,MAAM,EAAE,CAAC;IAClB,OAAO,EAAE,MAAM,EAAE,CAAC;IAClB,KAAK,EAAE,MAAM,EAAE,CAAC;IAChB,SAAS,EAAE,MAAM,EAAE,CAAC;CACrB;AAED;;GAEG;AACH,wBAAgB,cAAc,CAAC,IAAI,EAAE,MAAM,GAAG,MAAM,CAUnD;AAED;;GAEG;AACH,wBAAgB,gBAAgB,CAAC,OAAO,EAAE,MAAM,GAAG,MAAM,EAAE,CA2B1D;AAED;;GAEG;AACH,wBAAgB,cAAc,CAAC,OAAO,EAAE,MAAM,GAAG,MAAM,EAAE,CAmBxD;AAED;;GAEG;AACH,wBAAgB,cAAc,CAAC,OAAO,EAAE,MAAM,GAAG,MAAM,EAAE,CAmBxD;AAED;;GAEG;AACH,wBAAgB,cAAc,CAAC,OAAO,EAAE,MAAM,GAAG,MAAM,EAAE,CAuBxD;AAED;;GAEG;AACH,wBAAgB,YAAY,CAAC,OAAO,EAAE,MAAM,GAAG,MAAM,EAAE,CAUtD;AAED;;GAEG;AACH,wBAAgB,kBAAkB,CAAC,QAAQ,EAAE,MAAM,EAAE,OAAO,CAAC,EAAE,MAAM,GAAG,YAAY,CA0BnF;AAED;;GAEG;AACH,wBAAgB,gBAAgB,CAAC,KAAK,EAAE,MAAM,EAAE,EAAE,QAAQ,GAAE,MAAY,GAAG,YAAY,EAAE,CAExF;AAED;;GAEG;AACH,wBAAgB,gBAAgB,CAAC,QAAQ,EAAE,YAAY,GAAG,YAAY,EAAE,CAoBvE;;;;;;;;;;;;AAED,wBAUE"}
dist/analysis/patterns.js ADDED
@@ -0,0 +1,243 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ "use strict";
2
+ /**
3
+ * Pattern Extraction Module - Consolidated code pattern detection
4
+ *
5
+ * Single source of truth for extracting functions, imports, exports, etc.
6
+ * Used by native-worker.ts and parallel-workers.ts
7
+ */
8
+ var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
9
+ if (k2 === undefined) k2 = k;
10
+ var desc = Object.getOwnPropertyDescriptor(m, k);
11
+ if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
12
+ desc = { enumerable: true, get: function() { return m[k]; } };
13
+ }
14
+ Object.defineProperty(o, k2, desc);
15
+ }) : (function(o, m, k, k2) {
16
+ if (k2 === undefined) k2 = k;
17
+ o[k2] = m[k];
18
+ }));
19
+ var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
20
+ Object.defineProperty(o, "default", { enumerable: true, value: v });
21
+ }) : function(o, v) {
22
+ o["default"] = v;
23
+ });
24
+ var __importStar = (this && this.__importStar) || (function () {
25
+ var ownKeys = function(o) {
26
+ ownKeys = Object.getOwnPropertyNames || function (o) {
27
+ var ar = [];
28
+ for (var k in o) if (Object.prototype.hasOwnProperty.call(o, k)) ar[ar.length] = k;
29
+ return ar;
30
+ };
31
+ return ownKeys(o);
32
+ };
33
+ return function (mod) {
34
+ if (mod && mod.__esModule) return mod;
35
+ var result = {};
36
+ if (mod != null) for (var k = ownKeys(mod), i = 0; i < k.length; i++) if (k[i] !== "default") __createBinding(result, mod, k[i]);
37
+ __setModuleDefault(result, mod);
38
+ return result;
39
+ };
40
+ })();
41
+ Object.defineProperty(exports, "__esModule", { value: true });
42
+ exports.detectLanguage = detectLanguage;
43
+ exports.extractFunctions = extractFunctions;
44
+ exports.extractClasses = extractClasses;
45
+ exports.extractImports = extractImports;
46
+ exports.extractExports = extractExports;
47
+ exports.extractTodos = extractTodos;
48
+ exports.extractAllPatterns = extractAllPatterns;
49
+ exports.extractFromFiles = extractFromFiles;
50
+ exports.toPatternMatches = toPatternMatches;
51
+ const fs = __importStar(require("fs"));
52
+ /**
53
+ * Detect language from file extension
54
+ */
55
+ function detectLanguage(file) {
56
+ const ext = file.split('.').pop()?.toLowerCase() || '';
57
+ const langMap = {
58
+ ts: 'typescript', tsx: 'typescript', js: 'javascript', jsx: 'javascript',
59
+ rs: 'rust', py: 'python', go: 'go', java: 'java', rb: 'ruby',
60
+ cpp: 'cpp', c: 'c', h: 'c', hpp: 'cpp', cs: 'csharp',
61
+ md: 'markdown', json: 'json', yaml: 'yaml', yml: 'yaml',
62
+ sql: 'sql', sh: 'shell', bash: 'shell', zsh: 'shell',
63
+ };
64
+ return langMap[ext] || ext || 'unknown';
65
+ }
66
+ /**
67
+ * Extract function names from content
68
+ */
69
+ function extractFunctions(content) {
70
+ const patterns = [
71
+ /function\s+(\w+)/g,
72
+ /const\s+(\w+)\s*=\s*(?:async\s*)?\([^)]*\)\s*=>/g,
73
+ /let\s+(\w+)\s*=\s*(?:async\s*)?\([^)]*\)\s*=>/g,
74
+ /(?:async\s+)?(?:public|private|protected)?\s+(\w+)\s*\([^)]*\)\s*[:{]/g,
75
+ /(\w+)\s*:\s*(?:async\s*)?\([^)]*\)\s*=>/g,
76
+ /def\s+(\w+)\s*\(/g, // Python
77
+ /fn\s+(\w+)\s*[<(]/g, // Rust
78
+ /func\s+(\w+)\s*\(/g, // Go
79
+ ];
80
+ const funcs = new Set();
81
+ const reserved = new Set(['if', 'for', 'while', 'switch', 'catch', 'try', 'else', 'return', 'new', 'class', 'function', 'async', 'await']);
82
+ for (const pattern of patterns) {
83
+ const regex = new RegExp(pattern.source, pattern.flags);
84
+ let match;
85
+ while ((match = regex.exec(content)) !== null) {
86
+ const name = match[1];
87
+ if (name && !reserved.has(name) && name.length > 1) {
88
+ funcs.add(name);
89
+ }
90
+ }
91
+ }
92
+ return Array.from(funcs);
93
+ }
94
+ /**
95
+ * Extract class names from content
96
+ */
97
+ function extractClasses(content) {
98
+ const patterns = [
99
+ /class\s+(\w+)/g,
100
+ /interface\s+(\w+)/g,
101
+ /type\s+(\w+)\s*=/g,
102
+ /enum\s+(\w+)/g,
103
+ /struct\s+(\w+)/g,
104
+ ];
105
+ const classes = new Set();
106
+ for (const pattern of patterns) {
107
+ const regex = new RegExp(pattern.source, pattern.flags);
108
+ let match;
109
+ while ((match = regex.exec(content)) !== null) {
110
+ if (match[1])
111
+ classes.add(match[1]);
112
+ }
113
+ }
114
+ return Array.from(classes);
115
+ }
116
+ /**
117
+ * Extract import statements from content
118
+ */
119
+ function extractImports(content) {
120
+ const patterns = [
121
+ /import\s+.*?from\s+['"]([^'"]+)['"]/g,
122
+ /import\s+['"]([^'"]+)['"]/g,
123
+ /require\s*\(['"]([^'"]+)['"]\)/g,
124
+ /from\s+(\w+)\s+import/g, // Python
125
+ /use\s+(\w+(?:::\w+)*)/g, // Rust
126
+ ];
127
+ const imports = [];
128
+ for (const pattern of patterns) {
129
+ const regex = new RegExp(pattern.source, pattern.flags);
130
+ let match;
131
+ while ((match = regex.exec(content)) !== null) {
132
+ if (match[1])
133
+ imports.push(match[1]);
134
+ }
135
+ }
136
+ return [...new Set(imports)];
137
+ }
138
+ /**
139
+ * Extract export statements from content
140
+ */
141
+ function extractExports(content) {
142
+ const patterns = [
143
+ /export\s+(?:default\s+)?(?:class|function|const|let|var|interface|type|enum)\s+(\w+)/g,
144
+ /export\s*\{\s*([^}]+)\s*\}/g,
145
+ /module\.exports\s*=\s*(\w+)/g,
146
+ /exports\.(\w+)\s*=/g,
147
+ /pub\s+(?:fn|struct|enum|type)\s+(\w+)/g, // Rust
148
+ ];
149
+ const exports = [];
150
+ for (const pattern of patterns) {
151
+ const regex = new RegExp(pattern.source, pattern.flags);
152
+ let match;
153
+ while ((match = regex.exec(content)) !== null) {
154
+ if (match[1]) {
155
+ // Handle grouped exports: export { a, b, c }
156
+ const names = match[1].split(',').map(s => s.trim().split(/\s+as\s+/)[0].trim());
157
+ exports.push(...names.filter(n => n && /^\w+$/.test(n)));
158
+ }
159
+ }
160
+ }
161
+ return [...new Set(exports)];
162
+ }
163
+ /**
164
+ * Extract TODO/FIXME comments from content
165
+ */
166
+ function extractTodos(content) {
167
+ const pattern = /\/\/\s*(TODO|FIXME|HACK|XXX|BUG|NOTE):\s*(.+)/gi;
168
+ const todos = [];
169
+ let match;
170
+ while ((match = pattern.exec(content)) !== null) {
171
+ todos.push(`${match[1]}: ${match[2].trim()}`);
172
+ }
173
+ return todos;
174
+ }
175
+ /**
176
+ * Extract all patterns from a file
177
+ */
178
+ function extractAllPatterns(filePath, content) {
179
+ try {
180
+ const fileContent = content ?? (fs.existsSync(filePath) ? fs.readFileSync(filePath, 'utf-8') : '');
181
+ return {
182
+ file: filePath,
183
+ language: detectLanguage(filePath),
184
+ functions: extractFunctions(fileContent),
185
+ classes: extractClasses(fileContent),
186
+ imports: extractImports(fileContent),
187
+ exports: extractExports(fileContent),
188
+ todos: extractTodos(fileContent),
189
+ variables: [], // Could add variable extraction if needed
190
+ };
191
+ }
192
+ catch {
193
+ return {
194
+ file: filePath,
195
+ language: detectLanguage(filePath),
196
+ functions: [],
197
+ classes: [],
198
+ imports: [],
199
+ exports: [],
200
+ todos: [],
201
+ variables: [],
202
+ };
203
+ }
204
+ }
205
+ /**
206
+ * Extract patterns from multiple files
207
+ */
208
+ function extractFromFiles(files, maxFiles = 100) {
209
+ return files.slice(0, maxFiles).map(f => extractAllPatterns(f));
210
+ }
211
+ /**
212
+ * Convert FilePatterns to PatternMatch array (for native-worker compatibility)
213
+ */
214
+ function toPatternMatches(patterns) {
215
+ const matches = [];
216
+ for (const func of patterns.functions) {
217
+ matches.push({ type: 'function', match: func, file: patterns.file });
218
+ }
219
+ for (const cls of patterns.classes) {
220
+ matches.push({ type: 'class', match: cls, file: patterns.file });
221
+ }
222
+ for (const imp of patterns.imports) {
223
+ matches.push({ type: 'import', match: imp, file: patterns.file });
224
+ }
225
+ for (const exp of patterns.exports) {
226
+ matches.push({ type: 'export', match: exp, file: patterns.file });
227
+ }
228
+ for (const todo of patterns.todos) {
229
+ matches.push({ type: 'todo', match: todo, file: patterns.file });
230
+ }
231
+ return matches;
232
+ }
233
+ exports.default = {
234
+ detectLanguage,
235
+ extractFunctions,
236
+ extractClasses,
237
+ extractImports,
238
+ extractExports,
239
+ extractTodos,
240
+ extractAllPatterns,
241
+ extractFromFiles,
242
+ toPatternMatches,
243
+ };
dist/analysis/security.d.ts ADDED
@@ -0,0 +1,51 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ /**
2
+ * Security Analysis Module - Consolidated security scanning
3
+ *
4
+ * Single source of truth for security patterns and vulnerability detection.
5
+ * Used by native-worker.ts and parallel-workers.ts
6
+ */
7
+ export interface SecurityPattern {
8
+ pattern: RegExp;
9
+ rule: string;
10
+ severity: 'low' | 'medium' | 'high' | 'critical';
11
+ message: string;
12
+ suggestion?: string;
13
+ }
14
+ export interface SecurityFinding {
15
+ file: string;
16
+ line: number;
17
+ severity: 'low' | 'medium' | 'high' | 'critical';
18
+ rule: string;
19
+ message: string;
20
+ match?: string;
21
+ suggestion?: string;
22
+ }
23
+ /**
24
+ * Default security patterns for vulnerability detection
25
+ */
26
+ export declare const SECURITY_PATTERNS: SecurityPattern[];
27
+ /**
28
+ * Scan a single file for security issues
29
+ */
30
+ export declare function scanFile(filePath: string, content?: string, patterns?: SecurityPattern[]): SecurityFinding[];
31
+ /**
32
+ * Scan multiple files for security issues
33
+ */
34
+ export declare function scanFiles(files: string[], patterns?: SecurityPattern[], maxFiles?: number): SecurityFinding[];
35
+ /**
36
+ * Get severity score (for sorting/filtering)
37
+ */
38
+ export declare function getSeverityScore(severity: string): number;
39
+ /**
40
+ * Sort findings by severity (highest first)
41
+ */
42
+ export declare function sortBySeverity(findings: SecurityFinding[]): SecurityFinding[];
43
+ declare const _default: {
44
+ SECURITY_PATTERNS: SecurityPattern[];
45
+ scanFile: typeof scanFile;
46
+ scanFiles: typeof scanFiles;
47
+ getSeverityScore: typeof getSeverityScore;
48
+ sortBySeverity: typeof sortBySeverity;
49
+ };
50
+ export default _default;
51
+ //# sourceMappingURL=security.d.ts.map
dist/analysis/security.d.ts.map ADDED
@@ -0,0 +1 @@
 
 
1
+ {"version":3,"file":"security.d.ts","sourceRoot":"","sources":["../../src/analysis/security.ts"],"names":[],"mappings":"AAAA;;;;;GAKG;AAIH,MAAM,WAAW,eAAe;IAC9B,OAAO,EAAE,MAAM,CAAC;IAChB,IAAI,EAAE,MAAM,CAAC;IACb,QAAQ,EAAE,KAAK,GAAG,QAAQ,GAAG,MAAM,GAAG,UAAU,CAAC;IACjD,OAAO,EAAE,MAAM,CAAC;IAChB,UAAU,CAAC,EAAE,MAAM,CAAC;CACrB;AAED,MAAM,WAAW,eAAe;IAC9B,IAAI,EAAE,MAAM,CAAC;IACb,IAAI,EAAE,MAAM,CAAC;IACb,QAAQ,EAAE,KAAK,GAAG,QAAQ,GAAG,MAAM,GAAG,UAAU,CAAC;IACjD,IAAI,EAAE,MAAM,CAAC;IACb,OAAO,EAAE,MAAM,CAAC;IAChB,KAAK,CAAC,EAAE,MAAM,CAAC;IACf,UAAU,CAAC,EAAE,MAAM,CAAC;CACrB;AAED;;GAEG;AACH,eAAO,MAAM,iBAAiB,EAAE,eAAe,EA0B9C,CAAC;AAEF;;GAEG;AACH,wBAAgB,QAAQ,CACtB,QAAQ,EAAE,MAAM,EAChB,OAAO,CAAC,EAAE,MAAM,EAChB,QAAQ,GAAE,eAAe,EAAsB,GAC9C,eAAe,EAAE,CA4BnB;AAED;;GAEG;AACH,wBAAgB,SAAS,CACvB,KAAK,EAAE,MAAM,EAAE,EACf,QAAQ,GAAE,eAAe,EAAsB,EAC/C,QAAQ,GAAE,MAAY,GACrB,eAAe,EAAE,CAQnB;AAED;;GAEG;AACH,wBAAgB,gBAAgB,CAAC,QAAQ,EAAE,MAAM,GAAG,MAAM,CAQzD;AAED;;GAEG;AACH,wBAAgB,cAAc,CAAC,QAAQ,EAAE,eAAe,EAAE,GAAG,eAAe,EAAE,CAE7E;;;;;;;;AAED,wBAME"}
dist/analysis/security.js ADDED
@@ -0,0 +1,139 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ "use strict";
2
+ /**
3
+ * Security Analysis Module - Consolidated security scanning
4
+ *
5
+ * Single source of truth for security patterns and vulnerability detection.
6
+ * Used by native-worker.ts and parallel-workers.ts
7
+ */
8
+ var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
9
+ if (k2 === undefined) k2 = k;
10
+ var desc = Object.getOwnPropertyDescriptor(m, k);
11
+ if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
12
+ desc = { enumerable: true, get: function() { return m[k]; } };
13
+ }
14
+ Object.defineProperty(o, k2, desc);
15
+ }) : (function(o, m, k, k2) {
16
+ if (k2 === undefined) k2 = k;
17
+ o[k2] = m[k];
18
+ }));
19
+ var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
20
+ Object.defineProperty(o, "default", { enumerable: true, value: v });
21
+ }) : function(o, v) {
22
+ o["default"] = v;
23
+ });
24
+ var __importStar = (this && this.__importStar) || (function () {
25
+ var ownKeys = function(o) {
26
+ ownKeys = Object.getOwnPropertyNames || function (o) {
27
+ var ar = [];
28
+ for (var k in o) if (Object.prototype.hasOwnProperty.call(o, k)) ar[ar.length] = k;
29
+ return ar;
30
+ };
31
+ return ownKeys(o);
32
+ };
33
+ return function (mod) {
34
+ if (mod && mod.__esModule) return mod;
35
+ var result = {};
36
+ if (mod != null) for (var k = ownKeys(mod), i = 0; i < k.length; i++) if (k[i] !== "default") __createBinding(result, mod, k[i]);
37
+ __setModuleDefault(result, mod);
38
+ return result;
39
+ };
40
+ })();
41
+ Object.defineProperty(exports, "__esModule", { value: true });
42
+ exports.SECURITY_PATTERNS = void 0;
43
+ exports.scanFile = scanFile;
44
+ exports.scanFiles = scanFiles;
45
+ exports.getSeverityScore = getSeverityScore;
46
+ exports.sortBySeverity = sortBySeverity;
47
+ const fs = __importStar(require("fs"));
48
+ /**
49
+ * Default security patterns for vulnerability detection
50
+ */
51
+ exports.SECURITY_PATTERNS = [
52
+ // Critical: Hardcoded secrets
53
+ { pattern: /password\s*=\s*['"][^'"]+['"]/gi, rule: 'no-hardcoded-password', severity: 'critical', message: 'Hardcoded password detected', suggestion: 'Use environment variables or secret management' },
54
+ { pattern: /api[_-]?key\s*=\s*['"][^'"]+['"]/gi, rule: 'no-hardcoded-apikey', severity: 'critical', message: 'Hardcoded API key detected', suggestion: 'Use environment variables' },
55
+ { pattern: /secret\s*=\s*['"][^'"]+['"]/gi, rule: 'no-hardcoded-secret', severity: 'critical', message: 'Hardcoded secret detected', suggestion: 'Use environment variables or secret management' },
56
+ { pattern: /private[_-]?key\s*=\s*['"][^'"]+['"]/gi, rule: 'no-hardcoded-private-key', severity: 'critical', message: 'Hardcoded private key detected', suggestion: 'Use secure key management' },
57
+ // High: Code execution risks
58
+ { pattern: /eval\s*\(/g, rule: 'no-eval', severity: 'high', message: 'Avoid eval() - code injection risk', suggestion: 'Use safer alternatives like JSON.parse()' },
59
+ { pattern: /exec\s*\(/g, rule: 'no-exec', severity: 'high', message: 'Avoid exec() - command injection risk', suggestion: 'Use execFile or spawn with args array' },
60
+ { pattern: /Function\s*\(/g, rule: 'no-function-constructor', severity: 'high', message: 'Avoid Function constructor - code injection risk' },
61
+ { pattern: /child_process.*exec\(/g, rule: 'no-shell-exec', severity: 'high', message: 'Shell execution detected', suggestion: 'Use execFile or spawn instead' },
62
+ // High: SQL injection
63
+ { pattern: /SELECT\s+.*\s+FROM.*\+/gi, rule: 'sql-injection-risk', severity: 'high', message: 'Potential SQL injection - string concatenation in query', suggestion: 'Use parameterized queries' },
64
+ { pattern: /`SELECT.*\$\{/gi, rule: 'sql-injection-template', severity: 'high', message: 'Template literal in SQL query', suggestion: 'Use parameterized queries' },
65
+ // Medium: XSS risks
66
+ { pattern: /dangerouslySetInnerHTML/g, rule: 'xss-risk', severity: 'medium', message: 'XSS risk: dangerouslySetInnerHTML', suggestion: 'Sanitize content before rendering' },
67
+ { pattern: /innerHTML\s*=/g, rule: 'no-inner-html', severity: 'medium', message: 'Avoid innerHTML - XSS risk', suggestion: 'Use textContent or sanitize content' },
68
+ { pattern: /document\.write\s*\(/g, rule: 'no-document-write', severity: 'medium', message: 'Avoid document.write - XSS risk' },
69
+ // Medium: Other risks
70
+ { pattern: /\$\{.*\}/g, rule: 'template-injection', severity: 'low', message: 'Template literal detected - verify no injection' },
71
+ { pattern: /new\s+RegExp\s*\([^)]*\+/g, rule: 'regex-injection', severity: 'medium', message: 'Dynamic RegExp - potential ReDoS risk', suggestion: 'Validate/sanitize regex input' },
72
+ { pattern: /\.on\s*\(\s*['"]error['"]/g, rule: 'unhandled-error', severity: 'low', message: 'Error handler detected - verify proper error handling' },
73
+ ];
74
+ /**
75
+ * Scan a single file for security issues
76
+ */
77
+ function scanFile(filePath, content, patterns = exports.SECURITY_PATTERNS) {
78
+ const findings = [];
79
+ try {
80
+ const fileContent = content ?? (fs.existsSync(filePath) ? fs.readFileSync(filePath, 'utf-8') : '');
81
+ if (!fileContent)
82
+ return findings;
83
+ for (const { pattern, rule, severity, message, suggestion } of patterns) {
84
+ const regex = new RegExp(pattern.source, pattern.flags);
85
+ let match;
86
+ while ((match = regex.exec(fileContent)) !== null) {
87
+ const lineNum = fileContent.slice(0, match.index).split('\n').length;
88
+ findings.push({
89
+ file: filePath,
90
+ line: lineNum,
91
+ severity,
92
+ rule,
93
+ message,
94
+ match: match[0].slice(0, 50),
95
+ suggestion,
96
+ });
97
+ }
98
+ }
99
+ }
100
+ catch {
101
+ // Skip unreadable files
102
+ }
103
+ return findings;
104
+ }
105
+ /**
106
+ * Scan multiple files for security issues
107
+ */
108
+ function scanFiles(files, patterns = exports.SECURITY_PATTERNS, maxFiles = 100) {
109
+ const findings = [];
110
+ for (const file of files.slice(0, maxFiles)) {
111
+ findings.push(...scanFile(file, undefined, patterns));
112
+ }
113
+ return findings;
114
+ }
115
+ /**
116
+ * Get severity score (for sorting/filtering)
117
+ */
118
+ function getSeverityScore(severity) {
119
+ switch (severity) {
120
+ case 'critical': return 4;
121
+ case 'high': return 3;
122
+ case 'medium': return 2;
123
+ case 'low': return 1;
124
+ default: return 0;
125
+ }
126
+ }
127
+ /**
128
+ * Sort findings by severity (highest first)
129
+ */
130
+ function sortBySeverity(findings) {
131
+ return [...findings].sort((a, b) => getSeverityScore(b.severity) - getSeverityScore(a.severity));
132
+ }
133
+ exports.default = {
134
+ SECURITY_PATTERNS: exports.SECURITY_PATTERNS,
135
+ scanFile,
136
+ scanFiles,
137
+ getSeverityScore,
138
+ sortBySeverity,
139
+ };
dist/core/adaptive-embedder.d.ts ADDED
@@ -0,0 +1,156 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ /**
2
+ * AdaptiveEmbedder - Micro-LoRA Style Optimization for ONNX Embeddings
3
+ *
4
+ * Applies continual learning techniques to frozen ONNX embeddings:
5
+ *
6
+ * 1. MICRO-LORA ADAPTERS
7
+ * - Low-rank projection layers (rank 2-8) on top of frozen embeddings
8
+ * - Domain-specific fine-tuning with minimal parameters
9
+ * - ~0.1% of base model parameters
10
+ *
11
+ * 2. CONTRASTIVE LEARNING
12
+ * - Files edited together → embeddings closer
13
+ * - Semantic clustering from trajectories
14
+ * - Online learning from user behavior
15
+ *
16
+ * 3. EWC++ (Elastic Weight Consolidation)
17
+ * - Prevents catastrophic forgetting
18
+ * - Consolidates important adaptations
19
+ * - Fisher information regularization
20
+ *
21
+ * 4. MEMORY-AUGMENTED RETRIEVAL
22
+ * - Episodic memory for context-aware embeddings
23
+ * - Attention over past similar embeddings
24
+ * - Domain prototype learning
25
+ *
26
+ * Architecture:
27
+ * ONNX(text) → [frozen 384d] → LoRA_A → LoRA_B → [adapted 384d]
28
+ * (384×r) (r×384)
29
+ */
30
+ export interface AdaptiveConfig {
31
+ /** LoRA rank (lower = fewer params, higher = more expressive) */
32
+ loraRank?: number;
33
+ /** Learning rate for online updates */
34
+ learningRate?: number;
35
+ /** EWC regularization strength */
36
+ ewcLambda?: number;
37
+ /** Number of domain prototypes to maintain */
38
+ numPrototypes?: number;
39
+ /** Enable contrastive learning from co-edits */
40
+ contrastiveLearning?: boolean;
41
+ /** Temperature for contrastive loss */
42
+ contrastiveTemp?: number;
43
+ /** Memory capacity for episodic retrieval */
44
+ memoryCapacity?: number;
45
+ }
46
+ export interface LoRAWeights {
47
+ A: number[][];
48
+ B: number[][];
49
+ bias?: number[];
50
+ }
51
+ export interface DomainPrototype {
52
+ domain: string;
53
+ centroid: number[];
54
+ count: number;
55
+ variance: number;
56
+ }
57
+ export interface AdaptiveStats {
58
+ baseModel: string;
59
+ dimension: number;
60
+ loraRank: number;
61
+ loraParams: number;
62
+ adaptations: number;
63
+ prototypes: number;
64
+ memorySize: number;
65
+ ewcConsolidations: number;
66
+ contrastiveUpdates: number;
67
+ }
68
+ export declare class AdaptiveEmbedder {
69
+ private config;
70
+ private lora;
71
+ private prototypes;
72
+ private episodic;
73
+ private onnxReady;
74
+ private dimension;
75
+ private adaptationCount;
76
+ private ewcCount;
77
+ private contrastiveCount;
78
+ private coEditBuffer;
79
+ constructor(config?: AdaptiveConfig);
80
+ /**
81
+ * Initialize ONNX backend
82
+ */
83
+ init(): Promise<void>;
84
+ /**
85
+ * Generate adaptive embedding
86
+ * Pipeline: ONNX → LoRA → Prototype Adjustment → Episodic Augmentation
87
+ */
88
+ embed(text: string, options?: {
89
+ domain?: string;
90
+ useEpisodic?: boolean;
91
+ storeInMemory?: boolean;
92
+ }): Promise<number[]>;
93
+ /**
94
+ * Batch embed with adaptation
95
+ */
96
+ embedBatch(texts: string[], options?: {
97
+ domain?: string;
98
+ }): Promise<number[][]>;
99
+ /**
100
+ * Learn from co-edit pattern (contrastive learning)
101
+ * Files edited together should have similar embeddings
102
+ */
103
+ learnCoEdit(file1: string, content1: string, file2: string, content2: string): Promise<number>;
104
+ /**
105
+ * Process co-edit batch with contrastive loss
106
+ */
107
+ private processCoEditBatch;
108
+ /**
109
+ * Learn from trajectory outcome (reinforcement-like)
110
+ */
111
+ learnFromOutcome(context: string, action: string, success: boolean, quality?: number): Promise<void>;
112
+ /**
113
+ * EWC consolidation - prevent forgetting important adaptations
114
+ * OPTIMIZED: Works with Float32Array episodic entries
115
+ */
116
+ consolidate(): Promise<void>;
117
+ /**
118
+ * Fallback hash embedding
119
+ */
120
+ private hashEmbed;
121
+ private normalize;
122
+ /**
123
+ * Get statistics
124
+ */
125
+ getStats(): AdaptiveStats;
126
+ /**
127
+ * Export learned weights
128
+ */
129
+ export(): {
130
+ lora: LoRAWeights;
131
+ prototypes: DomainPrototype[];
132
+ stats: AdaptiveStats;
133
+ };
134
+ /**
135
+ * Import learned weights
136
+ */
137
+ import(data: {
138
+ lora?: LoRAWeights;
139
+ prototypes?: DomainPrototype[];
140
+ }): void;
141
+ /**
142
+ * Reset adaptations
143
+ */
144
+ reset(): void;
145
+ /**
146
+ * Get LoRA cache statistics
147
+ */
148
+ getCacheStats(): {
149
+ size: number;
150
+ maxSize: number;
151
+ };
152
+ }
153
+ export declare function getAdaptiveEmbedder(config?: AdaptiveConfig): AdaptiveEmbedder;
154
+ export declare function initAdaptiveEmbedder(config?: AdaptiveConfig): Promise<AdaptiveEmbedder>;
155
+ export default AdaptiveEmbedder;
156
+ //# sourceMappingURL=adaptive-embedder.d.ts.map
dist/core/adaptive-embedder.d.ts.map ADDED
@@ -0,0 +1 @@
 
 
1
+ {"version":3,"file":"adaptive-embedder.d.ts","sourceRoot":"","sources":["../../src/core/adaptive-embedder.ts"],"names":[],"mappings":"AAAA;;;;;;;;;;;;;;;;;;;;;;;;;;;;GA4BG;AAQH,MAAM,WAAW,cAAc;IAC7B,iEAAiE;IACjE,QAAQ,CAAC,EAAE,MAAM,CAAC;IAClB,uCAAuC;IACvC,YAAY,CAAC,EAAE,MAAM,CAAC;IACtB,kCAAkC;IAClC,SAAS,CAAC,EAAE,MAAM,CAAC;IACnB,8CAA8C;IAC9C,aAAa,CAAC,EAAE,MAAM,CAAC;IACvB,gDAAgD;IAChD,mBAAmB,CAAC,EAAE,OAAO,CAAC;IAC9B,uCAAuC;IACvC,eAAe,CAAC,EAAE,MAAM,CAAC;IACzB,6CAA6C;IAC7C,cAAc,CAAC,EAAE,MAAM,CAAC;CACzB;AAED,MAAM,WAAW,WAAW;IAC1B,CAAC,EAAE,MAAM,EAAE,EAAE,CAAC;IACd,CAAC,EAAE,MAAM,EAAE,EAAE,CAAC;IACd,IAAI,CAAC,EAAE,MAAM,EAAE,CAAC;CACjB;AAED,MAAM,WAAW,eAAe;IAC9B,MAAM,EAAE,MAAM,CAAC;IACf,QAAQ,EAAE,MAAM,EAAE,CAAC;IACnB,KAAK,EAAE,MAAM,CAAC;IACd,QAAQ,EAAE,MAAM,CAAC;CAClB;AAED,MAAM,WAAW,aAAa;IAC5B,SAAS,EAAE,MAAM,CAAC;IAClB,SAAS,EAAE,MAAM,CAAC;IAClB,QAAQ,EAAE,MAAM,CAAC;IACjB,UAAU,EAAE,MAAM,CAAC;IACnB,WAAW,EAAE,MAAM,CAAC;IACpB,UAAU,EAAE,MAAM,CAAC;IACnB,UAAU,EAAE,MAAM,CAAC;IACnB,iBAAiB,EAAE,MAAM,CAAC;IAC1B,kBAAkB,EAAE,MAAM,CAAC;CAC5B;AA8pBD,qBAAa,gBAAgB;IAC3B,OAAO,CAAC,MAAM,CAA2B;IACzC,OAAO,CAAC,IAAI,CAAY;IACxB,OAAO,CAAC,UAAU,CAAkB;IACpC,OAAO,CAAC,QAAQ,CAAiB;IACjC,OAAO,CAAC,SAAS,CAAkB;IACnC,OAAO,CAAC,SAAS,CAAe;IAGhC,OAAO,CAAC,eAAe,CAAa;IACpC,OAAO,CAAC,QAAQ,CAAa;IAC7B,OAAO,CAAC,gBAAgB,CAAa;IAGrC,OAAO,CAAC,YAAY,CAA+E;gBAEvF,MAAM,GAAE,cAAmB;IAiBvC;;OAEG;IACG,IAAI,IAAI,OAAO,CAAC,IAAI,CAAC;IAO3B;;;OAGG;IACG,KAAK,CAAC,IAAI,EAAE,MAAM,EAAE,OAAO,CAAC,EAAE;QAClC,MAAM,CAAC,EAAE,MAAM,CAAC;QAChB,WAAW,CAAC,EAAE,OAAO,CAAC;QACtB,aAAa,CAAC,EAAE,OAAO,CAAC;KACzB,GAAG,OAAO,CAAC,MAAM,EAAE,CAAC;IAmCrB;;OAEG;IACG,UAAU,CAAC,KAAK,EAAE,MAAM,EAAE,EAAE,OAAO,CAAC,EAAE;QAC1C,MAAM,CAAC,EAAE,MAAM,CAAC;KACjB,GAAG,OAAO,CAAC,MAAM,EAAE,EAAE,CAAC;IAsBvB;;;OAGG;IACG,WAAW,CAAC,KAAK,EAAE,MAAM,EAAE,QAAQ,EAAE,MAAM,EAAE,KAAK,EAAE,MAAM,EAAE,QAAQ,EAAE,MAAM,GAAG,OAAO,CAAC,MAAM,CAAC;IAkBpG;;OAEG;IACH,OAAO,CAAC,kBAAkB;IA+B1B;;OAEG;IACG,gBAAgB,CACpB,OAAO,EAAE,MAAM,EACf,MAAM,EAAE,MAAM,EACd,OAAO,EAAE,OAAO,EAChB,OAAO,GAAE,MAAY,GACpB,OAAO,CAAC,IAAI,CAAC;IAiBhB;;;OAGG;IACG,WAAW,IAAI,OAAO,CAAC,IAAI,CAAC;IAmBlC;;OAEG;IACH,OAAO,CAAC,SAAS;IAoBjB,OAAO,CAAC,SAAS;IAKjB;;OAEG;IACH,QAAQ,IAAI,aAAa;IAczB;;OAEG;IACH,MAAM,IAAI;QACR,IAAI,EAAE,WAAW,CAAC;QAClB,UAAU,EAAE,eAAe,EAAE,CAAC;QAC9B,KAAK,EAAE,aAAa,CAAC;KACtB;IAQD;;OAEG;IACH,MAAM,CAAC,IAAI,EAAE;QAAE,IAAI,CAAC,EAAE,WAAW,CAAC;QAAC,UAAU,CAAC,EAAE,eAAe,EAAE,CAAA;KAAE,GAAG,IAAI;IAS1E;;OAEG;IACH,KAAK,IAAI,IAAI;IAUb;;OAEG;IACH,aAAa,IAAI;QAAE,IAAI,EAAE,MAAM,CAAC;QAAC,OAAO,EAAE,MAAM,CAAA;KAAE;CAGnD;AAQD,wBAAgB,mBAAmB,CAAC,MAAM,CAAC,EAAE,cAAc,GAAG,gBAAgB,CAK7E;AAED,wBAAsB,oBAAoB,CAAC,MAAM,CAAC,EAAE,cAAc,GAAG,OAAO,CAAC,gBAAgB,CAAC,CAI7F;AAED,eAAe,gBAAgB,CAAC"}
dist/core/adaptive-embedder.js ADDED
@@ -0,0 +1,837 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ "use strict";
2
+ /**
3
+ * AdaptiveEmbedder - Micro-LoRA Style Optimization for ONNX Embeddings
4
+ *
5
+ * Applies continual learning techniques to frozen ONNX embeddings:
6
+ *
7
+ * 1. MICRO-LORA ADAPTERS
8
+ * - Low-rank projection layers (rank 2-8) on top of frozen embeddings
9
+ * - Domain-specific fine-tuning with minimal parameters
10
+ * - ~0.1% of base model parameters
11
+ *
12
+ * 2. CONTRASTIVE LEARNING
13
+ * - Files edited together → embeddings closer
14
+ * - Semantic clustering from trajectories
15
+ * - Online learning from user behavior
16
+ *
17
+ * 3. EWC++ (Elastic Weight Consolidation)
18
+ * - Prevents catastrophic forgetting
19
+ * - Consolidates important adaptations
20
+ * - Fisher information regularization
21
+ *
22
+ * 4. MEMORY-AUGMENTED RETRIEVAL
23
+ * - Episodic memory for context-aware embeddings
24
+ * - Attention over past similar embeddings
25
+ * - Domain prototype learning
26
+ *
27
+ * Architecture:
28
+ * ONNX(text) → [frozen 384d] → LoRA_A → LoRA_B → [adapted 384d]
29
+ * (384×r) (r×384)
30
+ */
31
+ Object.defineProperty(exports, "__esModule", { value: true });
32
+ exports.AdaptiveEmbedder = void 0;
33
+ exports.getAdaptiveEmbedder = getAdaptiveEmbedder;
34
+ exports.initAdaptiveEmbedder = initAdaptiveEmbedder;
35
+ const onnx_embedder_1 = require("./onnx-embedder");
36
+ // ============================================================================
37
+ // Optimized Micro-LoRA Layer with Float32Array and Caching
38
+ // ============================================================================
39
+ /**
40
+ * Low-rank adaptation layer for embeddings (OPTIMIZED)
41
+ * Implements: output = input + scale * (input @ A @ B)
42
+ *
43
+ * Optimizations:
44
+ * - Float32Array for 2-3x faster math operations
45
+ * - Flattened matrices for cache-friendly access
46
+ * - Pre-allocated buffers to avoid GC pressure
47
+ * - LRU embedding cache for repeated inputs
48
+ */
49
+ class MicroLoRA {
50
+ constructor(dim, rank, scale = 0.1) {
51
+ // EWC Fisher information (importance weights)
52
+ this.fisherA = null;
53
+ this.fisherB = null;
54
+ this.savedA = null;
55
+ this.savedB = null;
56
+ // LRU cache for repeated embeddings (key: hash, value: output)
57
+ this.cache = new Map();
58
+ this.cacheMaxSize = 256;
59
+ this.dim = dim;
60
+ this.rank = rank;
61
+ this.scale = scale;
62
+ // Initialize with small random values (Xavier-like)
63
+ const stdA = Math.sqrt(2 / (dim + rank));
64
+ const stdB = Math.sqrt(2 / (rank + dim)) * 0.01; // B starts near zero
65
+ this.A = this.initFlatMatrix(dim, rank, stdA);
66
+ this.B = this.initFlatMatrix(rank, dim, stdB);
67
+ // Pre-allocate buffers
68
+ this.hiddenBuffer = new Float32Array(rank);
69
+ this.outputBuffer = new Float32Array(dim);
70
+ }
71
+ initFlatMatrix(rows, cols, std) {
72
+ const arr = new Float32Array(rows * cols);
73
+ for (let i = 0; i < arr.length; i++) {
74
+ arr[i] = (Math.random() - 0.5) * 2 * std;
75
+ }
76
+ return arr;
77
+ }
78
+ /**
79
+ * Fast hash for cache key (FNV-1a variant)
80
+ */
81
+ hashInput(input) {
82
+ let h = 2166136261;
83
+ const len = Math.min(input.length, 32); // Sample first 32 for speed
84
+ for (let i = 0; i < len; i++) {
85
+ h ^= Math.floor(input[i] * 10000);
86
+ h = Math.imul(h, 16777619);
87
+ }
88
+ return h.toString(36);
89
+ }
90
+ /**
91
+ * Forward pass: input + scale * (input @ A @ B)
92
+ * OPTIMIZED with Float32Array and loop unrolling
93
+ */
94
+ forward(input) {
95
+ // Check cache first
96
+ const cacheKey = this.hashInput(input);
97
+ const cached = this.cache.get(cacheKey);
98
+ if (cached) {
99
+ return Array.from(cached);
100
+ }
101
+ // Zero the hidden buffer
102
+ this.hiddenBuffer.fill(0);
103
+ // Compute input @ A (dim → rank) - SIMD-friendly loop
104
+ // Unroll by 4 for better pipelining
105
+ const dim4 = this.dim - (this.dim % 4);
106
+ for (let r = 0; r < this.rank; r++) {
107
+ let sum = 0;
108
+ const rOffset = r;
109
+ // Unrolled loop
110
+ for (let d = 0; d < dim4; d += 4) {
111
+ const aIdx = d * this.rank + rOffset;
112
+ sum += input[d] * this.A[aIdx];
113
+ sum += input[d + 1] * this.A[aIdx + this.rank];
114
+ sum += input[d + 2] * this.A[aIdx + 2 * this.rank];
115
+ sum += input[d + 3] * this.A[aIdx + 3 * this.rank];
116
+ }
117
+ // Remainder
118
+ for (let d = dim4; d < this.dim; d++) {
119
+ sum += input[d] * this.A[d * this.rank + rOffset];
120
+ }
121
+ this.hiddenBuffer[r] = sum;
122
+ }
123
+ // Compute hidden @ B (rank → dim) and add residual
124
+ // Copy input to output buffer first
125
+ for (let d = 0; d < this.dim; d++) {
126
+ this.outputBuffer[d] = input[d];
127
+ }
128
+ // Add scaled LoRA contribution
129
+ for (let d = 0; d < this.dim; d++) {
130
+ let delta = 0;
131
+ for (let r = 0; r < this.rank; r++) {
132
+ delta += this.hiddenBuffer[r] * this.B[r * this.dim + d];
133
+ }
134
+ this.outputBuffer[d] += this.scale * delta;
135
+ }
136
+ // Cache result (LRU eviction if full)
137
+ if (this.cache.size >= this.cacheMaxSize) {
138
+ const firstKey = this.cache.keys().next().value;
139
+ if (firstKey)
140
+ this.cache.delete(firstKey);
141
+ }
142
+ this.cache.set(cacheKey, new Float32Array(this.outputBuffer));
143
+ return Array.from(this.outputBuffer);
144
+ }
145
+ /**
146
+ * Clear cache (call after weight updates)
147
+ */
148
+ clearCache() {
149
+ this.cache.clear();
150
+ }
151
+ /**
152
+ * Backward pass with contrastive loss
153
+ * Pulls positive pairs closer, pushes negatives apart
154
+ * OPTIMIZED: Uses Float32Array buffers
155
+ */
156
+ backward(anchor, positive, negatives, lr, ewcLambda = 0) {
157
+ if (!positive && negatives.length === 0)
158
+ return 0;
159
+ // Clear cache since weights will change
160
+ this.clearCache();
161
+ // Compute adapted embeddings
162
+ const anchorOut = this.forward(anchor);
163
+ const positiveOut = positive ? this.forward(positive) : null;
164
+ const negativeOuts = negatives.map(n => this.forward(n));
165
+ // Contrastive loss with temperature scaling
166
+ const temp = 0.07;
167
+ let loss = 0;
168
+ if (positiveOut) {
169
+ // Positive similarity
170
+ const posSim = this.cosineSimilarity(anchorOut, positiveOut) / temp;
171
+ // Negative similarities
172
+ const negSims = negativeOuts.map(n => this.cosineSimilarity(anchorOut, n) / temp);
173
+ // InfoNCE loss
174
+ const maxSim = Math.max(posSim, ...negSims);
175
+ const expPos = Math.exp(posSim - maxSim);
176
+ const expNegs = negSims.reduce((sum, s) => sum + Math.exp(s - maxSim), 0);
177
+ loss = -Math.log(expPos / (expPos + expNegs) + 1e-8);
178
+ // Compute gradients (simplified)
179
+ const gradScale = lr * this.scale;
180
+ // Update A based on gradient direction (flattened access)
181
+ for (let d = 0; d < this.dim; d++) {
182
+ for (let r = 0; r < this.rank; r++) {
183
+ const idx = d * this.rank + r;
184
+ // Gradient from positive (pull closer)
185
+ const pOutR = r < positiveOut.length ? positiveOut[r] : 0;
186
+ const aOutR = r < anchorOut.length ? anchorOut[r] : 0;
187
+ const gradA = anchor[d] * (pOutR - aOutR) * gradScale;
188
+ this.A[idx] += gradA;
189
+ // EWC regularization
190
+ if (ewcLambda > 0 && this.fisherA && this.savedA) {
191
+ this.A[idx] -= ewcLambda * this.fisherA[idx] * (this.A[idx] - this.savedA[idx]);
192
+ }
193
+ }
194
+ }
195
+ // Update B (flattened access)
196
+ for (let r = 0; r < this.rank; r++) {
197
+ const anchorR = r < anchor.length ? anchor[r] : 0;
198
+ for (let d = 0; d < this.dim; d++) {
199
+ const idx = r * this.dim + d;
200
+ const gradB = anchorR * (positiveOut[d] - anchorOut[d]) * gradScale * 0.1;
201
+ this.B[idx] += gradB;
202
+ if (ewcLambda > 0 && this.fisherB && this.savedB) {
203
+ this.B[idx] -= ewcLambda * this.fisherB[idx] * (this.B[idx] - this.savedB[idx]);
204
+ }
205
+ }
206
+ }
207
+ }
208
+ return loss;
209
+ }
210
+ /**
211
+ * EWC consolidation - save current weights and compute Fisher information
212
+ * OPTIMIZED: Uses Float32Array
213
+ */
214
+ consolidate(embeddings) {
215
+ // Save current weights
216
+ this.savedA = new Float32Array(this.A);
217
+ this.savedB = new Float32Array(this.B);
218
+ // Estimate Fisher information (diagonal approximation)
219
+ this.fisherA = new Float32Array(this.dim * this.rank);
220
+ this.fisherB = new Float32Array(this.rank * this.dim);
221
+ const numEmb = embeddings.length;
222
+ for (const emb of embeddings) {
223
+ // Accumulate squared gradients as Fisher estimate
224
+ for (let d = 0; d < this.dim; d++) {
225
+ const embD = emb[d] * emb[d] / numEmb;
226
+ for (let r = 0; r < this.rank; r++) {
227
+ this.fisherA[d * this.rank + r] += embD;
228
+ }
229
+ }
230
+ }
231
+ // Clear cache after consolidation
232
+ this.clearCache();
233
+ }
234
+ /**
235
+ * Optimized cosine similarity with early termination
236
+ */
237
+ cosineSimilarity(a, b) {
238
+ let dot = 0, normA = 0, normB = 0;
239
+ const len = Math.min(a.length, b.length);
240
+ // Unrolled loop for speed
241
+ const len4 = len - (len % 4);
242
+ for (let i = 0; i < len4; i += 4) {
243
+ dot += a[i] * b[i] + a[i + 1] * b[i + 1] + a[i + 2] * b[i + 2] + a[i + 3] * b[i + 3];
244
+ normA += a[i] * a[i] + a[i + 1] * a[i + 1] + a[i + 2] * a[i + 2] + a[i + 3] * a[i + 3];
245
+ normB += b[i] * b[i] + b[i + 1] * b[i + 1] + b[i + 2] * b[i + 2] + b[i + 3] * b[i + 3];
246
+ }
247
+ // Remainder
248
+ for (let i = len4; i < len; i++) {
249
+ dot += a[i] * b[i];
250
+ normA += a[i] * a[i];
251
+ normB += b[i] * b[i];
252
+ }
253
+ return dot / (Math.sqrt(normA * normB) + 1e-8);
254
+ }
255
+ getParams() {
256
+ return this.dim * this.rank + this.rank * this.dim;
257
+ }
258
+ getCacheStats() {
259
+ return {
260
+ size: this.cache.size,
261
+ maxSize: this.cacheMaxSize,
262
+ hitRate: 0, // Would need hit counter for accurate tracking
263
+ };
264
+ }
265
+ /**
266
+ * Export weights as 2D arrays for serialization
267
+ */
268
+ export() {
269
+ // Convert flattened Float32Array back to 2D number[][]
270
+ const A = [];
271
+ for (let d = 0; d < this.dim; d++) {
272
+ const row = [];
273
+ for (let r = 0; r < this.rank; r++) {
274
+ row.push(this.A[d * this.rank + r]);
275
+ }
276
+ A.push(row);
277
+ }
278
+ const B = [];
279
+ for (let r = 0; r < this.rank; r++) {
280
+ const row = [];
281
+ for (let d = 0; d < this.dim; d++) {
282
+ row.push(this.B[r * this.dim + d]);
283
+ }
284
+ B.push(row);
285
+ }
286
+ return { A, B };
287
+ }
288
+ /**
289
+ * Import weights from 2D arrays
290
+ */
291
+ import(weights) {
292
+ // Convert 2D number[][] to flattened Float32Array
293
+ for (let d = 0; d < this.dim && d < weights.A.length; d++) {
294
+ for (let r = 0; r < this.rank && r < weights.A[d].length; r++) {
295
+ this.A[d * this.rank + r] = weights.A[d][r];
296
+ }
297
+ }
298
+ for (let r = 0; r < this.rank && r < weights.B.length; r++) {
299
+ for (let d = 0; d < this.dim && d < weights.B[r].length; d++) {
300
+ this.B[r * this.dim + d] = weights.B[r][d];
301
+ }
302
+ }
303
+ // Clear cache after import
304
+ this.clearCache();
305
+ }
306
+ }
307
+ // ============================================================================
308
+ // Domain Prototype Learning (OPTIMIZED with Float32Array)
309
+ // ============================================================================
310
+ class PrototypeMemory {
311
+ constructor(maxPrototypes = 50, dimension = 384) {
312
+ this.prototypes = new Map();
313
+ this.maxPrototypes = maxPrototypes;
314
+ this.scratchBuffer = new Float32Array(dimension);
315
+ }
316
+ /**
317
+ * Update prototype with new embedding (online mean update)
318
+ * OPTIMIZED: Uses Float32Array internally
319
+ */
320
+ update(domain, embedding) {
321
+ const existing = this.prototypes.get(domain);
322
+ if (existing) {
323
+ // Online mean update: new_mean = old_mean + (x - old_mean) / n
324
+ const n = existing.count + 1;
325
+ const invN = 1 / n;
326
+ // Unrolled update loop
327
+ const len = Math.min(embedding.length, existing.centroid.length);
328
+ const len4 = len - (len % 4);
329
+ for (let i = 0; i < len4; i += 4) {
330
+ const d0 = embedding[i] - existing.centroid[i];
331
+ const d1 = embedding[i + 1] - existing.centroid[i + 1];
332
+ const d2 = embedding[i + 2] - existing.centroid[i + 2];
333
+ const d3 = embedding[i + 3] - existing.centroid[i + 3];
334
+ existing.centroid[i] += d0 * invN;
335
+ existing.centroid[i + 1] += d1 * invN;
336
+ existing.centroid[i + 2] += d2 * invN;
337
+ existing.centroid[i + 3] += d3 * invN;
338
+ existing.variance += d0 * (embedding[i] - existing.centroid[i]);
339
+ existing.variance += d1 * (embedding[i + 1] - existing.centroid[i + 1]);
340
+ existing.variance += d2 * (embedding[i + 2] - existing.centroid[i + 2]);
341
+ existing.variance += d3 * (embedding[i + 3] - existing.centroid[i + 3]);
342
+ }
343
+ for (let i = len4; i < len; i++) {
344
+ const delta = embedding[i] - existing.centroid[i];
345
+ existing.centroid[i] += delta * invN;
346
+ existing.variance += delta * (embedding[i] - existing.centroid[i]);
347
+ }
348
+ existing.count = n;
349
+ }
350
+ else {
351
+ // Create new prototype
352
+ if (this.prototypes.size >= this.maxPrototypes) {
353
+ // Remove least used prototype
354
+ let minCount = Infinity;
355
+ let minKey = '';
356
+ for (const [key, proto] of this.prototypes) {
357
+ if (proto.count < minCount) {
358
+ minCount = proto.count;
359
+ minKey = key;
360
+ }
361
+ }
362
+ this.prototypes.delete(minKey);
363
+ }
364
+ this.prototypes.set(domain, {
365
+ domain,
366
+ centroid: Array.from(embedding),
367
+ count: 1,
368
+ variance: 0,
369
+ });
370
+ }
371
+ }
372
+ /**
373
+ * Find closest prototype and return domain-adjusted embedding
374
+ * OPTIMIZED: Single-pass similarity with early exit
375
+ */
376
+ adjust(embedding) {
377
+ if (this.prototypes.size === 0) {
378
+ return { adjusted: Array.from(embedding), domain: null, confidence: 0 };
379
+ }
380
+ let bestSim = -Infinity;
381
+ let bestProto = null;
382
+ for (const proto of this.prototypes.values()) {
383
+ const sim = this.cosineSimilarityFast(embedding, proto.centroid);
384
+ if (sim > bestSim) {
385
+ bestSim = sim;
386
+ bestProto = proto;
387
+ }
388
+ }
389
+ if (!bestProto || bestSim < 0.5) {
390
+ return { adjusted: Array.from(embedding), domain: null, confidence: 0 };
391
+ }
392
+ // Adjust embedding toward prototype (soft assignment)
393
+ const alpha = 0.1 * bestSim;
394
+ const oneMinusAlpha = 1 - alpha;
395
+ const adjusted = new Array(embedding.length);
396
+ // Unrolled adjustment
397
+ const len = embedding.length;
398
+ const len4 = len - (len % 4);
399
+ for (let i = 0; i < len4; i += 4) {
400
+ adjusted[i] = embedding[i] * oneMinusAlpha + bestProto.centroid[i] * alpha;
401
+ adjusted[i + 1] = embedding[i + 1] * oneMinusAlpha + bestProto.centroid[i + 1] * alpha;
402
+ adjusted[i + 2] = embedding[i + 2] * oneMinusAlpha + bestProto.centroid[i + 2] * alpha;
403
+ adjusted[i + 3] = embedding[i + 3] * oneMinusAlpha + bestProto.centroid[i + 3] * alpha;
404
+ }
405
+ for (let i = len4; i < len; i++) {
406
+ adjusted[i] = embedding[i] * oneMinusAlpha + bestProto.centroid[i] * alpha;
407
+ }
408
+ return {
409
+ adjusted,
410
+ domain: bestProto.domain,
411
+ confidence: bestSim,
412
+ };
413
+ }
414
+ /**
415
+ * Fast cosine similarity with loop unrolling
416
+ */
417
+ cosineSimilarityFast(a, b) {
418
+ let dot = 0, normA = 0, normB = 0;
419
+ const len = Math.min(a.length, b.length);
420
+ const len4 = len - (len % 4);
421
+ for (let i = 0; i < len4; i += 4) {
422
+ dot += a[i] * b[i] + a[i + 1] * b[i + 1] + a[i + 2] * b[i + 2] + a[i + 3] * b[i + 3];
423
+ normA += a[i] * a[i] + a[i + 1] * a[i + 1] + a[i + 2] * a[i + 2] + a[i + 3] * a[i + 3];
424
+ normB += b[i] * b[i] + b[i + 1] * b[i + 1] + b[i + 2] * b[i + 2] + b[i + 3] * b[i + 3];
425
+ }
426
+ for (let i = len4; i < len; i++) {
427
+ dot += a[i] * b[i];
428
+ normA += a[i] * a[i];
429
+ normB += b[i] * b[i];
430
+ }
431
+ return dot / (Math.sqrt(normA * normB) + 1e-8);
432
+ }
433
+ getPrototypes() {
434
+ return Array.from(this.prototypes.values());
435
+ }
436
+ export() {
437
+ return this.getPrototypes();
438
+ }
439
+ import(prototypes) {
440
+ this.prototypes.clear();
441
+ for (const p of prototypes) {
442
+ this.prototypes.set(p.domain, p);
443
+ }
444
+ }
445
+ }
446
+ class EpisodicMemory {
447
+ constructor(capacity = 1000, dimension = 384) {
448
+ this.entries = [];
449
+ this.capacity = capacity;
450
+ this.dimension = dimension;
451
+ this.augmentBuffer = new Float32Array(dimension);
452
+ this.weightsBuffer = new Float32Array(Math.min(capacity, 16)); // Max k
453
+ }
454
+ add(embedding, context) {
455
+ if (this.entries.length >= this.capacity) {
456
+ // Find and remove least used entry (O(n) but infrequent)
457
+ let minIdx = 0;
458
+ let minCount = this.entries[0].useCount;
459
+ for (let i = 1; i < this.entries.length; i++) {
460
+ if (this.entries[i].useCount < minCount) {
461
+ minCount = this.entries[i].useCount;
462
+ minIdx = i;
463
+ }
464
+ }
465
+ this.entries.splice(minIdx, 1);
466
+ }
467
+ // Convert to Float32Array and pre-compute norm
468
+ const emb = embedding instanceof Float32Array
469
+ ? new Float32Array(embedding)
470
+ : new Float32Array(embedding);
471
+ let normSq = 0;
472
+ for (let i = 0; i < emb.length; i++) {
473
+ normSq += emb[i] * emb[i];
474
+ }
475
+ this.entries.push({
476
+ embedding: emb,
477
+ context,
478
+ timestamp: Date.now(),
479
+ useCount: 0,
480
+ normSquared: normSq,
481
+ });
482
+ }
483
+ /**
484
+ * Retrieve similar past embeddings for context augmentation
485
+ * OPTIMIZED: Uses pre-computed norms for fast similarity
486
+ */
487
+ retrieve(query, k = 5) {
488
+ if (this.entries.length === 0)
489
+ return [];
490
+ // Pre-compute query norm
491
+ let queryNormSq = 0;
492
+ for (let i = 0; i < query.length; i++) {
493
+ queryNormSq += query[i] * query[i];
494
+ }
495
+ const queryNorm = Math.sqrt(queryNormSq);
496
+ // Score all entries
497
+ const scored = [];
498
+ for (const entry of this.entries) {
499
+ // Fast dot product with loop unrolling
500
+ let dot = 0;
501
+ const len = Math.min(query.length, entry.embedding.length);
502
+ const len4 = len - (len % 4);
503
+ for (let i = 0; i < len4; i += 4) {
504
+ dot += query[i] * entry.embedding[i];
505
+ dot += query[i + 1] * entry.embedding[i + 1];
506
+ dot += query[i + 2] * entry.embedding[i + 2];
507
+ dot += query[i + 3] * entry.embedding[i + 3];
508
+ }
509
+ for (let i = len4; i < len; i++) {
510
+ dot += query[i] * entry.embedding[i];
511
+ }
512
+ const similarity = dot / (queryNorm * Math.sqrt(entry.normSquared) + 1e-8);
513
+ scored.push({ entry, similarity });
514
+ }
515
+ // Partial sort for top-k (faster than full sort for large arrays)
516
+ if (scored.length <= k) {
517
+ scored.sort((a, b) => b.similarity - a.similarity);
518
+ for (const s of scored)
519
+ s.entry.useCount++;
520
+ return scored.map(s => s.entry);
521
+ }
522
+ // Quick select for top-k
523
+ scored.sort((a, b) => b.similarity - a.similarity);
524
+ const topK = scored.slice(0, k);
525
+ for (const s of topK)
526
+ s.entry.useCount++;
527
+ return topK.map(s => s.entry);
528
+ }
529
+ /**
530
+ * Augment embedding with episodic memory (attention-like)
531
+ * OPTIMIZED: Uses pre-allocated buffers
532
+ */
533
+ augment(embedding, k = 3) {
534
+ const similar = this.retrieve(embedding, k);
535
+ if (similar.length === 0)
536
+ return Array.from(embedding);
537
+ // Pre-compute query norm
538
+ let queryNormSq = 0;
539
+ for (let i = 0; i < embedding.length; i++) {
540
+ queryNormSq += embedding[i] * embedding[i];
541
+ }
542
+ const queryNorm = Math.sqrt(queryNormSq);
543
+ // Compute weights
544
+ let sumWeights = 1; // Start with 1 for query
545
+ for (let j = 0; j < similar.length; j++) {
546
+ // Fast dot product for similarity
547
+ let dot = 0;
548
+ const emb = similar[j].embedding;
549
+ const len = Math.min(embedding.length, emb.length);
550
+ for (let i = 0; i < len; i++) {
551
+ dot += embedding[i] * emb[i];
552
+ }
553
+ const sim = dot / (queryNorm * Math.sqrt(similar[j].normSquared) + 1e-8);
554
+ const weight = Math.exp(sim / 0.1);
555
+ this.weightsBuffer[j] = weight;
556
+ sumWeights += weight;
557
+ }
558
+ const invSumWeights = 1 / sumWeights;
559
+ // Weighted average
560
+ const dim = embedding.length;
561
+ for (let i = 0; i < dim; i++) {
562
+ let sum = embedding[i]; // Query contribution
563
+ for (let j = 0; j < similar.length; j++) {
564
+ sum += this.weightsBuffer[j] * similar[j].embedding[i];
565
+ }
566
+ this.augmentBuffer[i] = sum * invSumWeights;
567
+ }
568
+ return Array.from(this.augmentBuffer.subarray(0, dim));
569
+ }
570
+ size() {
571
+ return this.entries.length;
572
+ }
573
+ clear() {
574
+ this.entries = [];
575
+ }
576
+ }
577
+ // ============================================================================
578
+ // Adaptive Embedder (Main Class)
579
+ // ============================================================================
580
+ class AdaptiveEmbedder {
581
+ constructor(config = {}) {
582
+ this.onnxReady = false;
583
+ this.dimension = 384;
584
+ // Stats
585
+ this.adaptationCount = 0;
586
+ this.ewcCount = 0;
587
+ this.contrastiveCount = 0;
588
+ // Co-edit buffer for contrastive learning
589
+ this.coEditBuffer = [];
590
+ this.config = {
591
+ loraRank: config.loraRank ?? 4,
592
+ learningRate: config.learningRate ?? 0.01,
593
+ ewcLambda: config.ewcLambda ?? 0.1,
594
+ numPrototypes: config.numPrototypes ?? 50,
595
+ contrastiveLearning: config.contrastiveLearning ?? true,
596
+ contrastiveTemp: config.contrastiveTemp ?? 0.07,
597
+ memoryCapacity: config.memoryCapacity ?? 1000,
598
+ };
599
+ // Pass dimension for pre-allocation of Float32Array buffers
600
+ this.lora = new MicroLoRA(this.dimension, this.config.loraRank);
601
+ this.prototypes = new PrototypeMemory(this.config.numPrototypes, this.dimension);
602
+ this.episodic = new EpisodicMemory(this.config.memoryCapacity, this.dimension);
603
+ }
604
+ /**
605
+ * Initialize ONNX backend
606
+ */
607
+ async init() {
608
+ if ((0, onnx_embedder_1.isOnnxAvailable)()) {
609
+ await (0, onnx_embedder_1.initOnnxEmbedder)();
610
+ this.onnxReady = true;
611
+ }
612
+ }
613
+ /**
614
+ * Generate adaptive embedding
615
+ * Pipeline: ONNX → LoRA → Prototype Adjustment → Episodic Augmentation
616
+ */
617
+ async embed(text, options) {
618
+ // Step 1: Get base ONNX embedding
619
+ let baseEmb;
620
+ if (this.onnxReady) {
621
+ const result = await (0, onnx_embedder_1.embed)(text);
622
+ baseEmb = result.embedding;
623
+ }
624
+ else {
625
+ // Fallback to hash embedding
626
+ baseEmb = this.hashEmbed(text);
627
+ }
628
+ // Step 2: Apply LoRA adaptation
629
+ let adapted = this.lora.forward(baseEmb);
630
+ // Step 3: Prototype adjustment (if domain specified)
631
+ if (options?.domain) {
632
+ this.prototypes.update(options.domain, adapted);
633
+ }
634
+ const { adjusted, domain } = this.prototypes.adjust(adapted);
635
+ adapted = adjusted;
636
+ // Step 4: Episodic memory augmentation
637
+ if (options?.useEpisodic !== false) {
638
+ adapted = this.episodic.augment(adapted);
639
+ }
640
+ // Step 5: Store in episodic memory
641
+ if (options?.storeInMemory !== false) {
642
+ this.episodic.add(adapted, text.slice(0, 100));
643
+ }
644
+ // Normalize
645
+ return this.normalize(adapted);
646
+ }
647
+ /**
648
+ * Batch embed with adaptation
649
+ */
650
+ async embedBatch(texts, options) {
651
+ const results = [];
652
+ if (this.onnxReady) {
653
+ const baseResults = await (0, onnx_embedder_1.embedBatch)(texts);
654
+ for (let i = 0; i < baseResults.length; i++) {
655
+ let adapted = this.lora.forward(baseResults[i].embedding);
656
+ if (options?.domain) {
657
+ this.prototypes.update(options.domain, adapted);
658
+ }
659
+ const { adjusted } = this.prototypes.adjust(adapted);
660
+ results.push(this.normalize(adjusted));
661
+ }
662
+ }
663
+ else {
664
+ for (const text of texts) {
665
+ results.push(await this.embed(text, options));
666
+ }
667
+ }
668
+ return results;
669
+ }
670
+ /**
671
+ * Learn from co-edit pattern (contrastive learning)
672
+ * Files edited together should have similar embeddings
673
+ */
674
+ async learnCoEdit(file1, content1, file2, content2) {
675
+ if (!this.config.contrastiveLearning)
676
+ return 0;
677
+ // Get embeddings
678
+ const emb1 = await this.embed(content1.slice(0, 512), { storeInMemory: false });
679
+ const emb2 = await this.embed(content2.slice(0, 512), { storeInMemory: false });
680
+ // Store in buffer for batch learning
681
+ this.coEditBuffer.push({ file1, emb1, file2, emb2 });
682
+ // Process batch when buffer is full
683
+ if (this.coEditBuffer.length >= 16) {
684
+ return this.processCoEditBatch();
685
+ }
686
+ return 0;
687
+ }
688
+ /**
689
+ * Process co-edit batch with contrastive loss
690
+ */
691
+ processCoEditBatch() {
692
+ if (this.coEditBuffer.length < 2)
693
+ return 0;
694
+ let totalLoss = 0;
695
+ for (const { emb1, emb2 } of this.coEditBuffer) {
696
+ // Use other pairs as negatives
697
+ const negatives = this.coEditBuffer
698
+ .filter(p => p.emb1 !== emb1)
699
+ .slice(0, 4)
700
+ .map(p => p.emb1);
701
+ // Backward pass with contrastive loss
702
+ const loss = this.lora.backward(emb1, emb2, negatives, this.config.learningRate, this.config.ewcLambda);
703
+ totalLoss += loss;
704
+ this.contrastiveCount++;
705
+ }
706
+ this.coEditBuffer = [];
707
+ this.adaptationCount++;
708
+ return totalLoss / this.coEditBuffer.length;
709
+ }
710
+ /**
711
+ * Learn from trajectory outcome (reinforcement-like)
712
+ */
713
+ async learnFromOutcome(context, action, success, quality = 0.5) {
714
+ const contextEmb = await this.embed(context, { storeInMemory: false });
715
+ const actionEmb = await this.embed(action, { storeInMemory: false });
716
+ if (success && quality > 0.7) {
717
+ // Positive outcome - pull embeddings closer
718
+ this.lora.backward(contextEmb, actionEmb, [], this.config.learningRate * quality, this.config.ewcLambda);
719
+ this.adaptationCount++;
720
+ }
721
+ }
722
+ /**
723
+ * EWC consolidation - prevent forgetting important adaptations
724
+ * OPTIMIZED: Works with Float32Array episodic entries
725
+ */
726
+ async consolidate() {
727
+ // Collect current episodic memories for Fisher estimation
728
+ const embeddings = [];
729
+ const entries = this.episodic.entries || [];
730
+ // Get last 100 entries for Fisher estimation
731
+ const recentEntries = entries.slice(-100);
732
+ for (const entry of recentEntries) {
733
+ if (entry.embedding instanceof Float32Array) {
734
+ embeddings.push(entry.embedding);
735
+ }
736
+ }
737
+ if (embeddings.length > 10) {
738
+ this.lora.consolidate(embeddings);
739
+ this.ewcCount++;
740
+ }
741
+ }
742
+ /**
743
+ * Fallback hash embedding
744
+ */
745
+ hashEmbed(text) {
746
+ const embedding = new Array(this.dimension).fill(0);
747
+ const tokens = text.toLowerCase().split(/\s+/);
748
+ for (let t = 0; t < tokens.length; t++) {
749
+ const token = tokens[t];
750
+ const posWeight = 1 / (1 + t * 0.1);
751
+ for (let i = 0; i < token.length; i++) {
752
+ const code = token.charCodeAt(i);
753
+ const h1 = (code * 31 + i * 17 + t * 7) % this.dimension;
754
+ const h2 = (code * 37 + i * 23 + t * 11) % this.dimension;
755
+ embedding[h1] += posWeight;
756
+ embedding[h2] += posWeight * 0.5;
757
+ }
758
+ }
759
+ return this.normalize(embedding);
760
+ }
761
+ normalize(v) {
762
+ const norm = Math.sqrt(v.reduce((a, b) => a + b * b, 0));
763
+ return norm > 0 ? v.map(x => x / norm) : v;
764
+ }
765
+ /**
766
+ * Get statistics
767
+ */
768
+ getStats() {
769
+ return {
770
+ baseModel: 'all-MiniLM-L6-v2',
771
+ dimension: this.dimension,
772
+ loraRank: this.config.loraRank,
773
+ loraParams: this.lora.getParams(),
774
+ adaptations: this.adaptationCount,
775
+ prototypes: this.prototypes.getPrototypes().length,
776
+ memorySize: this.episodic.size(),
777
+ ewcConsolidations: this.ewcCount,
778
+ contrastiveUpdates: this.contrastiveCount,
779
+ };
780
+ }
781
+ /**
782
+ * Export learned weights
783
+ */
784
+ export() {
785
+ return {
786
+ lora: this.lora.export(),
787
+ prototypes: this.prototypes.export(),
788
+ stats: this.getStats(),
789
+ };
790
+ }
791
+ /**
792
+ * Import learned weights
793
+ */
794
+ import(data) {
795
+ if (data.lora) {
796
+ this.lora.import(data.lora);
797
+ }
798
+ if (data.prototypes) {
799
+ this.prototypes.import(data.prototypes);
800
+ }
801
+ }
802
+ /**
803
+ * Reset adaptations
804
+ */
805
+ reset() {
806
+ this.lora = new MicroLoRA(this.dimension, this.config.loraRank);
807
+ this.prototypes = new PrototypeMemory(this.config.numPrototypes, this.dimension);
808
+ this.episodic.clear();
809
+ this.adaptationCount = 0;
810
+ this.ewcCount = 0;
811
+ this.contrastiveCount = 0;
812
+ this.coEditBuffer = [];
813
+ }
814
+ /**
815
+ * Get LoRA cache statistics
816
+ */
817
+ getCacheStats() {
818
+ return this.lora.getCacheStats?.() ?? { size: 0, maxSize: 256 };
819
+ }
820
+ }
821
+ exports.AdaptiveEmbedder = AdaptiveEmbedder;
822
+ // ============================================================================
823
+ // Factory & Singleton
824
+ // ============================================================================
825
+ let instance = null;
826
+ function getAdaptiveEmbedder(config) {
827
+ if (!instance) {
828
+ instance = new AdaptiveEmbedder(config);
829
+ }
830
+ return instance;
831
+ }
832
+ async function initAdaptiveEmbedder(config) {
833
+ const embedder = getAdaptiveEmbedder(config);
834
+ await embedder.init();
835
+ return embedder;
836
+ }
837
+ exports.default = AdaptiveEmbedder;
dist/core/agentdb-fast.d.ts ADDED
@@ -0,0 +1,149 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ /**
2
+ * AgentDB Fast - High-performance in-process alternative to AgentDB CLI
3
+ *
4
+ * The AgentDB CLI has ~2.3s startup overhead due to npx initialization.
5
+ * This module provides 50-200x faster operations by using in-process calls.
6
+ *
7
+ * Features:
8
+ * - In-memory episode storage with LRU eviction
9
+ * - Vector similarity search using @ruvector/core
10
+ * - Compatible API with AgentDB's episode/trajectory interfaces
11
+ */
12
+ /**
13
+ * Episode entry for trajectory storage
14
+ */
15
+ export interface Episode {
16
+ id: string;
17
+ state: number[];
18
+ action: string | number;
19
+ reward: number;
20
+ nextState: number[];
21
+ done: boolean;
22
+ metadata?: Record<string, any>;
23
+ timestamp?: number;
24
+ }
25
+ /**
26
+ * Trajectory (sequence of episodes)
27
+ */
28
+ export interface Trajectory {
29
+ id: string;
30
+ episodes: Episode[];
31
+ totalReward: number;
32
+ metadata?: Record<string, any>;
33
+ }
34
+ /**
35
+ * Search result for episode queries
36
+ */
37
+ export interface EpisodeSearchResult {
38
+ episode: Episode;
39
+ similarity: number;
40
+ trajectoryId?: string;
41
+ }
42
+ /**
43
+ * Fast in-memory AgentDB implementation
44
+ */
45
+ export declare class FastAgentDB {
46
+ private episodes;
47
+ private trajectories;
48
+ private vectorDb;
49
+ private dimensions;
50
+ private maxEpisodes;
51
+ private episodeOrder;
52
+ /**
53
+ * Create a new FastAgentDB instance
54
+ *
55
+ * @param dimensions - Vector dimensions for state embeddings
56
+ * @param maxEpisodes - Maximum episodes to store (LRU eviction)
57
+ */
58
+ constructor(dimensions?: number, maxEpisodes?: number);
59
+ /**
60
+ * Initialize the vector database
61
+ */
62
+ private initVectorDb;
63
+ /**
64
+ * Store an episode
65
+ *
66
+ * @param episode - Episode to store
67
+ * @returns Episode ID
68
+ */
69
+ storeEpisode(episode: Omit<Episode, 'id'> & {
70
+ id?: string;
71
+ }): Promise<string>;
72
+ /**
73
+ * Store multiple episodes in batch
74
+ */
75
+ storeEpisodes(episodes: (Omit<Episode, 'id'> & {
76
+ id?: string;
77
+ })[]): Promise<string[]>;
78
+ /**
79
+ * Retrieve an episode by ID
80
+ */
81
+ getEpisode(id: string): Promise<Episode | null>;
82
+ /**
83
+ * Search for similar episodes by state
84
+ *
85
+ * @param queryState - State vector to search for
86
+ * @param k - Number of results to return
87
+ * @returns Similar episodes sorted by similarity
88
+ */
89
+ searchByState(queryState: number[] | Float32Array, k?: number): Promise<EpisodeSearchResult[]>;
90
+ /**
91
+ * Fallback similarity search using brute-force cosine similarity
92
+ */
93
+ private fallbackSearch;
94
+ /**
95
+ * Compute cosine similarity between two vectors
96
+ */
97
+ private cosineSimilarity;
98
+ /**
99
+ * Store a trajectory (sequence of episodes)
100
+ */
101
+ storeTrajectory(episodes: (Omit<Episode, 'id'> & {
102
+ id?: string;
103
+ })[], metadata?: Record<string, any>): Promise<string>;
104
+ /**
105
+ * Get a trajectory by ID
106
+ */
107
+ getTrajectory(id: string): Promise<Trajectory | null>;
108
+ /**
109
+ * Get top trajectories by total reward
110
+ */
111
+ getTopTrajectories(k?: number): Promise<Trajectory[]>;
112
+ /**
113
+ * Sample random episodes (for experience replay)
114
+ */
115
+ sampleEpisodes(n: number): Promise<Episode[]>;
116
+ /**
117
+ * Get database statistics
118
+ */
119
+ getStats(): {
120
+ episodeCount: number;
121
+ trajectoryCount: number;
122
+ dimensions: number;
123
+ maxEpisodes: number;
124
+ vectorDbAvailable: boolean;
125
+ };
126
+ /**
127
+ * Clear all data
128
+ */
129
+ clear(): void;
130
+ /**
131
+ * Generate a unique ID
132
+ */
133
+ private generateId;
134
+ }
135
+ /**
136
+ * Create a fast AgentDB instance
137
+ */
138
+ export declare function createFastAgentDB(dimensions?: number, maxEpisodes?: number): FastAgentDB;
139
+ /**
140
+ * Get the default FastAgentDB instance
141
+ */
142
+ export declare function getDefaultAgentDB(): FastAgentDB;
143
+ declare const _default: {
144
+ FastAgentDB: typeof FastAgentDB;
145
+ createFastAgentDB: typeof createFastAgentDB;
146
+ getDefaultAgentDB: typeof getDefaultAgentDB;
147
+ };
148
+ export default _default;
149
+ //# sourceMappingURL=agentdb-fast.d.ts.map
dist/core/agentdb-fast.d.ts.map ADDED
@@ -0,0 +1 @@
 
 
1
+ {"version":3,"file":"agentdb-fast.d.ts","sourceRoot":"","sources":["../../src/core/agentdb-fast.ts"],"names":[],"mappings":"AAAA;;;;;;;;;;GAUG;AA6BH;;GAEG;AACH,MAAM,WAAW,OAAO;IACtB,EAAE,EAAE,MAAM,CAAC;IACX,KAAK,EAAE,MAAM,EAAE,CAAC;IAChB,MAAM,EAAE,MAAM,GAAG,MAAM,CAAC;IACxB,MAAM,EAAE,MAAM,CAAC;IACf,SAAS,EAAE,MAAM,EAAE,CAAC;IACpB,IAAI,EAAE,OAAO,CAAC;IACd,QAAQ,CAAC,EAAE,MAAM,CAAC,MAAM,EAAE,GAAG,CAAC,CAAC;IAC/B,SAAS,CAAC,EAAE,MAAM,CAAC;CACpB;AAED;;GAEG;AACH,MAAM,WAAW,UAAU;IACzB,EAAE,EAAE,MAAM,CAAC;IACX,QAAQ,EAAE,OAAO,EAAE,CAAC;IACpB,WAAW,EAAE,MAAM,CAAC;IACpB,QAAQ,CAAC,EAAE,MAAM,CAAC,MAAM,EAAE,GAAG,CAAC,CAAC;CAChC;AAED;;GAEG;AACH,MAAM,WAAW,mBAAmB;IAClC,OAAO,EAAE,OAAO,CAAC;IACjB,UAAU,EAAE,MAAM,CAAC;IACnB,YAAY,CAAC,EAAE,MAAM,CAAC;CACvB;AAED;;GAEG;AACH,qBAAa,WAAW;IACtB,OAAO,CAAC,QAAQ,CAAmC;IACnD,OAAO,CAAC,YAAY,CAAsC;IAC1D,OAAO,CAAC,QAAQ,CAAa;IAC7B,OAAO,CAAC,UAAU,CAAS;IAC3B,OAAO,CAAC,WAAW,CAAS;IAC5B,OAAO,CAAC,YAAY,CAAgB;IAEpC;;;;;OAKG;gBACS,UAAU,GAAE,MAAY,EAAE,WAAW,GAAE,MAAe;IAKlE;;OAEG;YACW,YAAY;IAe1B;;;;;OAKG;IACG,YAAY,CAAC,OAAO,EAAE,IAAI,CAAC,OAAO,EAAE,IAAI,CAAC,GAAG;QAAE,EAAE,CAAC,EAAE,MAAM,CAAA;KAAE,GAAG,OAAO,CAAC,MAAM,CAAC;IAoCnF;;OAEG;IACG,aAAa,CAAC,QAAQ,EAAE,CAAC,IAAI,CAAC,OAAO,EAAE,IAAI,CAAC,GAAG;QAAE,EAAE,CAAC,EAAE,MAAM,CAAA;KAAE,CAAC,EAAE,GAAG,OAAO,CAAC,MAAM,EAAE,CAAC;IAS3F;;OAEG;IACG,UAAU,CAAC,EAAE,EAAE,MAAM,GAAG,OAAO,CAAC,OAAO,GAAG,IAAI,CAAC;IAarD;;;;;;OAMG;IACG,aAAa,CACjB,UAAU,EAAE,MAAM,EAAE,GAAG,YAAY,EACnC,CAAC,GAAE,MAAW,GACb,OAAO,CAAC,mBAAmB,EAAE,CAAC;IAgCjC;;OAEG;IACH,OAAO,CAAC,cAAc;IAetB;;OAEG;IACH,OAAO,CAAC,gBAAgB;IAexB;;OAEG;IACG,eAAe,CACnB,QAAQ,EAAE,CAAC,IAAI,CAAC,OAAO,EAAE,IAAI,CAAC,GAAG;QAAE,EAAE,CAAC,EAAE,MAAM,CAAA;KAAE,CAAC,EAAE,EACnD,QAAQ,CAAC,EAAE,MAAM,CAAC,MAAM,EAAE,GAAG,CAAC,GAC7B,OAAO,CAAC,MAAM,CAAC;IAyBlB;;OAEG;IACG,aAAa,CAAC,EAAE,EAAE,MAAM,GAAG,OAAO,CAAC,UAAU,GAAG,IAAI,CAAC;IAI3D;;OAEG;IACG,kBAAkB,CAAC,CAAC,GAAE,MAAW,GAAG,OAAO,CAAC,UAAU,EAAE,CAAC;IAM/D;;OAEG;IACG,cAAc,CAAC,CAAC,EAAE,MAAM,GAAG,OAAO,CAAC,OAAO,EAAE,CAAC;IAYnD;;OAEG;IACH,QAAQ,IAAI;QACV,YAAY,EAAE,MAAM,CAAC;QACrB,eAAe,EAAE,MAAM,CAAC;QACxB,UAAU,EAAE,MAAM,CAAC;QACnB,WAAW,EAAE,MAAM,CAAC;QACpB,iBAAiB,EAAE,OAAO,CAAC;KAC5B;IAUD;;OAEG;IACH,KAAK,IAAI,IAAI;IAMb;;OAEG;IACH,OAAO,CAAC,UAAU;CAGnB;AAED;;GAEG;AACH,wBAAgB,iBAAiB,CAC/B,UAAU,GAAE,MAAY,EACxB,WAAW,GAAE,MAAe,GAC3B,WAAW,CAEb;AAKD;;GAEG;AACH,wBAAgB,iBAAiB,IAAI,WAAW,CAK/C;;;;;;AAED,wBAIE"}
dist/core/agentdb-fast.js ADDED
@@ -0,0 +1,301 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ "use strict";
2
+ /**
3
+ * AgentDB Fast - High-performance in-process alternative to AgentDB CLI
4
+ *
5
+ * The AgentDB CLI has ~2.3s startup overhead due to npx initialization.
6
+ * This module provides 50-200x faster operations by using in-process calls.
7
+ *
8
+ * Features:
9
+ * - In-memory episode storage with LRU eviction
10
+ * - Vector similarity search using @ruvector/core
11
+ * - Compatible API with AgentDB's episode/trajectory interfaces
12
+ */
13
+ Object.defineProperty(exports, "__esModule", { value: true });
14
+ exports.FastAgentDB = void 0;
15
+ exports.createFastAgentDB = createFastAgentDB;
16
+ exports.getDefaultAgentDB = getDefaultAgentDB;
17
+ // Lazy load ruvector core
18
+ let coreModule = null;
19
+ function getCoreModule() {
20
+ if (coreModule)
21
+ return coreModule;
22
+ try {
23
+ coreModule = require('@ruvector/core');
24
+ return coreModule;
25
+ }
26
+ catch {
27
+ // Fallback to ruvector if core not available
28
+ try {
29
+ coreModule = require('ruvector');
30
+ return coreModule;
31
+ }
32
+ catch (e) {
33
+ throw new Error(`Neither @ruvector/core nor ruvector is available: ${e.message}`);
34
+ }
35
+ }
36
+ }
37
+ /**
38
+ * Fast in-memory AgentDB implementation
39
+ */
40
+ class FastAgentDB {
41
+ /**
42
+ * Create a new FastAgentDB instance
43
+ *
44
+ * @param dimensions - Vector dimensions for state embeddings
45
+ * @param maxEpisodes - Maximum episodes to store (LRU eviction)
46
+ */
47
+ constructor(dimensions = 128, maxEpisodes = 100000) {
48
+ this.episodes = new Map();
49
+ this.trajectories = new Map();
50
+ this.vectorDb = null;
51
+ this.episodeOrder = []; // For LRU eviction
52
+ this.dimensions = dimensions;
53
+ this.maxEpisodes = maxEpisodes;
54
+ }
55
+ /**
56
+ * Initialize the vector database
57
+ */
58
+ async initVectorDb() {
59
+ if (this.vectorDb)
60
+ return;
61
+ try {
62
+ const core = getCoreModule();
63
+ this.vectorDb = new core.VectorDB({
64
+ dimensions: this.dimensions,
65
+ distanceMetric: 'Cosine',
66
+ });
67
+ }
68
+ catch (e) {
69
+ // Vector DB not available, use fallback similarity
70
+ console.warn(`VectorDB not available, using fallback similarity: ${e.message}`);
71
+ }
72
+ }
73
+ /**
74
+ * Store an episode
75
+ *
76
+ * @param episode - Episode to store
77
+ * @returns Episode ID
78
+ */
79
+ async storeEpisode(episode) {
80
+ await this.initVectorDb();
81
+ const id = episode.id ?? this.generateId();
82
+ const fullEpisode = {
83
+ ...episode,
84
+ id,
85
+ timestamp: episode.timestamp ?? Date.now(),
86
+ };
87
+ // LRU eviction if needed
88
+ if (this.episodes.size >= this.maxEpisodes) {
89
+ const oldestId = this.episodeOrder.shift();
90
+ if (oldestId) {
91
+ this.episodes.delete(oldestId);
92
+ }
93
+ }
94
+ this.episodes.set(id, fullEpisode);
95
+ this.episodeOrder.push(id);
96
+ // Index in vector DB if available
97
+ if (this.vectorDb && fullEpisode.state.length === this.dimensions) {
98
+ try {
99
+ await this.vectorDb.insert({
100
+ id,
101
+ vector: new Float32Array(fullEpisode.state),
102
+ });
103
+ }
104
+ catch {
105
+ // Ignore indexing errors
106
+ }
107
+ }
108
+ return id;
109
+ }
110
+ /**
111
+ * Store multiple episodes in batch
112
+ */
113
+ async storeEpisodes(episodes) {
114
+ const ids = [];
115
+ for (const episode of episodes) {
116
+ const id = await this.storeEpisode(episode);
117
+ ids.push(id);
118
+ }
119
+ return ids;
120
+ }
121
+ /**
122
+ * Retrieve an episode by ID
123
+ */
124
+ async getEpisode(id) {
125
+ const episode = this.episodes.get(id);
126
+ if (episode) {
127
+ // Update LRU order
128
+ const idx = this.episodeOrder.indexOf(id);
129
+ if (idx > -1) {
130
+ this.episodeOrder.splice(idx, 1);
131
+ this.episodeOrder.push(id);
132
+ }
133
+ }
134
+ return episode ?? null;
135
+ }
136
+ /**
137
+ * Search for similar episodes by state
138
+ *
139
+ * @param queryState - State vector to search for
140
+ * @param k - Number of results to return
141
+ * @returns Similar episodes sorted by similarity
142
+ */
143
+ async searchByState(queryState, k = 10) {
144
+ await this.initVectorDb();
145
+ const query = Array.isArray(queryState) ? queryState : Array.from(queryState);
146
+ // Use vector DB if available
147
+ if (this.vectorDb && query.length === this.dimensions) {
148
+ try {
149
+ const results = await this.vectorDb.search({
150
+ vector: new Float32Array(query),
151
+ k,
152
+ });
153
+ return results
154
+ .map((r) => {
155
+ const episode = this.episodes.get(r.id);
156
+ if (!episode)
157
+ return null;
158
+ return {
159
+ episode,
160
+ similarity: 1 - r.score, // Convert distance to similarity
161
+ };
162
+ })
163
+ .filter((r) => r !== null);
164
+ }
165
+ catch {
166
+ // Fall through to fallback
167
+ }
168
+ }
169
+ // Fallback: brute-force cosine similarity
170
+ return this.fallbackSearch(query, k);
171
+ }
172
+ /**
173
+ * Fallback similarity search using brute-force cosine similarity
174
+ */
175
+ fallbackSearch(query, k) {
176
+ const results = [];
177
+ for (const episode of this.episodes.values()) {
178
+ if (episode.state.length !== query.length)
179
+ continue;
180
+ const similarity = this.cosineSimilarity(query, episode.state);
181
+ results.push({ episode, similarity });
182
+ }
183
+ return results
184
+ .sort((a, b) => b.similarity - a.similarity)
185
+ .slice(0, k);
186
+ }
187
+ /**
188
+ * Compute cosine similarity between two vectors
189
+ */
190
+ cosineSimilarity(a, b) {
191
+ let dotProduct = 0;
192
+ let normA = 0;
193
+ let normB = 0;
194
+ for (let i = 0; i < a.length; i++) {
195
+ dotProduct += a[i] * b[i];
196
+ normA += a[i] * a[i];
197
+ normB += b[i] * b[i];
198
+ }
199
+ const denom = Math.sqrt(normA) * Math.sqrt(normB);
200
+ return denom === 0 ? 0 : dotProduct / denom;
201
+ }
202
+ /**
203
+ * Store a trajectory (sequence of episodes)
204
+ */
205
+ async storeTrajectory(episodes, metadata) {
206
+ const trajectoryId = this.generateId();
207
+ const storedEpisodes = [];
208
+ let totalReward = 0;
209
+ for (const episode of episodes) {
210
+ const id = await this.storeEpisode(episode);
211
+ const stored = await this.getEpisode(id);
212
+ if (stored) {
213
+ storedEpisodes.push(stored);
214
+ totalReward += stored.reward;
215
+ }
216
+ }
217
+ const trajectory = {
218
+ id: trajectoryId,
219
+ episodes: storedEpisodes,
220
+ totalReward,
221
+ metadata,
222
+ };
223
+ this.trajectories.set(trajectoryId, trajectory);
224
+ return trajectoryId;
225
+ }
226
+ /**
227
+ * Get a trajectory by ID
228
+ */
229
+ async getTrajectory(id) {
230
+ return this.trajectories.get(id) ?? null;
231
+ }
232
+ /**
233
+ * Get top trajectories by total reward
234
+ */
235
+ async getTopTrajectories(k = 10) {
236
+ return Array.from(this.trajectories.values())
237
+ .sort((a, b) => b.totalReward - a.totalReward)
238
+ .slice(0, k);
239
+ }
240
+ /**
241
+ * Sample random episodes (for experience replay)
242
+ */
243
+ async sampleEpisodes(n) {
244
+ const allEpisodes = Array.from(this.episodes.values());
245
+ const sampled = [];
246
+ for (let i = 0; i < Math.min(n, allEpisodes.length); i++) {
247
+ const idx = Math.floor(Math.random() * allEpisodes.length);
248
+ sampled.push(allEpisodes[idx]);
249
+ }
250
+ return sampled;
251
+ }
252
+ /**
253
+ * Get database statistics
254
+ */
255
+ getStats() {
256
+ return {
257
+ episodeCount: this.episodes.size,
258
+ trajectoryCount: this.trajectories.size,
259
+ dimensions: this.dimensions,
260
+ maxEpisodes: this.maxEpisodes,
261
+ vectorDbAvailable: this.vectorDb !== null,
262
+ };
263
+ }
264
+ /**
265
+ * Clear all data
266
+ */
267
+ clear() {
268
+ this.episodes.clear();
269
+ this.trajectories.clear();
270
+ this.episodeOrder = [];
271
+ }
272
+ /**
273
+ * Generate a unique ID
274
+ */
275
+ generateId() {
276
+ return `${Date.now()}-${Math.random().toString(36).substr(2, 9)}`;
277
+ }
278
+ }
279
+ exports.FastAgentDB = FastAgentDB;
280
+ /**
281
+ * Create a fast AgentDB instance
282
+ */
283
+ function createFastAgentDB(dimensions = 128, maxEpisodes = 100000) {
284
+ return new FastAgentDB(dimensions, maxEpisodes);
285
+ }
286
+ // Singleton instance for convenience
287
+ let defaultInstance = null;
288
+ /**
289
+ * Get the default FastAgentDB instance
290
+ */
291
+ function getDefaultAgentDB() {
292
+ if (!defaultInstance) {
293
+ defaultInstance = new FastAgentDB();
294
+ }
295
+ return defaultInstance;
296
+ }
297
+ exports.default = {
298
+ FastAgentDB,
299
+ createFastAgentDB,
300
+ getDefaultAgentDB,
301
+ };
dist/core/ast-parser.d.ts ADDED
@@ -0,0 +1,108 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ /**
2
+ * AST Parser - Tree-sitter based code parsing
3
+ *
4
+ * Provides real AST parsing for accurate code analysis,
5
+ * replacing regex-based heuristics with proper parsing.
6
+ *
7
+ * Supports: TypeScript, JavaScript, Python, Rust, Go, Java, C/C++
8
+ */
9
+ export declare function isTreeSitterAvailable(): boolean;
10
+ export interface ASTNode {
11
+ type: string;
12
+ text: string;
13
+ startPosition: {
14
+ row: number;
15
+ column: number;
16
+ };
17
+ endPosition: {
18
+ row: number;
19
+ column: number;
20
+ };
21
+ children: ASTNode[];
22
+ parent?: string;
23
+ }
24
+ export interface FunctionInfo {
25
+ name: string;
26
+ params: string[];
27
+ returnType?: string;
28
+ async: boolean;
29
+ exported: boolean;
30
+ startLine: number;
31
+ endLine: number;
32
+ complexity: number;
33
+ calls: string[];
34
+ }
35
+ export interface ClassInfo {
36
+ name: string;
37
+ extends?: string;
38
+ implements: string[];
39
+ methods: FunctionInfo[];
40
+ properties: string[];
41
+ exported: boolean;
42
+ startLine: number;
43
+ endLine: number;
44
+ }
45
+ export interface ImportInfo {
46
+ source: string;
47
+ default?: string;
48
+ named: string[];
49
+ namespace?: string;
50
+ type: 'esm' | 'commonjs' | 'dynamic';
51
+ }
52
+ export interface ExportInfo {
53
+ name: string;
54
+ type: 'default' | 'named' | 'all';
55
+ source?: string;
56
+ }
57
+ export interface FileAnalysis {
58
+ file: string;
59
+ language: string;
60
+ imports: ImportInfo[];
61
+ exports: ExportInfo[];
62
+ functions: FunctionInfo[];
63
+ classes: ClassInfo[];
64
+ variables: string[];
65
+ types: string[];
66
+ complexity: number;
67
+ lines: number;
68
+ parseTime: number;
69
+ }
70
+ export declare class CodeParser {
71
+ private parser;
72
+ private initialized;
73
+ init(): Promise<boolean>;
74
+ /**
75
+ * Detect language from file extension
76
+ */
77
+ detectLanguage(file: string): string;
78
+ /**
79
+ * Parse a file and return the AST
80
+ */
81
+ parse(file: string, content?: string): Promise<ASTNode | null>;
82
+ private convertNode;
83
+ /**
84
+ * Analyze a file for functions, classes, imports, etc.
85
+ */
86
+ analyze(file: string, content?: string): Promise<FileAnalysis>;
87
+ private analyzeTree;
88
+ private parseImport;
89
+ private parseExport;
90
+ private parseFunction;
91
+ private parseClass;
92
+ private findChild;
93
+ private getIdentifierName;
94
+ private calculateComplexity;
95
+ private analyzeWithRegex;
96
+ /**
97
+ * Get all symbols (functions, classes, types) in a file
98
+ */
99
+ getSymbols(file: string): Promise<string[]>;
100
+ /**
101
+ * Get the call graph for a file
102
+ */
103
+ getCallGraph(file: string): Promise<Map<string, string[]>>;
104
+ }
105
+ export declare function getCodeParser(): CodeParser;
106
+ export declare function initCodeParser(): Promise<CodeParser>;
107
+ export default CodeParser;
108
+ //# sourceMappingURL=ast-parser.d.ts.map
dist/core/ast-parser.d.ts.map ADDED
@@ -0,0 +1 @@
 
 
1
+ {"version":3,"file":"ast-parser.d.ts","sourceRoot":"","sources":["../../src/core/ast-parser.ts"],"names":[],"mappings":"AAAA;;;;;;;GAOG;AAgEH,wBAAgB,qBAAqB,IAAI,OAAO,CAO/C;AAMD,MAAM,WAAW,OAAO;IACtB,IAAI,EAAE,MAAM,CAAC;IACb,IAAI,EAAE,MAAM,CAAC;IACb,aAAa,EAAE;QAAE,GAAG,EAAE,MAAM,CAAC;QAAC,MAAM,EAAE,MAAM,CAAA;KAAE,CAAC;IAC/C,WAAW,EAAE;QAAE,GAAG,EAAE,MAAM,CAAC;QAAC,MAAM,EAAE,MAAM,CAAA;KAAE,CAAC;IAC7C,QAAQ,EAAE,OAAO,EAAE,CAAC;IACpB,MAAM,CAAC,EAAE,MAAM,CAAC;CACjB;AAED,MAAM,WAAW,YAAY;IAC3B,IAAI,EAAE,MAAM,CAAC;IACb,MAAM,EAAE,MAAM,EAAE,CAAC;IACjB,UAAU,CAAC,EAAE,MAAM,CAAC;IACpB,KAAK,EAAE,OAAO,CAAC;IACf,QAAQ,EAAE,OAAO,CAAC;IAClB,SAAS,EAAE,MAAM,CAAC;IAClB,OAAO,EAAE,MAAM,CAAC;IAChB,UAAU,EAAE,MAAM,CAAC;IACnB,KAAK,EAAE,MAAM,EAAE,CAAC;CACjB;AAED,MAAM,WAAW,SAAS;IACxB,IAAI,EAAE,MAAM,CAAC;IACb,OAAO,CAAC,EAAE,MAAM,CAAC;IACjB,UAAU,EAAE,MAAM,EAAE,CAAC;IACrB,OAAO,EAAE,YAAY,EAAE,CAAC;IACxB,UAAU,EAAE,MAAM,EAAE,CAAC;IACrB,QAAQ,EAAE,OAAO,CAAC;IAClB,SAAS,EAAE,MAAM,CAAC;IAClB,OAAO,EAAE,MAAM,CAAC;CACjB;AAED,MAAM,WAAW,UAAU;IACzB,MAAM,EAAE,MAAM,CAAC;IACf,OAAO,CAAC,EAAE,MAAM,CAAC;IACjB,KAAK,EAAE,MAAM,EAAE,CAAC;IAChB,SAAS,CAAC,EAAE,MAAM,CAAC;IACnB,IAAI,EAAE,KAAK,GAAG,UAAU,GAAG,SAAS,CAAC;CACtC;AAED,MAAM,WAAW,UAAU;IACzB,IAAI,EAAE,MAAM,CAAC;IACb,IAAI,EAAE,SAAS,GAAG,OAAO,GAAG,KAAK,CAAC;IAClC,MAAM,CAAC,EAAE,MAAM,CAAC;CACjB;AAED,MAAM,WAAW,YAAY;IAC3B,IAAI,EAAE,MAAM,CAAC;IACb,QAAQ,EAAE,MAAM,CAAC;IACjB,OAAO,EAAE,UAAU,EAAE,CAAC;IACtB,OAAO,EAAE,UAAU,EAAE,CAAC;IACtB,SAAS,EAAE,YAAY,EAAE,CAAC;IAC1B,OAAO,EAAE,SAAS,EAAE,CAAC;IACrB,SAAS,EAAE,MAAM,EAAE,CAAC;IACpB,KAAK,EAAE,MAAM,EAAE,CAAC;IAChB,UAAU,EAAE,MAAM,CAAC;IACnB,KAAK,EAAE,MAAM,CAAC;IACd,SAAS,EAAE,MAAM,CAAC;CACnB;AAMD,qBAAa,UAAU;IACrB,OAAO,CAAC,MAAM,CAAa;IAC3B,OAAO,CAAC,WAAW,CAAS;IAEtB,IAAI,IAAI,OAAO,CAAC,OAAO,CAAC;IAW9B;;OAEG;IACH,cAAc,CAAC,IAAI,EAAE,MAAM,GAAG,MAAM;IAyBpC;;OAEG;IACG,KAAK,CAAC,IAAI,EAAE,MAAM,EAAE,OAAO,CAAC,EAAE,MAAM,GAAG,OAAO,CAAC,OAAO,GAAG,IAAI,CAAC;IAkBpE,OAAO,CAAC,WAAW;IAUnB;;OAEG;IACG,OAAO,CAAC,IAAI,EAAE,MAAM,EAAE,OAAO,CAAC,EAAE,MAAM,GAAG,OAAO,CAAC,YAAY,CAAC;IAmBpE,OAAO,CAAC,WAAW;IAuEnB,OAAO,CAAC,WAAW;IAmCnB,OAAO,CAAC,WAAW;IAenB,OAAO,CAAC,aAAa;IAsDrB,OAAO,CAAC,UAAU;IA8ClB,OAAO,CAAC,SAAS;IAUjB,OAAO,CAAC,iBAAiB;IAWzB,OAAO,CAAC,mBAAmB;IAoB3B,OAAO,CAAC,gBAAgB;IA8GxB;;OAEG;IACG,UAAU,CAAC,IAAI,EAAE,MAAM,GAAG,OAAO,CAAC,MAAM,EAAE,CAAC;IAUjD;;OAEG;IACG,YAAY,CAAC,IAAI,EAAE,MAAM,GAAG,OAAO,CAAC,GAAG,CAAC,MAAM,EAAE,MAAM,EAAE,CAAC,CAAC;CAgBjE;AAQD,wBAAgB,aAAa,IAAI,UAAU,CAK1C;AAED,wBAAsB,cAAc,IAAI,OAAO,CAAC,UAAU,CAAC,CAI1D;AAED,eAAe,UAAU,CAAC"}
dist/core/ast-parser.js ADDED
@@ -0,0 +1,602 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ "use strict";
2
+ /**
3
+ * AST Parser - Tree-sitter based code parsing
4
+ *
5
+ * Provides real AST parsing for accurate code analysis,
6
+ * replacing regex-based heuristics with proper parsing.
7
+ *
8
+ * Supports: TypeScript, JavaScript, Python, Rust, Go, Java, C/C++
9
+ */
10
+ var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
11
+ if (k2 === undefined) k2 = k;
12
+ var desc = Object.getOwnPropertyDescriptor(m, k);
13
+ if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
14
+ desc = { enumerable: true, get: function() { return m[k]; } };
15
+ }
16
+ Object.defineProperty(o, k2, desc);
17
+ }) : (function(o, m, k, k2) {
18
+ if (k2 === undefined) k2 = k;
19
+ o[k2] = m[k];
20
+ }));
21
+ var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
22
+ Object.defineProperty(o, "default", { enumerable: true, value: v });
23
+ }) : function(o, v) {
24
+ o["default"] = v;
25
+ });
26
+ var __importStar = (this && this.__importStar) || (function () {
27
+ var ownKeys = function(o) {
28
+ ownKeys = Object.getOwnPropertyNames || function (o) {
29
+ var ar = [];
30
+ for (var k in o) if (Object.prototype.hasOwnProperty.call(o, k)) ar[ar.length] = k;
31
+ return ar;
32
+ };
33
+ return ownKeys(o);
34
+ };
35
+ return function (mod) {
36
+ if (mod && mod.__esModule) return mod;
37
+ var result = {};
38
+ if (mod != null) for (var k = ownKeys(mod), i = 0; i < k.length; i++) if (k[i] !== "default") __createBinding(result, mod, k[i]);
39
+ __setModuleDefault(result, mod);
40
+ return result;
41
+ };
42
+ })();
43
+ Object.defineProperty(exports, "__esModule", { value: true });
44
+ exports.CodeParser = void 0;
45
+ exports.isTreeSitterAvailable = isTreeSitterAvailable;
46
+ exports.getCodeParser = getCodeParser;
47
+ exports.initCodeParser = initCodeParser;
48
+ const fs = __importStar(require("fs"));
49
+ const path = __importStar(require("path"));
50
+ // Try to load tree-sitter
51
+ let Parser = null;
52
+ let languages = new Map();
53
+ let parserError = null;
54
+ async function loadTreeSitter() {
55
+ if (Parser)
56
+ return true;
57
+ if (parserError)
58
+ return false;
59
+ try {
60
+ // Dynamic require to avoid TypeScript errors
61
+ Parser = require('tree-sitter');
62
+ return true;
63
+ }
64
+ catch (e) {
65
+ parserError = new Error(`tree-sitter not installed: ${e.message}\n` +
66
+ `Install with: npm install tree-sitter tree-sitter-typescript tree-sitter-javascript tree-sitter-python`);
67
+ return false;
68
+ }
69
+ }
70
+ async function loadLanguage(lang) {
71
+ if (languages.has(lang))
72
+ return languages.get(lang);
73
+ const langPackages = {
74
+ typescript: 'tree-sitter-typescript',
75
+ javascript: 'tree-sitter-javascript',
76
+ python: 'tree-sitter-python',
77
+ rust: 'tree-sitter-rust',
78
+ go: 'tree-sitter-go',
79
+ java: 'tree-sitter-java',
80
+ c: 'tree-sitter-c',
81
+ cpp: 'tree-sitter-cpp',
82
+ ruby: 'tree-sitter-ruby',
83
+ php: 'tree-sitter-php',
84
+ };
85
+ const pkg = langPackages[lang];
86
+ if (!pkg)
87
+ return null;
88
+ try {
89
+ const langModule = await Promise.resolve(`${pkg}`).then(s => __importStar(require(s)));
90
+ const language = langModule.default || langModule;
91
+ // Handle TypeScript which exports tsx and typescript
92
+ if (lang === 'typescript' && language.typescript) {
93
+ languages.set(lang, language.typescript);
94
+ languages.set('tsx', language.tsx);
95
+ return language.typescript;
96
+ }
97
+ languages.set(lang, language);
98
+ return language;
99
+ }
100
+ catch {
101
+ return null;
102
+ }
103
+ }
104
+ function isTreeSitterAvailable() {
105
+ try {
106
+ require.resolve('tree-sitter');
107
+ return true;
108
+ }
109
+ catch {
110
+ return false;
111
+ }
112
+ }
113
+ // ============================================================================
114
+ // Parser
115
+ // ============================================================================
116
+ class CodeParser {
117
+ constructor() {
118
+ this.parser = null;
119
+ this.initialized = false;
120
+ }
121
+ async init() {
122
+ if (this.initialized)
123
+ return true;
124
+ const loaded = await loadTreeSitter();
125
+ if (!loaded)
126
+ return false;
127
+ this.parser = new Parser();
128
+ this.initialized = true;
129
+ return true;
130
+ }
131
+ /**
132
+ * Detect language from file extension
133
+ */
134
+ detectLanguage(file) {
135
+ const ext = path.extname(file).toLowerCase();
136
+ const langMap = {
137
+ '.ts': 'typescript',
138
+ '.tsx': 'tsx',
139
+ '.js': 'javascript',
140
+ '.jsx': 'javascript',
141
+ '.mjs': 'javascript',
142
+ '.cjs': 'javascript',
143
+ '.py': 'python',
144
+ '.rs': 'rust',
145
+ '.go': 'go',
146
+ '.java': 'java',
147
+ '.c': 'c',
148
+ '.h': 'c',
149
+ '.cpp': 'cpp',
150
+ '.cc': 'cpp',
151
+ '.cxx': 'cpp',
152
+ '.hpp': 'cpp',
153
+ '.rb': 'ruby',
154
+ '.php': 'php',
155
+ };
156
+ return langMap[ext] || 'unknown';
157
+ }
158
+ /**
159
+ * Parse a file and return the AST
160
+ */
161
+ async parse(file, content) {
162
+ if (!this.initialized) {
163
+ await this.init();
164
+ }
165
+ if (!this.parser)
166
+ return null;
167
+ const lang = this.detectLanguage(file);
168
+ const language = await loadLanguage(lang);
169
+ if (!language)
170
+ return null;
171
+ this.parser.setLanguage(language);
172
+ const code = content ?? (fs.existsSync(file) ? fs.readFileSync(file, 'utf8') : '');
173
+ const tree = this.parser.parse(code);
174
+ return this.convertNode(tree.rootNode);
175
+ }
176
+ convertNode(node) {
177
+ return {
178
+ type: node.type,
179
+ text: node.text,
180
+ startPosition: node.startPosition,
181
+ endPosition: node.endPosition,
182
+ children: node.children?.map((c) => this.convertNode(c)) || [],
183
+ };
184
+ }
185
+ /**
186
+ * Analyze a file for functions, classes, imports, etc.
187
+ */
188
+ async analyze(file, content) {
189
+ const start = performance.now();
190
+ const lang = this.detectLanguage(file);
191
+ const code = content ?? (fs.existsSync(file) ? fs.readFileSync(file, 'utf8') : '');
192
+ // Try tree-sitter first, fall back to regex
193
+ if (this.initialized && this.parser) {
194
+ const language = await loadLanguage(lang);
195
+ if (language) {
196
+ this.parser.setLanguage(language);
197
+ const tree = this.parser.parse(code);
198
+ return this.analyzeTree(file, lang, tree.rootNode, code, start);
199
+ }
200
+ }
201
+ // Regex fallback
202
+ return this.analyzeWithRegex(file, lang, code, start);
203
+ }
204
+ analyzeTree(file, lang, root, code, start) {
205
+ const imports = [];
206
+ const exports = [];
207
+ const functions = [];
208
+ const classes = [];
209
+ const variables = [];
210
+ const types = [];
211
+ const visit = (node) => {
212
+ // Imports
213
+ if (node.type === 'import_statement' || node.type === 'import_declaration') {
214
+ const imp = this.parseImport(node, lang);
215
+ if (imp)
216
+ imports.push(imp);
217
+ }
218
+ // Exports
219
+ if (node.type.includes('export')) {
220
+ const exp = this.parseExport(node, lang);
221
+ if (exp)
222
+ exports.push(exp);
223
+ }
224
+ // Functions
225
+ if (node.type.includes('function') || node.type === 'method_definition' || node.type === 'arrow_function') {
226
+ const fn = this.parseFunction(node, code, lang);
227
+ if (fn)
228
+ functions.push(fn);
229
+ }
230
+ // Classes
231
+ if (node.type === 'class_declaration' || node.type === 'class') {
232
+ const cls = this.parseClass(node, code, lang);
233
+ if (cls)
234
+ classes.push(cls);
235
+ }
236
+ // Variables
237
+ if (node.type === 'variable_declarator' || node.type === 'assignment') {
238
+ const name = this.getIdentifierName(node);
239
+ if (name)
240
+ variables.push(name);
241
+ }
242
+ // Type definitions
243
+ if (node.type === 'type_alias_declaration' || node.type === 'interface_declaration') {
244
+ const name = this.getIdentifierName(node);
245
+ if (name)
246
+ types.push(name);
247
+ }
248
+ // Recurse
249
+ for (const child of node.children || []) {
250
+ visit(child);
251
+ }
252
+ };
253
+ visit(root);
254
+ const lines = code.split('\n').length;
255
+ const complexity = this.calculateComplexity(code);
256
+ return {
257
+ file,
258
+ language: lang,
259
+ imports,
260
+ exports,
261
+ functions,
262
+ classes,
263
+ variables,
264
+ types,
265
+ complexity,
266
+ lines,
267
+ parseTime: performance.now() - start,
268
+ };
269
+ }
270
+ parseImport(node, lang) {
271
+ try {
272
+ const source = this.findChild(node, 'string')?.text?.replace(/['"]/g, '') || '';
273
+ const named = [];
274
+ let defaultImport;
275
+ let namespace;
276
+ // Find import specifiers
277
+ const specifiers = this.findChild(node, 'import_clause') || node;
278
+ for (const child of specifiers.children || []) {
279
+ if (child.type === 'identifier') {
280
+ defaultImport = child.text;
281
+ }
282
+ else if (child.type === 'namespace_import') {
283
+ namespace = this.getIdentifierName(child) || undefined;
284
+ }
285
+ else if (child.type === 'named_imports') {
286
+ for (const spec of child.children || []) {
287
+ if (spec.type === 'import_specifier') {
288
+ named.push(this.getIdentifierName(spec) || '');
289
+ }
290
+ }
291
+ }
292
+ }
293
+ return {
294
+ source,
295
+ default: defaultImport,
296
+ named: named.filter(Boolean),
297
+ namespace,
298
+ type: 'esm',
299
+ };
300
+ }
301
+ catch {
302
+ return null;
303
+ }
304
+ }
305
+ parseExport(node, lang) {
306
+ try {
307
+ if (node.type === 'export_statement') {
308
+ const declaration = this.findChild(node, 'declaration');
309
+ if (declaration) {
310
+ const name = this.getIdentifierName(declaration);
311
+ return { name: name || 'default', type: node.text.includes('default') ? 'default' : 'named' };
312
+ }
313
+ }
314
+ return null;
315
+ }
316
+ catch {
317
+ return null;
318
+ }
319
+ }
320
+ parseFunction(node, code, lang) {
321
+ try {
322
+ const name = this.getIdentifierName(node) || '<anonymous>';
323
+ const params = [];
324
+ let returnType;
325
+ const isAsync = node.text.includes('async');
326
+ const isExported = node.parent?.type?.includes('export');
327
+ // Get parameters
328
+ const paramsNode = this.findChild(node, 'formal_parameters') || this.findChild(node, 'parameters');
329
+ if (paramsNode) {
330
+ for (const param of paramsNode.children || []) {
331
+ if (param.type === 'identifier' || param.type === 'required_parameter') {
332
+ params.push(this.getIdentifierName(param) || '');
333
+ }
334
+ }
335
+ }
336
+ // Get return type
337
+ const returnNode = this.findChild(node, 'type_annotation');
338
+ if (returnNode) {
339
+ returnType = returnNode.text.replace(/^:\s*/, '');
340
+ }
341
+ // Calculate complexity
342
+ const bodyText = this.findChild(node, 'statement_block')?.text || '';
343
+ const complexity = this.calculateComplexity(bodyText);
344
+ // Find function calls
345
+ const calls = [];
346
+ const callRegex = /(\w+)\s*\(/g;
347
+ let match;
348
+ while ((match = callRegex.exec(bodyText)) !== null) {
349
+ if (!['if', 'for', 'while', 'switch', 'catch', 'function'].includes(match[1])) {
350
+ calls.push(match[1]);
351
+ }
352
+ }
353
+ return {
354
+ name,
355
+ params: params.filter(Boolean),
356
+ returnType,
357
+ async: isAsync,
358
+ exported: isExported,
359
+ startLine: node.startPosition.row + 1,
360
+ endLine: node.endPosition.row + 1,
361
+ complexity,
362
+ calls: [...new Set(calls)],
363
+ };
364
+ }
365
+ catch {
366
+ return null;
367
+ }
368
+ }
369
+ parseClass(node, code, lang) {
370
+ try {
371
+ const name = this.getIdentifierName(node) || '<anonymous>';
372
+ let extendsClass;
373
+ const implementsList = [];
374
+ const methods = [];
375
+ const properties = [];
376
+ // Get extends/implements
377
+ const heritage = this.findChild(node, 'class_heritage');
378
+ if (heritage) {
379
+ const extendsNode = this.findChild(heritage, 'extends_clause');
380
+ if (extendsNode) {
381
+ extendsClass = this.getIdentifierName(extendsNode) || undefined;
382
+ }
383
+ }
384
+ // Get methods and properties
385
+ const body = this.findChild(node, 'class_body');
386
+ if (body) {
387
+ for (const member of body.children || []) {
388
+ if (member.type === 'method_definition') {
389
+ const method = this.parseFunction(member, code, lang);
390
+ if (method)
391
+ methods.push(method);
392
+ }
393
+ else if (member.type === 'field_definition' || member.type === 'public_field_definition') {
394
+ const propName = this.getIdentifierName(member);
395
+ if (propName)
396
+ properties.push(propName);
397
+ }
398
+ }
399
+ }
400
+ return {
401
+ name,
402
+ extends: extendsClass,
403
+ implements: implementsList,
404
+ methods,
405
+ properties,
406
+ exported: node.parent?.type?.includes('export'),
407
+ startLine: node.startPosition.row + 1,
408
+ endLine: node.endPosition.row + 1,
409
+ };
410
+ }
411
+ catch {
412
+ return null;
413
+ }
414
+ }
415
+ findChild(node, type) {
416
+ if (!node.children)
417
+ return null;
418
+ for (const child of node.children) {
419
+ if (child.type === type)
420
+ return child;
421
+ const found = this.findChild(child, type);
422
+ if (found)
423
+ return found;
424
+ }
425
+ return null;
426
+ }
427
+ getIdentifierName(node) {
428
+ if (node.type === 'identifier')
429
+ return node.text;
430
+ if (!node.children)
431
+ return null;
432
+ for (const child of node.children) {
433
+ if (child.type === 'identifier' || child.type === 'property_identifier') {
434
+ return child.text;
435
+ }
436
+ }
437
+ return null;
438
+ }
439
+ calculateComplexity(code) {
440
+ const patterns = [
441
+ /\bif\b/g,
442
+ /\belse\b/g,
443
+ /\bfor\b/g,
444
+ /\bwhile\b/g,
445
+ /\bcase\b/g,
446
+ /\bcatch\b/g,
447
+ /\?\s*[^:]/g, // ternary
448
+ /&&/g,
449
+ /\|\|/g,
450
+ ];
451
+ let complexity = 1;
452
+ for (const pattern of patterns) {
453
+ complexity += (code.match(pattern) || []).length;
454
+ }
455
+ return complexity;
456
+ }
457
+ analyzeWithRegex(file, lang, code, start) {
458
+ const lines = code.split('\n');
459
+ const imports = [];
460
+ const exports = [];
461
+ const functions = [];
462
+ const classes = [];
463
+ const variables = [];
464
+ const types = [];
465
+ // Regex patterns
466
+ const importRegex = /import\s+(?:(\w+)\s*,?\s*)?(?:\{([^}]+)\}\s*)?(?:\*\s+as\s+(\w+)\s*)?from\s+['"]([^'"]+)['"]/g;
467
+ const requireRegex = /(?:const|let|var)\s+(?:(\w+)|\{([^}]+)\})\s*=\s*require\s*\(['"]([^'"]+)['"]\)/g;
468
+ const exportRegex = /export\s+(?:(default)\s+)?(?:(class|function|const|let|var|interface|type)\s+)?(\w+)?/g;
469
+ const functionRegex = /(?:export\s+)?(?:async\s+)?function\s+(\w+)\s*\(([^)]*)\)/g;
470
+ const arrowRegex = /(?:const|let|var)\s+(\w+)\s*=\s*(?:async\s*)?\([^)]*\)\s*=>/g;
471
+ const classRegex = /(?:export\s+)?class\s+(\w+)(?:\s+extends\s+(\w+))?/g;
472
+ const typeRegex = /(?:export\s+)?(?:type|interface)\s+(\w+)/g;
473
+ // Parse imports
474
+ let match;
475
+ while ((match = importRegex.exec(code)) !== null) {
476
+ imports.push({
477
+ source: match[4],
478
+ default: match[1],
479
+ named: match[2] ? match[2].split(',').map(s => s.trim().split(/\s+as\s+/)[0]) : [],
480
+ namespace: match[3],
481
+ type: 'esm',
482
+ });
483
+ }
484
+ while ((match = requireRegex.exec(code)) !== null) {
485
+ imports.push({
486
+ source: match[3],
487
+ default: match[1],
488
+ named: match[2] ? match[2].split(',').map(s => s.trim()) : [],
489
+ type: 'commonjs',
490
+ });
491
+ }
492
+ // Parse exports
493
+ while ((match = exportRegex.exec(code)) !== null) {
494
+ if (match[3]) {
495
+ exports.push({
496
+ name: match[3],
497
+ type: match[1] === 'default' ? 'default' : 'named',
498
+ });
499
+ }
500
+ }
501
+ // Parse functions
502
+ while ((match = functionRegex.exec(code)) !== null) {
503
+ functions.push({
504
+ name: match[1],
505
+ params: match[2].split(',').map(p => p.trim().split(/[:\s]/)[0]).filter(Boolean),
506
+ async: code.substring(match.index - 10, match.index).includes('async'),
507
+ exported: code.substring(match.index - 10, match.index).includes('export'),
508
+ startLine: code.substring(0, match.index).split('\n').length,
509
+ endLine: 0,
510
+ complexity: 1,
511
+ calls: [],
512
+ });
513
+ }
514
+ while ((match = arrowRegex.exec(code)) !== null) {
515
+ functions.push({
516
+ name: match[1],
517
+ params: [],
518
+ async: code.substring(match.index, match.index + 50).includes('async'),
519
+ exported: false,
520
+ startLine: code.substring(0, match.index).split('\n').length,
521
+ endLine: 0,
522
+ complexity: 1,
523
+ calls: [],
524
+ });
525
+ }
526
+ // Parse classes
527
+ while ((match = classRegex.exec(code)) !== null) {
528
+ classes.push({
529
+ name: match[1],
530
+ extends: match[2],
531
+ implements: [],
532
+ methods: [],
533
+ properties: [],
534
+ exported: code.substring(match.index - 10, match.index).includes('export'),
535
+ startLine: code.substring(0, match.index).split('\n').length,
536
+ endLine: 0,
537
+ });
538
+ }
539
+ // Parse types
540
+ while ((match = typeRegex.exec(code)) !== null) {
541
+ types.push(match[1]);
542
+ }
543
+ return {
544
+ file,
545
+ language: lang,
546
+ imports,
547
+ exports,
548
+ functions,
549
+ classes,
550
+ variables,
551
+ types,
552
+ complexity: this.calculateComplexity(code),
553
+ lines: lines.length,
554
+ parseTime: performance.now() - start,
555
+ };
556
+ }
557
+ /**
558
+ * Get all symbols (functions, classes, types) in a file
559
+ */
560
+ async getSymbols(file) {
561
+ const analysis = await this.analyze(file);
562
+ return [
563
+ ...analysis.functions.map(f => f.name),
564
+ ...analysis.classes.map(c => c.name),
565
+ ...analysis.types,
566
+ ...analysis.variables,
567
+ ];
568
+ }
569
+ /**
570
+ * Get the call graph for a file
571
+ */
572
+ async getCallGraph(file) {
573
+ const analysis = await this.analyze(file);
574
+ const graph = new Map();
575
+ for (const fn of analysis.functions) {
576
+ graph.set(fn.name, fn.calls);
577
+ }
578
+ for (const cls of analysis.classes) {
579
+ for (const method of cls.methods) {
580
+ graph.set(`${cls.name}.${method.name}`, method.calls);
581
+ }
582
+ }
583
+ return graph;
584
+ }
585
+ }
586
+ exports.CodeParser = CodeParser;
587
+ // ============================================================================
588
+ // Singleton
589
+ // ============================================================================
590
+ let parserInstance = null;
591
+ function getCodeParser() {
592
+ if (!parserInstance) {
593
+ parserInstance = new CodeParser();
594
+ }
595
+ return parserInstance;
596
+ }
597
+ async function initCodeParser() {
598
+ const parser = getCodeParser();
599
+ await parser.init();
600
+ return parser;
601
+ }
602
+ exports.default = CodeParser;
dist/core/attention-fallbacks.d.ts ADDED
@@ -0,0 +1,321 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ /**
2
+ * Attention Fallbacks - Safe wrapper around @ruvector/attention with automatic array conversion
3
+ *
4
+ * This wrapper handles the array type conversion automatically, allowing users
5
+ * to pass either regular arrays or Float32Arrays.
6
+ *
7
+ * @ruvector/attention requires Float32Array inputs.
8
+ * This wrapper handles the conversion automatically.
9
+ */
10
+ /**
11
+ * Attention output interface
12
+ */
13
+ export interface AttentionOutput {
14
+ /** Output vector as regular array */
15
+ values: number[];
16
+ /** Output as Float32Array for performance-critical code */
17
+ raw: Float32Array;
18
+ }
19
+ /**
20
+ * Multi-head attention mechanism
21
+ *
22
+ * This wrapper automatically converts array inputs to Float32Array.
23
+ */
24
+ export declare class MultiHeadAttention {
25
+ private inner;
26
+ readonly dim: number;
27
+ readonly numHeads: number;
28
+ /**
29
+ * Create a new multi-head attention instance
30
+ *
31
+ * @param dim - Embedding dimension (must be divisible by numHeads)
32
+ * @param numHeads - Number of attention heads
33
+ */
34
+ constructor(dim: number, numHeads: number);
35
+ /**
36
+ * Compute multi-head attention
37
+ *
38
+ * @param query - Query vector
39
+ * @param keys - Array of key vectors
40
+ * @param values - Array of value vectors
41
+ * @returns Attention output
42
+ *
43
+ * @example
44
+ * ```typescript
45
+ * const mha = new MultiHeadAttention(64, 4);
46
+ *
47
+ * // Works with regular arrays
48
+ * const result1 = mha.compute([...64 values], [[...64], [...64]], [[...64], [...64]]);
49
+ *
50
+ * // Also works with Float32Array
51
+ * const q = new Float32Array(64);
52
+ * const k = [new Float32Array(64)];
53
+ * const v = [new Float32Array(64)];
54
+ * const result2 = mha.compute(q, k, v);
55
+ * ```
56
+ */
57
+ compute(query: number[] | Float32Array, keys: (number[] | Float32Array)[], values: (number[] | Float32Array)[]): AttentionOutput;
58
+ /**
59
+ * Compute and return raw Float32Array (faster, no conversion)
60
+ */
61
+ computeRaw(query: Float32Array, keys: Float32Array[], values: Float32Array[]): Float32Array;
62
+ get headDim(): number;
63
+ }
64
+ /**
65
+ * Flash attention with tiled computation
66
+ */
67
+ export declare class FlashAttention {
68
+ private inner;
69
+ readonly dim: number;
70
+ readonly blockSize: number;
71
+ /**
72
+ * Create a new flash attention instance
73
+ *
74
+ * @param dim - Embedding dimension
75
+ * @param blockSize - Block size for tiled computation (default: 512)
76
+ */
77
+ constructor(dim: number, blockSize?: number);
78
+ /**
79
+ * Compute flash attention
80
+ */
81
+ compute(query: number[] | Float32Array, keys: (number[] | Float32Array)[], values: (number[] | Float32Array)[]): AttentionOutput;
82
+ computeRaw(query: Float32Array, keys: Float32Array[], values: Float32Array[]): Float32Array;
83
+ }
84
+ /**
85
+ * Hyperbolic attention in Poincare ball model
86
+ */
87
+ export declare class HyperbolicAttention {
88
+ private inner;
89
+ readonly dim: number;
90
+ readonly curvature: number;
91
+ /**
92
+ * Create a new hyperbolic attention instance
93
+ *
94
+ * @param dim - Embedding dimension
95
+ * @param curvature - Hyperbolic curvature (typically 1.0)
96
+ */
97
+ constructor(dim: number, curvature?: number);
98
+ /**
99
+ * Compute hyperbolic attention
100
+ */
101
+ compute(query: number[] | Float32Array, keys: (number[] | Float32Array)[], values: (number[] | Float32Array)[]): AttentionOutput;
102
+ computeRaw(query: Float32Array, keys: Float32Array[], values: Float32Array[]): Float32Array;
103
+ }
104
+ /**
105
+ * Linear attention (Performer-style) with O(n) complexity
106
+ */
107
+ export declare class LinearAttention {
108
+ private inner;
109
+ readonly dim: number;
110
+ readonly numFeatures: number;
111
+ /**
112
+ * Create a new linear attention instance
113
+ *
114
+ * @param dim - Embedding dimension
115
+ * @param numFeatures - Number of random features
116
+ */
117
+ constructor(dim: number, numFeatures: number);
118
+ /**
119
+ * Compute linear attention
120
+ */
121
+ compute(query: number[] | Float32Array, keys: (number[] | Float32Array)[], values: (number[] | Float32Array)[]): AttentionOutput;
122
+ computeRaw(query: Float32Array, keys: Float32Array[], values: Float32Array[]): Float32Array;
123
+ }
124
+ /**
125
+ * Local-global attention (Longformer-style)
126
+ */
127
+ export declare class LocalGlobalAttention {
128
+ private inner;
129
+ readonly dim: number;
130
+ readonly localWindow: number;
131
+ readonly globalTokens: number;
132
+ /**
133
+ * Create a new local-global attention instance
134
+ *
135
+ * @param dim - Embedding dimension
136
+ * @param localWindow - Size of local attention window
137
+ * @param globalTokens - Number of global attention tokens
138
+ */
139
+ constructor(dim: number, localWindow: number, globalTokens: number);
140
+ /**
141
+ * Compute local-global attention
142
+ */
143
+ compute(query: number[] | Float32Array, keys: (number[] | Float32Array)[], values: (number[] | Float32Array)[]): AttentionOutput;
144
+ computeRaw(query: Float32Array, keys: Float32Array[], values: Float32Array[]): Float32Array;
145
+ }
146
+ /**
147
+ * MoE configuration
148
+ */
149
+ export interface MoEConfig {
150
+ dim: number;
151
+ numExperts: number;
152
+ topK: number;
153
+ expertCapacity?: number;
154
+ }
155
+ /**
156
+ * Mixture of Experts attention
157
+ */
158
+ export declare class MoEAttention {
159
+ private inner;
160
+ readonly config: MoEConfig;
161
+ /**
162
+ * Create a new MoE attention instance
163
+ *
164
+ * @param config - MoE configuration
165
+ */
166
+ constructor(config: MoEConfig);
167
+ /**
168
+ * Create with simple parameters
169
+ */
170
+ static simple(dim: number, numExperts: number, topK: number): MoEAttention;
171
+ /**
172
+ * Compute MoE attention
173
+ */
174
+ compute(query: number[] | Float32Array, keys: (number[] | Float32Array)[], values: (number[] | Float32Array)[]): AttentionOutput;
175
+ computeRaw(query: Float32Array, keys: Float32Array[], values: Float32Array[]): Float32Array;
176
+ }
177
+ /**
178
+ * Project a vector into the Poincare ball
179
+ */
180
+ export declare function projectToPoincareBall(vector: number[] | Float32Array, curvature?: number): number[];
181
+ /**
182
+ * Compute hyperbolic (Poincare) distance between two points
183
+ */
184
+ export declare function poincareDistance(a: number[] | Float32Array, b: number[] | Float32Array, curvature?: number): number;
185
+ /**
186
+ * Mobius addition in hyperbolic space
187
+ */
188
+ export declare function mobiusAddition(a: number[] | Float32Array, b: number[] | Float32Array, curvature?: number): number[];
189
+ /**
190
+ * Exponential map from tangent space to hyperbolic space
191
+ */
192
+ export declare function expMap(base: number[] | Float32Array, tangent: number[] | Float32Array, curvature?: number): number[];
193
+ /**
194
+ * Logarithmic map from hyperbolic space to tangent space
195
+ */
196
+ export declare function logMap(base: number[] | Float32Array, point: number[] | Float32Array, curvature?: number): number[];
197
+ /**
198
+ * Check if attention module is available
199
+ */
200
+ export declare function isAttentionAvailable(): boolean;
201
+ /**
202
+ * Get attention module version
203
+ */
204
+ export declare function getAttentionVersion(): string | null;
205
+ /**
206
+ * Graph attention with Rotary Position Embeddings
207
+ * Excellent for code AST and dependency graphs
208
+ */
209
+ export declare class GraphRoPeAttention {
210
+ private inner;
211
+ readonly dim: number;
212
+ readonly numHeads: number;
213
+ readonly maxSeqLen: number;
214
+ constructor(dim: number, numHeads?: number, maxSeqLen?: number);
215
+ compute(query: number[] | Float32Array, keys: (number[] | Float32Array)[], values: (number[] | Float32Array)[], positions?: number[]): AttentionOutput;
216
+ }
217
+ /**
218
+ * Edge-featured attention for graphs with edge attributes
219
+ * Useful for weighted dependency graphs
220
+ */
221
+ export declare class EdgeFeaturedAttention {
222
+ private inner;
223
+ readonly dim: number;
224
+ readonly edgeDim: number;
225
+ constructor(dim: number, edgeDim?: number);
226
+ compute(query: number[] | Float32Array, keys: (number[] | Float32Array)[], values: (number[] | Float32Array)[], edgeFeatures?: (number[] | Float32Array)[]): AttentionOutput;
227
+ }
228
+ /**
229
+ * Dual-space attention (Euclidean + Hyperbolic)
230
+ * Best of both worlds for hierarchical + semantic similarity
231
+ */
232
+ export declare class DualSpaceAttention {
233
+ private inner;
234
+ readonly dim: number;
235
+ readonly curvature: number;
236
+ readonly alpha: number;
237
+ constructor(dim: number, curvature?: number, alpha?: number);
238
+ compute(query: number[] | Float32Array, keys: (number[] | Float32Array)[], values: (number[] | Float32Array)[]): AttentionOutput;
239
+ }
240
+ /**
241
+ * Basic dot-product attention
242
+ */
243
+ export declare class DotProductAttention {
244
+ private inner;
245
+ readonly dim: number;
246
+ constructor(dim: number);
247
+ compute(query: number[] | Float32Array, keys: (number[] | Float32Array)[], values: (number[] | Float32Array)[]): AttentionOutput;
248
+ }
249
+ /**
250
+ * Compute attention in parallel across multiple queries
251
+ */
252
+ export declare function parallelAttentionCompute(queries: (number[] | Float32Array)[], keys: (number[] | Float32Array)[], values: (number[] | Float32Array)[], attentionType?: 'dot' | 'multi-head' | 'flash' | 'hyperbolic' | 'linear'): Promise<number[][]>;
253
+ /**
254
+ * Batch attention compute for multiple query-key-value sets
255
+ */
256
+ export declare function batchAttentionCompute(batches: Array<{
257
+ query: number[] | Float32Array;
258
+ keys: (number[] | Float32Array)[];
259
+ values: (number[] | Float32Array)[];
260
+ }>, attentionType?: 'dot' | 'multi-head' | 'flash' | 'hyperbolic' | 'linear'): Promise<number[][]>;
261
+ /**
262
+ * Async flash attention with callback
263
+ */
264
+ export declare function computeFlashAttentionAsync(query: number[] | Float32Array, keys: (number[] | Float32Array)[], values: (number[] | Float32Array)[]): Promise<number[]>;
265
+ /**
266
+ * Async hyperbolic attention
267
+ */
268
+ export declare function computeHyperbolicAttentionAsync(query: number[] | Float32Array, keys: (number[] | Float32Array)[], values: (number[] | Float32Array)[], curvature?: number): Promise<number[]>;
269
+ /**
270
+ * Adam optimizer for attention training
271
+ */
272
+ export declare class AdamOptimizer {
273
+ private inner;
274
+ constructor(learningRate?: number, beta1?: number, beta2?: number);
275
+ step(gradients: number[] | Float32Array, params: number[] | Float32Array): number[];
276
+ }
277
+ /**
278
+ * InfoNCE contrastive loss
279
+ */
280
+ export declare function infoNceLoss(anchor: number[] | Float32Array, positive: number[] | Float32Array, negatives: (number[] | Float32Array)[], temperature?: number): number;
281
+ /**
282
+ * Hard negative mining for contrastive learning
283
+ */
284
+ export declare function mineHardNegatives(anchor: number[] | Float32Array, candidates: (number[] | Float32Array)[], topK?: number): number[][];
285
+ /**
286
+ * Benchmark attention implementations
287
+ */
288
+ export declare function benchmarkAttention(dim: number, seqLen: number, iterations?: number): Promise<Record<string, {
289
+ avgMs: number;
290
+ minMs: number;
291
+ maxMs: number;
292
+ }>>;
293
+ declare const _default: {
294
+ DotProductAttention: typeof DotProductAttention;
295
+ MultiHeadAttention: typeof MultiHeadAttention;
296
+ FlashAttention: typeof FlashAttention;
297
+ HyperbolicAttention: typeof HyperbolicAttention;
298
+ LinearAttention: typeof LinearAttention;
299
+ LocalGlobalAttention: typeof LocalGlobalAttention;
300
+ MoEAttention: typeof MoEAttention;
301
+ GraphRoPeAttention: typeof GraphRoPeAttention;
302
+ EdgeFeaturedAttention: typeof EdgeFeaturedAttention;
303
+ DualSpaceAttention: typeof DualSpaceAttention;
304
+ parallelAttentionCompute: typeof parallelAttentionCompute;
305
+ batchAttentionCompute: typeof batchAttentionCompute;
306
+ computeFlashAttentionAsync: typeof computeFlashAttentionAsync;
307
+ computeHyperbolicAttentionAsync: typeof computeHyperbolicAttentionAsync;
308
+ AdamOptimizer: typeof AdamOptimizer;
309
+ infoNceLoss: typeof infoNceLoss;
310
+ mineHardNegatives: typeof mineHardNegatives;
311
+ projectToPoincareBall: typeof projectToPoincareBall;
312
+ poincareDistance: typeof poincareDistance;
313
+ mobiusAddition: typeof mobiusAddition;
314
+ expMap: typeof expMap;
315
+ logMap: typeof logMap;
316
+ isAttentionAvailable: typeof isAttentionAvailable;
317
+ getAttentionVersion: typeof getAttentionVersion;
318
+ benchmarkAttention: typeof benchmarkAttention;
319
+ };
320
+ export default _default;
321
+ //# sourceMappingURL=attention-fallbacks.d.ts.map
dist/core/attention-fallbacks.d.ts.map ADDED
@@ -0,0 +1 @@
 
 
1
+ {"version":3,"file":"attention-fallbacks.d.ts","sourceRoot":"","sources":["../../src/core/attention-fallbacks.ts"],"names":[],"mappings":"AAAA;;;;;;;;GAQG;AA8CH;;GAEG;AACH,MAAM,WAAW,eAAe;IAC9B,qCAAqC;IACrC,MAAM,EAAE,MAAM,EAAE,CAAC;IACjB,2DAA2D;IAC3D,GAAG,EAAE,YAAY,CAAC;CACnB;AAED;;;;GAIG;AACH,qBAAa,kBAAkB;IAC7B,OAAO,CAAC,KAAK,CAAM;IACnB,SAAgB,GAAG,EAAE,MAAM,CAAC;IAC5B,SAAgB,QAAQ,EAAE,MAAM,CAAC;IAEjC;;;;;OAKG;gBACS,GAAG,EAAE,MAAM,EAAE,QAAQ,EAAE,MAAM;IAOzC;;;;;;;;;;;;;;;;;;;;;OAqBG;IACH,OAAO,CACL,KAAK,EAAE,MAAM,EAAE,GAAG,YAAY,EAC9B,IAAI,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,EACjC,MAAM,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,GAClC,eAAe;IAYlB;;OAEG;IACH,UAAU,CACR,KAAK,EAAE,YAAY,EACnB,IAAI,EAAE,YAAY,EAAE,EACpB,MAAM,EAAE,YAAY,EAAE,GACrB,YAAY;IAIf,IAAI,OAAO,IAAI,MAAM,CAEpB;CACF;AAED;;GAEG;AACH,qBAAa,cAAc;IACzB,OAAO,CAAC,KAAK,CAAM;IACnB,SAAgB,GAAG,EAAE,MAAM,CAAC;IAC5B,SAAgB,SAAS,EAAE,MAAM,CAAC;IAElC;;;;;OAKG;gBACS,GAAG,EAAE,MAAM,EAAE,SAAS,GAAE,MAAY;IAOhD;;OAEG;IACH,OAAO,CACL,KAAK,EAAE,MAAM,EAAE,GAAG,YAAY,EAC9B,IAAI,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,EACjC,MAAM,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,GAClC,eAAe;IAYlB,UAAU,CACR,KAAK,EAAE,YAAY,EACnB,IAAI,EAAE,YAAY,EAAE,EACpB,MAAM,EAAE,YAAY,EAAE,GACrB,YAAY;CAGhB;AAED;;GAEG;AACH,qBAAa,mBAAmB;IAC9B,OAAO,CAAC,KAAK,CAAM;IACnB,SAAgB,GAAG,EAAE,MAAM,CAAC;IAC5B,SAAgB,SAAS,EAAE,MAAM,CAAC;IAElC;;;;;OAKG;gBACS,GAAG,EAAE,MAAM,EAAE,SAAS,GAAE,MAAY;IAOhD;;OAEG;IACH,OAAO,CACL,KAAK,EAAE,MAAM,EAAE,GAAG,YAAY,EAC9B,IAAI,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,EACjC,MAAM,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,GAClC,eAAe;IAYlB,UAAU,CACR,KAAK,EAAE,YAAY,EACnB,IAAI,EAAE,YAAY,EAAE,EACpB,MAAM,EAAE,YAAY,EAAE,GACrB,YAAY;CAGhB;AAED;;GAEG;AACH,qBAAa,eAAe;IAC1B,OAAO,CAAC,KAAK,CAAM;IACnB,SAAgB,GAAG,EAAE,MAAM,CAAC;IAC5B,SAAgB,WAAW,EAAE,MAAM,CAAC;IAEpC;;;;;OAKG;gBACS,GAAG,EAAE,MAAM,EAAE,WAAW,EAAE,MAAM;IAO5C;;OAEG;IACH,OAAO,CACL,KAAK,EAAE,MAAM,EAAE,GAAG,YAAY,EAC9B,IAAI,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,EACjC,MAAM,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,GAClC,eAAe;IAYlB,UAAU,CACR,KAAK,EAAE,YAAY,EACnB,IAAI,EAAE,YAAY,EAAE,EACpB,MAAM,EAAE,YAAY,EAAE,GACrB,YAAY;CAGhB;AAED;;GAEG;AACH,qBAAa,oBAAoB;IAC/B,OAAO,CAAC,KAAK,CAAM;IACnB,SAAgB,GAAG,EAAE,MAAM,CAAC;IAC5B,SAAgB,WAAW,EAAE,MAAM,CAAC;IACpC,SAAgB,YAAY,EAAE,MAAM,CAAC;IAErC;;;;;;OAMG;gBACS,GAAG,EAAE,MAAM,EAAE,WAAW,EAAE,MAAM,EAAE,YAAY,EAAE,MAAM;IAQlE;;OAEG;IACH,OAAO,CACL,KAAK,EAAE,MAAM,EAAE,GAAG,YAAY,EAC9B,IAAI,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,EACjC,MAAM,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,GAClC,eAAe;IAYlB,UAAU,CACR,KAAK,EAAE,YAAY,EACnB,IAAI,EAAE,YAAY,EAAE,EACpB,MAAM,EAAE,YAAY,EAAE,GACrB,YAAY;CAGhB;AAED;;GAEG;AACH,MAAM,WAAW,SAAS;IACxB,GAAG,EAAE,MAAM,CAAC;IACZ,UAAU,EAAE,MAAM,CAAC;IACnB,IAAI,EAAE,MAAM,CAAC;IACb,cAAc,CAAC,EAAE,MAAM,CAAC;CACzB;AAED;;GAEG;AACH,qBAAa,YAAY;IACvB,OAAO,CAAC,KAAK,CAAM;IACnB,SAAgB,MAAM,EAAE,SAAS,CAAC;IAElC;;;;OAIG;gBACS,MAAM,EAAE,SAAS;IAW7B;;OAEG;IACH,MAAM,CAAC,MAAM,CAAC,GAAG,EAAE,MAAM,EAAE,UAAU,EAAE,MAAM,EAAE,IAAI,EAAE,MAAM,GAAG,YAAY;IAI1E;;OAEG;IACH,OAAO,CACL,KAAK,EAAE,MAAM,EAAE,GAAG,YAAY,EAC9B,IAAI,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,EACjC,MAAM,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,GAClC,eAAe;IAYlB,UAAU,CACR,KAAK,EAAE,YAAY,EACnB,IAAI,EAAE,YAAY,EAAE,EACpB,MAAM,EAAE,YAAY,EAAE,GACrB,YAAY;CAGhB;AAID;;GAEG;AACH,wBAAgB,qBAAqB,CACnC,MAAM,EAAE,MAAM,EAAE,GAAG,YAAY,EAC/B,SAAS,GAAE,MAAY,GACtB,MAAM,EAAE,CAIV;AAED;;GAEG;AACH,wBAAgB,gBAAgB,CAC9B,CAAC,EAAE,MAAM,EAAE,GAAG,YAAY,EAC1B,CAAC,EAAE,MAAM,EAAE,GAAG,YAAY,EAC1B,SAAS,GAAE,MAAY,GACtB,MAAM,CAGR;AAED;;GAEG;AACH,wBAAgB,cAAc,CAC5B,CAAC,EAAE,MAAM,EAAE,GAAG,YAAY,EAC1B,CAAC,EAAE,MAAM,EAAE,GAAG,YAAY,EAC1B,SAAS,GAAE,MAAY,GACtB,MAAM,EAAE,CAIV;AAED;;GAEG;AACH,wBAAgB,MAAM,CACpB,IAAI,EAAE,MAAM,EAAE,GAAG,YAAY,EAC7B,OAAO,EAAE,MAAM,EAAE,GAAG,YAAY,EAChC,SAAS,GAAE,MAAY,GACtB,MAAM,EAAE,CAIV;AAED;;GAEG;AACH,wBAAgB,MAAM,CACpB,IAAI,EAAE,MAAM,EAAE,GAAG,YAAY,EAC7B,KAAK,EAAE,MAAM,EAAE,GAAG,YAAY,EAC9B,SAAS,GAAE,MAAY,GACtB,MAAM,EAAE,CAIV;AAED;;GAEG;AACH,wBAAgB,oBAAoB,IAAI,OAAO,CAO9C;AAED;;GAEG;AACH,wBAAgB,mBAAmB,IAAI,MAAM,GAAG,IAAI,CAOnD;AAMD;;;GAGG;AACH,qBAAa,kBAAkB;IAC7B,OAAO,CAAC,KAAK,CAAM;IACnB,SAAgB,GAAG,EAAE,MAAM,CAAC;IAC5B,SAAgB,QAAQ,EAAE,MAAM,CAAC;IACjC,SAAgB,SAAS,EAAE,MAAM,CAAC;gBAEtB,GAAG,EAAE,MAAM,EAAE,QAAQ,GAAE,MAAU,EAAE,SAAS,GAAE,MAAa;IAQvE,OAAO,CACL,KAAK,EAAE,MAAM,EAAE,GAAG,YAAY,EAC9B,IAAI,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,EACjC,MAAM,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,EACnC,SAAS,CAAC,EAAE,MAAM,EAAE,GACnB,eAAe;CASnB;AAED;;;GAGG;AACH,qBAAa,qBAAqB;IAChC,OAAO,CAAC,KAAK,CAAM;IACnB,SAAgB,GAAG,EAAE,MAAM,CAAC;IAC5B,SAAgB,OAAO,EAAE,MAAM,CAAC;gBAEpB,GAAG,EAAE,MAAM,EAAE,OAAO,GAAE,MAAW;IAO7C,OAAO,CACL,KAAK,EAAE,MAAM,EAAE,GAAG,YAAY,EAC9B,IAAI,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,EACjC,MAAM,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,EACnC,YAAY,CAAC,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,GACzC,eAAe;CASnB;AAED;;;GAGG;AACH,qBAAa,kBAAkB;IAC7B,OAAO,CAAC,KAAK,CAAM;IACnB,SAAgB,GAAG,EAAE,MAAM,CAAC;IAC5B,SAAgB,SAAS,EAAE,MAAM,CAAC;IAClC,SAAgB,KAAK,EAAE,MAAM,CAAC;gBAElB,GAAG,EAAE,MAAM,EAAE,SAAS,GAAE,MAAY,EAAE,KAAK,GAAE,MAAY;IAQrE,OAAO,CACL,KAAK,EAAE,MAAM,EAAE,GAAG,YAAY,EAC9B,IAAI,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,EACjC,MAAM,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,GAClC,eAAe;CAQnB;AAED;;GAEG;AACH,qBAAa,mBAAmB;IAC9B,OAAO,CAAC,KAAK,CAAM;IACnB,SAAgB,GAAG,EAAE,MAAM,CAAC;gBAEhB,GAAG,EAAE,MAAM;IAMvB,OAAO,CACL,KAAK,EAAE,MAAM,EAAE,GAAG,YAAY,EAC9B,IAAI,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,EACjC,MAAM,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,GAClC,eAAe;CAQnB;AAMD;;GAEG;AACH,wBAAsB,wBAAwB,CAC5C,OAAO,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,EACpC,IAAI,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,EACjC,MAAM,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,EACnC,aAAa,GAAE,KAAK,GAAG,YAAY,GAAG,OAAO,GAAG,YAAY,GAAG,QAAuB,GACrF,OAAO,CAAC,MAAM,EAAE,EAAE,CAAC,CASrB;AAED;;GAEG;AACH,wBAAsB,qBAAqB,CACzC,OAAO,EAAE,KAAK,CAAC;IACb,KAAK,EAAE,MAAM,EAAE,GAAG,YAAY,CAAC;IAC/B,IAAI,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,CAAC;IAClC,MAAM,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,CAAC;CACrC,CAAC,EACF,aAAa,GAAE,KAAK,GAAG,YAAY,GAAG,OAAO,GAAG,YAAY,GAAG,QAAuB,GACrF,OAAO,CAAC,MAAM,EAAE,EAAE,CAAC,CASrB;AAED;;GAEG;AACH,wBAAgB,0BAA0B,CACxC,KAAK,EAAE,MAAM,EAAE,GAAG,YAAY,EAC9B,IAAI,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,EACjC,MAAM,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,GAClC,OAAO,CAAC,MAAM,EAAE,CAAC,CAanB;AAED;;GAEG;AACH,wBAAgB,+BAA+B,CAC7C,KAAK,EAAE,MAAM,EAAE,GAAG,YAAY,EAC9B,IAAI,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,EACjC,MAAM,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,EACnC,SAAS,GAAE,MAAY,GACtB,OAAO,CAAC,MAAM,EAAE,CAAC,CAcnB;AAMD;;GAEG;AACH,qBAAa,aAAa;IACxB,OAAO,CAAC,KAAK,CAAM;gBAEP,YAAY,GAAE,MAAc,EAAE,KAAK,GAAE,MAAY,EAAE,KAAK,GAAE,MAAc;IAKpF,IAAI,CAAC,SAAS,EAAE,MAAM,EAAE,GAAG,YAAY,EAAE,MAAM,EAAE,MAAM,EAAE,GAAG,YAAY,GAAG,MAAM,EAAE;CAIpF;AAED;;GAEG;AACH,wBAAgB,WAAW,CACzB,MAAM,EAAE,MAAM,EAAE,GAAG,YAAY,EAC/B,QAAQ,EAAE,MAAM,EAAE,GAAG,YAAY,EACjC,SAAS,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,EACtC,WAAW,GAAE,MAAa,GACzB,MAAM,CAQR;AAED;;GAEG;AACH,wBAAgB,iBAAiB,CAC/B,MAAM,EAAE,MAAM,EAAE,GAAG,YAAY,EAC/B,UAAU,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,EACvC,IAAI,GAAE,MAAU,GACf,MAAM,EAAE,EAAE,CAKZ;AAMD;;GAEG;AACH,wBAAsB,kBAAkB,CACtC,GAAG,EAAE,MAAM,EACX,MAAM,EAAE,MAAM,EACd,UAAU,GAAE,MAAY,GACvB,OAAO,CAAC,MAAM,CAAC,MAAM,EAAE;IAAE,KAAK,EAAE,MAAM,CAAC;IAAC,KAAK,EAAE,MAAM,CAAC;IAAC,KAAK,EAAE,MAAM,CAAA;CAAE,CAAC,CAAC,CAG1E;;;;;;;;;;;;;;;;;;;;;;;;;;;;AAED,wBAqCE"}
dist/core/attention-fallbacks.js ADDED
@@ -0,0 +1,552 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ "use strict";
2
+ /**
3
+ * Attention Fallbacks - Safe wrapper around @ruvector/attention with automatic array conversion
4
+ *
5
+ * This wrapper handles the array type conversion automatically, allowing users
6
+ * to pass either regular arrays or Float32Arrays.
7
+ *
8
+ * @ruvector/attention requires Float32Array inputs.
9
+ * This wrapper handles the conversion automatically.
10
+ */
11
+ Object.defineProperty(exports, "__esModule", { value: true });
12
+ exports.AdamOptimizer = exports.DotProductAttention = exports.DualSpaceAttention = exports.EdgeFeaturedAttention = exports.GraphRoPeAttention = exports.MoEAttention = exports.LocalGlobalAttention = exports.LinearAttention = exports.HyperbolicAttention = exports.FlashAttention = exports.MultiHeadAttention = void 0;
13
+ exports.projectToPoincareBall = projectToPoincareBall;
14
+ exports.poincareDistance = poincareDistance;
15
+ exports.mobiusAddition = mobiusAddition;
16
+ exports.expMap = expMap;
17
+ exports.logMap = logMap;
18
+ exports.isAttentionAvailable = isAttentionAvailable;
19
+ exports.getAttentionVersion = getAttentionVersion;
20
+ exports.parallelAttentionCompute = parallelAttentionCompute;
21
+ exports.batchAttentionCompute = batchAttentionCompute;
22
+ exports.computeFlashAttentionAsync = computeFlashAttentionAsync;
23
+ exports.computeHyperbolicAttentionAsync = computeHyperbolicAttentionAsync;
24
+ exports.infoNceLoss = infoNceLoss;
25
+ exports.mineHardNegatives = mineHardNegatives;
26
+ exports.benchmarkAttention = benchmarkAttention;
27
+ // Lazy load to avoid import errors if not installed
28
+ let attentionModule = null;
29
+ let loadError = null;
30
+ function getAttentionModule() {
31
+ if (attentionModule)
32
+ return attentionModule;
33
+ if (loadError)
34
+ throw loadError;
35
+ try {
36
+ attentionModule = require('@ruvector/attention');
37
+ return attentionModule;
38
+ }
39
+ catch (e) {
40
+ loadError = new Error(`@ruvector/attention is not installed or failed to load: ${e.message}\n` +
41
+ `Install with: npm install @ruvector/attention`);
42
+ throw loadError;
43
+ }
44
+ }
45
+ /**
46
+ * Convert any array-like input to Float32Array
47
+ */
48
+ function toFloat32Array(input) {
49
+ if (input instanceof Float32Array) {
50
+ return input;
51
+ }
52
+ return new Float32Array(input);
53
+ }
54
+ /**
55
+ * Convert nested arrays to Float32Arrays
56
+ */
57
+ function toFloat32Arrays(inputs) {
58
+ return inputs.map(arr => toFloat32Array(arr));
59
+ }
60
+ /**
61
+ * Convert Float32Array result back to regular array if needed
62
+ */
63
+ function fromFloat32Array(input) {
64
+ return Array.from(input);
65
+ }
66
+ /**
67
+ * Multi-head attention mechanism
68
+ *
69
+ * This wrapper automatically converts array inputs to Float32Array.
70
+ */
71
+ class MultiHeadAttention {
72
+ /**
73
+ * Create a new multi-head attention instance
74
+ *
75
+ * @param dim - Embedding dimension (must be divisible by numHeads)
76
+ * @param numHeads - Number of attention heads
77
+ */
78
+ constructor(dim, numHeads) {
79
+ const attention = getAttentionModule();
80
+ this.inner = new attention.MultiHeadAttention(dim, numHeads);
81
+ this.dim = dim;
82
+ this.numHeads = numHeads;
83
+ }
84
+ /**
85
+ * Compute multi-head attention
86
+ *
87
+ * @param query - Query vector
88
+ * @param keys - Array of key vectors
89
+ * @param values - Array of value vectors
90
+ * @returns Attention output
91
+ *
92
+ * @example
93
+ * ```typescript
94
+ * const mha = new MultiHeadAttention(64, 4);
95
+ *
96
+ * // Works with regular arrays
97
+ * const result1 = mha.compute([...64 values], [[...64], [...64]], [[...64], [...64]]);
98
+ *
99
+ * // Also works with Float32Array
100
+ * const q = new Float32Array(64);
101
+ * const k = [new Float32Array(64)];
102
+ * const v = [new Float32Array(64)];
103
+ * const result2 = mha.compute(q, k, v);
104
+ * ```
105
+ */
106
+ compute(query, keys, values) {
107
+ const raw = this.inner.compute(toFloat32Array(query), toFloat32Arrays(keys), toFloat32Arrays(values));
108
+ return {
109
+ values: fromFloat32Array(raw),
110
+ raw
111
+ };
112
+ }
113
+ /**
114
+ * Compute and return raw Float32Array (faster, no conversion)
115
+ */
116
+ computeRaw(query, keys, values) {
117
+ return this.inner.compute(query, keys, values);
118
+ }
119
+ get headDim() {
120
+ return this.dim / this.numHeads;
121
+ }
122
+ }
123
+ exports.MultiHeadAttention = MultiHeadAttention;
124
+ /**
125
+ * Flash attention with tiled computation
126
+ */
127
+ class FlashAttention {
128
+ /**
129
+ * Create a new flash attention instance
130
+ *
131
+ * @param dim - Embedding dimension
132
+ * @param blockSize - Block size for tiled computation (default: 512)
133
+ */
134
+ constructor(dim, blockSize = 512) {
135
+ const attention = getAttentionModule();
136
+ this.inner = new attention.FlashAttention(dim, blockSize);
137
+ this.dim = dim;
138
+ this.blockSize = blockSize;
139
+ }
140
+ /**
141
+ * Compute flash attention
142
+ */
143
+ compute(query, keys, values) {
144
+ const raw = this.inner.compute(toFloat32Array(query), toFloat32Arrays(keys), toFloat32Arrays(values));
145
+ return {
146
+ values: fromFloat32Array(raw),
147
+ raw
148
+ };
149
+ }
150
+ computeRaw(query, keys, values) {
151
+ return this.inner.compute(query, keys, values);
152
+ }
153
+ }
154
+ exports.FlashAttention = FlashAttention;
155
+ /**
156
+ * Hyperbolic attention in Poincare ball model
157
+ */
158
+ class HyperbolicAttention {
159
+ /**
160
+ * Create a new hyperbolic attention instance
161
+ *
162
+ * @param dim - Embedding dimension
163
+ * @param curvature - Hyperbolic curvature (typically 1.0)
164
+ */
165
+ constructor(dim, curvature = 1.0) {
166
+ const attention = getAttentionModule();
167
+ this.inner = new attention.HyperbolicAttention(dim, curvature);
168
+ this.dim = dim;
169
+ this.curvature = curvature;
170
+ }
171
+ /**
172
+ * Compute hyperbolic attention
173
+ */
174
+ compute(query, keys, values) {
175
+ const raw = this.inner.compute(toFloat32Array(query), toFloat32Arrays(keys), toFloat32Arrays(values));
176
+ return {
177
+ values: fromFloat32Array(raw),
178
+ raw
179
+ };
180
+ }
181
+ computeRaw(query, keys, values) {
182
+ return this.inner.compute(query, keys, values);
183
+ }
184
+ }
185
+ exports.HyperbolicAttention = HyperbolicAttention;
186
+ /**
187
+ * Linear attention (Performer-style) with O(n) complexity
188
+ */
189
+ class LinearAttention {
190
+ /**
191
+ * Create a new linear attention instance
192
+ *
193
+ * @param dim - Embedding dimension
194
+ * @param numFeatures - Number of random features
195
+ */
196
+ constructor(dim, numFeatures) {
197
+ const attention = getAttentionModule();
198
+ this.inner = new attention.LinearAttention(dim, numFeatures);
199
+ this.dim = dim;
200
+ this.numFeatures = numFeatures;
201
+ }
202
+ /**
203
+ * Compute linear attention
204
+ */
205
+ compute(query, keys, values) {
206
+ const raw = this.inner.compute(toFloat32Array(query), toFloat32Arrays(keys), toFloat32Arrays(values));
207
+ return {
208
+ values: fromFloat32Array(raw),
209
+ raw
210
+ };
211
+ }
212
+ computeRaw(query, keys, values) {
213
+ return this.inner.compute(query, keys, values);
214
+ }
215
+ }
216
+ exports.LinearAttention = LinearAttention;
217
+ /**
218
+ * Local-global attention (Longformer-style)
219
+ */
220
+ class LocalGlobalAttention {
221
+ /**
222
+ * Create a new local-global attention instance
223
+ *
224
+ * @param dim - Embedding dimension
225
+ * @param localWindow - Size of local attention window
226
+ * @param globalTokens - Number of global attention tokens
227
+ */
228
+ constructor(dim, localWindow, globalTokens) {
229
+ const attention = getAttentionModule();
230
+ this.inner = new attention.LocalGlobalAttention(dim, localWindow, globalTokens);
231
+ this.dim = dim;
232
+ this.localWindow = localWindow;
233
+ this.globalTokens = globalTokens;
234
+ }
235
+ /**
236
+ * Compute local-global attention
237
+ */
238
+ compute(query, keys, values) {
239
+ const raw = this.inner.compute(toFloat32Array(query), toFloat32Arrays(keys), toFloat32Arrays(values));
240
+ return {
241
+ values: fromFloat32Array(raw),
242
+ raw
243
+ };
244
+ }
245
+ computeRaw(query, keys, values) {
246
+ return this.inner.compute(query, keys, values);
247
+ }
248
+ }
249
+ exports.LocalGlobalAttention = LocalGlobalAttention;
250
+ /**
251
+ * Mixture of Experts attention
252
+ */
253
+ class MoEAttention {
254
+ /**
255
+ * Create a new MoE attention instance
256
+ *
257
+ * @param config - MoE configuration
258
+ */
259
+ constructor(config) {
260
+ const attention = getAttentionModule();
261
+ this.inner = new attention.MoEAttention({
262
+ dim: config.dim,
263
+ num_experts: config.numExperts,
264
+ top_k: config.topK,
265
+ expert_capacity: config.expertCapacity ?? 1.25,
266
+ });
267
+ this.config = config;
268
+ }
269
+ /**
270
+ * Create with simple parameters
271
+ */
272
+ static simple(dim, numExperts, topK) {
273
+ return new MoEAttention({ dim, numExperts, topK });
274
+ }
275
+ /**
276
+ * Compute MoE attention
277
+ */
278
+ compute(query, keys, values) {
279
+ const raw = this.inner.compute(toFloat32Array(query), toFloat32Arrays(keys), toFloat32Arrays(values));
280
+ return {
281
+ values: fromFloat32Array(raw),
282
+ raw
283
+ };
284
+ }
285
+ computeRaw(query, keys, values) {
286
+ return this.inner.compute(query, keys, values);
287
+ }
288
+ }
289
+ exports.MoEAttention = MoEAttention;
290
+ // Hyperbolic math utilities
291
+ /**
292
+ * Project a vector into the Poincare ball
293
+ */
294
+ function projectToPoincareBall(vector, curvature = 1.0) {
295
+ const attention = getAttentionModule();
296
+ const result = attention.projectToPoincareBall(toFloat32Array(vector), curvature);
297
+ return fromFloat32Array(result);
298
+ }
299
+ /**
300
+ * Compute hyperbolic (Poincare) distance between two points
301
+ */
302
+ function poincareDistance(a, b, curvature = 1.0) {
303
+ const attention = getAttentionModule();
304
+ return attention.poincareDistance(toFloat32Array(a), toFloat32Array(b), curvature);
305
+ }
306
+ /**
307
+ * Mobius addition in hyperbolic space
308
+ */
309
+ function mobiusAddition(a, b, curvature = 1.0) {
310
+ const attention = getAttentionModule();
311
+ const result = attention.mobiusAddition(toFloat32Array(a), toFloat32Array(b), curvature);
312
+ return fromFloat32Array(result);
313
+ }
314
+ /**
315
+ * Exponential map from tangent space to hyperbolic space
316
+ */
317
+ function expMap(base, tangent, curvature = 1.0) {
318
+ const attention = getAttentionModule();
319
+ const result = attention.expMap(toFloat32Array(base), toFloat32Array(tangent), curvature);
320
+ return fromFloat32Array(result);
321
+ }
322
+ /**
323
+ * Logarithmic map from hyperbolic space to tangent space
324
+ */
325
+ function logMap(base, point, curvature = 1.0) {
326
+ const attention = getAttentionModule();
327
+ const result = attention.logMap(toFloat32Array(base), toFloat32Array(point), curvature);
328
+ return fromFloat32Array(result);
329
+ }
330
+ /**
331
+ * Check if attention module is available
332
+ */
333
+ function isAttentionAvailable() {
334
+ try {
335
+ getAttentionModule();
336
+ return true;
337
+ }
338
+ catch {
339
+ return false;
340
+ }
341
+ }
342
+ /**
343
+ * Get attention module version
344
+ */
345
+ function getAttentionVersion() {
346
+ try {
347
+ const attention = getAttentionModule();
348
+ return attention.version?.() ?? null;
349
+ }
350
+ catch {
351
+ return null;
352
+ }
353
+ }
354
+ // ============================================================================
355
+ // Graph-based Attention (for code structure)
356
+ // ============================================================================
357
+ /**
358
+ * Graph attention with Rotary Position Embeddings
359
+ * Excellent for code AST and dependency graphs
360
+ */
361
+ class GraphRoPeAttention {
362
+ constructor(dim, numHeads = 4, maxSeqLen = 4096) {
363
+ const attention = getAttentionModule();
364
+ this.inner = new attention.GraphRoPeAttention(dim, numHeads, maxSeqLen);
365
+ this.dim = dim;
366
+ this.numHeads = numHeads;
367
+ this.maxSeqLen = maxSeqLen;
368
+ }
369
+ compute(query, keys, values, positions) {
370
+ const raw = this.inner.compute(toFloat32Array(query), toFloat32Arrays(keys), toFloat32Arrays(values), positions ? new Int32Array(positions) : undefined);
371
+ return { values: fromFloat32Array(raw), raw };
372
+ }
373
+ }
374
+ exports.GraphRoPeAttention = GraphRoPeAttention;
375
+ /**
376
+ * Edge-featured attention for graphs with edge attributes
377
+ * Useful for weighted dependency graphs
378
+ */
379
+ class EdgeFeaturedAttention {
380
+ constructor(dim, edgeDim = 16) {
381
+ const attention = getAttentionModule();
382
+ this.inner = new attention.EdgeFeaturedAttention(dim, edgeDim);
383
+ this.dim = dim;
384
+ this.edgeDim = edgeDim;
385
+ }
386
+ compute(query, keys, values, edgeFeatures) {
387
+ const raw = this.inner.compute(toFloat32Array(query), toFloat32Arrays(keys), toFloat32Arrays(values), edgeFeatures ? toFloat32Arrays(edgeFeatures) : undefined);
388
+ return { values: fromFloat32Array(raw), raw };
389
+ }
390
+ }
391
+ exports.EdgeFeaturedAttention = EdgeFeaturedAttention;
392
+ /**
393
+ * Dual-space attention (Euclidean + Hyperbolic)
394
+ * Best of both worlds for hierarchical + semantic similarity
395
+ */
396
+ class DualSpaceAttention {
397
+ constructor(dim, curvature = 1.0, alpha = 0.5) {
398
+ const attention = getAttentionModule();
399
+ this.inner = new attention.DualSpaceAttention(dim, curvature, alpha);
400
+ this.dim = dim;
401
+ this.curvature = curvature;
402
+ this.alpha = alpha;
403
+ }
404
+ compute(query, keys, values) {
405
+ const raw = this.inner.compute(toFloat32Array(query), toFloat32Arrays(keys), toFloat32Arrays(values));
406
+ return { values: fromFloat32Array(raw), raw };
407
+ }
408
+ }
409
+ exports.DualSpaceAttention = DualSpaceAttention;
410
+ /**
411
+ * Basic dot-product attention
412
+ */
413
+ class DotProductAttention {
414
+ constructor(dim) {
415
+ const attention = getAttentionModule();
416
+ this.inner = new attention.DotProductAttention(dim);
417
+ this.dim = dim;
418
+ }
419
+ compute(query, keys, values) {
420
+ const raw = this.inner.compute(toFloat32Array(query), toFloat32Arrays(keys), toFloat32Arrays(values));
421
+ return { values: fromFloat32Array(raw), raw };
422
+ }
423
+ }
424
+ exports.DotProductAttention = DotProductAttention;
425
+ // ============================================================================
426
+ // Parallel/Batch Attention Compute
427
+ // ============================================================================
428
+ /**
429
+ * Compute attention in parallel across multiple queries
430
+ */
431
+ async function parallelAttentionCompute(queries, keys, values, attentionType = 'multi-head') {
432
+ const attention = getAttentionModule();
433
+ const results = await attention.parallelAttentionCompute(toFloat32Arrays(queries), toFloat32Arrays(keys), toFloat32Arrays(values), attentionType);
434
+ return results.map((r) => fromFloat32Array(r));
435
+ }
436
+ /**
437
+ * Batch attention compute for multiple query-key-value sets
438
+ */
439
+ async function batchAttentionCompute(batches, attentionType = 'multi-head') {
440
+ const attention = getAttentionModule();
441
+ const nativeBatches = batches.map(b => ({
442
+ query: toFloat32Array(b.query),
443
+ keys: toFloat32Arrays(b.keys),
444
+ values: toFloat32Arrays(b.values),
445
+ }));
446
+ const results = await attention.batchAttentionCompute(nativeBatches, attentionType);
447
+ return results.map((r) => fromFloat32Array(r));
448
+ }
449
+ /**
450
+ * Async flash attention with callback
451
+ */
452
+ function computeFlashAttentionAsync(query, keys, values) {
453
+ const attention = getAttentionModule();
454
+ return new Promise((resolve, reject) => {
455
+ attention.computeFlashAttentionAsync(toFloat32Array(query), toFloat32Arrays(keys), toFloat32Arrays(values), (err, result) => {
456
+ if (err)
457
+ reject(err);
458
+ else
459
+ resolve(fromFloat32Array(result));
460
+ });
461
+ });
462
+ }
463
+ /**
464
+ * Async hyperbolic attention
465
+ */
466
+ function computeHyperbolicAttentionAsync(query, keys, values, curvature = 1.0) {
467
+ const attention = getAttentionModule();
468
+ return new Promise((resolve, reject) => {
469
+ attention.computeHyperbolicAttentionAsync(toFloat32Array(query), toFloat32Arrays(keys), toFloat32Arrays(values), curvature, (err, result) => {
470
+ if (err)
471
+ reject(err);
472
+ else
473
+ resolve(fromFloat32Array(result));
474
+ });
475
+ });
476
+ }
477
+ // ============================================================================
478
+ // Training Utilities (for SONA integration)
479
+ // ============================================================================
480
+ /**
481
+ * Adam optimizer for attention training
482
+ */
483
+ class AdamOptimizer {
484
+ constructor(learningRate = 0.001, beta1 = 0.9, beta2 = 0.999) {
485
+ const attention = getAttentionModule();
486
+ this.inner = new attention.AdamOptimizer(learningRate, beta1, beta2);
487
+ }
488
+ step(gradients, params) {
489
+ const result = this.inner.step(toFloat32Array(gradients), toFloat32Array(params));
490
+ return fromFloat32Array(result);
491
+ }
492
+ }
493
+ exports.AdamOptimizer = AdamOptimizer;
494
+ /**
495
+ * InfoNCE contrastive loss
496
+ */
497
+ function infoNceLoss(anchor, positive, negatives, temperature = 0.07) {
498
+ const attention = getAttentionModule();
499
+ return attention.InfoNceLoss.compute(toFloat32Array(anchor), toFloat32Array(positive), toFloat32Arrays(negatives), temperature);
500
+ }
501
+ /**
502
+ * Hard negative mining for contrastive learning
503
+ */
504
+ function mineHardNegatives(anchor, candidates, topK = 5) {
505
+ const attention = getAttentionModule();
506
+ const miner = new attention.HardNegativeMiner(topK);
507
+ const results = miner.mine(toFloat32Array(anchor), toFloat32Arrays(candidates));
508
+ return results.map((r) => fromFloat32Array(r));
509
+ }
510
+ // ============================================================================
511
+ // Benchmarking
512
+ // ============================================================================
513
+ /**
514
+ * Benchmark attention implementations
515
+ */
516
+ async function benchmarkAttention(dim, seqLen, iterations = 100) {
517
+ const attention = getAttentionModule();
518
+ return attention.benchmarkAttention(dim, seqLen, iterations);
519
+ }
520
+ exports.default = {
521
+ // Core attention types
522
+ DotProductAttention,
523
+ MultiHeadAttention,
524
+ FlashAttention,
525
+ HyperbolicAttention,
526
+ LinearAttention,
527
+ LocalGlobalAttention,
528
+ MoEAttention,
529
+ // Graph attention types
530
+ GraphRoPeAttention,
531
+ EdgeFeaturedAttention,
532
+ DualSpaceAttention,
533
+ // Parallel/batch compute
534
+ parallelAttentionCompute,
535
+ batchAttentionCompute,
536
+ computeFlashAttentionAsync,
537
+ computeHyperbolicAttentionAsync,
538
+ // Training utilities
539
+ AdamOptimizer,
540
+ infoNceLoss,
541
+ mineHardNegatives,
542
+ // Hyperbolic math
543
+ projectToPoincareBall,
544
+ poincareDistance,
545
+ mobiusAddition,
546
+ expMap,
547
+ logMap,
548
+ // Utilities
549
+ isAttentionAvailable,
550
+ getAttentionVersion,
551
+ benchmarkAttention,
552
+ };
dist/core/cluster-wrapper.d.ts ADDED
@@ -0,0 +1,148 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ /**
2
+ * Cluster Wrapper - Distributed coordination for multi-agent systems
3
+ *
4
+ * Wraps @ruvector/cluster for Raft consensus, auto-sharding,
5
+ * and distributed memory across agents.
6
+ */
7
+ export declare function isClusterAvailable(): boolean;
8
+ export interface ClusterNode {
9
+ id: string;
10
+ address: string;
11
+ role: 'leader' | 'follower' | 'candidate';
12
+ status: 'healthy' | 'unhealthy' | 'unknown';
13
+ lastHeartbeat: number;
14
+ }
15
+ export interface ShardInfo {
16
+ id: number;
17
+ range: [number, number];
18
+ node: string;
19
+ size: number;
20
+ status: 'active' | 'migrating' | 'offline';
21
+ }
22
+ export interface ClusterConfig {
23
+ nodeId: string;
24
+ address: string;
25
+ peers?: string[];
26
+ shards?: number;
27
+ replicationFactor?: number;
28
+ }
29
+ /**
30
+ * Distributed cluster for multi-agent coordination
31
+ */
32
+ export declare class RuvectorCluster {
33
+ private inner;
34
+ private nodeId;
35
+ private isLeader;
36
+ constructor(config: ClusterConfig);
37
+ /**
38
+ * Start the cluster node
39
+ */
40
+ start(): Promise<void>;
41
+ /**
42
+ * Stop the cluster node gracefully
43
+ */
44
+ stop(): Promise<void>;
45
+ /**
46
+ * Join an existing cluster
47
+ */
48
+ join(peerAddress: string): Promise<boolean>;
49
+ /**
50
+ * Leave the cluster
51
+ */
52
+ leave(): Promise<void>;
53
+ /**
54
+ * Get current node info
55
+ */
56
+ getNodeInfo(): ClusterNode;
57
+ /**
58
+ * Get all cluster nodes
59
+ */
60
+ getNodes(): ClusterNode[];
61
+ /**
62
+ * Check if this node is the leader
63
+ */
64
+ isClusterLeader(): boolean;
65
+ /**
66
+ * Get the current leader
67
+ */
68
+ getLeader(): ClusterNode | null;
69
+ /**
70
+ * Put a value in distributed storage
71
+ */
72
+ put(key: string, value: any): Promise<boolean>;
73
+ /**
74
+ * Get a value from distributed storage
75
+ */
76
+ get(key: string): Promise<any | null>;
77
+ /**
78
+ * Delete a value from distributed storage
79
+ */
80
+ delete(key: string): Promise<boolean>;
81
+ /**
82
+ * Atomic compare-and-swap
83
+ */
84
+ compareAndSwap(key: string, expected: any, newValue: any): Promise<boolean>;
85
+ /**
86
+ * Get shard information
87
+ */
88
+ getShards(): ShardInfo[];
89
+ /**
90
+ * Get the shard for a key
91
+ */
92
+ getShardForKey(key: string): ShardInfo;
93
+ /**
94
+ * Trigger shard rebalancing
95
+ */
96
+ rebalance(): Promise<void>;
97
+ /**
98
+ * Acquire a distributed lock
99
+ */
100
+ lock(name: string, timeout?: number): Promise<string | null>;
101
+ /**
102
+ * Release a distributed lock
103
+ */
104
+ unlock(name: string, token: string): Promise<boolean>;
105
+ /**
106
+ * Extend a lock's TTL
107
+ */
108
+ extendLock(name: string, token: string, extension?: number): Promise<boolean>;
109
+ /**
110
+ * Subscribe to a channel
111
+ */
112
+ subscribe(channel: string, callback: (message: any) => void): () => void;
113
+ /**
114
+ * Publish to a channel
115
+ */
116
+ publish(channel: string, message: any): Promise<number>;
117
+ /**
118
+ * Register an agent with the cluster
119
+ */
120
+ registerAgent(agentId: string, capabilities: string[]): Promise<boolean>;
121
+ /**
122
+ * Find agents with a capability
123
+ */
124
+ findAgents(capability: string): Promise<string[]>;
125
+ /**
126
+ * Assign a task to an agent
127
+ */
128
+ assignTask(taskId: string, agentId: string, task: any): Promise<boolean>;
129
+ /**
130
+ * Complete a task
131
+ */
132
+ completeTask(taskId: string, result: any): Promise<boolean>;
133
+ /**
134
+ * Get cluster statistics
135
+ */
136
+ stats(): {
137
+ nodes: number;
138
+ shards: number;
139
+ leader: string | null;
140
+ healthy: boolean;
141
+ };
142
+ }
143
+ /**
144
+ * Create a cluster node for agent coordination
145
+ */
146
+ export declare function createCluster(config: ClusterConfig): RuvectorCluster;
147
+ export default RuvectorCluster;
148
+ //# sourceMappingURL=cluster-wrapper.d.ts.map
dist/core/cluster-wrapper.d.ts.map ADDED
@@ -0,0 +1 @@
 
 
1
+ {"version":3,"file":"cluster-wrapper.d.ts","sourceRoot":"","sources":["../../src/core/cluster-wrapper.ts"],"names":[],"mappings":"AAAA;;;;;GAKG;AAqBH,wBAAgB,kBAAkB,IAAI,OAAO,CAO5C;AAED,MAAM,WAAW,WAAW;IAC1B,EAAE,EAAE,MAAM,CAAC;IACX,OAAO,EAAE,MAAM,CAAC;IAChB,IAAI,EAAE,QAAQ,GAAG,UAAU,GAAG,WAAW,CAAC;IAC1C,MAAM,EAAE,SAAS,GAAG,WAAW,GAAG,SAAS,CAAC;IAC5C,aAAa,EAAE,MAAM,CAAC;CACvB;AAED,MAAM,WAAW,SAAS;IACxB,EAAE,EAAE,MAAM,CAAC;IACX,KAAK,EAAE,CAAC,MAAM,EAAE,MAAM,CAAC,CAAC;IACxB,IAAI,EAAE,MAAM,CAAC;IACb,IAAI,EAAE,MAAM,CAAC;IACb,MAAM,EAAE,QAAQ,GAAG,WAAW,GAAG,SAAS,CAAC;CAC5C;AAED,MAAM,WAAW,aAAa;IAC5B,MAAM,EAAE,MAAM,CAAC;IACf,OAAO,EAAE,MAAM,CAAC;IAChB,KAAK,CAAC,EAAE,MAAM,EAAE,CAAC;IACjB,MAAM,CAAC,EAAE,MAAM,CAAC;IAChB,iBAAiB,CAAC,EAAE,MAAM,CAAC;CAC5B;AAED;;GAEG;AACH,qBAAa,eAAe;IAC1B,OAAO,CAAC,KAAK,CAAM;IACnB,OAAO,CAAC,MAAM,CAAS;IACvB,OAAO,CAAC,QAAQ,CAAkB;gBAEtB,MAAM,EAAE,aAAa;IAgBjC;;OAEG;IACG,KAAK,IAAI,OAAO,CAAC,IAAI,CAAC;IAI5B;;OAEG;IACG,IAAI,IAAI,OAAO,CAAC,IAAI,CAAC;IAI3B;;OAEG;IACG,IAAI,CAAC,WAAW,EAAE,MAAM,GAAG,OAAO,CAAC,OAAO,CAAC;IAIjD;;OAEG;IACG,KAAK,IAAI,OAAO,CAAC,IAAI,CAAC;IAQ5B;;OAEG;IACH,WAAW,IAAI,WAAW;IAI1B;;OAEG;IACH,QAAQ,IAAI,WAAW,EAAE;IAIzB;;OAEG;IACH,eAAe,IAAI,OAAO;IAK1B;;OAEG;IACH,SAAS,IAAI,WAAW,GAAG,IAAI;IAQ/B;;OAEG;IACG,GAAG,CAAC,GAAG,EAAE,MAAM,EAAE,KAAK,EAAE,GAAG,GAAG,OAAO,CAAC,OAAO,CAAC;IAIpD;;OAEG;IACG,GAAG,CAAC,GAAG,EAAE,MAAM,GAAG,OAAO,CAAC,GAAG,GAAG,IAAI,CAAC;IAK3C;;OAEG;IACG,MAAM,CAAC,GAAG,EAAE,MAAM,GAAG,OAAO,CAAC,OAAO,CAAC;IAI3C;;OAEG;IACG,cAAc,CAAC,GAAG,EAAE,MAAM,EAAE,QAAQ,EAAE,GAAG,EAAE,QAAQ,EAAE,GAAG,GAAG,OAAO,CAAC,OAAO,CAAC;IAYjF;;OAEG;IACH,SAAS,IAAI,SAAS,EAAE;IAIxB;;OAEG;IACH,cAAc,CAAC,GAAG,EAAE,MAAM,GAAG,SAAS;IAItC;;OAEG;IACG,SAAS,IAAI,OAAO,CAAC,IAAI,CAAC;IAQhC;;OAEG;IACG,IAAI,CAAC,IAAI,EAAE,MAAM,EAAE,OAAO,GAAE,MAAc,GAAG,OAAO,CAAC,MAAM,GAAG,IAAI,CAAC;IAIzE;;OAEG;IACG,MAAM,CAAC,IAAI,EAAE,MAAM,EAAE,KAAK,EAAE,MAAM,GAAG,OAAO,CAAC,OAAO,CAAC;IAI3D;;OAEG;IACG,UAAU,CAAC,IAAI,EAAE,MAAM,EAAE,KAAK,EAAE,MAAM,EAAE,SAAS,GAAE,MAAc,GAAG,OAAO,CAAC,OAAO,CAAC;IAQ1F;;OAEG;IACH,SAAS,CAAC,OAAO,EAAE,MAAM,EAAE,QAAQ,EAAE,CAAC,OAAO,EAAE,GAAG,KAAK,IAAI,GAAG,MAAM,IAAI;IAMxE;;OAEG;IACG,OAAO,CAAC,OAAO,EAAE,MAAM,EAAE,OAAO,EAAE,GAAG,GAAG,OAAO,CAAC,MAAM,CAAC;IAQ7D;;OAEG;IACG,aAAa,CAAC,OAAO,EAAE,MAAM,EAAE,YAAY,EAAE,MAAM,EAAE,GAAG,OAAO,CAAC,OAAO,CAAC;IAS9E;;OAEG;IACG,UAAU,CAAC,UAAU,EAAE,MAAM,GAAG,OAAO,CAAC,MAAM,EAAE,CAAC;IAcvD;;OAEG;IACG,UAAU,CAAC,MAAM,EAAE,MAAM,EAAE,OAAO,EAAE,MAAM,EAAE,IAAI,EAAE,GAAG,GAAG,OAAO,CAAC,OAAO,CAAC;IAgB9E;;OAEG;IACG,YAAY,CAAC,MAAM,EAAE,MAAM,EAAE,MAAM,EAAE,GAAG,GAAG,OAAO,CAAC,OAAO,CAAC;IAgBjE;;OAEG;IACH,KAAK,IAAI;QACP,KAAK,EAAE,MAAM,CAAC;QACd,MAAM,EAAE,MAAM,CAAC;QACf,MAAM,EAAE,MAAM,GAAG,IAAI,CAAC;QACtB,OAAO,EAAE,OAAO,CAAC;KAClB;CAGF;AAED;;GAEG;AACH,wBAAgB,aAAa,CAAC,MAAM,EAAE,aAAa,GAAG,eAAe,CAEpE;AAED,eAAe,eAAe,CAAC"}
dist/core/cluster-wrapper.js ADDED
@@ -0,0 +1,271 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ "use strict";
2
+ /**
3
+ * Cluster Wrapper - Distributed coordination for multi-agent systems
4
+ *
5
+ * Wraps @ruvector/cluster for Raft consensus, auto-sharding,
6
+ * and distributed memory across agents.
7
+ */
8
+ Object.defineProperty(exports, "__esModule", { value: true });
9
+ exports.RuvectorCluster = void 0;
10
+ exports.isClusterAvailable = isClusterAvailable;
11
+ exports.createCluster = createCluster;
12
+ let clusterModule = null;
13
+ let loadError = null;
14
+ function getClusterModule() {
15
+ if (clusterModule)
16
+ return clusterModule;
17
+ if (loadError)
18
+ throw loadError;
19
+ try {
20
+ clusterModule = require('@ruvector/cluster');
21
+ return clusterModule;
22
+ }
23
+ catch (e) {
24
+ loadError = new Error(`@ruvector/cluster not installed: ${e.message}\n` +
25
+ `Install with: npm install @ruvector/cluster`);
26
+ throw loadError;
27
+ }
28
+ }
29
+ function isClusterAvailable() {
30
+ try {
31
+ getClusterModule();
32
+ return true;
33
+ }
34
+ catch {
35
+ return false;
36
+ }
37
+ }
38
+ /**
39
+ * Distributed cluster for multi-agent coordination
40
+ */
41
+ class RuvectorCluster {
42
+ constructor(config) {
43
+ this.isLeader = false;
44
+ const cluster = getClusterModule();
45
+ this.nodeId = config.nodeId;
46
+ this.inner = new cluster.Cluster({
47
+ nodeId: config.nodeId,
48
+ address: config.address,
49
+ peers: config.peers ?? [],
50
+ shards: config.shards ?? 16,
51
+ replicationFactor: config.replicationFactor ?? 2,
52
+ });
53
+ }
54
+ // ===========================================================================
55
+ // Cluster Lifecycle
56
+ // ===========================================================================
57
+ /**
58
+ * Start the cluster node
59
+ */
60
+ async start() {
61
+ await this.inner.start();
62
+ }
63
+ /**
64
+ * Stop the cluster node gracefully
65
+ */
66
+ async stop() {
67
+ await this.inner.stop();
68
+ }
69
+ /**
70
+ * Join an existing cluster
71
+ */
72
+ async join(peerAddress) {
73
+ return this.inner.join(peerAddress);
74
+ }
75
+ /**
76
+ * Leave the cluster
77
+ */
78
+ async leave() {
79
+ await this.inner.leave();
80
+ }
81
+ // ===========================================================================
82
+ // Node Management
83
+ // ===========================================================================
84
+ /**
85
+ * Get current node info
86
+ */
87
+ getNodeInfo() {
88
+ return this.inner.getNodeInfo();
89
+ }
90
+ /**
91
+ * Get all cluster nodes
92
+ */
93
+ getNodes() {
94
+ return this.inner.getNodes();
95
+ }
96
+ /**
97
+ * Check if this node is the leader
98
+ */
99
+ isClusterLeader() {
100
+ this.isLeader = this.inner.isLeader();
101
+ return this.isLeader;
102
+ }
103
+ /**
104
+ * Get the current leader
105
+ */
106
+ getLeader() {
107
+ return this.inner.getLeader();
108
+ }
109
+ // ===========================================================================
110
+ // Distributed Operations
111
+ // ===========================================================================
112
+ /**
113
+ * Put a value in distributed storage
114
+ */
115
+ async put(key, value) {
116
+ return this.inner.put(key, JSON.stringify(value));
117
+ }
118
+ /**
119
+ * Get a value from distributed storage
120
+ */
121
+ async get(key) {
122
+ const result = await this.inner.get(key);
123
+ return result ? JSON.parse(result) : null;
124
+ }
125
+ /**
126
+ * Delete a value from distributed storage
127
+ */
128
+ async delete(key) {
129
+ return this.inner.delete(key);
130
+ }
131
+ /**
132
+ * Atomic compare-and-swap
133
+ */
134
+ async compareAndSwap(key, expected, newValue) {
135
+ return this.inner.compareAndSwap(key, JSON.stringify(expected), JSON.stringify(newValue));
136
+ }
137
+ // ===========================================================================
138
+ // Sharding
139
+ // ===========================================================================
140
+ /**
141
+ * Get shard information
142
+ */
143
+ getShards() {
144
+ return this.inner.getShards();
145
+ }
146
+ /**
147
+ * Get the shard for a key
148
+ */
149
+ getShardForKey(key) {
150
+ return this.inner.getShardForKey(key);
151
+ }
152
+ /**
153
+ * Trigger shard rebalancing
154
+ */
155
+ async rebalance() {
156
+ await this.inner.rebalance();
157
+ }
158
+ // ===========================================================================
159
+ // Distributed Locks
160
+ // ===========================================================================
161
+ /**
162
+ * Acquire a distributed lock
163
+ */
164
+ async lock(name, timeout = 30000) {
165
+ return this.inner.lock(name, timeout);
166
+ }
167
+ /**
168
+ * Release a distributed lock
169
+ */
170
+ async unlock(name, token) {
171
+ return this.inner.unlock(name, token);
172
+ }
173
+ /**
174
+ * Extend a lock's TTL
175
+ */
176
+ async extendLock(name, token, extension = 30000) {
177
+ return this.inner.extendLock(name, token, extension);
178
+ }
179
+ // ===========================================================================
180
+ // Pub/Sub
181
+ // ===========================================================================
182
+ /**
183
+ * Subscribe to a channel
184
+ */
185
+ subscribe(channel, callback) {
186
+ return this.inner.subscribe(channel, (msg) => {
187
+ callback(JSON.parse(msg));
188
+ });
189
+ }
190
+ /**
191
+ * Publish to a channel
192
+ */
193
+ async publish(channel, message) {
194
+ return this.inner.publish(channel, JSON.stringify(message));
195
+ }
196
+ // ===========================================================================
197
+ // Agent Coordination
198
+ // ===========================================================================
199
+ /**
200
+ * Register an agent with the cluster
201
+ */
202
+ async registerAgent(agentId, capabilities) {
203
+ return this.put(`agent:${agentId}`, {
204
+ id: agentId,
205
+ capabilities,
206
+ node: this.nodeId,
207
+ registeredAt: Date.now(),
208
+ });
209
+ }
210
+ /**
211
+ * Find agents with a capability
212
+ */
213
+ async findAgents(capability) {
214
+ const agents = await this.inner.scan('agent:*');
215
+ const matching = [];
216
+ for (const key of agents) {
217
+ const agent = await this.get(key);
218
+ if (agent?.capabilities?.includes(capability)) {
219
+ matching.push(agent.id);
220
+ }
221
+ }
222
+ return matching;
223
+ }
224
+ /**
225
+ * Assign a task to an agent
226
+ */
227
+ async assignTask(taskId, agentId, task) {
228
+ const assigned = await this.put(`task:${taskId}`, {
229
+ id: taskId,
230
+ agent: agentId,
231
+ task,
232
+ status: 'assigned',
233
+ assignedAt: Date.now(),
234
+ });
235
+ if (assigned) {
236
+ await this.publish(`agent:${agentId}:tasks`, { type: 'new_task', taskId });
237
+ }
238
+ return assigned;
239
+ }
240
+ /**
241
+ * Complete a task
242
+ */
243
+ async completeTask(taskId, result) {
244
+ const task = await this.get(`task:${taskId}`);
245
+ if (!task)
246
+ return false;
247
+ return this.put(`task:${taskId}`, {
248
+ ...task,
249
+ status: 'completed',
250
+ result,
251
+ completedAt: Date.now(),
252
+ });
253
+ }
254
+ // ===========================================================================
255
+ // Stats
256
+ // ===========================================================================
257
+ /**
258
+ * Get cluster statistics
259
+ */
260
+ stats() {
261
+ return this.inner.stats();
262
+ }
263
+ }
264
+ exports.RuvectorCluster = RuvectorCluster;
265
+ /**
266
+ * Create a cluster node for agent coordination
267
+ */
268
+ function createCluster(config) {
269
+ return new RuvectorCluster(config);
270
+ }
271
+ exports.default = RuvectorCluster;
dist/core/coverage-router.d.ts ADDED
@@ -0,0 +1,88 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ /**
2
+ * Coverage Router - Test coverage-aware agent routing
3
+ *
4
+ * Uses test coverage data to make smarter routing decisions:
5
+ * - Prioritize testing for uncovered code
6
+ * - Route to tester agent for low-coverage files
7
+ * - Suggest test files for modified code
8
+ */
9
+ export interface CoverageData {
10
+ file: string;
11
+ lines: {
12
+ total: number;
13
+ covered: number;
14
+ percentage: number;
15
+ };
16
+ functions: {
17
+ total: number;
18
+ covered: number;
19
+ percentage: number;
20
+ };
21
+ branches: {
22
+ total: number;
23
+ covered: number;
24
+ percentage: number;
25
+ };
26
+ uncoveredLines: number[];
27
+ uncoveredFunctions: string[];
28
+ }
29
+ export interface CoverageSummary {
30
+ files: Map<string, CoverageData>;
31
+ overall: {
32
+ lines: number;
33
+ functions: number;
34
+ branches: number;
35
+ };
36
+ lowCoverageFiles: string[];
37
+ uncoveredFiles: string[];
38
+ }
39
+ export interface TestSuggestion {
40
+ file: string;
41
+ testFile: string;
42
+ reason: string;
43
+ priority: 'high' | 'medium' | 'low';
44
+ coverage: number;
45
+ uncoveredFunctions: string[];
46
+ }
47
+ /**
48
+ * Parse Istanbul/NYC JSON coverage report
49
+ */
50
+ export declare function parseIstanbulCoverage(coveragePath: string): CoverageSummary;
51
+ /**
52
+ * Find coverage report in project
53
+ */
54
+ export declare function findCoverageReport(projectRoot?: string): string | null;
55
+ /**
56
+ * Get coverage data for a specific file
57
+ */
58
+ export declare function getFileCoverage(file: string, summary?: CoverageSummary): CoverageData | null;
59
+ /**
60
+ * Suggest tests for files based on coverage
61
+ */
62
+ export declare function suggestTests(files: string[], summary?: CoverageSummary): TestSuggestion[];
63
+ /**
64
+ * Determine if a file needs the tester agent based on coverage
65
+ */
66
+ export declare function shouldRouteToTester(file: string, summary?: CoverageSummary): {
67
+ route: boolean;
68
+ reason: string;
69
+ coverage: number;
70
+ };
71
+ /**
72
+ * Get coverage-aware routing weight for agent selection
73
+ */
74
+ export declare function getCoverageRoutingWeight(file: string, summary?: CoverageSummary): {
75
+ coder: number;
76
+ tester: number;
77
+ reviewer: number;
78
+ };
79
+ declare const _default: {
80
+ parseIstanbulCoverage: typeof parseIstanbulCoverage;
81
+ findCoverageReport: typeof findCoverageReport;
82
+ getFileCoverage: typeof getFileCoverage;
83
+ suggestTests: typeof suggestTests;
84
+ shouldRouteToTester: typeof shouldRouteToTester;
85
+ getCoverageRoutingWeight: typeof getCoverageRoutingWeight;
86
+ };
87
+ export default _default;
88
+ //# sourceMappingURL=coverage-router.d.ts.map
dist/core/coverage-router.d.ts.map ADDED
@@ -0,0 +1 @@
 
 
1
+ {"version":3,"file":"coverage-router.d.ts","sourceRoot":"","sources":["../../src/core/coverage-router.ts"],"names":[],"mappings":"AAAA;;;;;;;GAOG;AAKH,MAAM,WAAW,YAAY;IAC3B,IAAI,EAAE,MAAM,CAAC;IACb,KAAK,EAAE;QACL,KAAK,EAAE,MAAM,CAAC;QACd,OAAO,EAAE,MAAM,CAAC;QAChB,UAAU,EAAE,MAAM,CAAC;KACpB,CAAC;IACF,SAAS,EAAE;QACT,KAAK,EAAE,MAAM,CAAC;QACd,OAAO,EAAE,MAAM,CAAC;QAChB,UAAU,EAAE,MAAM,CAAC;KACpB,CAAC;IACF,QAAQ,EAAE;QACR,KAAK,EAAE,MAAM,CAAC;QACd,OAAO,EAAE,MAAM,CAAC;QAChB,UAAU,EAAE,MAAM,CAAC;KACpB,CAAC;IACF,cAAc,EAAE,MAAM,EAAE,CAAC;IACzB,kBAAkB,EAAE,MAAM,EAAE,CAAC;CAC9B;AAED,MAAM,WAAW,eAAe;IAC9B,KAAK,EAAE,GAAG,CAAC,MAAM,EAAE,YAAY,CAAC,CAAC;IACjC,OAAO,EAAE;QACP,KAAK,EAAE,MAAM,CAAC;QACd,SAAS,EAAE,MAAM,CAAC;QAClB,QAAQ,EAAE,MAAM,CAAC;KAClB,CAAC;IACF,gBAAgB,EAAE,MAAM,EAAE,CAAC;IAC3B,cAAc,EAAE,MAAM,EAAE,CAAC;CAC1B;AAED,MAAM,WAAW,cAAc;IAC7B,IAAI,EAAE,MAAM,CAAC;IACb,QAAQ,EAAE,MAAM,CAAC;IACjB,MAAM,EAAE,MAAM,CAAC;IACf,QAAQ,EAAE,MAAM,GAAG,QAAQ,GAAG,KAAK,CAAC;IACpC,QAAQ,EAAE,MAAM,CAAC;IACjB,kBAAkB,EAAE,MAAM,EAAE,CAAC;CAC9B;AAED;;GAEG;AACH,wBAAgB,qBAAqB,CAAC,YAAY,EAAE,MAAM,GAAG,eAAe,CA0F3E;AAED;;GAEG;AACH,wBAAgB,kBAAkB,CAAC,WAAW,GAAE,MAAsB,GAAG,MAAM,GAAG,IAAI,CAiBrF;AAED;;GAEG;AACH,wBAAgB,eAAe,CAAC,IAAI,EAAE,MAAM,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG,YAAY,GAAG,IAAI,CAqB5F;AAED;;GAEG;AACH,wBAAgB,YAAY,CAAC,KAAK,EAAE,MAAM,EAAE,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG,cAAc,EAAE,CAwEzF;AAED;;GAEG;AACH,wBAAgB,mBAAmB,CAAC,IAAI,EAAE,MAAM,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG;IAC5E,KAAK,EAAE,OAAO,CAAC;IACf,MAAM,EAAE,MAAM,CAAC;IACf,QAAQ,EAAE,MAAM,CAAC;CAClB,CAgCA;AAED;;GAEG;AACH,wBAAgB,wBAAwB,CAAC,IAAI,EAAE,MAAM,EAAE,OAAO,CAAC,EAAE,eAAe,GAAG;IACjF,KAAK,EAAE,MAAM,CAAC;IACd,MAAM,EAAE,MAAM,CAAC;IACf,QAAQ,EAAE,MAAM,CAAC;CAClB,CAuBA;;;;;;;;;AAED,wBAOE"}
dist/core/coverage-router.js ADDED
@@ -0,0 +1,315 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ "use strict";
2
+ /**
3
+ * Coverage Router - Test coverage-aware agent routing
4
+ *
5
+ * Uses test coverage data to make smarter routing decisions:
6
+ * - Prioritize testing for uncovered code
7
+ * - Route to tester agent for low-coverage files
8
+ * - Suggest test files for modified code
9
+ */
10
+ var __createBinding = (this && this.__createBinding) || (Object.create ? (function(o, m, k, k2) {
11
+ if (k2 === undefined) k2 = k;
12
+ var desc = Object.getOwnPropertyDescriptor(m, k);
13
+ if (!desc || ("get" in desc ? !m.__esModule : desc.writable || desc.configurable)) {
14
+ desc = { enumerable: true, get: function() { return m[k]; } };
15
+ }
16
+ Object.defineProperty(o, k2, desc);
17
+ }) : (function(o, m, k, k2) {
18
+ if (k2 === undefined) k2 = k;
19
+ o[k2] = m[k];
20
+ }));
21
+ var __setModuleDefault = (this && this.__setModuleDefault) || (Object.create ? (function(o, v) {
22
+ Object.defineProperty(o, "default", { enumerable: true, value: v });
23
+ }) : function(o, v) {
24
+ o["default"] = v;
25
+ });
26
+ var __importStar = (this && this.__importStar) || (function () {
27
+ var ownKeys = function(o) {
28
+ ownKeys = Object.getOwnPropertyNames || function (o) {
29
+ var ar = [];
30
+ for (var k in o) if (Object.prototype.hasOwnProperty.call(o, k)) ar[ar.length] = k;
31
+ return ar;
32
+ };
33
+ return ownKeys(o);
34
+ };
35
+ return function (mod) {
36
+ if (mod && mod.__esModule) return mod;
37
+ var result = {};
38
+ if (mod != null) for (var k = ownKeys(mod), i = 0; i < k.length; i++) if (k[i] !== "default") __createBinding(result, mod, k[i]);
39
+ __setModuleDefault(result, mod);
40
+ return result;
41
+ };
42
+ })();
43
+ Object.defineProperty(exports, "__esModule", { value: true });
44
+ exports.parseIstanbulCoverage = parseIstanbulCoverage;
45
+ exports.findCoverageReport = findCoverageReport;
46
+ exports.getFileCoverage = getFileCoverage;
47
+ exports.suggestTests = suggestTests;
48
+ exports.shouldRouteToTester = shouldRouteToTester;
49
+ exports.getCoverageRoutingWeight = getCoverageRoutingWeight;
50
+ const fs = __importStar(require("fs"));
51
+ const path = __importStar(require("path"));
52
+ /**
53
+ * Parse Istanbul/NYC JSON coverage report
54
+ */
55
+ function parseIstanbulCoverage(coveragePath) {
56
+ const files = new Map();
57
+ const lowCoverageFiles = [];
58
+ const uncoveredFiles = [];
59
+ let totalLines = 0, coveredLines = 0;
60
+ let totalFunctions = 0, coveredFunctions = 0;
61
+ let totalBranches = 0, coveredBranches = 0;
62
+ try {
63
+ const coverage = JSON.parse(fs.readFileSync(coveragePath, 'utf8'));
64
+ for (const [file, data] of Object.entries(coverage)) {
65
+ // Skip test files
66
+ if (file.includes('.test.') || file.includes('.spec.') || file.includes('__tests__')) {
67
+ continue;
68
+ }
69
+ // Parse statement coverage
70
+ const statements = Object.values(data.s || {});
71
+ const linesCovered = statements.filter(n => n > 0).length;
72
+ const linesTotal = statements.length;
73
+ // Parse function coverage
74
+ const functions = Object.values(data.f || {});
75
+ const fnCovered = functions.filter(n => n > 0).length;
76
+ const fnTotal = functions.length;
77
+ // Parse branch coverage
78
+ const branches = Object.values(data.b || {}).flat();
79
+ const brCovered = branches.filter(n => n > 0).length;
80
+ const brTotal = branches.length;
81
+ // Find uncovered lines
82
+ const uncoveredLines = [];
83
+ for (const [line, count] of Object.entries(data.s || {})) {
84
+ if (count === 0) {
85
+ uncoveredLines.push(parseInt(line));
86
+ }
87
+ }
88
+ // Find uncovered functions
89
+ const uncoveredFunctions = [];
90
+ const fnMap = data.fnMap || {};
91
+ for (const [fnId, count] of Object.entries(data.f || {})) {
92
+ if (count === 0 && fnMap[fnId]) {
93
+ uncoveredFunctions.push(fnMap[fnId].name || `function_${fnId}`);
94
+ }
95
+ }
96
+ const linePercentage = linesTotal > 0 ? (linesCovered / linesTotal) * 100 : 100;
97
+ const fnPercentage = fnTotal > 0 ? (fnCovered / fnTotal) * 100 : 100;
98
+ const brPercentage = brTotal > 0 ? (brCovered / brTotal) * 100 : 100;
99
+ files.set(file, {
100
+ file,
101
+ lines: { total: linesTotal, covered: linesCovered, percentage: linePercentage },
102
+ functions: { total: fnTotal, covered: fnCovered, percentage: fnPercentage },
103
+ branches: { total: brTotal, covered: brCovered, percentage: brPercentage },
104
+ uncoveredLines,
105
+ uncoveredFunctions,
106
+ });
107
+ totalLines += linesTotal;
108
+ coveredLines += linesCovered;
109
+ totalFunctions += fnTotal;
110
+ coveredFunctions += fnCovered;
111
+ totalBranches += brTotal;
112
+ coveredBranches += brCovered;
113
+ if (linePercentage < 50) {
114
+ lowCoverageFiles.push(file);
115
+ }
116
+ if (linePercentage === 0 && linesTotal > 0) {
117
+ uncoveredFiles.push(file);
118
+ }
119
+ }
120
+ }
121
+ catch (e) {
122
+ // Return empty summary on error
123
+ }
124
+ return {
125
+ files,
126
+ overall: {
127
+ lines: totalLines > 0 ? (coveredLines / totalLines) * 100 : 0,
128
+ functions: totalFunctions > 0 ? (coveredFunctions / totalFunctions) * 100 : 0,
129
+ branches: totalBranches > 0 ? (coveredBranches / totalBranches) * 100 : 0,
130
+ },
131
+ lowCoverageFiles,
132
+ uncoveredFiles,
133
+ };
134
+ }
135
+ /**
136
+ * Find coverage report in project
137
+ */
138
+ function findCoverageReport(projectRoot = process.cwd()) {
139
+ const possiblePaths = [
140
+ 'coverage/coverage-final.json',
141
+ 'coverage/coverage-summary.json',
142
+ '.nyc_output/coverage.json',
143
+ 'coverage.json',
144
+ 'coverage/lcov.info',
145
+ ];
146
+ for (const p of possiblePaths) {
147
+ const fullPath = path.join(projectRoot, p);
148
+ if (fs.existsSync(fullPath)) {
149
+ return fullPath;
150
+ }
151
+ }
152
+ return null;
153
+ }
154
+ /**
155
+ * Get coverage data for a specific file
156
+ */
157
+ function getFileCoverage(file, summary) {
158
+ if (!summary) {
159
+ const reportPath = findCoverageReport();
160
+ if (!reportPath)
161
+ return null;
162
+ summary = parseIstanbulCoverage(reportPath);
163
+ }
164
+ // Try exact match first
165
+ if (summary.files.has(file)) {
166
+ return summary.files.get(file);
167
+ }
168
+ // Try matching by basename
169
+ const basename = path.basename(file);
170
+ for (const [key, data] of summary.files) {
171
+ if (key.endsWith(file) || key.endsWith(basename)) {
172
+ return data;
173
+ }
174
+ }
175
+ return null;
176
+ }
177
+ /**
178
+ * Suggest tests for files based on coverage
179
+ */
180
+ function suggestTests(files, summary) {
181
+ if (!summary) {
182
+ const reportPath = findCoverageReport();
183
+ if (reportPath) {
184
+ summary = parseIstanbulCoverage(reportPath);
185
+ }
186
+ }
187
+ const suggestions = [];
188
+ for (const file of files) {
189
+ const coverage = summary ? getFileCoverage(file, summary) : null;
190
+ // Determine test file path
191
+ const ext = path.extname(file);
192
+ const base = path.basename(file, ext);
193
+ const dir = path.dirname(file);
194
+ const possibleTestFiles = [
195
+ path.join(dir, `${base}.test${ext}`),
196
+ path.join(dir, `${base}.spec${ext}`),
197
+ path.join(dir, '__tests__', `${base}.test${ext}`),
198
+ path.join('test', `${base}.test${ext}`),
199
+ path.join('tests', `${base}.test${ext}`),
200
+ ];
201
+ const existingTestFile = possibleTestFiles.find(t => fs.existsSync(t));
202
+ const testFile = existingTestFile || possibleTestFiles[0];
203
+ if (!coverage) {
204
+ suggestions.push({
205
+ file,
206
+ testFile,
207
+ reason: 'No coverage data - needs test file',
208
+ priority: 'high',
209
+ coverage: 0,
210
+ uncoveredFunctions: [],
211
+ });
212
+ }
213
+ else if (coverage.lines.percentage < 30) {
214
+ suggestions.push({
215
+ file,
216
+ testFile,
217
+ reason: `Very low coverage (${coverage.lines.percentage.toFixed(1)}%)`,
218
+ priority: 'high',
219
+ coverage: coverage.lines.percentage,
220
+ uncoveredFunctions: coverage.uncoveredFunctions,
221
+ });
222
+ }
223
+ else if (coverage.lines.percentage < 70) {
224
+ suggestions.push({
225
+ file,
226
+ testFile,
227
+ reason: `Low coverage (${coverage.lines.percentage.toFixed(1)}%)`,
228
+ priority: 'medium',
229
+ coverage: coverage.lines.percentage,
230
+ uncoveredFunctions: coverage.uncoveredFunctions,
231
+ });
232
+ }
233
+ else if (coverage.uncoveredFunctions.length > 0) {
234
+ suggestions.push({
235
+ file,
236
+ testFile,
237
+ reason: `${coverage.uncoveredFunctions.length} untested functions`,
238
+ priority: 'low',
239
+ coverage: coverage.lines.percentage,
240
+ uncoveredFunctions: coverage.uncoveredFunctions,
241
+ });
242
+ }
243
+ }
244
+ return suggestions.sort((a, b) => {
245
+ const priorityOrder = { high: 0, medium: 1, low: 2 };
246
+ return priorityOrder[a.priority] - priorityOrder[b.priority];
247
+ });
248
+ }
249
+ /**
250
+ * Determine if a file needs the tester agent based on coverage
251
+ */
252
+ function shouldRouteToTester(file, summary) {
253
+ const coverage = getFileCoverage(file, summary);
254
+ if (!coverage) {
255
+ return {
256
+ route: true,
257
+ reason: 'No test coverage data available',
258
+ coverage: 0,
259
+ };
260
+ }
261
+ if (coverage.lines.percentage < 50) {
262
+ return {
263
+ route: true,
264
+ reason: `Low coverage: ${coverage.lines.percentage.toFixed(1)}%`,
265
+ coverage: coverage.lines.percentage,
266
+ };
267
+ }
268
+ if (coverage.uncoveredFunctions.length > 3) {
269
+ return {
270
+ route: true,
271
+ reason: `${coverage.uncoveredFunctions.length} untested functions`,
272
+ coverage: coverage.lines.percentage,
273
+ };
274
+ }
275
+ return {
276
+ route: false,
277
+ reason: `Adequate coverage: ${coverage.lines.percentage.toFixed(1)}%`,
278
+ coverage: coverage.lines.percentage,
279
+ };
280
+ }
281
+ /**
282
+ * Get coverage-aware routing weight for agent selection
283
+ */
284
+ function getCoverageRoutingWeight(file, summary) {
285
+ const coverage = getFileCoverage(file, summary);
286
+ if (!coverage) {
287
+ // No coverage = prioritize testing
288
+ return { coder: 0.3, tester: 0.5, reviewer: 0.2 };
289
+ }
290
+ const pct = coverage.lines.percentage;
291
+ if (pct < 30) {
292
+ // Very low - strongly prioritize testing
293
+ return { coder: 0.2, tester: 0.6, reviewer: 0.2 };
294
+ }
295
+ else if (pct < 60) {
296
+ // Low - moderate testing priority
297
+ return { coder: 0.4, tester: 0.4, reviewer: 0.2 };
298
+ }
299
+ else if (pct < 80) {
300
+ // Okay - balanced
301
+ return { coder: 0.5, tester: 0.3, reviewer: 0.2 };
302
+ }
303
+ else {
304
+ // Good - focus on code quality
305
+ return { coder: 0.5, tester: 0.2, reviewer: 0.3 };
306
+ }
307
+ }
308
+ exports.default = {
309
+ parseIstanbulCoverage,
310
+ findCoverageReport,
311
+ getFileCoverage,
312
+ suggestTests,
313
+ shouldRouteToTester,
314
+ getCoverageRoutingWeight,
315
+ };
dist/core/diff-embeddings.d.ts ADDED
@@ -0,0 +1,93 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ /**
2
+ * Diff Embeddings - Semantic encoding of git diffs
3
+ *
4
+ * Generates embeddings for code changes to enable:
5
+ * - Change classification (feature, bugfix, refactor)
6
+ * - Similar change detection
7
+ * - Risk assessment
8
+ * - Review prioritization
9
+ */
10
+ export interface DiffHunk {
11
+ file: string;
12
+ oldStart: number;
13
+ oldLines: number;
14
+ newStart: number;
15
+ newLines: number;
16
+ content: string;
17
+ additions: string[];
18
+ deletions: string[];
19
+ }
20
+ export interface DiffAnalysis {
21
+ file: string;
22
+ hunks: DiffHunk[];
23
+ totalAdditions: number;
24
+ totalDeletions: number;
25
+ complexity: number;
26
+ riskScore: number;
27
+ category: 'feature' | 'bugfix' | 'refactor' | 'docs' | 'test' | 'config' | 'unknown';
28
+ embedding?: number[];
29
+ }
30
+ export interface CommitAnalysis {
31
+ hash: string;
32
+ message: string;
33
+ author: string;
34
+ date: string;
35
+ files: DiffAnalysis[];
36
+ totalAdditions: number;
37
+ totalDeletions: number;
38
+ riskScore: number;
39
+ embedding?: number[];
40
+ }
41
+ /**
42
+ * Parse a unified diff into hunks
43
+ */
44
+ export declare function parseDiff(diff: string): DiffHunk[];
45
+ /**
46
+ * Classify a change based on patterns
47
+ */
48
+ export declare function classifyChange(diff: string, message?: string): 'feature' | 'bugfix' | 'refactor' | 'docs' | 'test' | 'config' | 'unknown';
49
+ /**
50
+ * Calculate risk score for a diff
51
+ */
52
+ export declare function calculateRiskScore(analysis: DiffAnalysis): number;
53
+ /**
54
+ * Analyze a single file diff
55
+ */
56
+ export declare function analyzeFileDiff(file: string, diff: string, message?: string): Promise<DiffAnalysis>;
57
+ /**
58
+ * Get diff for a commit
59
+ */
60
+ export declare function getCommitDiff(commitHash?: string): string;
61
+ /**
62
+ * Get diff for staged changes
63
+ */
64
+ export declare function getStagedDiff(): string;
65
+ /**
66
+ * Get diff for unstaged changes
67
+ */
68
+ export declare function getUnstagedDiff(): string;
69
+ /**
70
+ * Analyze a commit
71
+ */
72
+ export declare function analyzeCommit(commitHash?: string): Promise<CommitAnalysis>;
73
+ /**
74
+ * Find similar past commits based on diff embeddings
75
+ */
76
+ export declare function findSimilarCommits(currentDiff: string, recentCommits?: number, topK?: number): Promise<Array<{
77
+ hash: string;
78
+ similarity: number;
79
+ message: string;
80
+ }>>;
81
+ declare const _default: {
82
+ parseDiff: typeof parseDiff;
83
+ classifyChange: typeof classifyChange;
84
+ calculateRiskScore: typeof calculateRiskScore;
85
+ analyzeFileDiff: typeof analyzeFileDiff;
86
+ analyzeCommit: typeof analyzeCommit;
87
+ getCommitDiff: typeof getCommitDiff;
88
+ getStagedDiff: typeof getStagedDiff;
89
+ getUnstagedDiff: typeof getUnstagedDiff;
90
+ findSimilarCommits: typeof findSimilarCommits;
91
+ };
92
+ export default _default;
93
+ //# sourceMappingURL=diff-embeddings.d.ts.map
dist/core/diff-embeddings.d.ts.map ADDED
@@ -0,0 +1 @@
 
 
1
+ {"version":3,"file":"diff-embeddings.d.ts","sourceRoot":"","sources":["../../src/core/diff-embeddings.ts"],"names":[],"mappings":"AAAA;;;;;;;;GAQG;AAKH,MAAM,WAAW,QAAQ;IACvB,IAAI,EAAE,MAAM,CAAC;IACb,QAAQ,EAAE,MAAM,CAAC;IACjB,QAAQ,EAAE,MAAM,CAAC;IACjB,QAAQ,EAAE,MAAM,CAAC;IACjB,QAAQ,EAAE,MAAM,CAAC;IACjB,OAAO,EAAE,MAAM,CAAC;IAChB,SAAS,EAAE,MAAM,EAAE,CAAC;IACpB,SAAS,EAAE,MAAM,EAAE,CAAC;CACrB;AAED,MAAM,WAAW,YAAY;IAC3B,IAAI,EAAE,MAAM,CAAC;IACb,KAAK,EAAE,QAAQ,EAAE,CAAC;IAClB,cAAc,EAAE,MAAM,CAAC;IACvB,cAAc,EAAE,MAAM,CAAC;IACvB,UAAU,EAAE,MAAM,CAAC;IACnB,SAAS,EAAE,MAAM,CAAC;IAClB,QAAQ,EAAE,SAAS,GAAG,QAAQ,GAAG,UAAU,GAAG,MAAM,GAAG,MAAM,GAAG,QAAQ,GAAG,SAAS,CAAC;IACrF,SAAS,CAAC,EAAE,MAAM,EAAE,CAAC;CACtB;AAED,MAAM,WAAW,cAAc;IAC7B,IAAI,EAAE,MAAM,CAAC;IACb,OAAO,EAAE,MAAM,CAAC;IAChB,MAAM,EAAE,MAAM,CAAC;IACf,IAAI,EAAE,MAAM,CAAC;IACb,KAAK,EAAE,YAAY,EAAE,CAAC;IACtB,cAAc,EAAE,MAAM,CAAC;IACvB,cAAc,EAAE,MAAM,CAAC;IACvB,SAAS,EAAE,MAAM,CAAC;IAClB,SAAS,CAAC,EAAE,MAAM,EAAE,CAAC;CACtB;AAED;;GAEG;AACH,wBAAgB,SAAS,CAAC,IAAI,EAAE,MAAM,GAAG,QAAQ,EAAE,CAsDlD;AAED;;GAEG;AACH,wBAAgB,cAAc,CAAC,IAAI,EAAE,MAAM,EAAE,OAAO,GAAE,MAAW,GAAG,SAAS,GAAG,QAAQ,GAAG,UAAU,GAAG,MAAM,GAAG,MAAM,GAAG,QAAQ,GAAG,SAAS,CAsB7I;AAED;;GAEG;AACH,wBAAgB,kBAAkB,CAAC,QAAQ,EAAE,YAAY,GAAG,MAAM,CA2BjE;AAED;;GAEG;AACH,wBAAsB,eAAe,CAAC,IAAI,EAAE,MAAM,EAAE,IAAI,EAAE,MAAM,EAAE,OAAO,GAAE,MAAW,GAAG,OAAO,CAAC,YAAY,CAAC,CAsC7G;AAED;;GAEG;AACH,wBAAgB,aAAa,CAAC,UAAU,GAAE,MAAe,GAAG,MAAM,CASjE;AAED;;GAEG;AACH,wBAAgB,aAAa,IAAI,MAAM,CAStC;AAED;;GAEG;AACH,wBAAgB,eAAe,IAAI,MAAM,CASxC;AAED;;GAEG;AACH,wBAAsB,aAAa,CAAC,UAAU,GAAE,MAAe,GAAG,OAAO,CAAC,cAAc,CAAC,CAwDxF;AAED;;GAEG;AACH,wBAAsB,kBAAkB,CACtC,WAAW,EAAE,MAAM,EACnB,aAAa,GAAE,MAAW,EAC1B,IAAI,GAAE,MAAU,GACf,OAAO,CAAC,KAAK,CAAC;IAAE,IAAI,EAAE,MAAM,CAAC;IAAC,UAAU,EAAE,MAAM,CAAC;IAAC,OAAO,EAAE,MAAM,CAAA;CAAE,CAAC,CAAC,CAgCvE;;;;;;;;;;;;AAmBD,wBAUE"}
dist/core/diff-embeddings.js ADDED
@@ -0,0 +1,334 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ "use strict";
2
+ /**
3
+ * Diff Embeddings - Semantic encoding of git diffs
4
+ *
5
+ * Generates embeddings for code changes to enable:
6
+ * - Change classification (feature, bugfix, refactor)
7
+ * - Similar change detection
8
+ * - Risk assessment
9
+ * - Review prioritization
10
+ */
11
+ Object.defineProperty(exports, "__esModule", { value: true });
12
+ exports.parseDiff = parseDiff;
13
+ exports.classifyChange = classifyChange;
14
+ exports.calculateRiskScore = calculateRiskScore;
15
+ exports.analyzeFileDiff = analyzeFileDiff;
16
+ exports.getCommitDiff = getCommitDiff;
17
+ exports.getStagedDiff = getStagedDiff;
18
+ exports.getUnstagedDiff = getUnstagedDiff;
19
+ exports.analyzeCommit = analyzeCommit;
20
+ exports.findSimilarCommits = findSimilarCommits;
21
+ const child_process_1 = require("child_process");
22
+ const onnx_embedder_1 = require("./onnx-embedder");
23
+ /**
24
+ * Parse a unified diff into hunks
25
+ */
26
+ function parseDiff(diff) {
27
+ const hunks = [];
28
+ const lines = diff.split('\n');
29
+ let currentFile = '';
30
+ let currentHunk = null;
31
+ for (const line of lines) {
32
+ // File header
33
+ if (line.startsWith('diff --git')) {
34
+ const match = line.match(/diff --git a\/(.+) b\/(.+)/);
35
+ if (match) {
36
+ currentFile = match[2];
37
+ }
38
+ }
39
+ // Hunk header
40
+ if (line.startsWith('@@')) {
41
+ if (currentHunk) {
42
+ hunks.push(currentHunk);
43
+ }
44
+ const match = line.match(/@@ -(\d+),?(\d*) \+(\d+),?(\d*) @@/);
45
+ if (match) {
46
+ currentHunk = {
47
+ file: currentFile,
48
+ oldStart: parseInt(match[1]),
49
+ oldLines: parseInt(match[2] || '1'),
50
+ newStart: parseInt(match[3]),
51
+ newLines: parseInt(match[4] || '1'),
52
+ content: '',
53
+ additions: [],
54
+ deletions: [],
55
+ };
56
+ }
57
+ }
58
+ else if (currentHunk) {
59
+ // Content lines
60
+ if (line.startsWith('+') && !line.startsWith('+++')) {
61
+ currentHunk.additions.push(line.substring(1));
62
+ currentHunk.content += line + '\n';
63
+ }
64
+ else if (line.startsWith('-') && !line.startsWith('---')) {
65
+ currentHunk.deletions.push(line.substring(1));
66
+ currentHunk.content += line + '\n';
67
+ }
68
+ else if (line.startsWith(' ')) {
69
+ currentHunk.content += line + '\n';
70
+ }
71
+ }
72
+ }
73
+ if (currentHunk) {
74
+ hunks.push(currentHunk);
75
+ }
76
+ return hunks;
77
+ }
78
+ /**
79
+ * Classify a change based on patterns
80
+ */
81
+ function classifyChange(diff, message = '') {
82
+ const lowerMessage = message.toLowerCase();
83
+ const lowerDiff = diff.toLowerCase();
84
+ // Check message patterns
85
+ if (/\b(fix|bug|issue|error|crash|patch)\b/.test(lowerMessage))
86
+ return 'bugfix';
87
+ if (/\b(feat|feature|add|new|implement)\b/.test(lowerMessage))
88
+ return 'feature';
89
+ if (/\b(refactor|clean|improve|optimize)\b/.test(lowerMessage))
90
+ return 'refactor';
91
+ if (/\b(doc|readme|comment|jsdoc)\b/.test(lowerMessage))
92
+ return 'docs';
93
+ if (/\b(test|spec|coverage)\b/.test(lowerMessage))
94
+ return 'test';
95
+ if (/\b(config|ci|cd|build|deps)\b/.test(lowerMessage))
96
+ return 'config';
97
+ // Check diff patterns
98
+ if (/\.(md|txt|rst)$/.test(diff))
99
+ return 'docs';
100
+ if (/\.(test|spec)\.[jt]sx?/.test(diff))
101
+ return 'test';
102
+ if (/\.(json|ya?ml|toml|ini)$/.test(diff))
103
+ return 'config';
104
+ // Check content patterns
105
+ if (/\bcatch\b|\btry\b|\berror\b/.test(lowerDiff) && /\bfix\b/.test(lowerDiff))
106
+ return 'bugfix';
107
+ if (/\bfunction\b|\bclass\b|\bexport\b/.test(lowerDiff))
108
+ return 'feature';
109
+ return 'unknown';
110
+ }
111
+ /**
112
+ * Calculate risk score for a diff
113
+ */
114
+ function calculateRiskScore(analysis) {
115
+ let risk = 0;
116
+ // Size risk
117
+ const totalChanges = analysis.totalAdditions + analysis.totalDeletions;
118
+ if (totalChanges > 500)
119
+ risk += 0.3;
120
+ else if (totalChanges > 200)
121
+ risk += 0.2;
122
+ else if (totalChanges > 50)
123
+ risk += 0.1;
124
+ // Complexity risk
125
+ if (analysis.complexity > 20)
126
+ risk += 0.2;
127
+ else if (analysis.complexity > 10)
128
+ risk += 0.1;
129
+ // File type risk
130
+ if (analysis.file.includes('auth') || analysis.file.includes('security'))
131
+ risk += 0.2;
132
+ if (analysis.file.includes('database') || analysis.file.includes('migration'))
133
+ risk += 0.15;
134
+ if (analysis.file.includes('api') || analysis.file.includes('endpoint'))
135
+ risk += 0.1;
136
+ // Pattern risk (deletions of error handling, etc.)
137
+ for (const hunk of analysis.hunks) {
138
+ for (const del of hunk.deletions) {
139
+ if (/\bcatch\b|\berror\b|\bvalidat/.test(del))
140
+ risk += 0.1;
141
+ if (/\bif\b.*\bnull\b|\bundefined\b/.test(del))
142
+ risk += 0.05;
143
+ }
144
+ }
145
+ return Math.min(1, risk);
146
+ }
147
+ /**
148
+ * Analyze a single file diff
149
+ */
150
+ async function analyzeFileDiff(file, diff, message = '') {
151
+ const hunks = parseDiff(diff).filter(h => h.file === file || h.file === '');
152
+ const totalAdditions = hunks.reduce((sum, h) => sum + h.additions.length, 0);
153
+ const totalDeletions = hunks.reduce((sum, h) => sum + h.deletions.length, 0);
154
+ // Calculate complexity (branch keywords in additions)
155
+ let complexity = 0;
156
+ for (const hunk of hunks) {
157
+ for (const add of hunk.additions) {
158
+ if (/\bif\b|\belse\b|\bfor\b|\bwhile\b|\bswitch\b|\bcatch\b|\?/.test(add)) {
159
+ complexity++;
160
+ }
161
+ }
162
+ }
163
+ const category = classifyChange(diff, message);
164
+ const analysis = {
165
+ file,
166
+ hunks,
167
+ totalAdditions,
168
+ totalDeletions,
169
+ complexity,
170
+ riskScore: 0,
171
+ category,
172
+ };
173
+ analysis.riskScore = calculateRiskScore(analysis);
174
+ // Generate embedding for the diff
175
+ if ((0, onnx_embedder_1.isReady)()) {
176
+ const diffText = hunks.map(h => h.content).join('\n');
177
+ const result = await (0, onnx_embedder_1.embed)(`${category} change in ${file}: ${diffText.substring(0, 500)}`);
178
+ analysis.embedding = result.embedding;
179
+ }
180
+ return analysis;
181
+ }
182
+ /**
183
+ * Get diff for a commit
184
+ */
185
+ function getCommitDiff(commitHash = 'HEAD') {
186
+ try {
187
+ return (0, child_process_1.execSync)(`git show ${commitHash} --format="" 2>/dev/null`, {
188
+ encoding: 'utf8',
189
+ maxBuffer: 10 * 1024 * 1024,
190
+ });
191
+ }
192
+ catch {
193
+ return '';
194
+ }
195
+ }
196
+ /**
197
+ * Get diff for staged changes
198
+ */
199
+ function getStagedDiff() {
200
+ try {
201
+ return (0, child_process_1.execSync)('git diff --cached 2>/dev/null', {
202
+ encoding: 'utf8',
203
+ maxBuffer: 10 * 1024 * 1024,
204
+ });
205
+ }
206
+ catch {
207
+ return '';
208
+ }
209
+ }
210
+ /**
211
+ * Get diff for unstaged changes
212
+ */
213
+ function getUnstagedDiff() {
214
+ try {
215
+ return (0, child_process_1.execSync)('git diff 2>/dev/null', {
216
+ encoding: 'utf8',
217
+ maxBuffer: 10 * 1024 * 1024,
218
+ });
219
+ }
220
+ catch {
221
+ return '';
222
+ }
223
+ }
224
+ /**
225
+ * Analyze a commit
226
+ */
227
+ async function analyzeCommit(commitHash = 'HEAD') {
228
+ const diff = getCommitDiff(commitHash);
229
+ // Get commit metadata
230
+ let message = '', author = '', date = '';
231
+ try {
232
+ const info = (0, child_process_1.execSync)(`git log -1 --format="%s|%an|%aI" ${commitHash} 2>/dev/null`, {
233
+ encoding: 'utf8',
234
+ }).trim();
235
+ [message, author, date] = info.split('|');
236
+ }
237
+ catch { }
238
+ // Parse hunks and group by file
239
+ const hunks = parseDiff(diff);
240
+ const fileHunks = new Map();
241
+ for (const hunk of hunks) {
242
+ if (!fileHunks.has(hunk.file)) {
243
+ fileHunks.set(hunk.file, []);
244
+ }
245
+ fileHunks.get(hunk.file).push(hunk);
246
+ }
247
+ // Analyze each file
248
+ const files = [];
249
+ for (const [file, fileHunkList] of fileHunks) {
250
+ const fileDiff = fileHunkList.map(h => h.content).join('\n');
251
+ const analysis = await analyzeFileDiff(file, diff, message);
252
+ files.push(analysis);
253
+ }
254
+ const totalAdditions = files.reduce((sum, f) => sum + f.totalAdditions, 0);
255
+ const totalDeletions = files.reduce((sum, f) => sum + f.totalDeletions, 0);
256
+ const riskScore = files.length > 0
257
+ ? files.reduce((sum, f) => sum + f.riskScore, 0) / files.length
258
+ : 0;
259
+ // Generate commit embedding
260
+ let embedding;
261
+ if ((0, onnx_embedder_1.isReady)()) {
262
+ const commitText = `${message}\n\nFiles changed: ${files.map(f => f.file).join(', ')}\n+${totalAdditions} -${totalDeletions}`;
263
+ const result = await (0, onnx_embedder_1.embed)(commitText);
264
+ embedding = result.embedding;
265
+ }
266
+ return {
267
+ hash: commitHash,
268
+ message,
269
+ author,
270
+ date,
271
+ files,
272
+ totalAdditions,
273
+ totalDeletions,
274
+ riskScore,
275
+ embedding,
276
+ };
277
+ }
278
+ /**
279
+ * Find similar past commits based on diff embeddings
280
+ */
281
+ async function findSimilarCommits(currentDiff, recentCommits = 50, topK = 5) {
282
+ if (!(0, onnx_embedder_1.isReady)()) {
283
+ await (0, onnx_embedder_1.initOnnxEmbedder)();
284
+ }
285
+ // Get current diff embedding
286
+ const currentEmbedding = (await (0, onnx_embedder_1.embed)(currentDiff.substring(0, 1000))).embedding;
287
+ // Get recent commits
288
+ let commits = [];
289
+ try {
290
+ commits = (0, child_process_1.execSync)(`git log -${recentCommits} --format="%H" 2>/dev/null`, {
291
+ encoding: 'utf8',
292
+ }).trim().split('\n');
293
+ }
294
+ catch {
295
+ return [];
296
+ }
297
+ // Analyze and compare
298
+ const results = [];
299
+ for (const hash of commits.slice(0, Math.min(commits.length, recentCommits))) {
300
+ const analysis = await analyzeCommit(hash);
301
+ if (analysis.embedding) {
302
+ const similarity = cosineSimilarity(currentEmbedding, analysis.embedding);
303
+ results.push({ hash, similarity, message: analysis.message });
304
+ }
305
+ }
306
+ return results
307
+ .sort((a, b) => b.similarity - a.similarity)
308
+ .slice(0, topK);
309
+ }
310
+ function cosineSimilarity(a, b) {
311
+ if (a.length !== b.length)
312
+ return 0;
313
+ let dotProduct = 0;
314
+ let normA = 0;
315
+ let normB = 0;
316
+ for (let i = 0; i < a.length; i++) {
317
+ dotProduct += a[i] * b[i];
318
+ normA += a[i] * a[i];
319
+ normB += b[i] * b[i];
320
+ }
321
+ const magnitude = Math.sqrt(normA) * Math.sqrt(normB);
322
+ return magnitude === 0 ? 0 : dotProduct / magnitude;
323
+ }
324
+ exports.default = {
325
+ parseDiff,
326
+ classifyChange,
327
+ calculateRiskScore,
328
+ analyzeFileDiff,
329
+ analyzeCommit,
330
+ getCommitDiff,
331
+ getStagedDiff,
332
+ getUnstagedDiff,
333
+ findSimilarCommits,
334
+ };
dist/core/gnn-wrapper.d.ts ADDED
@@ -0,0 +1,143 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ /**
2
+ * GNN Wrapper - Safe wrapper around @ruvector/gnn with automatic array conversion
3
+ *
4
+ * This wrapper handles the array type conversion automatically, allowing users
5
+ * to pass either regular arrays or Float32Arrays.
6
+ *
7
+ * The native @ruvector/gnn requires Float32Array for maximum performance.
8
+ * This wrapper converts any input type to Float32Array automatically.
9
+ *
10
+ * Performance Tips:
11
+ * - Pass Float32Array directly for zero-copy performance
12
+ * - Use toFloat32Array/toFloat32ArrayBatch for pre-conversion
13
+ * - Avoid repeated conversions in hot paths
14
+ */
15
+ /**
16
+ * Convert any array-like input to Float32Array (native requires Float32Array)
17
+ * Optimized paths:
18
+ * - Float32Array: zero-copy return
19
+ * - Float64Array: efficient typed array copy
20
+ * - Array: direct Float32Array construction
21
+ */
22
+ export declare function toFloat32Array(input: number[] | Float32Array | Float64Array): Float32Array;
23
+ /**
24
+ * Convert array of arrays to array of Float32Arrays
25
+ */
26
+ export declare function toFloat32ArrayBatch(input: (number[] | Float32Array | Float64Array)[]): Float32Array[];
27
+ /**
28
+ * Search result from differentiable search
29
+ */
30
+ export interface DifferentiableSearchResult {
31
+ /** Indices of top-k candidates */
32
+ indices: number[];
33
+ /** Soft weights for top-k candidates */
34
+ weights: number[];
35
+ }
36
+ /**
37
+ * Differentiable search using soft attention mechanism
38
+ *
39
+ * This wrapper automatically converts Float32Array inputs to regular arrays.
40
+ *
41
+ * @param query - Query vector (array or Float32Array)
42
+ * @param candidates - List of candidate vectors (arrays or Float32Arrays)
43
+ * @param k - Number of top results to return
44
+ * @param temperature - Temperature for softmax (lower = sharper, higher = smoother)
45
+ * @returns Search result with indices and soft weights
46
+ *
47
+ * @example
48
+ * ```typescript
49
+ * import { differentiableSearch } from 'ruvector/core/gnn-wrapper';
50
+ *
51
+ * // Works with regular arrays (auto-converted to Float32Array)
52
+ * const result1 = differentiableSearch([1, 0, 0], [[1, 0, 0], [0, 1, 0]], 2, 1.0);
53
+ *
54
+ * // For best performance, use Float32Array directly (zero-copy)
55
+ * const query = new Float32Array([1, 0, 0]);
56
+ * const candidates = [new Float32Array([1, 0, 0]), new Float32Array([0, 1, 0])];
57
+ * const result2 = differentiableSearch(query, candidates, 2, 1.0);
58
+ * ```
59
+ */
60
+ export declare function differentiableSearch(query: number[] | Float32Array | Float64Array, candidates: (number[] | Float32Array | Float64Array)[], k: number, temperature?: number): DifferentiableSearchResult;
61
+ /**
62
+ * GNN Layer for HNSW topology
63
+ */
64
+ export declare class RuvectorLayer {
65
+ private inner;
66
+ /**
67
+ * Create a new Ruvector GNN layer
68
+ *
69
+ * @param inputDim - Dimension of input node embeddings
70
+ * @param hiddenDim - Dimension of hidden representations
71
+ * @param heads - Number of attention heads
72
+ * @param dropout - Dropout rate (0.0 to 1.0)
73
+ */
74
+ constructor(inputDim: number, hiddenDim: number, heads: number, dropout?: number);
75
+ /**
76
+ * Forward pass through the GNN layer
77
+ *
78
+ * @param nodeEmbedding - Current node's embedding
79
+ * @param neighborEmbeddings - Embeddings of neighbor nodes
80
+ * @param edgeWeights - Weights of edges to neighbors
81
+ * @returns Updated node embedding as Float32Array
82
+ */
83
+ forward(nodeEmbedding: number[] | Float32Array, neighborEmbeddings: (number[] | Float32Array)[], edgeWeights: number[] | Float32Array): Float32Array;
84
+ /**
85
+ * Serialize the layer to JSON
86
+ */
87
+ toJson(): string;
88
+ /**
89
+ * Deserialize the layer from JSON
90
+ */
91
+ static fromJson(json: string): RuvectorLayer;
92
+ }
93
+ /**
94
+ * Tensor compressor with adaptive level selection
95
+ */
96
+ export declare class TensorCompress {
97
+ private inner;
98
+ constructor();
99
+ /**
100
+ * Compress an embedding based on access frequency
101
+ *
102
+ * @param embedding - Input embedding vector
103
+ * @param accessFreq - Access frequency (0.0 to 1.0)
104
+ * @returns Compressed tensor as JSON string
105
+ */
106
+ compress(embedding: number[] | Float32Array, accessFreq: number): string;
107
+ /**
108
+ * Decompress a compressed tensor
109
+ *
110
+ * @param compressedJson - Compressed tensor JSON
111
+ * @returns Decompressed embedding
112
+ */
113
+ decompress(compressedJson: string): number[];
114
+ }
115
+ /**
116
+ * Hierarchical forward pass through GNN layers
117
+ *
118
+ * @param query - Query vector
119
+ * @param layerEmbeddings - Embeddings organized by layer
120
+ * @param gnnLayersJson - JSON array of serialized GNN layers
121
+ * @returns Final embedding after hierarchical processing as Float32Array
122
+ */
123
+ export declare function hierarchicalForward(query: number[] | Float32Array, layerEmbeddings: (number[] | Float32Array)[][], gnnLayersJson: string[]): Float32Array;
124
+ /**
125
+ * Get compression level for a given access frequency
126
+ */
127
+ export declare function getCompressionLevel(accessFreq: number): string;
128
+ /**
129
+ * Check if GNN module is available
130
+ */
131
+ export declare function isGnnAvailable(): boolean;
132
+ declare const _default: {
133
+ differentiableSearch: typeof differentiableSearch;
134
+ RuvectorLayer: typeof RuvectorLayer;
135
+ TensorCompress: typeof TensorCompress;
136
+ hierarchicalForward: typeof hierarchicalForward;
137
+ getCompressionLevel: typeof getCompressionLevel;
138
+ isGnnAvailable: typeof isGnnAvailable;
139
+ toFloat32Array: typeof toFloat32Array;
140
+ toFloat32ArrayBatch: typeof toFloat32ArrayBatch;
141
+ };
142
+ export default _default;
143
+ //# sourceMappingURL=gnn-wrapper.d.ts.map
dist/core/gnn-wrapper.d.ts.map ADDED
@@ -0,0 +1 @@
 
 
1
+ {"version":3,"file":"gnn-wrapper.d.ts","sourceRoot":"","sources":["../../src/core/gnn-wrapper.ts"],"names":[],"mappings":"AAAA;;;;;;;;;;;;;GAaG;AAsBH;;;;;;GAMG;AACH,wBAAgB,cAAc,CAAC,KAAK,EAAE,MAAM,EAAE,GAAG,YAAY,GAAG,YAAY,GAAG,YAAY,CAK1F;AAED;;GAEG;AACH,wBAAgB,mBAAmB,CAAC,KAAK,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,GAAG,YAAY,CAAC,EAAE,GAAG,YAAY,EAAE,CAMrG;AAED;;GAEG;AACH,MAAM,WAAW,0BAA0B;IACzC,kCAAkC;IAClC,OAAO,EAAE,MAAM,EAAE,CAAC;IAClB,wCAAwC;IACxC,OAAO,EAAE,MAAM,EAAE,CAAC;CACnB;AAED;;;;;;;;;;;;;;;;;;;;;;;GAuBG;AACH,wBAAgB,oBAAoB,CAClC,KAAK,EAAE,MAAM,EAAE,GAAG,YAAY,GAAG,YAAY,EAC7C,UAAU,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,GAAG,YAAY,CAAC,EAAE,EACtD,CAAC,EAAE,MAAM,EACT,WAAW,GAAE,MAAY,GACxB,0BAA0B,CAQ5B;AAED;;GAEG;AACH,qBAAa,aAAa;IACxB,OAAO,CAAC,KAAK,CAAM;IAEnB;;;;;;;OAOG;gBACS,QAAQ,EAAE,MAAM,EAAE,SAAS,EAAE,MAAM,EAAE,KAAK,EAAE,MAAM,EAAE,OAAO,GAAE,MAAY;IAKrF;;;;;;;OAOG;IACH,OAAO,CACL,aAAa,EAAE,MAAM,EAAE,GAAG,YAAY,EACtC,kBAAkB,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,EAC/C,WAAW,EAAE,MAAM,EAAE,GAAG,YAAY,GACnC,YAAY;IAQf;;OAEG;IACH,MAAM,IAAI,MAAM;IAIhB;;OAEG;IACH,MAAM,CAAC,QAAQ,CAAC,IAAI,EAAE,MAAM,GAAG,aAAa;CAM7C;AAED;;GAEG;AACH,qBAAa,cAAc;IACzB,OAAO,CAAC,KAAK,CAAM;;IAOnB;;;;;;OAMG;IACH,QAAQ,CAAC,SAAS,EAAE,MAAM,EAAE,GAAG,YAAY,EAAE,UAAU,EAAE,MAAM,GAAG,MAAM;IAIxE;;;;;OAKG;IACH,UAAU,CAAC,cAAc,EAAE,MAAM,GAAG,MAAM,EAAE;CAG7C;AAED;;;;;;;GAOG;AACH,wBAAgB,mBAAmB,CACjC,KAAK,EAAE,MAAM,EAAE,GAAG,YAAY,EAC9B,eAAe,EAAE,CAAC,MAAM,EAAE,GAAG,YAAY,CAAC,EAAE,EAAE,EAC9C,aAAa,EAAE,MAAM,EAAE,GACtB,YAAY,CAOd;AAED;;GAEG;AACH,wBAAgB,mBAAmB,CAAC,UAAU,EAAE,MAAM,GAAG,MAAM,CAG9D;AAED;;GAEG;AACH,wBAAgB,cAAc,IAAI,OAAO,CAOxC;;;;;;;;;;;AAED,wBAUE"}
dist/core/gnn-wrapper.js ADDED
@@ -0,0 +1,213 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ "use strict";
2
+ /**
3
+ * GNN Wrapper - Safe wrapper around @ruvector/gnn with automatic array conversion
4
+ *
5
+ * This wrapper handles the array type conversion automatically, allowing users
6
+ * to pass either regular arrays or Float32Arrays.
7
+ *
8
+ * The native @ruvector/gnn requires Float32Array for maximum performance.
9
+ * This wrapper converts any input type to Float32Array automatically.
10
+ *
11
+ * Performance Tips:
12
+ * - Pass Float32Array directly for zero-copy performance
13
+ * - Use toFloat32Array/toFloat32ArrayBatch for pre-conversion
14
+ * - Avoid repeated conversions in hot paths
15
+ */
16
+ Object.defineProperty(exports, "__esModule", { value: true });
17
+ exports.TensorCompress = exports.RuvectorLayer = void 0;
18
+ exports.toFloat32Array = toFloat32Array;
19
+ exports.toFloat32ArrayBatch = toFloat32ArrayBatch;
20
+ exports.differentiableSearch = differentiableSearch;
21
+ exports.hierarchicalForward = hierarchicalForward;
22
+ exports.getCompressionLevel = getCompressionLevel;
23
+ exports.isGnnAvailable = isGnnAvailable;
24
+ // Lazy load to avoid import errors if not installed
25
+ let gnnModule = null;
26
+ let loadError = null;
27
+ function getGnnModule() {
28
+ if (gnnModule)
29
+ return gnnModule;
30
+ if (loadError)
31
+ throw loadError;
32
+ try {
33
+ gnnModule = require('@ruvector/gnn');
34
+ return gnnModule;
35
+ }
36
+ catch (e) {
37
+ loadError = new Error(`@ruvector/gnn is not installed or failed to load: ${e.message}\n` +
38
+ `Install with: npm install @ruvector/gnn`);
39
+ throw loadError;
40
+ }
41
+ }
42
+ /**
43
+ * Convert any array-like input to Float32Array (native requires Float32Array)
44
+ * Optimized paths:
45
+ * - Float32Array: zero-copy return
46
+ * - Float64Array: efficient typed array copy
47
+ * - Array: direct Float32Array construction
48
+ */
49
+ function toFloat32Array(input) {
50
+ if (input instanceof Float32Array)
51
+ return input;
52
+ if (input instanceof Float64Array)
53
+ return new Float32Array(input);
54
+ if (Array.isArray(input))
55
+ return new Float32Array(input);
56
+ return new Float32Array(Array.from(input));
57
+ }
58
+ /**
59
+ * Convert array of arrays to array of Float32Arrays
60
+ */
61
+ function toFloat32ArrayBatch(input) {
62
+ const result = new Array(input.length);
63
+ for (let i = 0; i < input.length; i++) {
64
+ result[i] = toFloat32Array(input[i]);
65
+ }
66
+ return result;
67
+ }
68
+ /**
69
+ * Differentiable search using soft attention mechanism
70
+ *
71
+ * This wrapper automatically converts Float32Array inputs to regular arrays.
72
+ *
73
+ * @param query - Query vector (array or Float32Array)
74
+ * @param candidates - List of candidate vectors (arrays or Float32Arrays)
75
+ * @param k - Number of top results to return
76
+ * @param temperature - Temperature for softmax (lower = sharper, higher = smoother)
77
+ * @returns Search result with indices and soft weights
78
+ *
79
+ * @example
80
+ * ```typescript
81
+ * import { differentiableSearch } from 'ruvector/core/gnn-wrapper';
82
+ *
83
+ * // Works with regular arrays (auto-converted to Float32Array)
84
+ * const result1 = differentiableSearch([1, 0, 0], [[1, 0, 0], [0, 1, 0]], 2, 1.0);
85
+ *
86
+ * // For best performance, use Float32Array directly (zero-copy)
87
+ * const query = new Float32Array([1, 0, 0]);
88
+ * const candidates = [new Float32Array([1, 0, 0]), new Float32Array([0, 1, 0])];
89
+ * const result2 = differentiableSearch(query, candidates, 2, 1.0);
90
+ * ```
91
+ */
92
+ function differentiableSearch(query, candidates, k, temperature = 1.0) {
93
+ const gnn = getGnnModule();
94
+ // Convert to Float32Array (native Rust expects Float32Array for performance)
95
+ const queryFloat32 = toFloat32Array(query);
96
+ const candidatesFloat32 = toFloat32ArrayBatch(candidates);
97
+ return gnn.differentiableSearch(queryFloat32, candidatesFloat32, k, temperature);
98
+ }
99
+ /**
100
+ * GNN Layer for HNSW topology
101
+ */
102
+ class RuvectorLayer {
103
+ /**
104
+ * Create a new Ruvector GNN layer
105
+ *
106
+ * @param inputDim - Dimension of input node embeddings
107
+ * @param hiddenDim - Dimension of hidden representations
108
+ * @param heads - Number of attention heads
109
+ * @param dropout - Dropout rate (0.0 to 1.0)
110
+ */
111
+ constructor(inputDim, hiddenDim, heads, dropout = 0.1) {
112
+ const gnn = getGnnModule();
113
+ this.inner = new gnn.RuvectorLayer(inputDim, hiddenDim, heads, dropout);
114
+ }
115
+ /**
116
+ * Forward pass through the GNN layer
117
+ *
118
+ * @param nodeEmbedding - Current node's embedding
119
+ * @param neighborEmbeddings - Embeddings of neighbor nodes
120
+ * @param edgeWeights - Weights of edges to neighbors
121
+ * @returns Updated node embedding as Float32Array
122
+ */
123
+ forward(nodeEmbedding, neighborEmbeddings, edgeWeights) {
124
+ return this.inner.forward(toFloat32Array(nodeEmbedding), toFloat32ArrayBatch(neighborEmbeddings), toFloat32Array(edgeWeights));
125
+ }
126
+ /**
127
+ * Serialize the layer to JSON
128
+ */
129
+ toJson() {
130
+ return this.inner.toJson();
131
+ }
132
+ /**
133
+ * Deserialize the layer from JSON
134
+ */
135
+ static fromJson(json) {
136
+ const gnn = getGnnModule();
137
+ const layer = new RuvectorLayer(1, 1, 1, 0); // Dummy constructor
138
+ layer.inner = gnn.RuvectorLayer.fromJson(json);
139
+ return layer;
140
+ }
141
+ }
142
+ exports.RuvectorLayer = RuvectorLayer;
143
+ /**
144
+ * Tensor compressor with adaptive level selection
145
+ */
146
+ class TensorCompress {
147
+ constructor() {
148
+ const gnn = getGnnModule();
149
+ this.inner = new gnn.TensorCompress();
150
+ }
151
+ /**
152
+ * Compress an embedding based on access frequency
153
+ *
154
+ * @param embedding - Input embedding vector
155
+ * @param accessFreq - Access frequency (0.0 to 1.0)
156
+ * @returns Compressed tensor as JSON string
157
+ */
158
+ compress(embedding, accessFreq) {
159
+ return this.inner.compress(toFloat32Array(embedding), accessFreq);
160
+ }
161
+ /**
162
+ * Decompress a compressed tensor
163
+ *
164
+ * @param compressedJson - Compressed tensor JSON
165
+ * @returns Decompressed embedding
166
+ */
167
+ decompress(compressedJson) {
168
+ return this.inner.decompress(compressedJson);
169
+ }
170
+ }
171
+ exports.TensorCompress = TensorCompress;
172
+ /**
173
+ * Hierarchical forward pass through GNN layers
174
+ *
175
+ * @param query - Query vector
176
+ * @param layerEmbeddings - Embeddings organized by layer
177
+ * @param gnnLayersJson - JSON array of serialized GNN layers
178
+ * @returns Final embedding after hierarchical processing as Float32Array
179
+ */
180
+ function hierarchicalForward(query, layerEmbeddings, gnnLayersJson) {
181
+ const gnn = getGnnModule();
182
+ return gnn.hierarchicalForward(toFloat32Array(query), layerEmbeddings.map(layer => toFloat32ArrayBatch(layer)), gnnLayersJson);
183
+ }
184
+ /**
185
+ * Get compression level for a given access frequency
186
+ */
187
+ function getCompressionLevel(accessFreq) {
188
+ const gnn = getGnnModule();
189
+ return gnn.getCompressionLevel(accessFreq);
190
+ }
191
+ /**
192
+ * Check if GNN module is available
193
+ */
194
+ function isGnnAvailable() {
195
+ try {
196
+ getGnnModule();
197
+ return true;
198
+ }
199
+ catch {
200
+ return false;
201
+ }
202
+ }
203
+ exports.default = {
204
+ differentiableSearch,
205
+ RuvectorLayer,
206
+ TensorCompress,
207
+ hierarchicalForward,
208
+ getCompressionLevel,
209
+ isGnnAvailable,
210
+ // Export conversion helpers for performance optimization
211
+ toFloat32Array,
212
+ toFloat32ArrayBatch,
213
+ };
dist/core/graph-algorithms.d.ts ADDED
@@ -0,0 +1,83 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ /**
2
+ * Graph Algorithms - MinCut, Spectral Clustering, Community Detection
3
+ *
4
+ * Provides graph partitioning and clustering algorithms for:
5
+ * - Code module detection
6
+ * - Dependency clustering
7
+ * - Architecture analysis
8
+ * - Refactoring suggestions
9
+ */
10
+ export interface Graph {
11
+ nodes: string[];
12
+ edges: Array<{
13
+ from: string;
14
+ to: string;
15
+ weight?: number;
16
+ }>;
17
+ adjacency: Map<string, Map<string, number>>;
18
+ }
19
+ export interface Partition {
20
+ groups: string[][];
21
+ cutWeight: number;
22
+ modularity: number;
23
+ }
24
+ export interface SpectralResult {
25
+ clusters: Map<string, number>;
26
+ eigenvalues: number[];
27
+ coordinates: Map<string, number[]>;
28
+ }
29
+ /**
30
+ * Build adjacency representation from edges
31
+ */
32
+ export declare function buildGraph(nodes: string[], edges: Array<{
33
+ from: string;
34
+ to: string;
35
+ weight?: number;
36
+ }>): Graph;
37
+ /**
38
+ * Minimum Cut (Stoer-Wagner algorithm)
39
+ *
40
+ * Finds the minimum weight cut that partitions the graph into two parts.
41
+ * Useful for finding loosely coupled module boundaries.
42
+ */
43
+ export declare function minCut(graph: Graph): Partition;
44
+ /**
45
+ * Spectral Clustering (using power iteration)
46
+ *
47
+ * Uses graph Laplacian eigenvectors for clustering.
48
+ * Good for finding natural clusters in code dependencies.
49
+ */
50
+ export declare function spectralClustering(graph: Graph, k?: number): SpectralResult;
51
+ /**
52
+ * Louvain Community Detection
53
+ *
54
+ * Greedy modularity optimization for finding communities.
55
+ * Good for detecting natural module boundaries.
56
+ */
57
+ export declare function louvainCommunities(graph: Graph): Map<string, number>;
58
+ /**
59
+ * Calculate modularity of a partition
60
+ */
61
+ export declare function calculateModularity(graph: Graph, partition: string[][]): number;
62
+ /**
63
+ * Find bridges (edges whose removal disconnects components)
64
+ */
65
+ export declare function findBridges(graph: Graph): Array<{
66
+ from: string;
67
+ to: string;
68
+ }>;
69
+ /**
70
+ * Find articulation points (nodes whose removal disconnects components)
71
+ */
72
+ export declare function findArticulationPoints(graph: Graph): string[];
73
+ declare const _default: {
74
+ buildGraph: typeof buildGraph;
75
+ minCut: typeof minCut;
76
+ spectralClustering: typeof spectralClustering;
77
+ louvainCommunities: typeof louvainCommunities;
78
+ calculateModularity: typeof calculateModularity;
79
+ findBridges: typeof findBridges;
80
+ findArticulationPoints: typeof findArticulationPoints;
81
+ };
82
+ export default _default;
83
+ //# sourceMappingURL=graph-algorithms.d.ts.map
dist/core/graph-algorithms.d.ts.map ADDED
@@ -0,0 +1 @@
 
 
1
+ {"version":3,"file":"graph-algorithms.d.ts","sourceRoot":"","sources":["../../src/core/graph-algorithms.ts"],"names":[],"mappings":"AAAA;;;;;;;;GAQG;AAEH,MAAM,WAAW,KAAK;IACpB,KAAK,EAAE,MAAM,EAAE,CAAC;IAChB,KAAK,EAAE,KAAK,CAAC;QAAE,IAAI,EAAE,MAAM,CAAC;QAAC,EAAE,EAAE,MAAM,CAAC;QAAC,MAAM,CAAC,EAAE,MAAM,CAAA;KAAE,CAAC,CAAC;IAC5D,SAAS,EAAE,GAAG,CAAC,MAAM,EAAE,GAAG,CAAC,MAAM,EAAE,MAAM,CAAC,CAAC,CAAC;CAC7C;AAED,MAAM,WAAW,SAAS;IACxB,MAAM,EAAE,MAAM,EAAE,EAAE,CAAC;IACnB,SAAS,EAAE,MAAM,CAAC;IAClB,UAAU,EAAE,MAAM,CAAC;CACpB;AAED,MAAM,WAAW,cAAc;IAC7B,QAAQ,EAAE,GAAG,CAAC,MAAM,EAAE,MAAM,CAAC,CAAC;IAC9B,WAAW,EAAE,MAAM,EAAE,CAAC;IACtB,WAAW,EAAE,GAAG,CAAC,MAAM,EAAE,MAAM,EAAE,CAAC,CAAC;CACpC;AAED;;GAEG;AACH,wBAAgB,UAAU,CACxB,KAAK,EAAE,MAAM,EAAE,EACf,KAAK,EAAE,KAAK,CAAC;IAAE,IAAI,EAAE,MAAM,CAAC;IAAC,EAAE,EAAE,MAAM,CAAC;IAAC,MAAM,CAAC,EAAE,MAAM,CAAA;CAAE,CAAC,GAC1D,KAAK,CAiBP;AAED;;;;;GAKG;AACH,wBAAgB,MAAM,CAAC,KAAK,EAAE,KAAK,GAAG,SAAS,CAwG9C;AAED;;;;;GAKG;AACH,wBAAgB,kBAAkB,CAAC,KAAK,EAAE,KAAK,EAAE,CAAC,GAAE,MAAU,GAAG,cAAc,CA0G9E;AAED;;;;;GAKG;AACH,wBAAgB,kBAAkB,CAAC,KAAK,EAAE,KAAK,GAAG,GAAG,CAAC,MAAM,EAAE,MAAM,CAAC,CAiGpE;AAED;;GAEG;AACH,wBAAgB,mBAAmB,CAAC,KAAK,EAAE,KAAK,EAAE,SAAS,EAAE,MAAM,EAAE,EAAE,GAAG,MAAM,CAgC/E;AAED;;GAEG;AACH,wBAAgB,WAAW,CAAC,KAAK,EAAE,KAAK,GAAG,KAAK,CAAC;IAAE,IAAI,EAAE,MAAM,CAAC;IAAC,EAAE,EAAE,MAAM,CAAA;CAAE,CAAC,CAsC7E;AAED;;GAEG;AACH,wBAAgB,sBAAsB,CAAC,KAAK,EAAE,KAAK,GAAG,MAAM,EAAE,CA+C7D;;;;;;;;;;AA+FD,wBAQE"}
dist/core/graph-algorithms.js ADDED
@@ -0,0 +1,514 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ "use strict";
2
+ /**
3
+ * Graph Algorithms - MinCut, Spectral Clustering, Community Detection
4
+ *
5
+ * Provides graph partitioning and clustering algorithms for:
6
+ * - Code module detection
7
+ * - Dependency clustering
8
+ * - Architecture analysis
9
+ * - Refactoring suggestions
10
+ */
11
+ Object.defineProperty(exports, "__esModule", { value: true });
12
+ exports.buildGraph = buildGraph;
13
+ exports.minCut = minCut;
14
+ exports.spectralClustering = spectralClustering;
15
+ exports.louvainCommunities = louvainCommunities;
16
+ exports.calculateModularity = calculateModularity;
17
+ exports.findBridges = findBridges;
18
+ exports.findArticulationPoints = findArticulationPoints;
19
+ /**
20
+ * Build adjacency representation from edges
21
+ */
22
+ function buildGraph(nodes, edges) {
23
+ const adjacency = new Map();
24
+ for (const node of nodes) {
25
+ adjacency.set(node, new Map());
26
+ }
27
+ for (const { from, to, weight = 1 } of edges) {
28
+ if (!adjacency.has(from))
29
+ adjacency.set(from, new Map());
30
+ if (!adjacency.has(to))
31
+ adjacency.set(to, new Map());
32
+ // Undirected graph - add both directions
33
+ adjacency.get(from).set(to, weight);
34
+ adjacency.get(to).set(from, weight);
35
+ }
36
+ return { nodes, edges, adjacency };
37
+ }
38
+ /**
39
+ * Minimum Cut (Stoer-Wagner algorithm)
40
+ *
41
+ * Finds the minimum weight cut that partitions the graph into two parts.
42
+ * Useful for finding loosely coupled module boundaries.
43
+ */
44
+ function minCut(graph) {
45
+ const n = graph.nodes.length;
46
+ if (n < 2) {
47
+ return { groups: [graph.nodes], cutWeight: 0, modularity: 0 };
48
+ }
49
+ // Copy adjacency for modification
50
+ const adj = new Map();
51
+ for (const [node, neighbors] of graph.adjacency) {
52
+ adj.set(node, new Map(neighbors));
53
+ }
54
+ let minCutWeight = Infinity;
55
+ let bestPartition = [];
56
+ const merged = new Map(); // Track merged nodes
57
+ for (const node of graph.nodes) {
58
+ merged.set(node, [node]);
59
+ }
60
+ let remaining = [...graph.nodes];
61
+ // Stoer-Wagner phases
62
+ while (remaining.length > 1) {
63
+ // Maximum adjacency search
64
+ const inA = new Set([remaining[0]]);
65
+ const weights = new Map();
66
+ for (const node of remaining) {
67
+ if (!inA.has(node)) {
68
+ weights.set(node, adj.get(remaining[0])?.get(node) || 0);
69
+ }
70
+ }
71
+ let lastAdded = remaining[0];
72
+ let beforeLast = remaining[0];
73
+ while (inA.size < remaining.length) {
74
+ // Find node with maximum weight to A
75
+ let maxWeight = -Infinity;
76
+ let maxNode = '';
77
+ for (const [node, weight] of weights) {
78
+ if (!inA.has(node) && weight > maxWeight) {
79
+ maxWeight = weight;
80
+ maxNode = node;
81
+ }
82
+ }
83
+ if (!maxNode)
84
+ break;
85
+ beforeLast = lastAdded;
86
+ lastAdded = maxNode;
87
+ inA.add(maxNode);
88
+ // Update weights
89
+ for (const [neighbor, w] of adj.get(maxNode) || []) {
90
+ if (!inA.has(neighbor)) {
91
+ weights.set(neighbor, (weights.get(neighbor) || 0) + w);
92
+ }
93
+ }
94
+ }
95
+ // Cut of the phase
96
+ const cutWeight = weights.get(lastAdded) || 0;
97
+ if (cutWeight < minCutWeight) {
98
+ minCutWeight = cutWeight;
99
+ const lastGroup = merged.get(lastAdded) || [lastAdded];
100
+ const otherNodes = remaining.filter(n => n !== lastAdded).flatMap(n => merged.get(n) || [n]);
101
+ bestPartition = [lastGroup, otherNodes];
102
+ }
103
+ // Merge last two nodes
104
+ if (remaining.length > 1) {
105
+ // Merge lastAdded into beforeLast
106
+ const mergedNodes = [...(merged.get(beforeLast) || []), ...(merged.get(lastAdded) || [])];
107
+ merged.set(beforeLast, mergedNodes);
108
+ // Update adjacency
109
+ for (const [neighbor, w] of adj.get(lastAdded) || []) {
110
+ if (neighbor !== beforeLast) {
111
+ const current = adj.get(beforeLast)?.get(neighbor) || 0;
112
+ adj.get(beforeLast)?.set(neighbor, current + w);
113
+ adj.get(neighbor)?.set(beforeLast, current + w);
114
+ }
115
+ }
116
+ // Remove lastAdded
117
+ remaining = remaining.filter(n => n !== lastAdded);
118
+ adj.delete(lastAdded);
119
+ for (const [, neighbors] of adj) {
120
+ neighbors.delete(lastAdded);
121
+ }
122
+ }
123
+ }
124
+ const modularity = calculateModularity(graph, bestPartition);
125
+ return {
126
+ groups: bestPartition.filter(g => g.length > 0),
127
+ cutWeight: minCutWeight,
128
+ modularity,
129
+ };
130
+ }
131
+ /**
132
+ * Spectral Clustering (using power iteration)
133
+ *
134
+ * Uses graph Laplacian eigenvectors for clustering.
135
+ * Good for finding natural clusters in code dependencies.
136
+ */
137
+ function spectralClustering(graph, k = 2) {
138
+ const n = graph.nodes.length;
139
+ const nodeIndex = new Map(graph.nodes.map((node, i) => [node, i]));
140
+ const clusters = new Map();
141
+ if (n === 0) {
142
+ return { clusters, eigenvalues: [], coordinates: new Map() };
143
+ }
144
+ // Build Laplacian matrix (D - A)
145
+ const degree = new Float64Array(n);
146
+ const laplacian = Array(n).fill(null).map(() => Array(n).fill(0));
147
+ for (const [node, neighbors] of graph.adjacency) {
148
+ const i = nodeIndex.get(node);
149
+ let d = 0;
150
+ for (const [neighbor, weight] of neighbors) {
151
+ const j = nodeIndex.get(neighbor);
152
+ laplacian[i][j] = -weight;
153
+ d += weight;
154
+ }
155
+ degree[i] = d;
156
+ laplacian[i][i] = d;
157
+ }
158
+ // Normalized Laplacian: D^(-1/2) L D^(-1/2)
159
+ for (let i = 0; i < n; i++) {
160
+ for (let j = 0; j < n; j++) {
161
+ if (degree[i] > 0 && degree[j] > 0) {
162
+ laplacian[i][j] /= Math.sqrt(degree[i] * degree[j]);
163
+ }
164
+ }
165
+ }
166
+ // Power iteration to find eigenvectors
167
+ const eigenvectors = [];
168
+ const eigenvalues = [];
169
+ for (let ev = 0; ev < Math.min(k, n); ev++) {
170
+ let vector = new Float64Array(n);
171
+ for (let i = 0; i < n; i++) {
172
+ vector[i] = Math.random();
173
+ }
174
+ normalize(vector);
175
+ // Deflation: orthogonalize against previous eigenvectors
176
+ for (const prev of eigenvectors) {
177
+ const dot = dotProduct(vector, new Float64Array(prev));
178
+ for (let i = 0; i < n; i++) {
179
+ vector[i] -= dot * prev[i];
180
+ }
181
+ }
182
+ normalize(vector);
183
+ // Power iteration
184
+ for (let iter = 0; iter < 100; iter++) {
185
+ const newVector = new Float64Array(n);
186
+ for (let i = 0; i < n; i++) {
187
+ for (let j = 0; j < n; j++) {
188
+ newVector[i] += laplacian[i][j] * vector[j];
189
+ }
190
+ }
191
+ // Deflation
192
+ for (const prev of eigenvectors) {
193
+ const dot = dotProduct(newVector, new Float64Array(prev));
194
+ for (let i = 0; i < n; i++) {
195
+ newVector[i] -= dot * prev[i];
196
+ }
197
+ }
198
+ normalize(newVector);
199
+ vector = newVector;
200
+ }
201
+ // Compute eigenvalue
202
+ let eigenvalue = 0;
203
+ for (let i = 0; i < n; i++) {
204
+ let sum = 0;
205
+ for (let j = 0; j < n; j++) {
206
+ sum += laplacian[i][j] * vector[j];
207
+ }
208
+ eigenvalue += vector[i] * sum;
209
+ }
210
+ eigenvectors.push(Array.from(vector));
211
+ eigenvalues.push(eigenvalue);
212
+ }
213
+ // K-means clustering on eigenvector coordinates
214
+ const coordinates = new Map();
215
+ for (let i = 0; i < n; i++) {
216
+ coordinates.set(graph.nodes[i], eigenvectors.map(ev => ev[i]));
217
+ }
218
+ // Simple k-means
219
+ const clusterAssignment = kMeans(graph.nodes.map(node => coordinates.get(node)), k);
220
+ for (let i = 0; i < n; i++) {
221
+ clusters.set(graph.nodes[i], clusterAssignment[i]);
222
+ }
223
+ return { clusters, eigenvalues, coordinates };
224
+ }
225
+ /**
226
+ * Louvain Community Detection
227
+ *
228
+ * Greedy modularity optimization for finding communities.
229
+ * Good for detecting natural module boundaries.
230
+ */
231
+ function louvainCommunities(graph) {
232
+ const communities = new Map();
233
+ let communityId = 0;
234
+ // Initialize: each node in its own community
235
+ for (const node of graph.nodes) {
236
+ communities.set(node, communityId++);
237
+ }
238
+ // Total edge weight
239
+ let m = 0;
240
+ for (const { weight = 1 } of graph.edges) {
241
+ m += weight;
242
+ }
243
+ m /= 2; // Undirected
244
+ if (m === 0)
245
+ return communities;
246
+ // Node weights (sum of edge weights)
247
+ const nodeWeight = new Map();
248
+ for (const node of graph.nodes) {
249
+ let w = 0;
250
+ for (const [, weight] of graph.adjacency.get(node) || []) {
251
+ w += weight;
252
+ }
253
+ nodeWeight.set(node, w);
254
+ }
255
+ // Community weights
256
+ const communityWeight = new Map();
257
+ for (const node of graph.nodes) {
258
+ const c = communities.get(node);
259
+ communityWeight.set(c, (communityWeight.get(c) || 0) + (nodeWeight.get(node) || 0));
260
+ }
261
+ // Iterate until no improvement
262
+ let improved = true;
263
+ while (improved) {
264
+ improved = false;
265
+ for (const node of graph.nodes) {
266
+ const currentCommunity = communities.get(node);
267
+ const ki = nodeWeight.get(node) || 0;
268
+ // Calculate modularity gain for moving to neighbor communities
269
+ let bestCommunity = currentCommunity;
270
+ let bestGain = 0;
271
+ const neighborCommunities = new Set();
272
+ for (const [neighbor] of graph.adjacency.get(node) || []) {
273
+ neighborCommunities.add(communities.get(neighbor));
274
+ }
275
+ for (const targetCommunity of neighborCommunities) {
276
+ if (targetCommunity === currentCommunity)
277
+ continue;
278
+ // Calculate edge weight to target community
279
+ let ki_in = 0;
280
+ for (const [neighbor, weight] of graph.adjacency.get(node) || []) {
281
+ if (communities.get(neighbor) === targetCommunity) {
282
+ ki_in += weight;
283
+ }
284
+ }
285
+ const sumTot = communityWeight.get(targetCommunity) || 0;
286
+ const gain = ki_in / m - (ki * sumTot) / (2 * m * m);
287
+ if (gain > bestGain) {
288
+ bestGain = gain;
289
+ bestCommunity = targetCommunity;
290
+ }
291
+ }
292
+ // Move node if beneficial
293
+ if (bestCommunity !== currentCommunity) {
294
+ communities.set(node, bestCommunity);
295
+ // Update community weights
296
+ communityWeight.set(currentCommunity, (communityWeight.get(currentCommunity) || 0) - ki);
297
+ communityWeight.set(bestCommunity, (communityWeight.get(bestCommunity) || 0) + ki);
298
+ improved = true;
299
+ }
300
+ }
301
+ }
302
+ // Renumber communities to be contiguous
303
+ const renumber = new Map();
304
+ let newId = 0;
305
+ for (const [node, c] of communities) {
306
+ if (!renumber.has(c)) {
307
+ renumber.set(c, newId++);
308
+ }
309
+ communities.set(node, renumber.get(c));
310
+ }
311
+ return communities;
312
+ }
313
+ /**
314
+ * Calculate modularity of a partition
315
+ */
316
+ function calculateModularity(graph, partition) {
317
+ let m = 0;
318
+ for (const { weight = 1 } of graph.edges) {
319
+ m += weight;
320
+ }
321
+ m /= 2;
322
+ if (m === 0)
323
+ return 0;
324
+ let modularity = 0;
325
+ for (const group of partition) {
326
+ const groupSet = new Set(group);
327
+ // Edges within group
328
+ let inGroup = 0;
329
+ let degreeSum = 0;
330
+ for (const node of group) {
331
+ for (const [neighbor, weight] of graph.adjacency.get(node) || []) {
332
+ if (groupSet.has(neighbor)) {
333
+ inGroup += weight;
334
+ }
335
+ degreeSum += weight;
336
+ }
337
+ }
338
+ inGroup /= 2; // Count each edge once
339
+ modularity += inGroup / m - Math.pow(degreeSum / (2 * m), 2);
340
+ }
341
+ return modularity;
342
+ }
343
+ /**
344
+ * Find bridges (edges whose removal disconnects components)
345
+ */
346
+ function findBridges(graph) {
347
+ const bridges = [];
348
+ const visited = new Set();
349
+ const discovery = new Map();
350
+ const low = new Map();
351
+ const parent = new Map();
352
+ let time = 0;
353
+ function dfs(node) {
354
+ visited.add(node);
355
+ discovery.set(node, time);
356
+ low.set(node, time);
357
+ time++;
358
+ for (const [neighbor] of graph.adjacency.get(node) || []) {
359
+ if (!visited.has(neighbor)) {
360
+ parent.set(neighbor, node);
361
+ dfs(neighbor);
362
+ low.set(node, Math.min(low.get(node), low.get(neighbor)));
363
+ if (low.get(neighbor) > discovery.get(node)) {
364
+ bridges.push({ from: node, to: neighbor });
365
+ }
366
+ }
367
+ else if (neighbor !== parent.get(node)) {
368
+ low.set(node, Math.min(low.get(node), discovery.get(neighbor)));
369
+ }
370
+ }
371
+ }
372
+ for (const node of graph.nodes) {
373
+ if (!visited.has(node)) {
374
+ parent.set(node, null);
375
+ dfs(node);
376
+ }
377
+ }
378
+ return bridges;
379
+ }
380
+ /**
381
+ * Find articulation points (nodes whose removal disconnects components)
382
+ */
383
+ function findArticulationPoints(graph) {
384
+ const points = [];
385
+ const visited = new Set();
386
+ const discovery = new Map();
387
+ const low = new Map();
388
+ const parent = new Map();
389
+ let time = 0;
390
+ function dfs(node) {
391
+ visited.add(node);
392
+ discovery.set(node, time);
393
+ low.set(node, time);
394
+ time++;
395
+ let children = 0;
396
+ for (const [neighbor] of graph.adjacency.get(node) || []) {
397
+ if (!visited.has(neighbor)) {
398
+ children++;
399
+ parent.set(neighbor, node);
400
+ dfs(neighbor);
401
+ low.set(node, Math.min(low.get(node), low.get(neighbor)));
402
+ // Root with 2+ children or non-root with low[v] >= disc[u]
403
+ if ((parent.get(node) === null && children > 1) ||
404
+ (parent.get(node) !== null && low.get(neighbor) >= discovery.get(node))) {
405
+ if (!points.includes(node)) {
406
+ points.push(node);
407
+ }
408
+ }
409
+ }
410
+ else if (neighbor !== parent.get(node)) {
411
+ low.set(node, Math.min(low.get(node), discovery.get(neighbor)));
412
+ }
413
+ }
414
+ }
415
+ for (const node of graph.nodes) {
416
+ if (!visited.has(node)) {
417
+ parent.set(node, null);
418
+ dfs(node);
419
+ }
420
+ }
421
+ return points;
422
+ }
423
+ // Helper functions
424
+ function normalize(v) {
425
+ let sum = 0;
426
+ for (let i = 0; i < v.length; i++) {
427
+ sum += v[i] * v[i];
428
+ }
429
+ const norm = Math.sqrt(sum);
430
+ if (norm > 0) {
431
+ for (let i = 0; i < v.length; i++) {
432
+ v[i] /= norm;
433
+ }
434
+ }
435
+ }
436
+ function dotProduct(a, b) {
437
+ let sum = 0;
438
+ for (let i = 0; i < a.length; i++) {
439
+ sum += a[i] * b[i];
440
+ }
441
+ return sum;
442
+ }
443
+ function kMeans(points, k, maxIter = 100) {
444
+ const n = points.length;
445
+ if (n === 0 || k === 0)
446
+ return [];
447
+ const dim = points[0].length;
448
+ // Random initialization
449
+ const centroids = [];
450
+ const used = new Set();
451
+ while (centroids.length < Math.min(k, n)) {
452
+ const idx = Math.floor(Math.random() * n);
453
+ if (!used.has(idx)) {
454
+ used.add(idx);
455
+ centroids.push([...points[idx]]);
456
+ }
457
+ }
458
+ const assignment = new Array(n).fill(0);
459
+ for (let iter = 0; iter < maxIter; iter++) {
460
+ // Assign points to nearest centroid
461
+ let changed = false;
462
+ for (let i = 0; i < n; i++) {
463
+ let minDist = Infinity;
464
+ let minC = 0;
465
+ for (let c = 0; c < centroids.length; c++) {
466
+ let dist = 0;
467
+ for (let d = 0; d < dim; d++) {
468
+ dist += Math.pow(points[i][d] - centroids[c][d], 2);
469
+ }
470
+ if (dist < minDist) {
471
+ minDist = dist;
472
+ minC = c;
473
+ }
474
+ }
475
+ if (assignment[i] !== minC) {
476
+ assignment[i] = minC;
477
+ changed = true;
478
+ }
479
+ }
480
+ if (!changed)
481
+ break;
482
+ // Update centroids
483
+ const counts = new Array(k).fill(0);
484
+ for (let c = 0; c < centroids.length; c++) {
485
+ for (let d = 0; d < dim; d++) {
486
+ centroids[c][d] = 0;
487
+ }
488
+ }
489
+ for (let i = 0; i < n; i++) {
490
+ const c = assignment[i];
491
+ counts[c]++;
492
+ for (let d = 0; d < dim; d++) {
493
+ centroids[c][d] += points[i][d];
494
+ }
495
+ }
496
+ for (let c = 0; c < centroids.length; c++) {
497
+ if (counts[c] > 0) {
498
+ for (let d = 0; d < dim; d++) {
499
+ centroids[c][d] /= counts[c];
500
+ }
501
+ }
502
+ }
503
+ }
504
+ return assignment;
505
+ }
506
+ exports.default = {
507
+ buildGraph,
508
+ minCut,
509
+ spectralClustering,
510
+ louvainCommunities,
511
+ calculateModularity,
512
+ findBridges,
513
+ findArticulationPoints,
514
+ };
dist/core/graph-wrapper.d.ts ADDED
@@ -0,0 +1,147 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ /**
2
+ * Graph Wrapper - Hypergraph database for code relationships
3
+ *
4
+ * Wraps @ruvector/graph-node for dependency analysis, co-edit patterns,
5
+ * and code structure understanding.
6
+ */
7
+ export declare function isGraphAvailable(): boolean;
8
+ export interface Node {
9
+ id: string;
10
+ labels: string[];
11
+ properties: Record<string, any>;
12
+ }
13
+ export interface Edge {
14
+ id?: string;
15
+ from: string;
16
+ to: string;
17
+ type: string;
18
+ properties?: Record<string, any>;
19
+ }
20
+ export interface Hyperedge {
21
+ id?: string;
22
+ nodes: string[];
23
+ type: string;
24
+ properties?: Record<string, any>;
25
+ }
26
+ export interface CypherResult {
27
+ columns: string[];
28
+ rows: any[][];
29
+ }
30
+ export interface PathResult {
31
+ nodes: Node[];
32
+ edges: Edge[];
33
+ length: number;
34
+ }
35
+ /**
36
+ * Graph Database for code relationships
37
+ */
38
+ export declare class CodeGraph {
39
+ private inner;
40
+ private storagePath?;
41
+ constructor(options?: {
42
+ storagePath?: string;
43
+ inMemory?: boolean;
44
+ });
45
+ /**
46
+ * Create a node (file, function, class, etc.)
47
+ */
48
+ createNode(id: string, labels: string[], properties?: Record<string, any>): Node;
49
+ /**
50
+ * Get a node by ID
51
+ */
52
+ getNode(id: string): Node | null;
53
+ /**
54
+ * Update node properties
55
+ */
56
+ updateNode(id: string, properties: Record<string, any>): boolean;
57
+ /**
58
+ * Delete a node
59
+ */
60
+ deleteNode(id: string): boolean;
61
+ /**
62
+ * Find nodes by label
63
+ */
64
+ findNodesByLabel(label: string): Node[];
65
+ /**
66
+ * Create an edge (import, call, reference, etc.)
67
+ */
68
+ createEdge(from: string, to: string, type: string, properties?: Record<string, any>): Edge;
69
+ /**
70
+ * Get edges from a node
71
+ */
72
+ getOutgoingEdges(nodeId: string, type?: string): Edge[];
73
+ /**
74
+ * Get edges to a node
75
+ */
76
+ getIncomingEdges(nodeId: string, type?: string): Edge[];
77
+ /**
78
+ * Delete an edge
79
+ */
80
+ deleteEdge(edgeId: string): boolean;
81
+ /**
82
+ * Create a hyperedge connecting multiple nodes
83
+ */
84
+ createHyperedge(nodes: string[], type: string, properties?: Record<string, any>): Hyperedge;
85
+ /**
86
+ * Get hyperedges containing a node
87
+ */
88
+ getHyperedges(nodeId: string, type?: string): Hyperedge[];
89
+ /**
90
+ * Execute a Cypher query
91
+ */
92
+ cypher(query: string, params?: Record<string, any>): CypherResult;
93
+ /**
94
+ * Find shortest path between nodes
95
+ */
96
+ shortestPath(from: string, to: string, maxDepth?: number): PathResult | null;
97
+ /**
98
+ * Get all paths between nodes (up to maxPaths)
99
+ */
100
+ allPaths(from: string, to: string, maxDepth?: number, maxPaths?: number): PathResult[];
101
+ /**
102
+ * Get neighbors of a node
103
+ */
104
+ neighbors(nodeId: string, depth?: number): Node[];
105
+ /**
106
+ * Calculate PageRank for nodes
107
+ */
108
+ pageRank(iterations?: number, dampingFactor?: number): Map<string, number>;
109
+ /**
110
+ * Find connected components
111
+ */
112
+ connectedComponents(): string[][];
113
+ /**
114
+ * Detect communities (Louvain algorithm)
115
+ */
116
+ communities(): Map<string, number>;
117
+ /**
118
+ * Calculate betweenness centrality
119
+ */
120
+ betweennessCentrality(): Map<string, number>;
121
+ /**
122
+ * Save graph to storage
123
+ */
124
+ save(): void;
125
+ /**
126
+ * Load graph from storage
127
+ */
128
+ load(): void;
129
+ /**
130
+ * Clear all data
131
+ */
132
+ clear(): void;
133
+ /**
134
+ * Get graph statistics
135
+ */
136
+ stats(): {
137
+ nodes: number;
138
+ edges: number;
139
+ hyperedges: number;
140
+ };
141
+ }
142
+ /**
143
+ * Create a code dependency graph from file analysis
144
+ */
145
+ export declare function createCodeDependencyGraph(storagePath?: string): CodeGraph;
146
+ export default CodeGraph;
147
+ //# sourceMappingURL=graph-wrapper.d.ts.map
dist/core/graph-wrapper.d.ts.map ADDED
@@ -0,0 +1 @@
 
 
1
+ {"version":3,"file":"graph-wrapper.d.ts","sourceRoot":"","sources":["../../src/core/graph-wrapper.ts"],"names":[],"mappings":"AAAA;;;;;GAKG;AAqBH,wBAAgB,gBAAgB,IAAI,OAAO,CAO1C;AAED,MAAM,WAAW,IAAI;IACnB,EAAE,EAAE,MAAM,CAAC;IACX,MAAM,EAAE,MAAM,EAAE,CAAC;IACjB,UAAU,EAAE,MAAM,CAAC,MAAM,EAAE,GAAG,CAAC,CAAC;CACjC;AAED,MAAM,WAAW,IAAI;IACnB,EAAE,CAAC,EAAE,MAAM,CAAC;IACZ,IAAI,EAAE,MAAM,CAAC;IACb,EAAE,EAAE,MAAM,CAAC;IACX,IAAI,EAAE,MAAM,CAAC;IACb,UAAU,CAAC,EAAE,MAAM,CAAC,MAAM,EAAE,GAAG,CAAC,CAAC;CAClC;AAED,MAAM,WAAW,SAAS;IACxB,EAAE,CAAC,EAAE,MAAM,CAAC;IACZ,KAAK,EAAE,MAAM,EAAE,CAAC;IAChB,IAAI,EAAE,MAAM,CAAC;IACb,UAAU,CAAC,EAAE,MAAM,CAAC,MAAM,EAAE,GAAG,CAAC,CAAC;CAClC;AAED,MAAM,WAAW,YAAY;IAC3B,OAAO,EAAE,MAAM,EAAE,CAAC;IAClB,IAAI,EAAE,GAAG,EAAE,EAAE,CAAC;CACf;AAED,MAAM,WAAW,UAAU;IACzB,KAAK,EAAE,IAAI,EAAE,CAAC;IACd,KAAK,EAAE,IAAI,EAAE,CAAC;IACd,MAAM,EAAE,MAAM,CAAC;CAChB;AAED;;GAEG;AACH,qBAAa,SAAS;IACpB,OAAO,CAAC,KAAK,CAAM;IACnB,OAAO,CAAC,WAAW,CAAC,CAAS;gBAEjB,OAAO,GAAE;QAAE,WAAW,CAAC,EAAE,MAAM,CAAC;QAAC,QAAQ,CAAC,EAAE,OAAO,CAAA;KAAO;IAatE;;OAEG;IACH,UAAU,CAAC,EAAE,EAAE,MAAM,EAAE,MAAM,EAAE,MAAM,EAAE,EAAE,UAAU,GAAE,MAAM,CAAC,MAAM,EAAE,GAAG,CAAM,GAAG,IAAI;IAKpF;;OAEG;IACH,OAAO,CAAC,EAAE,EAAE,MAAM,GAAG,IAAI,GAAG,IAAI;IAUhC;;OAEG;IACH,UAAU,CAAC,EAAE,EAAE,MAAM,EAAE,UAAU,EAAE,MAAM,CAAC,MAAM,EAAE,GAAG,CAAC,GAAG,OAAO;IAIhE;;OAEG;IACH,UAAU,CAAC,EAAE,EAAE,MAAM,GAAG,OAAO;IAI/B;;OAEG;IACH,gBAAgB,CAAC,KAAK,EAAE,MAAM,GAAG,IAAI,EAAE;IAavC;;OAEG;IACH,UAAU,CAAC,IAAI,EAAE,MAAM,EAAE,EAAE,EAAE,MAAM,EAAE,IAAI,EAAE,MAAM,EAAE,UAAU,GAAE,MAAM,CAAC,MAAM,EAAE,GAAG,CAAM,GAAG,IAAI;IAK9F;;OAEG;IACH,gBAAgB,CAAC,MAAM,EAAE,MAAM,EAAE,IAAI,CAAC,EAAE,MAAM,GAAG,IAAI,EAAE;IAWvD;;OAEG;IACH,gBAAgB,CAAC,MAAM,EAAE,MAAM,EAAE,IAAI,CAAC,EAAE,MAAM,GAAG,IAAI,EAAE;IAWvD;;OAEG;IACH,UAAU,CAAC,MAAM,EAAE,MAAM,GAAG,OAAO;IAQnC;;OAEG;IACH,eAAe,CAAC,KAAK,EAAE,MAAM,EAAE,EAAE,IAAI,EAAE,MAAM,EAAE,UAAU,GAAE,MAAM,CAAC,MAAM,EAAE,GAAG,CAAM,GAAG,SAAS;IAK/F;;OAEG;IACH,aAAa,CAAC,MAAM,EAAE,MAAM,EAAE,IAAI,CAAC,EAAE,MAAM,GAAG,SAAS,EAAE;IAczD;;OAEG;IACH,MAAM,CAAC,KAAK,EAAE,MAAM,EAAE,MAAM,GAAE,MAAM,CAAC,MAAM,EAAE,GAAG,CAAM,GAAG,YAAY;IAQrE;;OAEG;IACH,YAAY,CAAC,IAAI,EAAE,MAAM,EAAE,EAAE,EAAE,MAAM,EAAE,QAAQ,GAAE,MAAW,GAAG,UAAU,GAAG,IAAI;IAoBhF;;OAEG;IACH,QAAQ,CAAC,IAAI,EAAE,MAAM,EAAE,EAAE,EAAE,MAAM,EAAE,QAAQ,GAAE,MAAU,EAAE,QAAQ,GAAE,MAAW,GAAG,UAAU,EAAE;IAmB7F;;OAEG;IACH,SAAS,CAAC,MAAM,EAAE,MAAM,EAAE,KAAK,GAAE,MAAU,GAAG,IAAI,EAAE;IAapD;;OAEG;IACH,QAAQ,CAAC,UAAU,GAAE,MAAW,EAAE,aAAa,GAAE,MAAa,GAAG,GAAG,CAAC,MAAM,EAAE,MAAM,CAAC;IAKpF;;OAEG;IACH,mBAAmB,IAAI,MAAM,EAAE,EAAE;IAIjC;;OAEG;IACH,WAAW,IAAI,GAAG,CAAC,MAAM,EAAE,MAAM,CAAC;IAKlC;;OAEG;IACH,qBAAqB,IAAI,GAAG,CAAC,MAAM,EAAE,MAAM,CAAC;IAS5C;;OAEG;IACH,IAAI,IAAI,IAAI;IAOZ;;OAEG;IACH,IAAI,IAAI,IAAI;IAOZ;;OAEG;IACH,KAAK,IAAI,IAAI;IAIb;;OAEG;IACH,KAAK,IAAI;QAAE,KAAK,EAAE,MAAM,CAAC;QAAC,KAAK,EAAE,MAAM,CAAC;QAAC,UAAU,EAAE,MAAM,CAAA;KAAE;CAG9D;AAED;;GAEG;AACH,wBAAgB,yBAAyB,CAAC,WAAW,CAAC,EAAE,MAAM,GAAG,SAAS,CAEzE;AAED,eAAe,SAAS,CAAC"}
dist/core/graph-wrapper.js ADDED
@@ -0,0 +1,299 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ "use strict";
2
+ /**
3
+ * Graph Wrapper - Hypergraph database for code relationships
4
+ *
5
+ * Wraps @ruvector/graph-node for dependency analysis, co-edit patterns,
6
+ * and code structure understanding.
7
+ */
8
+ Object.defineProperty(exports, "__esModule", { value: true });
9
+ exports.CodeGraph = void 0;
10
+ exports.isGraphAvailable = isGraphAvailable;
11
+ exports.createCodeDependencyGraph = createCodeDependencyGraph;
12
+ let graphModule = null;
13
+ let loadError = null;
14
+ function getGraphModule() {
15
+ if (graphModule)
16
+ return graphModule;
17
+ if (loadError)
18
+ throw loadError;
19
+ try {
20
+ graphModule = require('@ruvector/graph-node');
21
+ return graphModule;
22
+ }
23
+ catch (e) {
24
+ loadError = new Error(`@ruvector/graph-node not installed: ${e.message}\n` +
25
+ `Install with: npm install @ruvector/graph-node`);
26
+ throw loadError;
27
+ }
28
+ }
29
+ function isGraphAvailable() {
30
+ try {
31
+ getGraphModule();
32
+ return true;
33
+ }
34
+ catch {
35
+ return false;
36
+ }
37
+ }
38
+ /**
39
+ * Graph Database for code relationships
40
+ */
41
+ class CodeGraph {
42
+ constructor(options = {}) {
43
+ const graph = getGraphModule();
44
+ this.storagePath = options.storagePath;
45
+ this.inner = new graph.GraphDatabase({
46
+ storagePath: options.storagePath,
47
+ inMemory: options.inMemory ?? true,
48
+ });
49
+ }
50
+ // ===========================================================================
51
+ // Node Operations
52
+ // ===========================================================================
53
+ /**
54
+ * Create a node (file, function, class, etc.)
55
+ */
56
+ createNode(id, labels, properties = {}) {
57
+ this.inner.createNode(id, labels, JSON.stringify(properties));
58
+ return { id, labels, properties };
59
+ }
60
+ /**
61
+ * Get a node by ID
62
+ */
63
+ getNode(id) {
64
+ const result = this.inner.getNode(id);
65
+ if (!result)
66
+ return null;
67
+ return {
68
+ id: result.id,
69
+ labels: result.labels,
70
+ properties: result.properties ? JSON.parse(result.properties) : {},
71
+ };
72
+ }
73
+ /**
74
+ * Update node properties
75
+ */
76
+ updateNode(id, properties) {
77
+ return this.inner.updateNode(id, JSON.stringify(properties));
78
+ }
79
+ /**
80
+ * Delete a node
81
+ */
82
+ deleteNode(id) {
83
+ return this.inner.deleteNode(id);
84
+ }
85
+ /**
86
+ * Find nodes by label
87
+ */
88
+ findNodesByLabel(label) {
89
+ const results = this.inner.findNodesByLabel(label);
90
+ return results.map((r) => ({
91
+ id: r.id,
92
+ labels: r.labels,
93
+ properties: r.properties ? JSON.parse(r.properties) : {},
94
+ }));
95
+ }
96
+ // ===========================================================================
97
+ // Edge Operations
98
+ // ===========================================================================
99
+ /**
100
+ * Create an edge (import, call, reference, etc.)
101
+ */
102
+ createEdge(from, to, type, properties = {}) {
103
+ const id = this.inner.createEdge(from, to, type, JSON.stringify(properties));
104
+ return { id, from, to, type, properties };
105
+ }
106
+ /**
107
+ * Get edges from a node
108
+ */
109
+ getOutgoingEdges(nodeId, type) {
110
+ const results = this.inner.getOutgoingEdges(nodeId, type);
111
+ return results.map((r) => ({
112
+ id: r.id,
113
+ from: r.from,
114
+ to: r.to,
115
+ type: r.type,
116
+ properties: r.properties ? JSON.parse(r.properties) : {},
117
+ }));
118
+ }
119
+ /**
120
+ * Get edges to a node
121
+ */
122
+ getIncomingEdges(nodeId, type) {
123
+ const results = this.inner.getIncomingEdges(nodeId, type);
124
+ return results.map((r) => ({
125
+ id: r.id,
126
+ from: r.from,
127
+ to: r.to,
128
+ type: r.type,
129
+ properties: r.properties ? JSON.parse(r.properties) : {},
130
+ }));
131
+ }
132
+ /**
133
+ * Delete an edge
134
+ */
135
+ deleteEdge(edgeId) {
136
+ return this.inner.deleteEdge(edgeId);
137
+ }
138
+ // ===========================================================================
139
+ // Hyperedge Operations (for co-edit patterns)
140
+ // ===========================================================================
141
+ /**
142
+ * Create a hyperedge connecting multiple nodes
143
+ */
144
+ createHyperedge(nodes, type, properties = {}) {
145
+ const id = this.inner.createHyperedge(nodes, type, JSON.stringify(properties));
146
+ return { id, nodes, type, properties };
147
+ }
148
+ /**
149
+ * Get hyperedges containing a node
150
+ */
151
+ getHyperedges(nodeId, type) {
152
+ const results = this.inner.getHyperedges(nodeId, type);
153
+ return results.map((r) => ({
154
+ id: r.id,
155
+ nodes: r.nodes,
156
+ type: r.type,
157
+ properties: r.properties ? JSON.parse(r.properties) : {},
158
+ }));
159
+ }
160
+ // ===========================================================================
161
+ // Query Operations
162
+ // ===========================================================================
163
+ /**
164
+ * Execute a Cypher query
165
+ */
166
+ cypher(query, params = {}) {
167
+ const result = this.inner.cypher(query, JSON.stringify(params));
168
+ return {
169
+ columns: result.columns,
170
+ rows: result.rows,
171
+ };
172
+ }
173
+ /**
174
+ * Find shortest path between nodes
175
+ */
176
+ shortestPath(from, to, maxDepth = 10) {
177
+ const result = this.inner.shortestPath(from, to, maxDepth);
178
+ if (!result)
179
+ return null;
180
+ return {
181
+ nodes: result.nodes.map((n) => ({
182
+ id: n.id,
183
+ labels: n.labels,
184
+ properties: n.properties ? JSON.parse(n.properties) : {},
185
+ })),
186
+ edges: result.edges.map((e) => ({
187
+ id: e.id,
188
+ from: e.from,
189
+ to: e.to,
190
+ type: e.type,
191
+ properties: e.properties ? JSON.parse(e.properties) : {},
192
+ })),
193
+ length: result.length,
194
+ };
195
+ }
196
+ /**
197
+ * Get all paths between nodes (up to maxPaths)
198
+ */
199
+ allPaths(from, to, maxDepth = 5, maxPaths = 10) {
200
+ const results = this.inner.allPaths(from, to, maxDepth, maxPaths);
201
+ return results.map((r) => ({
202
+ nodes: r.nodes.map((n) => ({
203
+ id: n.id,
204
+ labels: n.labels,
205
+ properties: n.properties ? JSON.parse(n.properties) : {},
206
+ })),
207
+ edges: r.edges.map((e) => ({
208
+ id: e.id,
209
+ from: e.from,
210
+ to: e.to,
211
+ type: e.type,
212
+ properties: e.properties ? JSON.parse(e.properties) : {},
213
+ })),
214
+ length: r.length,
215
+ }));
216
+ }
217
+ /**
218
+ * Get neighbors of a node
219
+ */
220
+ neighbors(nodeId, depth = 1) {
221
+ const results = this.inner.neighbors(nodeId, depth);
222
+ return results.map((n) => ({
223
+ id: n.id,
224
+ labels: n.labels,
225
+ properties: n.properties ? JSON.parse(n.properties) : {},
226
+ }));
227
+ }
228
+ // ===========================================================================
229
+ // Graph Algorithms
230
+ // ===========================================================================
231
+ /**
232
+ * Calculate PageRank for nodes
233
+ */
234
+ pageRank(iterations = 20, dampingFactor = 0.85) {
235
+ const result = this.inner.pageRank(iterations, dampingFactor);
236
+ return new Map(Object.entries(result));
237
+ }
238
+ /**
239
+ * Find connected components
240
+ */
241
+ connectedComponents() {
242
+ return this.inner.connectedComponents();
243
+ }
244
+ /**
245
+ * Detect communities (Louvain algorithm)
246
+ */
247
+ communities() {
248
+ const result = this.inner.communities();
249
+ return new Map(Object.entries(result));
250
+ }
251
+ /**
252
+ * Calculate betweenness centrality
253
+ */
254
+ betweennessCentrality() {
255
+ const result = this.inner.betweennessCentrality();
256
+ return new Map(Object.entries(result));
257
+ }
258
+ // ===========================================================================
259
+ // Persistence
260
+ // ===========================================================================
261
+ /**
262
+ * Save graph to storage
263
+ */
264
+ save() {
265
+ if (!this.storagePath) {
266
+ throw new Error('No storage path configured');
267
+ }
268
+ this.inner.save();
269
+ }
270
+ /**
271
+ * Load graph from storage
272
+ */
273
+ load() {
274
+ if (!this.storagePath) {
275
+ throw new Error('No storage path configured');
276
+ }
277
+ this.inner.load();
278
+ }
279
+ /**
280
+ * Clear all data
281
+ */
282
+ clear() {
283
+ this.inner.clear();
284
+ }
285
+ /**
286
+ * Get graph statistics
287
+ */
288
+ stats() {
289
+ return this.inner.stats();
290
+ }
291
+ }
292
+ exports.CodeGraph = CodeGraph;
293
+ /**
294
+ * Create a code dependency graph from file analysis
295
+ */
296
+ function createCodeDependencyGraph(storagePath) {
297
+ return new CodeGraph({ storagePath, inMemory: !storagePath });
298
+ }
299
+ exports.default = CodeGraph;
dist/core/index.d.ts ADDED
@@ -0,0 +1,48 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ /**
2
+ * Core module exports
3
+ *
4
+ * These wrappers provide safe, type-flexible interfaces to the underlying
5
+ * native packages, handling array type conversions automatically.
6
+ */
7
+ export * from './gnn-wrapper';
8
+ export * from './attention-fallbacks';
9
+ export * from './agentdb-fast';
10
+ export * from './sona-wrapper';
11
+ export * from './intelligence-engine';
12
+ export * from './onnx-embedder';
13
+ export * from './onnx-optimized';
14
+ export * from './parallel-intelligence';
15
+ export * from './parallel-workers';
16
+ export * from './router-wrapper';
17
+ export * from './graph-wrapper';
18
+ export * from './cluster-wrapper';
19
+ export * from './ast-parser';
20
+ export * from './diff-embeddings';
21
+ export * from './coverage-router';
22
+ export * from './graph-algorithms';
23
+ export * from './tensor-compress';
24
+ export * from './learning-engine';
25
+ export * from './adaptive-embedder';
26
+ export * from './neural-embeddings';
27
+ export * from './neural-perf';
28
+ export * from './rvf-wrapper';
29
+ export * from '../analysis';
30
+ export { default as gnnWrapper } from './gnn-wrapper';
31
+ export { default as attentionFallbacks } from './attention-fallbacks';
32
+ export { default as agentdbFast } from './agentdb-fast';
33
+ export { default as Sona } from './sona-wrapper';
34
+ export { default as IntelligenceEngine } from './intelligence-engine';
35
+ export { default as OnnxEmbedder } from './onnx-embedder';
36
+ export { default as OptimizedOnnxEmbedder } from './onnx-optimized';
37
+ export { default as ParallelIntelligence } from './parallel-intelligence';
38
+ export { default as ExtendedWorkerPool } from './parallel-workers';
39
+ export { default as SemanticRouter } from './router-wrapper';
40
+ export { default as CodeGraph } from './graph-wrapper';
41
+ export { default as RuvectorCluster } from './cluster-wrapper';
42
+ export { default as CodeParser } from './ast-parser';
43
+ export { CodeParser as ASTParser } from './ast-parser';
44
+ export { default as TensorCompress } from './tensor-compress';
45
+ export { default as LearningEngine } from './learning-engine';
46
+ export { default as AdaptiveEmbedder } from './adaptive-embedder';
47
+ export { default as NeuralSubstrate } from './neural-embeddings';
48
+ //# sourceMappingURL=index.d.ts.map