File size: 3,952 Bytes
c35e874
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
# aurekai/model-memory

Public model memory archive for the Aurekai platform. Stores compiled model representations, SAE dictionaries, and semantic embeddings for zero-shot operational orchestration.

## Overview

Model memory serves as the central knowledge base for Aurekai's runtime, enabling semantic querying and model-based decision making. This repository hosts:

- **Compiled Model Binaries**: Pre-compiled Aurekai model representations (`.akmodel`, `.bfmodel`)
- **SAE Dictionaries**: Sparse autoencoder dictionaries for model interpretability (`.aksae`, `.bfsae`)  
- **Semantic Embeddings**: Cached embeddings for fast semantic search across operators
- **Manifest Metadata**: Aurekai and legacy Bonfyre format manifests

## Quick Start

```bash
# Download latest model memory archive
curl -L https://huggingface.co/aurekai/model-memory/resolve/main/aurekai-model-memory-qwen3-8b-20260502.tar.gz -o model-memory.tar.gz

# Extract
tar -xzf model-memory.tar.gz

# Use with Aurekai runtime
export AUREKAI_MODEL_MEMORY=$(pwd)/model-memory
akai run <recipe> --model-cache --semantic-search
```

## Format Specifications

### Aurekai First Formats (.ak*)

- **`.akmodel`**: Aurekai-native model compiled format
  - Used by Aurekai runtime for direct inference
  - Optimized for semantic routing and operator selection
  
- **`.aksae`**: Aurekai-native SAE dictionary format
  - Sparse autoencoder coefficients in Aurekai serialization
  - Default SAE for model interpretability

- **`.akfpqx`**: Aurekai-native FPQx alignment format
  - Feature-to-proxy quantization alignments
  - Model-to-model alignment data

### Legacy Bonfyre Formats (.bf*)

For backward compatibility, this repository includes legacy Bonfyre format equivalents:
- `.bfmodel` → Bonfyre binary model representation
- `.bfsae` → Bonfyre SAE dictionary format
- `.bffpqx` → Bonfyre FPQx alignment format

## Available Models

### Qwen3 8B (qwen3-8b)

- **Release**: 2026-05-02
- **Archive**: `aurekai-model-memory-qwen3-8b-20260502.tar.gz`
- **Size**: See [SHA256SUMS](./SHA256SUMS)
- **Formats**:
  - `qwen3-8b.akmodel` + `qwen3-8b.bfmodel`
  - `default.aksae` + `default.bfsae`
  - `qwen3-to-llama3.akfpqx` + `qwen3-to-llama3.bffpqx`

## Integration with Aurekai

### Environment Variables

```bash
export AUREKAI_MODEL_MEMORY=/path/to/model-memory
export AUREKAI_SAE_DEFAULT=model-memory/default.aksae
export AUREKAI_EMBEDDINGS_CACHE=/tmp/aurekai-embeddings
```

### In Aurekai Config

```json
{
  "model_memory": {
    "path": "./model-memory",
    "formats": ["akmodel", "bfmodel"],
    "sae_dicts": ["default.aksae", "default.bfsae"],
    "fpqx_alignments": ["qwen3-to-llama3.akfpqx"]
  }
}
```

## Manifests

- **`aurekai.manifest.json`**: Aurekai public manifest with SAE and FPQx inventory
- **`bonfyre.manifest.json`**: Legacy Bonfyre manifest for backward compatibility

Both manifests are included in each release and describe:
- Available models and their paths
- SAE dictionary mappings (Aurekai → Legacy)
- FPQx alignment pairs for cross-model translation
- Operator compatibility and runtime requirements

## Performance Notes

- Model memory archives are compressed with both gzip and zstd
- Use `.tar.zst` for faster decompression on supported systems
- Recommended extraction to SSD for optimal semantic search performance

## License

Licensed under the Aurekai Open Source License. See [LICENSE](https://github.com/aurekai/aurekai/blob/main/LICENSE) in the main Aurekai repository.

## Related

- **Main Aurekai Repo**: https://github.com/aurekai/aurekai
- **SAE Dictionaries**: https://huggingface.co/aurekai/sae-dictionaries
- **FPQx Alignments**: https://huggingface.co/aurekai/fpqx-alignments
- **Semantic Cache Benchmarks**: https://huggingface.co/aurekai/semantic-cache-bench

## Support

For issues or questions:
- GitHub Discussions: https://github.com/aurekai/aurekai/discussions
- GitHub Issues: https://github.com/aurekai/aurekai/issues