Davros182 commited on
Commit
3210980
·
verified ·
1 Parent(s): b7f3ec1

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +132 -0
README.md ADDED
@@ -0,0 +1,132 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ tags:
3
+ - AGI
4
+ - SentientAI
5
+ ---
6
+ # DreamStack - Symbolic Cognition Simulator
7
+
8
+ A Python script that implements symbolic cognition using the `llama-cpp-python` library with a stacked equation structure.
9
+
10
+ ## Overview
11
+
12
+ DreamStack simulates symbolic cognition using this stacked equation structure:
13
+
14
+ ```
15
+ D₀ = A₀ · f₀(x) + B₀ · g₀(x)
16
+ D₁ = A₁ · f₁(g₀) + B₁ · g₁(g₀)
17
+ D₂ = A₂ · f₂(g₁) + B₂ · g₂(g₁)
18
+ ... up to a user-defined recursion depth
19
+ ```
20
+
21
+ Each level contains:
22
+ - `fₙ(x)`: the first output (real)
23
+ - `gₙ(x)`: the next 3 alternate outputs (imaginary)
24
+ - Recursively process each imaginary output as a new prompt
25
+
26
+ ## Prerequisites
27
+
28
+ 1. **Python 3.8+** installed on your system
29
+ 2. **GGUF model file** (e.g., `mistral-7b-instruct-v0.2.Q4_K_M.gguf`)
30
+
31
+ ## Installation
32
+
33
+ 1. **Install dependencies:**
34
+ ```bash
35
+ pip install -r requirements.txt
36
+ ```
37
+
38
+ 2. **Download a GGUF model:**
39
+ - Download a GGUF format model (e.g., from Hugging Face)
40
+ - Place it in a `models/` directory relative to the script
41
+ - Default expected path: `./models/mistral-7b-instruct-v0.2.Q4_K_M.gguf`
42
+
43
+ ## Usage
44
+
45
+ 1. **Run the script:**
46
+ ```bash
47
+ python dreamstack.py
48
+ ```
49
+
50
+ 2. **Follow the prompts:**
51
+ - Enter your initial prompt
52
+ - Specify recursion depth (default: 3)
53
+
54
+ 3. **View results:**
55
+ - Results are displayed in the terminal
56
+ - JSON results are saved to `dreamstack_results.json`
57
+
58
+ ## Configuration
59
+
60
+ You can modify the model configuration in the `DreamStack` class:
61
+
62
+ ```python
63
+ dreamstack = DreamStack(
64
+ model_path="./models/your-model.gguf",
65
+ n_ctx=2048, # Context window size
66
+ n_threads=4 # Number of threads
67
+ )
68
+ ```
69
+
70
+ ## Output Format
71
+
72
+ The script generates a JSON structure like:
73
+
74
+ ```json
75
+ {
76
+ "D_0": {
77
+ "prompt": "user prompt",
78
+ "real": "f₀ output",
79
+ "imaginary": ["g₀₁", "g₀₂", "g₀₃"]
80
+ },
81
+ "D_1": {
82
+ "prompt": "Based on: g₀₁",
83
+ "real": "f₁ output",
84
+ "imaginary": ["g₁₁", "g₁₂", "g₁₃"]
85
+ },
86
+ ...
87
+ }
88
+ ```
89
+
90
+ ## Example
91
+
92
+ ```bash
93
+ $ python dreamstack.py
94
+
95
+ === DreamStack - Symbolic Cognition Simulator ===
96
+
97
+ Enter your prompt: What is the meaning of life?
98
+ Enter recursion depth (default: 3): 2
99
+
100
+ Loading model from ./models/mistral-7b-instruct-v0.2.Q4_K_M.gguf...
101
+ Model loaded successfully!
102
+
103
+ === Starting DreamStack Processing ===
104
+ Initial prompt: What is the meaning of life?
105
+ Depth: 2
106
+
107
+ --- Generating D_0 ---
108
+ Prompt: What is the meaning of life?
109
+ Generating real output...
110
+ Real output: The meaning of life is to find purpose and fulfillment...
111
+ Generating imaginary output 1/3...
112
+ Imaginary output 1: From a philosophical perspective...
113
+ Generating imaginary output 2/3...
114
+ Imaginary output 2: In terms of biological evolution...
115
+ Generating imaginary output 3/3...
116
+ Imaginary output 3: From a spiritual viewpoint...
117
+
118
+ === Processing D_1 ===
119
+ ...
120
+ ```
121
+
122
+ ## Requirements
123
+
124
+ - `llama-cpp-python>=0.2.0`
125
+ - `typing-extensions>=4.0.0`
126
+
127
+ ## Notes
128
+
129
+ - The script requires a GGUF format model file
130
+ - Processing time depends on model size and recursion depth
131
+ - Results are automatically saved to JSON format
132
+ - Each layer generates 1 real and 3 imaginary outputs