File size: 1,257 Bytes
7bf820e
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
---
license: mit
tags:
- pytorch
- safetensors
- threshold-logic
- neuromorphic
- encoder
---

# threshold-priorityencoder8

8-to-3 priority encoder. Outputs 3-bit binary encoding of highest-priority active input.

## Function

priority_encode(i7..i0) -> (y2, y1, y0, valid)

- i7 = highest priority, i0 = lowest priority
- y2,y1,y0 = 3-bit binary encoding of highest active input index
- valid = 1 if any input is active

## Architecture

**Layer 1: 8 neurons (h7..h0)**

Each hk detects "ik is the highest active input":
- hk fires when ik=1 AND all higher-priority inputs are 0
- h7: weights [1,0,0,0,0,0,0,0], bias -1
- h6: weights [-1,1,0,0,0,0,0,0], bias -1
- ...
- h0: weights [-1,-1,-1,-1,-1,-1,-1,1], bias -1

**Layer 2: 4 neurons**

- y2 = h7 OR h6 OR h5 OR h4
- y1 = h7 OR h6 OR h3 OR h2
- y0 = h7 OR h5 OR h3 OR h1
- v = any h active

## Parameters

| | |
|---|---|
| Inputs | 8 |
| Outputs | 4 |
| Neurons | 12 |
| Layers | 2 |
| Parameters | 108 |
| Magnitude | 68 |

## Usage

```python
from safetensors.torch import load_file
import torch

w = load_file('model.safetensors')

# (see model.py for full implementation)

# Example: i5 is highest active (index 5 = 101)
# priority_encode(0,0,1,0,0,0,0,0, w) -> (1, 0, 1, 1)
```

## License

MIT