Threshold Logic Circuits
Collection
Boolean gates, voting functions, modular arithmetic, and adders as threshold networks.
β’
248 items
β’
Updated
β’
1
Hamming(7,4) encoder. Transforms 4 data bits into a 7-bit codeword with single-error correction capability.
d1 d2 d3 d4
β β β β
βββββΌββββΌββββ€
β β β β
β β β ββββββββββββββββββββββββββββββΊ c7 = d4
β β βββββββββββββββββββββββββΊ c6 = d3
β ββββββββββββββββββββΊ c5 = d2
βββββββββββββββΊ c3 = d1
β β β β
βΌ βΌ β βΌ
βββββββββββββββββ
β d1 XOR d2 XOR ββββββββΊ c1 = p1
β d4 β
βββββββββββββββββ
β β β
βΌ βΌ βΌ
βββββββββββββββββ
β d1 XOR d3 XOR ββββββββΊ c2 = p2
β d4 β
βββββββββββββββββ
β β β
βΌ βΌ βΌ
βββββββββββββββββ
β d2 XOR d3 XOR ββββΊ c4 = p3
β d4 β
βββββββββββββββββ
Richard Hamming invented this code in 1950. It encodes 4 data bits into 7 bits such that any single-bit error can be detected and corrected.
Codeword structure:
| Position | 1 | 2 | 3 | 4 | 5 | 6 | 7 |
|---|---|---|---|---|---|---|---|
| Bit | p1 | p2 | d1 | p3 | d2 | d3 | d4 |
| Type | parity | parity | data | parity | data | data | data |
Parity equations:
Each parity bit requires a 3-input XOR. In threshold logic:
XOR(a,b,c) = XOR(XOR(a,b), c)
a b
β β
βββ¬ββ
βΌ
βββββββββ
β XOR β (2 layers)
βββββββββ
β
β c
βββ¬ββ
βΌ
βββββββββ
β XOR β (2 more layers)
βββββββββ
β
βΌ
XOR(a,b,c)
Total depth: 4 layers per parity. All three parities compute in parallel.
| Property | Value |
|---|---|
| Data bits (k) | 4 |
| Codeword bits (n) | 7 |
| Parity bits (n-k) | 3 |
| Minimum distance | 3 |
| Error correction | 1 bit |
| Error detection | 2 bits |
Data: 1011
p1 = 1 β 0 β 1 = 0
p2 = 1 β 1 β 1 = 1
p3 = 0 β 1 β 1 = 0
Codeword: 0110011
ββ β
p1p2p3
| Component | Neurons | Parameters |
|---|---|---|
| p1 (3-way XOR) | 6 | 22 |
| p2 (3-way XOR) | 6 | 22 |
| p3 (3-way XOR) | 6 | 22 |
| d1-d4 pass-through | 4 | 20 |
| Total | 22 | 86 |
Layers: 4 (two cascaded XOR stages)
| Data | Codeword | HW |
|---|---|---|
| 0000 | 0000000 | 0 |
| 1000 | 1110000 | 3 |
| 0100 | 1001100 | 3 |
| 1100 | 0111100 | 4 |
| 0010 | 0101010 | 3 |
| 1010 | 1011010 | 4 |
| 0110 | 1100110 | 4 |
| 1110 | 0010110 | 3 |
| 0001 | 1101001 | 4 |
| 1001 | 0011001 | 3 |
| 0101 | 0100101 | 3 |
| 1101 | 1010101 | 4 |
| 0011 | 1000011 | 3 |
| 1011 | 0110011 | 4 |
| 0111 | 0001111 | 4 |
| 1111 | 1111111 | 7 |
Note: Minimum Hamming distance between any two codewords is 3.
from safetensors.torch import load_file
w = load_file('model.safetensors')
def hamming74_encode(d1, d2, d3, d4):
"""Encode 4 data bits to 7-bit Hamming codeword"""
# See model.py for full implementation
pass
# Encode data word 1011
codeword = hamming74_encode(1, 0, 1, 1)
# Returns [0, 1, 1, 0, 0, 1, 1]
threshold-hamming74encoder/
βββ model.safetensors
βββ model.py
βββ config.json
βββ README.md
MIT