phanerozoic commited on
Commit
cc5fa68
·
verified ·
1 Parent(s): 563eb1b

Upload README.md with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +96 -0
README.md ADDED
@@ -0,0 +1,96 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ tags:
4
+ - formal-verification
5
+ - coq
6
+ - threshold-logic
7
+ - neuromorphic
8
+ - modular-arithmetic
9
+ ---
10
+
11
+ # tiny-mod8-verified
12
+
13
+ Formally verified MOD-8 circuit. Single-layer threshold network computing modulo-8 arithmetic with 100% accuracy.
14
+
15
+ ## Architecture
16
+
17
+ | Component | Value |
18
+ |-----------|-------|
19
+ | Inputs | 8 |
20
+ | Outputs | 1 (per residue class) |
21
+ | Neurons | 8 (one per residue 0-7) |
22
+ | Parameters | 72 (8 × 9) |
23
+ | Weights | [1, 1, 1, 1, 1, 1, 1, -7] |
24
+ | Bias | 0 |
25
+ | Activation | Heaviside step |
26
+
27
+ ## Key Properties
28
+
29
+ - 100% accuracy (256/256 inputs correct)
30
+ - Coq-proven correctness
31
+ - Algebraic weight pattern: resets every 8 positions
32
+ - Computes Hamming weight mod 8
33
+ - Compatible with neuromorphic hardware
34
+
35
+ ## Algebraic Pattern
36
+
37
+ MOD-8 uses the pattern with reset at position 8:
38
+ - Positions 1-7: weight = 1
39
+ - Position 8: weight = 1-8 = -7
40
+
41
+ This creates a cumulative sum that cycles mod 8.
42
+
43
+ ## Usage
44
+
45
+ ```python
46
+ import torch
47
+ from safetensors.torch import load_file
48
+
49
+ weights = load_file('mod8.safetensors')
50
+
51
+ def mod8_circuit(bits):
52
+ # bits: list of 8 binary values
53
+ inputs = torch.tensor([float(b) for b in bits])
54
+ weighted_sum = (inputs * weights['weight']).sum() + weights['bias']
55
+ return weighted_sum.item()
56
+
57
+ # Test
58
+ print(mod8_circuit([1,1,1,1,1,1,1,1])) # 8 mod 8 = 0
59
+ print(mod8_circuit([1,1,1,1,1,1,1,0])) # 7 mod 8 = 7
60
+ ```
61
+
62
+ ## Verification
63
+
64
+ **Coq Theorem**:
65
+ ```coq
66
+ Theorem mod8_correct_residue_0 : forall x0 x1 x2 x3 x4 x5 x6 x7,
67
+ mod8_is_zero [x0; x1; x2; x3; x4; x5; x6; x7] =
68
+ Z.eqb ((Z.of_nat (hamming_weight [x0; x1; x2; x3; x4; x5; x6; x7])) mod 8) 0.
69
+ ```
70
+
71
+ Proven axiom-free using algebraic weight patterns.
72
+
73
+ Full proof: [coq-circuits/Modular/Mod8.v](https://github.com/CharlesCNorton/coq-circuits/blob/main/coq/Modular/Mod8.v)
74
+
75
+ ## Residue Distribution
76
+
77
+ For 8-bit inputs (256 total):
78
+ - Residue 0: 2 inputs
79
+ - Residue 1: 8 inputs
80
+ - Residue 2: 28 inputs
81
+ - Residue 3: 56 inputs
82
+ - Residue 4: 70 inputs
83
+ - Residue 5: 56 inputs
84
+ - Residue 6: 28 inputs
85
+ - Residue 7: 8 inputs
86
+
87
+ ## Citation
88
+
89
+ ```bibtex
90
+ @software{tiny_mod8_verified_2025,
91
+ title={tiny-mod8-verified: Formally Verified MOD-8 Circuit},
92
+ author={Norton, Charles},
93
+ url={https://huggingface.co/phanerozoic/tiny-mod8-verified},
94
+ year={2025}
95
+ }
96
+ ```