tiny-mod8-verified

Formally verified MOD-8 circuit. Single-layer threshold network computing modulo-8 arithmetic with 100% accuracy.

Architecture

Component Value
Inputs 8
Outputs 1 (per residue class)
Neurons 8 (one per residue 0-7)
Parameters 72 (8 × 9)
Weights [1, 1, 1, 1, 1, 1, 1, -7]
Bias 0
Activation Heaviside step

Key Properties

  • 100% accuracy (256/256 inputs correct)
  • Coq-proven correctness
  • Algebraic weight pattern: resets every 8 positions
  • Computes Hamming weight mod 8
  • Compatible with neuromorphic hardware

Algebraic Pattern

MOD-8 uses the pattern with reset at position 8:

  • Positions 1-7: weight = 1
  • Position 8: weight = 1-8 = -7

This creates a cumulative sum that cycles mod 8.

Usage

import torch
from safetensors.torch import load_file

weights = load_file('mod8.safetensors')

def mod8_circuit(bits):
    # bits: list of 8 binary values
    inputs = torch.tensor([float(b) for b in bits])
    weighted_sum = (inputs * weights['weight']).sum() + weights['bias']
    return weighted_sum.item()

# Test
print(mod8_circuit([1,1,1,1,1,1,1,1]))  # 8 mod 8 = 0
print(mod8_circuit([1,1,1,1,1,1,1,0]))  # 7 mod 8 = 7

Verification

Coq Theorem:

Theorem mod8_correct_residue_0 : forall x0 x1 x2 x3 x4 x5 x6 x7,
  mod8_is_zero [x0; x1; x2; x3; x4; x5; x6; x7] =
  Z.eqb ((Z.of_nat (hamming_weight [x0; x1; x2; x3; x4; x5; x6; x7])) mod 8) 0.

Proven axiom-free using algebraic weight patterns.

Full proof: coq-circuits/Modular/Mod8.v

Residue Distribution

For 8-bit inputs (256 total):

  • Residue 0: 2 inputs
  • Residue 1: 8 inputs
  • Residue 2: 28 inputs
  • Residue 3: 56 inputs
  • Residue 4: 70 inputs
  • Residue 5: 56 inputs
  • Residue 6: 28 inputs
  • Residue 7: 8 inputs

Citation

@software{tiny_mod8_verified_2025,
  title={tiny-mod8-verified: Formally Verified MOD-8 Circuit},
  author={Norton, Charles},
  url={https://huggingface.co/phanerozoic/tiny-mod8-verified},
  year={2025}
}
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including phanerozoic/tiny-mod8-verified