tiny-FullAdder-verified

Formally verified full adder circuit. Adds three 1-bit inputs (a, b, carry_in), producing sum and carry_out outputs with 100% accuracy.

Architecture

Component Value
Inputs 3 (a, b, carry_in)
Outputs 2 (sum, carry_out)
Neurons 9 (2 HalfAdders + OR)
Parameters 27
Depth 3 layers
Activation Heaviside step

Key Properties

  • 100% accuracy (8/8 input combinations correct)
  • Coq-proven correctness
  • Compositional design: 2 HalfAdders + OR gate
  • Building block for ripple-carry adders
  • Compatible with neuromorphic hardware

Circuit Design

Composed from two half adders and an OR gate:

HalfAdder1: (a, b) -> (s1, c1)
HalfAdder2: (s1, carry_in) -> (sum, c2)
carry_out = OR(c1, c2)

Components:

  • 2 HalfAdders: 4 neurons each (XOR + AND)
  • 1 OR gate: 1 neuron
  • Total: 9 neurons, 27 parameters, depth 3

Usage

import torch
from safetensors.torch import load_file

weights = load_file('fulladder.safetensors')

def heaviside(x):
    return int(x >= 0)

def full_adder(a, b, carry_in):
    # HalfAdder1: (a, b) -> (s1, c1)
    inputs_ha1 = torch.tensor([float(a), float(b)])

    or_out1 = heaviside((inputs_ha1 * weights['ha1.sum.layer1.or.weight']).sum() +
                        weights['ha1.sum.layer1.or.bias'])
    nand_out1 = heaviside((inputs_ha1 * weights['ha1.sum.layer1.nand.weight']).sum() +
                          weights['ha1.sum.layer1.nand.bias'])
    layer1_outs1 = torch.tensor([float(or_out1), float(nand_out1)])
    s1 = heaviside((layer1_outs1 * weights['ha1.sum.layer2.weight']).sum() +
                   weights['ha1.sum.layer2.bias'])
    c1 = heaviside((inputs_ha1 * weights['ha1.carry.weight']).sum() +
                   weights['ha1.carry.bias'])

    # HalfAdder2: (s1, carry_in) -> (sum, c2)
    inputs_ha2 = torch.tensor([float(s1), float(carry_in)])

    or_out2 = heaviside((inputs_ha2 * weights['ha2.sum.layer1.or.weight']).sum() +
                        weights['ha2.sum.layer1.or.bias'])
    nand_out2 = heaviside((inputs_ha2 * weights['ha2.sum.layer1.nand.weight']).sum() +
                          weights['ha2.sum.layer1.nand.bias'])
    layer1_outs2 = torch.tensor([float(or_out2), float(nand_out2)])
    sum_out = heaviside((layer1_outs2 * weights['ha2.sum.layer2.weight']).sum() +
                        weights['ha2.sum.layer2.bias'])
    c2 = heaviside((inputs_ha2 * weights['ha2.carry.weight']).sum() +
                   weights['ha2.carry.bias'])

    # Carry out: OR(c1, c2)
    carry_inputs = torch.tensor([float(c1), float(c2)])
    carry_out = heaviside((carry_inputs * weights['carry_or.weight']).sum() +
                          weights['carry_or.bias'])

    return sum_out, carry_out

# Test
print(full_adder(0, 0, 0))  # (0, 0)
print(full_adder(1, 1, 0))  # (0, 1) - 1+1=2, sum=0, carry=1
print(full_adder(1, 1, 1))  # (1, 1) - 1+1+1=3, sum=1, carry=1

Truth Table

a b cin sum cout
0 0 0 0 0
0 0 1 1 0
0 1 0 1 0
0 1 1 0 1
1 0 0 1 0
1 0 1 0 1
1 1 0 0 1
1 1 1 1 1

Verification

Coq Theorem:

Theorem full_adder_correct : forall a b c,
  full_adder a b c = full_adder_spec a b c.

Where full_adder_spec computes:

  • sum = a XOR b XOR c
  • carry_out = (a AND b) OR (c AND (a XOR b))

Proven axiom-free with verified properties:

  • Commutativity in first two arguments
  • Equivalence to half adder when carry_in = false
  • Exhaustive correctness on all 8 input combinations

Full proof: coq-circuits/Arithmetic/FullAdder.v

Applications

  • Building block for multi-bit ripple-carry adders
  • Component of ALU designs
  • Foundational for all integer arithmetic circuits
  • Educational hardware design

Citation

@software{tiny_fulladder_verified_2025,
  title={tiny-FullAdder-verified: Formally Verified Full Adder},
  author={Norton, Charles},
  url={https://huggingface.co/phanerozoic/tiny-FullAdder-verified},
  year={2025}
}
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Collection including phanerozoic/tiny-FullAdder-verified