metadata
license: mit
tags:
- pytorch
- safetensors
- threshold-logic
- neuromorphic
- bit-manipulation
- ffs
threshold-ffs8
8-bit find first set (FFS). Returns the 1-indexed position of the least significant set bit. Returns 0 if no bits are set.
Circuit
x7 x6 x5 x4 x3 x2 x1 x0
β β β β β β β β
βββββ΄ββββ΄ββββ΄ββββ΄ββββ΄ββββ΄ββββ
β
βΌ
βββββββββββββββββββββββ
β CTZ + 1 (if set) β
β or 0 (if all zero) β
βββββββββββββββββββββββ
β
βΌ
[f3, f2, f1, f0]
(position 0-8)
Function
ffs8(x7..x0) -> (f3, f2, f1, f0)
position = 8*f3 + 4*f2 + 2*f1 + f0
if input = 0: position = 0
if input != 0: position = CTZ(input) + 1
FFS is 1-indexed: the first bit (x0) is position 1, not 0.
Truth Table (selected)
| Input (hex) | Binary | FFS | f3 f2 f1 f0 | Meaning |
|---|---|---|---|---|
| 0x00 | 00000000 | 0 | 0 0 0 0 | No bits set |
| 0x01 | 00000001 | 1 | 0 0 0 1 | Bit 0 is first |
| 0x02 | 00000010 | 2 | 0 0 1 0 | Bit 1 is first |
| 0x04 | 00000100 | 3 | 0 0 1 1 | Bit 2 is first |
| 0x08 | 00001000 | 4 | 0 1 0 0 | Bit 3 is first |
| 0x10 | 00010000 | 5 | 0 1 0 1 | Bit 4 is first |
| 0x20 | 00100000 | 6 | 0 1 1 0 | Bit 5 is first |
| 0x40 | 01000000 | 7 | 0 1 1 1 | Bit 6 is first |
| 0x80 | 10000000 | 8 | 1 0 0 0 | Bit 7 is first |
| 0xFF | 11111111 | 1 | 0 0 0 1 | Bit 0 is first |
Relationship to CTZ
if (input == 0):
FFS = 0
else:
FFS = CTZ + 1
FFS and CTZ are closely related:
- CTZ returns 0-7 for positions, 8 for zero input
- FFS returns 1-8 for positions, 0 for zero input
Mechanism
Position detectors: Same as CTZ - detect where first 1 appears from LSB
| Signal | Fires when |
|---|---|
| p0 | x0 = 1 |
| p1 | x0 = 0, x1 = 1 |
| p2 | x0 = x1 = 0, x2 = 1 |
| ... | ... |
| p7 | x0..x6 = 0, x7 = 1 |
Output encoding: Direct binary encoding of position + 1
- f0 = p0 OR p2 OR p4 OR p6 (positions 0,2,4,6 β FFS 1,3,5,7)
- f1 = p1 OR p2 OR p5 OR p6 (positions 1,2,5,6 β FFS 2,3,6,7)
- f2 = p3 OR p4 OR p5 OR p6 (positions 3,4,5,6 β FFS 4,5,6,7)
- f3 = p7 (position 7 β FFS 8)
Parameters
| Inputs | 8 |
| Outputs | 4 |
| Neurons | 12 |
| Layers | 2 |
| Parameters | 76 |
| Magnitude | 48 |
Usage
from safetensors.torch import load_file
import torch
w = load_file('model.safetensors')
def ffs8(bits):
# bits = [x0, x1, ..., x7] (LSB first)
inp = torch.tensor([float(b) for b in bits])
# ... (see model.py for full implementation)
# Examples
print(ffs8([1,0,0,0,0,0,0,0])) # 1 (first bit set is position 0)
print(ffs8([0,0,1,0,0,0,0,0])) # 3 (first bit set is position 2)
print(ffs8([0,0,0,0,0,0,0,0])) # 0 (no bits set)
Applications
- POSIX ffs() function implementation
- Finding available slots in bitmaps
- Scheduler ready queue processing
- Memory allocator free list management
- Interrupt priority handling
Comparison: FFS vs CTZ vs CLZ
| Function | Zero input | Non-zero input | Index base |
|---|---|---|---|
| FFS | 0 | 1 to 8 | 1-indexed |
| CTZ | 8 | 0 to 7 | 0-indexed |
| CLZ | 8 | 0 to 7 | 0-indexed from MSB |
Files
threshold-ffs8/
βββ model.safetensors
βββ model.py
βββ create_safetensors.py
βββ config.json
βββ README.md
License
MIT