An experimental and deep tech-and-software company (Delaware C-corp, incorporated May 2026). We invent and ship across the full breadth of tech and software — deep research, runtime systems, novel substrates, infrastructure, hardware-adjacent stacks, and software products that don't fit anywhere else yet. UltraCompress — lossless 5-bit transformer compression — is our first flagship publicly-shipped product. More products in flight.v0.6.2 on PyPI 22 architectures verified USPTO 64/049,511 + 64/049,517 BUSL-1.1 + Additional Use Grant
Production-grade lossless 5-bit transformer compression across 22 architectures, dense + Mixture-of-Experts + state-space (Mamba), 0.6B to 405B parameters. Mathematically lossless customer-side reconstruction: SHA-256 over reconstructed tensor bytes matches the trainer's measurement, verified by uc verify.
(perplexity ratio = compressed PPL / bf16 baseline PPL; FineWeb-edu held-out tail; seq_len = 1024; n = 30-50; seed = 42)
| Model | Params | Type | PPL ratio | Notes |
|---|---|---|---|---|
| Phi-3-mini-4k-instruct | 3.8B | dense | 1.00262× | seq_len=128 caveat |
| Mixtral-8x7B | 47B | MoE | 1.00368× | tightest MoE result |
| Qwen3-1.7B-Base | 1.7B | dense | 1.00401× | small-decoder record |
| Qwen3-14B | 14B | dense | 1.00403× | 14B-class record |
| Yi-1.5-9B | 8.8B | dense | 1.00414× | >8B record |
| Qwen3-8B | 8B | dense | 1.00440× | 8B-class record |
| Mistral-7B-v0.3 | 7B | dense | 1.00548× | NEW this week — 9.16× tighter than prior |
| Hermes-3-Llama-3.1-405B | 405B | dense | 1.0066× | largest dense 5-bit lossless on the Hub |
| Qwen3-0.6B | 0.6B | dense | 1.0069× | |
| OLMo-2-0425-1B | 1B | dense | 1.0073× | |
| SmolLM2-1.7B-Instruct | 1.7B | dense | 1.0075× | |
| Mamba-2.8B | 2.8B | SSM | 1.0119× | first published 5-bit lossless on a state-space model |
| Llama-3.1-8B | 8B | dense | 1.0125× | standard eval |
Source + reproduce: github.com/sipsalabs/ultracompress · full benchmark page: sipsalabs.com/inference
pip install ultracompress hf download SipsaLabs/qwen3-8b-uc-v3-bpw5 --local-dir ./qwen3-8b uc bench ./qwen3-8b
api.sipsalabs.com is live, OpenAI-compatible:
export OPENAI_BASE_URL=https://api.sipsalabs.com/v1 # The official `openai` SDK works unchanged.
Pricing tiers + Compression-as-a-Service contracts at sipsalabs.com/pricing.
ultracompress v0.6+ under BUSL-1.1 with Additional Use Grant — free for sub-$1M ARR companies + research + individuals. Auto-converts to Apache 2.0 four years after each release.legacy/0.5.x.sipsalabs.com · github.com/sipsalabs · @SipsaLabs on X
Public verifier dashboard: sipsalabs.com/inference · Selective Disclosure Charter — what we publish vs what we keep internal: see github.com/sipsalabs/ultracompress