You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

YAML Metadata Warning:empty or missing yaml metadata in repo card

Check out the documentation for more information.

Sipsa Labs, Inc. update โ€” 2026-05-11. UltraCompress v0.6.2 on PyPI under BUSL-1.1 + Additional Use Grant (free for sub-$1M ARR companies, research, and individuals; auto-converts to Apache 2.0 four years post-release). OpenAI-compatible inference at api.sipsalabs.com. 22 architectures verified, 0.6Bโ€“405B parameters, sub-1.005ร— perplexity ratio on Mixtral-8x7B / Qwen3-14B / Mistral-7B. Live discussion on Hacker News. Commercial inquiries: founder@sipsalabs.com.



license: apache-2.0 library_name: transformers tags: - compression - ultracompress - streaming-compression - sipsa-labs

Gated artifact. This compressed model is part of Sipsa Labs' production tier (>10B parameter class). Click "Request access" to submit your use case โ€” Sip approves manually within 24h. Free for sub-$1M ARR companies, research, and individuals (BUSL-1.1 Additional Use Grant). Above $1M ARR shipping in production: see pricing or email founder@sipsalabs.com.

phi-3.5-moe-instruct-streaming-bpw5

Compressed via UltraCompress end-to-end pipeline (per-layer streaming + correction overlay). Single 32GB GPU. Mean cross-architecture PPL_r in 9-arch matrix = 1.0066.

See compression_report.json for baseline / compressed / PPL_r details.

Reproduce

# trainer-side flags (--rank, --block_size, --train_steps) are NDA-gated; contact founder@sipsalabs.com for trainer access.
pip install ultracompress
uc compress --hf-id ... --bpw 5 --rank <r> --output ./out.uc

Patent provisionals: USPTO 64/049,511 + 64/049,517 (filed 2026-04-25).

Codec internals + training procedure are patent-protected (USPTO 64/049,511 + 64/049,517).

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support