How to use from
Docker Model Runner
docker model run hf.co/limcheekin/BitNet-1.58bit-Models
Quick Links

No model card

Downloads last month
61
GGUF
Model size
2B params
Architecture
bitnet-b1.58
Hardware compatibility
Log In to add your hardware

We're not able to determine the quantization variants.

Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support