| license: apache-2.0 | |
| # switch-base-128_qmoe | |
| This is the [google/switch-base-128](https://huggingface.co/google/switch-base-128) model quantized with the QMoE framework to ternary precision and stored in the custom further compressed QMoE format. | |
| Please see the [QMoE repository](https://github.com/IST-DASLab/qmoe) for how to use this model. |