MLC version of arcee-ai/arcee-lite, using q0f16 quantization.
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Model tree for Felladrin/mlc-q0f16-arcee-lite
Base model
arcee-ai/arcee-liteMLC version of arcee-ai/arcee-lite, using q0f16 quantization.
Base model
arcee-ai/arcee-lite