This is a quantized INT4 model based on X86 CPU Fara-7B. You can deploy it on your CPU devices.

Note: This is unoffical version,just for test and dev.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support