This is a quantized INT4 model based on X86 CPU Fara-7B. You can deploy it on your CPU devices.
Note: This is unoffical version,just for test and dev.
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support