Some GGML quantizations of for TII's tiiuae/falcon-40b base model for use with ggllm.cpp.
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support
Some GGML quantizations of for TII's tiiuae/falcon-40b base model for use with ggllm.cpp.