Does the model support deployment with vLLM?
Unfortunately we currently don't support vLLM. I looked into integrating it, but that'd be a fundamental change and probably we're not gonna add it.
· Sign up or log in to comment