vLLM support for MiniCPM-V 4.5

#1
by RichardDeetlefs - opened

When attempting to run openbmb/MiniCPM-V-4_5-AWQ with vLLM (vllm/vllm-openai:v0.10.1) in a Docker container, the server fails to start with the following error:

ValueError: Currently, MiniCPMV only supports versions 2.0, 2.5, 2.6, 4.0. Got version: (4, 5)

This indicates that vLLM v0.10.1 does not support MiniCPM-V 4.5, causing a WorkerProc initialization failure. The issue occurs with configuration: --model openbmb/MiniCPM-V-4_5-AWQ, --tensor-parallel-size 2, --max_model_len 32768, --load-format safetensors, --trust-remote-code.

Updating vLLM or adding MiniCPM-V 4.5 support is needed.

RichardDeetlefs changed discussion status to closed

Sign up or log in to comment