Tokenizer mismatch when serving with vLLM

#6
by one-man-won - opened

I'm encountering a tokenizer issue when I'm serving the model with vLLM.

  • ValueError: Tokenizer class TokenizersBackend does not exist or is not currently imported.

command: vllm serve "LiquidAI/LFM2.5-VL-1.6B" --trust-remote-code

Versions:

  • tokenizers==0.22.2
  • transformers==4.57.6
  • vllm==0.15.1

Tried upgrading transforemers>5, still did not resolve this error.

Having the same issue with default vllm installtion with cu128.

Recommended setup in their vllm notebook had package version compatibility issues.

This is likely a dependency conflict β€” vLLM 0.15.1 pins or pulls in a specific transformers version that gets overwritten when you upgrade separately. Reinstalling in the right order should fix it:

uv pip install "vllm==0.15.1"
uv pip install "transformers==5.0.0"

Then activate your virtual env and run:

vllm serve "LiquidAI/LFM2.5-VL-1.6B" --trust-remote-code

The key is installing transformers after vLLM so it doesn't get downgraded by vLLM's dependency resolver. This worked for me with the same setup.

Sign up or log in to comment