Cannot load the model in vLLM 0.17.1 (latest stable)
I tryed to load this model in vLLM 0.17.1 (latest stable) having this error:
Tokenizer class TokenizersBackend does not exist or is not currently imported.
The original and not quantized Qwen/Qwen3.5-27B has this in its tokenizer_config.json:
"tokenizer_class": "Qwen2Tokenizer"
This Intel/Qwen3.5-27B-int4-AutoRound has this in its tokenizer_config.json:
"tokenizer_class": "TokenizersBackend"
But vLLM 0.17.1 requires: transformers >= 4.56.0, < 5 and new TokenizersBackend isn't managed
Please ignore vLLM requires and upgrade transformers to the latest version.
Thanks, but, for now, i tryed a simple solution.
I cloned the repo locally and replaced "TokenizersBackend" with "Qwen2Tokenizer" in the tokenizer_config.json... and vLLM 0.17.1 loaded the model without errors.