Fix: Chat template field in tokenizer_config.json
The error occurred while parsing the request payload, so I looked into the configuration files.
I found that the 'chat_template' field in 'tokenizer_config.json' was different from what’s defined in 'chat_template.json'.
Since this is a multimodal model, the chat_template needs to explicitly specify how to parse multimodal inputs. However, the value in tokenizer_config.json didn’t include any multimodal handling, which is incorrect.
I also checked the chat_template used by Qwen 2.5 VL, which is the base model for this one. Its chat_template matches the content of this model’s chat_template.json, but differs from the value set in tokenizer_config.json.
So I updated the chat_template field in tokenizer_config.json to match the one defined in chat_template.json.
If there is any feedback on this, feel free to do so.