Fix tokenizer: add <IMG_CONTEXT> special token while retaining original chat_template

#8

This PR cleanly fixes the KeyError: <IMG_CONTEXT> issue when using vLLM by adding <IMG_CONTEXT> as an additional special token. Unlike the previous PR, this strictly preserves the original complex chat_template from tokenizer_config.json avoiding any model performance collapse.

whats2000 changed pull request status to closed

Sign up or log in to comment