Missing configs?

#1
by loktar - opened

Seeing the following when attempting to run the model:

vllm.entrypoints.chat_utils.ChatTemplateResolutionError: As of transformers v4.44, default chat template is no longer allowed, so you must provide a chat template if the tokenizer does not define one.

looks like the chat_template.jinja didn't upload and the chat_template filed in the tokenizer_config.json was missing, both should be fixed now.

Awesome! Thanks for this!

loktar changed discussion status to closed

Sign up or log in to comment