Custom chat template

#3
by oPnf4fMoKMz4VFq - opened

In your blog post it's recommended to use the custom template which I assume is the template file in the repo?

I tried it with llama-server --chat-template-file template, but it fails to parse it. I'm probably just misunderstanding something about the template part.

common_chat_templates_init: error: lexer: unexpected character: $
...{- end }}<|im_end|>↵{{ end }}↵{{- range $i, $_ := .Messages }}↵{{- $last := eq (...
                                           ^
common_chat_templates_init: failed to initialize chat template
common_chat_templates_init: please consider disabling jinja via --no-jinja, or using another chat template
ByteShape org

Thank you for reading the post and for paying close attention to the details ;)
The custom template is already baked into the GGUF file. If you run the GGUF with llama.cpp, that template is used by default.
The template file in the repo is for Ollama. Ollama uses Go templates rather than Jinja (what llama.cpp likes), so that file cannot be loaded directly with llama.cpp.

Thanks a lot for clarifying, didn't realize it's specific to Ollama. They could've standardized on Jinja.. and spare people from unnecessary extra work and confusion. Oh well :)

oPnf4fMoKMz4VFq changed discussion status to closed

Sign up or log in to comment