bug in fine tuning dorna 2 using unsloth library
Unsloth: The tokenizer PartAI/Dorna2-Llama3.1-8B-Instruct
does not have a {% if add_generation_prompt %} for generation purposes.
Please file a bug report to the maintainers of PartAI/Dorna2-Llama3.1-8B-Instruct - thanks!
How to fix (google ai mode answer):
Correct the Chat Template: Manually edit the tokenizer_config.json file in your exported or loaded model.
Find the chat_template field.
It should look something like: {{...}} {% if add_generation_prompt %}{ '<|start_header_id|>assistant<|end_header_id|>\n\n' }{% endif %} (The exact content varies by model, but the {% if add_generation_prompt %} part is key).
but your chat template is:
"chat_template": "{{ '<|begin_of_text|>' }}{% if messages[0]['role'] == 'system' %}{% set loop_messages = messages[1:] %}{% set system_message = messages[0]['content'] %}{% else %}{% set loop_messages = messages %}{% endif %}{% if system_message is defined %}{{ '<|start_header_id|>system<|end_header_id|>\n\n' + system_message + '<|eot_id|>' }}{% endif %}{% for message in loop_messages %}{% set content = message['content'] %}{% if message['role'] == 'user' %}{{ '<|start_header_id|>user<|end_header_id|>\n\n' + content + '<|eot_id|><|start_header_id|>assistant<|end_header_id|>\n\n' }}{% elif message['role'] == 'assistant' %}{{ content + '<|eot_id|>' }}{% endif %}{% endfor %}"