Reasoning not being rendered properly in the LM Studio app.
#2
by
samhorry - opened
I've been running into some issues with this model in LM Studio, as well as the GGUF from unsloth. When using things like open webui, the model does not generate the opening tag at the beginning of its reasoning stage. I have not modified the prompt template or system prompt, and I can see even in the LM Studio app directly, when I go to edit the model's response, the closing tag is present, but not the opening one. I tried the same model using base llama.cpp and I didn't have this issue; was appearing at the start of the chain of thought like it should be.