using with ollama

#2
by pouria98sarmasti - opened

when i run this model with ollama, model responses are about safety of the user message.
but this issue does not exist when run model with llama.cpp .
what is the problem?
ollama.png

pouria98sarmasti changed discussion status to closed

Sign up or log in to comment