not working with Ollama

#2
by ahmgam - opened

something is not correct with Ollama with this model
when I use ollama run hf.co/prithivMLmods/chandra-ocr-2-GGUF:Q5_K_M, I get the following error :
500: unable to load model: /usr/share/ollama/.ollama/models/blobs/sha256-c5e3df32bfa20d0a822fdc635d3849f5d53e2049535cd85753df4fd762bdd276

I also tried to create custom model with the following modelfile (similar to qwen 3.5 modelfile)

FROM hf.co/prithivMLmods/chandra-ocr-2-GGUF:Q5_K_M
TEMPLATE {{ .Prompt }}
RENDERER qwen3.5
PARSER qwen3.5
PARAMETER presence_penalty 1.5
PARAMETER temperature 1
PARAMETER top_k 20
PARAMETER top_p 0.95

can you please test it?

Hi @ahmgam ,

I didn’t test it on Ollama, but I tested it with llama.cpp and it works fine.
So, it should work well on Ollama too!

Screenshot 2026-03-27 093218

Clean the broken model:

rm -rf /usr/share/ollama/.ollama/models/blobs/sha256-c5e3df32bfa20d0a822fdc635d3849f5d53e2049535cd85753df4fd762bdd276

Then also remove the model reference:

ollama rm hf.co/prithivMLmods/chandra-ocr-2-GGUF:Q5_K_M

Re-pull the model

ollama run hf.co/prithivMLmods/chandra-ocr-2-GGUF:Q5_K_M

Try the FIX !!

prithivMLmods changed discussion status to closed

I repeated the same steps again, I removed everything and downloaded it again, still getting the same error

Error: 500
{"error":"unable to load model: /root/.ollama/models/blobs/sha256-c5e3df32bfa20d0a822fdc635d3849f5d53e2049535cd85753df4fd762bdd276"}

@ahmgam

Okay, I’ll check and get back to you.

Sign up or log in to comment