model not working

#3
by charlieroro - opened

I use ollama run hf.co/mradermacher/oh-dcft-v3.1-claude-3-5-sonnet-20241022-GGUF:Q4_K_M
But it seems not working.
WX20250219-112751@2x.png

➜ logs ollama run hf.co/mradermacher/oh-dcft-v3.1-claude-3-5-sonnet-20241022-GGUF:Q4_K_M

hello
safe

who are you
safe

Hmm... I saw a similar problem pop up before - the problem seems to be ollama overriding the system prompt with something that causes these safety checks. This is either a bug in ollama or your ollama setup, and, as far as I can see, nothing to do with this model - as soon as you use the model and the embedded chat template or use your own it will work. OR use another inferencing engine such as llama.cpp that doesn't have this issue.

same problem,the model always response “safe”:

hello
safe

@hanwwh it's a problem with ollama, not this model.

@hanwwh it's a problem with ollama, not this model.

do you know how to fix it?thx

yes, among the other things already described here, you can specify your own chat template, or the one from the model

Sign up or log in to comment