ValueError: Failed to load model

#1
by Zauza25 - opened

image.png

I get this error when trying to load this model in WebUI. Other models works. And its not about vram too. Also its working fine via koboldcpp

LWDCLS Research org

I'm not much of an Ooba user, but maybe you can try updating the included llama.cpp version there?

Sign up or log in to comment