Is LM Studio support the ‘mllama’ architecture?

#80
by wilsonchou1996 - opened

LM studio 0.3.5
I tried to load this model on LM Studio, but it kept giving me an error message, Failed to load model, saying
"llama.cpp error: 'error loading model architecture: unknown model architecture: 'mllama''

I'm wondering whether LM Studio support the 'mllama'?

it supports it with MLX, but not GGUF

@bartowski Will there be support for GGUF any soon?

ollama supports the gguff version. Isn't ollama just wrapping llama.cpp here?

nah ollama deviated for vision a couple months back and added their own way to use vision adapters and didn't upstream it unfortunately :( nice for them to use, but yeah llama.cpp remains without support for this

Well that’s just sad. 😔

Sign up or log in to comment