Spaces:
Running
Running
Issue Running with Ollama
#2
by
divyendra
- opened
Hi @Quazim0t0 - I am trying to set it up for contribution - locally with ollama - but seems to not working correctly - even for very basic queries - Taking forever at every step with awful generated query.
Am I missing something here?
localModel = LiteLLMModel(
model_id="ollama_chat/qwen2.5-coder:32b-instruct-q4_K_M",
api_key="xxxx",
api_base="http://localhost:11434"
)
agent = CodeAgent(
tools=[sql_engine],
model=localModel,
)
I am unsure why it would take a while to generate a response. Could you try another model with lower parameters for me and see if it works better? It's weird to me that it's the exact same model but hosted locally, so I don't fully know why there would be a difference.