How can we use the model?

#1
by JermemyHaschal - opened

Hey there,
Do you have any idea how we can use the model? Obviously, it wouldn't work with llama.cpp as stated here, this seems like an auto-generated model card because of the .gguf format.

yeah, I just did a quick quant of the model using the "gguf my repo" space, no idea if it works correctly, it needs another models to run + a correct implementation, we might need to wait some time for that I think

Sign up or log in to comment