Doesn't work locally. (SOLVED)

#1
by GeneralAwareness - opened

I wonder what is different between ComfyUI and HF Warm as your LoRA doesn't work locally? I used your Batman prompt and Batman is not there, but a fat bearded man is pouring the coffee, etc... I gen the same prompt here on HF warm, and it showed Batman.

Have any ideas what is going on?

Hey @GeneralAwareness ,
The differences are in which base model is used along with the LoRA. In my examples here I used a quantized version of flux, Q4 if I'm not mistaken.
Scrolling through CivitAI or here on HF you can find many different variants of the original flux base model, all of them can work with a LoRA, each will give you different results.

Hope that helps!

oshtz changed discussion title from Doesn't work locally. to Doesn't work locally. (SOLVED)
oshtz locked this discussion

Sign up or log in to comment