Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
philipp-zettl 
posted an update 28 days ago
Post
705
This is probably a very hot take, but here goes nothing.

With the incredibly accurate LoRAs we see emerge for high quality models like FLUX from services like fal.ai that offer training within single digit minutes, e.g. 2 min per 1000 iterations.

Why the hell are people publishing private LoRAs as public models?!
Take a look at this listing: https://huggingface.co/models?other=base_model:adapter:black-forest-labs%2FFLUX.1-dev&sort=created

I would expect that people that hold a HF account have some kind of forward thinking. Heck, do you really want to give anyone the power to create ultra realistic images of yourself?!

Didn't we learn anything from social media?
I am puzzled..

Are you saying that it's not safe to publish LoRA of my photos?
If so, I guess so, because I too live in a country where anonymous and semi-anonymous culture has to thrive...
But are there countries that don't...?

·

I'm more concerned about bad actors using them to create content that might harm you or put you in a bad spot by creating visual content with your face.
For instance to blackmail you or harm your reputation.

I am for sure a big supporter of open source and publish all the things I have the rights to. Yet, I wouldn't publish a LoRA that is trained on my face.

Not everyone wants to be a supervillain who keeps all the good stuff for themselves. Some people want to care and share. On the other hand, companies like anthropic and OpenAI don't want to share. They want to have all the fun to themselves and monitize everyone who wants a piece of the fun cake

·

I think you got me wrong there. I'm mostly concerned about image generation LoRAs that are trained on your person or for instance the pictures of children.
Gate keeping the secret sauce for base models is different and I totally agree with you on that part.