Drag Your GAN: Interactive Point-based Manipulation on the Generative Image Manifold
Abstract
Synthesizing visual content that meets users' needs often requires flexible and precise controllability of the pose, shape, expression, and layout of the generated objects. Existing approaches gain controllability of generative adversarial networks (GANs) via manually annotated training data or a prior 3D model, which often lack flexibility, precision, and generality. In this work, we study a powerful yet much less explored way of controlling GANs, that is, to "drag" any points of the image to precisely reach target points in a user-interactive manner, as shown in Fig.1. To achieve this, we propose DragGAN, which consists of two main components: 1) a feature-based motion supervision that drives the handle point to move towards the target position, and 2) a new point tracking approach that leverages the discriminative generator features to keep localizing the position of the handle points. Through DragGAN, anyone can deform an image with precise control over where pixels go, thus manipulating the pose, shape, expression, and layout of diverse categories such as animals, cars, humans, landscapes, etc. As these manipulations are performed on the learned generative image manifold of a GAN, they tend to produce realistic outputs even for challenging scenarios such as hallucinating occluded content and deforming shapes that consistently follow the object's rigidity. Both qualitative and quantitative comparisons demonstrate the advantage of DragGAN over prior approaches in the tasks of image manipulation and point tracking. We also showcase the manipulation of real images through GAN inversion.
Community
eating
Incredible!
eating
running
Great work!
hi mom
Project page: https://vcai.mpi-inf.mpg.de/projects/DragGAN/
This is so satisfying looking forward for a Space demo
lion
oh wow
amazing~
wow
Good
RUN
say hi
The fashion demo is impressive !
zmazying,its way too fast
Holy Cow.
![IMG_3135.jpg](https://cdn-uploads.huggingface.co/production/uploads/6
454d93be4952d1c6cb33f3e/sNFI5ZijYcSVuQ13if0Pf.jpeg)
How can i use this app for trial
Yoo! loving it π€©
One of the first kind! π
how do i use this
Amazing work!
great!!!
I'm flabbergasted. So well done, good job.
on god
This is so cool!
ηοΌοΌοΌοΌοΌζεΎ ε ζ代η ε ¬εΌ
comment l'utiliser ?
EDIT: You do do call it DragGAN! It's right in the abstract! How did I miss that? I feel rather silly now, sorry. Great work.
Original comment:
Very impressive. My only disappointment is that you had the opportunity to call it "DragGAN" and have a tiny dragon as a mascot.
(I am joking. Well done, all.)
Very impressive. My only disappointment is that you had the opportunity to call it "DragGAN" and have a tiny dragon as a mascot.
(I am joking. Well done, all.)
They do call it DragGAN.
Where do I get it? How to install? Thanks
awesome
Amazing! can't wait
Very impressive. My only disappointment is that you had the opportunity to call it "DragGAN" and have a tiny dragon as a mascot.
(I am joking. Well done, all.)
They do call it DragGAN.
D'oh! Thank you for the correction. That was very silly of me.
Great project we are eagerly waiting to be able to test it.
Can't wait to test this project
Could be crazy powerful with additonnal semantic instructions, like "make the dog's face little bit more happy" "or "put the legs of the model closer to each other" with a cursors to "graduate" the result, in addition to current cursors.
can we test it
wen demo
This is like, really amazing. Also I'm waiting for a demo (I hope it isn't flooded with people)
Superb
Guys it's not available yet maybe we have to wait for weeks upon months for a demo or space also it doesn't work here this is not a demo or space this is a paper
wen demo
Maybe Weeks or Months or Years for a Release or Demo
That's awesome!
awesome, I think this is gonna be very useful!
Great, when and how should I use it?
eating
running
Amazing
Code will be released in June.
Red panda
You can try out an unofficial implementation of it right now from https://igpt.opengvlab.com/ if you want to test it without waiting till sometime in June. It's very noisily generating though but is a nice Gradio demo still.
GREAT EFFORT
Amazing
Impressive!
impressiveπ
Models citing this paper 1
Datasets citing this paper 0
No dataset linking this paper