TFG: Unified Training-Free Guidance for Diffusion Models
Abstract
Given an <PRE_TAG>unconditional diffusion model</POST_TAG> and a predictor for a target property of interest (e.g., a classifier), the goal of <PRE_TAG>training-free guidance</POST_TAG> is to generate samples with <PRE_TAG>desirable target properties</POST_TAG> without additional training. Existing methods, though effective in various individual applications, often lack theoretical grounding and rigorous testing on extensive benchmarks. As a result, they could even fail on simple tasks, and applying them to a new problem becomes unavoidably difficult. This paper introduces a novel <PRE_TAG><PRE_TAG>algorithmic framework</POST_TAG></POST_TAG> encompassing existing methods as special cases, unifying the study of <PRE_TAG>training-free guidance</POST_TAG> into the analysis of an algorithm-agnostic design space. Via theoretical and empirical investigation, we propose an efficient and effective <PRE_TAG><PRE_TAG>hyper-parameter searching strategy</POST_TAG></POST_TAG> that can be readily applied to any downstream task. We systematically benchmark across 7 diffusion models on 16 tasks with 40 targets, and improve performance by 8.5% on average. Our framework and benchmark offer a solid foundation for <PRE_TAG><PRE_TAG>conditional generation</POST_TAG></POST_TAG> in a training-free manner.
Models citing this paper 0
No model linking this paper
Datasets citing this paper 0
No dataset linking this paper
Spaces citing this paper 0
No Space linking this paper
Collections including this paper 0
No Collection including this paper