Skip to content

Inference-time search algorithms to optimize the 'noise' in diffusion-based text-to-image models for arbitrary objective optimization.

Notifications You must be signed in to change notification settings

aklein4/NoiseSearch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

26 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Noise Search

Inference-time search algorithms to optimize the 'noise' in diffusion-based text-to-image models for arbitrary objective optimization. Tested algorithms include Stochastic Hill Climbing (SHC), Simulated Annealing (SA), and random sampling.

We show that deliberate sampling algorithms like SHC and SA achieve higher scores than the naive random sampling commonly used in image generation.

See ./NoiseSearch-paper.pdf for more info.

Concept

Below, we see the optimization landscape of various objective functions (ImageReward, CLIPScore, JPEG compression) with respect to 2 dimensions in the noise space. We see that the landscapes tend to be locally smooth. This means that local optimization algorithms like HC and SA should achieve better scores than random sampling.

optimization landscapes

Results

Below, is the average score across different objective functions with numbers of sampling steps. We see than SHC and SA outperform random sampling.

search progress

We also see that these inference-time algorithms significantly outperform even larger models.

benchmark score table

Examples

Below are examples of images generated by different methods for different optimization objectives.

ImageReward examples

CLIP examples

JPEG examples

About

Inference-time search algorithms to optimize the 'noise' in diffusion-based text-to-image models for arbitrary objective optimization.

Topics

Resources

Stars

Watchers

Forks