Energy-based models (EBMs) are a powerful class of probabilistic generative
models due to their flexibility and interpretability. Tuttavia, relationships
between potential flows and explicit EBMs remain underexplored, while
contrastive divergence training via implicit Markov chain Monte Carlo (MCMC)
sampling is often unstable and expensive in high-dimensional settings. In this
paper, we propose Variational Potential Flow Bayes (VPFB), a new energy-based
generative framework that eliminates the need for implicit MCMC sampling and
does not rely on auxiliary networks or cooperative training. VPFB learns an
energy-parameterized potential flow by constructing a flow-driven density
homotopy that is matched to the data distribution through a variational loss
minimizing the Kullback-Leibler divergence between the flow-driven and marginal
homotopies. This principled formulation enables robust and efficient generative
modeling while preserving the interpretability of EBMs. Experimental results on
image generation, interpolation, out-of-distribution detection, E
compositional generation confirm the effectiveness of VPFB, showing that our
method performs competitively with existing approaches in terms of sample
quality and versatility across diverse generative modeling tasks.
Questo articolo esplora i giri e le loro implicazioni.
Scarica PDF:
2504.16262v1