CMX Lunch Seminar
Dynamical sampling of probability distributions based on model or data (i.e., generative modeling) is a central task in scientific computing and machine learning. I'll present recent work on understanding and improving algorithms in high-dimensional settings. This includes a novel "delocalization of bias" phenomenon in Langevin dynamics, where biased methods could achieve dimension-free scaling for low-dimensional marginals while unbiased methods cannot—a finding motivated by molecular dynamics simulations. I'll also briefly mention a new unbiased affine-invariant Hamiltonian sampler that outperforms popular samplers in emcee package (routinely used in astrophysics literature) in high dimensions, and introduce optimal Lipschitz energy criteria for design of measure transport in generative modeling of multiscale scientific data, as alternative to optimal kinetic energy in optimal transport. These examples show how dimensional scaling could be flattened, allowing efficient stochastic algorithms for high-dimensional sampling and generative modeling in relevant scientific applications.