Events

IFML Seminar

IFML Seminar: 04/10/26 - High-Magnetization Sampling at Low Temperatures: Ising Models and Bayesian Sparse Linear Regression

Kevin Tian, assistant professor of Computer Science, UT Austin

-

The University of Texas at Austin
Gates Dell Complex (GDC 6.302)
2317 Speedway
Austin, TX 78712
United States

Kevin Tian

Abstract: Sparse recovery, i.e., estimating a sparse signal theta* in R^d from few noisy measurements y = Xtheta* + xi, is a central problem in high-dimensional statistics. Recent works [KSTZ25, MW26] have taken an algorithmic perspective on Bayesian sparse linear regression, where the goal is to sample from the posterior pi(. | X, y) under a prior pi, with Gaussian noise xi ~ N(0, sigma^2 I_n). When pi is the spike-and-slab prior [MB88, GM93], a standard model in statistics, the induced posterior sampling problem is highly multimodal and poses a well-known algorithmic challenge [GM97, CGMCFS01]. Nevertheless, [KSTZ25] give a nearly-linear time sampler that uses n ~ k^3 polylog(d) Gaussian measurements, and produces a high-accuracy sample for any sigma > 0.

We show that n ~ k^2 log^2(d) measurements suffice for polynomial-time spike-and-slab posterior sampling. En route, we develop a broader framework for exploiting sparsity in high-dimensional sampling. For Ising models, where the Gibbs measure is propto exp(beta f(x)) with quadratic potential f over the hypercube {-1, 1}^d, this leads to efficient algorithms for bounded- and fixed-magnetization sampling, where the positive spin count k := |{i: x_i = 1}|$ is small. We give polynomial-time samplers for the Sherrington-Kirkpatrick (SK) model at subconstant temperatures 1/beta whenever k << sqrt d, circumventing a known algorithmic obstruction in the general setting [AMS22] by restricting the magnetization. We give an analogous sampler for the Gaussian Hopfield model with sublinear degrees of freedom whenever k << d.

Finally, we prove a lower bound against the Hamming contraction of the natural Markov chain for fixed-magnetization SK models, the down-up walk, when k >> sqrt d and beta >> 1, complemented by empirical evidence. This suggests that our methods may be approaching natural limits, and raises intriguing questions about the algorithmic thresholds for low-temperature, high-magnetization sampling in Ising models and Bayesian sparse linear regression.

Joint work with Syamantak Kumar, Purnamrita Sarkar, and Yusong Zhu.

Short Bio:  Kevin Tian is assistant professor of Computer Science at UT Austin. His areas of interest include Optimization, High-Dimensional Statistics, and Machine Learning Theory. 

Zoom link: https://utexas.zoom.us/j/84254847215