Events

IFML Seminar

# IFML Seminar: 11/1/24 - Tensor Networks and Phase Transitions in Machine Learning

Cristopher Moore, Resident Professor, Santa Fe Institute

-The University of Texas at Austin

Gates Dell Complex (GDC 6.302)

2317 Speedway

Austin, TX 78712

United States

**Speaker Bio:**Cristopher Moore received his B.A. in Physics, Mathematics, and Integrated Science from Northwestern University, and his Ph.D. in Physics from Cornell. From 2000 to 2012 he was a professor at the University of New Mexico, with joint appointments in Computer Science and Physics. Since 2012, Moore has been a resident professor at the Santa Fe Institute. He has also held visiting positions at the Niels Bohr Institute, École Normale Superieure, École Polytechnique, Université Paris 7, Northeastern University, the University of Michigan, and Microsoft Research.

Moore has written over 170 papers at the boundary between mathematics, physics, and computer science, ranging from quantum computing, social networks, and phase transitions in NP-complete problems and Bayesian inference, to risk assessment in criminal justice. He is an elected Fellow of the American Physical Society, the American Mathematical Society, and the American Association for the Advancement of Science. With Stephan Mertens, he is the author of The Nature of Computation from Oxford University Press.

**Abstract:**Suppose we observe a matrix of data with a low-rank “signal” obscured by noise. The standard way to find the signal, at least approximately, is PCA (principal component analysis): just look at the eigenvectors of the matrix. For Gaussian noise, random matrix theory tells us exactly how well this works: that is, the accuracy we can achieve as a function of the signal-to-noise ratio.

For tensors, such as three-index tables A_ijk, the situation is much more complex. Here there seems to be a “statistical-computational gap,” namely a regime where finding the signal is possible but exponentially hard. Physically, this corresponds to a “glass transition,” where the optimum becomes hidden behind an energy barrier. Mathematically, it means that we believe no polynomial-time algorithm exists, and that something like exhaustive search is necessary.

I’ll give evidence for this exponential hardness by showing that no algorithm remotely similar to PCA can work: more precisely, no low-degree polynomial of the input can. Along the way, I’ll give an introduction to tensor networks — a generalization of matrix products and traces that everyone, including network theorists, should know about. If we’re really lucky I’ll mention free probability and free cumulants...

This is joint work with Tim Kunisky (Yale) and Alex Wein (UC Davis) and appeared in FOCS 2024.

Event Registration