Events

IFML Seminar

IFML Seminar: 04/03/26 - Learning Mixture Models via Efficient High-dimensional Sparse Fourier Transforms

Manolis Zampetakis, Assistant Professor of Computer Science, Yale University

-

The University of Texas at Austin
Gates Dell Complex (GDC 6.302)
2317 Speedway
Austin, TX 78712
United States

Manolis Zampetakis
Abstract: In this work, we give a polynomial time and sample complexity algorithm for efficiently learning the parameters of a mixture of k spherical distributions in d dimensions.  Our method succeeds whenever the component distributions have a characteristic function with sufficiently heavy tails. Somewhat surprisingly, our algorithms succeed in learning the parameters without needing any minimum separation between the component means. This is in stark contrast to the case of spherical Gaussian mixtures where a minimum separation is provably necessary even information-theoretically. Also, unlike all previous methods, our techniques apply to heavy-tailed distributions and include examples that do not even have finite covariances.

Based on joint work with Alkis Kalavasis, Pravesh Kothari, and Shuchen Li.
 
Short Bio:  Manolis Zampetakis is an Assistant Professor of Computer Science at Yale University working on the foundations of machine learning (ML), statistics, and data science, with focus on statistical analysis from biased and missing data, and optimization methods for multi-agent environments. Before Yale, Manolis was a post-doctoral researcher at the EECS Department of UC Berkeley. He received his PhD from the EECS Department at MIT where he was advised by Constantinos Daskalakis. He has been awarded the Google PhD Fellowship, the ACM SIGEcom Doctoral Dissertation Award, and the COLT 2025 Best Paper Award.

Zoom link: https://utexas.zoom.us/j/84254847215