IFML Seminar: What Makes Data Suitable For Deep Learning?
Nadav Cohen, Assistant Professor, Tel Aviv University-
The University of Texas at Austin
Abstract: Deep learning is delivering unprecedented performance when applied to various data modalities, yet there are data distributions over which it utterly fails. The question of what makes a data distribution suitable for deep learning is a fundamental open problem in the field. In this talk I will present a recent theory aiming to address the problem via tools from quantum physics. The theory establishes that certain neural networks are capable of accurate prediction over a data distribution if and only if the data distribution admits low quantum entanglement under certain partitions of features. This brings forth practical methods for adaptation of data to neural networks, and vice versa. Experiments with widespread models over various datasets will demonstrate the findings. An underlying theme of the talk will be the potential of physics to advance our understanding of the relation between deep learning and real-world data.
Works covered in the talk were in collaboration with my graduate students Noam Razin, Yotam Alexander, Nimrod De La Vega and Tom Verbin.
Bio: Nadav Cohen is an Assistant Professor of Computer Science at Tel Aviv University. His research focuses on the theoretical and algorithmic foundations of deep learning. He earned a BSc in electrical engineering and a BSc in mathematics (both summa cum laude) at the Technion Excellence Program for Distinguished Undergraduates, followed by a PhD (direct track) in computer science at the Hebrew University of Jerusalem. Subsequently, he was a postdoctoral research scholar at the Institute for Advanced Study in Princeton. For his contributions to deep learning, Nadav received a number of awards, including the Google Doctoral Fellowship in Machine Learning, the Rothschild Postdoctoral Fellowship, the Zuckerman Postdoctoral Fellowship, and the Google Research Scholar Award.Event Registration