About the talk:
I will present a novel comprehensive theory of large scale learning with Deep Neural Networks, based on the correspondence between Deep Learning and the Information Bottleneck framework. The new theory has the following components:
Based partly on works with Ravid Shwartz-Ziv and Noga Zaslavsky.
About the speaker:
Dr. Naftali Tishby is a professor of Computer Science , and
the incumbent of the Ruth and Stan Flinkman Chair for Brain
Research at the Edmond and Lily Safra Center for Brain Science
(ELSC) at the Hebrew University of Jerusalem. He is one of the leaders
of machine learning research and computational neuroscience in Israel
and his numerous ex - students serve at key academic and industrial
research positions all over the world.
Prof. Tishby was the founding chair of the new computer - engineering program, and a director of the Leibnitz research center in computer science, at the Hebrew University. Tishby received his PhD in theoretical physics from the Hebrew university in 1985 and was a research staff member at MIT and Bell Labs from 1985 and 1991. Prof. Tishby was also a visiting professor at Princeton NECI, University of Pennsylvania, UCSB, and IBM research. His current research is at the interface between computer science, statistical physics, and computational neuroscience. He pioneered various applications of statistical physics and information theory in computational learning theory. More recently, he has been working on the foundations of biological information processing and the connections between dynamics and information. He has introduced with his colleagues new theoretical frameworks for optimal adaptation an d efficient information representation in biology, such as the Information Bottleneck method and the Minimum Information principle for neural coding. |
Contact information:
Naftali Tishby
Professor, Hebrew Universitya
eMail: tishby (at) cs.huji.ac.il