Deep Learning: The Path to Artificial Intellegence

October 20, 2014 (7:15 pm)

Time: October 20 2014 - 7:15 pm


Location: CIBC Auditorium, Goldberg Computer Science Building, Dalhousie University


Cost: FREE


Additional Information: All researchers seek to understand computational principles that would allow intelligent behaviours, mainly to build smarter computers, but possibly as well to better understand animal intelligence. Learning has turned out to be a key ingredient on the road to AI. It enables computers to learn about the world around us but also holds fundamentally hard mathematical challenges, associated with the so-called curse of dimensionality: every observation is different and the number of possible observations is huge, even on apparently simple tasks like classifying images of characters. Deep learning has been introduced to face that challenge by adding to the rich science of machine learning the notion of deep representation, the idea that better models can be learned if the machine constructs and discovers rich and abstract representations of the data. Past and future advances in deep learning hold incredible promises of technological advances on the path towards AI. This realization has strongly influenced information technology markets recently and there are already impressive fallouts from these investments in science and technology, mostly in computer vision and speech recognition. This talk will conclude with a glimpse of the challenges ahead of us in order to greatly expand the horizon and level of competence of deep learning systems.

Yoshua Bengio (CS PhD, McGill University, 1991) was post-doc with Michael Jordan at MIT and worked at AT&T Bell Labs before becoming professor at U. Montreal. He wrote two books and around 200 papers, the most cited being in the areas of deep learning, recurrent neural networks, probabilistic learning, NLP and manifold learning. Among the most cited Canadian computer scientists and one of the scientists responsible for reviving neural networks research with deep learning in 2006, he sat on editorial boards of top ML journals and of the NIPS foundation, holds a Canada Research Chair and an NSERC chair, is a program director of the CIFAR NCAP program and has been program/general chair for NIPS. He is driven by his quest for AI through machine learning, involving fundamental questions on learning of deep representations, the geometry of generalization in high-dimension, manifold learning, biologically inspired learning, and challenging applications of ML. In June 2014, Google Scholar finds almost 17500 citations to his work, yielding an h-index of 58.


Contact: If you have any questions, please contact David Langstroth at

Go back