[NeurIPS 2019] Yoshua Bengio report: From the first generation to the second generation, where will the deep learning system develop in the future?

NeurIPS 2019 is being held in Vancouver, Canada. As the most important conference in the field of machine learning, NeurIPS has always had a strong influence and ranking, and is considered one of the best conferences in neural computing.

This article brings you a report made by Yoshua Bengio , a deep learning promoter and Turing Award winner, at the conference.

→Click here to enter "Report Download"

1. Summary of the report

In the early days, the progress of deep learning was mainly focused on the learning of static data sets, mainly used for various perceptual tasks. These tasks mostly rely on human intuition and can be completed unconsciously, which can be called the first generation of system requirements. However, in recent years, with the change of research direction and the emergence of some new tools such as soft-attention and the progress in the field of deep reinforcement learning, they have opened new doors for the further development of deep learning architectures and training frameworks. In-depth architecture and training frameworks help to solve second-generation system requirements (this kind of system tasks require humans to consciously complete), such as reasoning, planning, causality capture, and system induction in natural language processing and other applications.

From the deep learning of the first-generation system to the tasks of the second-generation system, it is very important to complete the previous goal of mining high-level abstract features, because we believe that the requirements of the second-generation system will be proposed for representation learning Higher requirements to discover some advanced content that humans can use language to handle cleverly. We believe that in order to achieve this goal, the soft-attention mechanism is the key factor. It pays attention to certain concepts and performs calculations every time, because many high-level dependencies in the consciousness prior and its related assumptions can be The sparse factor graph captures approximately. Finally, the report introduces meta-learning. This kind of representational learning from the perspective of a priori and agency will be more helpful to support powerful synthetic generalization forms in novel ways.

Second, the author's introduction

7771.png

 

Yoshua Bengio is a pioneer in the field of artificial intelligence natural language processing. Bengio, born in France in 1964, grew up in Canada, and now lives in Montreal. He is a professor in the Department of Computer Science and Computing at the University of Montreal. Bengio received his PhD in Computer Science from McGill University.

Together with Geoffrey Hinton and Yann LeCun, he is considered to be the three people who promoted deep learning in the 1990s and early 2000s. In October 2016, Bengio co-founded Element AI, an artificial intelligence incubator located in Montreal.

3. Report sharing

2921.jpg

 

2922.jpg

 

2923.jpg

 

2924.jpg

 

2925.jpg

 

2926.jpg

 

2927.jpg

 

2928.jpg

 

2929.jpg

 

2930.jpg

 

2931.jpg

 

2932.jpg

 

2933.jpg

 

2934.jpg

(Part of the content of this article is organized from the expert)

Past review:

[NeurIPS100] Seven award-winning papers of NeurIPS2019 are announced and in-depth analysis of selected papers!

The TOP100 list of NeurIPS ten-year highly cited scholars is released! These big cows are worthy of worship!

[NeurIPS100] Aminer Participation Strategy: How to participate in the NeurIPS Conference with 13,000 people more efficiently?

Guess you like

Origin blog.csdn.net/AMiner2006/article/details/103524988