He Yuming made the neural network deeper, and Google made the entrance of the neural network wider, so that it became the large model it is today.

openai chatgpt related_Personal scum records are only for my own search blog-CSDN Blog

 The great god returns to the academic world: He Yuming announces to join MIT

The transformer encoder and decoder that are used in large models today have residual links derived from ResNet.

"After ResNet, you can effectively train a deep neural network with more than 100 layers, and make the network very deep." In his speech at the 2023 World Artificial Intelligence Conference, Tang Xiaoou praised He Yuming's academic contribution: "He Yuming made the neural network Deeper, Google has enlarged the entrance of the neural network, and it is deep and large, and it has become the large model it is today.”

In November 2021, He Yuming published the paper "Masked Autoencoders Are Scalable Vision Learners" as a first author, and proposed a computer vision recognition model with good generalization performance. It also became a hot topic in the computer vision circle just after it was published.

A newcomer in the field of AI is often surprised to see that the main author of many important studies is He Yuming during the exploration process. Although He Yuming has been in the industry for a long time, his scientific research attitude has always been regarded as a benchmark - he only produces a small number of articles every year, but they will definitely be heavyweight, with almost no exceptions.

We also often admire He Yuming's work style: Even for groundbreaking papers, the content is often concise and easy to read. He will use the most intuitive way to explain his "simple" ideas, without tricks or unnecessary Some of the proofs are just beautiful intuition.

Now returning to the academic world, I look forward to more amazing works from Kaiming.

Guess you like

Origin blog.csdn.net/fei33423/article/details/132022185