The whole world is discussing ChatGPT, in fact, many people don’t even know what GPT is

I will now let us disassemble g pt in an easy-to-understand way. The full name of the three words g pt is generative

The Chinese translation of pretangetransformer is a generative fertility chain transformation model, so let's look at it one by one

The translation from the first letter g of ppt to the first letter of Juge Ruitiao is generative, so we call the poor ppt generative artificial intelligence, which is the so-called generative a i.

Prior to this, artificial intelligence and machine learning were largely limited to observation, analysis and classification of content. A classic machine learning problem is image recognition. For example, let the machine recognize a cat. The program will carefully Go search and analyze tons of images to find cat matches.

The generative AI represented by gt is a technological breakthrough. It is an AI that can generate new content instead of being limited to analyzing existing data. For example, it can create a cat image or A description of the text.

The generative AI model can also be used to generate program code, poetry, articles, artwork, etc.

The chad g pt released by the satellite focuses on the generation of text content. In the future, there should be various forms of generative AI such as image art. Although the fields are different, its technical core is similar, that is, generative artificial intelligence. .

The second letter of ppt is pp, which is the first letter of the push group, which is translated as Yu training. It means that this model has been trained on some limited data sets, just like we will do it in advance before we are asked to answer a certain question. Go read some related books just like the reason why ppt can sound.

Answering our questions like a human is because he has been trained with a large amount of data. These data are written by our real people, that is, the content that we humans publish on the Internet before 2022.

So how to do pre-training? It uses two techniques, one called supervised learning and the other called reinforcement learning through human feedback.

So before the release of the ppt, a lot of supervised learning and reinforcement learning through human feedback have been carried out, so when we use it, this model can generate a coherent and fascinating response that sounds like a human being very accurately and quickly. It’s like a scholar who has read poems and books and is asked to write an article on the spot. It can be synthesized in one go to complete it. The third letter of gpt is tt and it is transformer. If you directly translate it, it is a converter or a transformer. The movie Transformers is called Transformer, but if you translate the Transformer into Transformer or Transformers, it is wrong, but the real meaning of former is very, very low-level.

An algorithmic architecture of artificial intelligence machine learning, it is a deep neural network. Initially, this architecture was developed in 2017 by an artificial intelligence team of Google called Google Brain, which is the model developed and released by Google Brain. It uses a The mechanism called self-attention is that the self does not see anything. When making predictions, the cloud model can assign different weights to different parts of the input data according to any position in the language sequence.

And it supports the processing of larger data sets. The three letters put together are gpt. The gpt model was first launched by obai as gpt in 2018. These models will continue to develop into gpt 2 in 2020 in 2019. It was developed into gpt three, and most recently in 2022, that is, last year, it was further developed into instruct gpt and chat gpt. Another progress in the evolution of the gpt model also comes from the improvement of the underlying hardware computing efficiency.

This enables gpt3 to accept much more data training than gpt2, giving it a more diverse knowledge base and the ability to perform a wider range of tasks.

Then I said for a long time, are you still a mouthful of sewage? That’s right, in fact, the most terrifying thing in cognition is not knowing what we don’t know, so knowing what we don’t know is actually a kind of growth and progress in itself. This is also the meaning of my video. We must be in awe of cutting-edge technology, and we must also be grateful for the hard work of generations of scientific and technological workers at home and abroad.

Giving gives us the opportunity to enjoy such a technological product with the significance of China's era in our lifetime. I am for the overlord of Chinese studies, so I want to pay attention to me and join my fan group of Xueba. Let's achieve a domineering life together.

Guess you like

Origin blog.csdn.net/CSDN6706/article/details/129793249