White started his neural network notes

I remember the first encounter neural network model or search the Internet to do when the relevant share predictive models to participate in the digital-analog Association tournament, started his own when is this really a big white field, and because time is running out and then look books on digital-analog, which the presentation is really profound, and do not understand their own look of frustration, help another teammate. Finally after so long lazy winter vacation, winter vacation doing modeling problem when the neural network in turn shake out, away from procrastination, a good learning neural network.
Into the topic
First, the history of the development of the neural network
learning should try to understand their background prior to a discipline or knowledge, so we can help them learn, saying Yu Jian trees, look at the forest (Well, actually I forgot that sentence Zhashui a), however, always look back at history to learn it without problems.
Second, what is neural networks?
I believe, after seeing the neural network word we first thought should be that the traditional biological neural networks, learning so many years of biological science it is not white, let us recall, biological knowledge learned in high school, we can be an analogy.
First, it is the neurons themselves. In the neural network, the neurons there are other names, such as the cell, node. But the difference here neurons neurons and organisms is still very large, neuron organism refers to a whole structure, but here it is only a small storage space for data storage.
The second is the dendritic friends. Dendritic equivalent neural network input layer, the input layer literally means for our data input layer, the input layer nodes are determined according to the data known to us.
Then the axon. Hidden layer neural axons equivalent networks, the data obtained before calculating output data, play the role of transition. The neural network has two or more hidden layers, and each unit contains a large number of neural network called a depth, of course, may not have the hidden layer, the hidden layer appeared but is transformed into a linear dynamic non-linear, historic significance.
Followed by the nucleus. Nuclear equivalent parameter in the neural network, is used for the calculation. (Parameter is arguably the most important elements in our previous parameters are fixed, but with the development of technology, and parameters can be optimized by a certain method (here again there was knowledge loss function, etc.) Well, why I do not say it because I do not know what method, crying) there is a rule that says the value of the non-bias point connection called weight value on the point of paranoia known as bias, collectively referred to as parameters. What words, the bias point is it? Neural networks represent that which the weight and content of the data should be added after the multiplication, the bias point is also very good resolution, and it is not the parameters obtained by multiplying the former data is then no arrow pointing to it.
Finally, the axon terminals. Axon terminals corresponding to the output layer of the neural network, is the number of nodes in which the axon terminals of the output layer, the image is not very ha ha, well I talk nonsense, according to the data type of the number of output nodes of the output layer is different , such as two nodes, the result is a vector.
Ah there is a point of knowledge, perception, a feeling that is really easy to not know the concept, the concept is Perceptron in 1958 was first put forward, which means the two Multilayer Perceptron (with a hidden layer) of the neural network.
Finally, we can direct analogy neural network model biological neural networks, data input through output after a certain calculated data.
Of course, we can also be likened to a complex function, a small house and so on.
White is white, not only wrote but also stammered once wanted to give up, well, I really gave up, Braille code so it was only part on the concept of neural networks, in fact, began to write a history of the development, but all Looking deleted feel of your mother should be more detailed and more fun.

Common progress! Also want to learn the neural network programming do not make stand flag, I still look forward to the next update! (too difficult

Probably one can understand the words of their own or because sharing a big brother,
the original link: https: //www.cnblogs.com/subconscious/p/5058741.html

发布了1 篇原创文章 · 获赞 1 · 访问量 15

Guess you like

Origin blog.csdn.net/weixin_45775565/article/details/104338890