Figure neural network (Graph Neural Network, GNN)

Disclaimer: This article is a blogger original article, follow the CC 4.0 BY-SA copyright agreement, reproduced, please attach the original source link and this statement.
This link: https://blog.csdn.net/you_jinpeng/article/details/102762809

Reference URL: https: //www.cnblogs.com/SivilTaram/p/graph_neural_network_1.html
paper name "A Comprehensive Survey on Graph Neural Networks "

Figure neural network (Graph Neural Network, GNN)

Derivation structural frame

Here Insert Picture DescriptionFor irregular diagram (Figure refers to the rule with the same picture) to convolution (a hidden state requirements after convolution), we can based on its surrounding nodes and edges to count information, assume that the node ① own with hidden features x v x_v ② features around the node x c The [ v ] x_co [v] on node ③ around hidden layer x n e [ v ] x_ne [v] side features connected to ④ h n t e [ v ] h_{n}^{t}e[v] , then the node can be calculated assuming that the current hidden according to some function f, then there is
ht + 1v = f (xv, xco [v], htne [v], hers [v]),
how to use these hidden after calculating hidden it?
This time we can set another function g to describe how to adapt to the downstream task. Have:
The v = g ( h v , x v ) o_ {v} = g (h_ {v}, x_ {v}) , Where g is also known as partial output function.

So how to find the function f it? We can use a neural network to automatically look for this function, as follows. Further, similar to the f, g may be expressed by a neural network.

Carefully observe the connection between the two moments, it is closely related to the wiring diagram. For example, at time T1, the status of a node receiving a time hidden from the node 3, because the neighbor node 3 and the node 1. Until time Tn, each node convergence hidden behind each node connected to an output o g of the obtained node.

For the different figures, the convergence time may be different, since the convergence time by two p- norm difference is smaller than a threshold value ε determined, such as:
H t + 1 2 + H t 2 < e ||H^{t+1}||_{2}+||H^{t}||_{2}<\varepsilon
which reached a certain threshold, we determined that our convergent end.

Banach fixed point theory

In particular do not understand, can refer to this URL: https: //www.cnblogs.com/SivilTaram/p/graph_neural_network_1.html
Here Insert Picture Description
introduced Jacobi matrix (Jacobian Matrix) penalty term can be completed, the specific formula see reference link.

Guess you like

Origin blog.csdn.net/you_jinpeng/article/details/102762809