Figure neural network from another angle

0. Figure depth learning Introduction

Traditional learning input data in a depth space lattice Euler or serialized, FIG neighbor node connected to the data is diverse, a variety of properties is, in the data format of the data is unformatted space Euler FIG like storage. Traditional cnn, lstm network whether data processing in FIG.

1. FIG neural network

Articles explain the following three aspects:
1) map data features introduced

2) the neural network described in FIG.

3) Application of Neural Network FIG.

1) describes the basic features of FIG.

Graphs, FIG adjacency matrix, Laplacian matrix of FIG.

Laplace operation

FIG Laplacian matrix decomposition characteristic root Solving

Laplacian matrix and signal processing characteristic root relationship

Map data (signal) Fourier transform

An inverse Fourier conversion characteristics of FIG.

Map data tasks:

1) node classification

2) classification edge

3) Community found

4) Figure classification

5) map generation

Defined filter -> node represents Embedded -> pooled -> node represents

Filtration stage node, a node feature representing redefined

FIG cell stage, FIG produce smaller, compressed information data in FIG.

Node plane tasks, the general framework of the filter layer

FIG configuration information extraction, frame

FIG spatial and frequency domain filtering common neural network of FIG.

GNN neural network expression, by a former model consisting of nodes to the neural network embedded vector amount represented by the neighbors want to decide; the next time vector and vector determine ago by a neighbor vector changes, by way of non-stop auto-regressive iterative update node vector is stable

FIG frequency domain signal processing data flow

FIG processing data in the frequency domain, the filter design, the self-adjusting manner by the data driver

N * d1-dimensional data is mapped to N * d2 dimensional data, filters require a total of d1 * d2 d1 * d2 * N parameters; too costly to do matrix decomposition calculation

Algebraically fitting manner can be decomposed matrix

Eigenvalue decomposition by polynomial fitting, FIG data field is equivalent to do matrix multiplication of FIG binomial data do Fourier decomposition - filter-- Inverse Fourier decomposition FIG restore data operation

Airspace perspective polynomial filtered, spatial domain is equivalent to K (K - hops) a neighbor function operates

Chebyshev polynomials expressed in the form (with decomposition cut Chebyshev polynomials do features)

FIG Chebyshev neural network, is used instead of Chebyshev polynomials matrix transformation matrix factorization technique, the resulting network

It is a special case of Chebyshev than cut GCN network, with the two cut to approximate formula Chebyshev

FIG data signal processing by the filter GCN

从空域角度看GCN,图节点特征抽取、Aggregation(邻居节点对节点影响,相当于是图结构信息)

GCN和gnn比较,都涉及邻居节点信息&图结构信息,GCN节点信息由上次特征矩阵和邻居节点矩阵决定

gnn是由前次结果和邻居节点经过前向网络处理得到

GraphSage对子图做了采样处理

先经过消息传递,都传递到消息后做向量更新

图池化处理,把大图变小,通过节点重要度计算,只保留较重要节点

通过聚类方式,降低特征矩阵维度分两步:设置阈值矩阵降低特征为度——重新生成特征

邻接矩阵和特征矩阵转化

通过特征根池化方式,通过优化更好的Hp转化矩阵,捕获更好的节点特征和图结构

特征根池化:通过图数据信号处理,傅立叶参数截断参数实现

图应用:

1)药物发现——图生成

2)图到序列——

3)图谱推荐

发布了29 篇原创文章 · 获赞 17 · 访问量 3万+

Guess you like

Origin blog.csdn.net/liangwqi/article/details/104345386
Recommended