[Stanford] Distance coding-a more powerful GNN

Personal summary:

The author breaks through the limitation of 1-Weisfeiler-Lehman (WL) test by proposing a type of structure-related features, namely Distance Encoding, to help GNN express arbitrary expressions with stricter expressive power than 1-WL test The size of the node set. DE essentially captures the distance between the node set to be learned and each node in the graph, including important metrics related to the graph, such as the shortest path distance and generalized PageRank score.

What is Weisfeiler-Lehman (WL) algorithm and WL Test?

Synopsis

Learning the structural representation of node sets from graph structure data is essential for various applications from node role discovery to link prediction and molecular classification. Graph Neural Networks (GNNs) have achieved great success in structural representation learning. however:

Most GNNs are restricted by the 1-Weisfeiler-Lehman (WL) test, so it is possible to generate the same representation for actually different structures and graphs. Recently, the more powerful GNN proposed by imitating high-level WL tests only focuses on the full graph representation, and cannot use the sparsity of the graph structure to improve computational efficiency. This article proposes a type of structure-related features, called Distance Encoding (DE), to help GNN express a set of nodes of any size with stricter expressive power than 1-WL test. DE essentially captures the distance between the node set to be learned and each node in the graph, including important metrics related to the graph, such as the shortest path distance and generalized PageRank score.

In addition, this article also proposes two general GNNs frameworks to use DEs:

As additional node attributes and further as controllers for message aggregation in GNNs, these two frameworks can still use the sparse structure to maintain the processing of large graphs.

Guess you like

Origin blog.csdn.net/qq_40199232/article/details/108383159
GNN