Papers address: https: //arxiv.org/abs/1911.06455 implementation code Address: HTTPS: //github.com/ seongjunyun / Graph the Transformer Networks can look at the papers read before sharing it again, look at the code, this will certainly do more with less!
Thesis purpose is to construct GTN (Graph Transformer Networks), to learn to heterogeneous networks valid node represents.
Other conventional methods (GNNS) disadvantages:
- For FIG isomers, since the same process is used only for patterning GNN, therefore ineffective.
- A simple approach is to ignore the type, the disadvantage is not the type to get information.
- Design of a manual meta-path, e.g. , FIG transformed with heterogeneous composition defined by the meta-path, then GNN operation. Disadvantage of the method is: For each question requires a separate manual design meta-path; and the final result affected by the selection of meta-path; meta-path selection needs to correspond domain knowledge.
Overall framework
- meta-path, he said:
A path: is = * [] (https://img-blog.csdnimg.cn/20200118125818858.png)!
Convolution layer
Code:
A 的size:
W的size:
GT layer
In GT layer, a stack of similar structure
Code
Solve meta-path length increases with the number of layers increases the problem:
GTN
Code
gcn_conv:
experiment
The meta-path with pre-defined models generated meta-path comparing:
meta-path validity
This article highlights
- Does not require knowledge of the art, does not need to manually set the meta-path, GTN defined meta-paths through the valid candidate adjacency matrix.
- Scalability.