Graph Neural Network Summary

  1. The basic concept of graph theory and the environment configuration of pyG
    https://blog.csdn.net/weixin_44133327/article/details/117958901
  2. Graph Neural Network Message Passing Graph Neural Network
    https://blog.csdn.net/weixin_44133327/article/details/118060691

     In this task, you mainly understand the operation process of the MessagePassing base class, and master propagate(), message(), The call sequence and related functions of the aggregate() and update() functions.
  3. Node representation based on graph neural network:
    https://blog.csdn.net/weixin_44133327/article/details/118165117
  • Brief Analysis of Graph Data
  • Node classification operation of MLP
  • A Brief Look at Graph Neural Networks
  • GAT node classification practice represented by spatial convolution
  • GCN node classification practice represented by spectral convolution
  1. The dataset class whose data is completely stored in memory
    https://blog.csdn.net/weixin_44133327/article/details/118283386

     Detailed explanation of the base class InMemoryDataset, including attributes, methods and operating procedures, which can store all datasets in memory

  2. Node prediction and edge prediction:
    https://blog.csdn.net/weixin_44133327/article/details/118284643

    In this task, you mainly learn how to build a Data class. Through the PlanetoidPubMed data class, you can understand the process of building a data set (download, generate objects, perform data processing, filter objects, and save files); the main idea of ​​​​the edge prediction task is to generate Negative samples make the number of positive and negative samples balance. By using the two-layer GCNConv neural network for side prediction, understand the process of side prediction.

  3. Node representation learning on super large graphs:
    https://blog.csdn.net/weixin_44133327/article/details/118400929

    In this task, mainly learn Cluster-GCN, a node representation method on super large graphs, understand the basic steps (clustering, approximate adjacency matrix, cluster sampling and update parameters), time/space complexity and node representation utilization .

  4. Graph representation learning method based on graph neural network:
    https://blog.csdn.net/weixin_44133327/article/details/118502767

    In this task, mainly learn the basic ideas of graph isomorphic network (GIN) (computing node representation, graph pooling, linear transformation), and understand the basic steps of WL Test's graph isomorphism testing method (iterative label, hash label , Comparing WL subtrees), graph similarity evaluation method (WL Subtree Kernel method: WL Test method has multiple layers of labels, statistical times, vector representation, and vector inner product).

おすすめ

転載: blog.csdn.net/weixin_44133327/article/details/118662465