《GAPNet: Graph Attention-based Point Neural Network for Exploiting Local Feature of Point Cloud》
research problem | How to better extract the local feature point clouds |
The main idea |
|
solution | self_attetion implementation: By mapping the first MLP each feature point to a higher-dimensional space, then the channel characteristic points of a weighted sum of each point to obtain further global feature neighboring achieve atteion of: DGCNN similar manner, in order to let each point of the original point cloud constructed as a central point corresponding kNN graph, and then further kNN graph based feature extraction, so as to obtain a local feature point for each local weights and graph attention: By self-attetion we can get right to the center point of each weight, we can get through neighboring attention right around the center point of each edge feature heavy, and finally we have to do is attetion weights based on the information side of the information and points of integration, Further features of the resulting final total Multi-head attention: Parallel repeat the process above graph attention in order to get richer geometrical information.
|