MPAN: Multi-part Attention Network for Point Cloud Based 3D Shape Retrieval

Zirui Li, Junyu Xu, Yue Zhao, Wenhui Li, Weizhi Nie
2020 IEEE Access  
3D shape retrieval is an important researching field due to its wide applications in computer vision and multimedia fields. With the development of deep learning technology, great progress has been made in recent years and lots of methods have achieved promising 3D shape retrieval results. Due to the effective description of point cloud data on structural information for 3D shapes, lots of methods based on point cloud data format are proposed for better shape representation. However, most of
more » ... m focus on extracting a global descrisptor from the whole 3D shape while the local features and detailed structural information are ignored, which negatively affect the effectiveness of shape descriptors. In addition, these methods also ignore the correlations among different parts of point clouds, which may introduce redundant information to the final shape descriptors. In order to address these issues, we propose a Multipart attention network (MPAN) for 3D model retrieval based on point cloud. Firstly, we segment a 3D shape into multiple parts by employing a pre-trained PointNet++ segmentation model. After extracting the local features from them, we introduce a novel self-attention mechanism to explore the correlations between different parts. Meanwhile, by considering the structural relevance of them, the redundancy for representing 3D shapes is removed while the effective information is utilized. Finally, informative and discriminative shape descriptors, considering both local features and spatial correlations, are generated for 3D shape retrieval task. To validate the effectiveness of our method, we conduct several experiments on the public 3D shape benchmark, ShapeNetPart dataset. Experimental results and comparisons with state-of-the-art methods demonstrate the superiority of our proposed method. INDEX TERMS 3D shape retrieval, self-attention, point cloud based method
doi:10.1109/access.2020.3018696 fatcat:j43mke27fffwbbv53aun76hpqi