Laplacian Mesh Transformer: Dual Attention and Topology Aware Network for 3D Mesh Classification and Segmentation

Xiao-Juan Li, Jie Yang, Fang-Lue Zhang ;

Abstract


"Deep learning-based approaches for shape understanding and processing tasks have attracted considerable attention. Despite the great progress that has been made, the existing approaches fail to efficiently capture sophisticated structure information and critical part features simultaneously, limiting their capability of providing discriminative deep shape features. To address the above issue, we proposed a novel deep learning framework, Laplacian Mesh Transformer, to extract the critical structure and geometry features. We introduce a dual attention mechanism, where the $1^{\rm st}$ level self-attention mechanism is used to capture the structure and critical partial geometric information on the entire mesh, and the $2^{\rm nd}$ level is to learn the importance of structure and geometric information. More particularly, Laplacian spectral decomposition is adopted as our basic structure representation given its ability to describe shape topology. Our approach builds a hierarchical structure to process shape features from fine to coarse using the dual attention mechanism, which is stable under the isometric transformations. It enables an effective feature extraction that can tackle 3D meshes with complex structure and geometry efficiently in various shape analysis tasks, such as shape segmentation and classification. Extensive experiments on the standard benchmarks show that our method outperforms state-of-the-art methods."

Related Material


[pdf] [supplementary material] [DOI]