9.PointABM: Integrating Bidirectional Mamba and Multi-Head Self-Attention for Point Cloud Analysis

Published in International Conference on Intelligent Technology and Embedded Systems, 2024

In recent years, Transformer has achieved dominance across a multitude of disciplines, attributed to its superior global modeling capabilities. Currently, the Mamba model, distinguished by its linear complexity, is emerging as a formidable challenger. Despite existing advancements, considerable room for improvement remains in point cloud processing, especially within embedded application contexts such as robotic navigation and autonomous vehicles. Based on this observation, we propose PointABM, a hybrid model that integrates Mamba and Transformer architectures to enhance global feature extraction, thereby improving the performance of 3D point cloud analysis. More specifically, first we design a Transformer Block to enhance the representation of global features. Then, we propose a bidirectional Mamba, which comprises both a traditional token forward SSM and an innovative backward SSM. Experimental results demonstrate that integrating Mamba with Transformer significantly enhances the model’s capability to analyze 3D point clouds, offering substantial improvements in both efficiency and accuracy for critical applications.


Recommended citation:

PointABM: Integrating Bidirectional Mamba and Multi-Head Self-Attention for Point Cloud Analysis, J.-W. Chen, Y.-J. Xiong*, D.-H. Zhu, J.-C. Zhang, Z. Zhou, 2024 4th International Conference on Intelligent Technology and Embedded Systems (ICITES). IEEE, 2024

Download Paper