EAAI Journal 2026 Journal Article
A Progressive Multilevel Mixing-based knowledge distillation framework for enhancing three-dimensional object detection on compressed point clouds
- Shuo Zhu
- Yongfang Wang
- Wei Chen
- Yihan Wang
Light Detection and Ranging (LiDAR)-based three-dimensional (3D) object detection has achieved notable progress and is widely applied in areas including autonomous driving scenarios and robotics. However, point clouds often face compression and transmission distortion, which poses a significant challenge to existing point cloud-based object detection models. We find that object detection performance suffers a sharp decline in high-compression point cloud scenarios due to the scarcity of compressed point clouds. To tackle this problem, we first propose a Progressive Multilevel Mixing (PMM) method to enable the model learning continually from a stream of compressed point cloud datasets, in which the compressed point clouds are mixed with the source point clouds at both the scene and object levels. We then propose a knowledge distillation (KD) framework for point cloud-based object detection that selectively learns at key locations based on the characteristics of 3D tasks, effectively allowing the student model to absorb crucial information from the corresponding teacher model. Specifically, we establish a Compressed Point Cloud Dataset for 3D Object Detection (CPC-3DOD) with rich and diverse range of point cloud scenes and five different compression ratios. As far as we know, this is the first study to contribute a large-scale database specifically designed for compressed point cloud detection. Extensive empirical evaluations have substantiated the effectiveness of our method, demonstrating marked performance improvements on the CPC-3DOD dataset. We make the CPC-3DOD dataset publicly available at https: //github. com/and-star/CPC-3DOD. • A new compressed point cloud dataset is created for 3D object detection research. • The novel mixing algorithm generates mixed point clouds to adapt models to compression. • The distillation framework transfers rich knowledge from the teacher to the student model. • Experimental results demonstrate the superior performance of the proposed method.