Acta Geodaetica et Cartographica Sinica ›› 2025, Vol. 54 ›› Issue (9): 1677-1686.doi: 10.11947/j.AGCS.2025.20240497

• Photogrammetry and Remote Sensing • Previous Articles     Next Articles

Robust multi-sensor fusion-based odometry method of LiDAR, millimeter-wave radar and IMU in degraded scenes

Weitong WU1(), Chi CHEN2(), Bisheng YANG2, Xiufeng HE1   

  1. 1.School of Earth Sciences and Engineering, Hohai University, Nanjing 211100, China
    2.State Key Laboratory of Information Engineering in Surveying, Mapping and Remote Sensing, Wuhan University, Wuhan 430079, China
  • Received:2024-12-09 Revised:2025-07-09 Online:2025-10-10 Published:2025-10-10
  • Contact: Chi CHEN E-mail:weitongwu@hhu.edu.cn;chichen@whu.edu.cn
  • About author:WU Weitong (1995—), male, PhD, assistant researcher, majors in multi-sensor fusion SLAM. E-mail: weitongwu@hhu.edu.cn
  • Supported by:
    The National Natural Science Foundation of China(42401538);The National Key Research and Development Program of China(2022YFB3904101)

Abstract:

Multi-sensor fusion-based simultaneous localization and mapping (SLAM) is crucial for robust localization and accurate mapping of unmanned systems in degraded environments. In complex environments such as underground and indoors, achieving robust SLAM solely with LiDAR is challenging due to perception limitations caused by insufficient geometric feature constraints and the presence of smoke and dust. Furthermore, existing asynchronous multi-sensor measurement update strategies based on filtering frameworks often compromise system accuracy. To address these challenges, this paper proposes a robust fusion odometry method for degraded scenarios, integrating LiDAR, millimeter-wave radar, and inertial sensors. This method is built upon an iterative error state Kalman filter framework that fuses multi-source data, specifically integrated measurements from the inertial measurement unit, LiDAR point-to-plane matching observations, and velocity estimations from the millimeter-wave radar. To mitigate the degradation in LiDAR localization, the radar velocity measurement is employed to enhance the forward direction constraint, while truncated singular value decomposition reduces the impact of degraded data on system updates, thereby improving the accuracy of asynchronous sensor fusion. This method was validated on multiple degraded scenarios, specifically tunnel and fog-affected corridor environments, using their respective datasets. Results indicated that the FAST-LIO2 method experienced significant drift and nearly failed in degraded areas. In comparison to the FAST-LIO2 method, the millimeter-wave radar inertial odometry method, and the proposed method (direct fusion), the proposed method demonstrated superior robustness and accuracy. Notably, in the corridor data, the ratio of the closure error to trajectory length for the proposed method was 0.9%, an order of magnitude better than the proposed method (direct fusion) and 80% more effective than the millimeter-wave radar-inertial odometry approach. Additionally, in a highway tunnel data of approximately 1 kilometer, the root mean square error of the trajectory for the proposed method was 4.57 m, representing a 4.4% improvement over the FAST-LIO2 method.

Key words: LiDAR, millimeter-wave radar, multi-sensor fusion-based SLAM, localization degradation, Kalman filter

CLC Number: