
测绘学报 ›› 2025, Vol. 54 ›› Issue (3): 448-460.doi: 10.11947/j.AGCS.2025.20240251
赖路广1(
), 赵冬青1(
), 李林阳1,2, 樊文哲1, 李雄庆1, 李鹏飞1
收稿日期:2024-07-05
出版日期:2025-04-11
发布日期:2025-04-11
通讯作者:
赵冬青
E-mail:llg16690994518@163.com;dongqing.zhao@hotmail.com
作者简介:赖路广(1999—),男,博士生,研究方向为多传感器融合同步定位与建图。 E-mail:llg16690994518@163.com
基金资助:
Luguang LAI1(
), Dongqing ZHAO1(
), Linyang LI1,2, Wenzhe FAN1, Xiongqing LI1, Pengfei LI1
Received:2024-07-05
Online:2025-04-11
Published:2025-04-11
Contact:
Dongqing ZHAO
E-mail:llg16690994518@163.com;dongqing.zhao@hotmail.com
About author:LAI Luguang (1999—), male, PhD candidate, majors in multi-sensor fusion SLAM. E-mail: llg16690994518@163.com
Supported by:摘要:
传统的视觉SLAM在光照不足或光照条件变化较大等恶劣环境中性能不佳甚至无法工作,而红外相机在黑暗、烟雾等挑战环境中具有更强的抗干扰能力。但红外相机噪声较大,成像质量较差,严重影响了红外SLAM的性能。本文基于红外相机热辐射成像的特点,顾及结构化场景存在的弱纹理特性,提出了一种点线结合的红外惯性里程计方法。在前端采用基于光流法的点特征跟踪算法,并对点特征进行筛选以剔除不稳定的点特征。同时对LSD算法进行改进以提取稳定的线特征,并基于LBD描述符进行线特征跟踪。在后端基于滑动窗口构建点、线、IMU信息的紧耦合图优化模型。最后,分别利用开源数据集和地下车库实测数据进行了验证。结果表明,点线结合的热惯性里程计的精度和稳健性较传统视觉SLAM算法有显著提升,有助于无人系统在黑暗、弱纹理的场景中实现稳健的定位。
中图分类号:
赖路广, 赵冬青, 李林阳, 樊文哲, 李雄庆, 李鹏飞. 弱纹理黑暗场景下点线结合的红外惯性里程计[J]. 测绘学报, 2025, 54(3): 448-460.
Luguang LAI, Dongqing ZHAO, Linyang LI, Wenzhe FAN, Xiongqing LI, Pengfei LI. A thermal-inertial odometry with point and line fusion for the weak textured dark scenes[J]. Acta Geodaetica et Cartographica Sinica, 2025, 54(3): 448-460.
| [1] |
张继贤, 刘飞. 视觉SLAM环境感知技术现状与智能化测绘应用展望[J]. 测绘学报, 2023, 52(10): 1617-1630. DOI:.
doi: 10.11947/j.AGCS.2023.20220240 |
|
ZHANG Jixian, LIU Fei. Review of visual SLAM environment perception technology and intelligent surveying and mapping application[J]. Acta Geodaetica et Cartographica Sinica, 2023, 52(10): 1617-1630. DOI:.
doi: 10.11947/j.AGCS.2023.20220240 |
|
| [2] | LI Xingxing, ZHANG Xiaohong, NIU Xiaoji, et al. Progress and achievements of multi-sensor fusion navigation in China during 2019—2023[J]. Journal of Geodesy and Geoinformation Science, 2023, 6(3): 102-114. |
| [3] | CADENA C, CARLONE L, CARRILLO H, et al. Past, present, and future of simultaneous localization and mapping: toward the robust-perception age[J]. IEEE Transactions on Robotics, 2016, 32(6): 1309-1332. |
| [4] |
杨元喜. 综合PNT体系及其关键技术[J]. 测绘学报, 2016, 45(5): 505-510. DOI:.
doi: 10.11947/j.AGCS.2016.20160127 |
|
YANG Yuanxi. Concepts of comprehensive PNT and related key technologies[J]. Acta Geodaetica et Cartographica Sinica, 2016, 45(5): 505-510. DOI:.
doi: 10.11947/j.AGCS.2016.20160127 |
|
| [5] | WU Yuzhen, WANG Lingxue, ZHANG Lian, et al. Catadioptric omnidirectional thermal odometry in dynamic environment[J]. ISPRS Journal of Photogrammetry and Remote Sensing, 2024, 216: 45-65. |
| [6] | YANG Yuanxi. Resilient PNT concept frame[J]. Journal of Geodesy and Geoinformation Science, 2019, 2(3): 1-7. |
| [7] | DI Kaichang, WAN Wenhui, ZHAO Hongying, et al. Progress and applications of visual SLAM[J]. Journal of Geodesy and Geoinformation Science, 2019, 2(2): 38-49. |
| [8] | KONG Da, ZHANG Yu, DAI Weichen. Direct near-infrared-depth visual SLAM with active lighting[J]. IEEE Robotics and Automation Letters, 2021, 6(4): 7057-7064. |
| [9] |
王铉彬, 李星星, 廖健驰, 等. 基于图优化的紧耦合双目视觉/惯性/激光雷达SLAM方法[J]. 测绘学报, 2022, 51(8): 1744-1756. DOI:.
doi: 10.11947/j.AGCS.2022.20210503 |
|
WANG Xuanbin, LI Xingxing, LIAO Jianchi, et al. Tightly-coupled stereo visual-inertial-LiDAR SLAM based on graph optimization[J]. Acta Geodaetica et Cartographica Sinica, 2022, 51(8): 1744-1756. DOI:.
doi: 10.11947/j.AGCS.2022.20210503 |
|
| [10] | 蔡毅, 王岭雪. 红外成像技术中的9个问题[J]. 红外技术, 2013, 35(11): 671-682. |
| CAI Yi, WANG Lingxue. Nine issues associated with infrared imaging technology[J]. Infrared Technology, 2013, 35(11): 671-682. | |
| [11] | EBADI K, BERNREITER L, BIGGIE H, et al. Present and future of SLAM in extreme environments: the DARPA SubT challenge[J]. IEEE Transactions on Robotics, 2023, 40: 936-959. |
| [12] | VAN MANEN B R, SLUITER V, MERSHA A Y. FirebotSLAM: thermal SLAM to increase situational awareness in smoke-filled environments[J]. Sensors, 2023, 23(17): 7611. |
| [13] | VIDAS S, SRIDHARAN S. Hand-held monocular SLAM in thermal-infrared[C]//Proceedings of 2012 International Conference on Control Automation Robotics & Vision. Guangzhou: IEEE, 2012: 859-864. |
| [14] | PAPACHRISTOS C, MASCARICH F, ALEXIS K. Thermal-inertial localization for autonomous navigation of aerial robots through obscurants[C]//Proceedings of 2018 International Conference on Unmanned Aircraft Systems. Dallas: IEEE, 2018: 394-399. |
| [15] | KHATTAK S, PAPACHRISTOS C, ALEXIS K. Key frame-based direct thermal-inertial odometry[EB/OL]. [2024-06-05]. https://arxiv.org/abs/1903.00798v1. |
| [16] | CHEN Wenqiang, WANG Yu, CHEN Haoyao, et al. EIL-SLAM: depth-enhanced edge-based infrared-LiDAR SLAM[J]. Journal of Field Robotics, 2022, 39(2): 117-130. |
| [17] | SHIN Y S, KIM A. Sparse depth enhanced direct thermal-infrared SLAM beyond the visible spectrum[J]. IEEE Robotics and Automation Letters, 2019, 4(3): 2918-2925. |
| [18] | 赖路广, 赵冬青, 李林阳, 等. 长波红外成像技术及其在SLAM中的应用[J]. 测绘科学, 2024, 49(3): 77-86. |
| LAI Luguang, ZHAO Dongqing, LI Linyang, et al. Long wave infrared imaging technology and its application in SLAM[J]. Science of Surveying and Mapping, 2024, 49(3): 77-86. | |
| [19] | SAPUTRA M R U, DE GUSMAO P P B, LU C X, et al. DeepTIO: a deep thermal-inertial odometry with visual hallucination[J]. IEEE Robotics and Automation Letters, 2020, 5(2): 1672-1679. |
| [20] | ZHAO Shibo, WANG Peng, ZHANG Hengrui, et al. TP-TIO: a robust thermal-inertial odometry with deep ThermalPoint[C]//Proceedings of 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems. Las Vegas: IEEE, 2020: 4505-4512. |
| [21] | JIANG Jiajun, CHEN Xingxin, DAI Weichen, et al. Thermal-inertial SLAM for the environments with challenging illumination[J]. IEEE Robotics and Automation Letters, 2022, 7(4): 8767-8774. |
| [22] | WANG Yu, CHEN Haoyao, LIU Yufeng, et al. Edge-based monocular thermal-inertial odometry in visually degraded environments[J]. IEEE Robotics and Automation Letters, 2023, 8(4): 2078-2085. |
| [23] | HE Ziqi, LI Gang. A visual-inertial navigation coupled localization method based on adaptive point-line feature extraction[J]. IEEE Sensors Journal, 2023, 23(20): 25096-25104. |
| [24] | LIU Xin, WEN Shuhuan, ZHANG Hong. A real-time stereo visual-inertial SLAM system based on point-and-line features[J]. IEEE Transactions on Vehicular Technology, 2023, 72(5): 5747-5758. |
| [25] | ZHAO Weiqiang, SUN Hang, ZHANG Xinyu, et al. Visual SLAM combining lines and structural regularities: towards robust localization[J]. IEEE Transactions on Intelligent Vehicles, 2024, 9(6): 5047-5064. |
| [26] | SHEN Shaojie, MICHAEL N, KUMAR V. Tightly-coupled monocular visual-inertial fusion for autonomous flight of rotorcraft MAVs[C]//Proceedings of 2015 IEEE International Conference on Robotics and Automation. Seattle: IEEE, 2015: 5303-5310. |
| [27] | QIN Tong, LI Peiliang, SHEN Shaojie. VINS-Mono: a robust and versatile monocular visual-inertial state estimator[J]. IEEE Transactions on Robotics, 2018, 34(4): 1004-1020. |
| [28] | ZHANG Lilian, KOCH R. An efficient and robust line segment matching approach based on LBD descriptor and pairwise geometric consistency[J]. Journal of Visual Communication and Image Representation, 2013, 24(7): 794-805. |
| [29] | GIOI R G, JAKUBOWICZ J, MOREL J M, et al. LSD: a fast line segment detector with a false detection control[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2010, 32(4): 722-732. |
| [30] | QIN Tong, CAO Shaozu, PAN Jie, et al. A general optimization-based framework for global pose estimation with multiple sensors[EB/OL]. [2024-01-11]. http://arxiv.org/abs/1901.03642v1. |
| [31] | BARTOLI A, STURM P. Structure-from-motion using lines: representation, triangulation, and bundle adjustment[J]. Computer Vision and Image Understanding, 2005, 100(3): 416-441. |
| [32] | FU Qiang, WANG Jialong, YU Hongshan, et al. PL-VINS: real-time monocular visual-inertial SLAM with point and line features[EB/OL]. [2024-04-16]. https://arxiv.org/abs/2009.07462v1. |
| [33] | LEE J, PARK S Y. PLF-VINS: real-time monocular visual-inertial SLAM with point-line fusion and parallel-line fusion[J]. IEEE Robotics and Automation Letters, 6(4): 7033-7040. |
| [34] | HE Yijia, ZHAO Ji, GUO Yue, et al. PL-VIO: tightly-coupled monocular visual-inertial odometry using point and line features[J]. Sensors, 2018, 18(4): 1159. |
| [35] | HUBER P J. Robust estimation of a location parameter[J]. The Annals of Mathematical Statistics, 1964, 35(1): 73-101. |
| [36] | SIBLEY G, MATTHIES L, SUKHATME G. Sliding window filter with application to planetary landing[J]. Journal of Field Robotics, 2010, 27(5): 587-608. |
| [37] | FORSTER C, CARLONE L, DELLAERT F, et al. On-manifold preintegration for real-time visual-inertial odometry[J]. IEEE Transactions on Robotics, 2017, 33(1): 1-21. |
| [38] | WEI Hongyu, ZHANG Tao, ZHANG Liang, et al. Multifeatures association bundle adjustment constraints for visual-oriented localization[J]. IEEE Sensors Journal, 2023, 23(22): 27909-27920. |
| [39] | YIN Jie, LI Ang, LI Tao, et al. M2DGR: a multi-sensor and multi-scenario SLAM dataset for ground robots[J]. IEEE Robotics and Automation Letters, 2022, 7(2): 2266-2273. |
| [40] | ZHOU Yuxuan, LI Xingxing, LI Shengyu, et al. Ground-VIO: monocular visual-inertial odometry with online calibration of camera-ground geometric parameters[J]. IEEE Transactions on Intelligent Transportation Systems, 2024, 25(10): 14328-14343. |
| [41] | ZHANG Zhengyou. A flexible new technique for camera calibration[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2000, 22(11): 1330-1334. |
| [42] | FURGALE P, REHDER J, SIEGWART R. Unified temporal and spatial calibration for multi-sensor systems[C]//Proceeding of 2013 IEEE/RSJ International Conference on Intelligent Robots & Systems. Tokyo: IEEE, 2013: 1280-1286. |
| [43] | SHAN Tixiao, ENGLOT B, MEYERS D, et al. LIO-SAM: tightly-coupled LiDAR inertial odometry via smoothing and mapping[C]//Proceedings of 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems. Las Vegas: IEEE, 2020: 5135-5142. |
| [44] | LIM H, KIM Y, JUNG K, et al. Avoiding degeneracy for monocular visual SLAM with point and line features[C]//Proceedings of 2021 IEEE International Conference on Robotics and Automation. Xi'an: IEEE, 2021: 11675-11681. |
| [1] | 郭迟, 刘阳, 罗亚荣, 刘经南, 张全. 图像语义信息在视觉SLAM中的应用研究进展[J]. 测绘学报, 2024, 53(6): 1057-1076. |
| [2] | 张继贤, 刘飞. 视觉SLAM环境感知技术现状与智能化测绘应用展望[J]. 测绘学报, 2023, 52(10): 1617-1630. |
| [3] | 王铉彬, 李星星, 廖健驰, 冯绍权, 李圣雨, 周宇轩. 基于图优化的紧耦合双目视觉/惯性/激光雷达SLAM方法[J]. 测绘学报, 2022, 51(8): 1744-1756. |
| [4] | 邓晨, 李宏伟, 张斌, 许智宾, 肖志远. 基于深度学习的语义SLAM关键帧图像处理[J]. 测绘学报, 2021, 50(11): 1605-1616. |
| [5] | 王晨捷, 罗斌, 李成源, 王伟, 尹露, 赵青. 无人机视觉SLAM协同建图与导航[J]. 测绘学报, 2020, 49(6): 767-776. |
| [6] | 季顺平, 秦梓杰. 多镜头组合式相机的全景SLAM[J]. 测绘学报, 2019, 48(10): 1254-1265. |
| [7] | 邸凯昌, 万文辉, 赵红颖, 刘召芹, 王润之, 张飞舟. 视觉SLAM技术的进展与应用[J]. 测绘学报, 2018, 47(6): 770-779. |
| 阅读次数 | ||||||
|
全文 |
|
|||||
|
摘要 |
|
|||||