[1] TAKETOMI T, UCHIYAMA H, IKEDA S. Visual SLAM algorithms:a survey from 2010 to 2016[J]. IPSJ Transactions on Computer Vision and Applications, 2017, 9(1):1-11. [2] DI Kaichang, WAN Wenhui, ZHAO Hongying, et al. Progress and applications of visual SLAM[J]. Journal of Geodesy and Geoinformation Science, 2019, 2(2):38-49. DOI:10.11947/j.JGGS.2019.0205. [3] LI Xingxing, WANG Xuanbin, LIAO Jianchi, et al. Semi-tightly coupled integration of multi-GNSS PPP and S-VINS for precise positioning in GNSS-challenged environments[J]. Satellite Navigation, 2021(1):1-14. [4] LIAO Jianchi, LI Xingxing, WANG Xuanbin, et al. Enhancing navigation performance through visual-inertial odometry in GNSS-degraded environment[J]. GPS Solutions, 2021, 25(2):50. [5] GUI Jianjun, GU Dongbing, WANG Sen, et al. A review of visual inertial odometry from filtering and optimisation perspectives[J]. Advanced Robotics, 2015, 29(20):1289-1301. [6] SHEN Shaojie, MULGAONKAR Y, MICHAEL N, et al. Multi-sensor fusion for robust autonomous flight in indoor and outdoor environments with a rotorcraft MAV[C]//Proceedings of 2014 IEEE International Conference on Robotics and Automation. Hong Kong, China:IEEE, 2014:4974-4981. [7] YANG Zhenfei, SHEN Shaojie. Monocular visual-inertial state estimation with online initialization and camera-IMU extrinsic calibration[J]. IEEE Transactions on Automation Science and Engineering, 2017, 14(1):39-51. [8] TSOTSOS K, CHIUSO A, SOATTO S. Robust inference for visual-inertial sensor fusion[C]//Proceedings of 2015 IEEE International Conference on Robotics and Automation. Seattle, WA:IEEE, 2015:5203-5210. [9] MOURIKIS A I, ROUMELIOTIS S I. A multi-state constraint Kalman filter for vision-aided inertial navigation[C]//Proceedings of 2007 IEEE International Conference on Robotics and Automation. Rome, Italy:IEEE, 2007:3565-3572. [10] DELMERICO J, SCARAMUZZA D. A benchmark comparison of monocular visual-inertial odometry algorithms for flying robots[C]//Proceedings of 2018 IEEE International Conference on Robotics&Automation. Brisbane, QLD, Australia:IEEE, 2018:2502-2509. [11] LEUTENEGGER S, LYNEN S, BOSSE M, et al. Keyframe-based visual-inertial odometry using nonlinear optimization[J]. The International Journal of Robotics Research, 2015, 34(3):314-334. [12] QIN Tong, LI Peiliang, SHEN Shaojie. VINS-mono:a robust and versatile monocular visual-inertial state estimator[J]. IEEE Transactions on Robotics, 2018, 34(4):1004-1020. [13] ZHANG Ji, SINGH S. LOAM:LiDAR odometry and mapping in real-time[C]//Proceedings of 2014 Robotics:Science and Systems Conference. Berkeley, CA,USA:IEEE,2014. [14] SHAN Tixiao, ENGLOT B. LeGO-LOAM:lightweight and ground-optimized LiDAR odometry and mapping on variable terrain[C]//Proceedings of 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Madrid, Spain:IEEE, 2018:4758-4765. [15] YE Haoyang, CHEN Yuying, LIU Ming. Tightly coupled 3D LiDAR inertial odometry and mapping[C]//Proceedings of 2019 International Conference on Robotics and Automation (ICRA). Montreal, QC, Canada:IEEE, 2019:3144-3150. [16] SHAN Tixiao, ENGLOT B, MEYERS D, et al. LIO-SAM:tightly-coupled LiDAR inertial odometry via smoothing and mapping[C]//Proceedings of 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Las Vegas, NV,USA:IEEE, 2020:5135-5142. [17] ZHAO Shibo, ZHANG Hengrui, WANG Peng, et al. Super odometry:IMU-centric LiDAR-visual-inertial estimator for challenging environments[C]//Proceedings of 2021 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Prague, Czech Republic:IEEE, 2021:8729-8736. [18] ZHANG Ji, SINGH S. Laser-visual-inertial odometry and mapping with high robustness and low drift[J]. Journal of Field Robotics, 2018, 35(8):1242-1264. [19] SHAO Weizhao, VIJAYARANGAN S, LI Cong, et al. Stereo visual inertial LiDAR simultaneous localization and mapping[C]//Proceedings of 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Macau, China:IEEE, 2019:370-377. [20] ZUO Xingxing, GENEVA P, LEE W, et al. LIC-fusion:LiDAR-inertial-camera odometry[C]//Proceedings of 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Macau, China:IEEE, 2019:5848-5854. [21] SHAN Tixiao, ENGLOT B, RATTI C, et al. LVI-SAM:tightly-coupled LiDAR-visual-inertial odometry via smoothing and mapping[C]//Proceedings of 2021 IEEE International Conference on Robotics and Automation (ICRA). Xi'an, China:IEEE, 2021:5692-5698. [22] AGARWAL S, MIERLE K. The Ceres Solver Team. Ceres solver[EB/OL].[2021-08-20]. http://ceres-solver.org. [23] TRIGGS B, McLauchlan P F, Hartley R I, et al. Bundle adjustment-a modern synthesis[C]//Proceedings of 2000 International Workshop on Vision Algorithms:Theory&Practice. Corfu, Greece:Springer, 2000:298-372. [24] 王铉彬.城市复杂环境下LiDAR+VIO+GNSS融合定位算法研究及车载实验验证[D].武汉:武汉大学, 2021. WANG Xuanbin. Research on the fusion of LiDAR+VIO+GNSS for precise positioning in urban complex environment and verification of vehicle-borne experiment[D]. Wuhan:Wuhan University, 2021. [25] HUBER P J. Robust estimation of a location parameter[J]. The Annals of Mathematical Statistics, 1964, 35(1):73-101. [26] SIBLEY G, MATTHIES L, SUKHATME G. Sliding window filter with application to planetary landing[J]. Journal of Field Robotics, 2010, 27(5):587-608. [27] GALVEZ-LÓPEZ D, TARDOS J D. Bags of binary words for fast place recognition in image sequences[J]. IEEE Transactions on Robotics, 2012, 28(5):1188-1197. [28] CALONDER M, LEPETIT V, STRECHA C, et al. BRIEF:binary robust independent elementary features[C]//Proceedings of the 11th European Conference on Computer Vision. Heraklion,Greece:Springer, 2010:778-792. [29] LEPETIT V, MORENO-NOGUER F, FUA P. EPnP:an accurate O(n) solution to the PnP problem[J]. International Journal of Computer Vision, 2009, 81(2):155-166. [30] LI Xingxing, HAN Xinjuan, LI Xin, et al. GREAT-UPD:an open-source software for uncalibrated phase delay estimation based on multi-GNSS and multi-frequency observations[J]. GPS Solutions, 2021, 25(2):66. [31] Unicorecomm. CLAP-B7[EB/OL].[2021-08-20]. https://www.unicorecomm.com/products/detail/5. [32] FURGALE P, REHDER J, SIEGWART R. Unified temporal and spatial calibration for multi-sensor systems[C]//Proceedings of 2013 IEEE/RSJ International Conference on Intelligent Robots&Systems. Tokyo, Japan:IEEE, 2013:1280-1286. [33] Autoware. Autoware calibration toolkit[EB/OL].[2021-08-20]. https://gitlab.com/autowarefoundation/autoware.ai/autoware. [34] QIN Tong, CAO Shaozu, PAN Jie, et al. A general optimization-based framework for global pose estimation with multiple sensors[EB/OL].[2019-01-11].https://arxiv.org/abs/1901.03642v1. [35] HORN B K P. Closed-form solution of absolute orientation using unit quaternions[J]. Journal of the Optical Society of America A, 1987, 4(4):629-642. [36] STURM J, ENGELHARD N, ENDRES F, et al. A benchmark for the evaluation of RGB-D SLAM systems[C]//Proceedings of 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS). Vilamoura-Algarve, Portugal:IEEE, 2012:573-580. |