[1] ZHANG Q, LI B P, XU G Q, et al. Indoor environment applications for mobile robots using Kinect2.0[C]∥Proceedings of the 11th World Congress on Intelligent Control & Automation. Shen-yang, China:IEEE, 2014. [2] JWO D J, WANG S H. Adaptive fuzzy strong tracking extended Kalman filtering for GPS navigation[J]. IEEE Sensors Journal, 2007, 7(5): 778-789. [3] KLEEMAN L, CHONG K S. Accurate odometry and error modelling for a mobile robot[C]∥Proceedings of IEEE International Conference on Robotics and Automation. Albuquerque, NM, US: IEEE, 1997: 2783-2788. [4] MADGWICK S, HARRISON A, VAIDYANATHAN R, et al. Estimation of IMU and MARG orientation using a gradient descent algorithm[C]∥Proceedings of IEEE International Conference on Rehabilitation Robotics. Zurich, Switzerland: IEEE, 2011: 1-7. [5] MUR-ARTAL R, TARDOS J D. Visual-inertial monocular SLAM with map reuse[J]. IEEE Robotics and Automation Letters, 2017, 2(2):796-803. [6] TAKETOMI T, UCHIYAMA H, IKEDA S, et al. Visual SLAM algorithms: a survey from 2010 to 2016[J]. IPSJ Transactions on Computer Vision and Applications, 2017, 9(1): 16. [7] YOUSIF K, TAGUCHI Y, RAMALINGAM S. MonoRGBD-SLAM: simultaneous localization and mapping using both mono-cular and RGBD cameras[C]∥Proceedings of IEEE International Conference on Robotics and Automation. Singapore:IEEE, 2017: 4495-4502. [8] ENGEL J, STUECKLER J, CREMERS D. Large-scale direct SLAM with stereo cameras[C]∥Proceedings of IEEE/RSJ International Conference on Intelligent Robots & Systems. Hamburg, Germany: IEEE, 2015: 1935-1942. [9] LEUTENEGGER S, LYNEN S, BOSSE M, et al. Keyframe-based visual-inertial odometry using nonlinear optimization[J]. The International Journal of Robotics Research, 2015, 34(3):314-334. [10] QAYYUM U, AHSAN Q, MAHMOOD Z, et al. IMU aided RGB-D SLAM[C]∥Proceedings of International Bhurban Conference on Applied Sciences and Technology. Islamabad, Pakistan: IEEE, 2017: 337-341. [11] LAIDLOW T, BLOESCH M, LI W B, et al. Dense RGB-D-inertial SLAM with map deformations[C]∥Proceedings of Intelligent Robots and Systems. Vancouver, BC, Canada: IEEE, 2017: 6741-6748. [12] 孙彤, 秦文虎, 孙立博. 基于松组合的视觉惯性SLAM方法[J]. 测控技术, 2019, 38(4):22-26. SUN T, QIN W H, SUN L B. Visnal-aid inertial SLAM method based on loose couple[J]. Measurement & Control Technology, 2019, 38(4):22-26.(in Chinese) [13] LAIDLOW T, BLOESCH M, LI W B, et al. Dense RGB-D-inertial SLAM with map deformations[C]∥Proceedings of 2017 IEEE/RSJ International Conference on Intelligent Robots and Systems. Vancouver, Canada: IEEE, 2017. [14] QIN T, LI P, SHEN S. Vins-mono: a robust and versatile monocular visual-inertial state estimator[J]. IEEE Transactions on Robotics, 2018, 34(4): 1004-1020. [15] LEUTENEGGER S, LYNEN S, BOSSE M, et al. Keyframe-based visual-inertial odometry using nonlinear optimization[J]. The International Journal of Robotics Research, 2015, 34(3):314-334. [16] BRUNETTO N, SALTI S, FIORAIO N, et al. Fusion of inertial and visual measurements for RGB-D SLAM on mobile devices[C]∥Proceedings of the IEEE International Conference on Computer Vision Workshops. Santiago, Chile: IEEE, 2015: 148-156. [17] KIM D H, HAN S B, KIM J H, et al. Visual odometry algorithm using an RGB-D sensor and IMU in a highly dynamic environment[J]. Revista de Informática Teórica e Aplicada, 2015, 34(5): 11-26. [18] ZHU Z J, XU F, YAN C, et al. Real-time indoor scene reconstruction with RGBD and inertial input[C]∥Proceedings of 2019 IEEE International Conference on Multimedia and Expo. Shanghai, China:IEEE, 2019: 7-12. [19] MOURIKIS A I, ROUMELIOTIS S I. A multi-state constraint Kalman filter for vision-aided inertial navigation[C]∥Procee-dings of IEEE International Conference on Robotics and Automation. Roma, Italy: IEEE, 2007. [20] YANG Z, GAO F, SHEN S. Real-time monocular dense mapping on aerial robots using visual-inertial fusion[C]∥Procee-dings of 2017 IEEE International Conference on Robotics and Automation. Singapore:IEEE, 2017: 4552-4559.
|