[1] CADENA C, CARLONE L, CARRILLO H, et al. Past, present, and future of simultaneous localization and mapping: toward the robust-perception age[J]. IEEE Transactions on Robotics, 2016, 32(6):1309-1332. [2] 刘全攀,王正杰,王寰.基于双目视觉-惯性导航的轻型无人机导航算法[J].兵工学报,2020,41(增刊2):241-248. LIU Q P, WANG Z J, WANG H. Navigation algorithm of light UAV based on stereo visual inertial navigation odometer[J]. Acta Armamentarii, 2020, 41(S2): 241-248. (in Chinese) [3] VAN GOOR P, MAHONY R. An equivariant filter for visual inertial odometry[C]∥Proceedings of the 2021 IEEE International Conference on Robotics and Automation. Xi'an, China: IEEE, 2021: 1875-1881. [4] GUI J J, GU D B, WANG S, et al. A review of visual inertial odometry from filtering and optimisation perspectives[J]. Advanced Robotics, 2015, 29(20): 1289-1301. [5] DAVISON A J,REID I D,MOLTON N D, et al. MonoSLAM: real-time single camera SLAM[J]. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2007, 29(6): 1052-1067. [6] LI P L, TONG Q, HU B T, et al. Monocular visual-inertial state estimation for mobile augmented reality[C]∥Proceedings of the 2017 IEEE International Symposium on Mixed and Augmented Reality. Nantes, France:IEEE, 2017: 11-21. [7] QIN T,LI P L,SHEN S J, et al. VINS-Mono:a robust and versatile monocular visual-inertial state estimator[J].IEEE Transactions on Robotics, 2018, 34(4): 1004-1020. [8] 施俊屹,查富生,孙立宁,等.移动机器人视觉惯性SLAM研究进展[J].机器人,2020,42(6):828-839. SHI J Y, ZHA F S, SUN L N, et al. A survey of visual-inertial SLAM for mobile robots[J].Robot, 2020,42(3):828-839.(in Chinese) [9] WU K J,GUO C X,GEORGIOU G, et al. VINS on wheels[C]∥Proceedings of the 2017 IEEE International Conference on Robotics and Automation. Singapore: IEEE, 2017: 5155-5162. [10] HU P, WANG, TAN Y P. Recurrent spatial pyramid CNN for optical flow estimation[J]. IEEE Transactions on Multimedia, 2018, 20(10): 2814-2823. [11] HWANGBO M,KIM J S, KANADE T. Inertial-aided KLT feature tracking for a moving camera[C]∥Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems. St.Louis, MO, US: IEEE, 2009:1909-1916. [12] HWANGBO M, KIM J, KANADE T. Gyro-aided feature tracking for a moving camera[J]. International Journal of Robotics Research, 2011,30(14): 1755-1774. [13] ZHANG P Z, XIONG L, YU Z P, et al. VINS-PL-vehicle: points and lines-based monocular VINS combined with vehicle kinematics for indoor garage[C]∥Proceedings of the 2020 IEEE Intelligent Vehicles Symposium. Las Vegas, NV, US: IEEE, 2020: 825-830. [14] 孙永全,田红丽.视觉惯性SLAM综述[J].计算机应用研究, 2019,36(12):3530-3533+3552. SUN Y Q, TIAN H L. Overview of visual inertial SLAM[J]. Application Research of Computers, 2019, 36(12): 3530-3533+355215. (in Chinese) [15] ZHANG M M, ZUO X X, CHEN Y M, et al. Pose estimation for ground robots: on manifold representation, integration, reparameterization, and optimization[J].IEEE Transactions on Robotics,2021, 37(4): 1081-1099. [16] SHAN T X, ENGLOT B, RATTI C, et al. LVI-SAM: tightly-coupled lidar-visual-inertial odometry via smoothing and mapping[C]∥Proceedings of the 2021 IEEE International Conference on Robotics and Automation. New York, NY, US: IEEE, 2021: 5692-5698. [17] KARAMAT T B, LINS R G, GIVIGI S N, et al. Novel EKF-based vision/inertial system integration for improved navigation[J]. IEEE Transactions on Instrumentation and Measurement, 2017, 67(99): 116-125. [18] QIAN J, ZI B, WANG D M, et al. The design and development of an omni-directional mobile robot oriented to an intelligent manufacturing system[J]. Sensors, 2017, 17(9): 2073. [19] QUAN M X, PIAO S H, TAN M L, et al. Tightly-coupled monocular visual-odometric SLAM using wheels and a MEMS gyroscope[J]. IEEE Access, 2019, 7: 97374-97389. [20] LUPTON T, SUKKARIEH S. Visual-inertial-aided navigation for high-dynamic motion in built environments without initial conditions[J]. IEEE Transactions on Robotics, 2012, 28(1): 61-76.
|