Welcome to Acta Armamentarii ! Today is

Acta Armamentarii ›› 2024, Vol. 45 ›› Issue (11): 3926-3937.doi: 10.12382/bgxb.2023.1221

Previous Articles     Next Articles

Event-combined Visual-inertial Odometry Using Point and Line Features

LIU Yumin1, CAI Zhihao1,2,*(), SUN Jialing1, ZHAO Jiang1, WANG Yingxun1,2   

  1. 1 School of Automation Science and Electrical Engineering, Beihang University, Beijing 100191, China
    2 Institute of Unmanned System, Beihang University, Beijing 100191, China
  • Received:2023-12-29 Online:2024-11-26
  • Contact: CAI Zhihao

Abstract:

Visual-inertial odometry is a key technology for robots to achieve autonomous localization. As an asynchronous vision sensor, the event cameras have complementary to the traditional cameras. For the scene of low light condition, high dynamic range and high-speed motion, the output of event camera and the traditional image are fused.A real-time visual inertial odometry using point and line features is proposed combined with the inertial measurement unit (IMU). An algorithm for generating an event image from event stream is proposed, a point-line feature detection method combined with events is designed, anda back-end sliding window optimization algorithm is designed based on the idea of visual-inertial tight-coupling. The dataset test and UAV flight test are conducted. The test results on the dataset show that, compared with the visual-inertial odometry using point and line features only on the traditional image, the proposed odometry can reduce the positioning error by more than 22% on average in the scene of high-speed motion, and it can reducethe positioning error by more than 59% on average in the scene of low light condition and high dynamic range.

Key words: event camera, point and line features, visual-inertial odometry, visualsimultaneous localization and mapping, pose estimation

CLC Number: