Welcome to Acta Armamentarii ! Today is

Acta Armamentarii ›› 2023, Vol. 44 ›› Issue (11): 3345-3358.doi: 10.12382/bgxb.2023.0547

Special Issue: 群体协同与自主技术

Previous Articles     Next Articles

Unmanned Swarm Collaborative Visual SLAM Algorithm Based on Semi-direct Method

CAO Haozhe1,*(), LIU Quanpan2   

  1. 1 School of National Security, People’s Public Security University of China, Beijing 100038, China
    2 Institute of Computer Application Technology, NORINCO Group, Beijing 100089, China
  • Received:2023-06-01 Online:2023-08-28
  • Contact: CAO Haozhe

Abstract:

The collaborative positioning and environmental awareness technologies are the cornerstone of autonomous navigation of unmanned swarm. However, due to the limitations of computing, load, bandwidth and other resources of small-sized individual platforms in large-scale unmanned swarm systems, many related technologies are difficultly deployed and applied in practice. In order to achieve the accurate positioning and environmental awareness of large-scale unmanned swarm under resource constraints, a lightweight collaborative visual SLAM algorithm based on semi-direct method is proposed, and a semi-direct feature point tracking method that combines the optical flow method and the direct method is designed. The centralized two-way communication strategy is used to make the large-scale unmanned swarm system have a high fault-tolerant rate in the face of communication interference and delay, and the unmanned swarm system. The comparative experiments were conducted on the algorithm based on the EuRoC dataset and actual physical environment. The results show that the real-time performance of the proposed algorithm is improved by an average of 60%, which is significantly better than other feature-based collaborative visual SLAM algorithms. In a low-quality communication environment where the packet loss rate is less than 40% and the communication delay is less than 0.1 second, the proposed algorithm has higher positioning accuracy and stronger robustness.

Key words: unmanned swarm, collaborative visual SLAM, semi-direct visual odometry, lightweight

CLC Number: