欢迎访问《兵工学报》官方网站,今天是 分享到:

兵工学报 ›› 2023, Vol. 44 ›› Issue (11): 3345-3358.doi: 10.12382/bgxb.2023.0547

所属专题: 群体协同与自主技术

• • 上一篇    下一篇

基于半直接法的无人集群协同视觉SLAM算法

曹昊哲1,*(), 刘全攀2   

  1. 1 中国人民公安大学 国家安全学院, 北京 100038
    2 中国兵器工业集团 计算机应用技术研究所, 北京 100089
  • 收稿日期:2023-06-01 上线日期:2023-08-28
  • 通讯作者:
  • 基金资助:
    高分辨率对地观测系统重大专项(01-Y30F05-9001-20/22); 中国人民公安大学基本科研业务费学科基础理论体系项目(2022JKF02019)

Unmanned Swarm Collaborative Visual SLAM Algorithm Based on Semi-direct Method

CAO Haozhe1,*(), LIU Quanpan2   

  1. 1 School of National Security, People’s Public Security University of China, Beijing 100038, China
    2 Institute of Computer Application Technology, NORINCO Group, Beijing 100089, China
  • Received:2023-06-01 Online:2023-08-28

摘要:

协同定位和环境感知技术是无人集群实现自主导航的基石,但受制于大规模无人集群系统小型个体平台的计算、载荷、带宽等资源所限,诸多相关技术难以实际部署应用。为实现资源约束下大规模无人集群的精准定位与环境感知,提出一种基于半直接法的轻量化协同视觉SLAM算法,设计融合光流法和直接法的半直接特征点跟踪方法,采用集中式双向通讯策略,使得大规模无人集群系统在面对通讯干扰和延迟时拥有较高的容错率,同时兼具准确性和快速性。基于EuRoC数据集和实际物理环境对算法开展对比实验,结果表明:新算法的实时性能平均提升60%,显著优于其他基于特征法的协同视觉SLAM算法;在丢包率小于40%以及通讯延迟低于0.1s的低质量通讯环境中,新算法定位精度更高、鲁棒性更强。

关键词: 无人集群, 协同视觉SLAM, 半直接法视觉里程计, 轻量化

Abstract:

The collaborative positioning and environmental awareness technologies are the cornerstone of autonomous navigation of unmanned swarm. However, due to the limitations of computing, load, bandwidth and other resources of small-sized individual platforms in large-scale unmanned swarm systems, many related technologies are difficultly deployed and applied in practice. In order to achieve the accurate positioning and environmental awareness of large-scale unmanned swarm under resource constraints, a lightweight collaborative visual SLAM algorithm based on semi-direct method is proposed, and a semi-direct feature point tracking method that combines the optical flow method and the direct method is designed. The centralized two-way communication strategy is used to make the large-scale unmanned swarm system have a high fault-tolerant rate in the face of communication interference and delay, and the unmanned swarm system. The comparative experiments were conducted on the algorithm based on the EuRoC dataset and actual physical environment. The results show that the real-time performance of the proposed algorithm is improved by an average of 60%, which is significantly better than other feature-based collaborative visual SLAM algorithms. In a low-quality communication environment where the packet loss rate is less than 40% and the communication delay is less than 0.1 second, the proposed algorithm has higher positioning accuracy and stronger robustness.

Key words: unmanned swarm, collaborative visual SLAM, semi-direct visual odometry, lightweight

中图分类号: