当前位置: X-MOL 学术Sci. Robot. › 论文详情
Our official English website, www.x-mol.net, welcomes your feedback! (Note: you will need to create a separate account there.)
Fully neuromorphic vision and control for autonomous drone flight
Science Robotics ( IF 26.1 ) Pub Date : 2024-05-15 , DOI: 10.1126/scirobotics.adi0591
F Paredes-Vallés 1 , J J Hagenaars 1 , J Dupeyroux 1 , S Stroobants 1 , Y Xu 1 , G C H E de Croon 1
Affiliation  

Biological sensing and processing is asynchronous and sparse, leading to low-latency and energy-efficient perception and action. In robotics, neuromorphic hardware for event-based vision and spiking neural networks promises to exhibit similar characteristics. However, robotic implementations have been limited to basic tasks with low-dimensional sensory inputs and motor actions because of the restricted network size in current embedded neuromorphic processors and the difficulties of training spiking neural networks. Here, we present a fully neuromorphic vision-to-control pipeline for controlling a flying drone. Specifically, we trained a spiking neural network that accepts raw event-based camera data and outputs low-level control actions for performing autonomous vision-based flight. The vision part of the network, consisting of five layers and 28,800 neurons, maps incoming raw events to ego-motion estimates and was trained with self-supervised learning on real event data. The control part consists of a single decoding layer and was learned with an evolutionary algorithm in a drone simulator. Robotic experiments show a successful sim-to-real transfer of the fully learned neuromorphic pipeline. The drone could accurately control its ego-motion, allowing for hovering, landing, and maneuvering sideways—even while yawing at the same time. The neuromorphic pipeline runs on board on Intel’s Loihi neuromorphic processor with an execution frequency of 200 hertz, consuming 0.94 watt of idle power and a mere additional 7 to 12 milliwatts when running the network. These results illustrate the potential of neuromorphic sensing and processing for enabling insect-sized intelligent robots.

中文翻译:


用于自主无人机飞行的完全神经拟态视觉和控制



生物传感和处理是异步和稀疏的,从而实现低延迟和节能的感知和行动。在机器人技术中,用于基于事件的视觉和尖峰神经网络的神经形态硬件有望表现出类似的特征。然而,由于当前嵌入式神经形态处理器的网络大小有限以及训练尖峰神经网络的困难,机器人的实现仅限于具有低维感觉输入和运动动作的基本任务。在这里,我们提出了一个用于控制飞行无人机的完全神经形态视觉控制管道。具体来说,我们训练了一个尖峰神经网络,它接受基于事件的原始相机数据并输出低级控制动作,以执行基于视觉的自主飞行。该网络的视觉部分由五层和 28,800 个神经元组成,将传入的原始事件映射到自我运动估计,并通过真实事件数据的自我监督学习进行训练。控制部分由单个解码层组成,并通过无人机模拟器中的进化算法进行学习。机器人实验表明,完全学习的神经形态管道成功地从模拟到真实的迁移。无人机可以精确地控制其自我运动,允许悬停、着陆和侧向机动——甚至在偏航的同时也是如此。神经形态管道在英特尔 Loihi 神经形态处理器上运行,执行频率为 200 赫兹,闲置功耗为 0.94 瓦,运行网络时仅额外消耗 7 到 12 毫瓦。这些结果说明了神经形态传感和处理在实现昆虫大小的智能机器人方面的潜力。
更新日期:2024-05-15
down
wechat
bug