logo
ResearchBunny Logo
Introduction
Animals, including insects and birds, navigate diverse habitats, from open landscapes to densely cluttered environments like forests. A critical aspect of their locomotion, particularly in flight, is efficient and safe obstacle avoidance. This necessitates maintaining safe distances from objects, identifying traversable gaps, and executing rapid maneuvers to avoid collisions. Understanding the underlying computational mechanisms of this virtuosic flight control is a major challenge in biology and robotics. While various hypotheses exist regarding how animals process optic flow (OF) information for obstacle avoidance—including strategies involving balancing OF across different eye regions, integrating motion across the visual field, focusing on OF contrast, optimizing spatial sensitivity, or learning associations between active vision and object size—a comprehensive, unifying mechanism remains elusive. The remarkable navigational feats of insects, achieved with brains of minuscule size, suggest that the underlying neural processes are computationally efficient and parsimonious. This efficiency makes these principles attractive for implementation in resource-constrained autonomous agents, prompting this study to investigate whether asynchronous processing of OF, inspired by insect visual systems, enables robust collision-free navigation in diverse environments.
Literature Review
Existing research on insect and bird flight control and obstacle avoidance primarily uses experimental settings like flight tunnels with controlled obstacle arrangements or gaps. These studies have highlighted the crucial role of optic flow (OF), the apparent motion of the surroundings on the animal's retina during movement. OF, being directly related to time-to-collision, is processed by the animal's brain to ensure collision-free flight. Several mechanisms have been proposed for this processing, including balancing OF across eye hemispheres, global visual field integration, foreground-background contrast analysis, optimized spatial sensitivity, and learned associations between active vision and object size. However, a single mechanism explaining how animals handle the complexities of navigation in various environments remains unclear. Studies on bees and flies reveal behaviors like gap selection based on size and brightness, speed modulation based on environmental clutter, and corridor centering, but the underlying neural computations are still debated. The use of artificial agents, especially neuromorphic ones that mirror the asynchronous nature of insect visual processing, allows testing these proposed mechanisms in a controlled and systematic way.
Methodology
This study employed a closed-loop neuromorphic approach using an insect-inspired robot to investigate the efficacy of asynchronous OF processing for collision-free navigation. The neuromorphic hardware used event-based cameras and spiking neural networks (SNNs) that process information asynchronously. An event-based camera only sends data when a luminance change occurs, ensuring high temporal resolution. The SNN, using spiking neurons, mimics the asynchronous communication of biological neurons. The design of the neuromorphic agent was constrained by the known properties of the fruit fly's visual motion pathway, specifically the T4 and T5 neurons, which are thought to be at the output of elementary local motion processing. The model consists of spiking elementary motion detectors (SEMDs) to extract OF, followed by an inverse soft winner-take-all (WTA) network that searches for regions of low apparent motion, indicating obstacle-free directions. The agent then turns toward these directions. The model was evaluated through closed-loop simulations and real-world experiments. The experiments involved navigation in various environments mimicking those used in biological studies: navigating a box, centering in corridors, crossing gaps, and meandering in cluttered environments. The SEMD's response was characterized by comparing its normalized velocity tuning curves with those of Drosophila's T4 and T5 neurons. Real-world experiments used a robotic platform with an event-based camera, a SpiNN-3 board for SNN simulation, and an AERnode FPGA board for communication and motor control.
Key Findings
The spiking elementary motion detector (sEMD) in the neuromorphic system showed a bell-shaped velocity tuning curve similar to that of Drosophila's T4/T5 neurons, confirming its biological plausibility. The system demonstrated robust obstacle avoidance across a wide range of illumination and contrast conditions. In closed-loop simulations and real-world experiments, the robot successfully performed several tasks analogous to those observed in insects: corridor centering, gap crossing, and navigation in densely cluttered environments. The robot's performance in a corridor was similar to that of blowflies, showing velocity increasing with corridor width and a tendency to center, particularly in narrow corridors. The control experiment, where visual input was removed, showed that the centring behavior is driven by visual input, not intrinsic robot movement. In cluttered environments (obstacle density 0-38%), the robot achieved a 97% success rate, primarily due to velocity modulation based on global OF. Higher obstacle densities resulted in more collisions, especially with constant velocity. The robot successfully selected larger gaps when presented with a choice, similar to bee behavior, using a probabilistic integration mechanism. The success rate was significantly impacted by the velocity control mechanism; adaptive velocity significantly improved the success rate compared to a fixed velocity scenario.
Discussion
The results demonstrate the effectiveness of a single, bio-inspired mechanism—asynchronous processing of optic flow—for robust obstacle avoidance in diverse environments. This finding aligns with the exceptional efficiency of insect navigation. The similarities between the robot's behavior (meandering, gap selection, speed modulation, and corridor centering) and that of flying insects like bees and flies suggest the model captures essential aspects of insect navigation strategies. Differences, such as the robot’s inability to traverse gaps as small as bees, might be attributed to the robot's passive vision strategy (saccadic turns) compared to bees' active vision (scanning maneuvers). The velocity modulation based on global OF, which significantly improved obstacle avoidance, is a crucial element and suggests a simple feedback control loop could regulate this in insects. The agent’s corridor-centering behavior may be explained by seeking regions of lowest apparent motion, a hypothesis that requires further investigation and comparison with the OF balancing hypothesis. The neuromorphic implementation, using a relatively small network (around 4k neurons and 300k synapses), demonstrates the computational parsimony of these strategies. This approach also demonstrates the potential of neuromorphic hardware for robotics applications, offering power efficiency and real-time capabilities compared to conventional methods.
Conclusion
This study presents a bio-inspired neuromorphic system for obstacle avoidance that successfully mimics key aspects of insect navigation in diverse environments. The system’s robust performance across various scenarios, achieved using a computationally efficient network and a single set of parameters, highlights the potential of bio-inspired designs in robotics. Future research should explore the integration of goal-directed behaviors into the model to further refine its resemblance to natural systems and improve its real-world applicability. Investigating the effect of different sensor modalities, like incorporating radar, could also improve robustness in challenging natural environments.
Limitations
The study's focus on simplified environments might not fully capture the complexities of natural settings. The robot's limited visual field could affect its performance in some scenarios, and the simplified model of insect vision may not encompass all aspects of biological navigation. The current model lacks goal-directed behavior, which is a crucial aspect of insect navigation in real-world scenarios. Further research is needed to incorporate goal-directed behaviors and test the system in more complex, natural environments.
Listen, Learn & Level Up
Over 10,000 hours of research content in 25+ fields, available in 12+ languages.
No more digging through PDFs—just hit play and absorb the world's latest research in your language, on your time.
listen to research audio papers with researchbunny