"...this is the closest they've ever been to creating a robot so similar to the real thing..."

Compared to man-made machines animals show in many respects a remarkable behavioral performance. Although there has been tremendous progress in building (autonomous) robotic systems in the last decades, these systems still often fail miserably:

In contrast, if you have ever tried to catch a flying fly with your hand or watched a dragon-fly intercept prey mid-flight at 1000 frames/sec you will realize that even insects - despite having only relatively small nervous systems - outperform current robotic systems in terms of computational and energy efficiency, speed and robustness in different environmental contexts . Hence, from an engineer's perspective the mechanisms underlying behavioral control in insects are of potential great interest; or to phrase it in the wise words of Grandmaster Miyagi:


Collision avoidance based on insect-inspired processing of visual motion information

A prerequisite for behavioral control in animals, as well as in autonomous robotic systems, is the acquisition and processing of environmental information. Flying insects, such as honey bees or flies mainly rely on information their visual system provides when performing behavioral tasks such as avoiding collisions, approaching targets or navigating in cluttered environments. In this context an important source of information is visual motion, as it provides information about a) self-motion, b) moving objects, and also c) about the three-dimensional layout of the environment.

In the visual system of flying insects certain types of motion-sensitive interneurons have been identified which respond to visual motion in a direction-selective way. Although neuroscientists still struggle to unravel the specific physiological and computational mechanisms of these cells, the response characteristics of such neurons can be modeled computationally according to so-called Hassenstein-Reichardt-Correlators - also known as Elementary Motion Detectors (EMDs). The concept of EMDs was initially proposed by Bernhard Hassenstein and Werner Reichardt in the late 1950s in order to explain optomotoric turning responses in weevils. However, in later studies it could be shown (by comparing EMD responses with electrophysiological recordings of motion-sensitive interneurons) that EMDs can explain the response characteristics in the fly's visual motion pathway depending on the parameters of a moving visual stimulus to a large extent.

One interesting property of motion-sensitive interneurons and EMDs is that their response depends upon the nearness to objects in an environment during (translational) self-motion. While navigating in cluttered environments it is crucial for an autonomous agent to have some kind of representation of it's relative distance to obstacles in order to avoid collisions. To solve this task robotic systems nowadays normally rely on active sensors (e.g. laser range finders) or extensive computations (e.g. Lucas-Kanade optic flow computation). However, in my former lab we could show that it is possible to autonomously navigate in cluttered environments whilst avoiding collisions purely based on nearness estimation from visual motion using EMDs. From a technical perspective this biomimetic approach has several advantages over conventional systems:

  1. due to it's highly parallel architecture and low resolution of the camera frames it is really fast
  2. it has low computational and energy requirements making it suitable to be integrated in robotic systems where weight and energy requirements are critical (e.g. flying drones)
  3. it can be implemented using only low-cost hardware (i.e. a Raspberry Pi 3 and a webcam)

BIVAC - Bio-Inspired Visual Collision Avoidance System

My current work at the Biomechatronics Group at CITEC (Bielefeld, Germany)...

Read more »