1395 60.Biologically Inspired Robots Jean-Arcady Meyer,Agnes Guillot After having stressed the difference between 60.3.5Ta5te..1402 bio-inspired and biomimetic robots,this chapter 60.3.6 Internal Sensors................... 1402 successively describes bio-inspired morphologies, 60.4 Bio-inspired Actuators...................... 1402 sensors,and actuators.Then,control architecture 60.4.1L0c0m0ti0n.… 1402 that,beyond mere reflexes,implement cognitive 60.4.2 Grasping 1407 abilities like memory or planning,or adaptive pro- 60.4.3 Drilling… 1408 cesses like learning,evolution and development 60.5 Bio-inspired Control Architectures........1408 are described.Finally,the chapter also reports 60.5.1 Behavior-Based Robotics.........1408 related works on energetic autonomy,collective 60.5.2 Learning Robots. .1409 robotics,and biohybrid robots. 60.5.3 Evolving Robots..... 1410 60.5.4 Developing Robots.......... 1411 60.6 Energetic Autonomy 1412 60.1 General Background .1395 60.7 Collective Robotics 1413 60.2 Bio-inspired Morphologies 1396 60.3 Bio-inspired Sensors......................1.398 60.8 Biohybrid Robots........1415 60.3.1Visi0n..1398 60.9 Discussion..... .1417 60.3.2 Auditic0n.… 1399 品 60.3.3T0uch.1400 60.10C0 nclusi0n.1418 n 60.3.4 Smell......... 1402 References ...... ..1418 8 60.1 General Background Human inventors and engineers have always found in electronics,and computer science than from biology. Nature's products an inexhaustible source of inspiration.On the one hand,this approach undoubtedly solidified About 2400 years ago,for instance,Archytas of Taren-the technical foundations of the discipline and led to tum allegedly built a kind of flying machine,a wooden the production of highly successful products,especially pigeon balanced by a weight suspended from a pul-in the field of industrial robotics.On the other hand,it ley,and set in motion by compressed air escaping from served to better appreciate the gap that still separates a valve.Likewise,circa 105 AD,the Chinese eunuch a robot from an animal,at least when qualities of au- Ts'ai Lun is credited with inventing paper,after watch-tonomy and adaptation are sought.As such qualities ing a wasp create its nest.More recently,Antoni Gaudi's are required in a continually growing application field design of the still-unfinished Sagrada Familia cathe--from planetary exploration to domestic uses-a spec- dral in Barcelona displays countless borrowings from tacular reversal of interest towards living creatures can mineral and vegetal exuberance. be noticed in current-day robotics,up to the point that it Although a similar tendency underlied all attempts has been said that natural inspiration is the new wave of at building automata or protorobots up to the middle of robotics [60.2]. the last century [60.1],in the last decades roboticists Undoubtedly,this new wave would not have been borrowed much more from mathematics,mechanics. possible without the synergies generated by recent
1395 Biologically In 60. Biologically Inspired Robots Jean-Arcady Meyer, Agnès Guillot After having stressed the difference between bio-inspired and biomimetic robots, this chapter successively describes bio-inspired morphologies, sensors, and actuators. Then, control architecture that, beyond mere reflexes, implement cognitive abilities like memory or planning, or adaptive processes like learning, evolution and development are described. Finally, the chapter also reports related works on energetic autonomy, collective robotics, and biohybrid robots. 60.1 General Background ............................ 1395 60.2 Bio-inspired Morphologies ................... 1396 60.3 Bio-inspired Sensors............................ 1398 60.3.1 Vision ....................................... 1398 60.3.2 Audition ................................... 1399 60.3.3 Touch ....................................... 1400 60.3.4 Smell ........................................ 1402 60.3.5 Taste ........................................ 1402 60.3.6 Internal Sensors ......................... 1402 60.4 Bio-inspired Actuators ......................... 1402 60.4.1 Locomotion ............................... 1402 60.4.2 Grasping ................................... 1407 60.4.3 Drilling ..................................... 1408 60.5 Bio-inspired Control Architectures ........ 1408 60.5.1 Behavior-Based Robotics ............ 1408 60.5.2 Learning Robots ......................... 1409 60.5.3 Evolving Robots ......................... 1410 60.5.4 Developing Robots ..................... 1411 60.6 Energetic Autonomy ............................ 1412 60.7 Collective Robotics ............................... 1413 60.8 Biohybrid Robots ................................. 1415 60.9 Discussion........................................... 1417 60.10 Conclusion .......................................... 1418 References .................................................. 1418 60.1 General Background Human inventors and engineers have always found in Nature’s products an inexhaustible source of inspiration. About 2400 years ago, for instance, Archytas of Tarentum allegedly built a kind of flying machine, a wooden pigeon balanced by a weight suspended from a pulley, and set in motion by compressed air escaping from a valve. Likewise, circa 105 AD, the Chinese eunuch Ts’ai Lun is credited with inventing paper, after watching a wasp create its nest. More recently, Antoni Gaudi’s design of the still-unfinished Sagrada Familia cathedral in Barcelona displays countless borrowings from mineral and vegetal exuberance. Although a similar tendency underlied all attempts at building automata or protorobots up to the middle of the last century [60.1], in the last decades roboticists borrowed much more from mathematics, mechanics, electronics, and computer science than from biology. On the one hand, this approach undoubtedly solidified the technical foundations of the discipline and led to the production of highly successful products, especially in the field of industrial robotics. On the other hand, it served to better appreciate the gap that still separates a robot from an animal, at least when qualities of autonomy and adaptation are sought. As such qualities are required in a continually growing application field – from planetary exploration to domestic uses – a spectacular reversal of interest towards living creatures can be noticed in current-day robotics, up to the point that it has been said that natural inspiration is the new wave of robotics [60.2]. Undoubtedly, this new wave would not have been possible without the synergies generated by recent Part G 60
1396 Part G Human-Centered and Life-Like Robotics advances in biology-where so-called integrative ap- ties of a continuum in which.on the one side,engineers proaches now produce a huge amount of data and models seek to reproduce some natural result,but not necessar- directly exploitable by roboticists-and in technology ily the underlying means,while,on the other side,they -with the massive availability of low-cost and power- seek to reproduce both the result and the means.Thus, efficient computing systems,and with the development bio-inspired robotics tends to adapt to traditional engi- of new materials exhibiting new properties.This will be neering approaches some principles that are abstracted demonstrated in this chapter,which first reviews recent from the observation of some living creature,whereas research efforts in bio-inspired morphologies,sensors, biomimetic robotics tends to replace classical engineer- and actuators.Then,control architectures that,beyond ing solutions by as detailed mechanisms or processes mere reflexes,implement cognitive abilities-like mem-that it is possible to reproduce from the observation ory or planning-or adaptive processes-like learning,of this creature.In practice,any specific application evolution and development-will be described.Finally,usually lies somewhere between these two extremities. the chapter will also report related works on energetic Be that as it may,because biomimetic realizations are autonomy,collective robotics,and biohybrid robots. always bio-inspired,whereas the reverse is not neces- It should be noted that this chapter will describe sarily true,qualifying expressions like bio-inspired or both bio-inspired and biomimetic realizations.In fact, biologically inspired will be preferentially used in this these two terms characterize,respectively,the extremi- chapter. 60.2 Bio-inspired Morphologies Although not comparable to that of real creatures,the di- pressions,eye,head,and body movements,and gestures versity of bio-inspired morphologies that may be found with its arms and hands.Touch sensors with sensitivity in the realm of robotics is nevertheless quite impres- to variable pressures are mounted under its clothing and sive.Currently,a huge number of robots populates the silicone skin,while floor sensors and omnidirectional Part terrestrial,as well as aquatic and aerial,environments vision sensors serve to recognize where people are in a and look like animals as diverse as dogs,kangaroos, order to make eye contact while addressing them during sharks,dragonflies,or jellyfishes,not to mention humans conversation.Moreover,it can respond to the content Fig.60.1). and prosody of a human partner by varying what it says In nature,the morphology of an animal fits its ecol- and the pitch of its voice (Fig.60.2c).See Chap.58 for ogy and behavior.In robotics applications,bio-inspired more references on human-friendly robots. morphologies are seldom imposed by functional consid- Another active research area in which functional erations.Rather,as close a resemblance as possible to considerations play a major role is that of shape-shifting a given animal is usually sought per se,as in animatron- robots that can dynamically reconfigure their morphol- ics applications for entertainment industry.However, ogy according to internal or external circumstances. several other applications are motivated by the func- Biological inspiration stems from organisms that can tional objective of facilitating human-robot interactions,regrow lost appendages,like the tail in lizards,or from thus allowing,for instance,children or elderly people transitions in developmental stages,like morphogenetic to adopt artificial pets and enjoy their company.Such changes in batrachians.For instance,the base topology interactions are facilitated in the case of so-called an- of the Conro self-reconfigurable robot developed in the thropopathic or human-friendly robots,such as Kismet Polymorphic Robotics Laboratory at USC-ISI is simply at MIT [60.3]or WE-4RII at Waseda University [60.4], connected as in a snake,but the system can reconfigure which are able to perceive and respond to human emo- itself in order to grow a set of legs or other specialized ap- tions,and do themselves express apparent emotions pendages(Fig.60.3)thanks to a dedicated hormone-like influencing their actions and behavior(Fig.60.2a,b). adaptive communication protocol [60.6,7]. Likewise,the Uando robot of Osaka Univer- Chapter 39 is devoted to distributed and cellular sity [60.5]is controlled by air actuators providing 43 robots and provides other examples of such reconfig- degrees of freedom.The android can make facial ex- urable robots
1396 Part G Human-Centered and Life-Like Robotics advances in biology – where so-called integrative approaches now produce a huge amount of data and models directly exploitable by roboticists – and in technology – with the massive availability of low-cost and powerefficient computing systems, and with the development of new materials exhibiting new properties. This will be demonstrated in this chapter, which first reviews recent research efforts in bio-inspired morphologies, sensors, and actuators. Then, control architectures that, beyond mere reflexes, implement cognitive abilities – like memory or planning – or adaptive processes – like learning, evolution and development – will be described. Finally, the chapter will also report related works on energetic autonomy, collective robotics, and biohybrid robots. It should be noted that this chapter will describe both bio-inspired and biomimetic realizations. In fact, these two terms characterize, respectively, the extremities of a continuum in which, on the one side, engineers seek to reproduce some natural result, but not necessarily the underlying means, while, on the other side, they seek to reproduce both the result and the means. Thus, bio-inspired robotics tends to adapt to traditional engineering approaches some principles that are abstracted from the observation of some living creature, whereas biomimetic robotics tends to replace classical engineering solutions by as detailed mechanisms or processes that it is possible to reproduce from the observation of this creature. In practice, any specific application usually lies somewhere between these two extremities. Be that as it may, because biomimetic realizations are always bio-inspired, whereas the reverse is not necessarily true, qualifying expressions like bio-inspired or biologically inspired will be preferentially used in this chapter. 60.2 Bio-inspired Morphologies Although not comparable to that of real creatures, the diversity of bio-inspired morphologies that may be found in the realm of robotics is nevertheless quite impressive. Currently, a huge number of robots populates the terrestrial, as well as aquatic and aerial, environments and look like animals as diverse as dogs, kangaroos, sharks, dragonflies, or jellyfishes, not to mention humans (Fig. 60.1). In nature, the morphology of an animal fits its ecology and behavior. In robotics applications, bio-inspired morphologies are seldom imposed by functional considerations. Rather, as close a resemblance as possible to a given animal is usually sought per se, as in animatronics applications for entertainment industry. However, several other applications are motivated by the functional objective of facilitating human–robot interactions, thus allowing, for instance, children or elderly people to adopt artificial pets and enjoy their company. Such interactions are facilitated in the case of so-called anthropopathic or human-friendly robots, such as Kismet at MIT [60.3] or WE-4RII at Waseda University [60.4], which are able to perceive and respond to human emotions, and do themselves express apparent emotions influencing their actions and behavior (Fig. 60.2a,b). Likewise, the Uando robot of Osaka University [60.5] is controlled by air actuators providing 43 degrees of freedom. The android can make facial expressions, eye, head, and body movements, and gestures with its arms and hands. Touch sensors with sensitivity to variable pressures are mounted under its clothing and silicone skin, while floor sensors and omnidirectional vision sensors serve to recognize where people are in order to make eye contact while addressing them during conversation. Moreover, it can respond to the content and prosody of a human partner by varying what it says and the pitch of its voice (Fig. 60.2c). See Chap. 58 for more references on human-friendly robots. Another active research area in which functional considerations play a major role is that of shape-shifting robots that can dynamically reconfigure their morphology according to internal or external circumstances. Biological inspiration stems from organisms that can regrow lost appendages, like the tail in lizards, or from transitions in developmental stages, like morphogenetic changes in batrachians. For instance, the base topology of the Conro self-reconfigurable robot developed in the Polymorphic Robotics Laboratory at USC-ISI is simply connected as in a snake, but the system can reconfigure itself in order to grow a set of legs or other specialized appendages (Fig. 60.3) thanks to a dedicated hormone-like adaptive communication protocol [60.6, 7]. Chapter 39 is devoted to distributed and cellular robots and provides other examples of such reconfigurable robots. Part G 60.2
Biologically Inspired Robots 60.2 Bio-inspired Morphologies 1397 6 Part G 60.2 Fig.60.1 A collection of zoomorphic robots b c) Fig.60.2a-c Three humanoid robots.(a)Kismet CRodney Brooks,Computer Science and Artificial Intelligence Lab, MIT(b)WE-4RII CAtsuo Takanishi Lab,Waseda University.(c)Uando CHiroshi Ishiguro,ATR Intelligent Robotics and Communication Lab.Osaka University
Biologically Inspired Robots 60.2 Bio-inspired Morphologies 1397 Fig. 60.1 A collection of zoomorphic robots a) b) c) Fig. 60.2a–c Three humanoid robots. (a) Kismet c Rodney Brooks, Computer Science and Artificial Intelligence Lab, MIT (b) WE-4RII c Atsuo Takanishi Lab, Waseda University. (c) Uando c Hiroshi Ishiguro, ATR Intelligent Robotics and Communication Lab, Osaka University Part G 60.2
1398 Part G Human-Centered and Life-Like Robotics Fig.60.3 A sequence reconfiguring a CONRO robot from a snake to a T-shaped creature with two legs.CWei-Min Shen, Polymorphic Robotics Laboratory,Univ.Southern California 60.3 Bio-inspired Sensors 60.3.1 Vision was exploited to implement optoelectronic devices al- lowing a terrestrial robot to wander in its environments Bio-inspired visual sensors in robotics range from very while avoiding obstacles [60.8],or tethered aerial robots simple photosensitive devices,which mostly serve to im- to track a contrasting target [60.9]or to automatically plement phototaxis,to complex binocular devices used perform terrain-following,take-off,or landing [60.10] for more cognitive tasks like object recognition. (Fig.60.4). Phototaxis is seldom the focus of dedicated research. The desert ant Cataglyphis,while probably merg- It is rather usually implemented merely to force a robot ing optic-flow and odometry monitoring to evaluate its to move and exhibit other capacities such as obstacle travel distances,is able to use its compound eyes to avoidance or inter-robot communication. perceive the polarization pattern of the sky and infer Part G160.3 Several visual systems calling upon optic-flow moni- its orientation.This affords it with accurate navigation toring are particularly useful in the context of navigation capacities that make it possible to explore its desert tasks and are implemented in a variety of robots.This is habitat for hundreds of meters while foraging,and re- the case with the work done in Marseilles'Biorobotics turn back to its nest on an almost straight line,despite Laboratory that serves to understand how the organiza- the absence of conspicuous landmarks and despite the tion of the compound eye of the housefly,and how the impossibility of laying pheromones on the ground that neural processing of visual information obtained during would not almost immediately evaporate.Inspired by flight,endow this insect with various reflexes mandatory the insect's navigation system,mechanisms for path for its survival.The biological knowledge thus acquired integration and visual piloting have been successfully 5 cm b) Fig.60.4a-c Optoelectronic devices inspired by the housefly's compound eye.(a)Device for obstacle avoidance.(b)De- vice for target tracking.(c)Device for terrain following,take-off,and landing.OCNRS Phototheque,Nicolas Franceschini, UMR6152-Mouvement et Perception-Marseille
1398 Part G Human-Centered and Life-Like Robotics Fig. 60.3 A sequence reconfiguring a CONRO robot from a snake to a T-shaped creature with two legs. c Wei-Min Shen, Polymorphic Robotics Laboratory, Univ. Southern California 60.3 Bio-inspired Sensors 60.3.1 Vision Bio-inspired visual sensors in robotics range from very simple photosensitive devices, which mostly serve to implement phototaxis, to complex binocular devices used for more cognitive tasks like object recognition. Phototaxis is seldom the focus of dedicated research. It is rather usually implemented merely to force a robot to move and exhibit other capacities such as obstacle avoidance or inter-robot communication. Several visual systems calling upon optic-flow monitoring are particularly useful in the context of navigation tasks and are implemented in a variety of robots. This is the case with the work done in Marseilles’ Biorobotics Laboratory that serves to understand how the organization of the compound eye of the housefly, and how the neural processing of visual information obtained during flight, endow this insect with various reflexes mandatory for its survival. The biological knowledge thus acquired a) b) c) 5 cm Fig. 60.4a–c Optoelectronic devices inspired by the housefly’s compound eye. (a) Device for obstacle avoidance. (b) Device for target tracking. (c) Device for terrain following, take-off, and landing.c CNRS Photothèque, Nicolas Franceschini, UMR6152 - Mouvement et Perception - Marseille was exploited to implement optoelectronic devices allowing a terrestrial robot to wander in its environments while avoiding obstacles [60.8], or tethered aerial robots to track a contrasting target [60.9] or to automatically perform terrain-following, take-off, or landing [60.10] (Fig. 60.4). The desert ant Cataglyphis, while probably merging optic-flow and odometry monitoring to evaluate its travel distances, is able to use its compound eyes to perceive the polarization pattern of the sky and infer its orientation. This affords it with accurate navigation capacities that make it possible to explore its desert habitat for hundreds of meters while foraging, and return back to its nest on an almost straight line, despite the absence of conspicuous landmarks and despite the impossibility of laying pheromones on the ground that would not almost immediately evaporate. Inspired by the insect’s navigation system, mechanisms for path integration and visual piloting have been successfully Part G 60.3
Biologically Inspired Robots 60.3 Bio-inspired Sensors 1399 employed on mobile robot navigation in the Sahara desert [60.11]. a Among the robotic realizations that are targeted at humanoid vision,some aim at integrating informa- tion provided by foveal and peripheral cameras.Ude et al.[60.12],in particular,describe a system that uses shape and color to detect and pursue objects through pe- ripheral vision and then recognizes the object through a more detailed analysis of higher-resolution foveal im- ages.The classification is inferred from a video stream rather than from a single image and,when a desired ob- ject is recognized,the robot reaches for it and ignores other objects(Fig.60.5).Common alternatives to the use of two cameras per eye consist of using space-variant vi- sion and,in particular,log-polar images.As an example, Metta [60.13]describes an attentional system that should be extended with modules for object recognition,trajec- Fig.60.5 (a)Four cameras implement foveal and peripheral vision tory tracking,and naive physics understanding during in the head of the humanoid robot DB.Foveal cameras are above the natural interaction of a robot with the environment. peripheral cameras.OJST.ATR Robot developed by SARCOS Other examples of robotic applications of percep- (b)The HRP2 humanoid robot Kawada Industries Inc./National tual processes underlying human vision are provided Institute of Advanced Industrial Science and Technology (AIST) in Chap.63 on perceptual robotics. Vision-based simultaneous localization and map- taxis behavior or more complex capacities such as object ping (SLAM)systems have also been implemented recognition. on humanoid robots,with the aim of increasing the At the University of Edinburgh,numerous research autonomy of these machines.In particular,Davison efforts are devoted to understanding the sensory-motor et al.[60.14]used the HRP2 robot (Fig.60.5)to pathways and mechanisms that underlie positive or demonstrate real-time SLAM capacities during agile negative phonotaxis behavior in crickets through the im- Part G combinations of walking and turning motions,using plementation of various models on diverse robots such the robot's internal inertial sensors to monitor a type as the Khepera shown on Fig.60.6.In particular,an ana- of three-dimensional odometry that reduced the local logue very-large-scale integrated(VLSI)circuit models w rate of increase in uncertainty within the SLAM map.the auditory mechanism that enables a female cricket to The authors speculate that the availability of traditional meet a conspecific male or to evade a bat(by the calling odometry on all of the robot's degrees of freedom will song or the echolocation calls they produce,respec- allow more long-term motion constraints to be im- tively).The results suggest that the mechanism outputs posed and exploited by the SLAM algorithm,based a directional signal to sounds ahead at calling song fre- on knowledge of possible robot configurations.Addi- quency and to sound behind at echolocation frequencies, tional references to SLAM techniques are to be found and that this combination of responses simplifies later in Chap.37. neural processing in the cricket [60.16].This process- As another step towards autonomy in humanoid ing is the subject of complementary modeling efforts robots,mapping and planning capacities may be com-in which spiking neuron controllers are also tested on bined.Michel et al.[60.15],for instance,demonstrate robots,thus allowing the exploration of the function- that a real-time vision-based sensing system and an ality of the identified neurons in the insect,including adaptive footstep planner allow a Honda ASIMO robot the possible roles of multiple sensory fibers,mutually to autonomously traverse dynamic environments con- inhibitory connections,and brain neurons with pattern- taining unpredictably moving obstacles. filtering properties.Such robotic implementations also make the investigation of multimodal influences on the 60.3.2 Audition behavior possible,via the inclusion of an optomotor sta- bilization response and the demonstration that this may Like vision,the sense of hearing in animals has been improve auditory tracking,particularly under conditions implemented on several robots to exhibit mere phono- of random disturbance [60.17]
Biologically Inspired Robots 60.3 Bio-inspired Sensors 1399 employed on mobile robot navigation in the Sahara desert [60.11]. Among the robotic realizations that are targeted at humanoid vision, some aim at integrating information provided by foveal and peripheral cameras. Ude et al. [60.12], in particular, describe a system that uses shape and color to detect and pursue objects through peripheral vision and then recognizes the object through a more detailed analysis of higher-resolution foveal images. The classification is inferred from a video stream rather than from a single image and, when a desired object is recognized, the robot reaches for it and ignores other objects (Fig. 60.5). Common alternatives to the use of two cameras per eye consist of using space-variant vision and, in particular, log-polar images. As an example, Metta [60.13] describes an attentional system that should be extended with modules for object recognition, trajectory tracking, and naive physics understanding during the natural interaction of a robot with the environment. Other examples of robotic applications of perceptual processes underlying human vision are provided in Chap. 63 on perceptual robotics. Vision-based simultaneous localization and mapping (SLAM) systems have also been implemented on humanoid robots, with the aim of increasing the autonomy of these machines. In particular, Davison et al. [60.14] used the HRP2 robot (Fig. 60.5) to demonstrate real-time SLAM capacities during agile combinations of walking and turning motions, using the robot’s internal inertial sensors to monitor a type of three-dimensional odometry that reduced the local rate of increase in uncertainty within the SLAM map. The authors speculate that the availability of traditional odometry on all of the robot’s degrees of freedom will allow more long-term motion constraints to be imposed and exploited by the SLAM algorithm, based on knowledge of possible robot configurations. Additional references to SLAM techniques are to be found in Chap. 37. As another step towards autonomy in humanoid robots, mapping and planning capacities may be combined. Michel et al. [60.15], for instance, demonstrate that a real-time vision-based sensing system and an adaptive footstep planner allow a Honda ASIMO robot to autonomously traverse dynamic environments containing unpredictably moving obstacles. 60.3.2 Audition Like vision, the sense of hearing in animals has been implemented on several robots to exhibit mere phonoa) b) Fig. 60.5 (a) Four cameras implement foveal and peripheral vision in the head of the humanoid robot DB. Foveal cameras are above peripheral cameras. c JST, ATR Robot developed by SARCOS. (b) The HRP2 humanoid robot c Kawada Industries Inc./ National Institute of Advanced Industrial Science and Technology (AIST) taxis behavior or more complex capacities such as object recognition. At the University of Edinburgh, numerous research efforts are devoted to understanding the sensory–motor pathways and mechanisms that underlie positive or negative phonotaxis behavior in crickets through the implementation of various models on diverse robots such as the Khepera shown on Fig. 60.6. In particular, an analogue very-large-scale integrated (VLSI) circuit models the auditory mechanism that enables a female cricket to meet a conspecific male or to evade a bat (by the calling song or the echolocation calls they produce, respectively). The results suggest that the mechanism outputs a directional signal to sounds ahead at calling song frequency and to sound behind at echolocation frequencies, and that this combination of responses simplifies later neural processing in the cricket [60.16]. This processing is the subject of complementary modeling efforts in which spiking neuron controllers are also tested on robots, thus allowing the exploration of the functionality of the identified neurons in the insect, including the possible roles of multiple sensory fibers, mutually inhibitory connections, and brain neurons with pattern- filtering properties. Such robotic implementations also make the investigation of multimodal influences on the behavior possible, via the inclusion of an optomotor stabilization response and the demonstration that this may improve auditory tracking, particularly under conditions of random disturbance [60.17]. Part G 60.3