on robots and flies: modeling the visual orientation ... · on robots and flies: modeling the...

16
Robotics and Autonomous Systems 29 (1999) 227–242 On robots and flies: Modeling the visual orientation behavior of flies Susanne A. Huber * , Matthias O. Franz, Heinrich H. Bülthoff Max Planck Institute für Biological Cybernetics, Spemannstr. 38, 72076 Tübingen, Germany Received 16 February 1998; received in revised form 20 July 1998 Communicated by F.C.A. Groen Abstract Although artificial and biological systems face similar sensorimotor control problems, until today only a few attempts have been made to implement specific biological control structures on robots. Nevertheless, the process of designing the sensorimotor control of a robot can contribute to our understanding of these mechanisms and can provide the basis of a critical evaluation of existing biological models. Flies have developed a specialized visuomotor control for tasks such as course stabilization, fixation and approach towards stationary objects, tracking of moving objects and landing, which are based on the analysis of visual motion information. Theoretical and experimental results suggest that in flies the visuomotor control for course stabilization as well as fixation and approach towards stationary objects may be implemented at least partially by one common sensory circuit. We present agents with a visuomotor controller that regulates the two behaviors of course stabilization and object fixation. To test this controller under real world conditions, we implemented it on a miniature robot. We have been able to show that in addition to course stabilization and object fixation, the robot also approaches stationary objects. ©1999 Elsevier Science B.V. All rights reserved. Keywords: Visual information processing; Sensorimotor control; Fly behavior; Object fixation; Autonomous agents 1. Introduction 1.1. Biology for robotics In many aspects of technological applications an increasingly important goal is to build robust au- tonomous robots that are able to achieve tasks in a dynamic and unpredictable environment. Robots are particularly needed where humans are not able to survive and remote control is difficult to implement (e.g., repair of sewage systems, expedition to Mars, * Corresponding author. Present address: University of Zurich, In- stitute of Psychology, Attenhoferstr. 9, CH-8032 Zurich, Switzer- land. Tel.: +1-634-21-38; fax: +1-634-49-29 E-mail address: [email protected] (S.A. Huber) volcanoes or deep sea explorations). By studying nat- ural systems, much can be learnt about the design of technical solutions. Recently, behavioral mechanisms of biological sys- tems have specifically been considered more closely for the design of autonomous mobile robots. The sug- gestion of Tinbergen [46] on behavioral modules for the description of complex behavior in biological sys- tems has turned out to serve as a good model for robust control of behavior in autonomous robots. Evo- lutionary, developmental and learning processes have been applied to adaptive control of autonomous agents [17,26]. In addition, several researchers have stressed that the environment plays an important role in the ad- equate organization of behavior and thereby reduces the complexity of internal mechanisms [3,4]. 0921-8890/99/$ – see front matter ©1999 Elsevier Science B.V. All rights reserved. PII:S0921-8890(99)00055-X

Upload: dinhhanh

Post on 21-Jun-2018

226 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: On robots and flies: Modeling the visual orientation ... · On robots and flies: Modeling the visual orientation behavior of flies ... land. Tel.: +1-634-21-38; ... tectors that

Robotics and Autonomous Systems 29 (1999) 227–242

On robots and flies: Modeling the visual orientation behavior of flies

Susanne A. Huber∗, Matthias O. Franz, Heinrich H. BülthoffMax Planck Institute für Biological Cybernetics, Spemannstr. 38, 72076 Tübingen, Germany

Received 16 February 1998; received in revised form 20 July 1998Communicated by F.C.A. Groen

Abstract

Although artificial and biological systems face similar sensorimotor control problems, until today only a few attemptshave been made to implement specific biological control structures on robots. Nevertheless, the process of designing thesensorimotor control of a robot can contribute to our understanding of these mechanisms and can provide the basis of a criticalevaluation of existing biological models. Flies have developed a specialized visuomotor control for tasks such as coursestabilization, fixation and approach towards stationary objects, tracking of moving objects and landing, which are based onthe analysis of visual motion information. Theoretical and experimental results suggest that in flies the visuomotor controlfor course stabilization as well as fixation and approach towards stationary objects may be implemented at least partiallyby one common sensory circuit. We present agents with a visuomotor controller that regulates the two behaviors of coursestabilization and object fixation. To test this controller under real world conditions, we implemented it on a miniature robot.We have been able to show that in addition to course stabilization and object fixation, the robot also approaches stationaryobjects. ©1999 Elsevier Science B.V. All rights reserved.

Keywords:Visual information processing; Sensorimotor control; Fly behavior; Object fixation; Autonomous agents

1. Introduction

1.1. Biology for robotics

In many aspects of technological applications anincreasingly important goal is to build robust au-tonomous robots that are able to achieve tasks in adynamic and unpredictable environment. Robots areparticularly needed where humans are not able tosurvive and remote control is difficult to implement(e.g., repair of sewage systems, expedition to Mars,

∗ Corresponding author. Present address: University of Zurich, In-stitute of Psychology, Attenhoferstr. 9, CH-8032 Zurich, Switzer-land. Tel.: +1-634-21-38; fax: +1-634-49-29E-mail address:[email protected] (S.A. Huber)

volcanoes or deep sea explorations). By studying nat-ural systems, much can be learnt about the design oftechnical solutions.

Recently, behavioral mechanisms of biological sys-tems have specifically been considered more closelyfor the design of autonomous mobile robots. The sug-gestion of Tinbergen [46] on behavioral modules forthe description of complex behavior in biological sys-tems has turned out to serve as a good model forrobust control of behavior in autonomous robots. Evo-lutionary, developmental and learning processes havebeen applied to adaptive control of autonomous agents[17,26]. In addition, several researchers have stressedthat the environment plays an important role in the ad-equate organization of behavior and thereby reducesthe complexity of internal mechanisms [3,4].

0921-8890/99/$ – see front matter ©1999 Elsevier Science B.V. All rights reserved.PII: S0921-8890(99)00055-X

Page 2: On robots and flies: Modeling the visual orientation ... · On robots and flies: Modeling the visual orientation behavior of flies ... land. Tel.: +1-634-21-38; ... tectors that

228 S.A. Huber et al. / Robotics and Autonomous Systems 29 (1999) 227–242

Until today, however, these concepts have been ap-plied to the field of robotics only on a very abstractlevel and many problems still remain unsolved. Mod-els of biological systems have been used for robot con-trol and actual sensorimotor mechanisms of biologi-cal systems have been implemented into robots (e.g.,[2,8,11,50]), although little is known about the func-tionality of many sensorimotor processes in biologi-cal systems. Autonomous robots and simulated agents,e.g., have been designed with orientation mechanismsthat are inspired by the behavior of flies. Visuomotorcontrollers have been developed that use visual mo-tion information. Mostly these agents use the visualmotion information for obstacle avoidance [12,18,51]or tracking of other objects [8,35].

1.2. Robots for biology

The development of robots that are inspired by liv-ing organisms does not only have a major influenceon the development of new technologies but is also apromising complementary research approach for thestudy of biological systems. The simulation of simplelife forms and the implementation of behavioral mod-els on robots may be especially useful to our under-standing of biological systems, e.g., to test models ofbiological information processing.

Information processing has been studied ininsects for many years because insects showstimulus–response characteristics that allow insightinto the mechanisms of visual processing as wellas into the interaction of sensory input and motoroutput. Especially the visually controlled orientationbehavior of flies is particularly well studied undercontrolled conditions with walking or tethered flyingflies [21,24,31,42] as well as in free flight situations[6,10,33,48,52].

Results from fly research suggest that the two be-haviors of course stabilization and approach towardsstationary objects may be realized at least partiallyby one common behavioral module [5,9,23,24,42].Reichardt and Poggio [42] used differential equationsto describe these orientation behaviors of flies. Inour work, we use the robotics approach to investi-gate this hypothesis. Our goal is to build a singlevisuomotor controller that regulates both behaviors.We modeled the subsequent processing steps in the

visual system of flies in detail. Like the robots ofFranceschini et al. [18] and others, the agent usesvisual motion information of a 360◦ horizontal fieldof view for behavior control. We can demonstrate thatour computer-simulated agent is able to control thetwo behaviors of course stabilization and approachtowards stationary objects by evaluating the motioninformation with a single visuomotor controller. Totest the behaviors under real world conditions, weimplement the control structure on a miniature robot– theKheperaTM [36].

2. Visuomotor control of a computer-simulatedagent

2.1. Model of the visual system

The large compound eyes of the fruitflyDrosophilamelanogastereach consist of about 700 ommatidia.The visual signals on the retinae are processed in threeneural layers: the neuropils lamina (L), medulla (M)and lobula complex which again can be divided intothe anterior lobula (LO) and the posterior lobula plate(LP) (Fig. 1).

Results on fly research suggest that the neurons inthese layers are responsible for contrast enhancement,reduction of signal redundancy, signal amplification,motion detection and evaluation of the motion signals[21,27,28,45]. We have implemented these levels ofvisual information processing in a simplified model(Fig. 2).

2.1.1. Spatial lowpass filtering in the retinaThe agent had a horizontal array of sensors with a

360◦ field of view which scanned the visual world atthe horizon. The visual input to the sensors was spa-tially lowpass filtered by Gaussian filters (σ = 3.8◦).As in Drosophila[21], the optical axes of the sensorshad an angular distance of dϕ = 4.6◦. Hence the arrayhad 78 sensors.

2.1.2. Redundancy reduction and amplification ofthe signal in the lamina

In flies large monopolar cells (LMCs) in the laminaare known to be responsible for local contrast enhance-ment both by reducing redundant parts of the signaland by signal amplification [34,45]. We have modeled

Page 3: On robots and flies: Modeling the visual orientation ... · On robots and flies: Modeling the visual orientation behavior of flies ... land. Tel.: +1-634-21-38; ... tectors that

S.A. Huber et al. / Robotics and Autonomous Systems 29 (1999) 227–242 229

Fig. 1. Cross-section through the fly’s brain with large compound eyes (redrawn from [28]), the retina (R), lamina (L), medulla (M), lobula(LO), lobula plate (LP) and the cervical connective (CC). The lamina and medulla are connected via the external chiasm (CHE) and themedulla and lobula complex via the internal chiasm (CHI).

Fig. 2. Sketch of the agent with ring sensor (R), simplified models of the three layers of the visual system: lamina (L), medulla (M), lobulaplate (LP), and the transmission weights (wll , wlr , wrl , wrr) that couple the outputs of the large field units (βl , βr) to the motor system. (b)Model of the visuomotor controller with the functional processing steps that take place in the different layers of the agent’s visual system((*) the bandpass filter is only realized on theKheperaTMrobot, see Section 4).

the temporal aspects of the LMC cells by applying atemporal highpass filterH (τH = 20.0 steps, whereone step corresponds to one simulation cycle), whicheliminates temporally redundant parts of the signal,i.e. signals which are steady or slowly changing intime. The signals are then linearly amplified to the fullrange of 256 gray-level values.

2.1.3. Motion detection and evaluation in themedulla and lobula plate

Compared to human eyes the resolution of the com-pound eyes is very coarse. Thus, the perception ofshape is more difficult. There is strong evidence thatfor visual orientation the detection of motion playsa prominent role (e.g., [27,40]). For motion percep-

Page 4: On robots and flies: Modeling the visual orientation ... · On robots and flies: Modeling the visual orientation behavior of flies ... land. Tel.: +1-634-21-38; ... tectors that

230 S.A. Huber et al. / Robotics and Autonomous Systems 29 (1999) 227–242

tion in insects, Hassenstein and Reichardt [27] pro-posed a detector which correlates temporal modula-tion of image intensities in two neighboring omma-tidia. The detector model has two mirror-symmetricalsubunits. In each subunit the signals of two input chan-nels are multiplied after the signals have been filteredby two lowpass filters with different time constants.We have modeled the lowpass filters of the motiondetectors with the time constantsτ1 = 1.5 steps andτ2 = 5.0 steps. The outputs of the two subunits arethen subtracted to obtain the direction of the motionstimulus. From electrophysiological studies it has beenconcluded that the subtraction stage of the movementdetector is localized on the dendrites of the large fieldtangential neurons within the lobula plate of the fly(review: [29]).

These neurons integrate the motion signals overlarge receptive fields, and are specialized in terms ofcertain motion patterns. For example, the horizontalequatorial cells (HSE cells) respond maximally to hor-izontal progressive motion in the frontolateral field ofview. The HSE cells receive input from motion de-tectors that respond more strongly to a pattern mov-ing horizontally from front to back (progressive mo-tion) than from back to front (regressive motion). Theasymmetry results from the fact that the time courseof these motion detector subunits is not completelymirror symmetric [16]. The asymmetric response ofthe detector subunits is modeled by a gain of 1.0 forprogressive and 0.7 for regressive motion.

Modeling the HSE cells, two large field units inte-grate the motion information over a 184◦ field of viewin the right and left hemisphere with an overlappingregion of 8◦ in the front. The sensitivity distributionS(j) (j number of sensors, where the sensors±1 areoriented at the visual anglesϕ = ±2.3◦ off the head-ing direction and the sensors±39 atϕ = ±177.1◦) ismodeled by the function:

S(j) = ajb e−cj (1)

with a = 0.625, b = 0.7 and c = 0.15 andj ∈[−1,39] (see also Fig. 3).S(j) is bilaterally symmet-rical for the two integration units (S(j) = S(−j)) andthe maximum ofS(j) is at the sensor 5 (atϕmax =23◦). The outputs (βl, βr) of the integration units arecoupled via transmission weights to the motor system.

Fig. 3. The spatial sensitivity distribution of both large field units.

2.2. Model of the motor system

The agent was modeled as a simple kinematic sys-tem with two motors, ignoring its mass and inertia.This approximation can be made in the modeling ofthe visuomotor control of flies, because after an ini-tial acceleration, within a short time (< 10 ms) the flyreaches a constant velocity as the force produced bythe wings is balanced by the increasing air friction. Themotors had a distance ofc = 1 u, given in units u ofthe agent’s widths (1 u= 0.25 cm). The force vectorsproduced by the wings of the fruitflyDrosophilahavean estimated perpendicular distance from the center ofthe fly of about 0.2–0.3 cm (body-length: 0.3 cm) [21].

The velocitiesvvvl = (0, vl) andvvvr = (0, vr) (Fig.4) of the motors were proportional to the force of thetwo motors. Each motor produced a constant velocityv0 = 0.1 u/step which was modulated by the outputsof the processed visual information:

vl = v0 − T (sl), vr = v0 − T (sr). (2)

The signalssl and sr result from the visual motioninformation via the control signalsml andmr and in-trinsic noisenl andnr (Fig. 5):

sl = kml + nl , sr = kmr + nr, (3)

wherek is a proportionality factor. The control signalsml andmr are explained in detail in Section 2.3.

As the force produced by the wings of the fly neverbecomes negative, the velocitiesvl andvr are alwaysgreater than or equal to 0 u/step. This was achieved by

Page 5: On robots and flies: Modeling the visual orientation ... · On robots and flies: Modeling the visual orientation behavior of flies ... land. Tel.: +1-634-21-38; ... tectors that

S.A. Huber et al. / Robotics and Autonomous Systems 29 (1999) 227–242 231

Fig. 4. Motor system of the simulated agent. The motors areseparated by a distancec and lead to velocitiesvvvl andvvvr .

Fig. 5. Noise signalnl of left motor (Gaussian distribution withσ = 0.64◦/step).

the sigmoid functionT (s) for the left and right motor,respectively:

T (s) =

−v0 if s < −v0,

s if |s| 6 v0,

v0 if s > v0.

(4)

The system had two degrees of freedom: translation inthe heading direction and rotation around the verticalbody-axis. The translatory and rotatory velocities were

vt = vr + vl

2and ψ = vr − vl

c. (5)

2.3. Coupling of sensor and motor systems

2.3.1. Orientation towards stationary objectsIn behavioral studies with flies, Reichardt and Pog-

gio [38,42] showed that the fly orients itself towardsa single black stripe in an otherwise homogeneousarena. This so-called fixation behavior is in contrast toa fly’s behavior in a visually homogeneous environ-ment, where it turns in all directions with equal prob-ability. Due to intrinsic noise the motor system con-tinuously produces torque, resulting in turning move-ments in either direction. The noise has a Gaussiandistribution [42].

One explanation of the fixation behavior is based onthe fact that the turning response of flies in open-loopexperiments is stronger if a stripe moves from frontto back than in the other direction [31] due to theasymmetric response of the HS-neurons. As the noisytorque signals of the motor system lead to movementsof an object’s retinal image, the resulting image flowwill cause the fly to orient towards the object becausethe positive response to progressive motion is strongerthan the negative response to regressive motion.

Besides the large field cells also the small field cells[13–15,28,30] are assumed to participate in objectfixation and additionaly in tracking moving objects.These cells respond selectively to motion in small ar-eas of their receptive field. As these cells have notbeen included into our model controller, we will notdescribe them in further detail here.

In experiments where two stripes (at various angu-lar separations1ψs, up to 180◦) are presented, thefly does not show any preference for either one of thestripes. For a separation of the stripes of1ψs < 40◦,flies fixate at the middle point between the two stripes.At around1ψs = 40◦–60◦, there is a transition fromone stable orientation to two stable orientations. For1ψs > 60◦, flies fixate on one stripe for a certainamount of time, then switch to the second stripe, fixateand switch back etc. They fixate upon each stripe forapproximately the same amount of time. Flies, how-ever, do not fixate directly on the center of each stripebut fixate a point off the center such that the fly isslightly oriented towards the other stripe. Especiallyfor ψs = 60◦, there is a high variation of the fixationbehavior of individual flies and the fixation point liesbetween about 2◦ and 15◦ off the center of the stripe.No such scatter was observed for1ψs = 80◦–180◦.

Page 6: On robots and flies: Modeling the visual orientation ... · On robots and flies: Modeling the visual orientation behavior of flies ... land. Tel.: +1-634-21-38; ... tectors that

232 S.A. Huber et al. / Robotics and Autonomous Systems 29 (1999) 227–242

At an angular separation of 180◦ an almost exact fix-ation on the center of the stripe has been observed[38,41,42].

To test the model of the fixation behavior of flies,the signals that result from the large field integrationunits were coupled proportionally to the motor system.

mfl (t) = ωllβl + ωlrβr,

mfr(t) = ωrlβl + ωrrβr. (6)

The matrix

WWW =(ωll ωlrωrl ωrr

)=

(0.9 −0.4

−0.4 0.9

)(7)

contains the transmission weights for the coupling ofthe outputsβl andβr of the two large field integrationunits to the left and right motor. We assumed bilateralsymmetry for the sensorimotor coupling.

2.3.2. Optomotor responseWhile flying, flies are continuously stabilizing

their course. In order to compensate for disturbances,which cause large rotatory image motion, they exe-cute turning movements — the so-called optomotorresponse — along the direction of the image motion.This behavior may be implemented by cells that in-tegrate the difference of the output signals from thehorizontal cells in the two optic lobes [31,43]. Amotor signal, simply proportional to the differenceof the image motion in the two eyes, has two majordisadvantages:1. The response of the Reichardt motion detector in-

creases with increasing image motion up to anoptimum and decreases again for faster motion.Due to this response characteristic, ambiguous re-sponse signals result with respect to pattern veloc-ity. It is impossible to decide from the visual sig-nal alone whether the image motion resulted froma pattern that moves at low or very high speed.Full compensation is not possible.

2. In general, a purely proportional controller (with-out memory) may cause oscillations because thecompensation behavior is active only until the reti-nal image is stabilized. As the disturbance per-sists, image flow will again be detected and thecompensation behavior is once more active untilthe retinal image is stabilized again, etc.

This second point, however, does not seem to be adisadvantage for the control system of flies. Warzechaand Egelhaaf [49] could show that, due to the responsecharacteristic of the motion detector, large-amplitudefluctuations in the image motion, generated when theoptomotor system becomes unstable, are transmittedwith a small gain; this leads to only relatively smallturning responses and thus small image motion.

To model the optomotor response behavior, the sig-nals βl and βr that result from the large field unitswere integrated over time. The realization of this stagein biological systems is most probably different froma pure, linear integration [43], but are not modeled inmore detail here.

In simulation the integration was replaced by a sum-mation:

Bl(t) =t∑

−∞βl(t

′), Br(t) =t∑

−∞βr(t

′). (8)

The coupling of these signals to the motor system leadsto the control signals:

morl (t) = ωllBl(t)+ ωlrBr(t),

morr (t) = ωrlBl(t)+ ωrrBr(t),

(9)

with the transmission weights from Eq. (7).

2.3.3. Optomotor response and object fixationExperimental results suggest that the behaviors for

course stabilization and approach towards stationaryobjects of flies may be implemented at least partiallyby a common sensory circuit [5,23,24,42]. In order totest this we designed agents with a visuomotor con-troller that regulated both the optomotor response andthe fixation behavior. For the controller the control sig-nals for optomotor responsemor and object fixationmfwere combined into a proportional-integral controller:

ml(t) = kfmfl (t)+ korm

orl (t),

mr(t) = kfmfr(t)+ korm

orr (t),

(10)

with the constant factorskf = 0.5 u/step andkor =5.0 · 10−4 u/step. A similar processing step is used incomputer simulations to model the figure-ground dis-crimination in the visual system of flies [43] and hasalso been described to explain data of the oculomotorsignals in primates, where a pulse of activity is trans-formed into a sum of the pulse and its integral beforebeing transferred to the motor neurons [7].

Page 7: On robots and flies: Modeling the visual orientation ... · On robots and flies: Modeling the visual orientation behavior of flies ... land. Tel.: +1-634-21-38; ... tectors that

S.A. Huber et al. / Robotics and Autonomous Systems 29 (1999) 227–242 233

Fig. 6. Sketch of simulated arena: (a) side view of the arena with a sinusoidal pattern and (b) with one black stripe, and (c) top view. Thestripe is at 0◦, initial orientation of agent is atψs.

3. Experiments with the simulated agent

3.1. Fixation behavior

To compare the performance of the fixation behaviorof the agent with that of flies in the studies of Reichardtand Poggio [38,39,42], we ran three experiments. In allthree, the agent was fixed in the middle of a drum andtherefore had only one degree of freedom, the rotationaround its vertical body axis. The angular velocity ofthe turning responseψ is given by Eq. (5).

Experiment 1 (Fixation on a single stripe). A singleblack stripe (angular width 17.3◦) was presented onthe wall of the drum in an otherwise white surrounding(Fig. 6(a) and (b)). The initial orientation of the agentwasψs = 41.4◦. The simulation ran for 10 000 timesteps.

Result. Due to torque fluctuations caused by intrinsicnoise of the system, the retinal image of the stripe wasnot stationary. The agent turned towards the object be-cause the turning response was stronger to progressivethan to regressive motion. Within 1100 steps the agentoriented towards the stripe. For the rest of the simu-lation the agent was able to fixate on the stripe (Fig.7). The slightly overlapping receptive fields of the in-tegration units were necessary for the agent to keepthe stripe in the front because agents which lack thischaracteristic were not able to fixate on the stripe infront of them.

Experiment 2 (Fixation on two stripes). In this ex-periment we ran five simulations with two blackstripes (angular width 17.3◦ each) at an angular

Fig. 7. Histogram of the agent’s orientation angleψ in a drumwith one stripe atψs = 0.0◦: ψ = 0.95◦ ± 11.21◦.

separation of1ψs = 40◦,60◦,80◦,100◦, or 180◦,respectively, on the otherwise white wall of the drum.In order to simulate the fixation behavior for twostripes, the agent was turned away from one stripe af-ter every 2000 time steps by a large rotation angle of|ψ | = 72◦ for 1ψs < 180◦ and of |ψ | = 144.0◦ for1ψs = 180◦. The direction of the rotation was cho-sen randomly. The simulations ran for 50 000 stepseach.

Result. Similar to the experiments with flies a transi-tion from one stable orientation to two stable orienta-tions took place. At an angular separation of the twostripes of1ψs = 40◦ the histogram shows a singlepeak in the middle between the two stripes (Fig. 8).For1ψs > 40◦ the agent either fixated on one or theother stripe. The influence of the sensitivity function

Page 8: On robots and flies: Modeling the visual orientation ... · On robots and flies: Modeling the visual orientation behavior of flies ... land. Tel.: +1-634-21-38; ... tectors that

234 S.A. Huber et al. / Robotics and Autonomous Systems 29 (1999) 227–242

Fig. 8. (a) Fixation behavior of a fly for two stripes (redrawn from [42]) with angular separation of1ψs = 40◦,60◦,80◦,100◦, and 180◦.Histogram of the error angleψ between the heading direction of the fly and the stripes in the drum. (b) Fixation behavior of a simulatedagent. Histogram of the agent’s orientation angleψ in a drum. Angles of the peak and standard deviations of the bimodal Gaussiandistribution are given in the diagram.

of the large field integration units is essential for thisbehavior. When agents were tested with a uniform sen-sitivity distribution, they fixated for all1ψs on a pointin the middle between the stripes, where the contrastdistribution and thus the image motion on the left andright eye are equal. Yet, with the sensitivity distribu-tion S(j) (Eq. (1)), the optical flow resulting from thestripe nearest to the heading direction had a higher in-fluence on the turning response; the agent thus fixatedon that stripe.

For 1ψs = 60◦,80◦ and 100◦, the bimodal his-togram1 of the orientation angles shows that the agentdid not fixate directly on the center of the stripe be-cause the estimated angular separation of the peaks ofthe histogram (Fig. 8) is smaller than the angular sep-aration of the two stripes. For1ψs = 180◦ the agentdirectly fixated on the center of the stripes.

Experiment 3 (Fixation on a stripe on a textured sur-rounding pattern). On the walls of the drum a stripe(contrastC = 1.0) was presented with a surroundingsinusoidal pattern (C = 0.5 andλ = 36◦). The start-

1 To obtain the maxima and variance of the bimodal histogram(Fig. 8) of orientation angles, the sum of two Gaussian distributionsis fitted to the data.

ing orientation of the agent was atψs = 61.8◦. Thesimulation ran for 10 000 steps.

Result. The agent fixated on this stripe after 1300steps, although due to its own turning response themotion signal that resulted from the sinusoidal patternwas non-zero (Fig. 9). Due to the contrast dependencyof the motion detectors, the signal that resulted from

Fig. 9. Histogram of the agent’s orientation angleψ in a drumwith sinusoidal pattern of low contrast and one prominent stripe(at ψs = 0.0◦): ψ = 1.99◦ ± 11.24◦.

Page 9: On robots and flies: Modeling the visual orientation ... · On robots and flies: Modeling the visual orientation behavior of flies ... land. Tel.: +1-634-21-38; ... tectors that

S.A. Huber et al. / Robotics and Autonomous Systems 29 (1999) 227–242 235

Fig. 10. Optomotor response. (a) Histogram of the agent’s optomotor response. The agent is fixed at the center of a rotating drum. Rotation

of drum: ψd = 2.9◦/step; rotation of agent:ψ = 2.8 ± 1.0◦/step. (b) The agent’s optomotor response is dependent on the angular speedof the arena.

the prominent stripe was much larger than that fromthe sinusoidal pattern, resulting in a turning responsetowards the stripe.

3.2. Optomotor response behavior

Experiment 4. As in the previous experiment, theagent was fixed in the middle of a drum and couldonly turn around its vertical body axis. In order to testthe optomotor response a sinusoidal pattern(λ = 36◦)(Fig. 6(c)) rotated with a constant angular velocityψd. The simulations ran for 10 000 time steps withdifferent velocitiesψd (Fig. 10(b)).

Result. In order to stabilize the retinal image and thusits orientation in the drum, the agent produced on anaverage a compensatory turning response of¯ψ ± σψ(Fig. 10). The agent was able to compensate for 97%of the arena’s rotation when the angular velocity waslower than 7◦/step. Due to the sigmoid function (Eq.(4)) the agent was not able to compensate to this extentfor higher velocities of the drum.

Experiment 5. As Warzecha and Egelhaaf [39,49]suggested a simple proportional controller for the gen-eration of the optomotor response behavior, we alsotested the performance of the agent with a proportionalcontroller:

ml(t) = kfmfl (t), mr(t) = kfm

fr(t), (11)

with the constant factorskf = 0.5 u/step.

Result. With a purely proportional controller theagent was able to compensate for only 35% ofthe external rotation of the drum (¯ψ = 1.0 ±1.3◦/step). However, the standard deviation was onlyslightly larger than with the proportional-integralcontroller.

4. Implementation on the robot

The controller was implemented on a mobile robot,theKheperaTM (Fig. 11). The imaging system on therobot consists of a conical mirror mounted above asmall video camera. The optical axis of the video cam-era is oriented to the center of the cone. This configu-ration provides a 360◦ horizontal field of view extend-ing from 10◦ below to 10◦ above the horizon [19]. Theimage was sampled on five circles along the horizonwithin a vertical aperture of 2.1◦. The samples wereaveraged vertically to provide robustness against in-accuracies in the imaging system. Then the sampleswere horizontally lowpass filtered using a Gaussianfilter, resulting in 96 sensors on the horizontal ring,48 for each eye. The resolution was higher than in thesimulation.

In processing of the video images, we modelledboth the temporal aspects of the LMCs and theirspatial aspects, the latter by using a spatial band-pass filter obtained by predictive coding — a pro-cedure known as image compression (e.g., [53]).

Page 10: On robots and flies: Modeling the visual orientation ... · On robots and flies: Modeling the visual orientation behavior of flies ... land. Tel.: +1-634-21-38; ... tectors that

236 S.A. Huber et al. / Robotics and Autonomous Systems 29 (1999) 227–242

Fig. 11.KheperaTM robot with vision module. The optical axis ofthe camera is oriented vertically upwards, receiving mainly visualinput from the image on the conical mirror.

Table 1Predictive coding: weights of the lateral inhibition area with centerpixel 0 and lateral pixels±3,±2,±1 (obtained from 2000 imagesrecorded during typical trajectories of the robot)

Pixel 0 ±1 ±2 ±3Weight 1.000 −0.510 0.003 0.007

The weighting function of the filter is formed bythree input units (m = 3) in either direction.Table 1 gives an average filter characteristic that wasobtained from 2000 images recorded during typicaltrajectories of the robot. The visual stimuli were thentemporally highpass filtered and amplified. Filteringas well as motion detection were the same as duringsimulation.

The transmission weights that couple the outputs ofthe large field integration unitsβl andβr to the motorsystem were the same as during simulation (Eq. (7))and the control signals resulted from Eq. (10). Theconstant factors were set tokf = 4.0 mm/s andkor =4.0·10−3 mm/s. Intrinsic noise signals with a Gaussiandistribution (Fig. 12) caused random rotations aroundthe vertical axis. The velocity of the motors could beset stepwise to±8n mm/s with n = 1, . . . ,10 forthe left and right motor. The basic velocity was set tov0 = 40 mm/s, which was modulated by the visualmotion signals. The robot was updated at a rate of12 Hz.

Fig. 12. Gaussian distribution of turning response due to the noisethat is added to the motor signals.

5. Experiments with the robot

5.1. Optomotor response behavior

Experiment 6. In this situation, as in the fixation ex-periments with flies, the robot was only able to turnaround its vertical axis within a circular arena (diame-ter: 45 cm). For the optomotor response a pattern withblack and white stripes was painted onto the wall of thearena (λ = 51.4◦). Instead of a constant rotatory biasof the drum, asymmetric motor signals were added tothe left and right motor:∓24 mm/s (this correspondsto ψ = ±4.2◦/frame).

Result. In the case of an asymmetric motor signalthat led to a rotation of the robot about the verticalbody-axis, the robot produced a compensatory turningresponse by the internally generated control signalscompensating for 97% of the motor asymmetry (Fig.13(a)).

5.2. Fixation behavior

Experiment 7. To test the fixation behavior, a singleblack stripe (angular width: 25.7◦) was painted ontothe white wall of the arena. At the beginning of theexperiment, the stripe was at an orientation ofψ =57◦ off the heading direction. Instead of measuring the

Page 11: On robots and flies: Modeling the visual orientation ... · On robots and flies: Modeling the visual orientation behavior of flies ... land. Tel.: +1-634-21-38; ... tectors that

S.A. Huber et al. / Robotics and Autonomous Systems 29 (1999) 227–242 237

Fig. 13. (a) Histogram of the optomotor response in the presence of a motor asymmetry of the left and right motor(vvva = (va

l , var ) = (−24,+24) mm/s) (9700 steps). Compensatory motor signals of the left and right motor resulted invl = 23.2±7.2 mm/s

and vr = −vl . (b) Histogram (9700 steps) of the visual angleϕ of the stripe’s retinal image:ϕ = −0.29◦ ± 10.36◦.

absolute orientation of the robot in the drum, we usedthe visual angle of the stripe’s retinal image.

Result. In an arena with a single black stripe, therobot was able to orient itself towards the stripe afterabout 8 s and fixate on it (Fig. 13(b)). This was inaccordance with the result from the simulation.

5.3. Approach behavior

Experiment 8. In the next experiment the robot wasstarted from various positions and with various ori-entations in the arena (Fig. 14). The robot had twodegrees of freedom: translation in the heading direc-tion and rotation around the vertical axis. The rota-tory and translatory velocities were set according toEq. (5).

Result. When the robot had two degrees of freedom,i.e. was able to both translate and rotate, it approachedthe stripe in all tested cases (Fig. 14). The movementof the robot was guided by the motion perceived at theborder of the stripe since no motion information can bederived from the uniform black stripe. Therefore, theagent often approached one border of the stripe insteadof heading directly towards the middle of the stripe.The trajectories were obtained by a visual trackingsystem with a video camera recording the arena froma bird’s-eye view. The system tracked a red marker ontop of the robot.

6. Discussion

We have investigated whether one visuomotor con-troller is sufficient to generate both optomotor re-sponse and fixation behavior. We could in fact showthat this was possible. Both the simulated agent andtheKheperaTM robot showed optomotor response andfixation behavior depending on the particular environ-mental conditions. In addition the robot approachedprominent objects in the arena if it was able to movearound. In this section we discuss the sensorimotorcontrol of the simulated agent and the robot and givesuggestions for further improvements.

6.1. Visual system

The visual signals were spatially lowpass filtered inorder to model the spatial sensor layout at the recep-tor layer. Predictive coding as suggested by Srinivasanet al. [34,45] together with temporal highpass filteringwere applied in order to reduce redundant informationfrom the image. The signal was then amplified suchthat it matched the full range of contrast sensitivity.The image processing procedure could be further im-proved because the spatial inhibitory surround that re-sults from a filter designed with the predictive codingtechnique was slightly stronger than actually observedin the fly’s lamina. More recent theories take anotherapproach to extract the maximal possible spatiotempo-ral information for a given sensory system [47]. They

Page 12: On robots and flies: Modeling the visual orientation ... · On robots and flies: Modeling the visual orientation behavior of flies ... land. Tel.: +1-634-21-38; ... tectors that

238 S.A. Huber et al. / Robotics and Autonomous Systems 29 (1999) 227–242

Fig. 14. Approach of single stripe in an arena. The robot starts with the orientations relative to the center point of the stripe1ψ = −60◦,60◦,65◦, and 70◦.

suggest a procedure to optimize a neural filter for animage sequence that the system normally encounterssuch that the information flow through the visual sys-tem is maximized. This may be implemented as a nextstep to model the lamina of flies in more detail. In addi-tion, adaptive processes in the photoreceptors may beimplemented to extend the range of luminances overwhich the sensors can operate. These processes willbecome essential if the system experiences large vari-ations in lighting conditions, which was not the casein this study.

Missler and Kamangar [35] modeled the informa-tion processing of the fly’s visual system for a trackingsystem. However, to design a controller that regulatesthe two behaviors of optomotor response and objectfixation under real world conditions, some character-istics of the fly’s visual system are essential which weincluded but which were missing in their model: (i)the reduction of signal redundancy in the lamina bylateral inhibition, (ii) the characteristic of the motiondectectors that respond more strongly to progressivemotion (from front to back) than to regressive motion

(from back to front), and (iii) the overlapping recep-tive fields as well as the sensitivity distribution of thetangential cells in the lobula plate.

6.2. Visuomotor control

Our work has demonstrated that with the controllerthe agents succeeded in both tasks of optomotor re-sponse and object fixation.

Experimental results on files suggest a temporal in-tegration stage of the output signals from the honzon-tal cells for the optomotor response [31,43]. We haveshown that for our agent, integration of the image mo-tion signals leads to an optomotor response behaviorcomparable to that of files. Purely proportional controlsignals are not sufficient to compensate for the drum’srotation. Nevertheless, as in flies [39,49], no unstablebehavior results due to the response characteristic ofthe motion detectors.

The following perceptual properties are essentialto generate a fixation behavior similar to those in

Page 13: On robots and flies: Modeling the visual orientation ... · On robots and flies: Modeling the visual orientation behavior of flies ... land. Tel.: +1-634-21-38; ... tectors that

S.A. Huber et al. / Robotics and Autonomous Systems 29 (1999) 227–242 239

flies: (i) an asymmetric response of the motion detec-tors to progressive and regressive motion, and (ii) thenon-uniform spatial sensitivity function of the largefield integration units with their overlapping fields infront of the agent.

6.2.1. Simulated agentThe fixation behavior of the simulated agent is com-

parable to that of flies. Similar to the transition of be-havior in flies, the agent changed the fixation behav-ior from the fixation of the middle point between thestripes at1ψs = 40◦ to the fixation of the two singlestripes for1ψs > 60◦, where the agent fixate on eitherof the stripes for a certain amount of time. Analog tothe findings in flies, for1ψs = 60◦,80◦ and 100◦, theestimated angular separation of the peaks of the his-togram (Fig. 7(b)) was on an average smaller than theangular separation of the two stripes. When the agentfixated upon one of the two stripes, the motion signalthat resulted from the second stripe in the lateral vi-sual field produced a turning response in its direction.This behavior strongly depends on the non-uniformsensitivity function of the large field units and on theangular separation of the two stripes. For example, auniform sensitivity distribution would cause a fixationon the center point between the two stripes. By theuse of a threshold function for the outputs of the largefield units, it may be possible to alter the agent’s fix-ation behavior in the presence of two stripes. Whenfixating upon one stripe in the heading direction, theresponse to the second laterally placed stripe is smallerand could be reduced further by a threshold function.Like flies, the agent fixates on the center of the stripesfor 1ψs = 180◦.

Differential equations have been used [37,42] to de-scribe the optomotor response and fixation behaviorsof flies under the same condition of one dynamic de-gree of freedom: rotation around the vertical axis. Thephenomenological equation of motion describes in afirst approximation the reaction of the fly under nor-mal fixation by the instantaneous values of the patternposition and velocity (see also [38,41]). In contrastto our controller, the stages of image preprocessing,motion detection and processing of the visual motioninformation that lead to information about pattern po-sition and velocity are not part of these simulations.Nevertheless, in the differential equation the shapeof the distribution of the position-dependent term for

the one-stripe-fixation behavior is similar to the spa-tial senstivity distributionS(j) of the large field unitsthat integrate local motion information spatially in ourmodel controller. Both indicate the attractiveness orthe reaction strength towards the stripe located at acertain spatial position on the retina. In fact, the simu-lations of the two-stripe fixation behavior with differ-ential equations and with our controller lead to com-parable results.

6.2.2. RobotThe implementation of the control structure onto

the robot was straightforward and the results showedthat the simulations represent the real world condi-tions of the robot very well. We slightly changed theproportionality factor for the proportional and integralparts of the controller and reduced the time constant ofthe lowpass filter that simulated the transmission timethrough the controller. The robot was able to fixate onthe black stripe and also to stabilize the retinal imageby optomotor response. However, with two degrees offreedom the robot even approached prominent objectsin the scene.

Our controller for the robot and the simulated agentis a simplification of the control structure of flies.One large simplification is that the robot is moving ona 2D plane. We do not model flying insects [22,48]but agents with two degrees of freedom: the rotationaround the vertical body axis and the translation in theheading direction. The controller also does not takeinto account the fly’s head movement relative to itsbody [20,32,48].

Other robots inspired by the visual motion process-ing of flies have been designed mainly for obstacleavoidance. Franceschini et al. [18] used a robot in areal environment that extracts the visual motion infor-mation from a ring sensor. Their robot has analog op-toelectronic circuits on board to detect the local mo-tion signals. The robot used the motion informationfor obstacle avoidance and a light sensor for the ap-proach of a distant goal. The movements of the robotwere piecewise linear translations with constant veloc-ity and saccadic rotations, using the fact that duringpure translations the image motion for objects in thedistance is smaller than for objects close to the robot.Robots have been designed [51] for wall following ina corridor that execute translatory and rotatory move-ments at the same time. Again the translatory image

Page 14: On robots and flies: Modeling the visual orientation ... · On robots and flies: Modeling the visual orientation behavior of flies ... land. Tel.: +1-634-21-38; ... tectors that

240 S.A. Huber et al. / Robotics and Autonomous Systems 29 (1999) 227–242

flow signals the distance to the walls. In order to ob-tain the translatory flow the number of rotations of thewheels are used to correct the image flow for rotatorymotion. For optomotor response and object approachno pure translatory motion signal is necessary. There-fore, our robot executes rotations and translations atthe same time with translatory and rotatory speed thatvaries according to the image motion, e.g., large pro-gressive image motion reduces the forward speed.

7. Conclusion

We combined models of the visual informationprocessing system in flies [21,27,28,34,45] withmodels of the control structure of their orienta-tion behaviors [5,6,9,10,42,43] and transferred thesimulations onto a robot that moves in a real envi-ronment. Testing the behavior of a robot in a realenvironment, we have been able to show that foroptomotor response and object fixation, it is not nec-essary to implement separate modules for the twobehaviors. One controller regulates both the optomo-tor response and the fixation of stationary objects.Which of the two behaviors predominates dependson the particular environment. The definition of abasic set of behaviors that enables the agent to ac-complish a given task is a fundamental problem forthe design of behavior-based architectures. The de-signer has to predict all possible interactions betweenbehavioral modules and the environment. Thus thedecomposition of the control architecture into behav-ioral modules has to be done very carefully.

Our results agree with results from previous simula-tions using differential equations [9,42,43] and exper-iments on flies (e.g., [5,42]) in that the large field cellsalone are most probably not responsible for the fullobject fixation behavior. Full object fixation is medi-ated by other cells, as for example the small field cells[13], which respond selectively to motion in small ar-eas of their receptive field, or to other position-inducedmechanisms. This is especially the case for the track-ing of moving objects, e.g., a male fly chasing anotherfly or a fly approaching a flower moving in the wind.The stabilizing role of the optomotor system does notallow for the fixation of moving targets against a fea-tured background if fixation and optomotor system arecoupled like in our controller [9,33,48,52]. Additional

sensorimotor control is necessary for full object fix-ation, e.g., in the presence of two and more objects(stripes) and especially in a non-stationary environ-ment.

For further studies on autonomous robots and flybehavior, the implementation of our controller on ananalog VLSI chip would be very useful. On such chips,motion detection mechanisms have been successfullydeveloped [1,25,44]. These chips integrate both thephotosensors and the motion computation on a sin-gle chip. As in biological systems, a parallel process-ing of visual signals is thus possible. These chips arevery small and represent an alternative to conventionalcomputer vision systems; however, there are still prob-lems intrinsic to this approach. The sensors show apoor signal-to-noise ratio, and the performance de-creases significantly at low contrast and low illumina-tion levels. However, for the simulation of biologicalprocesses this does not have to be a disadvantage, asnoise is intrinsic in all biological neural systems and isalso an important feature, because it allows switchingbetween behaviors which otherwise could get stuck inlocal minima.

Acknowledgements

We would like to thank H. Krapp for many stimu-lating discussions on fly vision and fly behavior. Wethank him and also A. Borst for helpful comments onpreliminary versions of this paper.

References

[1] N. Ancona, T. Poggio, Optical flow from 1D correlation:application to a simple TTC detector, Memo no. 1375, AILaboratory, MIT, Cambridge, MA, 1993.

[2] R.D. Beer, Intelligence as Adaptive Behavior, AcademicPress, San Diego, CA, 1990.

[3] V. Braitenberg, Vehicles — Experiments in Synthetic Psycho-logy, MIT Press, Cambridge, MA, 1984.

[4] R.A. Brooks, A robust layered control system for a mobilerobot, IEEE Journal of Robotics and Automation 2 (1) (1986)14–23.

[5] H. Bülthoff, Drosophila mutants disturbed in visualorientation: II. Mutants affected in movement and positioncomputation, Biological Cybernetics 45 (1982) 71–77.

[6] H.H. Bülthoff, T. Poggio, C. Wehrhahn, 3D analysis of theflight trajectory of fliesDrosophila melanogaster, Zeitschriftfür Naturforschung 35c (1980) 811–815.

Page 15: On robots and flies: Modeling the visual orientation ... · On robots and flies: Modeling the visual orientation behavior of flies ... land. Tel.: +1-634-21-38; ... tectors that

S.A. Huber et al. / Robotics and Autonomous Systems 29 (1999) 227–242 241

[7] R.H.S. Carpenter, Movements in the Eyes, Pion, London,1979.

[8] D.T. Cliff, Neural networks for visual tracking in anartificial fly, in: Towards a Practice for Autonomous Systems:Proceedings of the First European Conference on ArtificialLife, MIT Press, Cambridge, MA, 1992, pp. 78–87.

[9] T.S. Collett, Angular tracking and the optomotor response:An analysis of the visual reflex interaction in a hoverfly,Journal of Comparative Physiology 140 (1980) 145–158.

[10] T.S. Collett, M.F. Land, Visual control of flight behaviorin the hoverfly Syritta pipiens L., Journal of ComparativePhysiology 99 (1975) 1–66.

[11] H. Cruse, C. Bartling, G. Cymbalyk, J. Dean, M. Deifert, Amodular artificial neural network for controlling a 6-leggedwalking system, Biological Cybernetics 72 (1995) 421–430.

[12] A.P. Duchon, W.H. Warren, Robot navigation froma Gibsonian viewpoint, in: Proceedings of the IEEEInternational Conference on Systems, Man and Cybernetics,IEEE Press, Piscataway, NJ, 1994, pp. 2272–2277.

[13] M. Egelhaaf, On the neural basis of figure-grounddiscrimination by relative motion in the visual system of thefly. I: Behavioral constraints imposed on the neural networkand the role of the optomotor system, Biological Cybernetics52 (1985) 123–140.

[14] M. Egelhaaf, On the neural basis of figure-grounddiscrimination by relative motion in the visual system ofthe fly. II: Figure-detection cells, a new class of visualinterneurons, Biological Cybernetics 52 (1985) 195–209.

[15] M. Egelhaaf, On the neural basis of figure-grounddiscrimination by relative motion in the visual system of thefly. III: Possible input circuitries and behavioral significanceof the FD-cells, Biological Cybernetics 52 (1985) 267–280.

[16] M. Egelhaaf, A. Borst, W. Reichardt, Computational structureof a biological motion–detection system as revealed by localdetector analysis in the fly’s nervous system, Journal of theOptical Society of America A 6 (7) (1989) 1070–1087.

[17] D. Floreano, F. Mondada, Automatic creation of anautonomous agent: Genetic evolution of a neural-networkdriven robot, in: From Animals to Animats 3: Proceedings ofthe Third International Conference on Simulation of AdaptiveBehavior, MIT Press, Cambridge, MA, 1994, pp. 421–430.

[18] N. Franceschini, J.M. Pichon, C. Blanes, From insect vision torobot vision, Philosophical Transactions of the Royal Societyof London B 337 (1992) 283–294.

[19] M.O. Franz, B. Schölkopf, M.A. Mallot, H.H. Bülthoff,Learning view graphs for robot navigation, AutonomousRobots 5 (1) (1998) 111–125.

[20] G. Geiger, T. Poggio, On head and body movements of flyingflies, Biological Cybernetics 25 (1977) 177–180.

[21] K.G. Götz, Optomotorische Untersuchung des visuellenSystems einiger Augenmutanten der FruchtfliegeDrosophila,Kybernetik 2 (2) (1964) 77–92.

[22] K.G. Götz, Flight control inDrosophilaby visual perceptionof motion, Kybernetik 4 (1968) 199–208.

[23] K.G. Götz, The optomotor equilibrium of theDrosophilanavigation system, Journal of Comparative Physiology 99(1975) 187–210.

[24] K.G. Götz, H. Wenking, Visual control of locomotion inthe walking fruitfly Drosophila, Journal of ComparativePhysiology 85 (1973) 235–266.

[25] R.R. Harrison, C. Koch, An analog VLSI model of thefly elementary motion detector, in: Advances in NeuralInformation Processing Systems, vol. 10, Morgan Kaufmann,San Mateo, CA, 1998, pp. 880–886.

[26] I. Harvey, P. Husbands, D. Cliff, Seeing the light: Artificialevolution, real vision, in: From Animals to Animats 3:Proceedings of the Third International Conference onSimulation of Adaptive Behavior, MIT Press, Cambridge,MA, 1994, pp. 392–401.

[27] B. Hassenstein, W. Reichardt, Systemtheoretische Analyseder Zeit-, Reihenfolgen- und Vorzeichenauswertung beider Bewegungsperzeption des RüsselkäfersChlorophanus,Zeitschrift für Naturforschung 11b (1956) 513–524.

[28] K. Hausen, Motion sensitive interneurons in the optomotorsystem of the fly, Biological Cybernetics 45 (1982) 143–156.

[29] K. Hausen, Decoding of retinal image flow in insects, in:F.A. Miles, J. Wallman (Eds.), Visual Motion and its Rolein the Stabilization of Gaze, Elsevier Science, Amsterdam,1993, pp. 203–235.

[30] K. Hausen, C. Wehrhahn, Neural circuits mediating visualflight control in flies. I. Quantitative comparison of neural andbehavioral response characteristics, Journal of Neuroscience9 (11) (1989) 3828–3836.

[31] M. Heisenberg, R. Wolf, Vision inDrosophila, Springer,Berlin, 1984.

[32] M.F. Land, Head movement of flies during visually guidedflight, Nature 243 (1973) 299–300.

[33] M.F. Land, T.S. Collett, Chasing behavior in housefliesFanniacanicularis, Journal of Comparative Physiology 89 (1974)331–357.

[34] S.B. Laughlin, Form and function in retinal processing, Trendsin Neurosciences 10 (11) (1987) 478–483.

[35] J.M. Missler, F.A. Kamangar, A neural network for pursuittracking inspired by the fly visual system, Neural Networks8 (3) (1995) 463–480.

[36] F. Mondada, E. Franzi, P. Ienne, Mobile robot miniaturization:A tool for investigation in control algorithms, in: ExperimentalRobotics III, Proceedings of the Third InternationalSymposium on Experimental Robotics, Springer, London,1994.

[37] M. Poggio, T. Poggio, Cooperative physics of fly swarms:An emergent behavior, AI Memo no. 1512, AI Laboratory,MIT, Cambridge, MA, 1994.

[38] T. Poggio, W. Reichardt, A theory of the pattern inducedflight orientation of the flyMusca domestica, Kybernetik 12(1973) 185–203.

[39] T. Poggio, W. Reichardt, Characterization of nonlinearinteractions in the fly’s visual system, in: W. Reichardt, T.Poggio (Eds.), Theoretical Approaches in Neurobiology, MITPress, Cambridge, MA, 1981, pp. 64–84.

[40] W. Reichardt, Autocorrelation, a principle for the evaluationof sensory information by the central nervous system, in: W.A.Rosenblith (Ed.), Sensory Communication, MIT Press/Wiley,New York, 1961, pp. 303–317.

Page 16: On robots and flies: Modeling the visual orientation ... · On robots and flies: Modeling the visual orientation behavior of flies ... land. Tel.: +1-634-21-38; ... tectors that

242 S.A. Huber et al. / Robotics and Autonomous Systems 29 (1999) 227–242

[41] W. Reichardt, T. Poggio, A theory of the pattern inducedflight orientation of the flyMusca domesticaII, BiologicalCybernetics 18 (1975) 69–80.

[42] W. Reichardt, T. Poggio, Visual control of orientation behaviorin the fly. Part 1: A quantitative analysis, Quarterly Reviewsof Biophysics 9 (3) (1976) 311–375.

[43] W. Reichardt, T. Poggio, K. Hausen, Figure-grounddiscrimination by relative movement in the visual system ofthe fly. Part II: Towards the neural circuitry 1, BiologicalCybernetics 46 (Suppl.) (1983) 1–30.

[44] R. Sarpeshkar, J. Kramer, G. Indiveri, C. Koch, Analog VLSIarchitectures for motion processing: from fundamental limitsto system applications, Proceedings of IEEE 84 (7) (1996)969–982.

[45] M.V. Srinivasan, S.B. Laughlin, A. Dubs, Predictive coding:A fresh view of inhibition in the retina, Proceedings of theRoyal Society of London B 216 (1982) 427–459.

[46] N. Tinbergen, Instinktlehre — vergleichende Erforschungangeborenen Verhaltens, Parey Verlag, Berlin, 1953.

[47] J.H. van Hateren, Theoretical predictions of spatiotemporalreceptive fields of fly LMCs, and experimental validation,Journal of Comparative Physiology A 171 (1992) 157–170.

[48] H. Wagner, Flight performance and visual control of flight ofthe free-flying housefly (Musca domestica L.), III. Interactionsbetween angular movement induced by wide-and smallfieldstimuli, Philosophical Transactions of the Royal Society ofLondon B 312 (1986) 581–595.

[49] A.-K. Warzecha, M. Egelhaaf, Intrinsic properties ofbiological motion detectors prevent the optomotor controlsystem from getting unstable, Philosophical Transactions ofthe Royal Society of London B 351 (1996) 1579–1591.

[50] B. Webb, Using robots to model animals: a cricket test,Robotics and Autonomous Systems 16 (1995) 117–134 .

[51] K. Weber, S. Venkatesh, M.V. Srinivasan, Insect inspiredbehaviours for the autonomous control of mobile robots, in:M.V. Srinivasan, S. Venkatesh (Eds.), From Living Eyes toSeeing Machines, Oxford University Press, Oxford, 1997.

[52] C. Wehrhahn, T. Poggio, H.H. Bülthoff, Tracking and chasingin houseflies (Musca), Biological Cybernetics 45 (1982) 123–130.

[53] R.C. Gonzales, R.E. Woods, Digital Image Processing,Addison-Wesley, Reading, MA, 1992.

Susanne A. Huberis currently a researchfellow at the Department of Psychology,University of Zurich, Switzerland. In 1993she received a Diploma in Physics fromthe Eberhard-Karls-University in Tübin-gen, Germany. She hold a Ph.D. De-gree in the Natural Sciences from theEberhard-Karls-Universität in Tübingen.From 1993 to 1997 she worked as a re-search scientist at the Max-Planck-Institute

for Biological Cybernetics in Tübingen. Since 1998 she worksat the General and Developmental Psychology Group at the Uni-versity of Zurich. Her research interests include behavior-basedrobotics as well as children’s development of skilled behavior andknowledge about the physical world.

Matthias O. Franz graduated with aM.Sc. in Atmospheric Sciences fromSUNY at Stony Brook, NY, in 1994. In1995, he received in Diploma in Physicsfrom the Eberhard-Karls-Universität,Tübingen, Germany, where he also fin-ished his Ph.D. In Computer Science in1998. His thesis research on minimalisticvisual navigation strategies was done atthe Max-Planck-Institut für biologische

Kybernetik, Tübingen. He is currently working as a researcherat DaimlerChrysler Research & Technology, Ulm, Germany. Hisscientific interests include optic flow, autonomous navigation, andnatural image statistics.

Heinrich Bülthoff is scientific member ofthe Max-Planck-Gesellschaft and Directorat the Max-Planck-Institute for Biologi-cal Cybernetics in Tübingen. He directsa group of about 35 biologists, computerscientists, mathematicians, physicists andpsychologists working on psychophysicaland computational aspects of higher-levelprocesses in the following area: object andface recognition, sensory-motor integra-

tion, spatial cognition, autonomous navigation and artificial life,computer graphics psychophysics, and perception and behavior invirtual environments, Professor Bülthoff is involved in many in-ternational collaborations on these topics and is a member of thefollowing scientific societies: Psychonomics Society since 1996,Computer Society of the IEEE since 1988, Society of Neurosciencesince 1987, Association for Research in Vision and Ophthalmologysince 1984. he holds a Ph.D. degree in the Natural Sciences fromthe Eberhard-Karls-Universität in Tübingen. From 1980 to 1985he worked as a research scientist at the Max-Planck-Institute forBiological Cybernetics in Tübingen. He then moved to the USA towork at the Massachusetts Institute of Technology in Cambridgeand later became Professor at Brown University in Providence.After he had returned to the Max-Planck-Institute in 1993, he be-came an Honorary Professor at the Eberhard-Karls-Universität in1996.