identifying challenges for aerial robots operating in near ... · ing tasks such as bomb detection,...

8
Identifying Challenges for Aerial Robots Operating in Near-Earth Environments Keith W. Sevcik, William E. Green and Paul Y. Oh * Drexel University, Philadelphia, PA [email protected], [email protected], [email protected] Abstract Homeland security missions executed in near-Earth environments are often time consuming, labor inten- sive and possibly dangerous. Aerial robots perform- ing tasks such as bomb detection, search-and-rescue and reconnaissance could be used to conserve resources and minimize risk to personnel. Flying in environ- ments which are heavily populated with obstacles yields many challenges. Little data exists to guide the design of vehicles and sensor suites operating in these envi- ronments. This paper explores the challenges encoun- tered implementing several different sensing technol- gies in near-Earth environments. The results of ap- plying these technologies to control a robotic blimp are presented to direct future work. 1 Introduction Homeland security missions bring new and unfamil- iar territories which must be patrolled and kept safe. Caves, forests and other near-Earth environments along with urban structures, such as buildings and tunnels, are difficult and time consuming to safeguard. Furthermore, search-and-rescue missions are most of- ten dangerous and require large, diverse task forces [2]. Robots offer a means to offset this demand in resources and personnel. Much of the research effort has been in applying ground-based robots [8], however flying or hovering offers capabilities unachievable by ground based robots. Small-scale aerial platforms, such as micro-air- vehicles (MAV’s), 1 are capable of flying in environ- ments heavily populated with obstacles and can as- sist in such missions. However, there are several constraints for MAV’s and small unmanned-aerial- * IEEE Member. Address all correspondence to this author. This work was supported in part by the National Science Foun- dation CAREER award IIS 0347430 1 MAV’s are defined as aerial vehicles capable of safe, con- trolled flight in near-Earth environments. For example, vehicles such as those used in [3], while small, move too fast to navigate areas densely populated with obstacles. Figure 1: A 30 inch diameter blimp carrying a 14 gram mini wireless camera can provide surveillance images for use in disaster scenarios. vehicles (UAV’s) that conventional UAV’s, such as the Predator, do not face. For example, equipping MAVs with larger-scale navigational sensor suites, such as inertial measurement units (IMU’s), global position- ing systems (GPS) and pressure sensors is not feasible due to payload limitations. Furthermore, GPS-based methods will not work in buildings, tunnels or caves because satellite signals are occluded. The net effect is that small, lightweight (i.e. less than 100 g) alterna- tive sensor suites are required for aerial vehicles flying in near-Earth environments. The assessment and evaluation of such sensor suites demands an aerial platform which is small and can fly safely and slowly in near-Earth environments. Com- mercial vehicles currently being developed by Honey- well, BAE Systems and Piasecki Aircraft are capable of maneuvering in areas rich with obstacles. How- ever, they are not yet available as research platforms. Nonetheless, collision avoidance and autonomous nav- igation sensor suites will be needed and can be devel- oped in parallel. A simple and safe platform, such as a blimp, can serve as a test bed for sensor suite evalua- 67

Upload: others

Post on 11-Oct-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Identifying Challenges for Aerial Robots Operating in Near ... · ing tasks such as bomb detection, search-and-rescue ... carry a miniature wireless camera or stereo pair, com-

Identifying Challenges for Aerial Robots Operating in Near-Earth Environments

Keith W. Sevcik, William E. Green and Paul Y. Oh∗

Drexel University, Philadelphia, PA

[email protected], [email protected], [email protected]

Abstract

Homeland security missions executed in near-Earth

environments are often time consuming, labor inten-

sive and possibly dangerous. Aerial robots perform-

ing tasks such as bomb detection, search-and-rescue

and reconnaissance could be used to conserve resources

and minimize risk to personnel. Flying in environ-

ments which are heavily populated with obstacles yields

many challenges. Little data exists to guide the design

of vehicles and sensor suites operating in these envi-

ronments. This paper explores the challenges encoun-

tered implementing several different sensing technol-

gies in near-Earth environments. The results of ap-

plying these technologies to control a robotic blimp are

presented to direct future work.

1 Introduction

Homeland security missions bring new and unfamil-iar territories which must be patrolled and kept safe.Caves, forests and other near-Earth environmentsalong with urban structures, such as buildings andtunnels, are difficult and time consuming to safeguard.Furthermore, search-and-rescue missions are most of-ten dangerous and require large, diverse task forces[2]. Robots offer a means to offset this demand inresources and personnel. Much of the research efforthas been in applying ground-based robots [8], howeverflying or hovering offers capabilities unachievable byground based robots.

Small-scale aerial platforms, such as micro-air-vehicles (MAV’s),1 are capable of flying in environ-ments heavily populated with obstacles and can as-sist in such missions. However, there are severalconstraints for MAV’s and small unmanned-aerial-

∗IEEE Member. Address all correspondence to this author.This work was supported in part by the National Science Foun-dation CAREER award IIS 0347430

1MAV’s are defined as aerial vehicles capable of safe, con-trolled flight in near-Earth environments. For example, vehiclessuch as those used in [3], while small, move too fast to navigateareas densely populated with obstacles.

Figure 1: A 30 inch diameter blimp carrying a 14gram mini wireless camera can provide surveillanceimages for use in disaster scenarios.

vehicles (UAV’s) that conventional UAV’s, such as thePredator, do not face. For example, equipping MAVswith larger-scale navigational sensor suites, such asinertial measurement units (IMU’s), global position-ing systems (GPS) and pressure sensors is not feasibledue to payload limitations. Furthermore, GPS-basedmethods will not work in buildings, tunnels or cavesbecause satellite signals are occluded. The net effectis that small, lightweight (i.e. less than 100 g) alterna-tive sensor suites are required for aerial vehicles flyingin near-Earth environments.

The assessment and evaluation of such sensor suitesdemands an aerial platform which is small and can flysafely and slowly in near-Earth environments. Com-mercial vehicles currently being developed by Honey-well, BAE Systems and Piasecki Aircraft are capableof maneuvering in areas rich with obstacles. How-ever, they are not yet available as research platforms.Nonetheless, collision avoidance and autonomous nav-igation sensor suites will be needed and can be devel-oped in parallel. A simple and safe platform, such as ablimp, can serve as a test bed for sensor suite evalua-

67

Page 2: Identifying Challenges for Aerial Robots Operating in Near ... · ing tasks such as bomb detection, search-and-rescue ... carry a miniature wireless camera or stereo pair, com-

tion. Figure 1 shows a blimp, with a 30 inch diameter(allowing it to fit through standard doorways) and apayload capacity of around 60 g. This is enough tocarry a miniature wireless camera or stereo pair, com-pact sensors and other small electronic packages.

Prior work has demonstrated the ability to controland navigate aerial vehicles utilizing a variety of sens-ing techniques. Vision based guidance and control hasbeen demonstrated by [3]. Optic flow sensors studiedin [1] have been used to perform autonomous taskswith MAV’s. Localization and guidance using wire-less motes has been achieved in [12]. However, thedifficulties faced in near-Earth environments tend tosegregate these sensing methods, making them effec-tive for accomplishing only specific tasks. Little hasbeen done to evaluate these technologies from a single,consistent platform.

This paper illustrates how these sensing techniquescan be applied to a blimp. Section 2 discusses ablimp’s platform characteristics and dynamics. Sec-tion 3 demonstrates the use of optic flow sensors, com-puter vision and wireless motes. Finally, section 5concludes by summarizing and discussing future work.

2 Aerial Platform

Several aerial platforms have been experimented withand evaluated. Rotorcraft, such as helicopters orducted fan units [7], can hover but are extremely dif-ficult to control. Fixed-wing aircraft can be designedto fly at extremely slow speeds [9], but are limited bytheir payload capacities. Lighter-than-air vehicles, incontrast, are easy to fly, inexpensive, and capable ofhovering.

2.1 Lighter-Than-Air Vehicles

Helium is the most common gas used in blimps today,with a lifting capacity of 1.02 kg/m3 at standard tem-perature and pressure. The blimp holds roughly .17m3 of helium, giving it a theoretical lifting capacityof 174 g. Experimental results show an actual liftingcapacity of 200 g. The total mass of the balloon, gon-dola, fins and mounting tape is 135.8 g. Therefore, themaximum payload that can be carried by the blimp is64.2 g. This is substantially greater than typical near-Earth MAV’s, making it an ideal platform for testinga variety of sensors.

The blimp has two electric motors with attached pro-pellers positioned on the gondola which allow forwardand backward movement. These two motors can also

pivot via a radio-controlled (RC) servo to provide anupward or downward angle to the thrust vector, as de-picted in Figure 2. This allows the blimp to increaseor decrease its altitude respectively. Yaw (i.e. rota-tion about the vertical axis) is controlled by an electricmotor and propeller placed in the blimp’s rear fin.

The general approach for modeling a blimp followedby [13], [14] and [15] assumes that:

1. The airship can be modeled as a rigid body,thereby neglecting aeroelastic effects.

2. The volume and mass of the airship can be con-sidered constant.

This model is often applied to much larger blimpsthat use control surfaces to direct the craft. Sincethe system under investigation is much smaller, thefollowing assumptions can be made to simplify themodel:

3. The blimp is symmetric about the XZ plane.

4. The blimp is moving slow enough and is designedin such a way that the aerodynamic forces arenegligible.

Therefore, the dynamics for the blimp can then bewritten as:

MV̇ = Fd + Fg + Fp

V = [ Vx Vy Vz ωx ωy ωz ]T (Velocitiesalong and angular rates about the axes)

M = 6x6 mass and inertia matrixFd = Dynamic force vectors (coriolis and centrifu-

gal terms)Fg = Gravity and buoyancy vectorsFp = Propulsive force vectors

The remainder of the system definition closely followsthe derivation presented in [13]. All equations of mo-tion are defined about a reference frame fixed to thebody of the blimp whose origin is located at the centerof buoyancy, which is assumed to be coincident withthe center of volume. The center of gravity of theairship is defined relative to the center of buoyancy.The mass matrix accounts for all masses and inertiaspresent in the system, including virtual terms associ-ated with the apparent added inertia of a blimp. Thedynamic force vector Fd is defined as follows:

68

Page 3: Identifying Challenges for Aerial Robots Operating in Near ... · ing tasks such as bomb detection, search-and-rescue ... carry a miniature wireless camera or stereo pair, com-

−mzVzωy + myVyωz

−mxVxωz + mzVzωx

−myVyωx + mxVxωy

−(Jz − Jy)ωzωy + Jxzωxωy + (mz −my)VyVz

−(Jx − Jz)ωxωz + Jxz(ω2

z − ω2

x) + (mx −mz)VxVz

−(Jy − Jx)ωyωx − Jxzωzωy + (my −mx)VxVy

The gravity and buoyancy vector Fg is given by:

kx(mg −B)ky(mg −B)kz(mg −B)

azkyB(−azkx + axkz)B

−kyaxB

Where kx, ky and kz are components of a unit vec-tor in the direction of gravity. Finally, the propulsiveforces vector Fp for this specific actuation scheme isgiven by:

TpcosµTt

−TpsinµTtdtz

Tp(dzcosµ− dxsinµ)Ttdtx

Tp = Force from thrust propellersTt = Force from turning propellerµ = Angle of inclination of thrust propellers

dx, dz = x and z location of thrust propellersdtx, dtz = x and z location of turning propeller

Utilizing these equations of motion, it is possible toapply an input force to the thrust propellers and theturning propeller so the resulting linear and angularvelocities can be observed. By tuning the various con-stants used to characterize the system, the model canbe made to closely approximate the reactions of thereal world system.

2.2 PC-to-RC

In order to allow the blimp to be autonomously con-trolled by a ground-based PC, a PC-to-RC circuit wasconstructed [10]. Figure 3 shows how the circuit is in-terfaced with the PC and a standard 4-channel RCtransmitter. This setup allows digital commands sent

Figure 2: Blimp diagram

Figure 3: A PC-to-RC circuit converts digital com-mands to RC signals. Commands are then sent wire-lessly to the blimp through a RC transmitter.

from the PC to be converted into pulse width modu-lated (PWM) signals. PWM signals can then be sentwirelessly to the blimp’s onboard receiver.

The control software running on the PC generates 8-bit numbers for each of the 4 channels on the trans-mitter. The numbers correspond to the length of thePWM signal. Pulse lengths vary from 1 to 2 ms, where1.5 ms usually represents the neutral position of a RCservo. The microcontroller, integrated into the PC-to-RC circuit, receives the numbers and generates thepulse to be sent to the RC transmitter. The pulsesare grouped into frames, with a frame containing onepulse for each channel. Figure 5 shows the signal thatwould be sent to a 4 channel transmitter.

The frames sent from the microcontroller are receivedthrough the buddy port on the transmitter. Tradi-

69

Page 4: Identifying Challenges for Aerial Robots Operating in Near ... · ing tasks such as bomb detection, search-and-rescue ... carry a miniature wireless camera or stereo pair, com-

Figure 4: Optic flow is used to sense when an obstacle is within close proximity of the blimp. The blimp avoidsthe collision by giving full throttle to the yawing motor.

Figure 5: Signal from microcontroller to transmitter.

tionally, the buddy port is used to allow a trainer totake over the control of an amateur under their tute-lage. This port can also be used to allow the com-puter to take control of the transmitter. Autonomouscontrol can then be achieved based on informationgathered about the surrounding environment.

3 Sensors

Intelligence obtained from sensors allows the robot’scontrol system to make sophisticated decisions. Inaddition to traditional sensors such as sonar, in-frared (IR) and vision, biomimetic sensors can beconstructed as lightweight packages. Integrating suchhardware can produce a robust sensor suite for near-Earth environments.

3.1 Biomimetic Sensing

Insects make heavy use of vision, especially optic flow,for perceiving the environment [4]. Optic flow refersto the apparent movement of texture in the visual fieldrelative to the insect’s velocity. Insects perform a va-riety of tasks in complex environments by using theirnatural optic flow sensing capabilities. While in flight,for example, objects which are in close proximity tothe insect have higher optic flow magnitudes. Thus,flying insects, such as fruit flies [11] and dragon flies,avoid imminent collisions by saccading (or turning)away from regions of high optic flow.

Capturing such sensing techniques into a packagedsensor is a vast research area. Neuromorphic chipshave been available for many years [6]. However, toachieve the desired weight of 1-2 grams, mixed-modeand mixed-signal VLSI techniques [5] are used to de-velop compact circuits that directly perform compu-tations necessary to measure optic flow [1].

Centeye has developed the one-dimensional Ladybug

optic flow microsensor based on such techniques. Alens focuses an image of the environment onto a focalplane chip which contains photoreceptors and othercircuitry necessary to compute optic flow. Low levelfeature detectors respond to different spatial or tem-poral entities in the environment, such as edges, spots,or corners. The elementary motion detector (EMD)is the most basic circuit that senses visual motion,though its output may not be in a ready to use form.Fusion circuitry fuses information from the EMD’sto reduce errors, increase robustness, and producesa meaningful representation of the optic flow for spe-cific applications.

The resulting sensor, including optics, imaging, pro-cessing, and I/O weighs 4.8 grams. This sensor grabsframes up to 1.4 kHz, measures optic flow up to 20rad/s (4 bit output), and functions even when tex-ture contrast is just several percent. Integrating in-sect flight patterns with Centeye’s hardware collisionavoidance was demonstrated using the blimp (see Fig-ure 4). Although Centeye’s optic flow sensors arenot yet available commercially, Agilent Technoligies’ADNS-2051 optical sensor can be utilized to achievesimilar results.

3.2 Computer Vision

To perform more sophisticated vision techniques suchas line following, a wireless image acquisition system

70

Page 5: Identifying Challenges for Aerial Robots Operating in Near ... · ing tasks such as bomb detection, search-and-rescue ... carry a miniature wireless camera or stereo pair, com-

Figure 6: A wireless camera is coupled with a com-puter vision algorithm to achieve line following.

is required. RC Toys’ Eyecam 2 provides a reliablewireless video feed when utilized indoors. It is aboutas small as a US quarter coin, weighs just 15 gramsand transmits color video on 2.4 GHz frequency. Theoutput from the receiver is composite video, which canbe digitized with Hauppauge’s USB-Live 3 in order toplug-and-play into a PC.

To demonstrate line following, the blimp was placedover a black line with a white background. A programwas created to process the video feed. The video wasthen thresholded into a simple black and white image.Code was written to calculate the location of the cen-troid of the line within the image plane. PD controlwas then implemented to direct the blimp along theline (see Figure 6). Realisticaly, such ideal environ-ments will not be encountered. However, the samepath following techniques can be applied if the loca-tion of the blimp is known.

3.3 Wireless Mote Localization

Wireless motes provide a means for localizing theblimp. The term ”motes” refers to a general class oftechnologies aimed at having small, robust and versa-tile sensors that are easily deployable over a wide area.Such sensor networks could be distributed in facto-ries to monitor manufacturing conditions, spread overfields to log environmental conditions for agriculture,or mixed into concrete to actively measure buildingstresses and vibrations.

2http://www.rctoys.com/eyecam.php3http://www.hauppauge.com

The smartdust series of motes manufactured byCrossbow Technologies4 consists of small wirelesstransceivers which can be interfaced with any sensor.Crossbow offers two common packages, the MICA2and the MICA2DOT. At the core of these motes isan ATmega128L AVR microprocessor. This micro-processor executes all of the code programmed intothe mote.

Code is written for the TinyOS operating system.TinyOS is an event driven operating system that han-dles low level microprocessor and radio networkingtasks. This intrinsic networking ability allows forquick development of networks of wireless motes. Themotes decide the most efficient network arrangement,resulting in an adhoc network. The TinyOS architec-ture also supports multihopping, allowing two motesout of range of each other to pass their informationbetween intermediate motes.

The radio module used by the MICA2 andMICA2DOT provides a measurement of the strengthof received signals. The signal strength between a mo-bile mote attached to the blimp and wireless motes onthe ground can be used to determine the relative posi-tion of the robot. If the location of the ground basedmotes is known, the robot can be localized. Such astrategy could be used to determine the absolute po-sition of an aerial vehicle, the location of a vehiclerelative to a target, or the position of an aerial vehi-cle relative to ground based robots carrying motes.

To demonstrate this capability, a program was writ-ten to cause one of the motes to act as a beacon.Ground based motes that detected this beacon wereprogrammed to relay the strength of the received sig-nal to a computer base station. These strengths weredisplayed using a visual basic GUI which indicatedmotes in proximity to the beacon (see Figure 7).

4 Aerial Robot Competition

In investigating different sensing methodologies, sev-eral questions arose about the use of aerial robots innear-Earth environments. Problems such as the useof distributed versus local computing, the affect ofenvironmental obscurrants, and the range, resolutionand robustness of sensors were common across dif-ferent sensing technologies and aerial platforms. Anannual indoor aerial robot competition was conceivedto help identify these issues and encourage innovativesolutions. The challenges addressed would increase in

4http://www.xbow.com

71

Page 6: Identifying Challenges for Aerial Robots Operating in Near ... · ing tasks such as bomb detection, search-and-rescue ... carry a miniature wireless camera or stereo pair, com-

Figure 7: Signal strength is measured between a mobile node attached to the blimp and fixed nodes placed on atable. As the blimp passes by, the graphic corresponding to the nearest node is lit.

complexity, with the goal of achieving full autonomyby the year 2015.

In May 2005, Drexel University organized the firstindoor aerial robot competition. The inaugural com-petition, featuring undergraduate teams from DrexelUniversity and Swarthmore College (advised by Pro-fessor Bruce Maxwell), focused on both autonomousnavigation and target identification in urban-like ar-eas5.

4.1 Autonomous Collision Avoidance

One of the major challenges of autonomous flight innear-Earth environments is the limited availability ofGPS. This was mimicked by hosting the competitionindoors. The autonomous collision avoidance sectionutilized a 90 x 20 foot space populated with obsta-cles such as telephone poles and wire, urban struc-tures, trees, etc (see Figure 8). While these obstacleswere symbolic of an outdoor setting, hosting the com-petition indoors prevents the use of GPS for futurecompetitions. The obstacles were overlaid on a whitecloth, and a black line ran through the course to de-note a collision-free path. Teams had to implementa line following algorithm in real-time that was in-variant to changing lighting conditions (i.e. a glassroof enable sun to light up portions of the course)and noise from indoor video transmission. Towardsthe end of the course, robots were met with a low-speed fan to simulate wind disturbances. Points wereawarded based on how far through the course robotswere able to travel.

5Thanks to Professors Hong Zhang and Rungun Nathanfrom Rowan and Villanova Universities, respectively, for judg-ing the competition

Figure 8: Swarthmore College’s blimp following thecollision-free path.

4.2 Teleoperated Target Identification

The other section of the competition consisted of sev-eral mock victims spaced out in a 90 x 50 foot area.These victims were positioned in a non-consciousmanner, perhaps as a result of a chemical or biolog-ical agent released through the ventilation system ofan office building (see Figure 9). Using a wirelesscamera mounted on the blimp’s gondola, teams uti-lized teleoperated control to identify survivors and de-ploy markers (symbolic of radio beacons) pinpointingtheir locations before hazmat teams can arrive. Blimpoperators were only permitted to view video imagestransmitted wirelessly from the blimp’s camera andcould not directly view the search area. Points in thissection were awarded based on the marker proximityto survivors.

72

Page 7: Identifying Challenges for Aerial Robots Operating in Near ... · ing tasks such as bomb detection, search-and-rescue ... carry a miniature wireless camera or stereo pair, com-

Figure 9: In the search-and-rescue portion, teams willhave to locate victims by viewing images transmittedfrom the robot’s wireless camera.

4.3 Results

The difficulty of the line following section was evi-dent after practice runs for each team. To compen-sate for this, each team was allotted two restarts (i.e.the blimp can be placed back in the position it lastlost the line). With the incorporation of this rule,both teams were able to follow the line until reachingthe fan area, a distance of 75 feet. Once confrontedwith low speed wind currents, each team’s blimp wasimmediately blown off course, unable to demonstrategust stabilization. The target identification task alsoproved to be difficult. Teams were only able to locateand mark 1 to 4 victims out of a possible 8. In addi-tion to the scores accumulated in the collision avoid-ance and target identification sections, each team wasalso judged on the design of both the flight systemand the marker deployment mechanism. The overallwinner of the 2005 competition was Drexel University.

The key challenges identified in the inaugural com-petition were found mostly in the line following sec-tion. For example, sunlight shined sporadically on thecourse resulting in large gradients which effected theefficiency of the computer vision algorithms. Also,wireless video transmission indoors is diminished, butstill usable at short distances (i.e. ¡ 100 feet). Fur-thermore, stabilizing an aerial robot in the presenceof wind gusts is still a prevalent challenge.

In the teleoperated portion of the competition, teamsfound it difficult to interpret the raw video trans-mitted from the blimp’s wireless camera. A bird’s

eye view is oftentimes unfamiliar to the operator andmay require some image processing (e.g. object recog-nition) techniques to identify victims, tables, chairs,etc. During the teleoperated portion of the course,one of the teams lost control of their blimp when itwas flown over a portion of the course that had beenheated by sunlight. This observation identified ther-mals as a major concern for aerial robots operating innear-Earth environments.

5 Conclusions

The design of a sensor suite for a MAV varies greatlyfrom the sensor suites utilized on traditional UAVs.Flying below tree tops or in and around urban struc-tures prevents the use of GPS. Furthermore, devicessuch as IMU’s and gyros often strain the payload ca-pacities of small, lightweight aircraft. Design thenfocuses on achieving fundamental autonomous taskssuch as altitude control and obstacle avoidance us-ing the smallest packages possible. However, even themost highly-developed control system will fail whenpresented with unforeseen obstacles. Telephone wires,for example, are extremely thin, but could easily befatal to a MAV. Such near-Earth environment imped-iments demand the use of varied sensing technologiesto ensure robustness. Through fusion of optic flowsensing, vision based guidance and wireless networklocalization, aerial vehicles are provided with a diversesensor suite capable of addressing the issues faced.

This paper demonstrates the porting of these tech-niques onto a robotic blimp, which provides a robust,versatile platform who’s dynamics are well understoodand documented. To begin to characterize these sen-sor suites, future work must be conducted to measurethe reactions of these sensors to variables introducedin a controlled near-Earth environment. To facilitatecontroller design, experimental results must be du-plicated in simulated models. With well understoodmodels and corroborating physical data, design canthen move towards making MAV’s fully autonomousin near-Earth environments.

References

[1] Barrows, G., “Mixed-Mode VLSI Optic Flow Sensorsfor Micro Air Vehicles”, Ph.D. Dissertation, Univer-sity of Maryland, College Park, MD, Dec. 1999.

[2] Blitch, J., “World Trade Center Search-and-RescueRobots”, Plenary Session IEEE Int Conf Roboticsand Automation, Washington D.C., May 2002.

73

Page 8: Identifying Challenges for Aerial Robots Operating in Near ... · ing tasks such as bomb detection, search-and-rescue ... carry a miniature wireless camera or stereo pair, com-

[3] Ettinger, S.M., Nechyba, M.C., Ifju, P.G., Waszak,M., “Vision-Guided Flight Stability and Control forMicro Air Vehicles”, IEEE/RSJ Int Conf on Robotsand Systems, Lausanne, Switzerland, pp. 2134-2140,October 2002.

[4] Gibson, J.J., The Ecological Approach to Visual Per-ception, Houghton Mifflin, 1950.

[5] Harrison, R., Koch, C., “An analog VLSI implemen-tation of a visual interneuron: enhanced sensory pro-cessing through biophysical modeling”, InternationalJournal of Neural Systems, 9: 391-395, 1999

[6] Higgins, C., “Sensory architectures for biologically-inspired autonomous robotics,” The Biological Bul-letin, vol. 200, pp 235-242, April 2001.

[7] Hamel, T.; Mahony, R., Chriette, A., “Visual ServoTrajectory Tracking for a Four Rotor VTOL AerialVehicle”, IEEE International Conference on Roboticsand Automation (ICRA), Washington, D.C., pp.2781-2786, 2002.

[8] Murphy, R., et al, “Mobility and sensing demandsin USAR”, IEEE Industrial Electronics Conference(IECON), V1, pp. 138-142, 2000.

[9] Nicoud, J.D., Zufferey, J.C., “Toward Indoor FlyingRobots”, IEEE/RSJ Int Conf on Robots and Sys-tems, Lausanne, pp. 787-792, October 2002.

[10] Sevcik, K., Oh, P. “PC to RC Interface”, Servo Mag-azine, July 2004

[11] Tammero, L.F., Dickinson, M.H., “The influence ofvisual landscape on the free flight behavior of the fruitfly Drosophila melanogaster”, Journal of Experimen-tal Biology, v205, pp. 327-343, 2002.

[12] Corke, P., Peterson, R., Rus, D., ”Networked Robots:Flying Robot Navigation using a Sensor Net”,inISRR 2003

[13] Varella, S. B. Gomes and Ramos, J. G. ”Airship Dy-namics Modeling for Autonomous Operation.” Pro-ceedings of 1998 IEEE Int Conf on Robotics and Au-tomation, Leuven, Belgium, pp 3462-3467, 1998.

[14] Kim, J., Keller, J., and Kumar, V. ”Design and Ver-ification of Controllers for Airships.”, Proceedings of2003 IEEE Int Conf on Intelligent Robots and Sys-tems, 2003, Las Vegas, Nevada, pp 54-60, 2003.

[15] Hygounenc, E., Jung, I-K., Soueres, P. and Lacroix,S. ”The Autonomous Blimp Project of LAAS-CNRS:Achievements in Flight Control and Terrain Map-ping”. International Journal on Robotics Research,vol 23, pp 463-512, 2004.

74