a flexible real-time control system for …a flexible real-time control system for autonomous...

8
A Flexible Real-Time Control System for Autonomous Vehicles. Johannes Meyer, Armin Strobel Institute of Flight Systems and Automatic Control, Technische Universität Darmstadt, Germany 1 Abstract In this paper we present a framework for the real-time control of lightweight autonomous vehicles which comprehends a proposed hard- and software design. The system can be used for many kinds of vehicles and offers high computing power and flexibility in respect of the control algorithms and additional application dependent tasks. It was originally developed to control a small quad-rotor UAV where stringent restrictions in weight and size of the hardware components exist, but has been transfered to a fixed-wing UAV and a ground vehicle for in- and outdoor search and rescue missions. The modular structure and the use of a standard PC architecture at an early stage simplifies reuse of components and fast integration of new features. Figure 1: Quadrotor UAV controlled by the proposed sys- tem 1 Introduction In recent years the interest in autonomous unmanned ve- hicles has grown significantly, taking into account var- ious applications. The spectrum reaches from surveil- lance and reconnaissance [1, 2], environmental monitor- ing [3], collection of geospatial data [4] to the support of search and rescue forces in disaster mitigation and pre- vention [5, 6]. Depending on the situational requirements either unmanned ground vehicles (UGV), aerial vehicles (UAV), surface vehicles (USV) or even underwater vehi- cles (UUV) or a mix of these are considered. The usage of unmanned vehicles has a high potential in reducing the risks for humans in hazardous scenarios and helps to save time and money compared to existing approaches. This especially holds for autonomous systems that are able to accomplish a predefined mission or at least subtasks with- out human interaction. From a system engineer’s perspective many challenges need to be tackled when designing and implementing such a system. Even though many disparities exist when look- ing at the mechanical design, propulsion, sensors and actu- ators of different type of vehicles, some problems are con- spicuously similar. Basic prerequsites for an autonomous vehicle are the ability to localize itself, to build an inter- nal model of what happens in the environment based on sensor readings, to plan its future steps and to control the actuators effectively. All of these components need to play together in a robust and reliable way and in most cases time constraints exist to a greater or lesser extent. Another important aspect that needs to be addressed is the com- munication between the vehicle and an operator station or between different vehicles. In this paper we present a framework for the design of such a control system. It was originally developed as an onboard controller for a small quadrotor aircraft of approximately 1.2 kg takeoff weight (Fig. 1) but soon has been adopted for the integration into other vehicles as well. In the mean- time it has successfully been deployed in a fixed-wing air- plane and a ground vehicle used for search and rescue mis- sions. As opposed to specialized microcontroller-based de- signs we decided to come up with a solution, that relies on a commercial off-the-shelf onboard computer with PC ar- chitecture at a very early stage in the control process. Only basic sensor and actuator I/O is done by a microcontroller which communicates with the computer using an Ethernet connection. The choice of this link technology was mo- tivated by the high availability of single board computers 1 http://www.fsr.tu-darmstadt.de/

Upload: others

Post on 29-May-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: A Flexible Real-Time Control System for …A Flexible Real-Time Control System for Autonomous Vehicles. Johannes Meyer, Armin Strobel Institute of Flight Systems and Automatic Control,

A Flexible Real-Time Control System for Autonomous Vehicles.Johannes Meyer, Armin StrobelInstitute of Flight Systems and Automatic Control, Technische Universität Darmstadt, Germany 1

AbstractIn this paper we present a framework for the real-time control of lightweight autonomous vehicles which comprehendsa proposed hard- and software design. The system can be used for many kinds of vehicles and offers high computingpower and flexibility in respect of the control algorithms and additional application dependent tasks. It was originallydeveloped to control a small quad-rotor UAV where stringent restrictions in weight and size of the hardware componentsexist, but has been transfered to a fixed-wing UAV and a ground vehicle for in- and outdoor search and rescue missions.The modular structure and the use of a standard PC architecture at an early stage simplifies reuse of components and fastintegration of new features.

Figure 1: Quadrotor UAV controlled by the proposed sys-tem

1 Introduction

In recent years the interest in autonomous unmanned ve-hicles has grown significantly, taking into account var-ious applications. The spectrum reaches from surveil-lance and reconnaissance [1, 2], environmental monitor-ing [3], collection of geospatial data [4] to the support ofsearch and rescue forces in disaster mitigation and pre-vention [5, 6]. Depending on the situational requirementseither unmanned ground vehicles (UGV), aerial vehicles(UAV), surface vehicles (USV) or even underwater vehi-cles (UUV) or a mix of these are considered. The usageof unmanned vehicles has a high potential in reducing the

risks for humans in hazardous scenarios and helps to savetime and money compared to existing approaches. Thisespecially holds for autonomous systems that are able toaccomplish a predefined mission or at least subtasks with-out human interaction.

From a system engineer’s perspective many challengesneed to be tackled when designing and implementing sucha system. Even though many disparities exist when look-ing at the mechanical design, propulsion, sensors and actu-ators of different type of vehicles, some problems are con-spicuously similar. Basic prerequsites for an autonomousvehicle are the ability to localize itself, to build an inter-nal model of what happens in the environment based onsensor readings, to plan its future steps and to control theactuators effectively. All of these components need to playtogether in a robust and reliable way and in most casestime constraints exist to a greater or lesser extent. Anotherimportant aspect that needs to be addressed is the com-munication between the vehicle and an operator station orbetween different vehicles.

In this paper we present a framework for the design of sucha control system. It was originally developed as an onboardcontroller for a small quadrotor aircraft of approximately1.2 kg takeoff weight (Fig. 1) but soon has been adoptedfor the integration into other vehicles as well. In the mean-time it has successfully been deployed in a fixed-wing air-plane and a ground vehicle used for search and rescue mis-sions. As opposed to specialized microcontroller-based de-signs we decided to come up with a solution, that relies ona commercial off-the-shelf onboard computer with PC ar-chitecture at a very early stage in the control process. Onlybasic sensor and actuator I/O is done by a microcontrollerwhich communicates with the computer using an Ethernetconnection. The choice of this link technology was mo-tivated by the high availability of single board computers

1http://www.fsr.tu-darmstadt.de/

Page 2: A Flexible Real-Time Control System for …A Flexible Real-Time Control System for Autonomous Vehicles. Johannes Meyer, Armin Strobel Institute of Flight Systems and Automatic Control,

equipped with that interface and because of its high band-with and real-time capability, when not having to cope withpacket collisions. A modular software design makes it easyto replace only some of the components or simply load an-other configuration file for obtaining a different system be-havior, e.g. when moving to another vehicle. Many exist-ing UVs are equipped with onboard computers as well, butwe know no other projects that integrate it in the lowestcontrol loop on platforms equally sized than ours.

Limitations related to payload, power consumption andtiming contraints have to be taken into account when de-signing an onboard control system. The developer has tofind a suitable trade-off between the increased capabilitiesand flexibility and those limiting factors. Weight and sizelimitations are most important on small and lightweightvehicles, especially aerial vehicles. Lighter systems usu-ally have less accurate sensors and possess less computingpower. Also the vehicles have to carry batteries for en-ergy supply with them leading to an additional increaseof weight for each new piece of hardware and reductionof available runtime. However, during the last years therehas been substantial progress on the market of small andlightweight single board computers which consume littleenergy and can be used as capable onboard computers.Another frequently underestimated parameter is the toler-able transport latency in the control loop. Small vehiclesusually have faster dynamics and higher eigenfrequenciesand therefore latencies of more than a few milliseconds canhave a negative impact on control quality and even causeinstabilities for some systems.

Onboard systems for the control of unmanned vehicles area well researched area. While larger systems like the carsused for the DARPA Urban Challenge with less constraintsin size and weight often use off-the-shelf components e.g.from industrial automation [7], specialized autopilot solu-tions exist for small-scale UAV and model aircrafts [8].They have sensors, processors and peripheral circuits in-tegrated into one single board and control laws are moreor less fixed except for parameter tuning. The integratedfunctions are usually restricted to waypoint navigation, at-titude and air speed hold and some support auto-takeoffand landing. A bunch of middleware solutions for mobilerobots exists that run in a Java virtual machine or as na-tive tasks in Unix-based systems [9, 10], but none of themsupports hard real-time operation and they instead rely onunderlying controllers implemented in hardware.

In the following section we give a short overview over theproposed system structure and the different target vehicles.In section 3 the hardware components and their interactionwill be described and section 4 introduces the associatedmodular software framework. Afterwards some results aregiven in section 5 and the paper is concluded in section 6.

2 System OverviewIn contrast to most existing solutions, which are purelybased on microcontroller boards, our main goal was tocome up with a hard- and software framework that sim-plifies the realization of new navigation and control algo-rithms while at the same time offers enough computingpower for the additional onboard processing of computervision, obstacle avoidance, mapping and cooperative con-trol schemes. With the choice of a capable onboard PC andLinux as operating system together with a real-time en-abled kernel this leads to a powerful solution to this prob-lem. Unfortunately, most available embedded PC boardsdo not incorporate common sensor interfaces like analogports, SPI and I2C busses directly. For this reason we couldnot eliminate the need for a microcontroller board, whichserves as a hardware abstraction for sensors and actuatorsand interfaces the computer via an Ethernet link. However,this does not imply huge constraints in respect to flexibil-ity, as the interface communication is very abstract and allthe processing is done by the onboard computer.

2.1 Target Platforms

2.1.1 Quadrotor

The first platform is a quadrotor helicopter developed atTU Darmstadt (Fig. 1). Quadrotors are able to take off andland vertically and can hover at a fixed position withoutrudders or pitch-control, which motivates their applicationfor search and rescue missions [11]. The propulsion sys-tem using four independently controlled motors and pro-pellers allows the carriage of comparatively heavy pay-loads. Our airframe can carry up to 500g of cameras andother sensors and has a total weight of 1200 - 1400g in-cluding the control system and batteries for an enduranceof approximately 20 minutes. With a diameter of 80cm itcan be easily deployed in outdoor missions as well as inindoor scenarios. As the rotational sense of two adjacentdrives differ, the moments about all three axes and the totalthrust can be controlled independently by simply varyingthe speed of the individual motors.For image acquisition a Logitech QuickCam Pro9000 cam-era is mounted to the quadrotor which transmits video im-ages to the ground station via the onboard computer us-ing the wireless network. Additionally, up to five framesper second are stored on an onboard flash media for after-mission analysis including references to the available nav-igational data.

2.1.2 Fixed-wing aircraft

Our hardware and software has also been used in a fixed-wing propeller driven airplane. The airframe (Fig. 2) isbased on the Graupner Elektro Kadett, an almost ready tofly consumer model with 1.6 m wing span and made ofwood. Fully equipped with a brushless motor, a four cell

Page 3: A Flexible Real-Time Control System for …A Flexible Real-Time Control System for Autonomous Vehicles. Johannes Meyer, Armin Strobel Institute of Flight Systems and Automatic Control,

LiPo battery (4.8 Ah) and our hardware platform it weighsabout 2.3 kg.

Figure 2: Airplane

2.1.3 Ground vehicle

The ground vehicle is based on a 1:8 scale R/C Mon-stertruck model (Fig. 3). It is modified with a four wheelsteering and an additional gear to drive slow and increaseprecision. On top of this chassis a box containing the inter-face board with the same sensors as on our flying systemsand a additional odometers in every wheel. The laserscan-ners and cameras (one daylight and one thermal camera)are directly connected to a high performance mobile PCunit. This vehicle was used in RoboCup Rescue competi-tion [12] where the goal is to autonomously find victims ina simulated disaster scenario.

Figure 3: Ground Vehicle

Figure 4: Detailed view of the onboard hardware

Microcontroller(LPC2138)

Ethernet Module(ENC28J60)

Computing Unit(Atom Pico-ITX or Core 2 Duo

Mini-ITX)Power Management

Module

ADC IMU

GPS

Servos

RC Interface

(ATmega8)

RC Receiver

Motors

Barometer

Air Speed ADC

Temp.

Magnetometer

Servos

Extension Board

Radio ModemIEEE 802.15.4

Interface Board

I2C

I2C

SPI

SPI

SPI

SPI

RS232

RS485

PPM

PWM

GPIO

Analog

RS232

Eth

ernet

WiFiIEEE 802.11

USB

Figure 5: Structure of the hardware components

3 Hardware Description

3.1 Computing Unit

The structure of our onboard hardware is shown in Fig. 5.As PC unit nearly every PC- or compatible platform canbe used that comes with a supported Ethernet controller.It runs a Xenomai real-time enabled Linux kernel withRTnet as an alternative Ethernet driver [13]. For ourquadrotor and airplane we use a small single board com-puter with Intel Atom Z530 processor in Pico-ITX format(100 × 72 mm). With 400g total weight of all electroniccomponents the system fulfills the restrictions of our fly-ing vehicles. The ground vehicle is equipped with a Mini-ITX Core 2 Duo mainboard and an additional CUDA-compatible GPU, which enables parallel image processing,accurate simultaneously localization and mapping and runsthe complete software framework with communication andbehavior control.

3.2 Link Technology

Two main issues have to be considered when selecting alink technology for connecting sensors with a SBC, namely

Page 4: A Flexible Real-Time Control System for …A Flexible Real-Time Control System for Autonomous Vehicles. Johannes Meyer, Armin Strobel Institute of Flight Systems and Automatic Control,

the real-time capability and the achievable bandwidth andlatency. Additional issues to consider are the expectedavailability on small embedded PC units now and in nextfuture and the existence of a suitable driver implementation(Table 1).

Ethernet USB SerialPorts

SPI

Real-timeoperation

usingRTnet /Xenomai

no imple-mentationavailable

possible possible

Typicalbandwidth

10/100MBit/s

12/480MBit

< 1 MBit/s 10 MBit/sand more

Availability longterm longterm midterm unknownSpreading very good very good often poor

Table 1: Expected properties of link technologies

We decided to use Ethernet with RTnet/Xenomai as real-time driver implementation on the PC side. MAC con-trollers are either already integrated in the microcontrolleror can be easily connected as peripheral device.

3.3 Interface Board

The computing unit is connected to the interface board viaEthernet which sends sensor data on request and receivesthe output commands for the actuators. The interface isequipped with a Philips/NXP LPC2138 micro-controller,a high-speed 16 bit A/D-converter, a 6 degree of freedominertial measurement unit (IMU), an ENC28J60 Ethernetcontroller and additional components for power manage-ment. Supplementary sensors and actuators can be con-nected to the analog inputs or via external ports accordingto the requirements for the respective vehicle. The avail-able ports include two bidirectional serial ports, a SPI, twoI2C connectors and several GPIO ports.

A special expansion board for aerial vehicles is equippedwith a barometer, a differential pressure sensor for mea-suring airspeed and a 3D magnetometer. These deviceshave been mounted on the expansion board because theyare very prone to electromagnetic fields and need to beplaced where little interference is expected.

For safety and legal issues we have also connected a mod-ule that reads a PPM signal from a standard model aircraftR/C-receiver and has several outputs for servos. This al-lows to actuate the basic functions of all vehicles directlywith a remote control. Due to this the airplane can be flownby a safety pilot even in the case of a total failure of the in-terface board or PC unit.

The two serial ports on the interface board are usually con-nected to a GPS receiver and a Maxstream XBee data mo-dem. The data modem complements the wireless networkof the PC unit and is used for long range and reliable com-munication with limited bandwidth in order to transmit sta-tus and mission information and commands.

3.3.1 Firmware

The interface board initiates sensor measurements on re-quest only and sends the results back to the onboard com-puter. Which sensor information is requested is determinedby a bit field in the request packet. After all sensor data areread, they are packed into corresponding messages and for-warded to the computing unit. For the serial ports the in-terface board buffers all incoming data and relays it on re-quest. No pre-computation is made on the interface board,e.g. conversion of raw data to the real physical quantitiesor filtering. However, we plan to sample wide above thecut-off frequency of the IMU sensor and implement a dig-ital filter in the next release.

3.3.2 Hardware Modularity

All connected expansion boards, sensors and actuators areidentified by the interface board during an initializationphase. Information about available devices can be re-quested. Various additional sensors are supported by now,as to mention ultrasound range finders, temperature sen-sors and wheel encoders. On the output side differentbrushless motor controllers and PWM or RS485 servos arerecognized as actuators.

4 Software FrameworkThis section presents details about the software frameworkin general and some of the components we implementedfor autonomous vehicles.

Figure 6: Software component structure and appropriatedata flows

It is well known that the control of complex autonomousvehicles requires a structured approach to the design ofa control software. Integration of the whole function-ality in a monolithic core would exacerbate the mainte-nance significantly, especially when multiple developersare involved. The Orocos Open Robot Control Software(Orocos) provides the basic functionality for a modularcomponent-base design [14]. Orocos comes with a real-time capable core library, the Real-Time Toolkit (RTT),

Page 5: A Flexible Real-Time Control System for …A Flexible Real-Time Control System for Autonomous Vehicles. Johannes Meyer, Armin Strobel Institute of Flight Systems and Automatic Control,

and several useful libraries and ready-to-use components.RTT serves as an abstraction layer and uses the conceptof components, which are entities with distinct functional-ity providing data flow ports, attributes, properties, meth-ods, commands and events as external interface. By usingRTT the developer does not have to cope with real-timethread management and thread-safe data exchange directlybut can implement the required functionality in a more de-scriptive manner. RTT in general requires a real-time en-abled kernel like Xenomai or RTAI as target platform, butalso runs in native Linux mode or even on Windows hostswithout real-time guarantees.

4.1 General StructureThe currently realized components and their data flow forour quadrotor application are shown in Fig. 6. The struc-ture can easily be transferred to other autonomous vehi-cles. Every block represents an Orocos component witha well-defined interface which can be composed individ-ually depending on the application. During the initialsetup phase the components are connected to each otherand assigned to an activity. RTT activities include peri-odic execution, which are mainly applied in the controlloop, and event-triggered or non-periodic execution usedfor high-level mission control and communication compo-nent amongst others. The setup can also change duringruntime, e.g. when the control source changes from auton-mous guidance to manual control or vice-versa.The main control loop realizes the essential functionalityfor the mobility of the vehicle. From the software point ofview it is split up into four parts: the hardware interface,sensor abstraction, navigation/modeling and the controller.Each of these parts is represented by a special component.The interface component directly interacts with the systemvia network communication and driver APIs and providesmethods and events for accessing analog inputs and out-puts and other external interfaces. There is a strong cou-pling between this component and the subsequent sensorcomponents which abstract the hardware details on a indi-vidual sensor level and provide data flow ports with val-ues converted to meaningful physical quantities. As di-rect feedback of sensor information is not sufficient forcontrol in many cases an additional modeling componentis introduced, which infers non-observable state variablesfrom the sensor readings by using probabilisic models. Thecontroller uses the measured or estimated signals to calcu-late the commands for the actuators like servos and motors.Obviously the implementation of the controller highly de-pends on the vehicle’s kinematic and dynamics and needsto be adopted in the case of major changes or transfer toanother platform.On top of the basic control loop a supervisory componentconsecutively monitors the outputs and reacts on excep-tional events by switching to manual control mode or exe-cuting an emergency procedure. The guidance componentis an aggregate for several different control blocks which

are switched depending on the current mission state. Thismodules generate a nominal trajectory or setpoints for theunderlying controller during autonomous flight. The mis-sion manager can execute preplanned complex missionsdescribed by a hierarchical state automaton. During manu-ally controlled drive or flight the commands are read fromthe external R/C receiver in real-time and directly fed tothe actuators or the controller.The communication module collects data from differentother components and uses the interface to the radio mo-dem to send a periodic status message to listening agents.Furthermore, the vehicle’s state can be controlled by aset of commands which are forwarded to the autonmousguidance and mission control components. The counter-part of this interface is realized in our ground control soft-ware. It uses the same code as the communication compo-nent so that new functionality can be introduced at a sin-gle point. However, we will not go into details about theground segment in this paper. Also we currently do not usethe CORBA features of Orocos for communication for thesake of compatibility to previous projects.Depending on the available payload additional componentsare loaded for interfacing cameras and laser range finders.Until now, these information are not processed in real-timebut forwarded to other processes running on the same sys-tem. For indoor 2D navigation we implemented the simul-taneous localization and mapping algorithm (SLAM) al-gorithm which has been successfully deployed in a searchand rescue scenario in the RoboCup Rescue competition.The robot is able to find possibly injured humans and signsof hazardous materials autonomously using an object de-tector based on histograms of oriented gradients [15, 16]and evidence from the thermal camera.

4.2 Component description

4.2.1 Hardware Interface

The hardware component sends a request packet to the mi-crocontroller board periodically. For each sensor the usercan configure an individual rate divider so that only rel-evant data is transmitted in each timestep. The receivedinformation are forwarded to the sensor components forfurther processing. In most cases this includes transfor-mation of raw values to SI units and coordinate transfor-mations. Other modules, like GPS or the radio modemdirectly communicate with the connected devices and theserial data streams are forwarded over the Ethernet link.As soon as the controller has finished its processing stepan event is emitted and the interface component writes outthe new actuator commands immediately.For networking RTnet is used as a real-time protocol stackwhich seamlessly integrates into Xenomai/RTAI. RTnetsupports many popular NIC chipsets. As only one mas-ter and one slave device is present on the cable there is noneed for a special media access control like in other appli-cations of real-time networking.

Page 6: A Flexible Real-Time Control System for …A Flexible Real-Time Control System for Autonomous Vehicles. Johannes Meyer, Armin Strobel Institute of Flight Systems and Automatic Control,

4.2.2 Modeling and Navigation

For most vehicles self-localization is the most importantmodeling step. An Extended Kalman Filter (EKF) esti-mates the 3D position, velocities and Euler angles giventhe IMU sensor data according to the strapdown algorithm[17, 18]. Additional sensor information is needed to pre-vent the navigation solution from drifting. For outdoor op-eration, the GPS receiver is used as a source for the ap-proximate position and velocities in the earth-fixed refer-ence frame and the earth magnetic field serves as headingreference. The height estimation is further improved bypressure measurements from the barometric sensor. Forindoor localization, positional and directional feedback isdelivered by the external SLAM module using a 2D laserrange finder.

4.2.3 Controller

As mentioned above, each type of vehicle requires an in-dividual structure of the controller. Our quadrotor uses aset of four PID controllers for stabilization and controlledflight, namely one for each Euler angle and one for thevertical speed, whose outputs are superposed afterwardsto come up with an overall motor command. Further cas-cades can be activated on demand that control the horizon-tal speed, the position and the height of the vehicle. Theother vehicles use simpler approaches, e.g. a linear speedcontroller and a steering angle control for directional con-trol in the case of our unmanned ground vehicle.An interesting alternative to the direct implementation inC++ is the graphical development of controller compo-nents. The Orocos Simulink Toolbox uses MathWorksReal-Time Workshop to generate Orocos compatible codefrom models in Simulink. This approach allows even userswith poor programming skills to experiment with differentcontrol concepts and parameter tuning. We use this toolextensively together with a quadrotor testbed for researchand education.

4.2.4 Autonomy and Mission Control

An autonomous mission is defined by a sequence of ba-sic mission elements, which also can contain loops andconditional branches. The current mission state is con-trolled by the mission manager that basically implementsa finite-state machine and permanently checks whether thepreconditions for a transition are fulfilled or an exceptionoccured. Missions are loaded from a simple XML file orcan be transmitted from the ground station via the commu-nication link.Depending on the state, the mission manager optionallyactivates an associated autonomous guidance component.They implement the basic behavior patterns for the vehi-cle, e.g. takeoff, landing, waypoint navigation or otherspecial maneuvers for an aerial vehicle and generate con-trol commands for the underlying controllers. More than

one component can be active at the same time as long astheir outputs do not contradict each other.

4.2.5 Communication

The communication component provides the external in-terface of the control system. We use a message-basedbinary protocol based on the UBX protocol [19] knownfrom GPS chipsets by u-blox AG. Every message is taggedwith a timestamp and an identifier for the source and des-tination. A standard IP connection or the radio modemconnected to the interface board are available as underly-ing transport layer. The communication component con-tinously broadcast status messages and reacts on a set ofknown commands. In a future step we plan to extend mod-ularity by providing methods for the online registration ofcommands, so that each component can define its own ex-ternal interface without touching the communication.

4.2.6 Logging and Replay

The logging component permanently monitors the dataflow and writes a binary log file to a flash media. Thisdataset can be imported into Matlab for diagnostic pur-poses in the case of failures or debriefing after the mission.When running the software in a special replay mode, datais reinjected into the system from the log file in real-time,which simplifies debugging and testing without real hard-ware or sensor data available.

5 ResultsThe vehicles introduced in section 2 could successfully becontrolled by the proposed system. Obviously, the quadro-tor makes highest demands on real-time performance andbehaves critical if latencies are too high. However wecould achieve good flight performance even under harshwind conditions. The control loop runs at 200 Hz withaverage latencies of 1.2 ms including data exchange withthe microcontroller board, navigation filtering, control andlogging. The onboard computer uses approximately 5 per-cent of CPU time, so that plenty of room remains for othertasks. The quadrotor gained the 1st place at the OutdoorAutonomy Competition of the European Micro Air Vehi-cle conference taken place in the Netherlands in late 2009.The system demonstrated its flexibility when transferred tothe fixed-wing UAV and only the flight controller had to beadopted to the new environment. By resorting to Simulinkand Real Time Workshop it cost no more than a few min-utes to compile a previously implemented model, whichwas tested in a simulation environment.Some small modifications have been made for the groundvehicle where in addition to the inertial sensor data a ve-locity feedback from odometry is used as an observableparameter and integrated into the Kalman filter. For oper-ating indoor without GPS coverage, position and orienta-tion information are delivered from the SLAM algorithm

Page 7: A Flexible Real-Time Control System for …A Flexible Real-Time Control System for Autonomous Vehicles. Johannes Meyer, Armin Strobel Institute of Flight Systems and Automatic Control,

running in an external module. Beside of that the overallhard- and software structure remained the same.

6 Conclusion

In this paper we presented a control system which bringsthe advantages of a full-featured development environ-ment and robotic middleware software down to small-scalesystems and is appropriate to control very heterogeneoustypes of autonomous vehicles. The integration of all soft-ware components from low-level control to high-level au-tonomy and mission planning in a single framework allowsfast development cycles and simplifies debugging and of-fline testing. Especially the navigation, mission manage-ment and communication components have proven to beeasily transferable to different platforms.We plan to make our code available to the public in nearfuture. Also we work on interfaces to other robotic mid-dleware suites like ROS [20] or Player/Stage [21]. Theseframeworks are supported by strong communities and abroad spectrum of ready-to-use tools and components ex-ists.

Acknowledgements

This work has been funded, in part, by GRK 1362 of theGerman Research Foundation (DFG).

References

[1] M. Quigley, M.A. Goodrich, S. Griffiths, A. El-dredge, and R.W. Beard. Target acquisition, localiza-tion, and surveillance using a fixed-wing mini-UAVand gimbaled camera. In IEEE International Con-ference on Robotics and Automation, volume 3, page2600. Citeseer, 2005.

[2] B. Coifman, M. McCord, RG Mishalani, M. Iswalt,and Y. Ji. Roadway traffic monitoring from an un-manned aerial vehicle. In IEEE Intelligent TransportSystems, volume 153, page 11, 2006.

[3] A. Elfes, S.S. Bueno, M. Bergerman, J.J.G. Ramos,and S.B.V. Gomes. Project AURORA: develop-ment of an autonomous unmanned remote monitor-ing robotic airship. Journal of the Brazilian Com-puter Society, 4, 1998.

[4] S.S. Wegener, S.M. Schoenung, J. Totah, D. Sullivan,J. Frank, F. Enomoto, C. Frost, C. Theodore, F.L.F.Level, H.H.A.L. Endurance, et al. UAV autonomousoperations for airborne science missions. In Proceed-ings of the AIAA 3rd Unmanned Unlimited TechnicalConference, Workshop and Exhibit, 2004.

[5] L. Merino, F. Caballero, RM Dios, and A. Ollero. Co-operative fire detection using unmanned aerial vehi-cles. In IEEE International Conference on Roboticsand Automation, volume 2, page 1884, 2005.

[6] K. Daniel, B. Dusza, A. Lewandowski, and C. Wiet-feld. AirShield: A System-of-Systems MUAV Re-mote Sensing Architecture for Disaster Response.In IEEE International Systems Conference 2009(SysCon), Vancouver, pages 196 – 200. IEEE, March2009.

[7] S. Thrun, M. Montemerlo, H. Dahlkamp, D. Stavens,A. Aron, J. Diebel, P. Fong, J. Gale, M. Halpenny,G. Hoffmann, et al. Stanley: The robot that won theDARPA Grand Challenge. Journal of field Robotics,23(9):661–692, 2006.

[8] H. Chao, Y. Cao, and Y.Q. Chen. Autopilots for smallfixed-wing unmanned air vehicles: A survey. In IEEEInternational Conference on Mechatronics and Au-tomation, Harbin, China, 2007.

[9] J. Kramer and M. Scheutz. Development environ-ments for autonomous mobile robots: A survey. Au-tonomous Robots, 22(2):101–132, 2007.

[10] N. Mohamed, J. Al-Jaroodi, and I. Jawhar. Middle-ware for robotics: A survey. In IEEE Conference onRobotics, Automation and Mechatronics, pages 736–742, 2008.

[11] G.M. Hoffmann, H. Huang, S.L. Waslander, and C.J.Tomlin. Quadrotor helicopter flight dynamics andcontrol: Theory and experiment. In AIAA Guidance,Navigation and Control Conference, 2007.

[12] Micha Andriluka, Martin Friedmann, StefanKohlbrecher, Johannes Meyer, Karen Petersen,Christian Reinl, Peter Schauss, Paul Schnitzspan,Armin Strobel, Dirk Thomas, and Oskar von Stryk.RoboCupRescue 2009 - Robot League Team: Darm-stadt Rescue Robot Team (Germany). Technicalreport, Technische Universität Darmstadt, 2009.

[13] J. Kiszka, B. Wagner, Y. Zhang, and J. Broenink. RT-Net - a flexible hard real-time networking framework.In 10th IEEE International Conference on EmergingTechnologies and Factory Automation, pages 19–22,2005.

[14] P. Soetens and H. Bruyninckx. Realtime hybrid task-based control for robots and machine tools. In IEEEInternational Conference on Robotics and Automa-tion, volume 1, page 259, 2005.

[15] C. Wojek, G. Dorkó, A. Schulz, and B. Schiele.Sliding-Windows for Rapid Object Class Localiza-tion: A Parallel Technique. In DAGM-Symposium,pages 71–81, 2008.

Page 8: A Flexible Real-Time Control System for …A Flexible Real-Time Control System for Autonomous Vehicles. Johannes Meyer, Armin Strobel Institute of Flight Systems and Automatic Control,

[16] M. Andriluka, S. Roth, and B. Schiele. People-Tracking-by-Detection and People-Detection-by-Tracking. In IEEE Conference on Computer Visionand Pattern Recognition, 2008.

[17] P.G. Savage. Strapdown inertial navigation integra-tion algorithm design part 1: Attitude algorithms.Journal of Guidance, Control and Dynamics, 21:19–28, 1998.

[18] P.G. Savage. Strapdown inertial navigation integra-tion algorithm design part 2: Velocity and positionalgorithms. Journal of Guidance, Control and Dy-namics, 21:208–221, 1998.

[19] ANTARIS 4 UBX protocol specification.http://www.u-blox.com/de/download-center.html?task=summary\&cid=46\&catid=97.

[20] M. Quigley, B. Gerkey, K. Conley, J. Faust, T. Foote,J. Leibs, E. Berger, R. Wheeler, and A. Ng. ROS:an open-source Robot Operating System. In Open-Source Software workshop of the International Con-ference on Robotics and Automation, 2009.

[21] B. Gerkey, R.T. Vaughan, and A. Howard. Theplayer/stage project: Tools for multi-robot and dis-tributed sensor systems. In Proceedings of the 11thinternational conference on advanced robotics, pages317–323. Citeseer, 2003.