hardware‐in‐the‐loop simulation for visual target tracking of octorotor uav

13
Hardware-in-the-loop simulation for visual target tracking of octorotor UAV Bambang Rilanto Trilaksono, Ryan Triadhitama, Widyawardana Adiprawita and Artiko Wibowo Department of Electrical Engineering and Informatics, Bandung Institute of Technology, Bandung, Indonesia, and Anavatti Sreenatha School of Aerospace, Mechanical and Civil Engineering, University of New South Wales at ADFA, Canberra, Australia Abstract Purpose – The purpose of this paper is to present the development of hardware-in-the-loop simulation (HILS) for visual target tracking of an octorotor unmanned aerial vehicle (UAV) with onboard computer vision. Design/methodology/approach – HILS for visual target tracking of an octorotor UAV is developed by integrating real embedded computer vision hardware and camera to software simulation of the UAV dynamics, flight control and navigation systems run on Simulink. Visualization of the visual target tracking is developed using FlightGear. The computer vision system is used to recognize and track a moving target using feature correlation between captured scene images and object images stored in the database. Features of the captured images are extracted using speed-up robust feature (SURF) algorithm, and subsequently matched with features extracted from object image using fast library for approximate nearest neighbor (FLANN) algorithm. Kalman filter is applied to predict the position of the moving target on image plane. The integrated HILS environment is developed to allow real-time testing and evaluation of onboard embedded computer vision for UAV’s visual target tracking. Findings – Utilization of HILS is found to be useful in evaluating functionality and performance of the real machine vision software and hardware prior to its operation in a flight test. Integrating computer vision with UAV enables the construction of an unmanned system with the capability of tracking a moving object. Practical implications – HILS for visual target tracking of UAV described in this paper could be applied in practice to minimize trial and error in various parameters tuning of the machine vision algorithm as well as of the autopilot and navigation system. It also could reduce development costs, in addition to reducing the risk of crashing the UAV in a flight test. Originality/value – A HILS integrated environment for octorotor UAV’s visual target tracking for real-time testing and evaluation of onboard computer vision is proposed. Another contribution involves implementation of SURF, FLANN, and Kalman filter algorithms on an onboard embedded PC and its integration with navigation and flight control systems which enables the UAV to track a moving object. Keywords Octorotor UAV, SURF, FLANN, Computer vision, HILS, Simulation, Computer hardware Paper type Research paper Nomenclature Symbols u, v, w ¼ velocity component in body coordinate system, m/s p, q, r ¼ angular rate, rad/s w, u, c ¼ euler angles, rad V n ¼ nth rotor speed, rad/s Definitions, acronyms and abbreviations UAV ¼ unmanned aerial vehicle HILS ¼ hardware-in-the-loop simulation SURF ¼ speed-up robust feature FLANN ¼ fast library approximate nearest neighbor GPS ¼ global positioning satellite SIFT ¼ scale invariant feature transform UDP ¼ user datagram protocol DCM ¼ direct cosine matrix Introduction Development of computer vision system has been rapidly increasing for various purposes, including object or target recognition. A lot of algorithms have been proposed to meet the needs in the areas that rely on computer vision. Supported by rapid development of digital processors, real-time computer vision system is getting more feasible to be implemented. This allows us to create an unmanned system that is more intelligent to mimics human vision in navigating the environment. Then, there comes an idea to combine practicality of unmanned aerial vehicle (UAV) and sophistication of computer vision. However, due to safety consideration, a hardware-in-the-loop simulation (HILS) needs to be developed before the computer vision is installed The current issue and full text archive of this journal is available at www.emeraldinsight.com/1748-8842.htm Aircraft Engineering and Aerospace Technology: An International Journal 83/6 (2011) 407–419 q Emerald Group Publishing Limited [ISSN 1748-8842] [DOI 10.1108/00022661111173289] The authors would like to acknowledge to Noviantoro Sadewo who provided the information about octorotor platform and instrumentation, Taufiq Hilal Tawab who has derived the state space equation for octorotor dynamics, and Rusdiana Hakim who has realized the octorotor itself. 407

Upload: anavatti

Post on 23-Dec-2016

217 views

Category:

Documents


4 download

TRANSCRIPT

Page 1: Hardware‐in‐the‐loop simulation for visual target tracking of octorotor UAV

Hardware-in-the-loop simulation for visualtarget tracking of octorotor UAV

Bambang Rilanto Trilaksono, Ryan Triadhitama, Widyawardana Adiprawita and Artiko Wibowo

Department of Electrical Engineering and Informatics, Bandung Institute of Technology, Bandung, Indonesia, and

Anavatti SreenathaSchool of Aerospace, Mechanical and Civil Engineering, University of New South Wales at ADFA, Canberra, Australia

AbstractPurpose – The purpose of this paper is to present the development of hardware-in-the-loop simulation (HILS) for visual target tracking of an octorotorunmanned aerial vehicle (UAV) with onboard computer vision.Design/methodology/approach – HILS for visual target tracking of an octorotor UAV is developed by integrating real embedded computer visionhardware and camera to software simulation of the UAV dynamics, flight control and navigation systems run on Simulink. Visualization of the visualtarget tracking is developed using FlightGear. The computer vision system is used to recognize and track a moving target using feature correlationbetween captured scene images and object images stored in the database. Features of the captured images are extracted using speed-up robust feature(SURF) algorithm, and subsequently matched with features extracted from object image using fast library for approximate nearest neighbor (FLANN)algorithm. Kalman filter is applied to predict the position of the moving target on image plane. The integrated HILS environment is developed to allowreal-time testing and evaluation of onboard embedded computer vision for UAV’s visual target tracking.Findings – Utilization of HILS is found to be useful in evaluating functionality and performance of the real machine vision software and hardware priorto its operation in a flight test. Integrating computer vision with UAV enables the construction of an unmanned system with the capability of tracking amoving object.Practical implications – HILS for visual target tracking of UAV described in this paper could be applied in practice to minimize trial and error in variousparameters tuning of the machine vision algorithm as well as of the autopilot and navigation system. It also could reduce development costs, inaddition to reducing the risk of crashing the UAV in a flight test.Originality/value – A HILS integrated environment for octorotor UAV’s visual target tracking for real-time testing and evaluation of onboard computervision is proposed. Another contribution involves implementation of SURF, FLANN, and Kalman filter algorithms on an onboard embedded PC and itsintegration with navigation and flight control systems which enables the UAV to track a moving object.

Keywords Octorotor UAV, SURF, FLANN, Computer vision, HILS, Simulation, Computer hardware

Paper type Research paper

Nomenclature

Symbolsu, v, w ¼ velocity component in body coordinate system,

m/s

p, q, r ¼ angular rate, rad/s

w, u, c ¼ euler angles, rad

Vn ¼ nth rotor speed, rad/s

Definitions, acronyms and abbreviationsUAV ¼ unmanned aerial vehicle

HILS ¼ hardware-in-the-loop simulation

SURF ¼ speed-up robust feature

FLANN ¼ fast library approximate nearest neighbor

GPS ¼ global positioning satellite

SIFT ¼ scale invariant feature transform

UDP ¼ user datagram protocol

DCM ¼ direct cosine matrix

Introduction

Development of computer vision system has been rapidly

increasing for various purposes, including object or target

recognition. A lot of algorithms have been proposed to meet

the needs in the areas that rely on computer vision. Supported

by rapid development of digital processors, real-time

computer vision system is getting more feasible to be

implemented. This allows us to create an unmanned system

that is more intelligent to mimics human vision in navigating

the environment. Then, there comes an idea to combine

practicality of unmanned aerial vehicle (UAV) and

sophistication of computer vision. However, due to safety

consideration, a hardware-in-the-loop simulation (HILS)

needs to be developed before the computer vision is installed

The current issue and full text archive of this journal is available at

www.emeraldinsight.com/1748-8842.htm

Aircraft Engineering and Aerospace Technology: An International Journal

83/6 (2011) 407–419

q Emerald Group Publishing Limited [ISSN 1748-8842]

[DOI 10.1108/00022661111173289]

The authors would like to acknowledge to Noviantoro Sadewo whoprovided the information about octorotor platform and instrumentation,Taufiq Hilal Tawab who has derived the state space equation for octorotordynamics, and Rusdiana Hakim who has realized the octorotor itself.

407

Page 2: Hardware‐in‐the‐loop simulation for visual target tracking of octorotor UAV

and flight tested in the UAV. HILS for such a purpose can be

developed by combining the real computer vision software

and hardware with model and simulation of the UAV flight

dynamics, flight control and navigation systems.In this paper, development of HILS for visual target

tracking of an octorotor UAV is presented. The HILS is

developed to test the computer vision hardware reliability and

tune the controller parameters (Adiprawita et al., 2007).

Committing HILS could minimize the risk of failure on field

test and reduce the trial and error experiment on tuning the

parameters for the computer vision, the flight control and

navigation systems. The HILS we developed consists of real

computer vision hardware, Simulink blocks that simulates the

dynamics of the UAV, flight control and navigation systems,

FlightGear that is used to visualize the dynamics and

performance of object tracking, a camera and an LCD

projector. To allow the UAV to track a moving object, digital

scene matching area correlation (DSMAC) is developed.

DSMAC correlates the captured scene images with the target

images in the database (Wibowo et al., 2009). By using

DSMAC, the visual system recognizes and tracks a moving

target using feature correlation between scene image and

database images containing picture of the target. Features are

extracted using speed-up robust features (SURFs) algorithm

and are matched with features from database images using

fast library for approximate nearest neighbor (FLANN)

algorithm. Position of the target is estimated using Kalman

filter, which provides reference for navigation module to be

processed and subsequently be forwarded to UAV’s flight

controller. The main purpose of this paper is to provide the

insight gained in implementing a computer vision system for

moving object recognition and tracking, which is integrated

with navigation and control systems simulation in octorotor

UAV’s HILS environment. While there are a number of HILS

developed for UAV appearing in the literature in this area

(HILS), as far as the authors concern (Gans et al., 2009) is

the only work dealing with HILS for visual tracking of moving

object. In Gans et al. (2009), however, the computer vision

processing is carried-out in ground station, while in the

present paper the image processing is performed onboard.

In addition, the present paper applies SURF, FLANN and

Kalman filter algorithms for improved visual tracking

performance.Organization of this paper is as follows. Section 2 discusses

the physical description of the octorotor UAV that is used in our

experiment, mathematical equation of octorotor dynamics that

has been derived from first principle modeling, navigation

and control architecture implemented in the octorotor, and

performance of dynamics of the controlled octorotor. Section 3

describes the computer vision system architecture and

algorithm, and their hardware implementation. MATLAB’s

Simulink model of the navigation and control system applied in

the octorotor is discussed, and HILS architecture is presented

in this section. Section 4 presents HILS experiment results.

Conclusion is drawn in Section 5.

Octorotor UAV

In this section, architecture, hardware components, and the

overall system of octorotor UAV are described. Dynamics

model of the octorotor used for simulation is presented.

Physical description

Octorotor UAV, which is equipped with eight rotors, basically

may be viewed as a further development of quadrotor UAV.

A propeller and brushless motor are attached on each tip,

rotating in same speed and direction with its adjacent

(Figure 1). It behaves almost like a helicopter; can hover while

in the air, fly forward-backward or strafe left-right. Its

movement flexibility makes this UAV platform ideal for target

tracking, because it can move in every direction with

controlled speed and heading. The attitude of the aircraft is

controlled by setting the speed of propellers in a certain

combination (Figure 2). Octorotor is proven to have higher

payload capability than quadrotor, to allow onboard computer

vision. Also, it increases robustness in that if one or more

propellers fail, it can be reconfigured in a certain way as to

enable the UAV maintains its fly safely.The UAV is equipped with two microprocessors, ARM

Cortex M3 and ATMEGA 168, which implement navigation

and flight control systems, respectively. It uses wireless

communication to send states of the aircraft to ground station

to be monitored through a PC. It also receives global

positioning satellite (GPS) waypoint coordinate for

autonomous GPS waypoint navigation. For image

processing, an embedded PC is added and connected to the

navigation system. An analog CMOS camera is mounted

under the platform. The analog image signal is converted to

digital image by using frame grabber for further processing.

The vision system is implemented on PCM 4390 embedded

PC from Advantech and connected to navigation module with

serial communication. In addition, the video is sent to ground

station using wireless video sender for monitoring purpose.

Tilt compensation on the gimbal servo is used to enable the

camera points to certain degree downward despite the

pitching of the UAV. The hardware architecture of the UAV

is shown in Figure 3 and the embedded PC for vision system

which will be tested in this HILS is shown in Figure 4.

Mathematical model of octorotor UAV

Equation of motion describing the octorotor dynamics is

required in the software part of the HILS. Nonlinear dynamic

model was obtained from first principle approach and is

linearized over hover operating point because it is a good

startingpoint todescribe thedynamicsof theUAV.Firstprinciple

Figure 1 Octorotor UAV platform

Hardware-in-the-loop simulation

Bambang Rilanto Trilaksono et al.

Aircraft Engineering and Aerospace Technology: An International Journal

Volume 83 · Number 6 · 2011 · 407–419

408

Page 3: Hardware‐in‐the‐loop simulation for visual target tracking of octorotor UAV

approach is done by observing forces and moments experienced

in the aircraft. By using second law of Newton and coriolis

theorem, the non-linear equations representing behavior of the

UAV can be derived (Tawab, 2010). Variables involved in the

UAV model consists of three translation velocities in x,y,z

coordinates and three angular positions relative to inertial

reference of earth. Forces that are taken into account in the

derivation consist of gravitation, gyroscopic, and forcesproduced

by the propellers. Furthermore, the equations are linearized

using Taylor expansion and small perturbation method over

hover condition to produce linear system model of octorotor.

The obtained aircraft linear model response was compared with

the real measured data, and was shown to represent the behavior

of the octorotor dynamics quite well (Tawab, 2010). The linear

model of the UAV is represented by state space representation:

x ¼ Axþ Bu

y ¼ Cx ¼ Duð1Þ

where matrices A, B, C, and D, respectively, are:

Figure 2 Propeller speed configuration for octorotor attitude control

1 2

3

4

56

7

8

(a)

(e)

Notes: Yaw (a) (anticlockwise direction); (b) yaw (clockwise direction); (c) climb up; (d) roll (clockwisedirection); (e) pitch (anticlockwise direction); (f) pitch (clockwise direction); (g) climb down; (h) roll(anticlockwise direction)

(b)

(f)

(c)

(g)

(d)

(h)

Figure 3 Hardware architecture of the octorotor

Hardware-in-the-loop simulation

Bambang Rilanto Trilaksono et al.

Aircraft Engineering and Aerospace Technology: An International Journal

Volume 83 · Number 6 · 2011 · 407–419

409

Page 4: Hardware‐in‐the‐loop simulation for visual target tracking of octorotor UAV

A ¼

0 0 20:0001 0 20:4 0 0 9:81

0 0 0 0:4 0 20:19 29:81 0

0:0001 0 0 0 0:19 0 0 0:0193

0 0 0 0 0 20:0001 0 0

0 0 0 0 0 0 0 0

0 0 0 0 0 0 0 0

0 0 0 1 0 0:002 2:9 £ 1027 0

0 0 0 0 1 0 0 0

0 0 0 0 0 1 0:0001 0

266666666666666666664

377777777777777777775

B ¼

0 0 0 0 0 0 0 0

0 0 0 0 0 0 0 0

0:019 0:019 0:019 0:019 0:019 0:019 0:019 0:019

4:4 £ 1029 4:4 £ 1029 0:1839 0:1839 4:4 £ 1029 4:4 £ 1029 0:1839 0:1839

0:1839 0:1839 0 0 0:1839 0:1839 0 0

20:0585 20:0585 20:0585 20:0585 20:0585 20:0585 20:0585 20:0585

0 0 0 0 0 0 0 0

0 0 0 0 0 0 0 0

0 0 0 0 0 0 0 0

2666666666666666664

3777777777777777775

C ¼

1 0 0 0 0 0 0 0

0 1 0 0 0 0 0 0

0 0 1 0 0 0 0 0

0 0 0 1 0 0 0 0

0 0 0 0 1 0 0 0

0 0 0 0 0 1 0 0

0 0 0 0 0 0 1 0

0 0 0 0 0 0 0 1

0 0 0 0 0 0 0 0

2666666666666666666666664

3777777777777777777777775

D ¼ 0;

Figure 4 Embedded PC for computer vision system

Hardware-in-the-loop simulation

Bambang Rilanto Trilaksono et al.

Aircraft Engineering and Aerospace Technology: An International Journal

Volume 83 · Number 6 · 2011 · 407–419

410

Page 5: Hardware‐in‐the‐loop simulation for visual target tracking of octorotor UAV

with state x and input u defined as:

x ¼ u v w p q r w u ch iT

;

u ¼ V1 V2 V3 V4 V5 V6 V7 V8

h i:

ð2Þ

The octorotor dynamics model is compared with data

obtained from experiment. It is done by applying propeller

speed input used in the experiment, to the dynamics model in

the simulation. The responses from each state are

subsequently compared with the filtered measured data

from experiment. Results from this experiment are shown in

Figure 5 for angular rates and Figure 6 for angular position.

Navigation and control architecture

Our vision-based navigation system is developed to generate

reference signal from tracking error related to the target. The

target is intended to be kept always in the center of image

coordinate in the image plane. Error in y-axis is compensated

with forward orbackward movement incertain speed,depending

on the amount of the error, while error in x-axis is compensated

by turning the aircraft towards the direction of the target.The existing flight controller of the UAV receives absolute

angle of pitch, roll, or yaw. Commanded speed and turning

rate are translated by navigation module into pitch and yaw

angle set point for flight controller. Translation movements of

octorotor are controlled by the attitude of the aircraft. For

moving forward or backward, the aircraft is pitched in certain

degree down or up, respectively. Based on this observation,

velocity controller will adjust the pitch angle and level when

the aircraft has achieved the desired speed. However, as

pitching the aircraft may decrease the altitude, altitude hold is

applied to keep the UAV at certain altitude. Yaw angle

command for flight controller is obtained by integrating

turning rate that is proportional to error in x-axis.The structure of pitch, roll, and yaw controller are in

the form of cascaded PI controller. The inner loop controls

the rate of pitch, roll, or yaw, while the outer loop controls the

absolute angles. Gyroscope is used for feedback signal,

however, since it only provides angular rate of each state, the

value is integrated to get the absolute angle of each UAV

attitude angle. Cascaded PD controller is used in altitude

hold controller. The inner loop controls the climb rate and the

outer loop controls the height of UAV. Accelerometer and

pressure sensors are used for climb rate and altitude signal

feedback, respectively. PID controller parameters for the

purpose of simulation are tuned in such a way that the closed

loop system mimics the dynamics of the real aircraft as close

as possible. Navigation and flight controller structures can be

shown in Figures 11 and 12, respectively.

Computer vision system and HILS

In this section, the computer vision system used for target

tracking application is described. Algorithm to find the

desired target is presented along with and simulation platform

comprising octorotor dynamics, navigation and flight

Figure 5 Angular rate responses of dynamic model against measured signals

–0.015

–0.01

–0.005

0

0.005

0.01

0.015

0.02Comparison of Yaw Rate Response

MeasureModel

0 5 10 15 20 25 30 35 40

Time (s)

Ang

ular

rat

e (r

ad/s

)

0 5 10 15 20 25 30 35 40–0.015

–0.01

–0.005

0

0.005

0.01

0.015

0.02

0.025

0.03

Time (s)

Ang

ular

rat

e (r

ad/s

)

Comparison of Roll Rate Response

MeasureModel

–0.025

–0.02

–0.015

–0.01

–0.005

0

0.005

0.01

0.015

0.025

0.02Comparison of Pitch Rate Response

0 5 10 15 20 25 30 35 40Time (s)

Ang

ular

rat

e (r

ad/s

)

MeasureModel

Hardware-in-the-loop simulation

Bambang Rilanto Trilaksono et al.

Aircraft Engineering and Aerospace Technology: An International Journal

Volume 83 · Number 6 · 2011 · 407–419

411

Page 6: Hardware‐in‐the‐loop simulation for visual target tracking of octorotor UAV

controller models, and visualization of octorotor dynamics,

moving target, and scenery.

Computer vision system

The computer vision system measures position of the target

which provides feedback signal to keep the object always in

the center of image plane. The vision system consists of four

major stages. First, features from scene and object

database image are extracted using SURF algorithm. Next,

the extracted features from both scene and object database

image are matched using FLANN to correlate object on the

scene image with those from image database. To determine

the position of the object in the image coordinate, center of

mass of correlated features are calculated. Kalman filtering is

then applied to track moving target on each captured frames.

The Kalman filter predicts the object’s position and correct

the measured position. Pixel coordinates are sent to

navigation system to be further processed by navigation

module. Diagram of computer vision system is shown in

Figure 7.Brief descriptions of each part of computer vision system

are discussed below.

SURF algorithmSURF can be viewed as further development of scale invariant

feature transform (SIFT) algorithm, proposed by Bay et al.(2008). Basically, it approximates every calculations in SIFT

algorithm without sacrificing the quality of the feature

descriptors. Generally, SURF consists of detection of

interest points, interest point localization, and feature

description.SURF uses integral image as intermediate image

representation to allow fast computation of box-type

convolution filters (Figure 8), as proposed by Voila and

Jones (2001). The value of an integral image representation at

a location (x, y) represents the sum of intensity value in input

image within a rectangular area from origin to location (x, y).

Using such image representation, it only needs two

subtraction and addition to calculate all of intensity values

inside rectangular area. Interest points are found by detecting

blob-like structures at locations where the determinant is

maximum. The detector is based on determinant of

approximation of Hessian matrix, which is defined in a

location x ¼ (x, y) at scale s as follows:

Hðx;sÞ ¼Lxxðx;sÞ Lxyðx;sÞLxyðx;sÞ Lyyðx;sÞ

" #ð3Þ

Detection is done in various image scales called image

pyramid. In this way, the same interest points can be detected

repeatedly in various scales on input image. SURF up-scales

the filter, and therefore it can be pre-programmed instead of

repeatedly reducing the image size in each scale which will

Figure 6 Angular position responses of dynamic model against measured signals

0 5 10 15 20 25 30 35 40–0.01

0

0.01

0.02

0.03

0.04

0.05

Time (s)

Yaw

Ang

le (

rad)

Comparison of Yaw Angle Response

MeasureModel

0 5 10 15 20 25 30 35 40–5

0

5

10

15

20x 10–3

Time (s)

Pitc

h A

ngle

(ra

d)

Comparison of Pitch Angle Response

0 5 10 15 20 25 30 35 400

0.01

0.02

0.03

0.04

0.05

0.06

Time (s)

Rol

l Ang

le (

rad)

Comparison of Roll Angle Response

MeasureModel

MeasureModel

Hardware-in-the-loop simulation

Bambang Rilanto Trilaksono et al.

Aircraft Engineering and Aerospace Technology: An International Journal

Volume 83 · Number 6 · 2011 · 407–419

412

Page 7: Hardware‐in‐the‐loop simulation for visual target tracking of octorotor UAV

take more time to down sample the image size and could

eliminate the high frequency component of the image. Using

this method, image analysis can be done with less

computational effort.SURF descriptor is similar to SIFT; it describes the

intensity distribution within the interest point neighbor

(Lowe, 2004). The descriptor is based on Haar wavelet

response in x- and y-direction. Using the advantage of integral

image, it reduces the time required for computing Haar

wavelet response on each interest point locations. For faster

indexing, interest points can be separated by its Laplacian

sign. The Laplacian sign distinguishes whether the detected

interest point is a bright blob on dark background or vice

versa.

FLANN feature matchingOne of the general method for feature matching purpose is

nearest neighbor algorithm. It is a method to classify new data

based on similarity to labeled data, defined by its Euclidian

distance. This method is easy to implement and allows for

parallel computation. However, it needs quite big

computation and memory. A fast approximation method is

needed for real-time object recognition and tracking.FLANN is one of the efficient methods to approximate

nearest neighbor algorithm proposed by Muja and Lowe

(2009). Generally, it uses modified kd-tree construction using

what they called multiple randomized kd-trees. In the original

kd-tree construction, the data are splitted in half at each level

for which data exhibit the greatest variance. Randomized kd-

tree split the data randomly on first determined dimension in

which the data show the greatest variance. Multiple kd-trees

allow for parallel computing of feature matching and

minimize the search region of nearest neighbor algorithm.

Kalman filteringIn computer vision, Kalman filter is used to predict the objectmovement on each frame and to correct the measured objectposition. As previously discussed, object position isdetermined by calculating center of mass of correlated

features. But, due to noise from camera, some features arenot repeatedly correlated on every frame. Hence, objectposition could differ from frame to frame; this is considered as

measurement noise. In some cases, sometimes the object isnot recognized from captured image or the object has movedaway from camera view. In this case, Kalman filter is appliedto predict the object position. In our HILS, motion of the

object is tracked by using fourth order Kalman filter. Thefilter is designed under assumption that the object is movingin x- and y-direction at constant speed and the objectmovement dynamics is modeled by linear state space

representation. Such assumption simplifies the system andmeasured used to derive the Kalman filter, but accurateenough for the purpose of visual target tracking.

Hardware-in-the-loop simulation

The purpose of the HILS is to evaluate the performance ofcomputer vision system when it is integrated with navigationand flight control systems in the UAV to construct a vision-

based flight control system. HILS is a real-time simulationwhere signals that are coming in and out represent signalsfrom real world. Another advantage of such simulation is thatthe parameter can be tuned based on trial and error basis

without being afraid of crashing the aircraft. After successfultest on HILS, the computer vision system hardware can beinstalled onboard on the octorotor platform.

Simulation of the UAV dynamics is carried outusing MATLAB Simulink based on state space model

Figure 8 Box-type filter used as approximation of Hessian matrix

Notes: Grey regions are equal to 0, while black and white are equal to 1 and –1, respectively

Figure 7 Diagram of computer vision system

ImageCapturing

DatabaseImage

SURF FeatureExtraction

SURF FeatureExtraction

FeatureMatching

Object PositionCalculcation

KalmanFiltering

Sent toNavigation

Hardware-in-the-loop simulation

Bambang Rilanto Trilaksono et al.

Aircraft Engineering and Aerospace Technology: An International Journal

Volume 83 · Number 6 · 2011 · 407–419

413

Page 8: Hardware‐in‐the‐loop simulation for visual target tracking of octorotor UAV

obtained previously. Onboard control and navigation modules

are also simulated to evaluate the performance of object

tracking based on existing navigation and controller module.

Computer vision system hardware is connected to Simulink

and send pixel coordinate via UDP connection. The received

data are subsequently unpacked for further processing by

navigation system block. Angle set points are sent to flight

controller by navigation block. In addition to capability to

control autonomously from vision, the simulated UAV can

also be controlled by a joystick for manual control or in case

of emergency.To visualize the UAV dynamics, FlightGear simulator

v2.0.0 is used and 3D model of the real octorotor is created

(Figure 9). Simulink already provides a block to interact with

FlightGear to allow communication with flight dynamics

model. One of the advantages of FlightGear is the multiplayer

feature, in which one can use a simulator for visualization of

octorotor dynamics while the other one plays the target to be

tracked.As described previously, there are three main blocks used

in our simulation: navigation, flight control, and plant model

blocks. The Simulinkmodel structure canbe shown inFigure 10.

The navigation structure can be shown in Figure 11. Flight

control block contains controller for each UAV attitude and

altitude hold. The flight controller structure can be shown in

Figure 12. State space of UAV dynamics described previously is

used to model the UAV dynamics actuated by eight propeller

speeds.HILS structure can be shown in Figure 13. The simulation

structure forms a closed loop system consisting real-time

embedded hardware and software that closely approximates

the real vision-based object tracking. Scene visualization

generated from FlightGear is used as virtual environment for

target tracking simulation. A car, which is considered as

target, is generated from another computer and can be shown

on UAV’s scene visualization. A camera captures the scene

visualization which subsequently to be processed by

embedded computer vision system and the target position is

sent to UAV simulation (Figure 14). Response of the UAV

simulation is then visualized on FlightGear, resulting in a

closed-loop simulation environment.

Simulation result and discussion

Results of computer vision capability in finding the desired

target and simulation of visual target tracking are presented in

this section.

Figure 9 3D model of octorotor used in HILS

Figure 10 Simulink simulation structure

Plant Model

q

r

pvx

speed

sent to navigation

from vision system

joystick command

from joystick

alt_comm

pitch_comm

roll_comm

yaw_comm

p

q

r

rotor1

rotor2

rotor3

rotor4

rotor5

rotor6

rotor7

rotor8

Subsystem

x' = Ax + Buy = Cx + Du

State-Space

from_vision

from_joystick

vx

altitude

pitch

roll

yaw

Navigation

Hardware-in-the-loop simulation

Bambang Rilanto Trilaksono et al.

Aircraft Engineering and Aerospace Technology: An International Journal

Volume 83 · Number 6 · 2011 · 407–419

414

Page 9: Hardware‐in‐the‐loop simulation for visual target tracking of octorotor UAV

Figure 12 Flight controller structure

Subtract6Subtract5

PID

PID Controller6PID

PID Controller 51s

1s

Integrator

throttle

Go to 21

w

From55

1alt_comm

(a)

Subtract1Subtract

PID

PID Controller1PID

PID Controller

Integrator1pitch_plus

Go to11

pitch_min

Go to10

–1

Gainq

From47

deg rad

Angle Conversion2

1pitch_comm

(b)

+– +

+–

1s

1s

+–

+– +

+–

+–

Subtract3Subtract2

PID

PID Controller3PID

PID Controller2

Integrator2roll_min

Go to 2

roll_plus

Go to 1

–1

Gain1p

From1

deg rad

Angle Conversion1

1roll_comm

(c)

Subtract7Subtract4

PID

PID Controller7PID

PID Controller4

Integrator3yaw_min

Go to 20

yaw_plus

Go to19

–1

Gain3r

From54

deg rad

Angle Conversion6

1yaw_comm

(d)

Notes: (a) Altitude hold controller; (b) roll controller; (c) pitch controller; (d) yaw controller

Figure 11 Visual navigation structure

2yaw angle setpoint

1pitch angle setpoint

speedtheta_comm

Speed Controller

1s

Integrator

K

Gain Y

K

Gain X

-C- Desired X coordinate1

-C- Desired X coordinate

2

y coordinate from computer vision

1x coordinate from computer vision

++

++

Hardware-in-the-loop simulation

Bambang Rilanto Trilaksono et al.

Aircraft Engineering and Aerospace Technology: An International Journal

Volume 83 · Number 6 · 2011 · 407–419

415

Page 10: Hardware‐in‐the‐loop simulation for visual target tracking of octorotor UAV

Object recognition and tracking result

The important requirement of computer vision system is

ability to recognize the target in various conditions and to

track target movement wherever it moves. The object

recognition system must be able to recognize the target

from normal to certain level of low illumination due to cloudy

weather or pursuing under the shadow of building or bridge.

In addition, it is also able to recognize the target in different

orientation and perspective. For this experiment, the

computer vision system is tested to recognize a pre-specified

car in outdoor environment.Computer vision system is requested to recognize a car,

which is considered as target, amongst the other cars

(Figure 15). The image from different position and

perspective are captured, and the brightness to simulate low

illumination situation. Object can be recognized in different

perspective and camera orientation. Although the object is

partially shown, computer vision system can still recognize the

target. Lines on each picture above show the matched feature

with database features. Processing time for each frame took

about 100 ms for five images in the database. The result can

be shown in Figure 16.

Vision-based navigation result

Scenario used in this simulation is as follows. The UAV was

controlled manually by using joystick. Navigation was

switched to vision once the target is in sight. First, the UAV

will try to keep the target in the center of image when the

target is not moving. Then, the car will be moved in certain

trajectory and we can see how the tracking performance of the

UAV is. In this particular simulation, the UAV is kept fly

Figure 13 HILS structure

Screen

Hub

MonitorEmbedded PCFor Vision System

CMOS analog camera&

Frame grabberUSB

MATLAB&

FlightGearProjector

UDP

UDP

Figure 14 HILS setup Figure 15 Feature extracted from image database

Hardware-in-the-loop simulation

Bambang Rilanto Trilaksono et al.

Aircraft Engineering and Aerospace Technology: An International Journal

Volume 83 · Number 6 · 2011 · 407–419

416

Page 11: Hardware‐in‐the‐loop simulation for visual target tracking of octorotor UAV

at an altitude of 40 feet, and car speed does not exceed

15 km/h (Figure17).Several experiments are carried out involving various

environment conditions such as low visibility, rainy, and

tracking at dawn or dusk. The results can be shown in Figure 18.

The car and octorotor trajectory in the simulation is plotted for

each simulation environment condition. From the simulation

results, we can deduce that the computer vision performance is

adequate for visual target tracking application. The UAV

tracked the car well although the vision is disturbed by different

lighting and the object is a little bit disguised by fog and

rain drops. The object loss occurred once but Kalman filter

overcomes it by giving object position prediction. Once the

object is captured by the camera again, the tracking continues.

Regarding to octorotor performance in performing visual target

tracking, the flight controller and navigation is adequate for

such application as long as the target velocity does not exceed

the octorotor maximum speed, which is 15 km/h. Tracking

algorithm within the navigation module, successfully tracks the

target using the idea of keeping the target in the center of image

plane. As shown in Figure 18, the attitude control performance

is acceptable for target tracking in light maneuver. The altitude

of the octorotor is well kept at certain altitude despite the UAV’s

maneuver to keep the target in sight.

Conclusion

A HILS environment was constructed for visual target tracking

of octorotor UAV, aimed at real-time testing of embedded

computer vision hardware. Utilization of HILS for visual target

tracking helps designers to rigorously perform real-time testing

embedded computer vision hardware and tune various vision

system design parameters without need to do a high risk flight

test on the field. Using this HILS structure, visual target

tracking simulation can be done using dynamics model equation

under simulation environment before doing real flight test. Such

HILS can be extended by inserting real autopilot hardware to

produce simulation environment that mimics more closely

Figure 17 Simulation of visual target tracking

Figure 16 Feature correlation results

Hardware-in-the-loop simulation

Bambang Rilanto Trilaksono et al.

Aircraft Engineering and Aerospace Technology: An International Journal

Volume 83 · Number 6 · 2011 · 407–419

417

Page 12: Hardware‐in‐the‐loop simulation for visual target tracking of octorotor UAV

the real condition. SURF algorithm is proven to be adequate for

object recognition or feature correlation, since it provides scale,

illumination, and rotation invariant, robust features with faster

computation time. FLANN can be used as approximation of

nearest neighbor search algorithm with less computation load.

Such combination result a computer vision system that is

adequate for visual target tracking for octorotor UAV.

References

Adiprawita, W., Suwandi, A. and Sembiring, J. (2007),

“Hardware in the loop simulator in UAV rapid development

life cycle”, Proceedings of the International Conference on

Intelligent Unmanned System (ICIUS 2007 ), Bali, Indonesia,

24-25 October.Bay, H., Ess, A., Tuytelaars, T. and Gool Luc, V. (2008),

“SURF: speeded up robust features”, Computer Vision

and Image Understanding (CVIU ), Vol. 110 No. 3,

pp. 346-59.Gans, N.R., Dixon, W.E., Lind, R. and Kurdila, A. (2009),

“A hardware-in-the-loop simulation platform for vision-

based control of unmanned air vehicles”, Mechatronics,

Vol. 19, pp. 1043-56.Jung, D. and Tsiotras, P. (2007), Modelling and Hardware-in-

the-loop Simulation for a Small Unmanned Aerial Vehicle,

Georgia Institute of Technology, Atlanta, GA.

Lowe, G.D. (2004), “Distinctive image features from scale-

invariant keypoints”, International Journal of Computer

Vision, Vol. 60 No. 2, pp. 91-110.Muja, M. and Lowe, D. (2009), Approximate Nearest Neighbors

with Automatic Algorithm Configuration, Computer Science

Department, University of British Columbia, Vancouver.Tawab, T.H. (2010), “Modelling and simulation of octorotor

UAV”, Bachelor thesis, Institut Teknologi Bandung,

Bandung.Viola, P. and Jones, M. (2001), “Rapid object detection using

a boosted cascade of simple features”, paper presented at

Accepted Conference on Computer Vision and Pattern

Recognition, Kauai, HI.Watanabe, Y., Lesire, C., Piquereau, A., Fabiani, P.,

Sanfourche, M. and Besnerais, Guy L. (2010), Systems

Development and Flight Experiment of Vision-based

Simultaneous Navigation and Tracking, ONERA, Mevdon.Wibowo, A. and Daniel, F.D. (2009), Development of

Hardware-in-the-loop Simulation in a Missile Guidance

Using Digital Scene Matching Area Correlation (DSMAC ),

Institut Teknologi Bandung, Bandung.

Further reading

Cai, G., Ben Chen, M., Tong Lee, H. and Dong, M. (2009),

“Design and implementation of a hardware-in-the-loop

Figure 18 Tracking trajectories in several lighting conditions

37.631537.632

37.632537.633

37.633537.634 –122.3935

–122.393–122.3925

–122.392–122.3915

–122.391

0

10

20

30

40

50

longitude

Target Tracking Trajectory in Normal Condition

latitude

Alti

tude

(fe

et)

CarUAV

start

–122.3935–122.393

–122.3925–122.392

–122.3915–122.391

Target Tracking Trajectory in Rainy Weather

latitude

Alti

tude

(fe

et)

37.631537.632

37.632537.633

37.633537.634

0

10

20

30

40

longitude

Start

CarUAV

–122.3935–122.393

–122.3925–122.392

–122.3915–122.391

Target Tracking Trajectory in Dawn or Dusk

latitude

Alti

tude

(fe

et)

37.6315

37.63237.6325

37.63337.6335

37.6340

10

20

30

40

Start

longitudeCarUAV

–122.3935–122.393

–122.3925–122.392

–122.3915–122.391

Target Tracking Trajectory in Foggy Condition

latitude

Alti

tude

(fe

et)

37.63237.6325

37.63337.6335

37.63437.6345

0

10

20

30

40

Start

longitudeCarUAV

Hardware-in-the-loop simulation

Bambang Rilanto Trilaksono et al.

Aircraft Engineering and Aerospace Technology: An International Journal

Volume 83 · Number 6 · 2011 · 407–419

418

Page 13: Hardware‐in‐the‐loop simulation for visual target tracking of octorotor UAV

simulation system for small-scale UAV helicopters”,Mechatronics, Vol. 19, pp. 1057-66.

Cai, G., Feng, L., Chen, B.M. and Lee, T.H. (2008),“Systematic Design methodology and construction of UAVhelicopters”, Mechatronics, Vol. 18 No. 10, pp. 545-58.

Campoy, P., Correa, J.F. and Mondragon, I. (2008), ComputerVision On-board UAVs for Civilian Tasks, Springer Science þ

Business Media, Berlin.Courbon, J., Mezouar, Y., Guenard, N. and Martinet, P.

(2010), “Vision-based navigation of unmanned aerialvehicles”, Control Engineering Practice, Vol. 18, pp. 789-99.

Dobrokhodov, V.N., Kaminer, I.I., Jones, K.D. andGhabcheloo, R. (2006), “Vision-based tracking andmotion estimation for moving targets using small UAVs”,Proceedings of the 2006 American Control Conference,Minneapolis, MN, USA, 14-16 June.

DutchCopter (2009), en/BL-Ctrl Manual-v8, Mikro-copter,The Hague.

Huh, S. and Shim, D.H. (2010), “A vision-based landingsystem for small unmanned aerial vehicles using airbag”,Control Engineering Practice, Vol. 18, pp. 812-23.

Jung, D. and Tsiotras, P. (n.d), Modelling and Hardware-in-the-loop Simulation for a Small Unmanned Aerial Vehicle,Georgia Institute of Technology, Atlanta, GA, 30332-0150.

Lin, F., Lun, K.Y., Chen, B.M. and Lee, T.H. (2009),Development of a Vision-based Ground Target Detection andTracking System for a Small Unmanned Helicopter, Science inChina Press, Beijing.

Ma, L., Stepanyan, V., Cao, C., Faruque, I., Woosley, C. andHovakimyan, N. (2006), “Flight test bed for visual trackingof small UAVs”, paper presented at AIAA GuidanceNavigation and Control Conference and Exhibit, Keystone,CO, 21-24 August.

Mueller, E.R. (2007), “Hardware-in-the-loop simulation

design for evaluation of unmanned aerial vehicle controlsystems”, paper presented at AIAA Guidance Navigationand Control Conference and Exhibit, Hilton Head, SC, 20-23 August.

Rathinam, S., Whan, Z. and Sengupta, R. (2006), “Vision-based following of structures using an unmanned aerialvehicle (UAV)”, Research Report UCB-ITS-RR-2006-1,

Institute of Transportation Studies, University ofCalifornia at Berkeley, Berkeley, CA.

Sattigeri, R.J, Johnson, E. and Calise, A.J. (2007), “Vision

based target tracking with adaptive target state estimator”,paper presented at AIAA Guidance Navigation andControl Conference and Exhibit, Hilton Head, SC,20-23 August.

Saunders, J.B. (2009), “Obstacle avoidance, visual automatictarget tracking, and task allocation for small unmanned airvehicles”, doctoral thesis, Brigham Young University,

Provo, CT, August.Shan, J. and Wenderski, P. (2010), “Hardware-in-the-loop

simulation for spacecraft formation flying”, Journal ofControl Science and Engineering, Vol. 2010, January, articleID 572526.

Symington, A., Waharte, S., Julier, S. and Trigoni, N. (2010),“Probabilistic target detection by camera-equipped UAVs”,paper presented at IEEE International Conference onRobotics and Automation, Anchorage, AK, 3-8 May.

Corresponding author

Bambang Rilanto Trilaksono can be contacted at:[email protected]

To purchase reprints of this article please e-mail: [email protected]

Or visit our web site for further details: www.emeraldinsight.com/reprints

Hardware-in-the-loop simulation

Bambang Rilanto Trilaksono et al.

Aircraft Engineering and Aerospace Technology: An International Journal

Volume 83 · Number 6 · 2011 · 407–419

419