localization and control of the quadrotor using imu and mono camera rios11 international robocup...

7

Click here to load reader

Upload: nightpianist

Post on 01-Jun-2018

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Localization and Control of the Quadrotor Using IMU and Mono Camera RIOS11 International RoboCup IRAN OPEN 2011-Libre

8/9/2019 Localization and Control of the Quadrotor Using IMU and Mono Camera RIOS11 International RoboCup IRAN OPEN 2011-Libre

http://slidepdf.com/reader/full/localization-and-control-of-the-quadrotor-using-imu-and-mono-camera-rios11 1/7

Abstract —This paper proposes a method tinertial navigation system (V-INS) on a quadThe proposed method uses a light mono camland-marks and calculating their distances f using this information to calculate the positithese observations are fused with a lmeasurement unit (IMU) using a Kalman-filtmore precise and faster information about th

I. INTRODUCTION

ECENTLY , inertial navigation systbeen used for estimating position of

airplanes, helicopters, automobiles, etc. [1most INS lies an inertial measurementmeasures linear accelerations and rotationintegrating these signals in real time, an Itracking the position, velocity, and attitudedead reckoning process, however, cannextended periods of time because the errorestimates continuously increase. This is dubiases present in the inertial measurementscurrent INSs rely on an absolute sensor lik in order to receive periodic corrections.Kalman filter (K-F) estimator is usecombining the IMU and GPS me

Nevertheless, during GPS signal outagesstandalone IMU, specifically low-cost, sserious limitation: errors in the estimatedrapidly accumulate with time due to thedrift of the inertial sensors’ measurementscamera to calculate depth of image and esof robot, is another method for navigatiindoor situations. In this case, vision technito calculate observations. High price, heavneeding powerful processors to process thedifficulties that limit using this kind of crobots.

In case of navigation on quadrotor whic

Vahid Khorani, is with the Mechatronic Researc

Islamic Azad University, Qazvin [email protected] )

Alireza Mohammad Shahri, is an assistant prEngineering department of Iran University of ScieIUST, Tehran, Iran (e-mail: [email protected] )

Arash Shirkhorshidi, is with the Mechatronic(MRL), Islamic Azad University, Qazvin [email protected] )

Seyyed Mohammad Reza Mirfakhar, is with theLaboratory (MRL), Islamic Azad University, [email protected] )

Localization and Co

Vahid Khorani, Alireza Mohamm

R

o use vision aidedrotor flying robot.

ra to detect fixedom the robot andn of robot. Then,w cost inertial

er (K-F) to obtainrobot position.

ms (INSs) havevehicles such as

]. At the core ofunit (IMU) thatal velocities. ByNS is capable ofof a vehicle. Thist be used overin the computedto the noise and

. For this reason,e GPS or cameran most cases, a

for optimallyasurements [2].

for indoor uses,uffers from onenavigation stateshighly nonlinear[3]. Using stereotimating positionng robot [4] forques can be usedy camera set andimages are someameras in flying

is a light flying

Laboratory (MRL),, Iran. (e-mail:

fessor at Electricalce and Technology,

Research Laboratorynch, Iran. (e-mail:

echatronic ResearchBranch, Iran. (e-mail:

robot there are some limitations in uof their weights. Recently, GPS aifor using on quadrotors for navigindoor fly navigations, light andneeded which uses low power proce

In this paper a new method iscamera aided INS for navigatingThe proposed method uses a fiestimate position of robot for indofuse the IMU and camera observatilight mono camera, a light weak prcost IMU which are suitable choiquadrotor.

II. USING IMU TO CALCU

In case of inertial navigation f angles will be calculated usingyroscope set. The angles calculatea three dimensional accelerometercompass sensor set. After that, therotation matrix. Finally it is possiblrobot using rotation matrix anaccelerometer set. The position calcbe corrected using position earned f

Tow frames are used, the first fris rigidly connected to the center ofthe second frame named ‘earth-fraFig. 1 shows earth-frame and body-navigation.

Fig. 1: earth-frame and

Euler angles are calculated using

where , and are ZYX-Euler a

trol of the Quadrotor using IMUCamera

ad Shahri, Arash Shirkhorshidi, Seyyed Mohamma

sing equipments becauseded INSs are developedation [5]. For quadrotorreliable techniques are

ssors.developed to use monoquadrotor flying robot.

ed place land-mark tor flies. A K-F is used toons. This method uses acessor and a MEMS lowes for using on a light

ATE POSITION

r a flying robot, Eulera three dimensional

will be corrected usingand a three dimensionalngles are used to obtainto calculate position ofa three dimensional

lated in this section willom camera.

me named ‘body-frame’gravity of quadrotor ande’ is fixed to the earth.

rame are used in inertial

ody-frame

quation (1) [6]:

(1)

ngles and , and

and Mono

Reza Mirfakhar

Page 2: Localization and Control of the Quadrotor Using IMU and Mono Camera RIOS11 International RoboCup IRAN OPEN 2011-Libre

8/9/2019 Localization and Control of the Quadrotor Using IMU and Mono Camera RIOS11 International RoboCup IRAN OPEN 2011-Libre

http://slidepdf.com/reader/full/localization-and-control-of-the-quadrotor-using-imu-and-mono-camera-rios11 2/7

are rate of angles measured using three gyroscopes.Rotation matrix is obtained using Euler angles [6]:

(2)

where is rotation matrix that is used to transform themeasured accelerations from body-frame to earth-frame.

Acceleration measured using accelerometers will betransferred to the earth-frame:

"#$ #% #&'( )" #* #* #*'( (3)

where #*, #* and #* are accelerations measured usingthree accelerometers and #$, #% and #& are robotaccelerations in the earth-frame.

Integrating accelerations of robot in the earth-frameobtains robot position. This position will be diverged during

the time because of odometric errors. To omitting theseerrors an absolute position measurement is needed to correctthe estimates during the time. In this paper, a mono-camerais used to obtain robot position and correct the inertialnavigation results.

III. CAMERA AND VISION TECHNIQUES

In this project a UEye mono camera with TV lens is usedto detect color land-marks.

A. Color Spaces Conversions

The first thing in any image processing softwareapplication is converting color spaces to each other. The

default color space in the most cameras is RGB which isstupendously fast to compute since any conversion to andfrom it can be done by using minimum system resources.

But there is a problem with RGB and it is discretedistribution of colors that can make a lot of problems duringprocessing colored images. The solution is HSL color spacewhich rearranges the geometry of RGB in an attempt to bemore perceptually relevant than the Cartesian representation;so the colors are classified to simpler mode compared to theRGB.

B. Detecting Color Land-marks

The Basic level of this project is based on the Visiontechnology and its ability to find objects and land-marks. Atthis project a mono camera is used to find color land-marksdue to calculate distance of the robot from detected land-mark which has been done by detecting blobs in the capturedimage.

C. Estimating object position:

After detecting the object it is time of keep track andholding the position of object in image frame. At this projecta set of methods is introduced and used to calculate theaccurate estimation of object position. First all moments up

to third order of a polygon shape must be calculated [7],thensome spatial moments retrieved. The moments may be usedthen to calculate gravity center of the shape, its area, mainaxis and various shape characteristics. By doing this twomoments and + are achieved:

, - . /

+ 0 . / 1(4)

where - and 0 are order of the retrieved moments.Now by having and +, spatial moments from above

equations are retrieved:

456 78$9%:;:-90<-506< (5)

where 4+ is spatial moment and ;:-90< is =>? =>? F@G?H :- After all, the central moments are calculated:

J65 78$9%:;:-90<<:- K<5:0 0K<

6< (6)

where J65 is the central moment, -K and 0K are LDDDE =>? PMOQSB@AP ?TSO -K 4[\ 4\\]0K 4\[ 4\\]

(7)

Then position of the object can be achieved in the imageusing following formulas:

^ 4[\J\\_ 4\[J\\

(8)

Where ^ and _ show the position of the object in the image.After that, the object borders positions must be calculated.

The positions which are needed are:-Maximum right top X-Maximum right top Y-Minimum left down X-Minimum left down YBy having this positions the accurate distance between

camera and Land-Mark can be calculated using equations

introduced in following parts.

IV. USING CAMERA TO CALCULATE POSITION

In this project land-marks are circles by diameter of 30cm.At first, distance between camera and land-mark will becalculated when land-mark is supposed to be in the center ofimage and orthogonal compared to the camera. Then,complicated situations that land-mark isn’t in the center ofimage will be studied.

Page 3: Localization and Control of the Quadrotor Using IMU and Mono Camera RIOS11 International RoboCup IRAN OPEN 2011-Libre

8/9/2019 Localization and Control of the Quadrotor Using IMU and Mono Camera RIOS11 International RoboCup IRAN OPEN 2011-Libre

http://slidepdf.com/reader/full/localization-and-control-of-the-quadrotor-using-imu-and-mono-camera-rios11 3/7

A. Calculating the Distance between Camer Mark when Land-Mark is in the center of iorthogonal compared to the image

When the land-mark is exactly in the ceis orthogonal compared to the camera (as sseen a circle with a certain diametercalculate its distance from the center of cam

Fig. 2

In Fig. 2, frame that is made up from axinamed Camera-Frame which is in the centeand ` a are camera specifications, b is rad

( and _( are sizes of image sides and b pixels. Distance between center of camerobtains using equation below:

1 (c=OAdea b=OAdea bbf b A. Calculating the Distance between Camer Mark when it is not in the center of Image

Land-mark that is not in the center of i

be assumed a land-mark in the center of aangle gh in direction of yaw.

Fig. 3

Angle gh is obtained using equation b

gh ijkbl bmb inIf the camera has had an angle of in di

the new situation it assumes that has angle

a do

a and Land-age and

ter of image anden in Fig. 2) it ishich is used toera.

-K,

0K and

pK is

r of camera, dea ius of land-mark,shows number ofa and land-mark

q (9)

a and Land-

age (Fig. 3) can

amera which has

elow:

g br (10)

rection of yaw, in

a:(11)

In the new situation relation bemark can be showed using Fig. 4.mark is not orthogonal comparedneedful to calculate its real diameter

Fig. 4

As seen in Fig. 4 the red land-mand it is possible to measure diame

Relation between diameter of theblack one can be represented using

cbs t cba u bs tFinally, distance between land-

obtained using equation below:

1o b j vw[vxbf bqb iydea

B. Calculating Position

As seen in Fig. 5 using distanceand land-mark, distance between ceof image is calculated. In this filabeled using image-frame which cposition of land-mark in image-fra_o6 and distance between centerimage-frame is showed by 1.

Fig. 5

o6,_o6 and 1 are calculated using

ween camera and land- n this situation the land- to the camera and it is.

rk is seen in the camerater of the red land-mark.

real land-mark and thequation below:

a (12)

ark and camera can be

deazzr bbf bq (13)

etween center of camerater of camera and centerure center of image isntains axis -6, 06 and p6,e is showed by o6 andf camera and center of

equations below:

Page 4: Localization and Control of the Quadrotor Using IMU and Mono Camera RIOS11 International RoboCup IRAN OPEN 2011-Libre

8/9/2019 Localization and Control of the Quadrotor Using IMU and Mono Camera RIOS11 International RoboCup IRAN OPEN 2011-Libre

http://slidepdf.com/reader/full/localization-and-control-of-the-quadrotor-using-imu-and-mono-camera-rios11 4/7

WX

Z1 { 1o| i }=OA~deaxbob /I•y~ }=OA~ a k/Io6 c1=OAdeakbob /I•n_o6 c1=OA` av/I• bobzThree frames image-frame, camera-fram

(showed in Fig. 6) are studied in this seposition of camera in earth-frame.

Fig. 6

Homogeneous transform matrix betweencamera-frame is obtained using equation be

€6K l6K ‚6K/ i ƒ „

/ / ii / // i // / /where l6K is rotation matrix and ‚6K is posi

of the image-frame in camera-frame.Using equations (15) position of land-marcamera-frame:

‚oK l6K …o6_o6/ † ‚6K Rotation matrix between camera-frame a

obtained using equation (2):

lK

In this project it is assumed that beforrobot, the camera-frame is located in theframe and its orientation is the same asposition of land-mark before robot’s motiusing equation below:

‚o:‡ˆ\< ‚oK:‡ˆ\<

Finally position of camera during the musing equation below:

bobn~{

(14)

and earth-frametion to calculate

image-frame andlow:

1//i‰

(15)

ion of the center

is calculated in

(16)

nd earth-frame is

(17)

e any motion ofcenter of earth-

earth-frame. Soons is calculated

(18)

tions is obtained

‚K:‡< ‚o:‡ˆ\< lK:‡By assuming that camera-fra

quadrotor are the same, ‚K showsthe earth-frame.

V. USING KALMAN FILTER TO ES

Two kinds of sensors are used ininstrumentation part using sensor f navigation system is faster comparehas odometric errors that divergetechnique is not enough fast, but it inot odometric errors. Other absolutthis project are compass sensors aIMU accelerometers to obtain Eulesensors are used as observationavigation system’s errors using Ka

Inertial navigation system errorseen below [10]:

-Š "‹^9‹_9‹Œ9‹9‹ ‹ 9‹Ž9‹Ž9‹Ž9‹where consists of positions, velociti3-axis gyro and accelerometer driftsError process model is [10]:

-: < : < -: < where ‘: < is the system noisematrix [10].

The discrete state equation is:

-‡ ’‡9‡w[ -‡w[where -‡ is the state parameter vmodel noise vector and ’‡9‡w[ is th

Differences between positions,measured using absolute sensorssystem obtains a measuremenmeasurement error vector has relastates as seen in equation below:

p‡ “‡ -‡ where

p‡ is measurement error vec

observations,

“‡ is observation mmeasurement model.

The Kalman filter estimates inerrors during the time using equatio

VI. QUADROTOR POSITION AND

In this part, the basics of theintroduced and after that, it is usedto control robot orientation ainformation gathered by navigation

‚oK:‡< (19)

e and body-frame ofposition of quadrotor in

TIMATE IMU ERRORS

this project to completesion techniques. Inertialto the vision part, but it

during the time. Visions absolute sensor and has

sensors that are used ind tilt sensors which user angles [8, 9]. Absolutes to estimate inertialman filter.model has 15 states as

9‹ 9‹ 9

9‹9‹' (20)

s and attitude errors and.

: < (21)

nd : < is the dynamic

”‡ (22)

ctor, ”‡ is the processstate transition matrix.

velocities and attitudesand inertial navigation

t error vector. Thistion to the error model

‡ (23)

or that is obtained using

trix, and ‡ is noise of

rtial navigation systems (22) and (23) [11].

ATTITUDE CONTROL

quadrotor modeling areo determine an approachd position using the

system.

Page 5: Localization and Control of the Quadrotor Using IMU and Mono Camera RIOS11 International RoboCup IRAN OPEN 2011-Libre

8/9/2019 Localization and Control of the Quadrotor Using IMU and Mono Camera RIOS11 International RoboCup IRAN OPEN 2011-Libre

http://slidepdf.com/reader/full/localization-and-control-of-the-quadrotor-using-imu-and-mono-camera-rios11 5/7

Fig. 7

Dynamic model of quadrotor is descri[12]. The first part is about linear accelsecond part is about angular accelerations.represent dynamic model of the quadrotor:

8v_Œz … //8Ž† l–— //™6š;›IœIbIš ˜ m:~ <m:ž [<4[ 4~ 4ž 4š Ÿ›œb

where "^ _ Π'(is linear acceleration veis angular acceleration vector, 6 is propmoment produced by rotors, l is the rotbody-frame to the earth-frame and 8 is the

A. Attitude Control

In the case of attitude control of the quadthe main parameter is speed and accuracyA PD controller is used to control orientat

in this project. PD controller will be abquadrotor successfully using fast contromotors by fast dynamic and fast sensorsorientations accurately. First step in the cquadrotor is writing a SISO model for it. Scontrol four motors using four separate coto control the robot orientation. Equationhow to exchange MIMO model to four SIS

[¡ ¢~¡ ¢ž¡ ¢¡ ¢

„i / i ii i / ii / i ii i / i

‰ d £££where 6¡ ¢ is desired motor speed, £¤, output of ’, and controllers respectiv

hovering speed.Three PID controllers used in this proje

in equation below [12]:

£¤ :›¡ ¢ ›<¥¡9¤ :’¡ ¢ ’£¦ :œ¡ ¢ œ<¥¡9¦ :¡ ¢ £§ :b¡ ¢ b<¥¡9§ :¡ ¢

ed in two partserations and theEquations below

¨ ;Ÿ›œb (24)

ctor, "›I œI bI'( ller force, 46 istion matrix frommass.

rotor flying robotf measurements.

ions of quadrotor

e to control theloop includingto measure the

ontrolling of the, it is possible to

ntrollers in orderbelow represents

models:

£©¤¦§

(25)

¦ and £§ arely and d is the

t are represented

<¥ª9¤¥ª9¦ <¥ª9§ (26)

B. Position Control

Using dynamic model of quadrot(24) roll and pitch angles are calcuthe position:

¡ ¢ iŽ: :<II¡ ¢

’¡ ¢ iŽ: : <^II¡ ¢ where ¡ ¢ and ’¡ ¢ are desiredesired accelerations, and II¡ ¢ acceleration and are calculated usin

II ¡ ¢ ^II(a«5 K ea ¥¡:I (a«¥ª:(a«5¥6¬:^(a«For _II ¡ ¢ and ŒII¡ ¢ the same equatio

VII. TEST RESThe described methods wer

implemented in MRL. Fig. 8 showscamera and IMU beside a RB100 [testing the introduced methods.

Fig. 8: quadrotor platform d

Fig. 9 shows the land-mark usposition of land-mark was fixed anwall in height of one meter from the

Fig. 9: land-mark used in

r introduced in equationlated in order to control

:<_II ¡ ¢<:<_II ¡ ¢<

(27)

Euler-angles to reachand _II ¡ ¢ are desiredequation below:

5 K ea ^I<K ea ^<5 K ea ^< (28)

can be used.

LTS tested in a robot

the quadrotor which the13] PC installed on it for

signed in MRL

ed in this project. Thed it was installed on theearth.

this project

Page 6: Localization and Control of the Quadrotor Using IMU and Mono Camera RIOS11 International RoboCup IRAN OPEN 2011-Libre

8/9/2019 Localization and Control of the Quadrotor Using IMU and Mono Camera RIOS11 International RoboCup IRAN OPEN 2011-Libre

http://slidepdf.com/reader/full/localization-and-control-of-the-quadrotor-using-imu-and-mono-camera-rios11 6/7

Fig. 10 shows accelerations and Euler-angles measuredusing IMU.

Fig. 10: measurements gathered using IMU

Fig. 11 to Fig. 13 show position estimated using theproposed method. Fig. 14 focuses on the position curves toshow steps of the method. As can be seen in this figure, theposition measured using mono camera is very slow and its

error is high. Kalman filter has used vision observations toupdate the position calculated using IMU and finally thefiltered estimation is an acceptable measurement. Theestimated path followed by the quadrotor is showed in Fig.15.

Fig. 11: a. path measuerd using camerab. path estimated using Kalman Filter

Fig. 12: a. path measuerd using camerab. path estimated using Kalman Filter

Fig. 13: a. path measuerd using camerab. path estimated using Kalman Filter

Fig. 14: a. path measuerd using camerab. path estimated using Kalman Filter

0 2 4 6 8 10 12 14 16 18 20-1.2

-1

-0.8

-0.6

-0.4

-0.2

0

0.2

X ( m e

t e r )

time (sec)

desired positionposition measured using cameraposition stimated using K-Ffinally filtered position

0 2 4 6 8 10 12 14 16 18 20-0.2

0

0.2

0.4

0.6

0.8

1

1.2

Y ( m e

t e r )

time (sec)

desired positionposition measured using cameraposition stimated using K-Ffinally filtered position

0 2 4 6 8 10 12 14 16 18 20-0.2

0

0.2

0.4

0.6

0.8

1

1.2

Z ( m e

t e r )

time (sec)

desired positionposition measured using cameraposition stimated using K-Ffinally filtered position

15 15.5 16 16.5 17 17.5

0

0.2

0.4

0.6

0.8

1

Y ( m e

t e r )

time (sec)

desired positionposition measured using cameraposition stimated using K-Ffinally filtered position

Page 7: Localization and Control of the Quadrotor Using IMU and Mono Camera RIOS11 International RoboCup IRAN OPEN 2011-Libre

8/9/2019 Localization and Control of the Quadrotor Using IMU and Mono Camera RIOS11 International RoboCup IRAN OPEN 2011-Libre

http://slidepdf.com/reader/full/localization-and-control-of-the-quadrotor-using-imu-and-mono-camera-rios11 7/7

Fig. 15: a. path measuerd using camerab. path estimated using Kalman Filter

VIII. CONCLUSION

In this paper a method was proposed to estimating theposition of a quadrotor flying robot using vision aidedinertial navigation system (V-INS). The proposed methoduses a mono camera and a low cost inertial measurementunit (IMU) to estimate position of robot via a Kalman-filter(K-F). Test results showed that using this method obtainsinformation more precise and faster compared to the onlycamera.

ACKNOWLEDGMENT

Authors gratefully acknowledge Mr. Naim Ajil Foroushanwho researches in MRL for quadrotor. His support and helpis truly appreciated.

REFERENCES [1] Chatfield, A.B., Fundamentals of High Accuracy Inertial Navigation

Reston. American Institute of Aeronautics and Astronautics, 1997.[2] Mirzaei, F.M. and S.I. Roumeliotis, A Kalman Filter-Based Algorithm

for IMU-Camera Calibration: Observability Analysis and PerformanceEvaluation. IEEE TRANSACTIONS ON ROBOTICS 2008: p. 1-14.

[3] Adaptive Fuzzy Prediction of Low-Cost Inertial-Based PositioningErrors. IEEE TRANSACTIONS ON FUZZY SYSTEMS, 2007. 15: p.519-529.

[4] Pangercic, D., R.B. Rusu, and M. Beetz, 3D-based monocular SLAMfor mobile agents navigating in indoor environments, in EmergingTechnologies and Factory Automation, 2008. ETFA 2008. IEEEInternational Conference on 2008 Hamburg p. 839 - 845

[5] Wendel, J., et al., An integrated GPS/MEMS-IMU navigation systemfor an autonomous helicopter. elsevier, 2006: p. 527-533.

[6] H. Titterton, D. and J. L. Weston, Strapdown Inertial NavigationTechnology. 17: The Institution of Electrical Engineers (IEE).

[7] 7. Strachana, N.J.C., P. Nesvadbaa, and A.R. Allenb, A method forworking out the moments of a polygon using an integration technique.

Elsevier, 1990. 11(5): p. 351-354[8] 8. Tuck, K., Tilt Sensing Using Linear Accelerometers, F.Semiconductor, Editor. 2007. p. 1 -7.

[9] 9. Caruso, M.J., Applications of Magnetic Sensors for Low CostCompass Systems, Honeywell, SSEC.

[10] 10. Weiguang, G., et al., Application of Adaptive Kalman FilteringAlgorithm in IMU/GPS Integrated Navigation System Geo-spatialInformation Science, 2007. 10(1).

[11] 11. Welch, G. and G. Bishop, An Introduction to the Kalman Filter.2001, University of North Carolina at Chapel Hill, Department ofComputer Science.

[12] 12. Michael, N., et al., The GRASP Multiple Micro-UAV Testbed.Robotics & Automation Magazine, IEEE, 2010. 17(3): p. 56-65.

[13] 13. http://www.roboard.com

-1.2-1

-0.8-0.6

-0.4-0.2

0

-0.20

0.20.4

0.60.8

11.2-0.2

0

0.2

0.4

0.6

0.8

1

1.2

desired positionposition measured using cameraposition stimated using K-Ffinally filtered position