intech-autonomous flight control for rc helicopter using a wireless camera

Upload: andra-gabriela-necula

Post on 14-Apr-2018

218 views

Category:

Documents


0 download

TRANSCRIPT

  • 7/27/2019 InTech-Autonomous Flight Control for Rc Helicopter Using a Wireless Camera

    1/23

    11

    Autonomous Flight Control forRC Helicopter Using a Wireless Camera

    Yue Bao1, Syuhei Saito2 and Yutaro Koya11Tokyo City University,

    2Canon Inc.Japan

    1. IntroductionIn recent years, there are a lot of researches on the subject of autonomous flight control of amicro radio control helicopter. Some of them are about flight control of unmanned

    helicopter (Sugeno et al., 1996) (Nakamura et al., 2001). The approach using the fuzzycontrol system which consists of IF-Then control rules is satisfying the requirements for theflight control performance of an unmanned helicopter like hovering, takeoff, rotating, andlanding. It is necessary to presume three dimensional position and posture of micro RChelicopter for the autonomous flight control.A position and posture presumption method for the autonomous flight control of the RChelicopter using GPS (Global Positioning System), IMU (Inertial Measurement Unit), Laser

    Range Finder (Amida et al., 1998), and the image processing, etc. had been proposed.However, the method using GPS cannot be used at the place which cannot receive theelectric waves from satellites. Therefore, it is a problem that it cannot be used for the flightcontrol in a room. Although the method which uses various sensors, such as IMU and LaserRange Finder, can be used indoors, you have to arrange many expensive sensors orreceivers in the room beforehand. So, these methods are not efficient. On the other hand, themethod using an image inputted by a camera can be used in not only outdoors but alsoindoors, and is low price. However, this method needs to install many artificial markers inthe surroundings, and it is a problem that the speed of the image inputting and imageprocessing cannot catch up the speed of movement or vibration of a RC helicopter. Amethod presuming the three dimensional position and posture of a RC helicopter by the

    stereo measurement with two or more cameras installed in the ground was also proposed.In this case, the moving range of the RC helicopter is limited in the place where two or morecameras are installed. Moreover, there is a problem for which a high resolution camera mustbe used to cover a whole moving range. (Ohtake et al., 2009)Authors are studying an autonomous flight of a RC helicopter with a small-wireless cameraand a simple artificial marker which is set on the ground. This method doesnt need to set

    the expensive sensors, receivers, and cameras in the flight environment. And, we thoughtthat a more wide-ranging flight is possible if the natural feature points are detected from theimage obtained by the camera on the RC helicopter. This chapter contains the followingcontents.

    www.intechopen.com

  • 7/27/2019 InTech-Autonomous Flight Control for Rc Helicopter Using a Wireless Camera

    2/23

    Advances in Flight Control Systems218

    a. Input method of image from a small, wireless camera which is set on a RC helicopter.b. Extraction method of feature points from an image of flight environment taken with a

    camera on RC helicopter.c. Calculation method of three dimensional position and posture of RC helicopter by

    image processing.d. Experiment of autonomous flight of a RC helicopter using fuzzy logic control.2. Composition of system

    The overview of a micro RC helicopter with coaxial counter-rotating blades used in ourexperiment is shown in Fig.1. Since this RC helicopter can negate a running torque of a bodyby a running torque between an up propeller and a down propeller, it has the feature that itcan fly without being shakier than the usual RC helicopter. The composition of ourexperiment system for automatic guidance of RC helicopter is shown in Fig.2. A smallwireless camera is attached on the RC helicopter as shown in Fig.3, and the image of the

    ground is acquired with this camera, and this image is sent to the receiver on ground, andthen sent to the computer through a video capture. The position and posture of the RC

    Fig. 1. Micro RC helicopter with the coaxial contra-rotating rotors

    Fig. 2. Composition of automatic guidance system

    www.intechopen.com

  • 7/27/2019 InTech-Autonomous Flight Control for Rc Helicopter Using a Wireless Camera

    3/23

    Autonomous Flight Control for RC Helicopter Using a Wireless Camera 219

    Fig. 3. RC helicopter equipped with a micro wireless camera

    Fig. 4. An artificial marker

    Fig. 5. Coordinate axes and attitude angles

    www.intechopen.com

  • 7/27/2019 InTech-Autonomous Flight Control for Rc Helicopter Using a Wireless Camera

    4/23

    Advances in Flight Control Systems220

    helicopter are computed with image processing by the computer set on the ground, and, this

    image processing used the position and shape of an artificial marker in the camera image

    like Fig.4. The three dimensional position of the RC helicopter ( ( ), ( ), ( ))x t y t z t and the

    changing speed of the position ( ( ), ( ), ( ))x t y t z t$ $ $ and the attitude angles ( )t and changing

    speed of attitude angles ( )t$ can be obtained by this calculation. Fig.5 shows the relation

    between these coordinate axes and attitude angles.This micro RC helicopter is controlled by four control signals, such as Aileron, Elevator,

    Rudder, and Throttle, and, the control rule of fuzzy logic control is decided by using

    measurement data mentioned above. The control signals are sent to micro RC helicopter

    through the digital-analog converter.

    3. Image processing

    3.1 Image input

    The micro wireless camera attached on the RC helicopter takes an image by interlacesscanning. If the camera takes an image during RC helicopter flying, since the vibration of theRC helicopter is quicker than the frame rate of the camera, the image taken by the camerawill be a blurred image resulting from an interlace like Fig.6. We devised a method skippingthe odd number line (or, even number line) of input image to acquire an clear input imagewhile the RC helicopter is flying.

    Fig. 6. A blurring image acquired by wireless camera

    3.2 Feature point extractionFeature point detection is defined in terms of local neighborhood operations applied to an

    image such as an edge and corner. Harris operator (Harris and Stephens, 1988) (Schmid et

    al., 1998) and SUSAN operator (Smith and Brady, 1997) are well known as common feature

    detectors. The methods (Neumann and You, 1999) (Bao and Komiya, 2008) to estimate

    www.intechopen.com

  • 7/27/2019 InTech-Autonomous Flight Control for Rc Helicopter Using a Wireless Camera

    5/23

    Autonomous Flight Control for RC Helicopter Using a Wireless Camera 221

    position and attitude by using natural feature points or marker in the input image are

    proposed too. Our prototype experiment system used the Harris operator which can extract

    the same feature points in higher rate more than other feature point extraction methods

    (Schmid et al., 1998). First, we obtain a grayscale image from camera. Let us consider taking

    an image patch over an area (u , v ) from an image I and shifting it by (x, y). The HarrismatrixMcan be found by taking the second derivative of the sum of squared differences

    between these two patches around (x , y ) = (0, 0).Mis given by:

    2

    2

    2

    2

    I I IG G

    x yxM

    I I IG G

    x y y

    =

    (1)

    Let the standard deviation of G in the equation be with Gaussian function for performingsmoothing with Gaussian filter. The strength of a corner is decided by second derivative.

    Here, the eigenvalue of M is 1 2( , ) , and the value of eigenvalue can be got from the

    following inference.

    If 1 0 and 2 0 , then there are no features at this pixel (x , y). If either 1 or 2 is large positive value, then an edge is found. If 1 and 2 are both large positive values, then a corner is found.Because the exact calculation of eigenvalue by the method of Harris will increase

    computational amount, the following functions R were proposed instead of those

    calculation methods.2det( ) ( ( ))R M k tr M = (2)

    21 2 1 2( )R k = + (3)

    The det expresses a determinant and tr expresses the sum of the diagonal element of a

    matrix, and k is a value decided experientially.

    The kanade-Tomasi corner detector (Shi and Tomasi, 1994) uses min (12) as measure of

    feature point. For example, Fig.7 shows a feature point detection using Harris operator for

    photographed image. The Harris operator detects the corner point mainly from the image as

    a feature point.The position of feature point is estimate able by related position information of an artificial

    marker to feature point from camera image after coordinate transformation. The flight

    control area of RC helicopter can be expanded(see Fig.8) by using the information of natural

    feature points around an artificial marker. Harris operator is suitable for detecting natural

    points. Our system saves the areas including the natural feature points as templates when

    the artificial marker is detected. In the range that can take the image of the artificial marker,

    the system uses the position information of the artificial marker. If the system can't take the

    image of an artificial marker, the position of the helicopter is estimated by template

    matching between the area of natural feature points and the template area.

    www.intechopen.com

  • 7/27/2019 InTech-Autonomous Flight Control for Rc Helicopter Using a Wireless Camera

    6/23

    Advances in Flight Control Systems222

    Fig. 7. Feature point extraction by a Harris operator

    Fig. 8. The expansion of flight area

    www.intechopen.com

  • 7/27/2019 InTech-Autonomous Flight Control for Rc Helicopter Using a Wireless Camera

    7/23

    Autonomous Flight Control for RC Helicopter Using a Wireless Camera 223

    3.3 Detection of a marker

    Fig. 9. The flow chart of marker detection

    The image of the marker photographed with the wireless camera on micro RC helicopter isshown in Fig.4. The yaw angle can be calculated by the center of the marker and a positionof a cut of triangle of the marker, and the angle of the pitch and the angle of roll can beacquired by coordinate transformation between a camera coordinate system and a world

    coordinate system. The flow chart about the image processing for the marker extraction andthe calculation of position and posture are shown in Fig.9. First, a binarization processing isperformed at the input image from the camera installed on the micro RC helicopter. Then,the marker in the image is searched, and the outline of the marker is extracted. If the outlinecannot be extracted, an image is acquired from the camera again and a marker is searched again.A marker center is searched after the outline of a marker is extracted. The method of searchis shown in Fig.10. The maximum values and the minimum values about the x-coordinateand y-coordinate are searched from of the extracted outline, and their middle values aregiven as the coordinates of the marker center. A length of major axis and a length of minoraxis of the marker are calculated by the distance between the center coordinates of themarker and the pixel on the outline of the marker. The calculation method of the major axis

    and a minor axis is shown in Fig.11. When the center coordinate is defined as ( , )C CP x y , andthe coordinate of the pixel which is on the outline is defined as I(x,y), the distance PIfromthe center to the pixel of the outline is calculated by equation (4).

    2 2( ) ( )c cPI x x y y= + (4)

    For obtaining the maximum value 1G of PIand the minimum value 2G of PI, all pixels on

    the outline are calculated. And the segment of 1G is defined as the major axis PO , and the

    segment of 2G is defined as a minor axis PQ . The position and posture of the micro RC

    helicopter are calculated by the method shown in Section 4.

    www.intechopen.com

  • 7/27/2019 InTech-Autonomous Flight Control for Rc Helicopter Using a Wireless Camera

    8/23

  • 7/27/2019 InTech-Autonomous Flight Control for Rc Helicopter Using a Wireless Camera

    9/23

    Autonomous Flight Control for RC Helicopter Using a Wireless Camera 225

    Since the vibration of the RC helicopter is quicker than the frame rate of the camera, whenthe camera which attached on the micro RC helicopter takes the image, the taken image hasbecome a blurred image resulting from the interlace like Fig.6. Therefore, an exact resultcannot be obtained by a usual image processing. The image processing that we devised

    scans with skipping the odd (or even) number lines only about y axial direction at pixelscanning. The method of scanning is shown in Fig.12. If the pixels processed are on oddlines, the even number lines are skipped only about y axial direction. About x axialdirection, the system scans 1 pixel at a time like the usual pixel scanning. By this method, astable profile tracking can be performed without being blurred by interlace.

    even-numbered line

    odd-numbered lineattention pixel

    edge pixel

    Fig. 12. The method of outline tracking

    4. Calculation of position and posture4.1 Calculation of the position coordinates of RC helicopterIn order to measure the position of the RC helicopter, it is necessary to calculate the focal

    distance of the wireless camera on the RC helicopter first. If the size of marker and the

    location of z axial direction of RC helicopter are got, the focal distance f of the wireless

    camera can be calculated easily. The three-dimensional coordinates of a moving RC

    helicopter are calculated by using the focal distance, a center of gravity and the value of

    radius of the marker. The image formation surface of the camera and the circular marker

    surface are parallel if RC helicopter is hovering above the circular marker like Fig.13.

    Lets consider that a marker is in the center of the photographed image. The radius of

    circular marker in the camera image is defined as 1D , and the center coordinates of thecircle marker are defined as 1 1( , )C Cx y . The radius of the actual marker is defined as d1, and

    the center coordinates of the actual marker in a world coordinate system are defined as

    1 1 1( , , )x y z . Then, the focal distancefof a camera can be calculated from the following two

    equations from parallel relation.

    1 1 1: :z d f D= (5)

    1 1

    1

    z Df

    d

    = (6)

    www.intechopen.com

  • 7/27/2019 InTech-Autonomous Flight Control for Rc Helicopter Using a Wireless Camera

    10/23

    Advances in Flight Control Systems226

    When the RC-helicopter is in moving, the radius of the marker in the image after moving is

    defined as 2D , the center coordinates of the marker after moving are defined as 2 2( , )C Cx y ,

    and the center coordinates of actual marker is defined as 2 2 2( , , )x y z . Then, the following

    equation is acquired.

    2 1 2: :z d f D= (7)

    Here, since the focal distance f and the radius 1d of actual marker are not changing, the

    following relation is obtained from equation (5) and equation (7).

    1 2 2 1: :D D z z= (8)

    2z can be acquired by the following equation. Moreover, 2x and 2y can be calculated by

    the following equations from parallel relation. Therefore, the coordinate of the helicopter

    after moving is computable by using equation (9), equation (10), and equation (11), using the

    focal distance of the camera.

    1 12

    2

    D zz

    D

    = (9)

    2 22

    X zx

    = (10)

    2 22

    Y zy

    = (11)

    z

    x

    y

    X

    Y

    z1

    (Xc1,Yc1)

    D1

    Camera coordinate

    World coordinate

    (x1,y1)

    f

    d1

    yz

    x

    Y (Xc2,Yc2)

    X

    z2

    D2

    Camera coordinate

    World coordinate

    (x1,y1)

    f

    (x2,y2)

    (Xc1,Yc1)

    D1

    d1

    Fig. 13. The location calculation method of RC helicopter

    www.intechopen.com

  • 7/27/2019 InTech-Autonomous Flight Control for Rc Helicopter Using a Wireless Camera

    11/23

    Autonomous Flight Control for RC Helicopter Using a Wireless Camera 227

    4.2 Calculation of the attitude angle of RC helicopterThe relation between an angle of RC helicopter and an image in the camera coordinatesystem is shown in Fig.14. When RC helicopter is hovering above a circular marker, thecircular marker image in the camera coordinate system is a right circle like an actual marker.

    If RC helicopter leans, the marker in a camera coordinate system becomes an ellipse. Tocalculate the attitude angle, first, the triangular cut part of the circular marker is extracted asa direction feature point. Then the deformation of the marker image is corrected forcalculating a yaw angle using the relation between the center of the circular marker and thelocation of the direction feature point of the circular marker. The pitch angle and the rollangle are calculated performing coordinate transformation from the camera coordinatesystem to the world coordinate by using the deformation rate of the marker in the imagefrom the wireless camera.

    Cameracoordinate

    RChelicopter

    Ground

    Marker

    Fig. 14. Relation between attitude angle of RC helicopter and image in wireless camera.

    Calculation of a yaw angle

    The value of yaw angle can be calculated using the relation of positions between the centerof circular marker image and the direction feature point of the circular marker image.However, when the marker image is deforming into the ellipse, an exact value of the yawangle cannot be got directly. The yaw angle has to be calculated after correcting thedeformation of the circular marker. Since the length of the major axis of the ellipse does not

    change before and after the deformation of marker, the angle between x axis and the

    major axis can be correctly calculated even if the shape of the marker is not corrected.As shown in Fig.15, the center of a marker is defined as point P , the major axis of a markeris defined as PO , and the intersection point of the perpendicular and x axis which weretaken down from Point O to the x axis is defined as C . The following equation is got ifOPC is defined as '.

    ' arctanOC

    PC

    =

    (12)

    Here, when the major axis exists in the 1st quadrant like Fig.15(a), is equal to the value of

    ', and when the major axis exists in the 2nd quadrant, is calculated by subtracting '

    www.intechopen.com

  • 7/27/2019 InTech-Autonomous Flight Control for Rc Helicopter Using a Wireless Camera

    12/23

    Advances in Flight Control Systems228

    from 180 degrees like Fig.15(b). If the x -coordinate of Point O is defined as xO, the value of

    is calculated by the following equation.

    ( 0)

    180 ( 0)

    o

    o

    x

    x

    =

  • 7/27/2019 InTech-Autonomous Flight Control for Rc Helicopter Using a Wireless Camera

    13/23

    Autonomous Flight Control for RC Helicopter Using a Wireless Camera 229

    When the line segment between Point A and the center of the marker is defined as PA , the

    angle which the line segment PA and the major axis PO make is calculated by thefollowing equations.

    arctanASPS

    = (16)

    Finally, a yaw angle is calculable by adding to .

    P

    O

    X

    Y

    a

    A Q

    S

    G2

    G1

    Fig. 16. An angle between the direction feature point and the major axis

    Calculation of pitch angle and roll angle

    By using the deformation rate of the marker in an image, a pitch angle and a roll angle can

    be calculated by performing coordinate transformation from a camera coordinate system to

    a world coordinate system. In order to get the pitch angle and rolling angle, we used a weak

    perspective projection for the coordinate transformation (Bao et al., 2003).

    Fig.17 shows the principle of the weak perspective projection. The image of a plane figure

    which photographed the plane figure in a three-dimensional space by using a camera isdefined as I, and the original configuration of the plane figure is defined as T. The relation

    between I and T is obtained using the weak perspective projection transformation by the

    following two steps projection.

    a. T' is acquired by a parallel projection of Tto P paralleled to camera image surface C.b. Iis acquired by a central projection of T' to C.The attitude angle ' is acquired using relation between I and T. The angle ' shown in

    Fig.18 expresses the angle between original marker and the marker in the camera coordinate

    system. In that case, the major axis 1G of the marker image and a minor axis 2G of the

    marker image can show like Fig.19.

    www.intechopen.com

  • 7/27/2019 InTech-Autonomous Flight Control for Rc Helicopter Using a Wireless Camera

    14/23

    Advances in Flight Control Systems230

    x

    yz

    p m

    PM

    m

    o

    OCamera imaging surface C

    3-dimensional space

    Two dimension image T

    p

    Two dimension image T

    Photography image I

    Fig. 17. The conceptual diagram of weak central projection

    Y

    X

    G1

    P

    O

    G2

    Q

    Fig. 18. The schematic diagram of the attitude angle'

    www.intechopen.com

  • 7/27/2019 InTech-Autonomous Flight Control for Rc Helicopter Using a Wireless Camera

    15/23

    Autonomous Flight Control for RC Helicopter Using a Wireless Camera 231

    Camera

    Ground

    G2

    G1

    L

    US

    T

    PQ

    Fig. 19. Calculation of an attitude angle

    Fig. 19 shows the calculation method of. PQ is transformed into LP if along optical axis of

    a camera an inverse parallel projection is performed to the minor axis PQ. Since the originalconfiguration of a marker is a right circle, LP becomes equal to the length 1G of the major

    axis in a camera coordinate system. is calculated by the following equation.

    2

    1

    ' arcsinG

    G

    =

    (17)

    To get the segment TU, SUis projected orthogonally on the flat surface parallel to PQ . PQand TUare in parallel relationship and LP and SUare also in parallel relationship.

    Therefore, the relation between andcan be shown by equation (18), and the inclination

    of the camera can be calculated by the equation (19).

    ' = (18)

    2

    1

    arcsinG

    G

    =

    (19)

    5. Control of RC helicopter

    Control of RC helicopter is performed based on the position and posture of the markeracquired by Section 4. When RC helicopter is during autonomous hovering flight, the position

    www.intechopen.com

  • 7/27/2019 InTech-Autonomous Flight Control for Rc Helicopter Using a Wireless Camera

    16/23

    Advances in Flight Control Systems232

    data of RC helicopter are obtained by tracking the marker from definite height. The fuzzy ruleof the Throttle control input signal during the autonomous flying is defined as follows.

    If ( )z t is PB and ( )z t$ is PB, Then Throttle is NB If ( )z t is PB and ( )z t$ is ZO, Then Throttle is NS If ( )z t is PB and ( )z t$ is NB, Then Throttle is ZO If ( )z t is ZO and ( )z t$ is PB, Then Throttle is NS If ( )z t is ZO and ( )z t$ is ZO, Then Throttle is ZO If ( )z t is ZO and ( )z t$ is NB, Then Throttle is PS If ( )z t is NB and ( )z t$ is PB, Then Throttle is ZO If ( )z t is NB and ( )z t$ is ZO, Then Throttle is PS If ( )z t is NB and ( )z t$ is NB, Then Throttle is PBThe fuzzy rule design of Aileron, Elevator, and Rudder used the same method as Throttle.Each control input u(t) is acquired from a membership function and a fuzzy rule. The

    adaptation value i and control input u(t) of a fuzzy rule are calculated from the followingequations.

    1

    ( )n

    i Aki kk

    x =

    = (20)

    1

    1

    ( )

    ri ii

    rii

    cu t

    =

    =

    =

    (21)

    Here, i is the number of a fuzzy rule, n is the number of input variables, ris the quantity of a

    fuzzy rule, Aki is the membership function, kx is the adaptation variable of a membershipfunction, and ic is establishment of an output value (Tanaka, 1994) (Wang et al., 1997).

    6. Experiments

    In order to check whether parameter of a position and a posture can be calculated correctly,we compared actual measurement results with the calculation results by severalexperiments. The experiments were performed indoors. In the first experiment, a wirelesscamera shown in Fig.20 is set in a known three-dimensional position, and a marker is put onthe ground like Fig.21.The marker is photographed by this wireless camera. A personal computer calculated theposition and the posture of this wireless camera and compared the calculated parameterswith the actual parameters.Table 1 shows the specification of the wireless camera and Table 2 shows the specification ofthe personal computer. A marker of 19cm radius is used in experiments because it isconsidered that the marker of this size can be got easily when this type of wireless camerawhich has the resolution of 640x480 pixels photographs it at a height between 1m and 2m.Table 3 shows experimental results of z axis coordinates. Table 4 shows experimental results

    of moving distance. Table 5 shows experimental results of yaw angle ( +). Table 6 shows

    experimental results of angle. According to the experimental results, although there aresome errors in these computed results, these values are close to actual measurement.

    www.intechopen.com

  • 7/27/2019 InTech-Autonomous Flight Control for Rc Helicopter Using a Wireless Camera

    17/23

    Autonomous Flight Control for RC Helicopter Using a Wireless Camera 233

    Fig. 20. The wireless camera

    Fig. 21. The first experiment

    Maker RF SYSTEM lab.Part number Micro Scope RC-12Image sensor 270,000pixel , 1/4 inch , color CMOSLens 0.8mm Pin lensScan mode InterlaceEffective distance 30mTime of charging battery About 45 minutesSize 151835(mm)

    Weight 14.7g

    Table 1. The specification of the wireless camera

    Maker Hewlett PackardModel name Compaq nx 9030OS Windows XPCPU Intel Pentium M 1.60GHzMemory 768MB

    Table 2. The specification of PC

    www.intechopen.com

  • 7/27/2019 InTech-Autonomous Flight Control for Rc Helicopter Using a Wireless Camera

    18/23

    Advances in Flight Control Systems234

    Actual distance (mm) 800 1000 1200 1400Calculated value 785 980 1225 1372

    Table 3. The experimental results of z axis coordinates

    Actual moving distance (mm) 50 -50 100 -100Computed value of x axis coordinates 31 -33 78 -75Computed value of y axis coordinates 29 -33 101 -89

    Table 4. The experimental results of moving distance

    Actual degree (degree) 45 135 225 315Calculated value 64 115 254 350

    Table 5. The experimental results of yaw angle (angleangle)

    Actual degree (degree) 0 10 20 40Calculated value 12 28 36 44

    Table 6. The experimental results ofangle

    In next experiment, we attached the wireless camera on RC helicopter, and checked if

    parameters of a position and a posture would be calculated during the flight. Table 7 shows

    the specification of RC helicopter used for the experiment. A ground image like Fig.22 is

    photographed with the wireless camera attached at RC helicopter during the flight. The

    marker is detected by the procedures of Fig.9 using image processing program. A

    binarization was performed to the inputted image from the wireless camera and the outline

    on the marker was extracted like Fig. 23. The direction feature point was detected from the

    image of the ground photographed by the wireless camera like Fig.24.

    Fig. 25 shows the measurement results on the display of a personal computer used for the

    calculation. The measurement values in Fig.25 were x-coordinate=319, y-coordinate=189,

    z-coordinate = 837, angle =10.350105, angle = -2.065881, and angle'=37.685916. Since

    our proposal image input method which can improve blurring was used, the position and

    the posture were acquirable during flight. However, since the absolute position and posture

    of the RC helicopter were not measureable by other instrument during the flight. We

    confirmed that by the visual observation the position and the posture were acquirable

    almost correctly.

    Length 360mm(Body) , 62mm(Frame)Width 90mmHeight 160mmGross load 195gDiameter of a main rotor 350mmGear ratio 9.857:1Motor XRB Coreless Motor

    Table 7. The specification of RC helicopter

    www.intechopen.com

  • 7/27/2019 InTech-Autonomous Flight Control for Rc Helicopter Using a Wireless Camera

    19/23

    Autonomous Flight Control for RC Helicopter Using a Wireless Camera 235

    Fig. 22. An image photographed by the wireless camera

    Fig. 23. The result of marker detection

    Fig. 24. The result of feature point extraction

    www.intechopen.com

  • 7/27/2019 InTech-Autonomous Flight Control for Rc Helicopter Using a Wireless Camera

    20/23

    Advances in Flight Control Systems236

    Fig. 25. The measurement results during flight

    At the last, the autonomous flight control experiment of the RC helicopter was performed by

    detecting the marker ,calculating the position and the posture,and fuzzy control. Fig. 26

    shows a series of scenes of a hovering flight of the RC helicopter. The results of image

    processing can be checked on the display of the personal computer. From the experimental

    results, the marker was detected and the direction feature point was extracted correctly

    during the autonomous flight. However, when the spatial relation of the marker and the RC

    helicopter was unsuitable, the detection of position and posture became unstable, then the

    autonomous flight miscarried. We will improve the performance of the autonomous flight

    control for RC helicopter using stabilized feature point detection and stabilized position

    estimation.

    7. Conclusion

    This Chapter described an autonomous flight control for micro RC helicopter to fly indoors.It is based on three-dimensional measuring by a micro wireless camera attached on themicro RC helicopter and a circular marker put on the ground. First, a method of measuringthe self position and posture of the micro RC helicopter simply was proposed. By thismethod, if the wireless camera attached on the RC helicopter takes an image of the circularmarker, a major axis and a minor axis of the circular marker image is acquirable. Becausethis circular marker has a cut part, the direction of the circular marker image can be

    www.intechopen.com

  • 7/27/2019 InTech-Autonomous Flight Control for Rc Helicopter Using a Wireless Camera

    21/23

    Autonomous Flight Control for RC Helicopter Using a Wireless Camera 237

    Time 1 Time 2

    Time 3 Time 4

    Fig. 26. The experiment of autonomous flight

    acquired by extracting the cut part as a direction feature point of the circular marker.

    Therefore, the relation between the circular marker image and the actual circular marker can

    be acquired by a coordinate transform using the above data. In this way, the three-

    dimensional self position and posture of the micro RC helicopter can be acquired with

    image processing and weak perspective projection. Then, we designed a flight control

    system which can perform fuzzy control based on the three-dimensional position and

    posture of the micro RC helicopter. The micro RC helicopter is controlled by tracking the

    circle marker with a direction feature point during the flight.In order to confirm the effectiveness of our proposal method, in the experiment, the position

    and the posture were calculated using an image photographed with a wireless camera fixed

    in a known three-dimensional position. By the experiment results, the calculated values near

    the actually measuring values were confirmed. An autonomous flight control experiment

    was performed to confirm that if our proposal image input method is effective when using a

    micro wireless camera attached on the micro RC Helicopter. By results of the autonomous

    flight control experiment of the RC helicopter, the marker was detected at real-time during

    the flight, and it was confirmed that the autonomous flight of the micro RC helicopter is

    possible. However, when the spatial relation of the marker and the RC helicopter was

    www.intechopen.com

  • 7/27/2019 InTech-Autonomous Flight Control for Rc Helicopter Using a Wireless Camera

    22/23

    Advances in Flight Control Systems238

    unsuitable, the detection of position and posture became unstable and then the autonomous

    flight miscarried. We will improve the performance of autonomous flight control of the RC

    helicopter to more stable. We will improve the system so that the performance of the

    autonomous flight control of the RC Helicopter may become stability more.

    8. Reference

    Amida, O.; Kanade, T. & Miller, J.R. (1998). Vision-Based Autonomous Helicopter Researchat Carnegie Mellon Robotics Institute 1991-1997, American Helicopter SocietyInternational Conf. Heli, Japan.

    Harris, C. & Stephens, M. (1988). A Combined Corner and Edge Detecter, Proc. 4th AlveyVision Conf., pp.147-151.

    Nakamura, S.; Kataoka, K. & Sugeno, M. (2001). A Study on Autonomous Landing of anUnmanned Helicopter Using Active Vision and GPS,J.RSJVol.18, No.2, pp.252-260.

    Neumann, U. & You, S. (1999). Natural Feature Tracking for Augmented-reality, IEEETransactions on Multimedia, Vo.1, No.1, pp.53-64.

    Ohtake, H.; Iimura, K. & Tanaka, K. (2009). Fuzzy Control of Micro RC Helicopter withCoaxial Counter-rotating Blades, journal of Japan Society for Fuzzy Theory andIntelligent Informatics, Vol.21, No.1, pp.100-106.

    Schmid, C.; Mohr, R. & Bauckhage, C. (1998). Comparing and Evaluating Interest Points,Proc. 6th Int. Conf. on Computer Vision, pp.230-235.

    Shi, J. & Tomasi, C. (1994). Good Features to Track, Proc. IEEE Conf. Comput. Vision Patt.Recogn., pp.593-600.

    Smith, S. M.; & Brady, J. M. (1997). SUSAN - A New Approach to Low Level ImageProcessing, Int. J. Comput. Vis., vol.23, no.1, pp.45-78.

    Sugeno, M. et al. (1996). Inteligent Control of an Unmanned Helicopter based on FuzzyLogic., Proc. of American Helicopter Society 51st Annual Forum., Texas.Tanaka, K. (1994).Advanced Fuzzy Control, Kyoritsu Shuppan Co.,LTD, Japan.Wang, G.; Fujiwara, N. & Bao, Y. (1997). Automatic Guidance of Vehicle Using Fuzzy

    Control. (1st Report). Identification of General Fuzzy Steering Model andAutomatic Guidance of Cars., Systems, Control and Information, Vol.10, No.9, pp.470-479.

    Bao, Y.; Takayuki, N. & Akasaka, H. (2003). Weak Perspective Projection Invariant PatternRecognition without Gravity Center Calculation, journal of IIEEJ, Vol.32, No.5,pp.659--666

    Bao, Y. & Komiya, M. (2008). An improvement Moravec Operator for rotated image, Proc. of

    the ADVANTY 2008 SYMPOSIUM, pp.133-138.

    www.intechopen.com

  • 7/27/2019 InTech-Autonomous Flight Control for Rc Helicopter Using a Wireless Camera

    23/23