pose determination for malfunctioned satellites based on depth...

16
Research Article Pose Determination for Malfunctioned Satellites Based on Depth Information Feng Yu , Yi Zhao, and Yanhua Zhang College of Astronautics, Nanjing University of Aeronautics and Astronautics, 210016, China Correspondence should be addressed to Feng Yu; [email protected] Received 9 November 2018; Revised 22 February 2019; Accepted 1 April 2019; Published 11 June 2019 Academic Editor: Enrico C. Lorenzini Copyright © 2019 Feng Yu et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Autonomous on-orbit servicing is the future space activity which can be utilized to extend the satellite life. Relative pose estimation for a malfunctioned satellite is one of the key technologies to achieve robotic on-orbit servicing. In this paper, a relative pose determination method by using point cloud is presented for the nal phase of the rendezvous and docking of malfunctioned satellites. The method consists of three parts: (1) planes are extracted from point cloud by utilizing the random sample consensus algorithm. (2) The eigenvector matrix and the diagonal eigenvalue matrix are calculated by decomposing the point cloud distribution matrix of the extracted plane. The eigenvalues are utilized to recognize rectangular planes, and the eigenvector matrix is the attitude rotation matrix from the sensor to the plane. The solution of multisolution problem is also presented. (3) An extended Kalman lter is designed to estimate the translational states, the rotational states, the location of mass center, and the moment-of-inertia ratios. Because the method only utilizes the local features without observing the whole satellite, it is suitable for the nal phase of rendezvous and docking. The algorithm is validated by a series of mathematical simulations. 1. Introduction The on-orbit servicing technology, which is a very challeng- ing research topic, is commonly regarded to be utilized to extend the life of malfunctioned satellites. Although several on-orbit servicing demonstration cases were performed [1], autonomous on-orbit servicing operation under realistic conditions has not been carried out in space up to now. A series of technologies including autonomous navigation, control, and robotic arm are still immature and should be furtherly developed. Because the relative pose measurement for a malfunc- tioned satellite is the precondition of on-orbit servicing [2], many researchers have made many eorts in order to solve this important problem. The relative navigation of TECSAS/- DEOS, being studied at the center for robotics and machinery at the German Aerospace Center (DLR), is realized by a stereo vision system. In the automatic mode, a number of images are taken and dumped to ground segment and then are processed to estimate the relative position and the target motion [3]. The Argon system utilizes two dierent view-size cameras and a ash LIDAR. Complex image processing tech- niques are developed to measure the relative position and rel- ative attitude of noncooperative targets [4, 5]. Volpe et al. proposed an on-orbit relative pose and shape estimation with the use of a monocular camera and a distance sensor [6, 7]. In China, Caixiu proposed a method by using a single camera to measure the relative pose by utilizing a solar panel triangle structure and a circular docking ring [8]. Miao et al. utilized solar panel components as the identication objects for non- cooperative spacecraft pose determination [9]. Zhang et al. presented the relative pose measurement algorithm by using a circle and a noncoplanar feature point [10]. Huang et al. proposed a monocular visual servoing control method using only the edge lines of satellite brackets [11]. According to the above research, visual navigation is the main relative pose determination method in the relative state estimation for malfunctioned satellites [12]. However, the visual system sometimes suers poor imaging quality in the space due to the special optical environment, which leads to the diculty in subsequent processing. What is worse, the surface of the malfunctioned satellite is often covered by multilayer Hindawi International Journal of Aerospace Engineering Volume 2019, Article ID 6895628, 15 pages https://doi.org/10.1155/2019/6895628

Upload: others

Post on 06-May-2020

8 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Pose Determination for Malfunctioned Satellites Based on Depth …downloads.hindawi.com/journals/ijae/2019/6895628.pdf · 2019-07-30 · Research Article Pose Determination for Malfunctioned

Research ArticlePose Determination for Malfunctioned Satellites Based onDepth Information

Feng Yu , Yi Zhao, and Yanhua Zhang

College of Astronautics, Nanjing University of Aeronautics and Astronautics, 210016, China

Correspondence should be addressed to Feng Yu; [email protected]

Received 9 November 2018; Revised 22 February 2019; Accepted 1 April 2019; Published 11 June 2019

Academic Editor: Enrico C. Lorenzini

Copyright © 2019 Feng Yu et al. This is an open access article distributed under the Creative Commons Attribution License, whichpermits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Autonomous on-orbit servicing is the future space activity which can be utilized to extend the satellite life. Relative poseestimation for a malfunctioned satellite is one of the key technologies to achieve robotic on-orbit servicing. In this paper, arelative pose determination method by using point cloud is presented for the final phase of the rendezvous and docking ofmalfunctioned satellites. The method consists of three parts: (1) planes are extracted from point cloud by utilizing therandom sample consensus algorithm. (2) The eigenvector matrix and the diagonal eigenvalue matrix are calculated bydecomposing the point cloud distribution matrix of the extracted plane. The eigenvalues are utilized to recognize rectangularplanes, and the eigenvector matrix is the attitude rotation matrix from the sensor to the plane. The solution of multisolutionproblem is also presented. (3) An extended Kalman filter is designed to estimate the translational states, the rotational states,the location of mass center, and the moment-of-inertia ratios. Because the method only utilizes the local features withoutobserving the whole satellite, it is suitable for the final phase of rendezvous and docking. The algorithm is validated by aseries of mathematical simulations.

1. Introduction

The on-orbit servicing technology, which is a very challeng-ing research topic, is commonly regarded to be utilized toextend the life of malfunctioned satellites. Although severalon-orbit servicing demonstration cases were performed [1],autonomous on-orbit servicing operation under realisticconditions has not been carried out in space up to now. Aseries of technologies including autonomous navigation,control, and robotic arm are still immature and shouldbe furtherly developed.

Because the relative pose measurement for a malfunc-tioned satellite is the precondition of on-orbit servicing [2],many researchers have made many efforts in order to solvethis important problem. The relative navigation of TECSAS/-DEOS, being studied at the center for robotics andmachineryat the German Aerospace Center (DLR), is realized by astereo vision system. In the automatic mode, a number ofimages are taken and dumped to ground segment and thenare processed to estimate the relative position and the targetmotion [3]. The Argon system utilizes two different view-size

cameras and a flash LIDAR. Complex image processing tech-niques are developed to measure the relative position and rel-ative attitude of noncooperative targets [4, 5]. Volpe et al.proposed an on-orbit relative pose and shape estimation withthe use of a monocular camera and a distance sensor [6, 7]. InChina, Caixiu proposed a method by using a single camera tomeasure the relative pose by utilizing a solar panel trianglestructure and a circular docking ring [8]. Miao et al. utilizedsolar panel components as the identification objects for non-cooperative spacecraft pose determination [9]. Zhang et al.presented the relative pose measurement algorithm by usinga circle and a noncoplanar feature point [10]. Huang et al.proposed a monocular visual servoing control method usingonly the edge lines of satellite brackets [11]. According to theabove research, visual navigation is the main relative posedetermination method in the relative state estimation formalfunctioned satellites [12]. However, the visual systemsometimes suffers poor imaging quality in the space due tothe special optical environment, which leads to the difficultyin subsequent processing. What is worse, the surface of themalfunctioned satellite is often covered by multilayer

HindawiInternational Journal of Aerospace EngineeringVolume 2019, Article ID 6895628, 15 pageshttps://doi.org/10.1155/2019/6895628

Page 2: Pose Determination for Malfunctioned Satellites Based on Depth …downloads.hindawi.com/journals/ijae/2019/6895628.pdf · 2019-07-30 · Research Article Pose Determination for Malfunctioned

insulation (MLI) materials for the thermal protection.Because of this, optical features such as texture changeaccording to the direction of lighting and view; the image fea-tures may not be obviously identified [13].

Therefore, other viable image processing method or sen-sors were utilized by some researchers. Terui et al. used binoc-ular stereo vision to calculate the dense 3D points of targetsatellite [13], and relative pose was obtained by using IterativeClosest Point (ICP). LIDAR has been employed for full 6DOFpose estimation [14]. Gómez Martínez et al. proposed a poseestimation method for a rocket body in orbit using a time-of-flight (TOF) camera [15]. Hillenbrand and Lampariello real-ized the motion estimation and model identification of free-floating targets by utilizing range data measurements [16].The aforementioned literatures intrinsically utilize the targetgeometrical shape by processing a point cloud of 3D coordi-nates mapping the sensed target surface. The ICP method iscommonly employed to calculate target pose parameters.

However, Lichter and Dubowsky proposed a method toestimate the state, shape, and inertia parameters of noncoop-erative targets based on a series of distance images [17]. Thismethod includes three parts: firstly, the coarse target pose canbe estimated by condensing the range data of the target; sec-ondly, the full dynamic states and inertial parameters areestimated by using Kalman filters; and finally, the shape ofthe target is estimated by using the estimated pose informa-tion and the raw range data. This proposed method is effec-tive only when the whole target is observed. But it isdifficult to obtain the complete distance image of the targetat the close distance. Therefore, the author advised that ateam of range imagers should be distributed uniformly tocapture the whole target surface and the relative pose knowl-edge between sensors mounted on different satellites isrequired to be known with high accuracy. Literature [15]identified the rocket’s geometrical properties including thecylindrical body, nozzle, side tanks, and fairing from thepoint cloud in the target pose initialization.

Inspired by literatures [15, 17], this paper proposes a newmethod to estimate the pose of a malfunctioned satellitebased on the rectangle feature from point cloud. The com-plete distance image of the target is not necessary for thismethod, and only the local rectangular features are extractedto measure the relative position and attitude. Simulationresults show the effectiveness of the proposed method.

2. Rectangular Plane Extraction, Recognition,and Pose Determination

2.1. Plane Extraction from Point Cloud. Because satellitesare man-made objects, there are some regular componentson satellites, such as the main body, solar panels, anddocking ring. Although these components are not designedspecifically to measure the relative pose of malfunctionedsatellites, they have obvious geometric features for recogni-tion and can be utilized to realize relative navigation. Therectangular feature is one of the typical geometric shapeson malfunctioned satellites. For example, the main bodyof most satellites and solar panels is composed of a variety

of rectangles. Therefore, this paper proposes to select therectangular planes as the identification object.

The random sample consensus (RANSAC) algorithm isutilized to extract planes from the point cloud dataset. Thealgorithm can be described as follows: firstly, select amathematical model; secondly, find the so-called mini-mum set from the original point cloud sample, and thenestimate the parameters of the model by using the mini-mum set of points; thirdly, check the consistency of theremaining points with the estimated model, and add thesequalified points into the point set; and finally, the param-eters of the model can be reestimated using all the pointsthat conform to the model.

According to the aforementioned RANSAC algorithm,the specific steps for extracting the plane from the pointcloud are proposed as follows:

(1) Select three points P1 x1, y1, z1 , P2 x2, y2, z2 , andP3 x3, y3, z3 randomly in the point cloud dataset

(2) Calculate the parameters of the plane S according toP1, P2, and P3

(3) Suppose the plane thickness as ε, and calculate the dis-tance from any point to the plane in the point cloud:

di =Axi + Byi + Czi +D

A2 + B2 + C21

(4) Find out how many points are satisfying di < ε

(5) Repeat step 1~step 4 K times, and select the planethat includes the most points, where K can be calcu-lated by the following formula:

K =log 1 −Φ

log 1 − 1 − τ 3 , 2

where τ is the proportion of the outer point in thepoint cloud and Φ is the probability that the optimalplane can be found after K calculation

2.2. Rectangular Plane Recognition. Some components of amalfunctioned satellite are composed of rectangular planes.The dimensions of these rectangular planes are known for aspecific malfunctioned satellite. After the plane is extractedfrom the point cloud, the point cloud distribution matrix isutilized to identify if it is a rectangle plane and determineits dimensions.

Before calculating the point cloud distribution matrix,the location of the geometric centroid of the extracted planecan be determined as

rm =1n〠n

i=1ri, 3

where n is the point number in the point cloud, ri is the posi-tion of the nth point in the sensor coordinate system, and rmis the centroid location of the extracted plane.

2 International Journal of Aerospace Engineering

Page 3: Pose Determination for Malfunctioned Satellites Based on Depth …downloads.hindawi.com/journals/ijae/2019/6895628.pdf · 2019-07-30 · Research Article Pose Determination for Malfunctioned

In literature [17], the attitude is determined based on thesecond moment information of point cloud, which is calledas geometric moment matrix. However, the point cloud dis-tribution matrix is proposed in this paper, and then, the pointcloud distribution matrix is decomposed to calculate the atti-tude and identify the shape. In essence, both the geometricmoment matrix and point cloud distribution matrix reflectthe shape information of the point cloud, but the latterrequires less processing power. The point cloud distributionmatrix is computed as

J = 1n〠n

i=1ri − rm ⋅ ri − rm T 4

The eigenvector matrix and the diagonal eigenvaluematrix can be obtained by utilizing a matrix decompositionoperation as

J = RΛRT, 5

where Λ is the diagonal eigenvalue matrix of J and R is theeigenvector matrix.

The eigenvalue matrix reflects the point cloud distribu-tion in the extracted plane body coordinates, which isdescribed in Figure 1. The origin of the coordinate systemis the center of the extracted plane; the X-axis, Y-axis andZ-axis are parallel to the principal geometric axes. If theextracted plane is a rectangle, the X-axis and Y-axis areparallel to the short and long sides of the rectangle plane,respectively, and the corresponding side length can bedenoted as b and a, respectively. The eigenvalue matrixreflects the dimension parameters of the extracted plane.Therefore, Λ is invariable for a specific plane. That is tosay, it does not change with the relative pose.

Similarly, in the body coordinate system, it is obviousthat the eigenvalues matrix can be defined as

Λ = 1n〠n

i=1di − dm ⋅ di − dm T , 6

where n is the point number in the point cloud, di is the posi-tion of the nth point in the body coordinate system, and dm isthe origin of the body coordinate system, which also can bedefined as

dm =1n〠n

i=1di 7

It is supposed that the feature points obey a uniform dis-tribution in the feature plane coordinate system and the mea-surement noise of point cloud is the white Gaussian noise.Therefore, di can be rewritten as

di = di + εi, 8

where di is the true position of the nth point in the body coor-dinate system and εi is the noise.

Therefore, equation (6) can also be rewritten as

where dxi dyi d

zi are the three components of di and

dxm dym dzm are the three components of dm.Suppose Iy = 1/n ∑n

i=1 dyi − dym2and then

E Iy = E1n〠n

i=1dyi

2 + dym2 − 2 ⋅ dyi d

ym , 10

where E Iy is the mathematical expectation of Iy.

Substitute equation (8) into (10), and then, equation (10)can be expressed as

E Iy = E1n〠n

i=1dyi

2+ E

1n〠n

i=1εyi

2 + 2E1n〠n

i=1dyi ε

yi

− 2E1n〠n

i=1dym d

yi + εyi + E

1n〠n

i=1dym

2

11

Y

X

Z

O

a

b

Figure 1: The body coordinate system.

Λ =1n〠n

i=1

dxi − dxm2 dxi − dxm dyi − dym dxi − dxm dzi − dzm

dxi − dxm dyi − dym dyi − dym2 dyi − dym dzi − dzm

dxi − dxm dzi − dzm dyi − dym dzi − dzm dzi − dzm2

, 9

3International Journal of Aerospace Engineering

Page 4: Pose Determination for Malfunctioned Satellites Based on Depth …downloads.hindawi.com/journals/ijae/2019/6895628.pdf · 2019-07-30 · Research Article Pose Determination for Malfunctioned

If it is supposed that the extracted plane is a rectangularplane and the number of point n is large enough, the follow-ing equations can hold:

E1n〠n

i=1dym

2 ≈ 0, 12

E1n〠n

i=1dym d

yi + εyi ≈ 0, 13

E1n〠n

i=1dyi ε

yi ≈ 0, 14

E1n〠n

i=1εyi

2 = σy 2, 15

E1n〠n

i=1dyi

2=b2

12, 16

where σy is the mean square deviation of noise εyi and b is theside length of the edge which is parallel to the Y-axis.

Equations (12), (13), (14), (15), and (16) are substitutedinto equation (11) and then

E Iy =b2

12+ σy 2 17

Similarly, E Ix = a2/12 + σx 2 and E Iz = σz 2.Suppose Ixy = 1/n ∑n

i=1 dxi − dxm dyi − dym and then

E Ixy = E1n〠n

i=1dxi d

yi − E

1n〠n

i=1dxi d

ym

− E1n〠n

i=1dyi d

xm + E

1n〠n

i=1dxmd

ym

18

It is regarded that dxi and dyi are independent of eachother. Therefore, the following equation holds:

E Ixy = 0 19

Similarly, E Ixz = 0 and E Iyz = 0.It can be seen from the above derivation that the

eigenvalue matrix Λ of the rectangular point cloud planeis only related to the size of the rectangular plane andthe noise of the sensor. If the mean square deviation ofthe sensor noise is known, the parameters a and b canbe estimated. Therefore, the extracted rectangular surfaceof the satellite can be identified by calculating the eigen-value matrix Λ.

2.3. Coarse Pose Determination. The matrix decomposi-tion operation presented in equation (5) can provide theattitude rotation matrix. The eigenvector matrix R is theattitude rotation matrix from the sensor coordinate sys-tem to the body coordinate system of the extracted rect-angular plane. It is assumed that the sensor coordinatesystem is the same as the chaser body coordinate systemin this paper.

Thus, the quaternion of the body coordinate system withrespect to the chaser body coordinate system, which isdenoted as qrc , can be calculated as

qrc0 =12

1 + Rxx + Ryy + Rzz1/2,

qrc1 =1

4qrc0Ryz − Rzy ,

qrc2 =1

4qrc0Rzx − Rxz ,

qrc3 =1

4qrc0Rxy − Ryx ,

20

where Rxx , Ryy, Rzz , Ryz , Rzy , Rzx , Rxz , Rxy , and Ryx are the ele-ments of the attitude rotation matrix R.

If the eigenvalues of J are arranged as the middle, thelargest, and the smallest in the diagonal line of Λ, thereare 4 cases for establishing the body coordinate systemdepending on the axial direction due to the symmetricalstructure of the extracted rectangle plane. The 4 casesare shown in Figure 2.

It is supposed that case (1) is the coordinate system of thepresent time (see Figure 2). Any one of 4 cases of the bodycoordinate system may appear during the next matrixdecomposition operation. The relationship between thetarget attitude qti and the attitude quaternion qrc can bedescribed as

qti = qciqrcqtr , 21

where qci is the attitude quaternion of the chaser satelliteand qtr is the quaternion from the body coordinate systemof the extracted rectangular plane to the target body coor-dinate system, which is known.

Because the target attitude quaternion qti can be predictedin the filter which is designed in Section 3, the predicted qrccan be derived as

qrc = qicqtiqrt 22

The predicted qrc can be utilized to solve the multisolu-tion problem presented in Figure 2. Define the differencequaternion between the predicted qrc and the measuredqrc as

δqrc = qrc −1qrc 23

4 International Journal of Aerospace Engineering

Page 5: Pose Determination for Malfunctioned Satellites Based on Depth …downloads.hindawi.com/journals/ijae/2019/6895628.pdf · 2019-07-30 · Research Article Pose Determination for Malfunctioned

If the scalar part of δqrc is close to 1, case (1) occurs. Ifthe scalar part of δqrc is close to 0, any one of cases(2)~(4) occurs, and the body coordinate system of theextracted plane needs to be changed to case (1) as

qrc = qrcq′, 24

where qrc is the adjusted quaternion and q′ is the quater-nion which is utilized to adjust the attitude quaternionqrc, satisfying

q′ = u ⋅ sinθ

2+ cos

θ

2, 25

where u is the direction of the Euler axis and θ is equal to180 degrees.

Cases (2), (3), and (4) need to be changed to case (1);the coordinate system should be rotated 180 degrees aroundthe X-, Y-, and Z-axes, respectively. The corresponding qua-ternions are q′ = 0 0 1 0 T, q′ = 1 0 0 0 T, andq′ = 0 1 0 0 T, respectively.

If there are 2 or more extracted rectangular planes ina single measurement, the similar problem exists. It issupposed that there are 2 extracted rectangular planes.The 2 attitude quaternions are denoted as qrc1 and qrc2,respectively. The corresponding target attitudes qti1 andqti2 can be derived according equation (21). The differencequaternion between qti1 and qti2 can be obtained by usingequation (23). If the scalar part of the difference quater-

nion is close to 0, the similar operation presented inequations (24) and (25) should be performed.

3. Estimation of the Rotational andTranslational Parameters

3.1. State Equations. When two satellites are nearly inthe same circular orbit with a close distance, the Hillequations can perfectly describe the translational motiondynamics as

=03×3 I3×3F1 F2

+03×3I3×3

w1, 26

where

F1 =0 0 0

0 −n2 0

0 0 3n2,

F2 =0 0 2n

0 0 0

−2n 0 0

,

27

n is the average orbital angular rate of the malfunc-tioned satellite and x and υ are the relative positionand velocity between the target and the chaser, which

Y

X

(1) (2)

(3) (4)

Z

O

Y

X Z

O

Y

X

Z

O

Y

XZ

O

Figure 2: Description of the multisolution problem.

5International Journal of Aerospace Engineering

Page 6: Pose Determination for Malfunctioned Satellites Based on Depth …downloads.hindawi.com/journals/ijae/2019/6895628.pdf · 2019-07-30 · Research Article Pose Determination for Malfunctioned

is described in the target orbital frame. w1 is the dis-turbance force on an unit mass which is modeled asGaussian white noise.

As the target satellite is a malfunctioned satellite, theangular velocity cannot be obtained directly. The attitudekinematics and dynamics can be written as

qt =12qt ⊗ ω,

ωt = It−1 −ωt × Itωt + It−1w2,28

where qt is the target attitude quaternion, ω = 0 ωTt

T,ω is the target angular velocity, It is the moment-of-inertia ratio matrix which can be described as

It =

1IxyIxx

IxzIxx

IxyIxx

IyyIxx

IyzIxx

IxzIxx

IyzIxx

IzzIxx

=

1 Ixy Ixz

Ixy Iyy Iyz

Ixz Iyz Izz

, 29

and w2 is the ratio of the disturbing torque and Ixx , whichis modeled as Gaussian white noise.

3.2. Measurement Equations. The geometrical relationshipbetween the two satellites is described in Figure 3. Ocxcyczc is the chaser body coordinate system. rkm is the locationof the geometric centroid of an extracted rectangularplane, and it is expressed in the chaser body coordinatesystem, where k denotes the sequence number of theextracted rectangular plane. Orxryrzr is the target nominalbody coordinate system. rkb is the location of the geometriccentroid of the extracted rectangular plane in the frameOrxryrzr . Because the target is a malfunctioned satellite,the satellite structure model is known. Therefore, the rela-tive pose between the extracted rectangular plane andframe Orxryrzr is known. So rkb is a known value. Becausethe mass center changes due to the fuel consumption, thelocation error of the mass center is denoted as bk. Otxtytztis the target body coordinate system, whose origin Ot is atthe true mass center, and its axes xt , yt , and zt are parallelto the axes xr , yr , and zr , respectively. x is the displace-ment vector from the target mass center to the chasermass center and it is expressed in the target orbital frame.The origin of the orbital frame lies in the mass center ofservice satellite, the xo axis is in the flight direction, thezo axis is in the direction of the Earth center, and the yoaxis which completes the right-handed orthogonal coordi-nate system is in the opposite direction of the angularmomentum of the orbit.

According to the geometrical relationship presented inFigure 3, the displacement x can be described as

x = Rott rkb + bk − Rot

ocRocc rkm + υk , 30

where Rott is the attitude rotation matrix from the frame

Otxtytzt to the target orbital frame, Rotoc is the rotation

matrix from the chaser orbital frame to the target orbitalframe, Roc

c is the rotation matrix from the frame Ocxcyczcto the chaser orbital frame and it can be determined bythe chaser attitude determination system, and υk is theobservation noise of the geometric centroid of theextracted rectangular plane.

Because the two satellites are nearly in same circular orbitwith a close distance, the rotation matrix ROt

Occan be regarded

as the unit matrix [18].

ROtOc≈ I 31

Therefore, equation (30) can be rewritten as

RotocRocc rkm ≈ Roc

c rkm≈ −x + Roc

t rkb + Roct bk − Roc

c υk= −x + Roc

i Ritr

kb + Roc

i Ritbk + υk,

32

where Roci is the rotation matrix from the inertial frame to

the chaser orbital frame and it can be determined accord-ing to the orbit information and Rt

i is target attitudematrix.

It is obvious that bk is a constant vector and can bemodeled as

bk = 0 33

As the target is a malfunctioned satellite which is nonco-operative, no information including the attitude parameters

Yc

Xc

Zc

Oc

Yt

Xt

Zr

Ot

Yr

Xr

Zr

Or

xbk

rbk

rmk

Figure 3: Geometrical relationship between two satellites.

6 International Journal of Aerospace Engineering

Page 7: Pose Determination for Malfunctioned Satellites Based on Depth …downloads.hindawi.com/journals/ijae/2019/6895628.pdf · 2019-07-30 · Research Article Pose Determination for Malfunctioned

can be provided to the service spacecraft. An available way tomeasure the target attitude is combining the relative attitudeand the chaser attitude. According to equation (21), the atti-tude measurement is

qti = qci qrcηk qtr , 34

where ηk is the observation noise.A set of feature rectangles is tracked by the observer. As

the target tumbling in orbit, new rectangular features areobserved and must be added to the state vector, and somefeatures disappear and can be only propagated in the fil-ter. This strategy has been proven effective in literatures[6, 7, 19].

3.3. EKF Filter Design. Define state variables as

X = xT υT IT qTt ωTt bT T, 35

where IT = Iyy , Izz , Ixy, Ixz , Iyz and b denotes location errorsof all the tracked feature rectangles.

The dynamic equations are made up of equations (25),(33), and (28). Due to the nonlinearity of attitude kinematicsand dynamic equations, a linearization operation can be per-formed. Then, the new state vector is described as

X = δxT δυT δIT δqTt,13 δωTt δbT T, 36

where δx = x − x, δυ = υ − υ , δb = b − b, δI = I − I, δωt =ωt − ω t , δqTt = q−1t ⊗ qt , and δqt = 1 δqTt,13

T. Because δq0 is not an independent variable and it has thesecond-order variations, δq0 is ignored in the lineariza-tion procedure.

Define the state transfer matrix Φk/k−1.

Φk/k−1 = I +AT , 37

where T is the discrete time, A is the Jacobian matrix of par-tial derivatives of the dynamic equations with respect to statevariables X, that is,

A =

03×3 I3×3 03×5 03×3 03×3 03×3mF1 F2 03×5 03×3 03×3 03×3m05×3 05×3 05×5 05×3 05×3 05×3m03×3 03×3 03×5 − ω t × 0 5I3×3 03×3m03×3 03×3 F4 03×3 F3 03×3m03m×3 03m×3 03m×5 03m×3 03m×3 03m×3m

,

38

where F1 and F2 are defined in Section 3.1, F3 = I−1t ⋅Ι tω t × − ω t × ⋅ Ι t ,

F4 = I−1t ⋅

0 0 m2 m3 0

m2 0 m1 0 m3

0 m3 0 m1 m2

− ω t × ⋅

0 0 ω2t ω3

t 0

ω2t 0 ω1

t 0 ω3t

0 ω3t 0 ω1

t ω2t

,

39

m1 m2 m3T = I−1t ω t × Ι tω t , and m is the number

of the tracked features.Similarly, in equations (32) and (34), a linearization oper-

ation is also performed as

Z1,k = Rocc rkm + x − Roc

i Ritrkb − Roc

i Ritbk

= −δx + Roci R

itδbk

− Roci R

it rkb × + Roc

i Rit bk × δqt,13 + υk,

Z2,k = q−1t qci qrcηk qtr 13 = δqt,13 + ηk

40

Therefore, Z = ZT1,1 ZT

2,1 ⋯ ZT1,k ZT

2,k ⋯ T and

ε = υT1 ηT1 ⋯ υTk ηTk ⋯ T, and the observationmatrix can be defined as

H =

−I3×3 03×3 03×5 − Roci R

it r1b × + Roc

i Rit b1 × 03×3 Roc

i Rit 03×3 m−2 03×3

03×3 03×3 03×5 I3×3 03×3 03×3 03×3 m−2 03×3⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮

−I3×3 03×3 03×5 − Roci R

it rkb × + Roc

i Rit bk × 03×3 03×3 k−1 Roc

i Rit 03×3 m−k

03×3 03×3 03×5 I3×3 03×3 03×3 k−1 03×3 03×3 m−k

⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮ ⋮

41

7International Journal of Aerospace Engineering

Page 8: Pose Determination for Malfunctioned Satellites Based on Depth …downloads.hindawi.com/journals/ijae/2019/6895628.pdf · 2019-07-30 · Research Article Pose Determination for Malfunctioned

The detailed information of the EKF can be concluded as

(1) Initialization:

X0 = E X0 ,

P0 = E X0 − E X0 X0 − E X0T

42

(2) Time update:

Xk−1 = f Xk−1,wk−1 ,

Pk/k−1 =Φk/k−1 ⋅ Pk−1 ⋅ΦTk/k−1 +Qk−1

43

(3) Measurement update:

Kk = Pk/k−1 ⋅HTk ⋅ Hk ⋅ Pk/k−1 ⋅HT

k + Rk−1,

δXk =KkZ,Pk = I −Kk ⋅Hk ⋅ Pk/k−1

44

The estimated states x, υ , b, qk, ωk, and Ik can be inno-vated right after measurement update.

4. Simulation and Evaluation

In order to evaluate the effectiveness of extracting a pointcloud plane, a point cloud rectangular plane is simulatedby the simulation software. The point cloud is randomlysimulated obeying a uniform distribution in a rectangularplane, which ranges from -5 cm to 5 cm in the X-axis,ranges from -9 cm to 9 cm in the Y-axis, and fixes at6.5 cm in the Z-axis. It is also simulated that there are2500 points including 100, 500, and 1250 outliers, respec-tively, in the rectangular plane. The results are shown inFigures 4–7.

In this simulation, the repeat number is set to be 300times, and the distance threshold is 0.005. The extractionresults are shown in Table 1.

It can be seen in Table 1 that the plane can be extractedexactly from the point cloud with outliers. However, the pro-cessing speed of the plane extraction is slowed down with theincrease of outliers.

The point cloud declines in number as the distancebetween the target and the sensor increases. In order tovalidate the effectiveness of a rectangular feature recogni-tion formula as the number of points varies, the numberof points in the rectangular plane is set to be 10000,5000, 2500, 1250, and 625, respectively. The geometriccentroid and the point cloud distribution matrix of theextracted plane are calculated, and the three eigenvalues

−50

5

−10−5

05

105.5

6

6.5

7

7.5

Z

XY

Figure 4: The original point cloud.

−50

5

−10−5

05

10456789

10

XY

Z

Figure 5: Point cloud with 100 outliers.

−50

5

−10−5

05

102

4

6

8

10

XY

Z

Figure 6: Point cloud with 500 outliers.

−50

5

−10−5

05

102

4

6

8

10

XY

Z

Figure 7: Point cloud with 1250 outliers.

8 International Journal of Aerospace Engineering

Page 9: Pose Determination for Malfunctioned Satellites Based on Depth …downloads.hindawi.com/journals/ijae/2019/6895628.pdf · 2019-07-30 · Research Article Pose Determination for Malfunctioned

are obtained by using the matrix decomposition operation.The theoretical value of the geometric centroid is (0, 0, 6.5),and the eigenvalues are 8.3333, 27, and 0, respectively.The geometric centroid and the 3 eigenvalues of the rect-angular point cloud are calculated and presented inTable 2. Therefore, it can be seen that the geometric cen-troid and the eigenvalues are almost not obviously affectedby the number of points.

And then, the random noise of 0.01 cm, 0.1 cm, and1 cm is added to the rectangular point cloud plane whichcontains 2500 points. It can be seen in Table 3 that thelocation error does not obviously increase as the addedrandom noise increases, because the number of points islarge enough.

The theoretical eigenvalues and the eigenvalues com-puted by using the matrix decomposition operation are

presented in Table 4. The computed eigenvalues are com-pared with the theoretical eigenvalues, and both the valuesare almost the same. Therefore, the effectiveness of theproposed recognition algorithm is validated according toTables 3 and 4.

In order to validate the effectiveness of the relative navi-gation method, digital simulation software has been writtenin VC++. In this simulation, it is assumed that the targetand the chaser are in the same orbit with different orbitalphases. The chaser locates at the position which is about25m behind the target satellite. The inertia matrix of the tar-get satellite is set as It = diag 30, 15, 20 kg ⋅m2. The initialangular velocity of the target is 0 0 1 °/s. The pointcloud data is randomly simulated by using uniform distribu-tion on the target surface with Gaussian noise. The numberof the points is 1800, and the mean square deviation of theGaussian error is 0 1 0 1 0 1 m. The true location errorb is set as 0 1 0 1 0 1 m.

The PCL (Point Cloud Library) is employed to extractplanes. And then, the proposed method is utilized to recog-nize the feature rectangular plane and determine the relativeposition and attitude between the feature rectangular planeand the chaser. The relative position and attitude errors areshown in Figures 8–10.

It can be seen from Figure 8 that the accuracy of therelative position is better than 0.01m, and the triaxial max-imum errors are about 0.01m, 0.01m, and 0.01m, respec-tively. The attitude error with a multisolution problem isshown in Figure 9, and the attitude error after removingthe multisolution problem is shown in Figure 10. The mul-tisolution problem occurs, and then the error is about 180°.There is no such phenomenon in Figure 10. Therefore, themethod proposed in this paper has solved the attitude mul-tisolution problem caused by the symmetrical rectangularplane. The attitude error presented in Figure 10 is betterthan 0.2°, and the triaxial maximum error is about 0.13°,0.12°, and 0.16°.

In the relative navigation filter, x 0 is set as the firstobservation of the relative position, and x 0 is set as 0.qt 0 is set as the first observation of the target attitude,and ωt 0 is set to be the target angular velocity corruptedby initial errors. The covariance matrix P0 = diag σ2x , σ2υ,σ2b, σ2qt,13 , σ

2ωt, σ2I , where σx = 12 12 12 m2 , σ2x =

0 12 0 12 0 12 m/s 2, σ2b = 0 12 0 12 0 12 m2 ,

σ2qt,13 = 0 12 0 12 0 12 rad , σ2ωt= 10−4 × 12 12 12

rad/s 2, and σ2I = 0 12 0 12 0 12 0 12 0 12 . Thesystem noise is Gaussian noise with zero mean, and thesystem noise variance matrix is Q = 10−12I6×6.

It can be seen from Figures 11–13 that not only therelative position and velocity can be estimated but alsothe mass center location error of the malfunctioned satel-lite can be estimated. According to the simulation curves,the relative position and velocity errors converge in a veryshort time, and the fluctuation is very small after conver-gence. The accuracy of the three axis positions is betterthan 1.4mm, 3.2mm, and 1.4mm. The triaxial velocity

Table 2: The geometric centroid and eigenvalues.

Number(points)

The center of gravityEigenvaluedistribution

10000 (0.0154, -0.0390, 6.5000) (8.4102, 26.7072, 0)

5000 (-0.0079, -0.0090, 6.5000) (8.3106, 27.3320, 0)

2500 (-0.0801, -0.0033, 6.5000) (8.5595, 26.7923, 0)

1250 (0.1542, -0.0438, 6.5000) (8.4415, 26.4924, 0)

625 (0.0906, -0.3373, 6.5000) (8.2372, 27.7920, 0)

Table 3: The location error of the geometric centroid.

Error(cm)

The geometriccentroid (cm)

The locationerror (cm)

0 (-0.0801, -0.0033, 6.5000) (-0.0801, -0.0033, 0)

0.01 (0.0817, -0.0833, 6.4999) (0.0817, -0.0833, 0.0001)

0.1 (0.0541, 0.0661, 6.4984) (0.0541, 0.0661, 0.0016)

1 (-0.0261, 0.0210, 6.5254) (-0.0261, 0.0210, 0.0254)

Table 4: The eigenvalues.

Error (cm) Eigenvalues (simulation) Eigenvalues (theory)

0 (8.5595, 26.7923, 0) (8.3333, 27, 0)

0.01 (8.5898, 26.9769, 0.0001) (8,3334, 27.0001, 0.0001)

0.1 (8.3124, 26.9526, 0.0103) (8.3433, 27.01, 0.01)

1 (9.4664, 27.7494, 0.9765) (9.3333, 28, 1)

Table 1: Plane extraction from the point cloud.

Outliers The number of extracted points Time (ms)

0 2500 20

100 2400 20

500 2000 30

1250 1252 70

9International Journal of Aerospace Engineering

Page 10: Pose Determination for Malfunctioned Satellites Based on Depth …downloads.hindawi.com/journals/ijae/2019/6895628.pdf · 2019-07-30 · Research Article Pose Determination for Malfunctioned

error is better than 0.01mm/s, 0.01mm/s, and 0.01mm/s,respectively. The estimation error of the mass center loca-tion is less than 1mm, 1mm, and 1mm, respectively.

The estimation errors of the attitude, the angular velocity,and the moment-of-inertia ratios of the target satellite arepresented in Figures 14–16. There are some fluctuations of

0 200 400 600 800 1000−0.01

0

0.01

t (s)

X (m

)

0 200 400 600 800 1000−0.01

0

0.01

t (s)

Y (m

)

0 200 400 600 800 1000−0.01

0

0.01

t (s)

Z (m

)

Figure 8: Position errors.

0 200 400 600 800 1000−200

0

200

t (s)

Pitc

h an

gle (°)

0 200 400 600 800 1000−200

0

200

t (s)

Yaw

angl

e (°)

0 200 400 600 800 1000−200

0

200

t (s)

Roll

angl

e (°)

Figure 9: Attitude errors with multisolution.

10 International Journal of Aerospace Engineering

Page 11: Pose Determination for Malfunctioned Satellites Based on Depth …downloads.hindawi.com/journals/ijae/2019/6895628.pdf · 2019-07-30 · Research Article Pose Determination for Malfunctioned

the malfunctioned satellite attitude, but the accuracy is betterthan 0.1°, and the angular velocity error is better than 0.01°/s.The moment-of-inertia ratios also converge to the true valuesas time goes by.

5. Conclusion

A new method is proposed in this paper to estimate targetpose by utilizing local feature rectangular planes when the

0 200 400 600 800 1000−0.2

0

0.2

t (s)

Pitc

h an

gle (°)

0 200 400 600 800 1000−0.2

0

0.2

t (s)

Yaw

angl

e (°)

0 200 400 600 800 1000−0.2

0

0.2

t (s)

Roll

angl

e (°)

Figure 10: Attitude errors removing multisolution.

0 500 1000 1500 2000 2500

0 500 1000 1500 2000 2500

0 500 1000 1500 2000 2500

−0.2

0

0.2

t (s)

X (m

)

−0.2

0

0.2

t (s)

Y (m

)

−0.2

0

0.2

t (s)

Y (m

)

Figure 11: Relative position errors.

11International Journal of Aerospace Engineering

Page 12: Pose Determination for Malfunctioned Satellites Based on Depth …downloads.hindawi.com/journals/ijae/2019/6895628.pdf · 2019-07-30 · Research Article Pose Determination for Malfunctioned

0 500 1000 1500 2000 2500

0 500 1000 1500 2000 2500

0 500 1000 1500 2000 2500

−2

0

2×10−3

×10−3

×10−3

t (s)

X (m

/s)

−2

0

2

t (s)

Y (m

/s)

−5

0

5

t (s)

Z (m

/s)

Figure 12: Relative velocity errors.

0 500 1000 1500 2000 2500

0 500 1000 1500 2000 2500

0 500 1000 1500 2000 2500

−200

0

200

t (s)

X (m

m)

−200

0

200

t (s)

Y (m

m)

−200

0

200

t (s)

Z (m

m)

Figure 13: Location errors of the mass center.

12 International Journal of Aerospace Engineering

Page 13: Pose Determination for Malfunctioned Satellites Based on Depth …downloads.hindawi.com/journals/ijae/2019/6895628.pdf · 2019-07-30 · Research Article Pose Determination for Malfunctioned

0 500 1000 1500 2000 2500−0.05

0

0.05

t (s)

Pitc

h an

gle (°)

0 500 1000 1500 2000 2500−0.05

0

0.05

t (s)

Yaw

angl

e (°)

0 500 1000 1500 2000 2500−0.1

0

0.1

t (s)

Roll

angl

e (°)

Figure 14: Attitude errors of the target.

0 500 1000 1500 2000 2500−0.02

0

0.02

t (s)

X (°/s

)

0 500 1000 1500 2000 2500−0.05

0

0.05

t (s)

Y (°/s

)

−0.05

0

0.05

Z (°/s

)

0 500 1000 1500 2000 2500

t (s)

Figure 15: Angular velocity errors of the target.

13International Journal of Aerospace Engineering

Page 14: Pose Determination for Malfunctioned Satellites Based on Depth …downloads.hindawi.com/journals/ijae/2019/6895628.pdf · 2019-07-30 · Research Article Pose Determination for Malfunctioned

whole satellite cannot be observed. The plane extractionmethod, the rectangular feature recognition method, andthe solution of multisolution problem are presented. Finally,the Kalman filter is designed to estimate the relative positionand the attitude of the target satellite. Therefore, this paperprovides a feasible algorithm for relative navigation duringrendezvous and docking to malfunctioned satellites.

Data Availability

The data used to support the findings of this study areincluded within the article. The data is generated by usingthe simulation software which is described in Simulationand Evaluation. The data used to support the findings of thismanuscript: “Pose Determination for Malfunctioned Satel-lites Based on Depth Information”, written by Feng Yu, YiZhao, and Yanhua Zhang, is generated by simulation soft-ware. The detailed simulation conditions are presented indetail in this manuscript. The simulation software is availablefrom the corresponding author upon request.

Conflicts of Interest

The authors declare that there is no conflict of interestregarding the publication of this paper.

Acknowledgments

This study was supported by the “Fundamental ResearchFunds for the Central Universities,” No. NS2016084. Theauthors fully appreciate the financial support.

References

[1] G. Roesler, P. Jaffe, and G. Henshaw, “Inside DARPA’s mis-sion to send a repair robot to geosynchronous orbit,” IEEESpectrum, vol. 8, 2017.

[2] B. Liang, X. Du, C. Li, and W. Xu, “Advances in space roboton-orbit servicing for non-cooperative spacecraft,” Robot,vol. 34, no. 2, pp. 242–256, 2012.

[3] K. Landzettel, “Ongoing space robotics missions: TECSAS/-DEOS[EB/OL],” http://www.dlr.de/rm-neu/en/desktopdefault.aspx/tabid-3825/5963_read-8759.

[4] J. Galante, J. Van Eepoel, M. Strube et al., “Pose measurementperformance of the argon relative navigation sensor suite insimulated-flight conditions,” in AIAA Guidance, Navigation,and Control Conference, pp. 27–42, Minneapolis, Minnesota,2012.

[5] S. D’Amico, J. S. Ardaens, G. Gaias, H. Benninghoff,B. Schlepp, and J. L. Jørgensen, “Noncooperative rendezvoususing angles-only optical navigation: system design and flightresults,” Journal of Guidance, Control, and Dynamics, vol. 36,no. 6, pp. 1576–1595, 2013.

[6] R. Volpe, M. Sabatini, and G. B. Palmerini, “Pose and shapereconstruction of a noncooperative spacecraft using cameraand range measurements,” International Journal of AerospaceEngineering, vol. 2017, Article ID 4535316, 13 pages, 2017.

[7] R. Volpe, G. B. Palmerini, and M. Sabatini, “A passive camerabased determination of a non-cooperative and unknownsatellite’s pose and shape,” Acta Astronautica, vol. 151,pp. 805–817, 2018.

[8] C. Caixiu, Pose On-Orbit Measurement of Non-CooperativeSpacecraft, Beijing University of Posts and Telecommunica-tion, Beijing, China, 2015.

[9] X. Miao, F. Zhu, Y. Hao, Q. Wu, and R. Xia, “Vision pose mea-surement for non-cooperative space vehicles based on solarpanel component,” Chinese High Technology Letters, vol. 23,no. 4, pp. 400–406, 2013.

[10] L.-m. Zhang, F. Zhu, Y.-m. Hao, and M.-y. Wang, “Pose mea-surement based on a circle and a non-coplanar feature point,”Acta Photonica Sinica, vol. 44, no. 11, article 1112002, 2015.

[11] P. Huang, L. Chen, B. Zhang, Z. Meng, and Z. Liu, “Autono-mous rendezvous and docking with nonfull field of view fortethered space robot,” International Journal of Aerospace Engi-neering, vol. 2017, Article ID 3162349, 11 pages, 2017.

[12] S. Segal, A. Carmi, and P. Gurfil, “Stereovision-based estima-tion of relative dynamics between noncooperative satellites:theory and experiments,” IEEE Transactions on Control Sys-tems Technology, vol. 22, no. 2, pp. 568–584, 2014.

[13] F. Terui, H. Kamimura, and S.’i. Nishida, “Motion estimationto a failed satellite on orbit using stereo vision and 3D model

0 500 1000 1500 2000 25000

0.2

0.4

0.6

0.8

t (s)

yy/x

x

0 500 1000 1500 2000 25000

0.2

0.4

0.6

0.8

t (s)

zz/x

x

0 500 1000 1500 2000 2500

−0.05

0

0.05

t (s)

xy/x

x

0 500 1000 1500 2000 2500−0.02

0

0.02

t (s)

xz/x

x

0 500 1000 1500 2000 2500

−0.1

0

0.1

t (s)

yz/x

x

Figure 16: Moment-of-inertia ratio errors.

14 International Journal of Aerospace Engineering

Page 15: Pose Determination for Malfunctioned Satellites Based on Depth …downloads.hindawi.com/journals/ijae/2019/6895628.pdf · 2019-07-30 · Research Article Pose Determination for Malfunctioned

matching,” in 2006 9th International Conference on Control,Automation, Robotics and Vision, pp. 1–8, Singapore, 2006.

[14] J. A. Christian and S. Cryan, “A survey of LIDAR technologyand its use in spacecraft relative navigation,” in AIAA Guid-ance, Navigation, and Control (GNC) Conference, pp. 1–7, Bos-ton, MA, USA, 2013.

[15] H. Gómez Martínez, G. Giorgi, and B. Eissfeller, “Pose estima-tion and tracking of non-cooperative rocket bodies using time-of-flight cameras,” Acta Astronautica, vol. 139, pp. 165–175,2017.

[16] U. Hillenbrand and R. Lampariello, “Motion and parameterestimation of a free-floating space object form range data formotion prediction,” in The 8th international symposium onartificial intelligence, Robotics and Automation in Space,pp. 1–10, Munich, Germany, 2005.

[17] M. D. Lichter and S. Dubowsky, “Estimation of state, shape,and inertial parameters of space objects from sequences ofrange images,” in Intelligent Robots and Computer VisionXXI: Algorithms, Techniques, and Active Vision, pp. 194–205,Providence, RI, USA, 2003.

[18] F. Yu, Z. He, B. Qiao, and X. Yu, “Stereo-vision-based relativepose estimation for the rendezvous and docking of noncooper-ative satellites,” Mathematical Problems in Engineering,vol. 2014, Article ID 461283, 12 pages, 2014.

[19] T. Bailey, Mobile robot localization and mapping in extensiveoutdoor environments, [Ph.D. thesis], University of Sydney,2002.

15International Journal of Aerospace Engineering

Page 16: Pose Determination for Malfunctioned Satellites Based on Depth …downloads.hindawi.com/journals/ijae/2019/6895628.pdf · 2019-07-30 · Research Article Pose Determination for Malfunctioned

International Journal of

AerospaceEngineeringHindawiwww.hindawi.com Volume 2018

RoboticsJournal of

Hindawiwww.hindawi.com Volume 2018

Hindawiwww.hindawi.com Volume 2018

Active and Passive Electronic Components

VLSI Design

Hindawiwww.hindawi.com Volume 2018

Hindawiwww.hindawi.com Volume 2018

Shock and Vibration

Hindawiwww.hindawi.com Volume 2018

Civil EngineeringAdvances in

Acoustics and VibrationAdvances in

Hindawiwww.hindawi.com Volume 2018

Hindawiwww.hindawi.com Volume 2018

Electrical and Computer Engineering

Journal of

Advances inOptoElectronics

Hindawiwww.hindawi.com

Volume 2018

Hindawi Publishing Corporation http://www.hindawi.com Volume 2013Hindawiwww.hindawi.com

The Scientific World Journal

Volume 2018

Control Scienceand Engineering

Journal of

Hindawiwww.hindawi.com Volume 2018

Hindawiwww.hindawi.com

Journal ofEngineeringVolume 2018

SensorsJournal of

Hindawiwww.hindawi.com Volume 2018

International Journal of

RotatingMachinery

Hindawiwww.hindawi.com Volume 2018

Modelling &Simulationin EngineeringHindawiwww.hindawi.com Volume 2018

Hindawiwww.hindawi.com Volume 2018

Chemical EngineeringInternational Journal of Antennas and

Propagation

International Journal of

Hindawiwww.hindawi.com Volume 2018

Hindawiwww.hindawi.com Volume 2018

Navigation and Observation

International Journal of

Hindawi

www.hindawi.com Volume 2018

Advances in

Multimedia

Submit your manuscripts atwww.hindawi.com