probabilistic graphical fusion of lidar, gps, and 3d building...

18
Received: 25 February 2018 Revised: 19 October 2018 Accepted: 29 October 2018 DOI: 10.1002/navi.298 ORIGINAL ARTICLE Probabilistic graphical fusion of LiDAR, GPS, and 3D building maps for urban UAV navigation Derek Chen 1 Grace Xingxin Gao 2 1 Aerospace Engineering, University of Illinois at Urbana-Champaign, Urbana, Illinois 2 Aerospace Department, University of Illinois at Urbana-Champaign, Champaign, Illinois Correspondence Grace Xingxin Gao, Aerospace Department, University of Illinois at Urbana-Champaign, United States. Email: [email protected] Abstract Because of the recent interest in unmanned air vehicle (UAV) commercial- ization, there is a great need for navigation algorithms that provide accurate and robust positioning in urban environments that are often Global Positioning Systems (GPS) challenged or denied. In this paper, we present a probabilistic graph-based navigation algorithm resilient to GPS errors. Fusing GPS pseudor- ange and Light Detection and Ranging (LiDAR) odometry measurements with 3D building maps, we apply a batch estimation approach to generate a robust trajectory estimate and maps of the surrounding environment. We then leverage the maps to locate potential sources of GPS multipath and mitigate the effects of degraded pseudorange measurements on the trajectory estimate. We experi- mentally validate our results with flight tests conducted in GPS-challenged and GPS-denied environments. 1 INTRODUCTION In recent years, unmanned air vehicles (UAVs) have been sought for numerous commercial applications 1,2 including consumer package delivery, aerial photography, 3 infras- tructure inspection, 4,5 and emergency first response. 6,7 In open-sky environments, Global Positioning System (GPS) receivers are most commonly used to pro- vide accurate and globally referenced positioning for UAVs. However, many applications, such as consumer product delivery, require UAVs to operate in urban environments. In such environments, buildings and structures reflect and block GPS signals, causing multi- path and low satellite visibility. These factors result in large positioning errors or causing GPS to be unavail- able. Before widespread commercial adoption of these applications occurs, UAV navigation must be made safe and reliable. In this paper, we present a sensor fusion algorithm based on a probabilistic graph model using GPS and Light Detection and Randing (LiDAR) measurements with map data for urban UAV navigation. Our navigation algorithm is structured around odometry generated by LiDAR paired with compass measurements. To overcome drift and bias errors, GPS pseudorange measurements and LiDAR map-matching are applied to anchor the LiDAR odometry-derived trajectory to the global frame. As the UAV navigates, it maintains maps of its surrounding envi- ronment, initialized from 3D building data. With each LiDAR scan, the UAV updates its maps and optimizes for its trajectory states using a probabilistic graph approach. The updated environment map is then used to mitigate multipath and non-line-of-sight (NLOS) errors in the GPS measurements. This work is based on the Master thesis of the first author. 8 The main contributions of this paper are as follows: 1. A probabilistic graph approach to fuse GPS pseudo- range and LiDAR odometry measurements aided by 3D building model maps for UAV trajectory estima- tion and mapping. 2. An evaluation of the algorithm for urban UAV naviga- tion through a series of flight test in GPS-challenged and GPS-denied environments. NAVIGATION. 2019;66:151–168. wileyonlinelibrary.com/journal/navi © 2019 Institute of Navigation 151

Upload: others

Post on 23-Apr-2020

6 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Probabilistic graphical fusion of LiDAR, GPS, and 3D building …gracegao/publications/journal... · 2019-05-06 · LiDAR map-matching are applied to anchor the LiDAR odometry-derived

Received: 25 February 2018 Revised: 19 October 2018 Accepted: 29 October 2018

DOI: 10.1002/navi.298

O R I G I N A L A R T I C L E

Probabilistic graphical fusion of LiDAR, GPS, and 3Dbuilding maps for urban UAV navigation

Derek Chen1 Grace Xingxin Gao2

1Aerospace Engineering, University ofIllinois at Urbana-Champaign, Urbana,Illinois2Aerospace Department, University ofIllinois at Urbana-Champaign,Champaign, Illinois

CorrespondenceGrace Xingxin Gao, AerospaceDepartment, University of Illinois atUrbana-Champaign, United States.Email: [email protected]

Abstract

Because of the recent interest in unmanned air vehicle (UAV) commercial-ization, there is a great need for navigation algorithms that provide accurateand robust positioning in urban environments that are often Global PositioningSystems (GPS) challenged or denied. In this paper, we present a probabilisticgraph-based navigation algorithm resilient to GPS errors. Fusing GPS pseudor-ange and Light Detection and Ranging (LiDAR) odometry measurements with3D building maps, we apply a batch estimation approach to generate a robusttrajectory estimate and maps of the surrounding environment. We then leveragethe maps to locate potential sources of GPS multipath and mitigate the effectsof degraded pseudorange measurements on the trajectory estimate. We experi-mentally validate our results with flight tests conducted in GPS-challenged andGPS-denied environments.

1 INTRODUCTION

In recent years, unmanned air vehicles (UAVs) have beensought for numerous commercial applications1,2 includingconsumer package delivery, aerial photography,3 infras-tructure inspection,4,5 and emergency first response.6,7

In open-sky environments, Global Positioning System(GPS) receivers are most commonly used to pro-vide accurate and globally referenced positioning forUAVs. However, many applications, such as consumerproduct delivery, require UAVs to operate in urbanenvironments. In such environments, buildings andstructures reflect and block GPS signals, causing multi-path and low satellite visibility. These factors result inlarge positioning errors or causing GPS to be unavail-able. Before widespread commercial adoption of theseapplications occurs, UAV navigation must be made safeand reliable.

In this paper, we present a sensor fusion algorithmbased on a probabilistic graph model using GPS andLight Detection and Randing (LiDAR) measurements withmap data for urban UAV navigation. Our navigation

algorithm is structured around odometry generated byLiDAR paired with compass measurements. To overcomedrift and bias errors, GPS pseudorange measurements andLiDAR map-matching are applied to anchor the LiDARodometry-derived trajectory to the global frame. As theUAV navigates, it maintains maps of its surrounding envi-ronment, initialized from 3D building data. With eachLiDAR scan, the UAV updates its maps and optimizes forits trajectory states using a probabilistic graph approach.The updated environment map is then used to mitigatemultipath and non-line-of-sight (NLOS) errors in the GPSmeasurements. This work is based on the Master thesis ofthe first author.8 The main contributions of this paper areas follows:

1. A probabilistic graph approach to fuse GPS pseudo-range and LiDAR odometry measurements aided by3D building model maps for UAV trajectory estima-tion and mapping.

2. An evaluation of the algorithm for urban UAV naviga-tion through a series of flight test in GPS-challengedand GPS-denied environments.

NAVIGATION. 2019;66:151–168. wileyonlinelibrary.com/journal/navi © 2019 Institute of Navigation 151

Page 2: Probabilistic graphical fusion of LiDAR, GPS, and 3D building …gracegao/publications/journal... · 2019-05-06 · LiDAR map-matching are applied to anchor the LiDAR odometry-derived

152 CHEN AND GAO

2 RELATED WORK

Approaches for mitigation of urban GPS measurementerrors range from GPS sensor fusion to aiding GPS withinformation from environment maps.

2.1 GPS sensor fusionGPS receivers are often paired with other navigation sen-sors to improve positioning accuracy and availability. Oneof the common sensors is the inertial navigation sys-tem (INS).9,10 GPS with INS aiding has been applied,along with other sensors, to multiple vehicular11,12 andUAV13 applications and has been shown to provide posi-tion estimates where GPS is intermittently unavailable.Khagani et al have demonstrated that fusing GPS/INSwith a vehicle dynamic model leads to major improve-ments in navigation accuracy in periods of GPS outages.14

Klein et al rely on a navigation solution and use pseu-dorange measurements as aiding to navigate periods ofGPS outages. However, GPS/INS is still subject to grow-ing drift and biases in long periods of GPS outages ordegraded GPS. Environment observing sensors such as theLiDAR sensor15-17 have been used to generate odometrymeasurements by measuring changes in the observed envi-ronment. Dill et al have shown that integrating LiDARwith INS is able to obtain global anchor navigation solu-tions to meter-accuracy when GPS is available.18 Solovievhas demonstrated that 2D line LIDAR scan matching canbe paired with GPS for navigating in environments whereGPS navigation alone can be difficult.16 More recently,these sensors have been combined in Simultaneous Local-ization and Mapping (SLAM) approaches,19-25 most com-monly developed for indoor and GPS-denied navigation.While SLAM techniques alone have the capability of allow-ing users to navigate by observing environmental changes,long term positioning often suffers from drift and biases.By incorporating GPS, the positions and maps gener-ated by the SLAM algorithms are anchored to the globalframe.26 Gao et al implemented fusion of INS/GPS/LiDARmeasurements for urban and indoor navigation using ahybrid scan matching algorithm.17 Shepard et al imple-mented a navigation system combining Parallel Trackingand Mapping (PTAM) processed camera measurementswith carrier-phase differential GPS.26

2.2 Improving urban GPSwith environment mappingRecently, environment observation and modelingapproaches have been used to mitigate multipath andNLOS errors in the GPS measurements and improveaccuracy and availability in urban environments.

Initial approaches applied upwards-facing omnidirec-tional cameras to characterize the city skyline surroundinga GPS receiver's antenna to reject occluded signals.27,28

Zribi et al combined GPS measurements and odomet-ric data with a digital road map to improve 2D vehiclepositioning.29 With the recent availability of more accu-rate three-dimensional (3D) mapping, 3D building modelshave been leveraged to model environmental features thatocclude or degrade GPS signal measurements.30 Groveset al apply shadow-matching,31,32 a technique where 3Dbuilding models are used to predict the visibility of satel-lites and match actual satellite visibility to localize thereceiver.33 Yozevitch et al have applied a particle filter toshadow-matching for 3D positioning of UAVs in urbanregions.34 Gu et al applied ray tracing of the GPS signalsreflected off of nearby buildings and incorporated themas reflected pseudoranges into the navigation solution.35

Miura et al then extended upon ray tracing to correct andupdate the 3D building model map.36 Ng et al apply 3Dmap knowledge and treat NLOS satellites as virtual satel-lites applied into their Direct Position Estimate approach,comparing expected signal reception and finding thehighest correlation.37 Gentner et al treats multipathcomponents from terrestrial radio positioning sourcesas line-of-sight (LOS) signals from virtual transmitters,improving the estimate of the receiver position whilelocalizing the potential sources of multipath.38-40

3 UAV SYSTEM AND SENSORMEASUREMENT MODELING

In this section, we introduce the UAV model and the vari-ous sensors used for urban navigation.

3.1 UAV trajectory modelingWe define a UAV trajectory as a set of UAV states:

X = {x0, x1, x2, … , xK}. (1)

Each UAV state, xk, is denoted as the position and veloc-ity of the UAV, in the earth-centered earth-fixed (ECEF)coordinate frame, along with its GPS receiver clock biases.

xk =[

xk 𝑦k zk.xk

.𝑦k

.zk bk.bk

]T. (2)

That is,xk =

[xkpos xkvel bk

.bk

]T, (3)

where

xkpos : position of UAV in ECEFxkvel : velocity of UAV in ECEFbk : GPS receiver clock bias.bk : GPS receiver clock bias drift rate.

Page 3: Probabilistic graphical fusion of LiDAR, GPS, and 3D building …gracegao/publications/journal... · 2019-05-06 · LiDAR map-matching are applied to anchor the LiDAR odometry-derived

CHEN AND GAO 153

The GPS clock bias and clock drift are included in theUAV state vector to account for the difference betweenthe receiver clock and the GPS satellite clocks. To propa-gate between sequential states, a constant velocity modeldescribes the dynamics of the UAV. The state transitionmatrix of the UAV is defined as follows:

H =

⎡⎢⎢⎢⎢⎢⎢⎢⎣

1 0 0 dt 0 0 0 00 1 0 0 dt 0 0 00 0 1 0 0 dt 0 00 0 0 1 0 0 0 00 0 0 0 1 0 0 00 0 0 0 0 1 0 00 0 0 0 0 0 1 dt0 0 0 0 0 0 0 1

⎤⎥⎥⎥⎥⎥⎥⎥⎦. (4)

The state transition between UAV is modeled with thefollowing equation:

⎡⎢⎢⎢⎣xkpos

xkvelbk.bk

⎤⎥⎥⎥⎦ = H⎡⎢⎢⎢⎣

xk−1pos

xk−1velbk−1.bk−1

⎤⎥⎥⎥⎦ , (5)

where dt is the time increment between states.

3.2 LiDAR odometryWe structure our UAV navigation solution on LiDARodometry measurements. At each measurement epoch,the LiDAR generates a 3D point cloud of its surroundingenvironment in its own LiDAR frame. By comparing con-secutively collected point clouds and applying the iterativeclosest point (ICP) algorithm,41,42 we derive the relativemotion of the UAV and generate these odometry measure-ments. Given a source and a reference point cloud, u and v,the ICP algorithm seeks to find a translation and rotationtransformation that would best match the two input cloudsby minimizing the correspondence residual function:

E(R,T) = 1np

np∑i=1

||ui − Rvi − T||2, (6)

where

ui : the ith point in 3D point cloud uvi : the ith point in 3D point cloud vR : 3x3 rotation matrixT : 3x1 translation vectornp : number of corresponding points.

The steps in the ICP algorithm are as follows:

1. Point correspondenceFor each point in the source point cloud, thealgorithm finds an associated point in the referencepoint cloud that is closest to the source point.

2. RejectionPoint correspondences that are either located too faror rotated too much are treated as outliers and arerejected.

3. Minimize and transformThe residual function E(R,T) is then minimized tofind the rotation and translation that best aligns thesource and reference point clouds and applies theresulting translation and rotation to the source pointcloud.

4. IterateThe process is iterated until convergence when theresiduals are under a certain threshold or a maxi-mum number of iterations is reached.

Figure 1 shows an example of two consecutive pointclouds prior to the application of the ICP algorithm. Here,we use the translation vector, T, to define the relativechange in position between two consecutive UAV states,k and k + 1. The LiDAR odometry measurement vectorlk,k + 1 is defined as follows:

lk,k−1 =[lxk,k−1 l𝑦k,k−1 lzk,k−1

]T, (7)

where each component of lk,k + 1 is situated in the localLiDAR frame. The relation between consecutive states,[ xk

𝑦kzk

]=

[ xk−1𝑦k−1zk−1

]+

[ lk,k−1xlk,k−1𝑦lk,k−1z

](8)

is used to construct the LiDAR odometry measurementfunction:

hLiDAR(xk, xk−1) = xkpos − xk−1pos + 𝜖ICPk,k−1 , (9)

where 𝜖ICPk,k+1 is the Gaussian error attributed to the ICPodometry.

FIGURE 1 Consecutive point clouds for iterative closest point(ICP) comparison [Color figure can be viewed atwileyonlinelibrary.com and www.ion.org]

Page 4: Probabilistic graphical fusion of LiDAR, GPS, and 3D building …gracegao/publications/journal... · 2019-05-06 · LiDAR map-matching are applied to anchor the LiDAR odometry-derived

154 CHEN AND GAO

3.2.1 Compass measurementsTo integrate the LiDAR odometry with GPS, compass mea-surements provide heading information used to rotatethe LiDAR measurements from the local LiDAR frame toECEF. Assuming a level-UAV with a right handed coor-dinate frame (y-axis oriented forward-pointing and z-axisoriented upward-pointing), the LiDAR measurements arerotated into the East, North, and Up (ENU) coordinateframe with the following rotation matrix.

xENU =

[ cos(𝜓) sin(𝜓) 0− sin(𝜓) cos(𝜓) 0

0 0 1

]xLocal, (10)

where xENU is the position vector in the ENU frame, 𝜃 is therotation between the LiDAR frame and north, and xLocal isthe position in the local LiDAR frame. With the latitude,𝜙,and longitude, 𝜆, of the UAV, the measurements are thenrotated to ECEF with the following rotation matrix:

xECEF =

[− sin(𝜆) − sin(𝜙)cos(𝜆) cos(𝜙) cos(𝜆)cos(𝜆) − sin(𝜙) sin(𝜆) cos(𝜙) sin(𝜆)

0 cos(𝜙) sin(𝜙)

]xENU .

(11)

Although estimation of the orientation of the UAV isbeyond the scope of the paper and a fairly accurate rota-tion between the LiDAR and ENU frames are assumed,degradation in the compass measurements by environ-mental effects may be detected by comparing the compassmeasurements with the rotations derived from the ICPalgorithm. The incremental rotations are then used to esti-mate the orientation of the UAV and align the LiDARframe with ENU. Furthermore, LiDAR urban map match-ing can be used to obtain reference locations matching theorientation of the UAV with the referenced structures inthe surrounding environment. Any errors in frame align-ment are absorbed as errors in the estimation step.

3.3 LiDAR-based map matchingTo reduce the effect of drift and bias from the LiDAR odom-etry measurements, the UAV incorporates LiDAR-basedmap matching. As the UAV navigates, the collected pointcloud measurements are used to construct two types ofmaps to represent its environment, a point cloud map, andan urban building map.

3.3.1 Point cloud map matching and loopclosuresAs the UAV travels, it concatenates all LiDAR pointclouds collected throughout its trajectory and anchorsthem in their collecting positions in what is known as thepoint cloud map. When the UAV revisits these previously

scanned areas, the point cloud map is then leveraged forloop closures. Through proximity, the feasibility of a loopclosure is detected and initiated. The ICP algorithm is usedto find the relative transformation between the currentstate and the closest prior state associated with the regionin the point cloud map. To ensure accuracy of the loopclosure, the association of the two states is only enactedwhen the residual function in the ICP algorithm is belowa certain threshold, verifying that the two point clouds arematching scans of the same region and above a level ofsimilarity. Once the association is verified, we form theICP-derived translation vector relating the two states, li,𝑗 =[li,𝑗x li,𝑗𝑦 li,𝑗z ]. [ xk

𝑦kzk

]=

[ x𝑗𝑦𝑗z𝑗

]+

[ lk,𝑗xlk,𝑗𝑦lk,𝑗z

]. (12)

The corresponding measurement function, hLoop(xk, xj), isdefined as follows:

hLoop(xk, x𝑗) = x𝑗pos − xkpos + 𝜖ICPk,𝑗 , (13)

where 𝜖ICPi,𝑗 is the Gaussian error attributed to the ICPodometry.

3.3.2 Urban map matchingWhile navigating in areas where 3D building mod-els are available, 3D building models are loaded andpre-processed into the urban building map to provide analternative way to anchor the UAV positions to the globalframe. Prior to urban navigation, the UAV first initializesthe urban building map with the intended navigation areaby pre-loading existing 3D building model data. In ourexperiments, we used the GPS-only navigation solutionand pre-loaded building model data in a 1 x 1 km squarefrom the Champaign building footprint dataset.43 Thebuilding footprint dataset consists of three-dimensionalpolygons referenced in WGS84 of buildings in Champaign,IL. Using the vertices of the polygons in the data, 3D mod-els were generated consisting of 2D building shapes pairedwith corresponding building heights, shown in Figure 2.Navigating at low altitudes, the UAV extracts the outlinefrom each building, shown in Figure 3, to be comparedwith the observations seen from each LiDAR point cloud.The set of points is defined as Mubm.

To identify comparable features between the urbanbuilding map and the LiDAR point clouds, LiDARplane-fitting is used to extract walls and large recogniz-able features from the LiDAR point cloud. This is accom-plished with the M-estimator sample consensus (MSAC)algorithm,44 a variant of the random sample consensus(RANSAC) algorithm.45 For each MSAC-generated plane,a set of points from the original point cloud is sampled androtated to the corresponding frame of the urban building

Page 5: Probabilistic graphical fusion of LiDAR, GPS, and 3D building …gracegao/publications/journal... · 2019-05-06 · LiDAR map-matching are applied to anchor the LiDAR odometry-derived

CHEN AND GAO 155

FIGURE 2 Urban building map of the Green Street area in Champaign, IL loaded from the Champaign Building Footprint dataset. Eachbuilding consists of a building shape. Building height (not shown) is recorded in the system [Color figure can be viewed atwileyonlinelibrary.com and www.ion.org]

FIGURE 3 Walls detected from the urban building map [Color figure can be viewed at wileyonlinelibrary.com and www.ion.org]

Page 6: Probabilistic graphical fusion of LiDAR, GPS, and 3D building …gracegao/publications/journal... · 2019-05-06 · LiDAR map-matching are applied to anchor the LiDAR odometry-derived

156 CHEN AND GAO

map. This process is shown in Figure 4 where the originalpoint cloud is shown with the detected planes and pointssampled from each plane. For epoch k, the set of sampledpoint measurements collected from all planes is defined asZsk .

Particle-based map matchingBecause of the varying geometry of the map that makesit difficult to optimize as a mathematical model, a MonteCarlo particle approach46 is taken towards map matching.Particles are generated to represent the potential positions

of the UAV, shown in Figure 5. We define the set of gen-erated particles as set M. Any generated particles thatintersect with buildings are regenerated.

For each UAV particle, mki , the urban building map isused to generate a virtual range measurement to the clos-est structure in the direction of each measurement in Zsk .This process is defined with the measurement function,hubm(mki ,Mubm, 𝜙𝑗), where 𝜙j is the angular direction ofthe jth measurement. The set of virtual range measure-

FIGURE 4 Sampled points form planes detected in a point cloud [Color figure can be viewed at wileyonlinelibrary.com and www.ion.org]

FIGURE 5 Monte Carlo generated particles for potential states of the unmanned air vehicle (UAV) initialized by the UAV's GlobalPositioning System (GPS)-derived position. Invalid particles due to building intersection are highlighted and regenerated [Color figure can beviewed at wileyonlinelibrary.com and www.ion.org]

Page 7: Probabilistic graphical fusion of LiDAR, GPS, and 3D building …gracegao/publications/journal... · 2019-05-06 · LiDAR map-matching are applied to anchor the LiDAR odometry-derived

CHEN AND GAO 157

ments via the urban building map is defined as set Zubmi

for UAV particle i.These virtual measurements are then compared with

the observed measurement vector to generate a probabil-ity to determine the likelihood of a particle based on theobserved measurements. For UAV particle i modeling theurban map matching at epoch k, mki ∈ M, the likelihoodrelation, is formalized as follows:

P(mki |Zsk ) ∝∏𝑗

e−12||zsk,𝑗

−hubm(mki,Mubm,𝜙𝑗 )||2 , (14)

where zsk,𝑗 is the jth measurement in the set of sampledplane measurements collected at epoch k.

The particle with the highest likelihood that satisfiesan empirical threshold is deemed the most likely posi-tion of the UAV. If no particle satisfies the threshold, thenno urban map matching measurement is made for thatepoch. Figure 6 shows the result of the map matchingbetween the initial estimate, the map matched estimate,and the true position along with the map-matching mea-surements. Once the most likely particle is identified, it isdesignated as an urban building map matching measure-ment defined as follows:

mk =[mkx mk𝑦 mkz

]T, (15)

and its associated measurement function:

hMap(xk) = xkpos + 𝜖Map, (16)

where

xkpos : position components of xk𝜖Map : map-matching error term

3.4 GPS measurementsTo further anchor the UAV position in previouslyunmapped areas or where a good mapping fit could notbe made, GPS pseudorange measurements are used. GPSpseudorange measurements measure the approximatedistance between a GPS receiver's antenna and the broad-casting GPS satellite. With at least four GPS pseudorangemeasurements, the UAV is able to obtain a position andtime navigation solution. It is noted that although onlyGPS measurements are used in this paper, the similarapproach would be applied to incorporate range mea-surements from other Global Navigation Satellite Systems(GNSS). The receiver clock bias and drift for each GNSSsystem would need to be included in the UAV state vec-tor. The pseudorange, 𝜌(s)k , is modeled by the followingmeasurement function:

hGPS(xk) = ||x(s)k − xkpos || + bk + 𝜖𝜌(s)k

, (17)

where

x(s)k : position of GPS satellite s at epoch k

xkpos : position components of UAV state xkbk : clock bias of of UAV state k𝜖𝜌(s) : pseudorange error term

FIGURE 6 Map matching for unmanned air vehicle (UAV) positioning. The true position is compared with the initial position estimateand the best state estimate obtained from the map-matching process. The sampled points are the points measured from the plane fittingprocess and centered around the initial state estimate. The best fit matched points are then correlated with the walls in the urban buildingmap (outlined in blue) [Color figure can be viewed at wileyonlinelibrary.com and www.ion.org]

Page 8: Probabilistic graphical fusion of LiDAR, GPS, and 3D building …gracegao/publications/journal... · 2019-05-06 · LiDAR map-matching are applied to anchor the LiDAR odometry-derived

158 CHEN AND GAO

Errors in the pseudorange measurements consist of theionospheric, tropospheric, receiver noise, and environ-ment multipath errors. When present and in severe cases,the most dominant errors result from environment multi-path.

In this work, the reflected GPS signals that cause theseerrors are categorized as either NLOS signals or tradi-tional multipath signals. NLOS signals occur when aGPS receiver receives a reflected signal from a satellitenot-in-view, while multipath occurs when the GPS receiverreceives a direct LOS signal to the satellite along withreflected signals from the environment.

For NLOS signals, an approach similar to35,47,48 is taken.Line-of-sight (LOS) vectors are formed between the GPSposition estimate and each visible satellite. If the vec-tors intersect with buildings in the urban building map,then the measurement is characterized as affected byNLOS errors. We expand upon the previous approaches byusing the LiDAR measurements to continuously updatethe building maps used for NLOS error detection. Formultipath signals, because of the unpredictable nature ofthe angles of reflection at which multipath occurs, it isdifficult to pinpoint their reflecting locations. As such, ade-weighting approach is applied for multipath mitigation.Using the LiDAR-updated urban building map, the prox-imity of buildings surrounding the navigation solution ateach time step is incorporated to de-weight the GPS com-ponents of the sensor fusion solution in the presence ofnumerous buildings. This allows the UAV to increase theweighting of the LiDAR odometry solution for navigationin multipath-affected regions.

4 GRAPHICAL SENSOR FUSION

To estimate the UAV trajectory, we take a probabilisticapproach to sensor fusion. We represent the UAV trajec-tory as a posterior probability conditional upon a UAV stateprior and the set of all GPS, LiDAR odometry, and LiDARmap matching measurements,

P(X|x0,ZGPS,ZLiDAR,ZMap,ZLoop

), (18)

where

X : UAV trajectoryx0 : an initial prior on the UAV stateZGPS : the set of all GPS pseudorange measurementsZLiDAR : the set of all LiDAR odometry measurementsZMap : the set of all urban map matching measurementsZLoop : the set of all loop closure measurements.

The initial prior of the UAV state is initialized with aGPS-derived navigation solution at the initial state andwill be reanchored once more reliable measurements areavailable.

4.1 Graphical modelingTo model the posterior probability defined in Equation 18,a Bayesian network,49 a type of directed and acyclic prob-abilistic graph, is applied. A Bayesian graph formulationhas the following advantages:

1. Conditional dependencies between related states areapplied as constraints in the estimation process,allowing for batch estimation. For example, thisallows the presence of a GPS measurement at onenode to anchor the rest of the graph to the globalframe.

2. Batch estimation across the graph allows for updateof past trajectory estimates, enabling generation of anoptimized map.

3. Mapping and estimation are complementary. Incor-porating map-based measurements improves thetrajectory estimate while formulating an optimizedtrajectory estimate generates and updates an opti-mized map.

The graph is structured around two types of nodes, UAVnodes and anchor nodes. UAV nodes are the back bone ofthe Bayesian network and are used to represent the statesof the UAV as random vectors throughout the trajectory.Anchor nodes represent visible GPS satellites and urbanmap matching points as landmarks in the environmentand have an associated position vector. Directed edges thenconnect related nodes, representing relations formulatedby a dynamic process or various types of sensor mea-surements. The backbone of the UAV trajectory graph isstructured by the state transition and LiDAR-derivedodometry edges that connect consecutive UAV statenodes. GPS measurement edges connect UAV nodes withGPS anchor nodes, subsequently anchoring the trajectorygraph to the global frame. Urban map matching nodesyield a similar effect but tie the respective UAV nodes tourban map matching anchors. Finally, loop closure edgesrelate nonconsecutive UAV nodes through the LiDAR loopclosure measurements.

Figure 7 shows an example of a UAV trajectory modeledby the described Bayesian network. For conciseness, GPSsatellite nodes are grouped by each satellite Pseudo Ran-dom Noise (PRN) number, yielding one node per satelliteas opposed to one node per satellite per GPS measurementepoch.

Page 9: Probabilistic graphical fusion of LiDAR, GPS, and 3D building …gracegao/publications/journal... · 2019-05-06 · LiDAR map-matching are applied to anchor the LiDAR odometry-derived

CHEN AND GAO 159

FIGURE 7 Bayesian network model for an unmanned air vehicle (UAV) trajectory based on collected sensor measurements [Color figurecan be viewed at wileyonlinelibrary.com and www.ion.org]

The posterior probability of each UAV node is formu-lated from the probabilities of all connected directed edgesto that node:

P (xk|xk−1, zk) ∝ P (xk|xk−1)P(xk|zkGPS

)P(xk|xk−1, zkLiDAR)

P(xk|zkMap

)P(xk|zkLoop , x𝑗

), (19)

where

xk : UAV state at epoch kzkGPS : GPS measurements at epoch kzkLiDAR : LiDAR measurements at epoch kzkMap : Map measurements at epoch kzkLoop : loop closure measurements at epoch k

The posterior probability of the UAV trajectory, for allUAV states for each epoch from 1 to K, is then formu-lated from the joint probability of all individual posteriorprobabilities.

P(X|x0,Z) = P(x0)P(x1|z1, x0)P(x2|z2, x1)

…P(xK|zK , xK−1),(20)

where P(x0) is the probability of an initial prior, Z is theset of all sensor measurements, and P(xk|zk, xk− 1, xj) isthe posterior probability of state i given measurements zk,the previous state xk− 1, and any loop closure connectedstates xj.

4.2 Graphical inferenceTo estimate the UAV trajectory from the posterior proba-bilities, graphical inference is applied across the graph tofind the most likely configuration of UAV nodes that satisfythe constraints implied by the edges.49 Taking a Gaussianassumption for the measurement and process models ofthe system, each probability is converted to a likelihoodfunction from the corresponding measurement functions:

P(

xk|𝜌(s)k

)∝ e

− 12

‖‖‖‖hGPS

(xk ,x

(s)k

)+bk−𝜌

(s)k

‖‖‖‖2∕𝜎2

𝜌(s)k , (21)

P(xk|xk−1, lk,k−1) ∝ e−12‖hLiDAR(xk ,xk−1)−lk,k−1‖2∕𝜎2

ICPk,k−1 , (22)

P(xk|mk) ∝ e−12||hMap(xk)−mk||2∕𝜎2

Mapk , (23)

P(xk|x𝑗 , lk,𝑗) ∝ e−12∕‖hLoop(xk ,x𝑗 )−lk,𝑗‖2

𝜎2ICPk,𝑗 , (24)

P(xk|xk−1) ∝ e−12‖𝑓(xk−1)−xk‖2∕𝜎2

Process , (25)

where f(xk− 1) is the state transition function, 𝜎2𝜌(s)k

is the

variance of the pseudorange measurements, 𝜎2ICPk,k−1

is thevariance of the LiDAR ICP measurements, 𝜎2

ICPk,𝑗is the

variance of the Loop closing ICP measurements, 𝜎2Mapk

is the variance of the map matching measurements, and𝜎2

Process is the process noise.This is accomplished through Maximum a Posteriori

(MAP) estimation. The negative log-likelihood is thentaken of the exponential likelihood functions to formulatea nonlinear least squares optimization problem as follows:

X̂ = arg maxX

P(X|Z,U) (26)

= arg minX

− log P(X|Z,U) (27)

= arg minX

∑nGPS

∑s||hGPS(xk, x(s)k ) − 𝜌(s)k ||2

𝜎2𝜌(s)k

+∑

nLiDAR

||hLiDAR(xk, xk−1) − lk,k−1||2𝜎2ICPk,k−1

+∑nMap

||hMap(xk) − mk||2𝜎2Mapk

+∑nLoop

||hLoop(xk, x𝑗) − lk,𝑗||2𝜎2ICPk,𝑗

+∑

K||𝑓 (xk−1) − xk||2𝜎2

Process, (28)

where

X̂ : estimated UAV Parameters∑n

: summing across all measurements n∑K

: summing across all epochs k

Page 10: Probabilistic graphical fusion of LiDAR, GPS, and 3D building …gracegao/publications/journal... · 2019-05-06 · LiDAR map-matching are applied to anchor the LiDAR odometry-derived

160 CHEN AND GAO

|| · ||2𝜎2 : squared Mahalanobis distance with covariance

𝜎2

To solve for these parameters, standard optimizationtechniques for iteratively reweighted least squares (IRLS)50

are applied. The components for each measurement arestacked for batch estimation. A series of incremental stepsare taken from an initial state estimate until convergenceto minimize the square of the error residual defined asthe difference between the predicted measurement fromthe measurement model and the actual measurement. Theincremental step, from the current state estimate, x̂i, in thedirection of the Jacobian, is defined as follows:

Δxi(x̂i, zi) = −pinv(JT

i WiJi)

JTi Wiei(x̂i, zi), (29)

where pinv is the pseudoinverse operation and the Jaco-bian, weighting matrix, and error function at time step iare defined as Ji, Wi, ei, respectively.

The Jacobian for the state transition process modelis based on the state transition matrix H defined inEquation 4.

JProcessk,k−1 =

⎡⎢⎢⎢⎢⎢⎢⎢⎣

1 0 0 0 0 0 0 00 1 0 0 0 0 0 00 0 1 0 0 0 0 00 0 0 1 0 0 0 00 0 0 0 1 0 0 00 0 0 0 0 1 0 00 0 0 0 0 0 1 00 0 0 0 0 0 0 1

⎤⎥⎥⎥⎥⎥⎥⎥⎦. (30)

JProcessk−1,k =

⎡⎢⎢⎢⎢⎢⎢⎢⎣

−1 0 0 −dt 0 0 0 00 −1 0 0 −dt 0 0 00 0 −1 0 0 −dt 0 00 0 0 −1 0 0 0 00 0 0 0 −1 0 0 00 0 0 0 0 −1 0 00 0 0 0 0 0 −1 −dt0 0 0 0 0 0 0 −1

⎤⎥⎥⎥⎥⎥⎥⎥⎦. (31)

The Jacobian for each sensor relates each sensor measure-ment to state parameters that they influence. For the GPSmeasurements, defined in Equation 17, the Jacobian isdefined as the unit LOS vector to each satellite from thestate estimate:

JGPSk =

⎡⎢⎢⎢⎢⎢⎣

− x(1)−xk||x(1)−xk|| − 𝑦(1)−𝑦k||y(1)−yk|| − z(1)−zk||z(1)−zk|| 01x3 1 0

− x(2)−xk||x(2)−xk|| − 𝑦(2)−𝑦k||y(2)−yk|| − z(2)−zk||z(2)−zk|| 01x3 1 0⋮ ⋮ ⋮ ⋮

− x(s)−xk||x(s)−xk|| − 𝑦(s)−𝑦k||y(s)−yk|| − z(s)−zk||z(s)−zk|| 01x3 1 0

⎤⎥⎥⎥⎥⎥⎦. (32)

The Jacobian for the ICP-derived LiDAR odometry,described in Equation 9, relates the two consecutive statesconnected by the LiDAR odometry edge:

JLiDARk,k−1 =[

I306x6

]. (33)

JLiDARk−1,k =[−I3

06x6

]. (34)

Similarly, the ICP-derived LiDAR loop closure, describedin Equation 13, relates two nonconsecutive states:

JLoopk,𝑗 =[

I306x6

]. (35)

JLoop𝑗,k =[−I3

06x6

]. (36)

The map matching Jacobian for the equation describedin Equation 16, is then denoted as follows:

JMapk =[

I306x6

]. (37)

For M sensor measurements, the trajectory Jacobian,Jtraj, is an M × 8K matrix formulated by stacking the Jaco-bian from each sensor measurement into a large but sparsematrix. A 3 x 3 identity matrix is included for the ini-tial prior of the UAV state. For conciseness JP = JProcess,JG = JGPS, JL = JLiDAR, JM = JMap, and JO = JLoop.

Jtra𝑗 =

⎡⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎢⎣

I8x8JP0,1 JP1,0

JP1,2 JP2,1 …JP2,3 JP3,2 …

⋱JPK−1,K JPK,K−1

JG1 …JG2 …

JG3 …⋱

JGKJL1,2 JL2,1 …

JL2,3 JL3,2 …⋱

JLK−1,K JLK,K−1JM1 …

JM2 …JM3 …

⋱… JMapK

… JOi,𝑗 … JO𝑗,i

⎤⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎥⎦

(38)

The trajectory cost vector etraj is then an M × 1 vectorformulated by stacking the corresponding cost vectors ofeach sensor measurement.

To weight the influence of the prior, a weighting matrix,WPrior, is formulated by diagonalizing the initial uncer-tainties of the initial state. Weighting matrices are alsoformed from the covariance of each sensor measurementtype. For the GPS measurements, the covariance for eachmeasurement varies per satellite per measurement epoch.

Page 11: Probabilistic graphical fusion of LiDAR, GPS, and 3D building …gracegao/publications/journal... · 2019-05-06 · LiDAR map-matching are applied to anchor the LiDAR odometry-derived

CHEN AND GAO 161

WGPSk =

⎡⎢⎢⎢⎢⎣1∕𝜎2

𝜌1k

1∕𝜎2𝜌2

k⋱

1∕𝜎2𝜌(s)k

⎤⎥⎥⎥⎥⎦. (39)

Because both the LiDAR odometry and the loop closuresteps use the ICP algorithm, they have the same weight-ing matrix characterized by the covariance of the ICPmatching.

WLiDARk = WLoopk =⎡⎢⎢⎢⎣

1∕𝜎2ICPx

0 00 1∕𝜎2

ICP𝑦0

0 0 1∕𝜎2ICPz

⎤⎥⎥⎥⎦ . (40)

The weighting matrix of map matching is characterized bythe map matching covariance.

WMapk =⎡⎢⎢⎢⎣

1∕𝜎2Mapx

0 00 1∕𝜎2

Map𝑦0

0 0 1∕𝜎2Mapz

⎤⎥⎥⎥⎦ . (41)

The weighting matrix of the trajectory Wtrajectory isdefined as a diagonal matrix:

WTra𝑗ector𝑦 =

⎡⎢⎢⎢⎢⎣WPrior

WGPSWLiDAR

WMapWLoop

⎤⎥⎥⎥⎥⎦,

(42)where WPrior, WGPS, WLiDAR, WMap, and WLoop are theblock diagonalized weight matrices for the Prior andall GPS, LiDAR, Map, and loop closure measurements,respectively.

Iterative least squares is applied to the Jacobian anderror vector to estimate the trajectory parameters untilconvergence or when a maximum number of iterationsare reached in batch estimation iterations. Although

computation speed is not addressed in this paper, effi-cient optimization methodologies for solving Bayesiangraphs51,52 can be applied to solve for the UAV trajec-tory parameters in near real-time. Only when significantchanges to the graph occur does a large batch estimationneed to be computed. As such, computationally manage-able incremental updates are made at each time step whilethe large scale batch processing segments can be off loadedto cloud-based computing. After computation, the largescale batch estimate is then incorporated to update theon-board graph.

4.3 Map updateAfter trajectory estimation, the UAV's environment mapsare updated. To update the point cloud map, the pointclouds are reanchored at the updated UAV node posi-tions. For the urban building map, the points on the urbanbuilding map can be updated using globally referencedbuilding data points, such as the Champaign building foot-print dataset43 and generated through plane-fitting of theLiDAR point clouds. Since the building points are glob-ally referenced, those points remain as a constant sourceof positioning. The sampled plane points are added tomap new structures and correct existing building posi-tions. To reduce the possibility of adding dynamic objects,such as vehicles and pedestrians, to the map, the LiDARdata is pre-processed for points that are either too distantor too close to the LiDAR sensor. Any remaining objectsmapped are either filtered out during the plane fitting pro-cess or are minimal enough in the total point cloud tohave a negligible effect on the map-matching solutions. Forfuture work, object recognition techniques such as thosedescribed in53,54 may be applied to identify and removenonstructural objects.

FIGURE 8 System flow diagram for graphical sensor fusion [Color figure can be viewed at wileyonlinelibrary.com and www.ion.org]

Page 12: Probabilistic graphical fusion of LiDAR, GPS, and 3D building …gracegao/publications/journal... · 2019-05-06 · LiDAR map-matching are applied to anchor the LiDAR odometry-derived

162 CHEN AND GAO

4.4 Graphical sensor fusion summaryFigure 8 shows a summary of the graphical sensor fusionprocess. First, the graph is initialized with an initial guessof its location from available GPS and map-matchingmeasurements. As the UAV navigates, it collects measure-ments from available sensors and updates the graph withnew nodes and constraining edges. Incremental graphi-cal inference steps are then performed to estimate its mostrecent states and respective components of the map areupdated. Batches of measurements are also collected by

an offloading processor, which periodically performs infer-ence on larger segments of the trajectory, returning anoptimized trajectory, updated urban building map, andpoint cloud map.

5 RESULTS AND DISCUSSION

5.1 Experimental setupTo validate the proposed algorithm, data were collectedand flight tested on a UAV test bed. The flight test was

FIGURE 9 iBQR data collection platform [Color figure can be viewed at wileyonlinelibrary.com and www.ion.org]

FIGURE 10 The data collection environment highlighted with Global Positioning System (GPS) conditions of each region as GPS-friendly,GPS-challenged, and GPS-denied [Color figure can be viewed at wileyonlinelibrary.com and www.ion.org]

Page 13: Probabilistic graphical fusion of LiDAR, GPS, and 3D building …gracegao/publications/journal... · 2019-05-06 · LiDAR map-matching are applied to anchor the LiDAR odometry-derived

CHEN AND GAO 163

conducted on the iBQR, shown in Figure 9, a custombuilt quadrotor in the research group.55 The iBQR wasdesigned with configurable arms and sensor mounts sothat GPS signals can be received without interferencefrom the UAV's flight electronics. For both test beds, the

FIGURE 11 Number of visible Global Positioning System (GPS)satellites, non-line-of-sight (NLOS) satellites detected by the urbanbuilding map, and occluded satellites [Color figure can be viewed atwileyonlinelibrary.com and www.ion.org]

FIGURE 12 Experimental results from Light Detection and Randing (LiDAR) odometry + Global Positioning System (GPS) graphicalsensor fusion in an urban environment. Pairing LiDAR odometry measurements with GPS pseudoranges provides an improved navigationsolution than Kalman Filter GPS alone, however subsections of multipath degrade the navigation solution [Color figure can be viewed atwileyonlinelibrary.com and www.ion.org]

same data collection platform was used. Data are collectedon the Asctec Mastermind, an Intel i7 computer operat-ing in Ubuntu Linux and running the Robotics OperatingSystem (ROS). The Asctec Mastermind connects to thecompass, GPS receiver, and LiDAR. For our heading mea-surements, we use the magnetometer of the Xsens MTiIMU, an integrated inertial measurement unit. GPS pseu-doranges are collected from a commercial off-the-shelf(COTS) u-blox LEA-6T GPS receiver paired with a Max-tena helical antenna. Finally, we mount a Velodyne PuckLite LiDAR for LiDAR point cloud collection. In bothcases, navigation data are collected for post-processing.

5.2 Results and analysisDynamic flight data were collected on a tethered iBQRflight where the UAV travelled from a GPS-friendly envi-ronment, to a GPS-challenged environment, and finally toa GPS-denied scenario. The GPS conditions for each regionof the selected trajectory are highlighted in Figure 10. Inthe GPS-friendly environment, an average of nine satel-lites were visible, leading to an accurate navigation solu-tion. As the UAV travels into the GPS-challenged region,although a similar number of satellites are visible, mul-tipath and NLOS effects degrade the quality of the GPS

Page 14: Probabilistic graphical fusion of LiDAR, GPS, and 3D building …gracegao/publications/journal... · 2019-05-06 · LiDAR map-matching are applied to anchor the LiDAR odometry-derived

164 CHEN AND GAO

measurements and the GPS navigation solution. Finally,in the GPS-denied region, the UAV loses GPS satellite vis-ibility because of the tall structures in the surroundingenvironment. Figure 11 shows the number of visible GPSsatellites, NLOS satellites detected by the urban building

map, and occluded satellites throughout the trajectory.Since it is challenging to obtain truth for outdoor UAVpositioning, the pilot-observed trajectory aided by a geo-referenced map of the area was used to estimate thetrue trajectory for reference. In Figure 12, we first show

FIGURE 13 Experimental results from full probabilistic graphical algorithm in an urban environment. With the map-aiding in the form ofLight Detection and Randing (LiDAR) map matching and multipath and non-line-of-sight (NLOS) mitigation, we obtain an improvedtrajectory estimate over a Global Positioning System (GPS)-only Kalman filtered approach [Color figure can be viewed atwileyonlinelibrary.com and www.ion.org]

FIGURE 14 Point cloud map (in the Light Detection and Randing (LiDAR) frame) generated from Global Positioning System (GPS)-onlynavigation solution. In the GPS-challenged and GPS-denied regions, the point cloud map becomes noisy and mis-positioned [Color figurecan be viewed at wileyonlinelibrary.com and www.ion.org]

Page 15: Probabilistic graphical fusion of LiDAR, GPS, and 3D building …gracegao/publications/journal... · 2019-05-06 · LiDAR map-matching are applied to anchor the LiDAR odometry-derived

CHEN AND GAO 165

FIGURE 15 Point cloud map (in the Light Detection and Randing (LiDAR) frame) generated from graphical approach. The point cloudmap is consistent with the Google Map of the region throughout the UAV trajectory [Color figure can be viewed at wileyonlinelibrary.comand www.ion.org]

FIGURE 16 A comparison of trajectory estimates in a severely Global Positioning System (GPS)-challenged environment. The GPS-onlyfrom the Light Detection and Randing (LiDAR) gets degraded by multipath. Navigating without loop closure is affected by drift since theiterative closest point (ICP) algorithm is susceptible to errors in an environment where features are similar. It is only with the loop closurestep entering and exiting the alley does the trajectory estimate match the true path taken [Color figure can be viewed atwileyonlinelibrary.com and www.ion.org]

the navigation solution of the graphical sensor fusionwith only the GPS pseudoranges with LiDAR odometry.Although a navigation solution is available throughout theUAV's trajectory, severe multipath degrades the naviga-tion solution and loss of GPS results in drift. Through theincorporation of map matching and multipath mitigationwith the urban building map, Figure 13 shows that the

proposed augmented graphical sensor fusion approach isable to provide a robust and accurate trajectory estimatefor the UAV in GPS-challenged and GPS-denied urbanenvironments. The importance of an accurate trajectoryestimate for mapping also becomes apparent after observ-ing the point cloud map. Figures 14 and 15 show the pointcloud map generated by anchoring the point clouds to the

Page 16: Probabilistic graphical fusion of LiDAR, GPS, and 3D building …gracegao/publications/journal... · 2019-05-06 · LiDAR map-matching are applied to anchor the LiDAR odometry-derived

166 CHEN AND GAO

UAV trajectory positions. With a GPS-only Kalman filteredsolution, once the UAV entered the GPS-challenged area,mapping of the region became noisy, as seen in Figure 14.With the graphical approach, where the trajectory is batchestimated and the noisy pseudorange measurements aremitigated, Figure 15 shows a more accurate mapping of theenvironment. Finally, we examine the effects of the loopclosure measurement on the trajectory estimate. Data werecollected as the UAV navigated into and out of an urbancanyon forming alleyway. The results of our navigationalgorithm are compared with the COTS receiver outputand trajectory estimate without loop closure in Figure 16.Since the alleyway creates a difficult GPS-challenged envi-ronment, the Kalman filtered GPS navigation solution isdegraded because of low satellite visibility and multipath.With the uniformity of the LiDAR observable features (ie,straight walls) in the alleyway, the ICP algorithm fails togenerate an accurate odometry estimate. Without loop clo-sure, the trajectory estimates degrade with drift and boththe GPS-only and GPS-with-LiDAR positions deviate fromthe true path. Application of loop closure allows the UAVto constrain its motion at the start and end of the trajec-tory, leading to a more accurate trajectory estimate in thealleyway.

6 CONCLUSION

In this paper, we presented a graphical sensor fusionapproach for urban UAV navigation. In our approach, weapplied a probabilistic graph to fuse GPS, LiDAR, and com-pass measurements with urban building footprint data. Weused the LiDAR point cloud measurements in three ways.Using the ICP algorithm on consecutive LiDAR scans, wegenerate odometry measurements to determine the rel-ative motion of the UAV. Then, we applied plane-fittingon the LiDAR point clouds to detect building walls formap matching with our urban building map. To improveour GPS measurements, we then incorporated techniquesfor multipath mitigation. Using the environment mapgenerated by the LiDAR, and aided by the urban build-ing footprints, we identified the presence of multipathand NLOS errors from the environment surrounding aUAV and mitigated their effects on the GPS pseudorangemeasurements. Combining these elements, we then pre-sented an algorithm for UAV trajectory estimation andenvironment mapping. Through a series of experimentsin GPS-challenged environments, we have shown that ourgraph-based approach has the capability of combining theadvantages of the LIDAR and GPS sensors. This enablesour UAV to obtain an accurate trajectory estimate in situa-tions that are normally challenging when navigating witheach of the sensors individually.

ACKNOWLEDGMENTS

The authors would like to thank Akshay Shetty, Shub-hendra Chauhan, Craig Babiarz, Arthur Chu, Sri RamyaBhamidipati, and Anneliese Fang for their assistance dur-ing numerous experiments and flight tests, and Enyu Luofor his hardware expertise in developing and supportingthe operation of the iBQR.

ORCID

Grace Xingxin Gao https://orcid.org/0000-0002-1807-8637

REFERENCES1. Commercial UAV Expo. How are professionals in high tech

industries using drones in 2017? [Available at: http://www.expouav.com/free-reports-2017/]; 2017.

2. Canis B. Unmanned aircraft systems (UAS): commercial outlookfor a new industry. Cong Res Serv. 2015.

3. Gurtner A, Greer DG, Glassock R, Mejias L, Walker RA, BolesWW. Investigation of fish-eye lenses for small-UAV aerial pho-tography. IEEE Trans Geosci Remote Sens. 2009.

4. Asctec. UAV for power line and infrastructure inspec-tion. [Available at: http://www.asctec.de/en/drone-uav/uav-uas-drone-powerlineinfrastructure-inspection/]; 2016.

5. Edison Electric Institute. Unmanned aircraft systems (UAS);2015. [Available at: http://www.eei.org/]

6. Ambulance Drone. [Available at: http://www.io.tudelft.nl/onderzoek/delft-design-labs/applied-labs/ambulance-drone/],TU Delft.

7. de Voorde PV, Gautama S, Momont A, Ionescu CM, Paepe PD,Fraeyman N. The drone ambulance [A-UAS]: golden bullet orjust a blank? Resuscitation. 2017.

8. Chen D. Graphical SLAM for urban UAV navigation. Master'sThesis: University of Illinois at Urbana-Champaign; 2017.

9. Groves PD. Principles of GNSS, inertial, and multisensor inte-grated navigation Systems. 2nd ed., (GNSS Technology andApplications): Artech House; 2013.

10. Kaplan ED, Hegarty C. Understanding GPS: Principles andApplications. 2nd ed.: Artech House; 2005.

11. Li T, Petovello M, Lachapelle G, Basnayake C. Ultra-tightly cou-pled GPS/vehicle sensor integration for land vehicle navigation.NAVIGATION. 2010;57:263-274.

12. Klein I, Filin S, Toledo T. Vehicle constraints enhancement forsupporting INS navigation in urban environments. NAVIGA-TION. 2011;58:7-15.

13. Hirokawa R, Ebinuma T. A low-cost tightly coupled GPS/INSfor small UAVs augmented with multiple GPS antennas. NAVI-GATION. 2009;56:35-44.

14. Khaghani M, Skaloud J. Autonomous vehicle dynamicmodel-based navigation for small UAVs. NAVIGATION.2016;63:345-358.

15. Shetty A, Gao GX. Covariance estimation for GPS-LiDarsensor fusion for UAVs. Proceedings of the 30th Interna-tional Technical Meeting of The Satellite Division of theInstitute of Navigation (ION GNSS+ 2017); September 2017;Portland, OR:2919-2923.

Page 17: Probabilistic graphical fusion of LiDAR, GPS, and 3D building …gracegao/publications/journal... · 2019-05-06 · LiDAR map-matching are applied to anchor the LiDAR odometry-derived

CHEN AND GAO 167

16. Soloviev A. Tight coupling of GPS, laser scanner, and inertialmeasurements for navigation in urban environments. IEEE/IONPLANS; May 2008; Monterey, CA:511-525.

17. Gao Y, Liu S, Atia MM, Noureldin A. INS/GPS/LiDAR inte-grated navigation system for urban and indoor environmentsusing hybrid scan matching algorithm. Sensors. 2015.

18. Dill ET, de Haag MU. 3D multi-copter navigation andmapping using GPS, inertial, and LiDAR. NAVIGATION.2016;63:205-220.

19. Thrun S, Burgard W, Fox D. Probabilistic Robotics; 2005.20. Durrant-Whyte H, Bailey T. Simultaneous localisation and map-

ping (SLAM): Part I: The essential algorithms. IEEE Rob AutomMag. June 2006;13(2).

21. Im J-H, Kim K-W, Jee G-I. 3D LiDAR based vehicle localizationusing vertical structure landmark in urban road. Proceedingsof the 30th International Technical Meeting of the Satellite Divi-sion of the Institute of Navigation (ION GNSS+ 2017); September2017; Portland, OR:387-401.

22. Afia AB, Escher A-C, Macabiau C. A low-cost GNSS/IMU/visualmonoSLAM/WSS integration based on federated Kalman filter-ing for navigation in urban environments. Proceedings of the28th International Technical Meeting of the Satellite Division ofThe Institute of Navigation (ION GNSS+ 2015); September 2015;Tampa, FL:618-628.

23. Watson RM, Gross JN. Robust navigation in GNSS degradedenvironment using graph optimization. Proceedings of the 30thInternational Technical Meeting of the Satellite Division of TheInstitute of Navigation (ION GNSS+ 2017); September 2017;Portland, OR:2906-2918.

24. Chen D, Gao GX. Robust MAV state estimation using anM-estimator augmented sensor fusion graphs. Proceedings of the28th International Technical Meeting of the Satellite Division ofThe Institute of Navigation (ION GNSS+ 2015); September 2015;Tampa, FL:841-848.

25. Chen D, Gao GX. Simultaneous state estimation of UAV tra-jectory using probabilistic graph models. Proceedings of 2015International Technical Meeting of The Institute of Navigation;January 2015; Dana Point, CA:804-810.

26. Shepard DP, Humphreys TE. High-precision globally-referenced position and attitude via a fusion of visual SLAM,carrier-phase-based GPS, and inertial measurements. Pro-ceedings of IEEE/ION PLANS 2014; May 2014; Monterey, CA:1309-1328.

27. Meguro J, Murata T, Takiguchi J, Amano Y, Hashizume T.GPS multipath mitigation for urban area using omnidirectionalinfrared camera. IEEE Trans Intell Transp Syst. 2009;10(1).

28. Meguro J, Murata T, Takiguchi J, Amano Y, Hashizume T. GPSaccuracy improvement by satellite selection using omnidirec-tional infrared camera. IEEE/RSJ International Conference onIntelligent Robots and Systems; 2008.

29. Zribi M, Touil K, Benjelloun M. Vehicle localization viasensor fusion using evidence theory. NAVIGATION. 2009;56:23-33.

30. Wang L, Groves PD, Ziebart MK. Multi-constellation GNSSperformance evaluation for urban canyons using large virtualreality city models. J Navig. 2012;65:459-476.

31. Groves PD. Shadow matching: a new GNSS positioning tech-nique for urban canyons. J Navig. July 2011;64(3):417-430.

32. Groves PD, Jiang Z, Wang L, Ziebart MK. Intelligent urban posi-tioning using multi-constellation GNSS with 3D mapping andNLOS signal detection. Proceedings of 25th International Techni-cal Meeting of the Satellite Division of the Institute of Navigation(ION GNSS 2012); Nashville, TN:458-472.

33. Wang L, Groves PD, Ziebart MK. GNSS shadow match-ing: improving urban positioning accuracy using a 3D citymodel with optimized visibility scoring scheme. NAVIGATION.2013;60:195-207.

34. Yozevitch R, Moshe BB. A robust shadow matchingalgorithm for GNSS positioning. NAVIGATION. 2015;62:95-109.

35. Gu Y, Wada Y, Hsu L, Kamijo S. SLAM with3dimensional-GNSS. Proceedings of IEEE/ION PLANS 2016;April 2016; Savannah, GA:90-197.

36. Miura S, Hsu L-T, Chen F, Kamijo S. GPS error correction withpseudorange evaluation using three-dimensional maps. IEEETrans Intell Transp Syst. December 2015;16(6).

37. Ng Y, Gao GX. Direct position estimation utilizingnon-line-of-sight (NLOS) GPS signals. Proceedings of the 29thInternational Technical Meeting of the Satellite Division of TheInstitute of Navigation (ION GNSS+ 2016); September 2016;Portland, OR:1279-1284.

38. Gentner C, Ma B, Phlmann R, Ulmschneider M, Jost T,Dammann A. Simultaneous localization and mapping in multi-path environments: mapping and reusing of virtual transmitters,Proceedings of the 29th International Technical Meeting of theSatellite Division of The Institute of Navigation (ION GNSS+2016); September 2016; Portland, OR:1170-1177.

39. Gentner C, Phlmann R, Ulmschneider M, Jost T, Dammann A.Multipath assisted positioning for pedestrians, Proceedings of the28th International Technical Meeting of the Satellite Division ofThe Institute of Navigation (ION GNSS+ 2015); September 2015;Tampa, FL:2079-2086.

40. Gentner C, Ma B, Ulmschneider M, Jost T, Dammann A. Simul-taneous localization and mapping in multipath environments,Proceedings of IEEE/ION PLANS 2016; April 2016; Savannah,GA:807-815.

41. Chen Y, Medioni G. Object modelling by registration of multiplerange images, Proceedings of the 1991 IEEE International Con-ference on Robotics and Automation; April 1991; Sacramento,CA.

42. Besl PJ, McKay ND. A method for registration of 3-D shapes.IEEE Trans Pattern Anal Mach Intell. 1992.

43. State of Illinois Data Portal. Champaign building footprints.https://data.illinois.gov/Municipality/Building-Footprints/;2017.

44. Torr PHS, Zisserman A. MLESAC: a new robust estimator withapplication to estimating image geometry. Comput Vis ImageUnderst. 2000.

45. Strutz T. Data fitting and uncertainty: a practical introduction toweighted least squares and beyond; 2016.

46. Hastings WK. Monte Carlo sampling methods using markovchains and their applications; 1970.

47. Peyraud S, Btaille D, Renault S, et al. About non-line-Of-sightsatellite detection and exclusion in a 3D map-aided localizationalgorithm. Sensors. 2013.

48. Suzuki T, Nakano Y, Amano Y. NLOS multipathdetection by using machine learning in urban environ-ments. Proceedings of the 30th International TechnicalMeeting of the Satellite Division of the Institute of Naviga-tion (ION GNSS+ 2017); September 2017; Portland, OR:3958-3967.

49. Koller D, Friedman N. Probabilistic graphical models: principlesand techniques: The MIT Press; 2009.

50. Aster RC, Borchers B, Thurber CH. Parameter estimation andinverse problems. 2nd ed.: Academic Press; 2011.

Page 18: Probabilistic graphical fusion of LiDAR, GPS, and 3D building …gracegao/publications/journal... · 2019-05-06 · LiDAR map-matching are applied to anchor the LiDAR odometry-derived

168 CHEN AND GAO

51. Dellaert F, Kaess M. Square root SAM: simultaneous local-ization and mapping via square root information smoothing.International Journal of Robotics Research. 2006.

52. Kümmerle R, Grisetti G, Strasdat H, Konolige K, Burgard W.g2o: a general framework for graph optimization. IEEE Interna-tional Conference on Robotics and Automation (ICRA); 2011.

53. Zhou Y, Tuzel O. VoxelNet: end-to-end learning for point cloudbased 3D object detection. Apple, Inc.; 2017.

54. Minemura K, Liau H, Monrroy A, Kato S. LMNet: real-timemulticlass object detection on CPU using 3D LiDAR. Intel;May 2018.

55. Federal Aviation Administration. FAA REGISTRY: N720AR.

How to cite this article: Chen D, Gao GX.Probabilistic graphical fusion of LiDAR, GPS, and3D building maps for urban UAV navigation.NAVIGATION. 2019;66:151–168. https://doi.org/10.1002/navi.298