may 20-24, 2019 montreal, canada challenges for fast

30
Challenges for Fast Autonomous Navigation with Lightweight Drones Vijay Kumar Professor and Nemirovsky Family Dean University of Pennsylvania 2019 IEEE International Conference on Robotics and Automation May 20-24, 2019 Montreal, Canada

Upload: others

Post on 28-May-2022

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: May 20-24, 2019 Montreal, Canada Challenges for Fast

Challenges for Fast Autonomous Navigation with Lightweight Drones

Vijay KumarProfessor and Nemirovsky Family DeanUniversity of Pennsylvania

2019 IEEE

International Conference on Robotics and Automation

May 20-24, 2019 Montreal, Canada

Page 2: May 20-24, 2019 Montreal, Canada Challenges for Fast

SmallSafe

SpeedSmartWe are here!

Swarms

Aerial Robotics Research (and Commercialization)

Page 3: May 20-24, 2019 Montreal, Canada Challenges for Fast

Search and Rescue

Page 4: May 20-24, 2019 Montreal, Canada Challenges for Fast

Global Health Challenges

The Power of LocalBuilding aid, health, development and environmental solutions by sustainably localizing robotics technologies

Page 5: May 20-24, 2019 Montreal, Canada Challenges for Fast

5

Technology Drivers

• Computation• Sensing• Energy

Page 6: May 20-24, 2019 Montreal, Canada Challenges for Fast

ComputationTable 1: Light weight versus low power computers for autonomous flight.

ComputerIntel NUC(i3-5010U)

Intel NUC(i5-5250U)

Intel NUC(i7-5557U)

Odroid XU3(Exynos 5422)

Qualcomm Eagle(Snapdragon 801)

Cores 2 2 2 4 4

Clock Speed (GHz) 1.7 2.1 3.1 2 2.5

MFLOPS (Single-Core) 1900 2730 3440 1030 1200

MFLOPS (Multi-Core) 4250 5400 7480 4270 4350

Mass (g) 210 210 210 38 27

MFlops / g (Single-Core) 9.05 13 16.38 27.11 44.44

MFlops / g Multi-Core) 20.24 25.71 35.62 112.37 161.11

[3] Darryll J Pines and Felipe Bohorquez. Challenges facingfuture micro-air-vehicle development. Journal of air-

craft, 43(2):290–305, 2006.

[4] Ascending Technologies GmbH.http://www.asctec.de/.

[5] Sergei Lupashin, Angela Schollig, Michael Sherback,and Ra↵aello D’Andrea. A simple learning strategy forhigh-speed quadrocopter multi-flips. In Robotics and

Automation (ICRA), 2010 IEEE International Confer-

ence on, pages 1642–1648. IEEE, 2010.

[6] D. Mellinger and V. Kumar. Minimum snap trajectorygeneration and control for quadrotors. Robotics and Au-

tomation (ICRA), 2011 IEEE International Conference

on, pages 2520–2525, May 2011.

[7] N. Michael, S. Shen, K. Mohta, Y. Mulgaonkar,V. Kumar, K. Nagatani, Y. Okada, S. Kiribayashi,K. Otake, K. Yoshida, K. Ohno, E. Takeuchi, andS. Tadokoro. Collaborative mapping of an earthquake-damaged building via ground and aerial robots. JournalField Robotics, 29(5):832–841, 2012.

[8] Q. Lindsey, D. Mellinger, and V. Kumar. Constructionof cubic structures with quadrotor teams. Robotics Sci-

ence and Systems, 2011.

[9] Matthew Turpin, Kartik Mohta, Nathan Michael, andVijay Kumar. Goal assignment and trajectory planningfor large teams of aerial robots. Robotics: Science and

Systems (RSS), June 2013.

[10] A. Bachrach, A. de Winter, R. He, G. Hemann, S. Pren-tice, and N. Roy. Range - robust autonomous navigationin gps-denied environments. In Robotics and Automa-

tion (ICRA), 2010 IEEE International Conference on,pages 1096–1097, May 2010.

[11] Dimitrios G Kottas, Joel A Hesch, Sean L Bowman, andStergios I Roumeliotis. On the consistency of vision-aided inertial navigation. In Experimental Robotics,pages 303–317. Springer, 2013.

[12] Friedrich Fraundorfer, Lionel Heng, Dominik Honegger,Gim Hee Lee, Lorenz Meier, Petri Tanskanen, and MarcPollefeys. Vision-based autonomous mapping and explo-ration using a quadrotor mav. In Intelligent Robots and

Systems (IROS), 2012 IEEE/RSJ International Con-

ference on, pages 4557–4564. IEEE, 2012.

[13] Albert S Huang, Abraham Bachrach, Peter Henry,Michael Krainin, Daniel Maturana, Dieter Fox, andNicholas Roy. Visual odometry and mapping for au-tonomous flight using an rgb-d camera. In Interna-

tional Symposium on Robotics Research (ISRR), vol-ume 2, 2011.

[14] Yash Mulgaonkar, Michael Whitzer, Brian Morgan,Christopher M Kroninger, Aaron M Harrington, and Vi-jay Kumar. Power and weight considerations in small,agile quadrotors. In SPIE Defense+ Security, pages90831Q–90831Q. International Society for Optics andPhotonics, 2014.

[15] Vijay Kumar and Nathan Michael. Opportunitiesand challenges with autonomous micro aerial vehi-cles. The International Journal of Robotics Research,31(11):1279–1291, 2012.

[16] G. Loianno, Y. Mulgaonkar, C. Brunner, D. Ahuja,A. Ramanandan, M. Chari, S. Diaz, and V. Kumar.Smartphones power flying robots. In Intelligent Robots

and Systems (IROS), 2015 IEEE/RSJ International

Conference on, pages 1256–1263, Sept 2015.

[17] Christian Forster, Matia Pizzoli, and Davide Scara-muzza. SVO: Fast semi-direct monocular visual odome-try. In IEEE International Conference on Robotics and

Automation (ICRA), 2014.

[18] S. Shen, Y. Mulgaonkar, N. Michael, and V. Kumar.Multi-sensor fusion for robust autonomous flight in in-door and outdoor environments with a rotorcraft mav.In 2014 IEEE International Conference on Robotics and

Automation (ICRA), pages 4974–4981, May 2014.

[19] T. Lee, M. Leoky, and N. H. McClamroch. Geomet-ric tracking control of a quadrotor uav on se(3). In49th IEEE Conference on Decision and Control (CDC),pages 5420–5425, Dec 2010.

[20] DJI. http://www.dji.com/.

[21] Pixhawk Autopilot. https://pixhawk.org.

250 gram, 20 W Open Vision ComputerTX2 + FPGA IMU synchronization, feature extraction on FPGA with DMA access to NVIDIA TX2 memory; VIO at 20 FPS, real time stereo on TX2

Morgan Quigley (OSRF)

Page 7: May 20-24, 2019 Montreal, Canada Challenges for Fast

Model Range Resolution Weight* Power Cost

HDL-64E

120 m26.8o vertical

FOV

< 2 cm.0.08o (azimuth)0.4o (elevation)

13.2 kg 60 W $75K

HDL-32E

100 m41o vertical FOV

±2 cm.0.1o - 0.4o (azimuth)

1.33o (elevation)1 kg 12W $30K

VLP-16 (Puck)

100 m30o vertical FOV

±3 cm.0.1o - 0.4o (azimuth)

2o (elevation)830 g 8 W $7999

VLP-16 Lite

100 m30o vertical FOV

±3 cm.0.1o - 0.4o (azimuth)

2o (elevation)590 g 8 W $9399

Ouster OS-1

45o vert. FOV (128 beams)40 – 120 m

± 1.5-10 cm0.13o-0.18o 380 g $10,000

Page 8: May 20-24, 2019 Montreal, Canada Challenges for Fast

Power

0

200

400

600

800

1000

1200

1400

0 50 100 150 200 250 300 350 400 450 500

Spec

ific

Pow

er (

W/k

g)

Specific Energy (Whr/kg)

10,000 Whr/kg

Lithium Polymer Batteries

Quadrotors with 10-20 min endurance

[B. Morgan, ARL and Y. Mulgaonkar, Penn]

Bolt (10 secs)

Armstrong (20 mins)

Lose weight!

Page 9: May 20-24, 2019 Montreal, Canada Challenges for Fast

The FalconFast Aggressive Lightweight flight in CONstrained Environments

DARPA FLA

Page 10: May 20-24, 2019 Montreal, Canada Challenges for Fast

Autonomy

Size

3000 g

3500 g

75 g

201120 g

2014650 g 2015

740 g

2012

1750 g

2013

1850 g

2011

2016

250 g

2016

rely on external infrastructure

vision + IMU

2-D laser scanner

3-D laser

Page 11: May 20-24, 2019 Montreal, Canada Challenges for Fast

A. Weinstein, A. Cho, and G. Loianno, and V. Kumar, “VIO-Swarm: A Swarm of 250 gram autonomous quadrotors ” ICRA 2018.

Light Weight AutonomyS. S. Shivakumar, K. Mohta, B. Pfrommer, V. Kumar and C. J. Taylor, Guided Semi Global Optimization for Real Time Dense Depth Estimation, ICRA 2019.

1 kg quadrotor

Stereo camera synced with Vector NAV IMU, NVDIA Jetson TX2 + FPGA (low-level pixel-wise operations) – OSRF

TOF 3-D camera, 6m range, 100x65 deg, 60 Hz – PMD technologies

Shreyas Shivakumar

CJ Taylor

Sikang Liu

Mike Watterson

Ke Sun

Kartik Mohta

Page 12: May 20-24, 2019 Montreal, Canada Challenges for Fast

A. Weinstein, A. Cho, and G. Loianno, and V. Kumar, “VIO-Swarm: A Swarm of 250 gram autonomous quadrotors ” ICRA 2018.

Light Weight Autonomy

250 gram quadrotorQualcomm® Snapdragon Flight™ development board running Snapdragon Navigator™ flight controller and Machine Vision (MV) SDK

GiuseppeLoianno

Shreyas Shivakumar

Ke Sun

Kartik Mohta

CJ Taylor

S. S. Shivakumar, K. Mohta, B. Pfrommer, V. Kumar and C. J. Taylor, Guided Semi Global Optimization for Real Time Dense Depth Estimation, ICRA 2019.

1 kg quadrotor

Stereo camera synced with Vector NAV IMU, NVDIA Jetson TX2 + FPGA (low-level pixel-wise operations) – OSRF

TOF 3-D camera, 6m range, 100x65 deg, 60 Hz – PMD technologies

Page 13: May 20-24, 2019 Montreal, Canada Challenges for Fast

Autonomous Flight in Fukushima Daiichi Reactor Unit 1

Monica Garcia (SWRI), Richard Garcia (SWRI), Wataru Sato (TEPCO)

Giuseppe Loianno

Dinesh Thakur

Laura Jarin-Lipschitz

Wenxin Liu

Elijah Lee

Page 14: May 20-24, 2019 Montreal, Canada Challenges for Fast

14

Coping with environmental conditions

Fig. 4: Onboad LEDs and drive board mounted on the MAV (left). The lighting payload is equipped with A) LEDiL C16029STRADA-SQ-C lens and B) LEDiL C12727 STRADA-SQ-VSM lens. Cree XP-G3 illumination distribution is used for thelenses. The concentric rings on the test target indicate fields of view of 100� and 120� captured with bottom camera.

Fig. 5: Picture of mock-up PCV viewed from a camera located at X100B (left), Point cloud of test fixture captured viaKAARATA Stencil (center), OctoMap and automatically generated JPS 3D navigation path (right)

widest and most uniform central illumination spot while(LEDiL C12727 STRADA-SQ-VSM [15]) generated themost uniform illumination spot over the entire FOV of thecamera. The Cree LEDs have 125� directional viewing angle.The forward cameras have much narrower FOV of 90�, henceno lens were added.

We have designed and fabricated a custom LED drivecontroller board for use with the MAV. Constant current LEDdrivers are used in the lighting payload. A constant currentdriver generates consistent illumination throughout the flightand protects the LED from thermal runaway if the ambienttemperature or the temperature of the LED increases duringthe flight. The board is equipped with trim pot resistors tocontrol the Lux output of the LEDs. Fig. 4 shows the LEDdrive board with two different lenses and the correspondingillumination distribution and also a picture of the lightingpayload installed on the MAV platform. The lighting payloadadds only 8g to the MAV weight and draws a maximumcurrent of 1.5A at maximum light output of all three LEDs.

B. Waterproof System

Waterproofing the vehicle requires careful considerationof each component. A multi-step conformal coating processis performed on the Snapdragon Flight board, ElectronicSpeed Control (ESC) and the LED driver board. Specifically,

the pins, connectors, and cameras are removed and a firstconformal coating is applied with Arathane 5750-A/B (LV),which is a soft translucent urethane composite designed forinsulating printed circuit boards and electronic components.Once the initial coating process is completed, the connectorsare reattached, and the second coating process is performedusing RTV silicone. To protect the LEDs from the adverseeffects of dripping water, the front PCB surface of the LEDswas coated with RTV silicone. The downward-facing camerais also coated with a hydrophobic coating. Glass coverslipswith a hydrophobic coating were installed on the forward-facing stereo cameras. A waterproof coating is added to thebattery cell, balancing cable, and circuit.

III. AUTONOMOUS NAVIGATION

In this section, we describe the main algorithms enablingthe autonomous capabilities of the aerial vehicle in unknownand unstructured environments.

A. Control and EstimationThe system combines the Inertial Measurement Unit

(IMU) data and the VGA downward camera image data in anExtended Kalman Filter (EKF) framework in order to local-ize the vehicle in the environment [12]. The prediction stepis based on integration of the IMU data. The measurementupdate step is given by the standard 3D landmark perspective

LED lighting with lenses, custom LED drivers

Fig. 8: MAV flight inside dripping water setup(left). MAV flight from X100B camera with IR illuminator(center). Anotherview from a camera without IR illuminator at same timestamp(right). The MAV is indicated with yellow circle while redstars indicate IR illuminators of various cameras throughout the setup.

IV. EXPERIMENTAL RESULTS

An important step prior to deployment in nuclear reactorenvironments is the evaluation of developed technologiesin mock-up facilities. In this section, we report on theexperiments carried out in a mock-up of the PCV (10 mdiameter) and CRD rail as shown in Fig. 8. Three differenttests were conducted to validate the proposed setup andstrategy. During all the tests, the nominal velocity of thevehicle was limited to 0.5m/s.

A. Lighting payload tests

We conducted a basic flight time test to determine theapproximate impact of the lighting payload on the flighttime of the system. Flight times were recorded manually,and a timer was started when the motors started spinningand stopped when the vehicle landed due to insufficientpower. The data was averaged over multiple runs. The vehiclewas set to hover at a constant height during the tests.Batteries from two different manufacturers were used duringthe tests. Battery information obtained via the SnapdragonFlight board indicated that the vehicle was consuming almost9 A when the vehicle was hovering and the lighting payloadwas running, and approximately 7.5 A when only the vehiclewas hovering with the lighting payload was turned off. Thevehicle consumed approximately 0.5 A when the vehicle wasturned on but before the motors were activated. The resultsof the test can be found in Table I. All times are recordedin seconds (s). It was demonstrated that adding the payloadleads to only half a minute of lost flight time.

Fig. 9: Stereo camera images with water directly splashingon the lens.

TABLE I: Flight time test results with lighting payload

Battery Vehicle Flight Time (s) Impact (s)No Payload PayloadTenergy (950mAh) 386.9 351.2 35.7Thunderpower (910mAh) 382.2 350.4 31.8

B. Dripping water testsA localized dripping water test feature was installed inside

of the pedestal Fig. 7. The dripping water test feature consistsof a porous garden hose and a water capture basin. The hosewas arranged in a coil pattern along a portion of the pedestalceiling to increase the surface area. The water capture basinconsists of a plastic sheet and elevated edges that contain wa-ter dispensed from the porous water hose. The porous gardenhose emits droplets of water from its entire porous surface.This produces semi-random, gravity-propelled droplets alongthe entire length of the hose. The rate of dripping waterduring a test was measured with a Stratus precision raingauge at various locations in the water feature. Measuredprecipitation rates varied significantly during testing (100mm/hr 375 mm/hr) throughout the dripping water testfeature. We conducted a series of dripping water tests onthe waterproofed MAV. Fig 9 shows example images ofwater splash captured with the downward-facing camera andforward-facing stereo camera pair. The MAV waterproofingprocess was verified by flying the MAV system under thedripping water test feature inside of the pedestal. The vehiclewas able to successfully fly in the presence of dripping water.In Fig. 10 global trajectory tracking from two runs is shown,one without dripping water and one with. Each run consistedof a circular trajectory repeated four times. Multiple suchruns were conducted.

C. Flight from X100B to CRD RailThe flight from the X-100B to the pedestal area leverages a

general waypoint navigation algorithm coupled with obstacleavoidance. The initial global trajectory was generated usingJPS search. The X-100B entry is roughly 5m above theground level. Fig. 11 shows the global trajectory trackingof one of the runs and a corresponding OctoMap generatedalong the path.Stereo camera images with water dripping

(waterproof coating of battery, cables, boards)

Dinesh Thakur, Laura Jarin-Lipschitz, Alex Zhou, Wataru Sato, Richard Garcia, Monica Garcia, Giuseppe Loianno, and Vijay Kumar, Autonomous Inspection of Primary Containment Vessel Test Fixture using Micro Aerial Vehicles, 2019.

Page 15: May 20-24, 2019 Montreal, Canada Challenges for Fast

15

Scaling DownYash Mulgaonkar

Y. Mulgaonkar, G. Cross and V. Kumar, “Design of small, safe and robust quadrotor swarms,” in IEEEInternational Conference on Robotics andAutomation (ICRA), Seattle WA, May 2015.

Page 16: May 20-24, 2019 Montreal, Canada Challenges for Fast

16

State Estimation with CollisionsWenxin Liu Yash Mulgaonkar Giuseppe Loianno

Page 17: May 20-24, 2019 Montreal, Canada Challenges for Fast

Global Health Challenges

The Power of LocalBuilding aid, health, development and environmental solutions by sustainably localizing robotics technologies

Page 18: May 20-24, 2019 Montreal, Canada Challenges for Fast

Last Mile

Credit: Daniel Ronen 2016

Global Health: Drone Delivery

Page 19: May 20-24, 2019 Montreal, Canada Challenges for Fast

Credit: Matternet 2014

Page 20: May 20-24, 2019 Montreal, Canada Challenges for Fast
Page 21: May 20-24, 2019 Montreal, Canada Challenges for Fast
Page 22: May 20-24, 2019 Montreal, Canada Challenges for Fast
Page 23: May 20-24, 2019 Montreal, Canada Challenges for Fast

People who live more than 6 km from health center face 2 to 4 times the risk of adverse outcomes

Akello et al. (2008), Rutherford et al. (2009), Schellenbert et al.2008), Byass et al. (2008), Okwaraji et al. (2012)Cargo Drones in Humanitarian Contexts (https://drones.fsd.ch/en/3754/, 2016)

Distance increases health risks and costs

Around 50% of the transportation cost of medical supplies is the cost incurred in the last mile of delivery

Last Mile Costs of Public Health Supply Chains in Developing Countries, USAID 2013.

Around 40% of vaccines supplied to Sub Saharan Africa expire before final leg transportation and administration

Page 24: May 20-24, 2019 Montreal, Canada Challenges for Fast

Use Cases

Page 25: May 20-24, 2019 Montreal, Canada Challenges for Fast
Page 26: May 20-24, 2019 Montreal, Canada Challenges for Fast

World Mosquito Program

Page 27: May 20-24, 2019 Montreal, Canada Challenges for Fast
Page 28: May 20-24, 2019 Montreal, Canada Challenges for Fast

Humanitarian Relief Efforts

Page 29: May 20-24, 2019 Montreal, Canada Challenges for Fast

Planning/Control with Suspended PayloadsMax. payload

velocityMax. payload

angleError reduction

1.04 m/s 9.94 deg 67%

2.14 m/s 32.88 deg 61%

3.03 m/s 53.77 deg 54%

Sarah Tang, Valentin Wüest, and Vijay Kumar. “Aggressive flight with suspended payloads using visionbased control”. Robotics and Automation Letters (RA-L), vol. 3, no. 2, pp. 1152—1159, Apr. 2018.

Sarah Tang

Valentin Wuest

Page 30: May 20-24, 2019 Montreal, Canada Challenges for Fast

Novel sensors

NeuromorphicFabric

NeuromorphicFabric

NeuromorphicFabric

NeuromorphicFabric

AtittudeController

Onboard Filter

Actuators

Rigid Body Dynamics

Sensors

Pixhawk

Processing of Sensor

Data

Planner(Mapping, Planning,

Trajectory Generation)

Position Controller

Sensor Fusion (e.g.,

UKF)

Intel NUC or some other conventional processor

Pos Vel Acc Jerk

YawYaw rate

Pos Vel R

ThrustRdes

Ωdes

R

R Ω

Motor Speeds

IMU

Deep Learning for Perception: Object

Detection, Classification, Semantic Perception

PerceptionDeep Learning for Flight

Control: Visual Odometry, Collision Avoidance

Low-level sensing and control

Memory Architectures for Learning Navigation

Policies

Memory

DeepReinforcement Learning for Autonomous Flight

Learning Autonomous Behaviors

Deep Learning?