collaboration between unmanned aerial and ground...
TRANSCRIPT
Collaboration Between Unmanned Aerial and Ground Vehicles
Dr. Daisy Tang
Key Components
� Autonomous control of individual agent
� Collaborative system
� Mission planning
� Task allocation
� Communication
� Data fusion
� Operator control unit (OCU)
� Situation awareness
� Sliding autonomy
Paper Presentations
� “Deploying air-ground multi-robot teams in urban environments”, by Chaimowicz et al., Multi-Robot Systems, From Swarms to Intelligent Automata, 2005.
� Presented by Nada Alghofaili
� “Integrated long-range UAV/UGV collaborative target tracking”, by Moseley et al., SPIE Unmanned Systems and Technology XI Conference, 2009.
� Presented by Wie Li
Some Videos
� https://www.youtube.com/watch?v=fzAlGxrjg1g&spfreload=10
� https://www.youtube.com/watch?v=RPCUB6xjQTI
� https://www.youtube.com/watch?v=eLcctvwXdDg
Motivation of Collaboration
� Challenges of urban environments
� Buildings pose 3-D constraints on visibility, communication and GPS
� A network of aerial and ground vehicles working in cooperation is more beneficial
� We need to:
� Keep the network tightly integrated for vehicles to support each other
� Provide ways to facilitate human operator to command the whole network
Research Goal
� Establish the overall paradigm, modeling framework and software architecture to enable a minimum number of human operators to manage a heterogeneous robotic team with varying degrees of autonomy
Hardware
� Team: 5 unmanned ground vehicles (UGVs) + 2 fixed wing aircraft and a blimp
UGVs
� 48 cm long and 35 cm high chassis on a scale model truck
� Pentium III laptop
� Odometry, steering servos
� GPS
� IMU
� A forward-looking stereo camera pair
� A small embedded computer with 802.11 wireless Ethernet
� Jbox handles multi-hop routing in an ad-hoc wireless network
UAVs
� Fixed wing aircraft:
� Equipped with Piccolo autopilot
� Provides innerloop attitude and velocity stabilization control
� A high resolution camera
� IMU
� GPS receiver
� Radio modem is used for communication between air vehicles and operator base station
� Blimp:
� 9 meters length, 3kg payload
� GPS, IMU, video camera, onboard computation and communication
Software
� ROCI (Remote Object Control Interface) for UAVs and UGVs
� A high-level OS for programming and managing networks of robots and sensors
� Each robot is a node that contains several processing and sensing modules and may export different types of services and data to other nodes
� Complex tasks can be built by connecting inputs and outputs of specific modules
� The connection is defined in XML
Localization and Navigation
� A Kalman filter is used to estimate robot localization based on
� Wheel encoder odometry, IMU, GPS, robot observations from external vision sensors and landmarks
� Navigation based on a list of waypoints
� Specified manually through a user interface
� Automatically generated
� Create a Voronoi Diagram of the environment and use it as a roadmap for planning intermediate waypoints
� Diagram can be generated beforehand using overhead imagery obtained by air vehicles
� Mission scripts
Trajectory Controller
� A trajectory controller generates linear and angular velocities
� Local obstacle avoidance is done by the two stereo cameras
� Trajectories can be compared to find potential collisions
Situation Awareness
� Main interface: ROCI Browser
� It displays the network hierarchically
� Human operator can browse nodes
� Tasks running on each node
� Modules that make up each task
� Browser’s main job is to give user command and control over the network and ability to retrieve and visualize information from any one distributed node
Mission Scripts
� User can start and stop the execution of tasks in the robots remotely, change task parameters or control
� Elaborated missions are constructed using scripts, which define a sequence of actions that should be performed
� Capturing panoramic images at different waypoints, or navigating through multiple intermediate waypoints before reaching a target site
A Snapshot
Air-Ground Cooperation
� Challenges: cluttered urban environments
� UAVs could help UGVs by providing localization data and acting as communication relays
� Example: localize ground vehicles using a sequence of images taken from the blimp
� Relates the position of the robot in a global coordinate frame with its pixel coordinates in the image
� Use a set of known landmarks in the image
� Rely on measurements from the GPS/IMU onboard the blimp and camera parameters
Comparing Two Methods
� None of these approaches could be applied alone if we need a localization system that is applicable, reliable, and accurate
� Motivation:
� Find more sophisticated methods for cooperative localization
� Fuse information from different sources in a systematic way
The Combined Approach
� Based on prior work on decentralized data fusion (DDF) and decentralized active sensor networks
� A collaborative feature search and localization
� Exploits complementary character of UAV and UGV
� UAV rapidly covers designated search area
� UGVs deploy to refine the feature location
Example
Cooperative Radio-Mapping
� Communication is essential for coordination
� Radio propagation characteristics are difficult to predict
� Transmission power, terrain, 3-D geometry of the environment, interference
� Goal:
� Acquiring information for radio connectivity maps in urban terrains to help plan multi-robot tasks
� Approach:
� Build radio connectivity map, which returns signal strength between any two positions in the environment
Waypoints Navigation
� An overhead surveillance picture is used to generate roadmaps for motion planning and waypoints generation
� Minimize probability of losing connectivity under line-of-sight condition
� Radio signal strength measurements are obtained as team members simultaneously traverse through their respective waypoints
� Broadcast messages @ arrival
� Broadcast messages when ready to go after signal measurement
� Repeated until all waypoints are traversed
� Recovery behaviors: returning to the last position
Preliminary Results