telesupervised adaptive ocean sensor fleet year 3 midterm review
DESCRIPTION
Telesupervised Adaptive Ocean Sensor Fleet Year 3 Midterm Review. March 5, 2009 Carnegie Mellon University (CMU) NASA Goddard Space Flight Center (GSFC) NASA Wallops Flight Facility (WFF) Jet Propulsion Laboratory (JPL). PI: John M. Dolan, CMU Co-I’s: Gregg Podnar, CMU Jeff Hosler, GSFC - PowerPoint PPT PresentationTRANSCRIPT
Telesupervised Adaptive Ocean Sensor Fleet Year 3 Midterm Review
March 5, 2009Carnegie Mellon University (CMU)
NASA Goddard Space Flight Center (GSFC)NASA Wallops Flight Facility (WFF)
Jet Propulsion Laboratory (JPL)
PI: John M. Dolan, CMUCo-I’s: Gregg Podnar, CMU
Jeff Hosler, GSFCJohn & Tiffany Moisan, WFF
Alberto Elfes, JPL
2
Outline
• Project and system overview (slides 2-6)
• Response to Year 2 External Review (slide 7)
• Technical status (slides 8-34)
• Schedule, milestones, and work planned (slides 35-40)
• Critical issues (slide 41)
• Financial status (slides 42-47)
• Educational outreach and presentations (slides 48-51)
• Acronyms/glossary (slide 52)
3
Team Members and Roles• John M. Dolan (CMU), PI
• Gregg Podnar (CMU), Co-I and Program Manager
• Jeff Hosler (GSFC), Co-I and ASF Software
• John Moisan (WFF), Co-I and OASIS Platform Lead
• Tiffany Moisan (WFF), Co-I and HAB Science Lead
• Alberto Elfes (CMU), Co-I and HAB Data Analysis and Prediction
• John Higinbotham (EST), OASIS Platform Software and Operations
• Alan Guisewite (CMU), Field Observation System and Infrastructure
• Steve Stancliff (CMU), Software Architecture and MSB Platform Development
• Christopher Baker (CMU), Software Architecture and Testing
• Bryan Low (CMU), Adaptive Sampling Techniques
• David Schlesinger (CMU), Data Storage and Visualization
• David Asikin (CMU), MSB/RSB Platform Control and Navigation
• Vinay Gunasekaran (CMU), MSB/RSB Platform Control and Navigation
• Yunchan Paik (CMU), Path Planning and Software Architecture
• Rob Waaser (CMU), RSB Platform Development
4
Telesupervised Adaptive Ocean Sensor Fleet (TAOSF)
Objective
Key Milestones
TRLin = 4
• Improved in-situ study of Harmful Algal Blooms (HAB), coastal pollutants, oil spills, and hurricane factors
• Expanded data-gathering effectiveness and science return of existing NOAA OASIS (Ocean Atmosphere Sensor Integration System) surface vehicles
• Establishment of sensor web capability combining ocean-deployed and space sensors
• Manageable demands on scientists for tasking, control, and monitoring
Approach• Telesupervision of a networked fleet of NOAA surface autonomous vehicles (OASIS)• Adaptive repositioning of sensor assets based on environmental sensor inputs (e.g., concentration gradients)• Integration of complementary established and emergent technologies (System Supervision Architecture (SSA), Inference Grids, Adaptive Sensor Fleet (ASF), Instrument Remote Control (IRC), and OASIS)• Thorough, realistic, step-by-step testing in relevant environments• Gregg Podnar / CMU• Jeffrey Hosler, John Moisan, Tiffany Moisan / GSFC• Alberto Elfes / JPL
PI: John Dolan, CMU
Co-I’s/Partners
• Interface Definition Document Feb 2007• Test components on one platform in water May 2007• Autonomous multi-platform mapping of dye Jul 2007• Science requirements for Inference Grid Feb 2008• Multi-platform concentration searchsimulation May 2008• HAB search in estuary for high concentration Jul 2008• Moving-water test plan & identify location Feb 2009• Simulate test using in-situ and MODIS data May 2009• Use MODIS data to target and reassign fleet Jul 2009
TRLEndY1 = 5
TRLEndY2 = 6
Telesupervised Adaptive Ocean Sensor Fleet investigating
a Harmful Algal Bloom (simulated with water-tracing dye).
Telesupervised Adaptive Ocean Sensor Fleet investigating
a Harmful Algal Bloom (simulated with water-tracing dye).
5
TAOSF Program Synergy
ESTO Office
Inputs
14 PhD, MS, BS, and HS students
ESTO Office
Collaborative Partner
Telesupervised Adaptive Ocean
Sensor Fleet Project
AIST Value Added Outputs
Tools and Technology Users
GSFC
OASIS Platforms
Adaptive Sensor Fleet
Multi-Robot Telesupervision
Architecture
Planetary Exploration
HAB Detection
InferenceGrids
Sensor WebInfrastructure
6
TAOSF System Overview
High-level planning and monitoring
• System Components• System Supervision Arch. (SSA)• Adaptive Sensor Fleet (ASF)• Instrument Remote Control (IRC)• Inference Grids (IG)• Marine platforms (OASIS)
High-bandwidth, single-platformtelepresence
Low-bandwidth, multi-platform telemetry
7
TAOSF System Overview
Multi-Platform Simulation
Environment(GSFC)
Adaptive Sensor
Fleet(GSFC)
SystemSupervision Architecture (CMU/JPL)
CMU: Carnegie Mellon UniversityGSFC: Goddard Space Flight CenterWFF: Wallops Flight Facility
OASIS ASV
System (EST/WFF)
SSA PlatformInterface(CMU)
Platform Simulator
(CMU)
RiverPlatforms
(CMU)
Buoys,Aerostat
(CMU/JPL)
UnderwaterPlatform(UCSD)
EST: Emergent Space TechnologiesJPL: Jet Propulsion LaboratoryUCSD: University of California at San Diego
8
Response to Year 2 External Review
• Limited OASIS platform availability– Legitimate concern; our solution is to develop additional platforms at CMU– These platforms are not HAB-chasers, but will allow us to demonstrate the multi-
platform telesupervision capability which is the project’s principal goal
• Loss of aerostat, possible alternative kite system– The aerostat system was valuable in Year 1 and 2 tests; however, in Year 3 we are
concentrating on chlorophyll-a mapping in the absence of HAB, and it is not visible from the air except possibly with highly expensive instruments
• OASIS power system limitations– Severity of the problem depends on the extent to which: 1) dye/HAB float with the
current; 2) OASIS floats with the current vs. a combination of current and wind– Our tests to date, including our July 2008 test, have not shown a strong wind effect,
and OASIS was able to use water-coordinates-based navigation to remain with the dye. We will soon test the same with chlorophyll-a.
• Dye gap measurement inexact– Main factor is probably target-drift-compensation error; can be improved with real-time
current measurement
• Knowledge of HAB occurrence predictors– Many data gathered, but no scientific consensus among ocean biologists– OASIS boats’ data will aid analysis of this problem
9
Technical Status (progress since last review)
• Operational platforms (slides 9-13)
– 2 OASIS platforms with chl-a sensor packages
– Created Miniature Sensor Boat (MSB) and Robot Sensor Boat (RSB)
• Software (slides 15-18)
– TAOSF architecture design and software extensions
• System Tests (slides 20-26)
– ASF – SSA data throughput tests
– MSB and RSB pool tests
• Data analysis (slides 28-34)
– Analysis of in-water dye-mapping tests
10
Year 1• Brought OASIS-1 to the level of OASIS-2; third platform (OASIS-3) near completion
Year 2• Completion of OASIS-3 (a.k.a. ASV3)• OASIS-2 and OASIS-3 operational; upgraded OASIS-1 still in testing
Year 3 to date• OASIS-1 decommissioned
OASIS-3 platformOASIS-2 about to launch
OASIS Platforms (NOAA Platform Team)
11
OASIS Platform Status
The ASV1 platform has completed its lifecycle as an early prototype system and has been decommissioned to maximize use of diminishing funding resources. Functional components are being used to support maintenance or serve as spares for the operational ASV2 and ASV3 platforms.
12
OASIS Platform Sensors & Telemetry
Chlorophyll sensor added to ASV3 since Y2 end-of-year review.
13
Miniature Sensor Boat• Miniature Sensor Boat (MSB)
– Boat model retrofitted for autonomy– Prototype control/sensing system transferable to other platforms
Control and sensing hardware (Nomad GPS, compass, USB hub, cellular modem comms,
air temp sensor, …)
Autonomy hardware mounted in MSB
MSB side view
14
Robot Sensor Boat• Robot Sensor Boat (RSB)
– Ported MSB control system to RSB– Plan to build two more for year-end system test
Diver propulsion system (2)
Kayak base platform
Water sonde
15
Technical Status
• Operational platforms
• Software
• System Tests
• Data analysis
16
Software Progress Overview
• Year 1 (Sep 2006 – Aug 2007)– Integration of CMU (SSA – System Supervision Architecture),
U.S. Navy OCU (MOCU1), Goddard (ASF - Adaptive Sensor Fleet), and Wallops (OASIS) software into single system for OASIS platform control, monitoring, and data collection
• Year 2 (Sep 2007 – Aug 2008)– Improved interface (OCU support for heterogeneous fleet;
interactive data access and display)– Improved simulation capability (turning, wind, current)– Initial cooperative and adaptive search algorithms
• Year 3 to date– Improved ASF-SSA data throughput
– Control and telemetry software for new MSB/RSB platforms
– Multiple improvements to existing software
1MOCU ( Multi-Robot Operator Control Unit) is developed by SPAWAR Systems Center San Diego (SSC-SD)
17
Software Subsystems
CMU Remote Sensor PlatformsAdaptive Sensor Fleet (GSFC)
Platform Communicator
System Supervision Architecture (CMU)
Adaptive Sampling / Inference
Grids
OASIS ASV
System
PlatformInterface
Multi-Platform Simulation
Environment
PlatformSimulator
TaskPlanning
Data StorageAnd Retrieval
Web-based Data
Displays
OperatorControl
Interfaces
EST, WFF
JPL
CMU: Carnegie Mellon UniversityGSFC: Goddard Space Flight CenterWFF: Wallops Flight FacilityEST: Emergent Space TechnologiesJPL: Jet Propulsion Laboratory
18
Software Progress• New software components
– “Platform" module (based on the simulator module) to run MSB / RSB and other entities (drift buoys, etc.)
• Low-level interfaces to sensors and actuators
• High-level interfaces to SSA telemetry and command data, waypoint planner, etc.
– Simple GUI interface (simple = no maps, etc.) which runs on a ruggedized Linux PDA for controlling platforms in the field
PDA GUI close-up showing engineering telemetry
Ruggedized PDA & GUI
19
Software Progress
• Improvements to existing software– Improved data throughput in ASF-SSA interface
– New control modes added to SSA system to control for MSB / RSB
– Modularity in engineering data similar to previous modularity in science data (deal easily with platforms having different engineering data)
– Incremental improvements in many modules in order to improve reliability and throughput
– Improved software logging capabilities / granularity
– Archiving of command files in database (not fully integrated)
20
Technical Status
• Operational platforms
• Software
• System Tests
• Data analysis
21
System Tests
Year 1• Multiple dry system software tests• Three in-water single-platform tests, the final one (21 Aug 07)
with dye-mapping and sensor validation system
Year 2• Multiple system component tests (communications, navigation,
drift compensation, telemetry)• Two in-water multi-platform tests, the second (2 July 08) with
drift-compensated dye-mapping
Year 3 to date• UCSD UUV telemetry dry test with upload to SSA database• ASF – SSA data throughput tests• February 2009 pool tests of MSB and RSB
22
ASF – SSA Interface Tests
• Component purpose: translate commands and telemetry between SSA software components and the OASIS platforms
• Goals of recent work
– Increase telemetry throughput by porting telemetry output to new sockets-based SSA interfaces
• Insufficient throughput causes data to back up on the ASF side of the interface, causing large delays
• Previous work had already substantially enhanced throughput, but was still only marginally sufficient (0.8-1.1Hz) relative to expected platform data rates (~1Hz)
– Incorporate updated ASF capabilities
• Allow capture radius and speed to be specified per commanded waypoint
• Report current commanded path and tracking status back to operator - e.g. "Working on waypoint 3 out of 5“
• Results
– Telemetry output now possible at roughly 2Hz per platform (leaves acceptable overhead to recover from transient problems)
– Current command and tracking status correctly reported from ASF into SSA (allows more effective command verification and monitoring)
– Long-term stability
• Successfully ran 3 simulated boats for roughly one hour
• Negligible steady-state delay, no appreciable accumulating delay
– Ready for dry test with real OASIS platforms
23
System Tests• Miniature Sensor Boat (MSB) dry tests (October
2008 through January 2009)
– Determine bandwidth overhead for various transport methods in order to reduce bandwidth usage over cellular modems
– Cellular and 802.11 antenna performance tests
– Testing of sensor/actuator integration with SSA platform code (GPS, compass, air temperature sensor)
– Testing of transmission of engineering and science telemetry over cellular modems to dataserver
24
System Tests• Miniature Sensor Boat (MSB) in-water tests (Feb
2009)– Test platform stability, performance, and remote
control capability
20 Feb pool testPlatform balancing
http://water.tsar.ri.cmu.edu/~gwp/RSB/MSB_initial_testing.mp4
25
System Tests
• 27 Feb Robot Sensor Boat (RSB) test– Port MSB control methodology to RSB– Test platform stability, performance, and
remote control capability
http://water.tsar.ri.cmu.edu/~gwp/RSB/RSB_initial_testing.mp4
26
• WFF was unable to establish an operations contract with EST until early February 2009
• As a result, OASIS testing has been on hold
• Awaiting warmer weather for use of manually driven skiff to gather chlorophyll-a data for analysis prior to OASIS chl-a mapping tests
• MSB/RSB platforms handle well and are ready for field and autonomy testing in warmer weather
System Test Issues
27
System Test PlansTest 0 (System shakeout – dry test)• March 2009• Communications and telemetry throughput
Test 1 (Chlorophyll-a mapping – April 2009)• 2 OASIS platforms• Raster scanning of Chesapeake Bay area to map chlorophyll-a variations• Not algae, but a real-world biological process
Test 2 (Adaptive chlorophyll-a mapping – April/May 2009)• 2 OASIS platforms• Adaptive “front” mapping or concentration gradient search of Chesapeake Bay
area• Show chlorophyll-a in recent MODIS image
Test 3 (Multi-platform system proving test – June/July 2009)• 2 OASIS platforms, 1 MSB (Miniature Sensor Boat), 3 RSB (Robot Sensor
Boats), multiple simulated platforms• All systems under telesupervisory control
28
Technical Status
• Operational platforms
• Software
• System Tests
• Data analysis
– JPL activities
– 2 Jul 08 test recap
– IG and Log-Gaussian process inference
29
Summary of Activities
1. Definition of comprehensive data format• A standard format has been defined in joint work with CMU to store
data from all sensor sources• JPL is working on the definition of an Engineering Data Record (EDR)
for the TAOSF data
2. Development of visualization/science interface (in progress)• Easy visualization of MODIS, EO-1 imagery overlaid on ocean test
area
3. Implementation of stand-alone Inference Grid software for easy interface to TAOSF architecture (in progress)• Implementation of Random Field updating models using sensor data• Implementation of multi-property Random Field data structure• Interpolation/extrapolation Random Field methods to provide overall
structure of the field from sparse data• Bloom edge finding and bloom motion estimation methods
30
Example: Multi-Property Inference Grid
Dye presence probability Humidity
Air pressureWind speed
31
2 July 2008 Multi-Platform Dye-Mapping Test
• Part A: Map dye patch with multiple platforms during slack tide (high, slow-moving) using multi-platform raster / rectangle scan patterns
• Details Perpendicular current Two distinct dye stripes are laid down Boat A progresses along the dye stripes using a raster scan Boat B performs a box pattern as the dye stripes pass through Drift buoys serve as end markers for the dye stripes
32
2 July 2008 Multi-Platform Dye-Mapping Test
• Combined plot of fluorometer “hits” (above threshold) of OASIS 2 and 3
• An inter-dye-stripe gap is intuitively noticeable, with errors probably due to a combination of drift divergence and fluorometer thresholding
150
200
250
300
350
400
150 200 250 300 350 400
OASIS 2 Command
OASIS 2 Fluorometer Hits
OASIS 2 Trajectory
OASIS 3 Command
OASIS 3 Fluorometer Hits
OASIS 3 Trajectory
Inter-dye-stripe gap?
OASIS 3 boxOASIS 2 raster
meters
Duration: 15 min 40 sec
33
Multi-Boat Inference Grid
Dye presence Inference Grids computed from fluorometer data obtained during the July 2, 2008 Transit 2 test:
OASIS 2 OASIS 3 OASIS 2+3
34
Multi-Boat Inference Grid:
Dye presence Inference Grids computed from fluorometer data obtained during the July 2, 2008 Transit 2 test:
OASIS 2+3
35
Data Analysis (2 Jul 08 wet test)
• Log-Gaussian estimation of dye stripes
OASIS-2 drift-corrected path
3D rendering of fluorometer readings
Inferred stripe pattern using thresholding
Row
Fl. rdng.
36
Year 3 Schedule
Yr. 3 start date: Sept. 5, 2008 Yr. 3 end date: Sept. 4, 2009
37
Three-Year ScheduleYr. 1 start date: Sept. 5, 2006 Yr. 3 end date: Sept. 4, 2009
38
Year 2 Milestones (Modified)
• Establish requirements for Inference Grids Dec 2007
• Autonomous multi-platform mapping of dye Mar 2008– Performed 2 July 2008
• Validate target-drift compensation methods Apr 2008– Performed 12 June 2008
• Inference-Grid mapping and bloom localization May 2008– Performed July-October 2008
• Multi-platform HAB search in estuary Jul 2008– No HABs reported during the 2008 season
Note: blue milestones are new or modified
39
Year 3 Milestones (Modified)
• Moving-water test plan & identify location Feb 2009– Performed during Year 2
• Adaptive chlorophyll-a mapping Apr 2009
• Simulate test using in-situ and MODIS data May 2009
– MODIS data are too low-resolution; simulation will be based on chlorophyll-a
• Use MODIS data to target and reassign fleet Jul 2009
Note: blue milestones are new or modified
40
Key Project Milestones
• Interface Definition Document Feb 2007
• Autonomous multi-platform mapping of dye Jul 2007– Performed Jul 2008
• Multi-platform HAB search in estuary Jul 2008– No HABs have occurred, so we will sense chlorophyll-a instead
• Use MODIS data to target and reassign fleet Jul 2009
41
Work Planned
• Refine drift compensation– Include wind adjustments– Acquire drift from speed-in-water sensor, compare to drift buoy
• Perform three graduated system tests
– Raster chlorophyll-a mapping
– Adaptive chlorophyll-a mapping
– Multi-platform, multi-location telesupervised science sensing
• UCSD UUV wet test, showing incorporation of “foreign” platform into SSA
• Data collection and analysis and system telesupervision performing pursuit and observation of a naturally occurring HAB, if possible
42
Critical Issues
• Unavailability of EST partner for first half of Year 3 owing to inability to establish operations contract
• Unavailability of OASIS platforms for testing
Mitigation
• Development of additional (MSB and RSB) platforms at CMU
43
Project Financial Status
Cost Status To Date, All Organizations
0
200
400
600
800
1,000
1,200
Cost
$K
Cum P lan 35 73 111 148 186 224 261 299 337 368 399 431 471 511 551 591 631 671 712 751 791 825 859 892. 933 973 1013 1053 1093 1133
Cum Actual 51 68 87 110 123 153 209 242 298 356 373 383 421 465 515 535 558 580 600 630 679 702 714 785 804 850 902 960 1009 1039
Variance 16 -5 -23 -38 -63 -71 -53 -57 -39 -12 -27 -48 -50 -46 -36 -57 -74 -91 -112 -122 -113 -123 -144 -108 -129 -123 -111 -93 -84 -94
Sep-06
Oct-06
Nov-06
Dec-06
J an-07
Feb-07
Mar-07
Apr-07
May-07
J un-07
J ul-07
Aug-07
Sep-07
Oct-07
Nov-07
Dec-07
J an-08
Feb-08
Mar-08
Apr-08
May-08
J un-08
J ul-08
Aug-08
Sep-08
Oct-08
Nov-08
Dec-08
J an-09
Feb-09
Notes: 1. Underspent by $94K (1 Mar 09): $2K CMU, $30K GSFC, $8K WFF,
$54K JPL
44
Project Financial Status
Cost Status To Date, CMU
0
100
200
300
400
500
600
Cost
$K
Cum P lan 17 34 50 67 84 101 118 134 151 162 172 182 199 215 232 248 264 281 297 314 330 340 350 361 383 406 428 451 474 496
Cum Actual 17 34 51 68 77 100 114 129 144 161 174 183 202 216 230 244 258 272 279 294 319 338 356 362 365 386 417 451 474 494
Variance 0 1 1 1 -7 -1 -3 -6 -8 -1 2 1 3 0 -2 -4 -7 -9 -18 -20 -11 -2 5 2 -18 -20 -11 0 0 -2
Sep-06
Oct-06
Nov-06
Dec-06
J an-07
Feb-07
Mar-07
Apr-07
May-07
J un-07
J ul-07
Aug-07
Sep-07
Oct-07
Nov-07
Dec-07
J an-08
Feb-08
Mar-08
Apr-08
May-08
J un-08
J ul-08
Aug-08
Sep-08
Oct-08
Nov-08
Dec-08
J an-09
Feb-09
45
Project Financial Status
Cost Status To Date, GSFC
0
20
40
60
80
100
120
140
160
180
200
Cost
$K
Cum P lan 8 16 25 33 41 49 57 66 74 82 90 98 102 107 111 116 120 124 129 133 137 142 146 150 155 160 164 169 173 178
Cum Actual 34 34 34 40 43 50 71 79 94 105 107 107 107 109 111 111 114 116 120 124 128 131 133 134 135 138 137 147 146 148
Variance 26 18 9 7 2 1 14 14 21 23 17 9 4 2 0 -5 -6 -8 -9 -9 -10 -11 -13 -16 -20 -21 -27 -22 -27 -30
Sep-06
Oct-06
Nov-06
Dec-06
J an-07
Feb-07
Mar-07
Apr-07
May-07
J un-07
J ul-07
Aug-07
Sep-07
Oct-07
Nov-07
Dec-07
J an-08
Feb-08
Mar-08
Apr-08
May-08
J un-08
J ul-08
Aug-08
Sep-08
Oct-08
Nov-08
Dec-08
J an-09
Feb-09
Notes:1. As in Year 1, in Years 2 & 3, GSFC again subtracted overhead up front, but only reports spending on the
remainder. The above is based on multiplying reported Year 2 & 3 spending by 1.414 = (original budget/(budget-overhead) from Year 1.
46
Project Financial Status
Cost Status To Date, WFF
0
50
100
150
200
250
300
Cost
$K
Cum Plan 10 20 30 40 51 61 71 81 91 101 112 122 132 143 153 163 174 184 195 205 215 225 236 246 252 258 264 270 275 281
Cum Actual 0 0 0 0 0 0 20 30 56 86 86 87 92 114 142 142 147 151 152 158 161 161 151 193 206 219 227 239 266 273
Variance -10 -20 -30 -40 -51 -61 -51 -51 -35 -16 -25 -35 -40 -29 -11 -21 -27 -33 -43 -47 -54 -64 -85 -53 -46 -39 -37 -31 -9 -8
Sep-06
Oct-06
Nov-06
Dec-06
J an-07
Feb-07
Mar-07
Apr-07
May-07
J un-07
J ul-07
Aug-07
Sep-07
Oct-07
Nov-07
Dec-07
J an-08
Feb-08
Mar-08
Apr-08
May-08
J un-08
J ul-08
Aug-08
Sep-09
Oct-09
Nov-09
Dec-09
J an-09
Feb-09
47
Cost Status To Date, JPL
0
20
40
60
80
100
120
140
160
180
200
Cost
$K
Cum Plan 0 3 5 8 10 13 16 18 21 23 26 29 38 47 55 64 73 82 91 100 109 118 127 136 143 150 157 164 171 178
Cum Actual 0 0 2 2 3 3 3 4 4 5 5 6 20 27 32 38 39 41 49 54 71 72 75 95 98 106 120 123 123 124
Variance 0 -3 -3 -6 -7 -10 -13 -15 -17 -18 -21 -23 -18 -20 -23 -26 -34 -41 -42 -46 -38 -46 -52 -41 -45 -44 -37 -41 -48 -54
Sep-06
Oct-06
Nov-06
Dec-06
J an-07
Feb-07
Mar-07
Apr-07
May-07
J un-07
J ul-07
Aug-07
Sep-07
Oct-07
Nov-07
Dec-07
J an-08
Feb-08
Mar-08
Apr-08
May-08
J un-08
J ul-08
Aug-08
Sep-09
Oct-09
Nov-09
Dec-09
J an-09
Feb-09
Project Financial Status
48
Project Financial Status
Full Project Cost Status, All Organizations
0
200
400
600
800
1,000
1,200
1,400
1,600
Cost
$K
Cum Cost Plan 111 224 337 431 551 671 791 892 1013 1133 1241 1357
Cum Cost Actual 87 153 298 383 515 580 679 785 902 1039
Variance -24 -71 -39 -48 -36 -91 -112 -107 -111 -94
Nov-06 Feb-07 May-07 Aug-07 Nov-07 Feb-08 May-08 Aug-08 Nov-08 Feb-09 May-09 Aug-09
49
Educational Outreach
Steve Stancliff– Ph.D.
student, Robotics
– Carnegie Mellon University
Ellie Lin Ratliff– M.S. student, Robotics– Carnegie Mellon University– Graduated May 2008– Univ. of Pittsburgh
Secondary Science Education Master’s Program
Chris Baker– Ph.D. student,
Robotics– Carnegie Mellon
University
Bryan Low– Ph.D.
student, ECE
– Carnegie Mellon University
Sandra Mau– M.S. student, Robotics– Carnegie Mellon University– Graduated May 2007– Physical Sciences Commercialization
Analyst, Univ. of Queensland, Brisbane, Australia
50
Educational Outreach
David Asikin– M.S. student, Robotics– Carnegie Mellon
University
Cristian Guajardo– M.S. student, ECE– Carnegie Mellon University– Graduated May 2008– Boeing Corporation, Dallas,
TX
Vinay Gunasekaran– M.S. student, ECE– Carnegie Mellon
University
Yunchan Paik– B.S. student, ECE– Carnegie Mellon
UniversityNo picture available
51
Educational Outreach
David Schlesinger– B.S. student, ECE– Carnegie Mellon University– 2006-08 Summer Junior
Programmer
Matt Felser– B.S. student,
Computer Science– Penn State– 2007-08 Summer
Junior Programmer
Shari Eskenas– B.S.E. student, UCSB– Electrical Engineering– 2008 Summer NASA
Undergraduate Researcher
Rob Waaser– B.S. student, ECE– Carnegie Mellon
University
Jeff Baker– B.S. student, Computer Science– Duquesne University– Left TAOSF Sep 2007, working
on another CMU project
52
Presentations 19-21 Nov 2008: “Networked Architecture for Robotic Environmental Ocean Science Sensors”
presented by Gregg Podnar at the Sensing a Changing World Workshop, Centre for Geo-information, Wageningen University and Research Centre, Wageningen, The Netherlands.
19-21 November 2008: "Inference Grids for Environmental Mapping and Mission Planning of Autonomous Mobile Environmental Robots", presented by Alberto Elfes at the Sensing a Changing World Workshop, Centre for Geo-information, Wageningen University and Research Centre, Wageningen, The Netherlands.
January 2009: “Telesupervised Adaptive Ocean Sensor Fleet”, published in the NASA Tech Briefs Magazine, January 2009, Vol. 33, No. 1, pp. 30-32, winner of NASA Tech Brief Award in December 2008.
January 2009: “Information-Theoretic Multi-Robot Adaptive Exploration and Mapping”, first author Bryan Low, submitted to the International Joint Conferences on Artificial Intelligence (IJCAI 2009) to be held 11-17 July 2009 in Pasadena, CA.
February 2009: “Robot Boats as a Mobile Exploration Aquatic Sensor Network”, first author Bryan Low, submitted to the Earth and Space Science Applications workshop of the ACM/IEEE International Conference on Information Processing in Sensor Networks (ISSA-ESPN) to be held 16 April, 2009 in San Francisco, CA.
53
Acronyms/Glossary• ASF – Adaptive Sensor Fleet• ASV – Autonomous Surface Vehicle• CMU – Carnegie Mellon University• EST – Emergent Space Technologies• GSFC – Goddard Space Flight Center• HAB – Harmful Algal Bloom• IG – Inference Grids• IRC – Instrument Remote Control• JPL – Jet Propulsion Laboratory• MOCU – Multi-Robot Operator Control Unit• MODIS – Moderate-Resolution Imaging Spectrometer • MSB – Miniature Sensor Boat• NOAA – National Oceanic and Atmospheric Administration• OASIS – Ocean Atmosphere Sensor Integration System• OCU – Operator Control Unit• Rhodamine WT – A non-toxic liquid red dye commonly used in water-tracing studies• RSB – Robot Sensor Boat• SPAWAR – Space and Naval Warfare Systems• SSA – System Supervision Architecture• SSC-SD – SPAWAR Systems Center – San Diego• TAOSF – Telesupervised Adaptive Ocean Sensor Fleet• UCSB – University of California, Santa Barbara• UCSD – University of California, San Diego• UUV – Unmanned Underwater Vehicle• WAAS – Wide-Area Augmentation System (increases GPS accuracy)• WFF – Wallops Flight Facility