geoss evaluation task analysis john adamec geoss et meeting, geneva, switzerand march 22-25, 2010

19
GEOSS Evaluation Task Analysis John Adamec GEOSS ET Meeting, Geneva, Switzerand March 22-25, 2010

Upload: jayson-chandler

Post on 03-Jan-2016

215 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: GEOSS Evaluation Task Analysis John Adamec GEOSS ET Meeting, Geneva, Switzerand March 22-25, 2010

GEOSS Evaluation Task Analysis

John AdamecGEOSS ET Meeting, Geneva,

SwitzerandMarch 22-25, 2010

Page 2: GEOSS Evaluation Task Analysis John Adamec GEOSS ET Meeting, Geneva, Switzerand March 22-25, 2010

TASK/TARGET MATCHING

The high level analysis of all Overarching Tasks

Page 3: GEOSS Evaluation Task Analysis John Adamec GEOSS ET Meeting, Geneva, Switzerand March 22-25, 2010

Task/Target Matching

• What we have– Lars’ analysis of 10 of 14 strategic target areas

– MLQ’s analysis of 3 strategic target areas

– Coverage of 11 of 14 areas

– Two somewhat different approaches.

– Previous discussion that this is not satisfactory

– No actionable suggestions for improvement

Page 4: GEOSS Evaluation Task Analysis John Adamec GEOSS ET Meeting, Geneva, Switzerand March 22-25, 2010

Task/Target Matching• Investigated alternative method

– Automated Analysis• Text comparison tool scores similarity of two documents

• Produces a quantified index of overlap

• Results comparable between Transverse and Strategic Target Areas

– Unsatisfactory• Identified available tools only discover word-for-word

matches, limited accounting for synonyms

• No accounting for context or meaning.

– Could be viable if a team member has access to and experience with natural language analysis software.

(http://plagiarism.phys.virginia.edu/Wsoftware.html)

Page 5: GEOSS Evaluation Task Analysis John Adamec GEOSS ET Meeting, Geneva, Switzerand March 22-25, 2010

Task/Target Matching

• Proposal– Lars completes his initial analysis.– If desired, 1 or 2 more team members replicate

the analysis.– The replicators and Lars produce a set of

consensus sheets.– Summarize findings in written form (answer the

Evaluation Framework Questions); this is incorporated into the report.

Page 6: GEOSS Evaluation Task Analysis John Adamec GEOSS ET Meeting, Geneva, Switzerand March 22-25, 2010

Task/Target Matching

Questions?

Discussion?

Suggestions?

Agreement?

Page 7: GEOSS Evaluation Task Analysis John Adamec GEOSS ET Meeting, Geneva, Switzerand March 22-25, 2010

IN-DEPTH TASK ANALYSIS

Proposal for selection and analysis of a subset of GEOSS implementation activities.

Page 8: GEOSS Evaluation Task Analysis John Adamec GEOSS ET Meeting, Geneva, Switzerand March 22-25, 2010

Task Selection

• Need: A list of tasks for in-depth analysis– Identified before conducting secretariat interviews– Covers all 14 SBAs and Transverse Areas– Representative of breadth of GEOSS

implementation– Yields reasonable analysis work load for team

• Based on: a variety of inputs– Lars’ and MLQ’s partial review, progress reports,

Cape Town, diversity of GEOSS activity

Page 9: GEOSS Evaluation Task Analysis John Adamec GEOSS ET Meeting, Geneva, Switzerand March 22-25, 2010

Task Selection• Proposal– Select 14 tasks (1 from each target area)

based on partial analysis– Narrow that list to 1 subtask from each of the

tasks.• Focus on “primary” component of the task or

otherwise specifically identified sub-task.

This maintains the coverage across all of the target areas but cuts out the variation in size of Overarching Tasks and yields a more tractable workload.

Page 10: GEOSS Evaluation Task Analysis John Adamec GEOSS ET Meeting, Geneva, Switzerand March 22-25, 2010

Task SelectionArchitecture: AR-09-04a GEONETCastThis task is selected for in-depth analysis because it is global in scope and ties

to multiple points in the Cape Town priorities. The Geo Portal/GCI activity was not chosen because we are aware of ongoing evaluations of that activity by the implementers and because that activity indirectly encompasses the entirety of GEOSS and is perhaps more suited to a targeted in-depth evaluation of the Architecture Transverse Area.

Data Management: DA-06-01 GEOSS Data Sharing PrinciplesThis task was identified by Michel L as a task for in-depth evaluation. The data

sharing principles are one of the few tasks specifically mentioned in the Cape Town priorities and therefore a definite inclusion in our analysis. The principles, being a concept rather than a digital/physical infrastructure, are more conducive to the scope and capabilities of this review.

Capacity Building: CB-09-05a Open Source SoftwareThis is a sub-task of the infrastructure capacity building task. It is selected

because it is directed toward improving end-user capacity to utilize data; it is also led outside of the US and Europe.

Page 11: GEOSS Evaluation Task Analysis John Adamec GEOSS ET Meeting, Geneva, Switzerand March 22-25, 2010

Task SelectionScience and Technology: ST-09-02 Promoting Awareness and

Benefits of GEO in the Science and Technology Community.Recommended by Lars for in-depth evaluation, this task has no sub-

tasks. Based on initial feedback from folks we have talked to, the title of this task indicates it is an important issue to be addressed.

User Engagement: US-09-01a Identifying Synergies between Societal Benefit Areas

This sub-task is recommended because it is a specific effort to formulate the core Earth Observation needs for a working GEOSS.

Disasters: DI-09-03a Tsunami Early Warning System of SystemsThis sub-task was identified in Lars’ analysis for lack of activity.

Additionally, there was great interest in this at the time of the formalization of GEOSS in 2005 and it continues to be a current topic.

Page 12: GEOSS Evaluation Task Analysis John Adamec GEOSS ET Meeting, Geneva, Switzerand March 22-25, 2010

Task Selection

Health: HE-09-01 Information Systems for HealthThis task has no sub-tasks and was identified in Lars’ analysis as a

candidate for in-depth evaluation. This was chosen because the establishment of Earth Observation data needs and opportunities underlies progress in achieving health outcomes through GEOSS.

Energy: EN-07-01 Management of Energy SourcesThere are no sub-tasks under energy. All three were recommended for in-

depth evaluation by Lars’ analysis. This particular task was selected because it had relevance to multiple outcomes.

 Climate: CL-09-03a Global Integrated Global Carbon Observation

(IGCO)This task was identified by Michel L as a candidate for in-depth analysis.

It is the primary component of the overarching task.

Page 13: GEOSS Evaluation Task Analysis John Adamec GEOSS ET Meeting, Geneva, Switzerand March 22-25, 2010

Task Selection

Water: WA-06-07c Asia (Capacity Building for Water Resource Management)

This specific sub-task is not overseen by a unifying “primary” component in the Water SBA; however, this subtask (of the 3 regional sub-tasks) was identified by Lars as lacking in reported activity.

Weather: WE-06-03 TIGGE and the Development of a Global Interactive Forecast System for Weather

This task is chosen because both weather activities received equal ratings in Lars’ analysis, but this has only a single component (no sub tasks) and a more concrete objective than the other weather tasks.

Page 14: GEOSS Evaluation Task Analysis John Adamec GEOSS ET Meeting, Geneva, Switzerand March 22-25, 2010

Task SelectionEcosystems: EC-09-02a Impact of Tourism on Environmental and

Socio-Economic ActivitiesThe sub-tasks under 09-02 are greatly varied and Lars’ analysis showed a

lack of connections to target outcomes. This sub-task is proposed simply because of its “a” status- if another team member has a reason (such as expertise) to desire a different sub-task we are open to that suggestion.

Agriculture: AG-07-03a Global Agricultural Monitoring SystemThis is chosen because this is the primary “a” component of AG-07-03

which Lars’ analysis indicated had high relevance to Cape Town.

Biodiversity: BI-07-01a Biodiversity Observation NetworkThere is a single Overarching Task for biodiversity. GEO-BON is the

primary component of this task and the other sub-tasks are clearly components of a comprehensive BON.

Page 15: GEOSS Evaluation Task Analysis John Adamec GEOSS ET Meeting, Geneva, Switzerand March 22-25, 2010

Task Selection

Questions?

Discussion?

Suggestions?

Agreement?

Page 16: GEOSS Evaluation Task Analysis John Adamec GEOSS ET Meeting, Geneva, Switzerand March 22-25, 2010

In-Depth AnalysisProposal

Assignment- 2 analyses per member• Develop a “highlight” – short, less than 2

page, write-up of the activity. Tell the “story” of the task- what happened? What has worked well? What hasn’t? What is needed to reach their 2015 goals?– These will appear in an appendix or as “boxes” in

the report.• Set of conclusions or generalized findings

based on each task which will be synthesized in report text.

Page 17: GEOSS Evaluation Task Analysis John Adamec GEOSS ET Meeting, Geneva, Switzerand March 22-25, 2010

In-Depth Analysis

Sources- You are encouraged to use any of the following to develop the analysis

• Data from secretariat interview questions about the sub-tasks.

• Task sheets and progress reports

• Communication with task leads and participants.

Page 18: GEOSS Evaluation Task Analysis John Adamec GEOSS ET Meeting, Geneva, Switzerand March 22-25, 2010

In-Depth Analysis

• Additional Notes-– We do not want to be conducting multiple interviews

with each secretariat member so please ask about all selected sub-tasks in their areas even if you are not personally conducting the in-depth analysis of that sub-task.

– Old task sheets may have a different task number associated with the sub-task.

– We are not judging the individual tasks- we are looking at them to help understand how success is achieved in the GEOSS framework.

Page 19: GEOSS Evaluation Task Analysis John Adamec GEOSS ET Meeting, Geneva, Switzerand March 22-25, 2010

In-Depth Analysis

Questions?

Discussion?

Suggestions?

Agreement?