metrics baseline - (definitions with examples and mpg baseline change process) mpg co-chairs: h. k....

35
Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November 2011 -

Upload: gabriella-boyd

Post on 24-Dec-2015

215 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Metrics Baseline-

(Definitions with Examples and MPG Baseline Change Process)

MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC

Clyde Brown, LaRC / SSAI

November 2011-

Page 2: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Metrics Baseline Updates

• New for November 2011:–The ESDSWG MPAR-WG became the ESDIS Metrics Planning Group (MPG).–The Impact Metric template was revised on June 28, 2011.–Changes to comments terminology were made in November 2011.

• New for October 2010:–A change to the baseline adding new Citations Metrics was recommended by the MPAR-WG and approved by NASA HQ in October 2010 for the FY2011 reporting year. –Two new citations metrics will be entered by projects using the MCT:

• Metric 13 – a count of citations in peer reviewed publications,• Metric 14 – a count of citations in other than peer reviewed publications.• Entry of citations metrics is voluntary, and would be, at a minimum, on

an annual basis.–See charts 30, 31 and 32 for more information.–NASA HQ also approved a new quad-chart format for Impact Metrics. The new impact metric quad charts would be entered via E-Books as the opportunity arises. See charts 33, 34 and 35 for more information and examples.

Page 3: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Metrics Baseline

• History :– Original metrics baseline recommended by MPAR-WG and

approved by NASA HQ for FY2004 for REASoN projects;

– Changes to the baseline were recommended by the MPAR-WG and approved by NASA HQ in 2006 for the FY2007 reporting year based on projects’ experience.

– Principal changes were addition of voluntary Service Metrics and Project Defined metrics to provide needed flexibility for REASoN and ACCESS projects, plus the ‘Metrics Mission Statement’ and clarification of some definitions.

– The process for adopting recommendations for changes to the metrics baseline follows on the next chart, updated to reflect the change from the ESDSWG MPAR-WG to the ESDIS MPG.

Page 4: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

MPG Process for Changing the Metrics Baseline

• MPG Process to adopt a recommendation:

1. Majority vote of MPG members to adopt proposed recommendation as an MPG draft (which will include the rationale / justification for the recommendation);

2. One MPG member is appointed “shepherd”;

3. 30 day period of NASA ES activity review for MPG draft (not all ES activities are represented by MPG members) coordinated by shepherd;

4. The shepherd assembles comments, drafts revisions to the recommendation per ES activity feedback, presents summary of feedback and draft revisions to MPG along with impact analysis as needed;

5. Two-thirds vote of responding MPG members to adopt the final recommendation.

The recommendation is provided to the ESDIS Project for concurrence to send to the NASA HQ Program Executive for Data Systems for final approval.

ESDIS or NASA HQ may approve, approve with modifications, or disapprove the recommendation.

Page 5: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Metrics Mission Statement

To measure the success of each project in meeting its stated goals and objectives, to show the role and contribution of each project to the NASA science, application, and education programs, and to enable an overall assessment of the success of programs such as MEaSUREs / REASoN / ACCESS and their contribution to NASA’s goals.

Page 6: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Three Types of Metrics Included in Baseline

• Products and Services Metrics: Measure the number and types of products and/or services, data volumes provided by a project.

– Common Metrics: Reported by most if not all projects, will be overall measures with sufficient cross-project commonality to allow assessment of the MEaSUREs (etc.) program as a whole, and will not be used as comparative measures of project performance.

– Project-Defined Metrics: Projects may add one to four metrics defined by each Project as best measures of its performance against its objectives.

– Reporting: Common metrics will be reported monthly unless otherwise agreed between a project and its study manager. Project Defined metrics will be reported at interval chosen by the project.

• Programmatic Metrics: Characterize the role of the project within the NASA science, applications, and/or education programs by indicating program areas the project supports. After initial report, update as needed.

• Outcome Metrics: Provide measures of the value of the projects’ contribution to research and applications.

– Impact Metrics: Specific success stories that provide an example (s) of how the project’s products / services have directly benefited a user, organization, or activity it supports. Reported as opportunity allows.

– Citations Metrics: Counts of citations (and the actual citations themselves) of a project’s products in peer reviewed and other publications.

Page 7: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Metrics Reporting

• Metrics are reported using the on-line Metrics Collection Tool (MCT)

• For each metric, the following is reported:– The Metric’s Value – A single or multiple usually numeric value.

– A Standing Comment - An explanation of how the metric is determined by the Project. Enter a standing comment once and then it will remain attached to the metric until updated. Projects may change standing comments each month or let the current standing comment stand as long as it applies. (Standing comments were known as “baseline” comments prior to November 2011.)

– A Comment for the Current Month - An explanation qualifying or otherwise explaining a Project's metric entry for the current month. Use of this comment box is optional every month and should be used to enter comments applicable only to that month's value of the metric. (Known as “supplemental comments” prior to November 2011.)

Note: In some cases, multiple values may be reported, e.g. a project may report products delivered broken down by Product Type by entering as a value a string of numbers separated by semi-colons, and in the baseline comment explain which numbers correspond to which types.

Page 8: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

List of Metrics Comprising the Metrics BaselineProducts and Services Metrics -

Common Metrics:– Metric 1 - Distinct Users– Metric 2 – Distinct Users by Class– Metric 3 – Products Distributed (instances of types available, see metric 4)– Metric 4 – Product Types Available– Metric 5 – Volume Distributed– Metric 6 – Volume Available– Metric 7 – no longer used– Metric 11 – Services Provided (instances of service types available)– Metric 12 – Services Available

Project -Defined Metrics (up to 4, defined by each project, metrics 100, 101, 102, 103)

Programmatic Metrics:– Metric 8 – Science Focus Areas Supported– Metric 9 – Applications Areas Supported– Metric 10 – Education categories Supported

Outcome Metrics:– Metrics 13 and 14 - Citation Count Metrics– Impact Metrics

Page 9: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Common Metric #1, Distinct Users

• Purpose: To measure the size of the activity’s user community, to be assessed in the context of its NASA program role.

• MCT Question: Please enter the count of individuals who, by any means, request and receive or in some other way use products, services and/or other information during the reporting period.

Metric Name Definition

Number of Distinct Users

The number of distinct individual users who, by any means, request and receive or in some other way use products, services and/or other information during the reporting period.

Page 10: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Common Metric #1, Distinct Users, Cont.

• Examples:

• Value: 23; Standing Comment: The number of distinct users is the sum of those who downloaded data from the website, and those who either called or e-mailed for technical support.

• Value: 5,936; Standing Comment: Distinct users are determined based on unique ip address for data retrieved by ftp or http. For request by phones and email it is determined using the requestor’s name or email address.

• Value: 10,608; Standing Comment: The number of unique individual users is calculated from FTP and WWW logs at RSS, and DataPool activity at UAH. The combined total represents unique website visitors as well as users who download data via FTP or DataPool interface.

Page 11: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Common Metric #2, Distinct Users by Class

• Purpose: To measure the types of users served by the activity, to be assessed in the context of its NASA program role.

• MCT Question: Please enter the number of users who obtain products and services from your project by the following classes.

Metric Definition

Characterization of Distinct Users Requesting Products and Information

Classes of users who obtain products and services from the project. The metric will show the numbers of distinct users accessing data and services from classes identified by internet domains or other identification that can be related to domain classes: from a) first-tier domains: […], and b) second-tier domains, [...].

Page 12: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Common Metric #2, Distinct Users by Class , Cont.

• Examples:

• Values: counts by category as applicable; Standing Comment: Based on Metric 1, we determine the characterization of Distinct Users using log files created by the web server.

• Values: counts by category as applicable; Standing Comment: Characterization of users is determined by resolving the ip address to a domain name to determine if it is a non-us domain, commercial, etc.. For non-Internet requests we determine this by using the address we send data/information to. K12 users are reported on line 2.13.

• Values: counts by category as applicable; Standing Comment: Characterization of unique individual users is calculated from FTP logs at RSS, and DataPool activity at UAH. The combined totals represent users who download data via FTP or DataPool interface. Web visitors are not included..

Page 13: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Common Metric #3, Products Delivered

• Purpose: To measure, in conjunction with Metrics 4, 5, and 6, the data and information produced and distributed by the activity, to be assessed in the context of its NASA program role. A particular set of values for these metrics might be much smaller for one activity than another activity, but in each case could represent excellent performance, given the particular role of each activity. The count of products delivered is a useful measure given the user-oriented definition of a ‘product’ that is independent of how the product is constituted or how large it is.

• MCT Question: Please enter the number of instances of products delivered to users during the reporting period.

• Note: Products Delivered may be broken down by Product Type.

Metric Definition

Number of Products Delivered to Users

The number of separately cataloged and ordered, dynamically produced, or otherwise made available data and/or information products delivered, and/or tools provided to users during the reporting period by each project.

A ‘product’ (an instance of a Metric 4 Product Type) is an item (pre-prepared or produced on demand) that may be listed in a product catalog, inventory or menu, a publication, or a customized product produced by a project for a particular user. The intent is to capture the presentation to the user of the data products, information products and/or tools provided by the project.

Page 14: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Common Metric #3, Products Delivered, Cont.

• Examples:

• Value: 3,419; Standing Comment: We calculate the number of delivered products using the log files created by the website server.

• Value: 386; 542; Standing Comment: A product is a prepared or on-demand file, DVD, CD, image, etc. that represents part of a dataset associated with an individual field campaign.

• Value: 409;108; Standing Comment: Number of products delivered is calculated from FTP and WWW logs at RSS, and DataPool activity at UAH. The combined total represents number of files downloaded via FTP or DataPool interface, as well as page views of web visitors.

Page 15: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Common Metric #4, Product Types

• Purpose: The count of product types produced is a useful measure because of the effort by the activity required to develop and support each of its product types. A project’s values for this metric are to be assessed in the context of its NASA program role.

• MCT Question: Please enter the number of product types made available to users during the reporting period.

Metric Definition

Number of Distinct Product Types Produced and Maintained by Project

A product type refers to a collection of ‘products’ of the same type such as “sea surface temperature datasets”, tools or other capabilities, and/or types of separately available information products that a project provides to its users. The intent is to capture the user view of the product types provided by the project, i.e. how the project’s product types are presented to the user.

Page 16: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Common Metric #4, Product Types, Cont.

• Examples:

• Value:13; Standing Comment: We count the number of products available on the website. 1. Lagrangian Ice Motion product 2. Backscatter Histogram product 3. Ice Age and Thickness product 4. Deformation product 5. Ice Age 6. Ice Thickness 7. Backscatter Histogram 8. Divergence 9. Vorticity 10. Shear 11. Sheba Ice Motion Product 12. Melt Onset Product 13. Canadian Arctic Shelf Exchange Study (CASES 2003-2004) product .

• Value:19; Standing Comment: A product type is an atmospheric chemistry dataset associated with an individual field campaign.

• Value: 82; Standing Comment: At RSS, products include 5 geophysical parameters (sst, wind, vapor, cloud, rain) at 4 timeframes (daily, 3-day, weekly, monthly), as well as 1 SST OI and 1 SST swath data set. At UAH, 60 product types may be discovered by browsing the DataPool interface.

Page 17: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Common Metric #5, Volume of Data Distributed

• Purpose: The volume distributed is a useful output measure but one which depends heavily on the particular types of data an activity produces and distributes and must be assessed in the context of the activity’s role and data it works with. See note in metric 3.

• MCT Question: Please enter the volume of data distributed during the reporting period. (An explanation of how to calculate this volume is available on the MCT).

• Note: Volume Distributed may be broken down by Product Type.

Metric Definition

Volume of Data Distributed

The volume of data and/or data products and/or information provided as web downloads or otherwise distributed to users during the reporting period (in MB, GB or TB as appropriate, to at least three significant digits precision.

For some projects, not distributing digital data, the proper answer is ‘not applicable’.

Page 18: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Common Metric #5, Volume of Data Distributed, Cont.

• Examples:

• Value:10,894 MB; Standing Comment: We calculate the volume of delivered products using the log files created by the website server.

• Value: 38,652 MB; Standing Comment: Volume distributed is the amount of data the project has distributed for the month via http/ftp data files, CDs, and/or DVDs.

• Value: 1,551 GB; Standing Comment: Volume of data distributed is calculated from FTP logs at RSS, and DataPool activity at UAH. The combined total represents the size of data downloaded via FTP or DataPool interface.

Page 19: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Common Metric #6, Data Volume Available

• Purpose: The cumulative volume available for users provides a measure of the total resource for users that the activity creates. See note in Metric 4.

• MCT Question: Please enter the total cumulative volume of data available at the end of the reporting period. (An explanation of how to calculate this volume is available on the MCT).

Metric Definition

Total Volume of Data Available for Research and Other Uses

The total cumulative volume, as of the end of the reporting period, of data and products held by the project and available to researchers and other users (MB, GB or TB to at least three significant digits). This number can include data that are not on-line but are available through other means.

For some projects, not holding digital data, the proper answer is ‘not applicable’.

Page 20: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Common Metric #6, Data Volume Available , Cont.

• Examples:

• Value: 20,516 MB; Standing Comment: We calculate the total volume of data available to the users by summing the product file sizes staged on the server disks.

• Value: 2,033,219 MB; Standing Comment: Data available includes original video available on DVD or CD and data files that can be accessed via ftp and/or http.

• Value: 2,009 GB; Standing Comment: Volume of data available represents the size of data products on-line and publicly available at RSS and UAH.

Page 21: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Common Metric #11, Services Provided

• Purpose: To measure, in conjunction with Metric 12, Service Types, the services performed by the activity, to be assessed in the context of its ESE role. A particular set of values for these metrics might be much smaller for one activity than another activity, but in each case could represent excellent performance, given the particular ESE role of each activity.

• MCT Question: Please enter the number of services provided to users during the reporting period.

• Note: Services Provided may be broken down by Service Type.

Metric Definition

Number of Services Provided to Users

The number of services provided to users during the reporting period by a project. A service provided is a session with, or invocation of, an on-line service or capability selectable from a general menu of available services or capabilities, or an instance of a service otherwise provided (e.g. an exhibit, workshop or presentation), or a customized service such as development of novel data, custom model runs, etc.).

The intent is to capture how the information services and

capabilities provided by the project are presented to the user.

Page 22: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Common Metric #11, Services Provided, Cont.

Examples:

• Value: 1128; Standing Comment: Number of interactive maps served to outside users.

• Value: 342; Standing Comment: The number of plots created dynamically via our website.

• Value: 294, Standing Comment: Execution of trend analysis

Page 23: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Common Metric #12, Service Types

• Purpose: The count of service types produced is a useful measure because of the effort by the activity required to develop and support each of its service types. A project’s values for this metric are to be assessed in the context of its NASA program role.

• MCT Question: Please enter the number of service types made available to users during the reporting period.

Metric Definition

Number of Distinct Service Types Provided by Project

A service type refers to a separately-accessible web service or other capability, and/or any other separately-available service of any kind, that a project provides to its users. The intent is to capture the user view of the services and capabilities provided by the project, i.e. how the project’s services are presented to the user.

Page 24: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Common Metric #12, Service Types, Cont.Examples:

• Value: 4; Standing Comment: The four basics service types are: 1) Searching for glacier data via interactive maps 2) Searching for glacier data via text fields (e.g. by name) 3) Searching for ASTER imagery via interactive maps. 4) Submitting data to the GLIMS project via a web form

• Value: 29; Standing Comment: Number of distinct data sets currently available for online plotting tool.

• Value: 5; Standing Comment: Our Service types include customer support, merges of data sets, online plotting, statistical analysis, and reprojection and reformatting of data. The iteration of a service type generally results in products that we include in our products provided metrics.

• Value: 3; Standing Comment: Services provided include a Web Server, an OpenDap Server, and the WIPE server where products can be accessed.

• Value: 3; Standing Comment: FTP download service. OPeNDAP. High-Efficiency File Transfer (HEFT) through Aspera.

• Value: 1, Standing Comment: Trend analysis is now available

Page 25: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Examples of Possible Service Types and Corresponding Services Provided

Service Types (12) Services Provided (11)

On-Demand Subsetting Number of subsets created / provided in month

On-Demand Reformatting Number of reformatted products created in month

On-Demand Reprojection Number of remapped products created in month

On-Demand Graphics / Visualizations / Images Generation

Number of graphics, e.g. plots or images, created / provided in month

Custom Product Generation Number of custom products generated in month

Custom Statistical Analysis or Model Runs

Number of analyses or model runs provided in month

User Support – Responses Requests for user support responded to in month

User Support - Research Special research efforts made for users in month

Special Search / Discovery Support, On-Line or Manual

Count of special searches / discoveries performed

Custom merges of datasets Number of merged datasets created in month

Seminars / Workshops Conducted Number of seminars / workshops, or of attendees

Special Metadata / Documentation Preparation

Number of instances of this being done in month

Page 26: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Project Defined Metrics ExamplesTo provide additional significant detail:• Value: 83,114; Standing Comment: Total number of glacier

snapshots in glacier_dynamic system.• Value: 7,296,634; Standing Comment: Total number of valid glacier

vertices in database

• Value: 226,556; Standing Comment: Total number of ASTER footprints in database

To measure an activity of importance to the project’s goals:• Value: 6, Standing Comment: Partners using DIAL technology: will grow as

the project advances.

• Value: 65; Standing Comment: Number of countries served by ipydis is an important statistic for this internationally focused project.

To highlight a significant ancillary activity:• Value: [list of papers, etc.]; Standing Comment: Project publications and

presentations.

Page 27: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Programmatic Metric #8, Support for Science Focus Areas

• Baseline Definition: The projects will identify the NASA Science Mission Directorate's Science Focus Areas that each project supports (may be multiple). The focus areas are: weather, climate change and variability, atmospheric composition, water and energy cycle, Earth surface and interior, and carbon cycle and ecosystems.

• Purpose: To enable the ESE program office to determine which NASA Science Mission Directorate's Science Focus Areas are supported by the activity, and to assess how the data products provided by the activity relate to that support.

• MCT Question: Please identify the NASA Science Mission Directorate's Science Focus Areas that your project supports (may be multiple).

– Categories from NASA Science Mission Directorate's Science Focus Areas.– The focus areas are: weather, climate change and variability, atmospheric

composition, water and energy cycle, Earth surface and interior, and carbon cycle and ecosystems.

Page 28: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Programmatic Metric #9, Support for Applications Areas

• Baseline Definition: The projects will identify the NASA Science Mission Directorate's Applications of National Importance that each project supports (may be multiple). The 12 applications areas are: agricultural efficiency, air quality, aviation safety, carbon management, coastal management, ecosystems, disaster preparedness, energy forecasting, homeland security, invasive species, public health, and water management.

• Purpose: To enable the ESE program office to determine which NASA Science Mission Directorate's Applications of National Importance are supported by the activity, and to assess how the data products provided by the activity relate to that support.

• MCT Question: Please identify the NASA Science Mission Directorate's Applications of National Importance that your project supports (may be multiple).

• Categories are from NASA Science Mission Directorate's Applications of National Importance.

• The 12 applications areas are: agricultural efficiency, air quality, aviation safety, carbon management, coastal management, ecosystems, disaster preparedness, energy forecasting, homeland security, invasive species, public health, and water management.

Page 29: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Programmatic Metric #10, Support for Education Initiatives

• Baseline Definition: The projects will identify the NASA education categories that each supports. These six categories are Elementary and Secondary Education, Higher Education, Underrepresented and Underserved, e-Education, and Informal Education.

• Purpose: To enable the ESE program office to assess support provided by the activity to NASA Science Mission Directorate's education initiatives, by indicating use by education user groups of the activity’s products and services.

• Website Question: Please identify the NASA education categories that your project supports (may be multiple).

• The categories are Elementary and Secondary Education, Higher Education, Underrepresented and Underserved, e-Education, Informal Education, Other.

Page 30: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Outcome Metrics – Citation Count Metrics• The count of the number of times a project’s data, products, services

or publications is cited by researchers in peer-reviewed and other publications.

• Projects will report (using the MCT) two counts: – 1) Metric 13: a count of citations in peer-reviewed publications, and – 2) Metric 14: a count of citations in any other publications, conference or

workshop proceedings, posters, online publications, abstracts, etc. – Note: a ‘count’ is recorded for each publication that contains one or more

references to a project’s work (e.g. a project’s publications, products or services).

• Reporting of the counts is voluntary, on a ‘best effort’ basis, at an interval to be chosen by the project, with semi-annual or annual reporting recommended.

• In addition to the counts, the project will provide the actual citations themselves.

• Each project will use, and provide a description of, its own methodology for collecting the citations counts and strive for consistency from one reporting period to the next.

• Reports on the citation metric will be provided to NASA Headquarters at least annually.

Page 31: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Outcome Metric #13, Citations Count – Peer Reviewed

• Purpose: The objective of the Citations metric is to obtain a better measure of user satisfaction with project’s products and services and enable a better assessment of their contribution to the NASA science and applications programs and Earth science research in general.

• MCT Question: Please enter the number of citations of your project’s data, products, services or publications in peer-reviewed publications.

Metric Definition

Count of the number of citations in peer-reviewed publications.

The count of the number of times a project’s work (data, products, services or publications) is cited by researchers in peer-reviewed publications. A ‘count’ is recorded for each publication that contains one or more references to a project’s work.

Page 32: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Outcome Metric #14, Citations Count - Other than Peer Reviewed

• Purpose: The objective of the Citations metric is to obtain a better measure of user satisfaction with project’s products and services and enable a better assessment of their contribution to the NASA science and applications programs and Earth science research in general.

• MCT Question: Please enter the number of citations of your project’s data, products, services or publications in other than peer-reviewed publications.

Metric Definition

Count of the number of citations in other than peer-reviewed publications.

The count of the number of times a project’s work (data, products, services or publications) is cited by researchers in other than peer-reviewed publications, e.g. conference or workshop proceedings, posters, online publications, abstracts, etc. A ‘count’ is recorded for each publication that contains one or more references to a project’s work.

Page 33: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

Outcome Metrics - Impact Metrics (Success Stories)Impact Metrics, a one page Quad chart format that Includes:• The general impact or benefit expected from release of a new ESDR, or

• The specific impact or benefit where a project’s ESDR has directly benefited a particular user, organization, project, etc. that the project supports, i.e. the traditional ‘nugget’.

• The general benefit impact story might include:

– The name and description of the new or newly revised ESDR.

– An explanation of what is new and scientifically important / significant about it, how it will support NASA’s research goals, provide benefits. Include any pertinent references.

• The specific benefit impact story might include:

– The name of the product and/or service, in the title if possible.

– The direct benefit of using the product or service, the impact, in a prominent manner; in the title if possible.

– Any collaboration with local groups, NGO's, businesses, or federal agencies.

– How working with NASA helped make this happen.

• In either case, an image or graphic that provides a visual complement to the narrative description.

• The Project should use the “Comments” field of the ppt chart for a more detailed narrative and references that can serve as the basis for News Items to be reported on the Community Data Systems website.

• Projects should provide Impact Metrics as the opportunity arises.

• Impact Metric Quad Charts should be sent to E-Books in the same manner as E-Books quad chart reports.

• The next two charts provide a template for the quad chart format and an example of a quad chart Impact Metric. NOTE: Projects should use the template available on E-Books.

Page 34: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

MEaSUREsMaking Earth Science Data Records for Use in Research Environments

HeadlineHighlight

Optional Text

Who/What/Where/When/Why

ImpactRelevance to NASA

Further Information Science Application

• This is the “what’s new and exciting” part.• Description of a success story, i.e. how the project’s

products / services have directly benefited a particular user, organization, or activity the project supports.

• Or, describe a new ESDR your project is releasing, or a significant update to an ESDR previously released, and what the general impact or benefit is expected to be.

• Project and PI: Name & Organization.• Describe project and data involved in this impact.• State project goals, accomplished steps to reach

project goals, including any beyond original objectives.

• Use “action” verbs & active descriptions: developed, released, achieved, demonstrated, modeled…

• Etc.

Links to find out more

• Describe how your project’s work is relevant to NASA goals

• Etc.

caption for graphic

Insert an image or graphic here that provides a visual complement to the narrative description of

the impact or new ESDR described below.

• Describe Earth science research area or application this work applies to

Page 35: Metrics Baseline - (Definitions with Examples and MPG Baseline Change Process) MPG Co-Chairs: H. K. (Rama) Ramapriyan, GSFC Clyde Brown, LaRC / SSAI November

CLIMODE Project Depends on DISCOVER SSTsHelped cruised scientists forecast changes in the Gulf Stream path and thereby reduce ship fuel

use; 6 CLIMODE cruises supported since November 2005

DISCOVER – P.I.: Frank Wentz, Remote Sensing Systems

Impacts of DISCOVER Data on CLIMODE ProjectRelevance to NASA Goals

• Daily SST data are distributed to cruise scientists to help determine ship course and instrument deployment

• 6 CLIMODE cruises supported since November 2005• Helped scientists forecast changes in the Gulf Stream

path and thereby reduce fuel use• Helped scientists adapt measurement strategies in

response to changes in the Gulf Stream path •

• MEaSUREs – for passive microwave ocean products• TMI and AMSR-E satellite data are blended to

produce a diurnally-corrected microwave optimally-interpolated (MW OI) SST product

CLIMODE web site: http://www.climode.org/Example of plots made by the project: http://www.climode.org/Data/APL/apl_prod.htmDISCOVER web site: http://www.discover-earth.org/

CLIMODE cruise track overlaid on DISCOVER Sea Surface Temps Jan 24, 2006

CLIMODE (CLIvar MOde water Dynamic Experiment)

• NSF funded project to improve understanding of the air-sea exchange and cross-frontal mixing in the wintertime Gulf Stream.

Further Information

• Contributes to improved understanding of ocean physics which will aid in improving climate models

• Increases our understanding of the role of oceans and atmosphere in the climate system and improves our predictive capability

Science Application

• This work applies to weather and disaster management application areas