trl calculator ver bi.1 beta

156
TRL Calculator Open Source A NASA OPEN SOURCE AGREEMEN THIS OPEN SOURCE AGREEMENT (“AGREEMENT”) REPRODUCTION, DISTRIBUTION, MODIFICATION CERTAIN COMPUTER SOFTWARE ORIGINALLY REL GOVERNMENT AS REPRESENTED BY THE GOVERNM ("GOVERNMENT AGENCY"). THE UNITED STATE BY GOVERNMENT AGENCY, IS AN INTENDED THI SUBSEQUENT DISTRIBUTIONS OR REDISTRIBUTI ANYONE WHO USES, REPRODUCES, DISTRIBUTES THE SUBJECT SOFTWARE, AS DEFINED HEREIN, THAT ACTION, ACCEPTING IN FULL THE RESPO CONTAINED IN THIS AGREEMENT. Government Agency: __NASA Marshall Space Government Agency Original Software Desi _________________________ Government Agency Original Software Titl Version 3beta User Registration Requested. Please Vis http://_______ _________________ Government Agency Point of Contact for O W. Bilbro 1. DEFINITIONS A. “Contributor” means Governme of the Original Software, an Modification. B. “Covered Patents” mean paten Contributor that are necessa or sale of its Modification the Subject Software.

Upload: david-urie

Post on 01-Dec-2014

447 views

Category:

Documents


35 download

TRANSCRIPT

Page 1: TRL Calculator Ver BI.1 Beta

TRL Calculator Open Source Agreement

NASA OPEN SOURCE AGREEMENT VERSION 1.3

THIS OPEN SOURCE AGREEMENT (“AGREEMENT”) DEFINES THE RIGHTS OF USE, REPRODUCTION, DISTRIBUTION, MODIFICATION AND REDISTRIBUTION OF CERTAIN COMPUTER SOFTWARE ORIGINALLY RELEASED BY THE UNITED STATES GOVERNMENT AS REPRESENTED BY THE GOVERNMENT AGENCY LISTED BELOW ("GOVERNMENT AGENCY"). THE UNITED STATES GOVERNMENT, AS REPRESENTED BY GOVERNMENT AGENCY, IS AN INTENDED THIRD-PARTY BENEFICIARY OF ALL SUBSEQUENT DISTRIBUTIONS OR REDISTRIBUTIONS OF THE SUBJECT SOFTWARE. ANYONE WHO USES, REPRODUCES, DISTRIBUTES, MODIFIES OR REDISTRIBUTES THE SUBJECT SOFTWARE, AS DEFINED HEREIN, OR ANY PART THEREOF, IS, BY THAT ACTION, ACCEPTING IN FULL THE RESPONSIBILITIES AND OBLIGATIONS CONTAINEDIN THIS AGREEMENT.

Government Agency: __NASA Marshall Space Flight Center____Government Agency Original Software Designation: _________________________Government Agency Original Software Title: TRL Calculator AFRL_NASA Version 3betaUser Registration Requested. Please Visit http://________________________Government Agency Point of Contact for Original Software: ED01/James W. Bilbro

1. DEFINITIONS

A. “Contributor” means Government Agency, as the developer of the Original Software, and any entity that makes a Modification.

B. “Covered Patents” mean patent claims licensable by a Contributor that are necessarily infringed by the use or sale of its Modification alone or when combined with the Subject Software.

C. “Display” means the showing of a copy of the Subject Software, either directly or by means of an image, or any other device.

D. “Distribution” means conveyance or transfer of the Subject Software, regardless ofmeans, to another.

E. “Larger Work” means computer software that combines Subject Software, or portions thereof, with software separate from the Subject Software that is not governed by the terms of this Agreement.

F. “Modification” means any alteration of, including addition to or deletion from, thesubstance or structure of either the Original Software or Subject Software, and includes derivative works, as that term is defined in the Copyright Statute, 17 USC101. However, the act of including Subject Software as part of a Larger Work does not in and of itself constitute a Modification.

G. “Original Software” means the computer software first released under this Agreement by Government Agency with Government Agency designation Marshall Space Flight Center and entitled TRL Calculator AFRL_NASA Version 3beta, including source code, object code and accompanying documentation, if any.

Page 2: TRL Calculator Ver BI.1 Beta

NASA OPEN SOURCE AGREEMENT VERSION 1.3

THIS OPEN SOURCE AGREEMENT (“AGREEMENT”) DEFINES THE RIGHTS OF USE, REPRODUCTION, DISTRIBUTION, MODIFICATION AND REDISTRIBUTION OF CERTAIN COMPUTER SOFTWARE ORIGINALLY RELEASED BY THE UNITED STATES GOVERNMENT AS REPRESENTED BY THE GOVERNMENT AGENCY LISTED BELOW ("GOVERNMENT AGENCY"). THE UNITED STATES GOVERNMENT, AS REPRESENTED BY GOVERNMENT AGENCY, IS AN INTENDED THIRD-PARTY BENEFICIARY OF ALL SUBSEQUENT DISTRIBUTIONS OR REDISTRIBUTIONS OF THE SUBJECT SOFTWARE. ANYONE WHO USES, REPRODUCES, DISTRIBUTES, MODIFIES OR REDISTRIBUTES THE SUBJECT SOFTWARE, AS DEFINED HEREIN, OR ANY PART THEREOF, IS, BY THAT ACTION, ACCEPTING IN FULL THE RESPONSIBILITIES AND OBLIGATIONS CONTAINEDIN THIS AGREEMENT.

Government Agency: __NASA Marshall Space Flight Center____Government Agency Original Software Designation: _________________________Government Agency Original Software Title: TRL Calculator AFRL_NASA Version 3betaUser Registration Requested. Please Visit http://________________________Government Agency Point of Contact for Original Software: ED01/James W. Bilbro

1. DEFINITIONS

A. “Contributor” means Government Agency, as the developer of the Original Software, and any entity that makes a Modification.

B. “Covered Patents” mean patent claims licensable by a Contributor that are necessarily infringed by the use or sale of its Modification alone or when combined with the Subject Software.

C. “Display” means the showing of a copy of the Subject Software, either directly or by means of an image, or any other device.

D. “Distribution” means conveyance or transfer of the Subject Software, regardless ofmeans, to another.

E. “Larger Work” means computer software that combines Subject Software, or portions thereof, with software separate from the Subject Software that is not governed by the terms of this Agreement.

F. “Modification” means any alteration of, including addition to or deletion from, thesubstance or structure of either the Original Software or Subject Software, and includes derivative works, as that term is defined in the Copyright Statute, 17 USC101. However, the act of including Subject Software as part of a Larger Work does not in and of itself constitute a Modification.

G. “Original Software” means the computer software first released under this Agreement by Government Agency with Government Agency designation Marshall Space Flight Center and entitled TRL Calculator AFRL_NASA Version 3beta, including source code, object code and accompanying documentation, if any.

Page 3: TRL Calculator Ver BI.1 Beta

TRL Calculator Open Source Agreement

NASA OPEN SOURCE AGREEMENT VERSION 1.3

THIS OPEN SOURCE AGREEMENT (“AGREEMENT”) DEFINES THE RIGHTS OF USE, REPRODUCTION, DISTRIBUTION, MODIFICATION AND REDISTRIBUTION OF CERTAIN COMPUTER SOFTWARE ORIGINALLY RELEASED BY THE UNITED STATES GOVERNMENT AS REPRESENTED BY THE GOVERNMENT AGENCY LISTED BELOW ("GOVERNMENT AGENCY"). THE UNITED STATES GOVERNMENT, AS REPRESENTED BY GOVERNMENT AGENCY, IS AN INTENDED THIRD-PARTY BENEFICIARY OF ALL SUBSEQUENT DISTRIBUTIONS OR REDISTRIBUTIONS OF THE SUBJECT SOFTWARE. ANYONE WHO USES, REPRODUCES, DISTRIBUTES, MODIFIES OR REDISTRIBUTES THE SUBJECT SOFTWARE, AS DEFINED HEREIN, OR ANY PART THEREOF, IS, BY THAT ACTION, ACCEPTING IN FULL THE RESPONSIBILITIES AND OBLIGATIONS CONTAINEDIN THIS AGREEMENT.

Government Agency: __NASA Marshall Space Flight Center____Government Agency Original Software Designation: _________________________Government Agency Original Software Title: TRL Calculator AFRL_NASA Version 3betaUser Registration Requested. Please Visit http://________________________Government Agency Point of Contact for Original Software: ED01/James W. Bilbro

1. DEFINITIONS

A. “Contributor” means Government Agency, as the developer of the Original Software, and any entity that makes a Modification.

B. “Covered Patents” mean patent claims licensable by a Contributor that are necessarily infringed by the use or sale of its Modification alone or when combined with the Subject Software.

C. “Display” means the showing of a copy of the Subject Software, either directly or by means of an image, or any other device.

D. “Distribution” means conveyance or transfer of the Subject Software, regardless ofmeans, to another.

E. “Larger Work” means computer software that combines Subject Software, or portions thereof, with software separate from the Subject Software that is not governed by the terms of this Agreement.

F. “Modification” means any alteration of, including addition to or deletion from, thesubstance or structure of either the Original Software or Subject Software, and includes derivative works, as that term is defined in the Copyright Statute, 17 USC101. However, the act of including Subject Software as part of a Larger Work does not in and of itself constitute a Modification.

G. “Original Software” means the computer software first released under this Agreement by Government Agency with Government Agency designation Marshall Space Flight Center and entitled TRL Calculator AFRL_NASA Version 3beta, including source code, object code and accompanying documentation, if any.

Page 4: TRL Calculator Ver BI.1 Beta

NASA OPEN SOURCE AGREEMENT VERSION 1.3

THIS OPEN SOURCE AGREEMENT (“AGREEMENT”) DEFINES THE RIGHTS OF USE, REPRODUCTION, DISTRIBUTION, MODIFICATION AND REDISTRIBUTION OF CERTAIN COMPUTER SOFTWARE ORIGINALLY RELEASED BY THE UNITED STATES GOVERNMENT AS REPRESENTED BY THE GOVERNMENT AGENCY LISTED BELOW ("GOVERNMENT AGENCY"). THE UNITED STATES GOVERNMENT, AS REPRESENTED BY GOVERNMENT AGENCY, IS AN INTENDED THIRD-PARTY BENEFICIARY OF ALL SUBSEQUENT DISTRIBUTIONS OR REDISTRIBUTIONS OF THE SUBJECT SOFTWARE. ANYONE WHO USES, REPRODUCES, DISTRIBUTES, MODIFIES OR REDISTRIBUTES THE SUBJECT SOFTWARE, AS DEFINED HEREIN, OR ANY PART THEREOF, IS, BY THAT ACTION, ACCEPTING IN FULL THE RESPONSIBILITIES AND OBLIGATIONS CONTAINEDIN THIS AGREEMENT.

Government Agency: __NASA Marshall Space Flight Center____Government Agency Original Software Designation: _________________________Government Agency Original Software Title: TRL Calculator AFRL_NASA Version 3betaUser Registration Requested. Please Visit http://________________________Government Agency Point of Contact for Original Software: ED01/James W. Bilbro

1. DEFINITIONS

A. “Contributor” means Government Agency, as the developer of the Original Software, and any entity that makes a Modification.

B. “Covered Patents” mean patent claims licensable by a Contributor that are necessarily infringed by the use or sale of its Modification alone or when combined with the Subject Software.

C. “Display” means the showing of a copy of the Subject Software, either directly or by means of an image, or any other device.

D. “Distribution” means conveyance or transfer of the Subject Software, regardless ofmeans, to another.

E. “Larger Work” means computer software that combines Subject Software, or portions thereof, with software separate from the Subject Software that is not governed by the terms of this Agreement.

F. “Modification” means any alteration of, including addition to or deletion from, thesubstance or structure of either the Original Software or Subject Software, and includes derivative works, as that term is defined in the Copyright Statute, 17 USC101. However, the act of including Subject Software as part of a Larger Work does not in and of itself constitute a Modification.

G. “Original Software” means the computer software first released under this Agreement by Government Agency with Government Agency designation Marshall Space Flight Center and entitled TRL Calculator AFRL_NASA Version 3beta, including source code, object code and accompanying documentation, if any.

Page 5: TRL Calculator Ver BI.1 Beta

TRL Version HistoryDate Version Description

3/2/2007 Modification of AFRL Calculator

6/6/2007 Calculator completed

1/21/2009 NASA Open Source Agreement Released

1/28/2009 TRL Calculator Ver B1.beta Modified stand-alone version Released

Modification allows for questions to be added at the

discretion of the user and removes links to AD2

TRL Calculator Version 2_2 NASA Variant

TRL Calculator AFRL_NASA Version 3beta

TRL Calculator AFRL_NASA Version 3beta

Page 6: TRL Calculator Ver BI.1 Beta

TRL Version HistoryDescription Developer

Modification of AFRL Calculator W. Nolte: AFRL

Calculator completed J.W. Bilbro: MSFC

NASA Open Source Agreement ReleasedJ.W. Bilbro: MSFC

Modified stand-alone version Released J.W. Bilbro: JBCI

Modification allows for questions to be added at the

discretion of the user and removes links to AD2

Page 7: TRL Calculator Ver BI.1 Beta

###

Date Saved:

Project: Date: 4/9/23

###

WBS # Sys/subsys/comp Name Details? TRL TRL SRL SRL MRL MRL Evaluator Date Delete?

Project TRL Summary R

ow

Ind

icators

Check box values

Deta

Page 8: TRL Calculator Ver BI.1 Beta
Page 9: TRL Calculator Ver BI.1 Beta
Page 10: TRL Calculator Ver BI.1 Beta
Page 11: TRL Calculator Ver BI.1 Beta
Page 12: TRL Calculator Ver BI.1 Beta
Page 13: TRL Calculator Ver BI.1 Beta

TRL Index of Projects

Project Name:

Example

Delete Project?

Delete Project?

Page 14: TRL Calculator Ver BI.1 Beta

TRL Index of Projects

###

Example ###

Delete Project?

Delete Project?

Page 15: TRL Calculator Ver BI.1 Beta

Key Points

###

###

IF HERITAGE EQUIPMENT IS BEING USED OUTSIDE OF THE ARCHITECTURE AND OPERATIONAL

ENVIRONMENT FOR WHICH IT WAS ORIGINALLY DESIGNED IT IS AT MOST A LEVEL 5 UNTIL

ANALYSIS AND/OR TEST WARRANTS ITS INCREASE

IF IT IS WITHIN THE EXPERIENCE BASE IT IS ENGINEERING DEVELOPMENT

IF IT IS OUTSIDE OF THE EXPERIENCE BASE IT IS TECHNOLOGY DEVELOPMENT

IT IS EXTREMELY DIFFICULT TO KNOW APRIORI WHICH IS WHICH

THE ONLY RECOURSE AVAILABLE IS TO APPLY STRONG, UPFRONT SYSTEMS ENGINEERING

THIS TOOL IS INTENDED TO BE AN AID TO THAT PROCESS BY PROVIDING A SYSTEMATIC

ASSESSMENT OF COMPONENT, SUBSYSTEM AND SYSTEM MATURITY AS WELL AS INSIGHT INTO THE

COST SCHEDULE AND RISK ASSOCIATED WITH DEVELOPMENT

Background Material

Page 16: TRL Calculator Ver BI.1 Beta

Introduction

The idea that technology immaturity can have a significant impact on the ability to deliver satisfactory products on time andwithin cost has existed for many years. But how do you know that the technology you need is immature? For that matter, how do you know that you need any technology at all? No manager begins a program/project with the idea that they will have to rely on “immature” technology in order to meet requirements and in fact, most managers will want to avoid the use of any “technology” at all. But what is technology? Technology development is typically associated with the application of scientific knowledge to do something completely new or in a completely new way and as such is to be avoided at all cost in the development of any program/project that has deliverables at fixed cost and schedule. Perhaps a more useful definition would be one that defines technology development as development that lies outside of our experience base – i.e. in the region of “unknown unknowns” where you don’t know what you don’t know. It is in this area where all too often program/projects have been caught unaware, resulting in cost overruns, schedule slips and even cancellations or failures. This is in large part due to a lack of “up-front” investment in the time and effort to understand what the requirements are and what is required to meet them. In other words, a lackof “up-front” systems engineering directed toward understanding whether or not requirements can be met within the available resources (i.e. the availability of the technology needed to meet requirements at a level of maturity that can be developed to a point where it can be incorporated within the cost, schedule and risk constraints.) This is particularly true when dealing with “systems. Systems engineering is often short circuited when dealing with “heritage” systems because it is believed that such systems have already been “proven.” However, a “heritage” system incorporated into a different architecture and operating in a different environment from those for which they were designed may well require modifications that fall outside of the realm of experience. In which case, this too should be considered technology development. In fact, the ground rule for the technological assessment processes accompanying these tools is that the maturity of heritage systems is automatically dropped to TRL 5 until sufficient analysis is done to justify a higher level.

In order to understand whether or not technology development is required - and to subsequently quantify the associated cost, schedule and risk, it is necessary to systematically assess the maturity of each system, sub-system or component in terms of the architecture and operational environment. It is then necessary to assess what is required in the way of development to advance the maturity to a point where it can be successfully incorporated within cost, schedule and performance constraints. The TRL Calculator and the AD2 Calculator were developed to provide a simple, standardized, systematic method for assessing the maturity of systems and the cost, schedule and risk associated with development and infusion. They are not “magic bullets” and they can begamed, but they also provide a means of comparing apples to apples and permit everyone to be on the same page or to at least havethe wherewithal to get on the same page.

The TRL Calculator can be used at the very early (concept) stage of the program/project to provide a baseline maturity assessment. The AD2 Calculator can be used at the same time to provide a quick look at the “tall tent pole” issues. Subsequent useof the AD2 Calculator can provide the basis for the Technology Development Plan required for delivery at SRR by NPR 7120.5d and to measure progress in the development. The TRL Calculator can also be used to measure progress and to provide the basis for the Technology Readiness Assessment Report required for delivery at PDR by NPR 7120.5d.

Since my retirement from NASA I have integrated the two calculators into a single workbook and added a number of other features including the ability to save and recall projects in the TRL Calculator and to add additional questions and categories in the AD2 Calculator. The integrated calculator is available on request.

I am greatly indebted to Bill Nolte of AFRL for his willingness to modify his original TRL Calculator to Uwe Hueter, SAIC, and Vyga Kulpa, MSFC for their part in developing the requirements for using the calculator in the assessment of the Ares project. I am also greatly indebted to John Cole for his initial coding of the AD2 Calculator. John Kelly of NASA headquarters provided critical advice and support in the area of software assessment. Bob Duffe of Ames Research Center developed the

software questions and provided the material on software assurance, mission class and safety criticality. Leann Thomas, Pat Benson and Cathy White of MSFC reviewed and commented on the software questions and John Vickers also of MSFC reviewed and commented on the manufacturing questions. I cannot adequately express my appreciation for the support of all of these individuals.

Page 17: TRL Calculator Ver BI.1 Beta

Introduction

The idea that technology immaturity can have a significant impact on the ability to deliver satisfactory products on time andwithin cost has existed for many years. But how do you know that the technology you need is immature? For that matter, how do you know that you need any technology at all? No manager begins a program/project with the idea that they will have to rely on “immature” technology in order to meet requirements and in fact, most managers will want to avoid the use of any “technology” at all. But what is technology? Technology development is typically associated with the application of scientific knowledge to do something completely new or in a completely new way and as such is to be avoided at all cost in the development of any program/project that has deliverables at fixed cost and schedule. Perhaps a more useful definition would be one that defines technology development as development that lies outside of our experience base – i.e. in the region of “unknown unknowns” where you don’t know what you don’t know. It is in this area where all too often program/projects have been caught unaware, resulting in cost overruns, schedule slips and even cancellations or failures. This is in large part due to a lack of “up-front” investment in the time and effort to understand what the requirements are and what is required to meet them. In other words, a lackof “up-front” systems engineering directed toward understanding whether or not requirements can be met within the available resources (i.e. the availability of the technology needed to meet requirements at a level of maturity that can be developed to a point where it can be incorporated within the cost, schedule and risk constraints.) This is particularly true when dealing with “systems. Systems engineering is often short circuited when dealing with “heritage” systems because it is believed that such systems have already been “proven.” However, a “heritage” system incorporated into a different architecture and operating in a different environment from those for which they were designed may well require modifications that fall outside of the realm of experience. In which case, this too should be considered technology development. In fact, the ground rule for the technological assessment processes accompanying these tools is that the maturity of heritage systems is automatically dropped to TRL 5 until sufficient analysis is done to justify a higher level.

In order to understand whether or not technology development is required - and to subsequently quantify the associated cost, schedule and risk, it is necessary to systematically assess the maturity of each system, sub-system or component in terms of the architecture and operational environment. It is then necessary to assess what is required in the way of development to advance the maturity to a point where it can be successfully incorporated within cost, schedule and performance constraints. The TRL Calculator and the AD2 Calculator were developed to provide a simple, standardized, systematic method for assessing the maturity of systems and the cost, schedule and risk associated with development and infusion. They are not “magic bullets” and they can begamed, but they also provide a means of comparing apples to apples and permit everyone to be on the same page or to at least havethe wherewithal to get on the same page.

The TRL Calculator can be used at the very early (concept) stage of the program/project to provide a baseline maturity assessment. The AD2 Calculator can be used at the same time to provide a quick look at the “tall tent pole” issues. Subsequent useof the AD2 Calculator can provide the basis for the Technology Development Plan required for delivery at SRR by NPR 7120.5d and to measure progress in the development. The TRL Calculator can also be used to measure progress and to provide the basis for the Technology Readiness Assessment Report required for delivery at PDR by NPR 7120.5d.

Since my retirement from NASA I have integrated the two calculators into a single workbook and added a number of other features including the ability to save and recall projects in the TRL Calculator and to add additional questions and categories in the AD2 Calculator. The integrated calculator is available on request.

I am greatly indebted to Bill Nolte of AFRL for his willingness to modify his original TRL Calculator to Uwe Hueter, SAIC, and Vyga Kulpa, MSFC for their part in developing the requirements for using the calculator in the assessment of the Ares project. I am also greatly indebted to John Cole for his initial coding of the AD2 Calculator. John Kelly of NASA headquarters provided critical advice and support in the area of software assessment. Bob Duffe of Ames Research Center developed the

software questions and provided the material on software assurance, mission class and safety criticality. Leann Thomas, Pat Benson and Cathy White of MSFC reviewed and commented on the software questions and John Vickers also of MSFC reviewed and commented on the manufacturing questions. I cannot adequately express my appreciation for the support of all of these individuals.

Page 18: TRL Calculator Ver BI.1 Beta

Introduction

The idea that technology immaturity can have a significant impact on the ability to deliver satisfactory products on time andwithin cost has existed for many years. But how do you know that the technology you need is immature? For that matter, how do you know that you need any technology at all? No manager begins a program/project with the idea that they will have to rely on “immature” technology in order to meet requirements and in fact, most managers will want to avoid the use of any “technology” at all. But what is technology? Technology development is typically associated with the application of scientific knowledge to do something completely new or in a completely new way and as such is to be avoided at all cost in the development of any program/project that has deliverables at fixed cost and schedule. Perhaps a more useful definition would be one that defines technology development as development that lies outside of our experience base – i.e. in the region of “unknown unknowns” where you don’t know what you don’t know. It is in this area where all too often program/projects have been caught unaware, resulting in cost overruns, schedule slips and even cancellations or failures. This is in large part due to a lack of “up-front” investment in the time and effort to understand what the requirements are and what is required to meet them. In other words, a lackof “up-front” systems engineering directed toward understanding whether or not requirements can be met within the available resources (i.e. the availability of the technology needed to meet requirements at a level of maturity that can be developed to a point where it can be incorporated within the cost, schedule and risk constraints.) This is particularly true when dealing with “systems. Systems engineering is often short circuited when dealing with “heritage” systems because it is believed that such systems have already been “proven.” However, a “heritage” system incorporated into a different architecture and operating in a different environment from those for which they were designed may well require modifications that fall outside of the realm of experience. In which case, this too should be considered technology development. In fact, the ground rule for the technological assessment processes accompanying these tools is that the maturity of heritage systems is automatically dropped to TRL 5 until sufficient analysis is done to justify a higher level.

In order to understand whether or not technology development is required - and to subsequently quantify the associated cost, schedule and risk, it is necessary to systematically assess the maturity of each system, sub-system or component in terms of the architecture and operational environment. It is then necessary to assess what is required in the way of development to advance the maturity to a point where it can be successfully incorporated within cost, schedule and performance constraints. The TRL Calculator and the AD2 Calculator were developed to provide a simple, standardized, systematic method for assessing the maturity of systems and the cost, schedule and risk associated with development and infusion. They are not “magic bullets” and they can begamed, but they also provide a means of comparing apples to apples and permit everyone to be on the same page or to at least havethe wherewithal to get on the same page.

The TRL Calculator can be used at the very early (concept) stage of the program/project to provide a baseline maturity assessment. The AD2 Calculator can be used at the same time to provide a quick look at the “tall tent pole” issues. Subsequent useof the AD2 Calculator can provide the basis for the Technology Development Plan required for delivery at SRR by NPR 7120.5d and to measure progress in the development. The TRL Calculator can also be used to measure progress and to provide the basis for the Technology Readiness Assessment Report required for delivery at PDR by NPR 7120.5d.

Since my retirement from NASA I have integrated the two calculators into a single workbook and added a number of other features including the ability to save and recall projects in the TRL Calculator and to add additional questions and categories in the AD2 Calculator. The integrated calculator is available on request.

I am greatly indebted to Bill Nolte of AFRL for his willingness to modify his original TRL Calculator to Uwe Hueter, SAIC, and Vyga Kulpa, MSFC for their part in developing the requirements for using the calculator in the assessment of the Ares project. I am also greatly indebted to John Cole for his initial coding of the AD2 Calculator. John Kelly of NASA headquarters provided critical advice and support in the area of software assessment. Bob Duffe of Ames Research Center developed the

software questions and provided the material on software assurance, mission class and safety criticality. Leann Thomas, Pat Benson and Cathy White of MSFC reviewed and commented on the software questions and John Vickers also of MSFC reviewed and commented on the manufacturing questions. I cannot adequately express my appreciation for the support of all of these individuals.

Page 19: TRL Calculator Ver BI.1 Beta

Introduction

The idea that technology immaturity can have a significant impact on the ability to deliver satisfactory products on time andwithin cost has existed for many years. But how do you know that the technology you need is immature? For that matter, how do you know that you need any technology at all? No manager begins a program/project with the idea that they will have to rely on “immature” technology in order to meet requirements and in fact, most managers will want to avoid the use of any “technology” at all. But what is technology? Technology development is typically associated with the application of scientific knowledge to do something completely new or in a completely new way and as such is to be avoided at all cost in the development of any program/project that has deliverables at fixed cost and schedule. Perhaps a more useful definition would be one that defines technology development as development that lies outside of our experience base – i.e. in the region of “unknown unknowns” where you don’t know what you don’t know. It is in this area where all too often program/projects have been caught unaware, resulting in cost overruns, schedule slips and even cancellations or failures. This is in large part due to a lack of “up-front” investment in the time and effort to understand what the requirements are and what is required to meet them. In other words, a lackof “up-front” systems engineering directed toward understanding whether or not requirements can be met within the available resources (i.e. the availability of the technology needed to meet requirements at a level of maturity that can be developed to a point where it can be incorporated within the cost, schedule and risk constraints.) This is particularly true when dealing with “systems. Systems engineering is often short circuited when dealing with “heritage” systems because it is believed that such systems have already been “proven.” However, a “heritage” system incorporated into a different architecture and operating in a different environment from those for which they were designed may well require modifications that fall outside of the realm of experience. In which case, this too should be considered technology development. In fact, the ground rule for the technological assessment processes accompanying these tools is that the maturity of heritage systems is automatically dropped to TRL 5 until sufficient analysis is done to justify a higher level.

In order to understand whether or not technology development is required - and to subsequently quantify the associated cost, schedule and risk, it is necessary to systematically assess the maturity of each system, sub-system or component in terms of the architecture and operational environment. It is then necessary to assess what is required in the way of development to advance the maturity to a point where it can be successfully incorporated within cost, schedule and performance constraints. The TRL Calculator and the AD2 Calculator were developed to provide a simple, standardized, systematic method for assessing the maturity of systems and the cost, schedule and risk associated with development and infusion. They are not “magic bullets” and they can begamed, but they also provide a means of comparing apples to apples and permit everyone to be on the same page or to at least havethe wherewithal to get on the same page.

The TRL Calculator can be used at the very early (concept) stage of the program/project to provide a baseline maturity assessment. The AD2 Calculator can be used at the same time to provide a quick look at the “tall tent pole” issues. Subsequent useof the AD2 Calculator can provide the basis for the Technology Development Plan required for delivery at SRR by NPR 7120.5d and to measure progress in the development. The TRL Calculator can also be used to measure progress and to provide the basis for the Technology Readiness Assessment Report required for delivery at PDR by NPR 7120.5d.

Since my retirement from NASA I have integrated the two calculators into a single workbook and added a number of other features including the ability to save and recall projects in the TRL Calculator and to add additional questions and categories in the AD2 Calculator. The integrated calculator is available on request.

I am greatly indebted to Bill Nolte of AFRL for his willingness to modify his original TRL Calculator to Uwe Hueter, SAIC, and Vyga Kulpa, MSFC for their part in developing the requirements for using the calculator in the assessment of the Ares project. I am also greatly indebted to John Cole for his initial coding of the AD2 Calculator. John Kelly of NASA headquarters provided critical advice and support in the area of software assessment. Bob Duffe of Ames Research Center developed the

software questions and provided the material on software assurance, mission class and safety criticality. Leann Thomas, Pat Benson and Cathy White of MSFC reviewed and commented on the software questions and John Vickers also of MSFC reviewed and commented on the manufacturing questions. I cannot adequately express my appreciation for the support of all of these individuals.

Page 20: TRL Calculator Ver BI.1 Beta

Instructions

Instructions for the TRL Calculator Ver BI.1 betaFebruary 4, 2009

Double click on the document to open for reading.

N.B. the tabs are hidden for a reason, please use the buttons to navigate throughout the workbook and close the workbook using the “Close Calculator buttons”!Thanks - JB

Introduction:

Technology Assessment of a complex system requires the assessment of all of its systems, sub-systems and components, including those elements that are thoughtto be mature because of past operational use. It is comprised of two parts, first determining the current maturity through the use of the Technology Readiness Level (TRL) Calculator and then determining what is required to advance that maturity in terms of cost, schedule and risk through the use of the Advancement Degree of Difficulty (AD2) Calculator. This calculator deals with TRLs. The TRL Calculator includes questions for hardware and software and an edited version of questions for Manufacturing Readiness Levels (MRL’s) (this is not the same set as the full MRL tool. It includes the ability to capture and save project data as a function of the product Work Breakdown Structure.

The TRL Calculator saves copies of the evaluation, but those copies cannot be edited. If a new evaluation is required at a later date all of the questions must be answered anew and the data saved with an annotation to the title that it is the 2evaluation (Widget & Widget II). If this annotation is not made, it will replace the old data with the new and the earlier information will be lost.

The calculator saves evaluation data (answers to the questions) according to the name of what is being evaluated (system name, subsystem name, component name, etc.) All of the data is saved under the project name and stays in the active part of the calculators until either the project is deleted, or saved. Once it is saved, the data is no longer in the active part of the calculator. Data is sorted according to the WBS number entered for each element. Elements do not have to be entered in the order of the WBS. The calculator will sort the data into a WBS hierarchy when that option is selected. When the project itself is saved, theproject name is recorded in the project index. The projects can be recalled and

Page 21: TRL Calculator Ver BI.1 Beta

Instructions for the TRL Calculator Ver BI.1 betaFebruary 4, 2009

Double click on the document to open for reading.

N.B. the tabs are hidden for a reason, please use the buttons to navigate throughout the workbook and close the workbook using the “Close Calculator buttons”!Thanks - JB

Introduction:

Technology Assessment of a complex system requires the assessment of all of its systems, sub-systems and components, including those elements that are thoughtto be mature because of past operational use. It is comprised of two parts, first determining the current maturity through the use of the Technology Readiness Level (TRL) Calculator and then determining what is required to advance that maturity in terms of cost, schedule and risk through the use of the Advancement Degree of Difficulty (AD2) Calculator. This calculator deals with TRLs. The TRL Calculator includes questions for hardware and software and an edited version of questions for Manufacturing Readiness Levels (MRL’s) (this is not the same set as the full MRL tool. It includes the ability to capture and save project data as a function of the product Work Breakdown Structure.

The TRL Calculator saves copies of the evaluation, but those copies cannot be edited. If a new evaluation is required at a later date all of the questions must be answered anew and the data saved with an annotation to the title that it is the 2evaluation (Widget & Widget II). If this annotation is not made, it will replace the old data with the new and the earlier information will be lost.

The calculator saves evaluation data (answers to the questions) according to the name of what is being evaluated (system name, subsystem name, component name, etc.) All of the data is saved under the project name and stays in the active part of the calculators until either the project is deleted, or saved. Once it is saved, the data is no longer in the active part of the calculator. Data is sorted according to the WBS number entered for each element. Elements do not have to be entered in the order of the WBS. The calculator will sort the data into a WBS hierarchy when that option is selected. When the project itself is saved, theproject name is recorded in the project index. The projects can be recalled and

Page 22: TRL Calculator Ver BI.1 Beta

Instructions for the TRL Calculator Ver BI.1 betaFebruary 4, 2009

Double click on the document to open for reading.

N.B. the tabs are hidden for a reason, please use the buttons to navigate throughout the workbook and close the workbook using the “Close Calculator buttons”!Thanks - JB

Introduction:

Technology Assessment of a complex system requires the assessment of all of its systems, sub-systems and components, including those elements that are thoughtto be mature because of past operational use. It is comprised of two parts, first determining the current maturity through the use of the Technology Readiness Level (TRL) Calculator and then determining what is required to advance that maturity in terms of cost, schedule and risk through the use of the Advancement Degree of Difficulty (AD2) Calculator. This calculator deals with TRLs. The TRL Calculator includes questions for hardware and software and an edited version of questions for Manufacturing Readiness Levels (MRL’s) (this is not the same set as the full MRL tool. It includes the ability to capture and save project data as a function of the product Work Breakdown Structure.

The TRL Calculator saves copies of the evaluation, but those copies cannot be edited. If a new evaluation is required at a later date all of the questions must be answered anew and the data saved with an annotation to the title that it is the 2evaluation (Widget & Widget II). If this annotation is not made, it will replace the old data with the new and the earlier information will be lost.

The calculator saves evaluation data (answers to the questions) according to the name of what is being evaluated (system name, subsystem name, component name, etc.) All of the data is saved under the project name and stays in the active part of the calculators until either the project is deleted, or saved. Once it is saved, the data is no longer in the active part of the calculator. Data is sorted according to the WBS number entered for each element. Elements do not have to be entered in the order of the WBS. The calculator will sort the data into a WBS hierarchy when that option is selected. When the project itself is saved, theproject name is recorded in the project index. The projects can be recalled and

Page 23: TRL Calculator Ver BI.1 Beta

Instructions for the TRL Calculator Ver BI.1 betaFebruary 4, 2009

Double click on the document to open for reading.

N.B. the tabs are hidden for a reason, please use the buttons to navigate throughout the workbook and close the workbook using the “Close Calculator buttons”!Thanks - JB

Introduction:

Technology Assessment of a complex system requires the assessment of all of its systems, sub-systems and components, including those elements that are thoughtto be mature because of past operational use. It is comprised of two parts, first determining the current maturity through the use of the Technology Readiness Level (TRL) Calculator and then determining what is required to advance that maturity in terms of cost, schedule and risk through the use of the Advancement Degree of Difficulty (AD2) Calculator. This calculator deals with TRLs. The TRL Calculator includes questions for hardware and software and an edited version of questions for Manufacturing Readiness Levels (MRL’s) (this is not the same set as the full MRL tool. It includes the ability to capture and save project data as a function of the product Work Breakdown Structure.

The TRL Calculator saves copies of the evaluation, but those copies cannot be edited. If a new evaluation is required at a later date all of the questions must be answered anew and the data saved with an annotation to the title that it is the 2evaluation (Widget & Widget II). If this annotation is not made, it will replace the old data with the new and the earlier information will be lost.

The calculator saves evaluation data (answers to the questions) according to the name of what is being evaluated (system name, subsystem name, component name, etc.) All of the data is saved under the project name and stays in the active part of the calculators until either the project is deleted, or saved. Once it is saved, the data is no longer in the active part of the calculator. Data is sorted according to the WBS number entered for each element. Elements do not have to be entered in the order of the WBS. The calculator will sort the data into a WBS hierarchy when that option is selected. When the project itself is saved, theproject name is recorded in the project index. The projects can be recalled and

Page 24: TRL Calculator Ver BI.1 Beta

###

Date Saved: 1/16/2008

Project: EXAMPLE Date: 4/9/23

###

WBS # Sys/subsys/comp Name Details? TRL TRL SRL SRL MRL MRL Evaluator Date Delete?

1.1 Spacecraft Bus 3 3 4 Bilbro 1/16/08 ###

1.2.1.1 Primary Mirror 2 Bilbro 1/16/08 ###

1.2.1.1.1 Primary Mirror Actuators 1 2 Bilbro 1/16/08 ###

1.3 Sunshade 1 2 Bilbro 1/16/08 ###

Project TRL Summary

Ro

w

Indicato

rs

Check box values

Deta

Page 25: TRL Calculator Ver BI.1 Beta
Page 26: TRL Calculator Ver BI.1 Beta
Page 27: TRL Calculator Ver BI.1 Beta
Page 28: TRL Calculator Ver BI.1 Beta
Page 29: TRL Calculator Ver BI.1 Beta
Page 30: TRL Calculator Ver BI.1 Beta

11

Summary of the Technology's Readiness

Sys/subsys/comp: Primary Mirror Program:

WBS #: 1.2.1.1 Evaluator:

SW Mission Class: n / a Safety Critical? n / a

1 2 3 4 5 6 7 8 9

Technology Readiness Level Achieved Hardware 2

Technology Readiness Level Achieved Software

Manufacturing Readiness Level Achieved Hardware

Technology Readiness Level Details

TRL 3

100 Critical functions/components of the concept/application identified?

70 Subsystem or component analytical predictions made?

0 Subsystem or component performance assessed by Modeling and Simulation?

90 Preliminary key parameters performance metrics established?

90 Laboratory tests and test environments established?

10

10 Component acquisition/fabrication completed?

0 Component tests completed?

0

0 Analytical verification of critical functions from proof-of-concept made?

0 Analytical and experimental proof-of-concept documented?

If you wish to view the details of a given level(s) check the appropriate boxes at right and click paste. The entire page can then be saved to a new work sheet by clicking "Save All". The levels achieved and the appropriate project information is saved to the WBS summary at the same time.

Laboratory test support equipment and facilities completed for component/proof-of-concept testing?

Analysis of test results completed establishing key performance metrics for components/subsystems?

Page 31: TRL Calculator Ver BI.1 Beta

Level 3 Comments:

Manufacturing Readiness Level Details

MRL 3

60 Preliminary design of components/subsystem/systems to be manufactured exists?

60 Basic manufacturing requirements identified?

60 Current manufacturability concepts assessed?

0 Modifications required to existing manufacturing concepts?

0 New manufacturing concepts required?

0 Have requirments for new materials, components, skills and facilities been identified?

40 Preliminary process flow identified?

40 Required manufacturing concepts identified?

Level 3 Comments:

Industry/government survey indicates that mirror fabrication is feasible in a number of different materials, however, subscale models need to be fabricated and tested at cryogenic temperatures in order to select optium material for fabrication at room temperature an operation at 30 K.

Industry/government survey indicates that mirror fabrication is feasible in a number of different materials, however, subscale models need to be fabricated and tested at cryogenic temperatures in order to select optium material for fabrication at room temperature an operation at 30 K.

Page 32: TRL Calculator Ver BI.1 Beta

Industry/government survey indicates that mirror fabrication is feasible in a number of different materials, however, subscale models need to be fabricated and tested at cryogenic temperatures in order to select optium material for fabrication at room temperature an operation at 30 K.

Page 33: TRL Calculator Ver BI.1 Beta

TRL(Hardware)

1

Y Y

1 2

MRL

2

0 0

0 00 00 0

SRL (Software)

4

Page 34: TRL Calculator Ver BI.1 Beta

Summary of the Technology's Readiness

NGST

Bilbro Date: 1/16/08

ASSESSMENT DETAILSYellow 50%Set Point

Level 1 2 3 4 5 6 7 8 9TRLSRLMRL

Technology Readiness Level Details

TRL 3

Critical functions/components of the concept/application identified?

Subsystem or component analytical predictions made?

Subsystem or component performance assessed by Modeling and Simulation?

Preliminary key parameters performance metrics established?

Laboratory tests and test environments established?

Component acquisition/fabrication completed?

Component tests completed?

Analytical verification of critical functions from proof-of-concept made?

Analytical and experimental proof-of-concept documented?

If you wish to view the details of a given level(s) check the appropriate boxes at right and click paste. The entire page can then be saved to a new work sheet by clicking "Save All". The levels achieved and the appropriate project information is saved to the WBS summary at the same time.

Laboratory test support equipment and facilities completed for component/proof-of-

Analysis of test results completed establishing key performance metrics for

Page 35: TRL Calculator Ver BI.1 Beta

Manufacturing Readiness Level Details

MRL 3

Preliminary design of components/subsystem/systems to be manufactured exists?

Basic manufacturing requirements identified?

Current manufacturability concepts assessed?

Modifications required to existing manufacturing concepts?

New manufacturing concepts required?

Have requirments for new materials, components, skills and facilities been identified?

Preliminary process flow identified?

Required manufacturing concepts identified?

Industry/government survey indicates that mirror fabrication is feasible in a number of different materials, however, subscale models need to be fabricated and tested at cryogenic temperatures in order to select optium material for

Industry/government survey indicates that mirror fabrication is feasible in a number of different materials, however, subscale models need to be fabricated and tested at cryogenic temperatures in order to select optium material for

Page 36: TRL Calculator Ver BI.1 Beta

Industry/government survey indicates that mirror fabrication is feasible in a number of different materials, however, subscale models need to be fabricated and tested at cryogenic temperatures in order to select optium material for

Page 37: TRL Calculator Ver BI.1 Beta

R S T U V W X YDisplay Green? 7 19 20 20 20 20 20 20

Display Yellow?Max GreenMax Yellow

Display Green? 0 0 0 0 0 0 0 0

Display Yellow?Max GreenMax Yellow

Display Green? 0 0 0 0 0 0 0 0

Display Yellow?Max GreenMax Yellow

Page 38: TRL Calculator Ver BI.1 Beta

1

700

00

Page 39: TRL Calculator Ver BI.1 Beta

2 3 4 5 6 7 8 9

1200 370 0 0 0 0 0 0

0 0 0 0 0 0 0 00 260 0 0 0 0 0 0

Page 40: TRL Calculator Ver BI.1 Beta

12

Summary of the Technology's Readiness

Sys/subsys/comp: Primary Mirror Actuators Program:

WBS #: 1.2.1.1.1 Evaluator:

SW Mission Class: n / a Safety Critical? n / a

1 2 3 4 5 6 7 8 9

Technology Readiness Level Achieved Hardware 2 1

Technology Readiness Level Achieved Software

Manufacturing Readiness Level Achieved Hardware

Technology Readiness Level Details

TRL 2

100 A concept formulated?

100 Basic scientific principles underpinning concept identified?

30 Preliminary analytical studies confirm basic concept?

100 Application identified?

40 Preliminary design solution identified?

100 Preliminary system studies show application to be feasible?

100 Preliminary performance predictions made?

40

100 Benefits formulated?

100 Research & development approach formulated?

100 Preliminary definition of Laboratory tests and test environments established?

40

If you wish to view the details of a given level(s) check the appropriate boxes at right and click paste. The entire page can then be saved to a new work sheet by clicking "Save All". The levels achieved and the appropriate project information is saved to the WBS summary at the same time.

Modeling & Simulation used to further refine performance predictions and confirm benefits?

Concept/application feasibility & benefits reported in scientific journals/conference proceedings/technical reports ?

Page 41: TRL Calculator Ver BI.1 Beta

Level 2 Comments:

TRL 3

100 Critical functions/components of the concept/application identified?

30 Subsystem or component analytical predictions made?

0 Subsystem or component performance assessed by Modeling and Simulation?

40 Preliminary key parameters performance metrics established?

90 Laboratory tests and test environments established?

10

10 Component acquisition/fabrication completed?

0 Component tests completed?

0

0 Analytical verification of critical functions from proof-of-concept made?

0 Analytical and experimental proof-of-concept documented?

Level 3 Comments:

2 design solutions have been identified and 3 more are in processProgress will be documented in SPIE Astronomy meeting proceedings

Laboratory test support equipment and facilities completed for component/proof-of-concept testing?

Analysis of test results completed establishing key performance metrics for components/subsystems?

Industry/government survey indicates that actuator fabrication is feasible in a number of different materials, however, subscale models need to be fabricated and tested at cryogenic temperatures in order to select optium material for fabrication at room temperature an operation at 30 K.

Page 42: TRL Calculator Ver BI.1 Beta

Manufacturing Readiness Level Details

MRL 3

30 Preliminary design of components/subsystem/systems to be manufactured exists?

30 Basic manufacturing requirements identified?

40 Current manufacturability concepts assessed?

0 Modifications required to existing manufacturing concepts?

0 New manufacturing concepts required?

0 Have requirments for new materials, components, skills and facilities been identified?

20 Preliminary process flow identified?

20 Required manufacturing concepts identified?

Level 3 Comments:

Industry/government survey indicates that actuator fabrication is feasible in a number of different materials, however, subscale models need to be fabricated and tested at cryogenic temperatures in order to select optium material for fabrication at room temperature an operation at 30 K.

Page 43: TRL Calculator Ver BI.1 Beta

TRL(Hardware)

1

Y

Y1

2MRL

2

0 0

0 00 00 0

SRL (Software)

4

Page 44: TRL Calculator Ver BI.1 Beta

Summary of the Technology's Readiness

NGST

Bilbro Date: 1/16/08

ASSESSMENT DETAILSYellow 50%Set Point

Level 1 2 3 4 5 6 7 8 9TRLSRLMRL

Technology Readiness Level Details

TRL 2

A concept formulated?

Basic scientific principles underpinning concept identified?

Preliminary analytical studies confirm basic concept?

Application identified?

Preliminary design solution identified?

Preliminary system studies show application to be feasible?

Preliminary performance predictions made?

Benefits formulated?

Research & development approach formulated?

Preliminary definition of Laboratory tests and test environments established?

If you wish to view the details of a given level(s) check the appropriate boxes at right and click paste. The entire page can then be saved to a new work sheet by clicking "Save All". The levels achieved and the appropriate project information is saved to the WBS summary at the same time.

Modeling & Simulation used to further refine performance predictions and confirm

Page 45: TRL Calculator Ver BI.1 Beta

TRL 3

Critical functions/components of the concept/application identified?

Subsystem or component analytical predictions made?

Subsystem or component performance assessed by Modeling and Simulation?

Preliminary key parameters performance metrics established?

Laboratory tests and test environments established?

Component acquisition/fabrication completed?

Component tests completed?

Analytical verification of critical functions from proof-of-concept made?

Analytical and experimental proof-of-concept documented?

Laboratory test support equipment and facilities completed for component/proof-of-

Analysis of test results completed establishing key performance metrics for

Industry/government survey indicates that actuator fabrication is feasible in a number of different materials, however, subscale models need to be fabricated and tested at cryogenic temperatures in order to select optium material for

Page 46: TRL Calculator Ver BI.1 Beta

Manufacturing Readiness Level Details

MRL 3

Preliminary design of components/subsystem/systems to be manufactured exists?

Basic manufacturing requirements identified?

Current manufacturability concepts assessed?

Modifications required to existing manufacturing concepts?

New manufacturing concepts required?

Have requirments for new materials, components, skills and facilities been identified?

Preliminary process flow identified?

Required manufacturing concepts identified?

Industry/government survey indicates that actuator fabrication is feasible in a number of different materials, however, subscale models need to be fabricated and tested at cryogenic temperatures in order to select optium material for

Page 47: TRL Calculator Ver BI.1 Beta

R S T U V W X YDisplay Green? 7 15 16 16 16 16 16 16

Display Yellow?Max GreenMax Yellow

Display Green? 0 0 0 0 0 0 0 0

Display Yellow?Max GreenMax Yellow

Display Green? 0 0 0 0 0 0 0 0

Display Yellow?Max GreenMax Yellow

Page 48: TRL Calculator Ver BI.1 Beta

1

700

00

Page 49: TRL Calculator Ver BI.1 Beta

2 3 4 5 6 7 8 9

950 280 0 0 0 0 0 0

0 0 0 0 0 0 0 00 140 0 0 0 0 0 0

Page 50: TRL Calculator Ver BI.1 Beta

13

Summary of the Technology's Readiness

Sys/subsys/comp: Sunshade Program:

WBS #: 1.3 Evaluator:

SW Mission Class: n / a Safety Critical? n / a

1 2 3 4 5 6 7 8 9

Technology Readiness Level Achieved Hardware 2 1

Technology Readiness Level Achieved Software

Manufacturing Readiness Level Achieved Hardware

Technology Readiness Level Details

TRL 2

100 A concept formulated?

100 Basic scientific principles underpinning concept identified?

50 Preliminary analytical studies confirm basic concept?

100 Application identified?

60 Preliminary design solution identified?

100 Preliminary system studies show application to be feasible?

100 Preliminary performance predictions made?

60

100 Benefits formulated?

100 Research & development approach formulated?

100 Preliminary definition of Laboratory tests and test environments established?

50

If you wish to view the details of a given level(s) check the appropriate boxes at right and click paste. The entire page can then be saved to a new work sheet by clicking "Save All". The levels achieved and the appropriate project information is saved to the WBS summary at the same time.

Modeling & Simulation used to further refine performance predictions and confirm benefits?

Concept/application feasibility & benefits reported in scientific journals/conference proceedings/technical reports ?

Page 51: TRL Calculator Ver BI.1 Beta

Level 2 Comments:

TRL 3

100 Critical functions/components of the concept/application identified?

40 Subsystem or component analytical predictions made?

20 Subsystem or component performance assessed by Modeling and Simulation?

40 Preliminary key parameters performance metrics established?

90 Laboratory tests and test environments established?

10

10 Component acquisition/fabrication completed?

0 Component tests completed?

0

0 Analytical verification of critical functions from proof-of-concept made?

0 Analytical and experimental proof-of-concept documented?

Level 3 Comments:

Laboratory testing on potential materials expected to be completed in 2 weeks

Modeling & simulation expected to be completed in 6 weeksResults to be presented at SPIE Astronomy Conference, in July

Laboratory test support equipment and facilities completed for component/proof-of-concept testing?

Analysis of test results completed establishing key performance metrics for components/subsystems?

Page 52: TRL Calculator Ver BI.1 Beta

Manufacturing Readiness Level Details

MRL 3

30 Preliminary design of components/subsystem/systems to be manufactured exists?

30 Basic manufacturing requirements identified?

40 Current manufacturability concepts assessed?

0 Modifications required to existing manufacturing concepts?

0 New manufacturing concepts required?

0 Have requirments for new materials, components, skills and facilities been identified?

20 Preliminary process flow identified?

20 Required manufacturing concepts identified?

Level 3 Comments:

Industry/government survey indicates that sunshade fabrication is feasible in a number of different materials, however, subscale models need to be fabricated and tested

Page 53: TRL Calculator Ver BI.1 Beta

TRL(Hardware)

1

Y

Y1

2MRL

2

0 0

0 00 00 0

SRL (Software)

4

Page 54: TRL Calculator Ver BI.1 Beta

Summary of the Technology's Readiness

NGST

Bilbro Date: 1/16/08

ASSESSMENT DETAILSYellow 50%Set Point

Level 1 2 3 4 5 6 7 8 9TRLSRLMRL

Technology Readiness Level Details

TRL 2

A concept formulated?

Basic scientific principles underpinning concept identified?

Preliminary analytical studies confirm basic concept?

Application identified?

Preliminary design solution identified?

Preliminary system studies show application to be feasible?

Preliminary performance predictions made?

Benefits formulated?

Research & development approach formulated?

Preliminary definition of Laboratory tests and test environments established?

If you wish to view the details of a given level(s) check the appropriate boxes at right and click paste. The entire page can then be saved to a new work sheet by clicking "Save All". The levels achieved and the appropriate project information is saved to the WBS summary at the same time.

Modeling & Simulation used to further refine performance predictions and confirm

Page 55: TRL Calculator Ver BI.1 Beta

TRL 3

Critical functions/components of the concept/application identified?

Subsystem or component analytical predictions made?

Subsystem or component performance assessed by Modeling and Simulation?

Preliminary key parameters performance metrics established?

Laboratory tests and test environments established?

Component acquisition/fabrication completed?

Component tests completed?

Analytical verification of critical functions from proof-of-concept made?

Analytical and experimental proof-of-concept documented?

Laboratory testing on potential materials expected to be completed in 2 weeks

Laboratory test support equipment and facilities completed for component/proof-of-

Analysis of test results completed establishing key performance metrics for

Page 56: TRL Calculator Ver BI.1 Beta

Manufacturing Readiness Level Details

MRL 3

Preliminary design of components/subsystem/systems to be manufactured exists?

Basic manufacturing requirements identified?

Current manufacturability concepts assessed?

Modifications required to existing manufacturing concepts?

New manufacturing concepts required?

Have requirments for new materials, components, skills and facilities been identified?

Preliminary process flow identified?

Required manufacturing concepts identified?

Industry/government survey indicates that sunshade fabrication is feasible in a number of different materials, however,

Page 57: TRL Calculator Ver BI.1 Beta

R S T U V W X YDisplay Green? 7 15 16 16 16 16 16 16

Display Yellow?Max GreenMax Yellow

Display Green? 0 0 0 0 0 0 0 0

Display Yellow?Max GreenMax Yellow

Display Green? 0 0 0 0 0 0 0 0

Display Yellow?Max GreenMax Yellow

Page 58: TRL Calculator Ver BI.1 Beta

1

700

00

Page 59: TRL Calculator Ver BI.1 Beta

2 3 4 5 6 7 8 9

1020 310 0 0 0 0 0 0

0 0 0 0 0 0 0 00 140 0 0 0 0 0 0

Page 60: TRL Calculator Ver BI.1 Beta

10

Summary of the Technology's Readiness

Sys/subsys/comp: Spacecraft Bus Program:

WBS #: 1.1 Evaluator:

SW Mission Class: n / a Safety Critical? n / a

1 2 3 4 5 6 7 8 9

Technology Readiness Level Achieved Hardware 3

Technology Readiness Level Achieved Software

Manufacturing Readiness Level Achieved Hardware 4 3

Technology Readiness Level Details

TRL 4

80

80 Preliminary definition of operational environment completed?

70 Laboratory tests and test environments defined for breadboard testing?

70

100 Key parameter performance metrics established for breadboard laboratory tests?

100 Laboratory test support equipment and facilities completed for breadboard testing?

10 System/subsystem/component level breadboard fabrication completed?

0 Breadboard tests completed?

0 Analysis of test results completed verifying performance relative to predicitions?

0 Preliminary system requirements for end user's application defined?

40

40 Relevant test environment defined?

If you wish to view the details of a given level(s) check the appropriate boxes at right and click paste. The entire page can then be saved to a new work sheet by clicking "Save All". The levels achieved and the appropriate project information is saved to the WBS summary at the same time.

Concept/application translated into detailed system/subsystem/component level breadboard design?

Pre-test predictions of breadboard peformance in a laboratory environment assessed by Modeling and Simulation?

Critical test environments and performance predictions defined relative to the preliminary definiton of the operating environment?

Page 61: TRL Calculator Ver BI.1 Beta

0

Level 4 Comments:

Manufacturing Readiness Level Details

MRL 4

40 Manufacturing requirements (including testing) finalized?

70 Machine/Tooling requirements identified?

30 Has training/certification been identified for all skills required (particularly new skills)?

70 Material requirements identified?

60 Producibility assessment intitalized?

60 Machinery/tooling modifciations breadboarded?

80 Key manufacturing processes identified?

80 New machinery/tooling breadboarded?

50 Metrology requirements identified?

80 Key metrology components breadboarded?

Breadboard performance results verifying analytical predictions and defintion of relevant operational environment documented?

3 weeks to breadboard design completePrelimnary operational evironment due for completion in 1 weekAll other activites to be comple within 2 months

Page 62: TRL Calculator Ver BI.1 Beta

80 Key analytical tool requirements identified?

70 Key analytical tools breadboarded?

70 Key manufacturing processes assessed in laboratory?

70 Mitigation strategies identified to address manufacturability / producibility shortfalls?

50 All Manufacturing Processes Identified?

Level 4 Comments:

All manufacturing task will be completed in 3 weeks

Page 63: TRL Calculator Ver BI.1 Beta

TRL(Hardware)

1

Y Y Y

1 2 3

MRL

2

0 0 Y

0 0 Y0 0 30 0 4

SRL (Software)

4

Page 64: TRL Calculator Ver BI.1 Beta

Summary of the Technology's Readiness

NGST

Bilbro Date: 1/16/08

ASSESSMENT DETAILSYellow 50%Set Point

Level 1 2 3 4 5 6 7 8 9TRLSRLMRL

Technology Readiness Level Details

TRL 4

Preliminary definition of operational environment completed?

Laboratory tests and test environments defined for breadboard testing?

Key parameter performance metrics established for breadboard laboratory tests?

Laboratory test support equipment and facilities completed for breadboard testing?

System/subsystem/component level breadboard fabrication completed?

Breadboard tests completed?

Analysis of test results completed verifying performance relative to predicitions?

Preliminary system requirements for end user's application defined?

Relevant test environment defined?

If you wish to view the details of a given level(s) check the appropriate boxes at right and click paste. The entire page can then be saved to a new work sheet by clicking "Save All". The levels achieved and the appropriate project information is saved to the WBS summary at the same time.

Concept/application translated into detailed system/subsystem/component level

Pre-test predictions of breadboard peformance in a laboratory environment assessed

Critical test environments and performance predictions defined relative to the

Page 65: TRL Calculator Ver BI.1 Beta

Manufacturing Readiness Level Details

MRL 4

Manufacturing requirements (including testing) finalized?

Machine/Tooling requirements identified?

Has training/certification been identified for all skills required (particularly new skills)?

Material requirements identified?

Producibility assessment intitalized?

Machinery/tooling modifciations breadboarded?

Key manufacturing processes identified?

New machinery/tooling breadboarded?

Metrology requirements identified?

Key metrology components breadboarded?

Breadboard performance results verifying analytical predictions and defintion

Page 66: TRL Calculator Ver BI.1 Beta

Key analytical tool requirements identified?

Key analytical tools breadboarded?

Key manufacturing processes assessed in laboratory?

Mitigation strategies identified to address manufacturability / producibility shortfalls?

All Manufacturing Processes Identified?

All manufacturing task will be completed in 3 weeks

Page 67: TRL Calculator Ver BI.1 Beta

R S T U V W X YDisplay Green? 7 19 30 32 32 32 32 32

Display Yellow?Max GreenMax Yellow

Display Green? 0 0 8 8 8 8 8 8

Display Yellow?Max GreenMax Yellow

Display Green? 0 0 0 0 0 0 0 0

Display Yellow?Max GreenMax Yellow

Page 68: TRL Calculator Ver BI.1 Beta

1

700

00

Page 69: TRL Calculator Ver BI.1 Beta

2 3 4 5 6 7 8 9

1200 1100 590 0 0 0 0 0

0 0 0 0 0 0 0 00 800 960 0 0 0 0 0

Page 70: TRL Calculator Ver BI.1 Beta

###

Date Saved:

Project: Date: 4/9/23

###

WBS # Sys/subsys/comp Name Details? TRL TRL SRL SRL MRL MRL Evaluator Date Delete?

Project TRL Summary R

ow

Ind

icators

Check box values

Deta

Page 71: TRL Calculator Ver BI.1 Beta
Page 72: TRL Calculator Ver BI.1 Beta
Page 73: TRL Calculator Ver BI.1 Beta
Page 74: TRL Calculator Ver BI.1 Beta
Page 75: TRL Calculator Ver BI.1 Beta
Page 76: TRL Calculator Ver BI.1 Beta

###

Summary of the Technology's Readiness######

###

Sys/subsys/comp: 0 Program: 0

WBS #: 0 Evaluator: 0 Date: 4/9/23

SW Mission Class: n / a Safety Critical? n / a###

1 2 3 4 5 6 7 8 9

Technology Readiness Level Achieved Hardware ASSESSMENT DETAILS

Yellow 50%

Technology Readiness Level Achieved Software Level 1 2 3 4 5 6 7 8 9TRL ###

Manufacturing Readiness Level Achieved Hardware SRL ###MRL ###

###

###

###

###

###

###

###

###

###

Start Row

MRL

SRL

TRL/SRL/MRL DETAIL PASTE LOCATIONS

Cell #

TRL Location

TRL Count #

MRL Count #

Level

Set Point

TR/SRL/MRL DETAILPASTE LOGIC

X

If you wish to view the details of a given level(s) check the appropriate boxes at right and click paste. The entire page can then be saved to a new work sheet by clicking "Save All". The levels achieved and the appropriate project information is saved to the WBS summary at the same time.

TRL(Y

MRL

PRL Disp

Max

Max

Page 77: TRL Calculator Ver BI.1 Beta

Technology Assessment Process

Assign TRL to subsystems based on lowest TRL of components + TRL state of integration

Assess systems, subsystems and components per thehierarchical product breakdown of the WBS

Assign TRL to all components based

on assessment of maturity

Assign TRL to Systems based on lowest TRL of subsystems + TRL state of integration

Identify all components, subsystems and systems that are at lower TRL’s than required by the program

Baseline TechnologicalMaturity Assessment for SRR

Technology ReadinessAssessment Report for PDR

Perform AD2 on all identified components, subsystems and systemsthat are below requisite maturity level.

Technology Development PlanCost Plan

Schedule PlanRisk Assessment

Assign TRL to subsystems based on lowest TRL of components + TRL state of integration

Assess systems, subsystems and components per thehierarchical product breakdown of the WBS

Assign TRL to all components based

on assessment of maturity

Assign TRL to Systems based on lowest TRL of subsystems + TRL state of integration

Identify all components, subsystems and systems that are at lower TRL’s than required by the program

Baseline TechnologicalMaturity Assessment for SRR

Technology ReadinessAssessment Report for PDR

Perform AD2 on all identified components, subsystems and systemsthat are below requisite maturity level.

Technology Development PlanCost Plan

Schedule PlanRisk Assessment

START

Page 78: TRL Calculator Ver BI.1 Beta

Technology Assessment Process

Assign TRL to subsystems based on lowest TRL of components + TRL state of integration

Assess systems, subsystems and components per thehierarchical product breakdown of the WBS

Assign TRL to all components based

on assessment of maturity

Assign TRL to Systems based on lowest TRL of subsystems + TRL state of integration

Identify all components, subsystems and systems that are at lower TRL’s than required by the program

Baseline TechnologicalMaturity Assessment for SRR

Technology ReadinessAssessment Report for PDR

Perform AD2 on all identified components, subsystems and systemsthat are below requisite maturity level.

Technology Development PlanCost Plan

Schedule PlanRisk Assessment

Assign TRL to subsystems based on lowest TRL of components + TRL state of integration

Assess systems, subsystems and components per thehierarchical product breakdown of the WBS

Assign TRL to all components based

on assessment of maturity

Assign TRL to Systems based on lowest TRL of subsystems + TRL state of integration

Identify all components, subsystems and systems that are at lower TRL’s than required by the program

Baseline TechnologicalMaturity Assessment for SRR

Technology ReadinessAssessment Report for PDR

Perform AD2 on all identified components, subsystems and systemsthat are below requisite maturity level.

Technology Development PlanCost Plan

Schedule PlanRisk Assessment

START

Page 79: TRL Calculator Ver BI.1 Beta

You have not documented any task accomplishment at this TRL or above.

You can justify a claim that your technology program has achieved this TRL.

Percent Complete for Levels

This is used to indicate that work has started in a given level.

100% Each level and all levels below it must complete 100% of the questions before turning green.

TRL, MRL, PRL Color Codes 1234567

1 2 3 4 5 6 7 8 9 89

Other Color Code Conventions

Background, cell isn't used.

XXX Title of sheet or section. If blank, a decorative border.

Xxx Comments, explanatory material, and instructions.

Xxx

Xxx

Xxx

Calculator Instructions and Color Code Descriptions

One or more tasks have been accomplished at this TRL or above, but there are enough tasks undone so that you cannot claim achievement of this TRL.

You have accomplished many of the tasks required for this TRL, and you may be able to justify achievement of this TRL depending on which tasks are still undone.

You can answer the questions by using sliders at the left margin to show what portion of a task has been accomplished. This feature lets you track progress to help manage your project. When the total % complete of the sliders and the checked boxes reaches or exceeds the value set by the "yellow set" spinner at the top of the sheet, the thermometer turns color at that level according to the criteria below. You can set the % complete for the "yellow set point" at any value from 50% to 85%. Any activity turns the level red.

You may have activity at higher levels indicated by red or yellow, but higher levels will never turn green untill all of the levels below are green. Everything must be completed to turn a level green.

5% to Yellow

Set Point

50% to85%

It could be OK to set the default % complete value to this number, but you still have a lot of each task undone. Consider raising the value.

Throughout this spreadsheet, we have used a consistent color scheme to identify individual TRLs. The overall "flow" of the colors goes from blue to green as the technology's readiness goes from pure research to pure development. Here are the specific colors with the indicated TRL for each:

We have tried to use a standard set of cell colors to indicate what the cell contents are, as well as whether or not the cell is available for data entry, etc. We used the spreadsheet's protection feature to preclude altering some cells. Much of the computation is done in hidden cells on the Calculator sheet. The standard color scheme is given below:

Calculator technical maturity (TRL) questions to be answered with check box, radio button, or slider.

Calculator manufacturing readiness (MRL) questions to be answered with check box or slider. For software, MRL refers to quality issues.

Calculator programmatic readiness (PRL) questions to be answered with check box or slider.

Page 80: TRL Calculator Ver BI.1 Beta

Data entry by check box or radio buttons. Comments are entered using the keyboard.

Computed or calculated value, no manual data entry permitted.

Page 81: TRL Calculator Ver BI.1 Beta

Version Tracking and Configuration Control Sheet

Version Release Date Comments1.0 11-Mar-02 Beta release. First numbered version. Configuration baseline.

Version 1.0 introduces version numbering as a form ofconfiguration management. In this version, we have "swatted"some bugs discovered in pre-release testing. This hardwarecalculator is the companion to Software TRL Calculator v.1.0.

Interim 1.01 N/A Color codes for technical and program questions added.Included in 1.1 Color code sheet added. Added hidden table to compute numeric

values for TRL and PRT. "Yellow" and "Green" levels achieved aredisplayed in the calculator's TRL and PRT title bars.Added DT&E and OT&E to AFRL Commentary sheet.

1.1 15-Aug-02 The spreadsheet computes each section TRL/PRT separately.You can weight each section according to relative importance.The red, yellow, and green graphical display is still unweighted,but the overall TRL/PRT weighted geometric mean is also given.Updated TRL definitions and descriptions IAW 5000.2-Rdated April 5, 2002.

1.11 Interim Release Separate sheet for TRL and PRT. (Cal Verity)14-Nov-02 Fix weighted TRL jump to 9 when 0 selected (Cal Verity)

Standard format for hardware and software, TRL and PRT, using both % Complete and Weighted TRL/PRT features.Adjustable point where % complete adds to countAlternate versions with and without PRT created

1.12 15-Jan-03 Separates documentation from calculators to eliminate repetition.Added some background material to documentation.

2.01 26-Jan-04 Major revision. Beta test release of items included in version 2.1.

This calculator is a further modification of the calculator developed by Mr. William Nolte of the Air Force Research Laboratory and modified by Mr. James W. Bilbro when he was Assistant Director for Technology/Chief Technologist, George C. Marshall Space Flight Center (MSFC), AL. This calculator integrates the AD2 Calculator with the aforementioned TRL Calculator into a single entity. The original AD2 Calculator was developed by Mr. Bilbro while at MSFC and was coded by Mr. John Cole also of MSFC. Questions regarding this version should be directed to Mr. James W. Bilbro, JB Consulting International, 4017 Panorama Drive SE, Huntsville, AL 35801, Telephone (256) 655-6273, E-mail [email protected]. The original calculator was designed and developed by Mr. William L. Nolte, AFRL/SNOX, 2241 Avionics Circle, WPAFB OH 45433-7302, Telephone (937) 255-4202 Ext. 4040, E-mail [email protected].

Page 82: TRL Calculator Ver BI.1 Beta

2.1 1-Apr-04 Major revision. MRL added. Aggregate overall TRL computed.Allows selection of Hardware, Software or Both.Allows selection of TRL, PRL, and/or MRL.Allows for adding or deleting questions from TRL computation.Allows user to hide unused questions and blank rows.Lets user assume completion of TRL 1 through 3.Top level view added.Questions organized by TRL, not category.Display of Green requires all questions at that level to be checkedWeighted TRL no longer computed.Summary sheet added to display results.

2.2 17-May-04 Wrote MRL Definitions and MRL Background sheetApproved for public release.

NASA Variant 2-Mar-07 Rearranged questions and reorganized display to meet needs ofNASA. See release notes for NASA Variant for details.

AFRL-NASA Ver 3beta 7-Jul-07 Added capability to store data as a function of WBS.Rearranged and further modified questions and displays.

TRL Calculator Ver 4 beta 25-Jan-08 Added capability to store and recall project data.

TRL/AD2 Integrated Ver I 25-Jan-08 Integrated TRL and AD2 Calculators into a single workbook.

21-Mar-08

21-Mar-08

TRL/AD2 Integrated Ver I.1beta

Added additional blank questions for the TRL calculator and added a TRL AD2 Project Status Calculator.

TRL & AD2 Calculators Ver BI.1beta

Created stand-alone TRL and AD2 Calculators from the integrated version

Page 83: TRL Calculator Ver BI.1 Beta

Version Tracking and Configuration Control Sheet

CommentsBeta release. First numbered version. Configuration baseline.Version 1.0 introduces version numbering as a form ofconfiguration management. In this version, we have "swatted"some bugs discovered in pre-release testing. This hardwarecalculator is the companion to Software TRL Calculator v.1.0.Color codes for technical and program questions added.Color code sheet added. Added hidden table to compute numericvalues for TRL and PRT. "Yellow" and "Green" levels achieved aredisplayed in the calculator's TRL and PRT title bars.Added DT&E and OT&E to AFRL Commentary sheet.The spreadsheet computes each section TRL/PRT separately.You can weight each section according to relative importance.The red, yellow, and green graphical display is still unweighted,but the overall TRL/PRT weighted geometric mean is also given.Updated TRL definitions and descriptions IAW 5000.2-Rdated April 5, 2002.Separate sheet for TRL and PRT. (Cal Verity)Fix weighted TRL jump to 9 when 0 selected (Cal Verity)Standard format for hardware and software, TRL and PRT, using both % Complete and Weighted TRL/PRT features.Adjustable point where % complete adds to countAlternate versions with and without PRT createdSeparates documentation from calculators to eliminate repetition.Added some background material to documentation. Major revision. Beta test release of items included in version 2.1.

This calculator is a further modification of the calculator developed by Mr. William Nolte of the Air Force Research Laboratory and modified by Mr. James W. Bilbro when he was Assistant Director for Technology/Chief Technologist, George C. Marshall Space Flight Center (MSFC), AL. This calculator integrates the AD2 Calculator with the aforementioned TRL Calculator into a single entity. The original AD2 Calculator was developed by Mr. Bilbro while at MSFC and was coded by Mr. John Cole also of MSFC. Questions regarding this version should be directed to Mr. James W. Bilbro, JB Consulting International, 4017 Panorama Drive SE, Huntsville, AL 35801, Telephone (256) 655-6273, E-mail [email protected].

The original calculator was designed and developed by Mr. William L. Nolte, AFRL/SNOX, 2241 Avionics Circle, WPAFB OH 45433-7302, Telephone (937) 255-4202 Ext. 4040, E-mail [email protected].

Page 84: TRL Calculator Ver BI.1 Beta

Major revision. MRL added. Aggregate overall TRL computed.Allows selection of Hardware, Software or Both.Allows selection of TRL, PRL, and/or MRL.Allows for adding or deleting questions from TRL computation.Allows user to hide unused questions and blank rows.Lets user assume completion of TRL 1 through 3.Top level view added.Questions organized by TRL, not category.Display of Green requires all questions at that level to be checkedWeighted TRL no longer computed.Summary sheet added to display results.Wrote MRL Definitions and MRL Background sheetApproved for public release.Rearranged questions and reorganized display to meet needs ofNASA. See release notes for NASA Variant for details.Added capability to store data as a function of WBS.Rearranged and further modified questions and displays.Added capability to store and recall project data.

Integrated TRL and AD2 Calculators into a single workbook.

Added additional blank questions for the TRL calculator and added a TRL AD2 Project Status Calculator.

Created stand-alone TRL and AD2 Calculators from the integrated

Page 85: TRL Calculator Ver BI.1 Beta

TRL Calculator ###

1 2 3 4 5 6 7 8 9 Use TRL?

Technology Readiness Level Achieved Hardware Hide blank rows Use SRL?

Use MRL?

Technology Readiness Level Achieved Software % Complete set point

Green set point: 100%Default % Complete:

Manufacturing Readiness Level Achieved Hardware Yellow set point 50% Use TRL?UseSRL?Use MRL?

Class

n/aA

Sys/subsys/comp: Project: B

WBS#: Evaluator: Date: 4/9/23 C

SW Mission Class: Safety Critical? D

HW/SW/Mfg###

% Complete LEVEL 1 (Check all that apply or use slider for % complete) TRL

H 0 Research hypothesis formulated? ###

H 0 Basic scientific principles observed? ###

H 0 Physical laws and assumptions underpinning observations defined? ###

H 0 Physical laws and assumptions underpinning observations verified? ###

H 0 Basic elements of technology identified? ###

H 0 Scientific knowledge generated underpinning hypothesis? ###

H 0 Peer reviewed publication of studies confirming basic principles? ###

S 0 Research hypothesis formulated? ###

S 0 Basic scientific principles observed? ###

S 0 Algorithmic functions and assumptions underpinning observations defined? ###

S 0 Algorithmic functions and assumptions underpinning observations verified? ###

S 0 Basic mathematical formulations identified? ###

S 0 Scientific knowledge generated underpinning hypotheses? ###

S 0 Peer reviewed publication of studies confirming basic principles? ###

Level 1 Comments: Max TRL1

###

###

###

HW/SW/Mfg###

% Complete LEVEL 2 (Check all that apply or use slider for % complete) TRL

H 0 A concept formulated? ###

H 0 Basic scientific principles underpinning concept identified? ###

H 0 Preliminary analytical studies confirm basic concept? ###

H 0 Application identified? ###

H 0 Preliminary design solution identified? ###

H 0 Preliminary system studies show application to be feasible? ###

H 0 Preliminary performance predictions made? ###

H 0 Modeling & Simulation used to further refine performance predictions and confirm benefits? ###

H 0 Benefits formulated? ###

H 0 Research & development approach formulated? ###

H 0 Preliminary definition of Laboratory tests and test environments established? ###

H 0###

S 0 A concept formulated? ###

S 0 Basic scientific principles underpinning concept identified? ###

S 0 Basic properties of algorithms, representations & concepts defined? ###

S 0 Preliminary analytical studies confirm basic concept? ###

S 0 Application identified? ###

S 0 Preliminary design solution identified? ###

S 0 Preliminary system studies show application to be feasible? ###

S 0 Preliminary performance predictions made? ###

S 0 Basic principles coded? ###

S 0 Modeling & Simulation used to further refine performance predictions and confirm benefits? ###

S 0 Benefits formulated? ###

S 0 Research & development approach formulated? ###

S 0 Preliminary definition of Laboratory tests and test environments established? ###

S 0 Experiments performed with synthetic data? ###

Change yellow set point below.

Concept/application feasibility & benefits reported in scientific journals/conference proceedings/technical reports ?

Do you want to assume completion of TRL 1?

Do you want to assume completion of TRL 2?

Do you want to include Hardware TRL?Do you want to include Software SRL?Do you want to include Manufacturing MRL?

Page 86: TRL Calculator Ver BI.1 Beta

S 0###

###

Level 2 Comments: Max TRL2

###

###

###

HW/SW/Mfg###

% Complete LEVEL 3 (Check all that apply or use slider for % complete) TRL

H 0 Critical functions/components of the concept/application identified? ###

H 0 Subsystem or component analytical predictions made? ###

H 0 Subsystem or component performance assessed by Modeling and Simulation? ###

H 0 Preliminary key parameters performance metrics established? ###

H 0 Laboratory tests and test environments established? ###

H 0 Laboratory test support equipment and facilities completed for component/proof-of-concept testing? ###

H 0 Component acquisition/fabrication completed? ###

H 0 Component tests completed? ###

H 0 Analysis of test results completed establishing key performance metrics for components/subsystems?###

H 0 Analytical verification of critical functions from proof-of-concept made? ###

H 0 Analytical and experimental proof-of-concept documented? ###

S 0 Critical functions/components of the concept/application identified? ###

S 0 Subsystem or component analytical predictions made? ###

S 0 Subsystem or component performance assessed by Modeling and Simulation? ###

S 0 Preliminary performance metrics established for key parameters? ###

S 0 Laboratory tests and test environments established? ###

S 0###

S 0 Component acquisition/coding completed? ###

S 0 Component V&V completed? ###

S 0 Analysis of test results completed establishing key performance metrics for components/ subsystems?###

S 0 Analytical verification of critical functions from proof-of-concept made? ###

S 0 Analytical and experimental proof-of-concept documented? ###

M 0 Preliminary design of components/subsystem/systems to be manufactured exists? ###

M 0 Basic manufacturing requirements identified? ###

M 0 Current manufacturability concepts assessed? ###

M 0 Modifications required to existing manufacturing concepts? ###

M 0 New manufacturing concepts required? ###

M 0 Have requirments for new materials, components, skills and facilities been identified? ###

M 0 Preliminary process flow identified? ###

M 0 ###

Level 3 Comments: Max TRL3

###

###

###

HW/SW/Mfg% Complete LEVEL 4 (Check all that apply or use slider for % complete) TRL

H 0 Concept/application translated into detailed system/subsystem/component level breadboard design?###

H 0 Preliminary definition of operational environment completed? ###

H 0 Laboratory tests and test environments defined for breadboard testing? ###

H 0###

H 0 Key parameter performance metrics established for breadboard laboratory tests? ###

H 0 Laboratory test support equipment and facilities completed for breadboard testing? ###

H 0 System/subsystem/component level breadboard fabrication completed? ###

H 0 Breadboard tests completed? ###

H 0 Analysis of test results completed verifying performance relative to predicitions? ###

H 0 Preliminary system requirements for end user's application defined? ###

H 0###

H 0 Relevant test environment defined? ###

H 0###

S 0###

S 0 Preliminary definition of operational environment completed? ###

S 0 Laboratory tests and test environments defined for integrated component testing? ###

S 0###

Concept/application feasibility & benefits reported in scientific journals/conference proceedings/technical reports?

Laboratory test support equipment and computing environment completed for component/proof-of-concept testing?

Required manufacturing concepts identified?

Pre-test predictions of breadboard peformance in a laboratory environment assessed by Modeling and Simulation?

Critical test environments and performance predictions defined relative to the preliminary definiton of the operating environment?

Breadboard performance results verifying analytical predictions and defintion of relevant operational environment documented?

Concept/application translated into detailed system/subsystem/component level software architecture design?

Pre-test predictions of integrated component performance in a laboratory environment assessed by Modeling and Simulation?

Do you want to assume completion of TRL 3?

Page 87: TRL Calculator Ver BI.1 Beta

S 0 Key parameter performance metrics established for integrated component laboratory tests? ###

S 0###

S 0 System/subsystem/component level coding completed? ###

S 0 Integrated component tests completed? ###

S 0 Analysis of test results completed verifying performance relative to predictions? ###

S 0 Preliminary system requirements defined for end users' application? ###

S 0###

S 0 Relevant test environment defined? ###

S 0###

S 0 ###

S 0###

M 0 Manufacturing requirements (including testing) finalized? ###

M 0 Machine/Tooling requirements identified? ###

M 0 Has training/certification been identified for all skills required (particularly new skills)? ###

M 0 Material requirements identified? ###

M 0 Producibility assessment intitalized? ###

M 0 Machinery/tooling modifciations breadboarded? ###

M 0 Key manufacturing processes identified? ###

M 0 New machinery/tooling breadboarded? ###

M 0 Metrology requirements identified? ###

M 0 Key metrology components breadboarded? ###

M 0 Key analytical tool requirements identified? ###

M 0 Key analytical tools breadboarded? ###

M 0 Key manufacturing processes assessed in laboratory? ###

M 0 Mitigation strategies identified to address manufacturability / producibility shortfalls? ###

M 0 All Manufacturing Processes Identified? ###

Level 4 Comments: Max TRL4

###

###

###

HW/SW/Mfg% Complete LEVEL 5 (Check all that apply or use sliders) TRL

H 0 Critical functions and associated subsystems and components identified? ###

H 0 Relevant environments finalized? ###

H 0 Scaling requirements defined & documented? ###

H 0 Critical subsystems and components breadboards/brassboards identified and designed? ###

H 0 Subsystem/component breadboards/brassboards built? ###

H 0 Facilities, GSE, STE available to support testing in a relevant environment? ###

H 0 M&S pre-test performance predictions completed? ###

H 0 System level performance predictions made for subsequent development phases? ###

H 0 Breadboards/brassboards successfully demonstrated in a relevant environment? ###

H 0 Successful demonstration documented along with scaling requirements? ###

S 0 ###

S 0 Relevant environments finalized? ###

S 0 Scaling requirements defined & documented? ###

S 0 ###

S 0 Subsystem/component integrations and implementations completed? ###

S 0 Facilities, GSE, STE available to support testing in a relevant environment? ###

S 0 Modeling & Simlation pre-test performance predictions completed? ###

S 0 System level performance predictions made for subsequent development phases? ###

S 0 Subsystems/integrated components successfully demonstrated in a relevant environment? ###

S 0 Successful demonstration documented along with scaling requirements? ###

M 0 Design techniques defined to the point where largest problems defined? ###

M 0 System interface requirements known? ###

M 0 Trade studies and lab experiments define key manufacturing processes? ###

M 0 Tooling and machines demonstrated in lab? ###

M 0 Quality and reliability considered, but target levels not yet established? ###

M 0 Initial assesment of assembly needs conducted? ###

M 0 Production processes reviewed with manufacturing for producibility? ###

M 0 Limited pre-production hardware produced? ###

M 0 Some special purpose components combined with available laboratory components? ###

M 0 Process, tooling, inspection, and test equipment in development? ###

M 0 Have facility requirements been identified? ###

M 0 Has a manufacturing flow chart been developed? ###

M 0 Manufacturing process developed? ###

###

Laboratory test support equipment and computing environment completed for integrated component testing?

Critical test environments and performance predictions defined relative to the preliminary definition of the operating environment?

Integrated component performance results verifying analytical predictions and definition of relevant operational environment documented?Integrated component tests completed for reused code?

Integrated component performance results verifying analytical predictions and definition of relevant operational environment documented?

Critical functions and associated subsystems and components identified?

Critical subsystems and component implementations identified and designed?

Page 88: TRL Calculator Ver BI.1 Beta

Level 5 Comments: Max TRL5

###

###

###

HW/SW/Mfg% Complete LEVEL 6 (Check all that apply or use sliders) TRL

H 0 System requirements finalized? ###

H 0 Operating environment definition finalized? ###

H 0###

H 0 M&S used to simulate system performance in an operational environment? ###

H 0###

H 0 External interfaces baselined? ###

H 0 Scaling requirements finalized? ###

H 0 Facilities, GSE, STE available to support engineering model testing in the relevant environment? ###

H 0 Engineering model or prototype that adequately addresses critical scaling issues fabricated? ###

H 0###

H 0 Analysis of test results verify performance predictions for relevant environment? ###

H 0 Test performance demonstrating agreement with performance predictions documented? ###

S 0 System requirements finalized? ###

S 0 Operating environment definition finalized? ###

S 0###

S 0 Modeling and simulation used to simulate system performance in an operational environment? ###

S 0###

S 0 Hardware/software interfaces baselined? ###

S 0 Scaling requirements finalized? ###

S 0###

S 0 Software model or prototype built that adequately addresses critical scaling issues? ###

S 0###

S 0 Prototype implementation of the software demonstrated on full-scale realistic application? ###

S 0 Analysis of test results verify performance predictions for relevant environment? ###

S 0 Initial draft of required software documentation completed? ###

S 0 Test performance documented demonstrating agreement with performance predictions? ###

S 0 Engineering feasibility fully demonstrated and documented? ###

M 0 Quality and reliability levels established? ###

M 0 Materials, process, design, and integration methods employed? ###

M 0 Investment needs for process and tooling determined? ###

M 0 Production issues identified and major ones have been resolved? ###

M 0 Most pre-production hardware available? ###

M 0 Process and tooling mature? ###

M 0 Critical manufacturing processes prototyped and targets for improved yield established? ###

M 0 Components functionally compatible with operational system? ###

M 0 Integration demonstrations completed? ###

M 0 Initital manufacturing plan developed? ###

M 0 Critical schedule paths identified? ###

M 0 Storage and handling issues addressed? ###

M 0 Long-lead items identified? ###

M 0 Initial production cost estimation complete? ###

M 0 Production demonstrations complete? ###

Level 6 Comments: Max TRL6######

###

HW/SW/Mfg% Complete LEVEL 7 (Check all that apply or use sliders) TRL

H 0 The flight hardware design baselined? ###

H 0 Design addresses all critical scaling issues? ###

H 0 M&S used to predict peformance in the operational environment? ###

H 0 Facilities, GSE, STE available to support prototype and qualification testing of flight demonstrator?###

H 0###

H 0 Qualification tests for flight demonstrator completed? ###

H 0 All peformance specifications verified by test or analysis? ###

Subset of relevant environments identified that address key aspects of the final operating environment?

M&S used to simulate system/subsystem engineering model/protype performance in the relevant environment?

Engineering model or prototype that adequately addresses critical scaling issues tested in the relevant environment?

Subset of relevant environments identified that address key aspects of the final operating environment?

Modeling and simulation used to simulate system/subsystem engineering model/prototype performance in the relevant environment?

Facilities, computing environment available to support software model testing in the relevant environment?

Software model or prototype built that adequately addresses critical scaling issues tested in the relevant environment?

Fully integrated engineering model or scaled prototype unit that adequately addresses all critical scaling & external interfaces built?

Page 89: TRL Calculator Ver BI.1 Beta

H 0 Fully integrated prototype unit successfully demonstrated in operational environment? ###

H 0 All final acceptance testing plans/procedures/criteria have been baselined? ###

H 0 Sucessful flight demonstration documented? ###

S 0 Hardware interfaces baselined? ###

S 0 Design addresses all critical scaling issues? ###

S 0 Modeling and simulation used to predict performance in the operational environment? ###

S 0###

S 0###

S 0 All software testing/V&V specified in software development plan completed and results documented?###

S 0 All performance specifications verified by test or analysis? ###

S 0 Fully integrated prototype software successfully demonstrated in operational environment? ###

S 0 All final acceptance testing plans/procedures/criteria have been baselined? ###

S 0 Intermediate draft of required software documentation completed? ###

S 0 Successful operational demonstration documented? ###

M 0 Materials, processes, methods, and design techniques baselined? ###

M 0 Most maintainability, reliability, and supportability data available? ###

M 0 Manufacturing processes baselined? ###

M 0 Production planning complete? ###

M 0 Process tooling and inspection / test equipment demonstrated? ###

M 0 Machines and tooling demonstrated in pre-production environment? ###

M 0 Integration facilities ready & available? ###

M 0 Prototype system built on "soft" tooling? ###

M 0 Prototype improves to pre-production quality? ###

M 0 Ready for Low Rate Initial Production (LRIP)? ###

###

Level 7 Comments: Max TRL7

###

###

###

HW/SW/Mfg% Complete LEVEL 8 (Check all that apply or use sliders) TRL

H 0 All flight hardware subsystems design complete? ###

H 0 All flight hardware interfaces defined? ###

H 0 Flight system design complete? ###

H 0 Flight and ground operational plans baselined? ###

H 0 All flight system qualification tests completed? ###

H 0 All system performance specifications verified by either test or analysis? ###

H 0 All component/subsystem/system flight hardware system acceptance testing completed? ###

H 0 All flight hardware delivered for integration? ###

H 0 Final system integration complete? ###

H 0 System readiness for launch/operation documented? ###

S 0 All software components and hardware interfaces defined and design complete? ###

S 0 Software operations, maintenance and retirement baselined and draft documentation prepared? ###

S 0###

S 0###

S 0 All component/subsystem/ hardware interface system acceptance testing completed? ###

S 0 Software readiness for launch/operation documented? ###

S 0###

M 0 Maintainability, reliability, and supportability data collection completed? ###

M 0 All materials are in production and readily available? ###

M 0 Supply chain established. ###

M 0 All manufacturing equipment, tooling, metrology and analysis systems in place? ###

M 0 Manufacturing facilities complete ready for full rate production? ###

M 0 All manufacturing processes validated under operational contions? ###

###

Level 8 Comments: Max TRL8

###

###

###

HW/SW/Mfg% Complete LEVEL 9 (Check all that apply or use sliders) TRL

H 0 Flight system inserted into operational enviroment? ###

H 0 Flight system operated in operational enviroment? ###

Facilities, computing environment available to support prototype and qualification testing of operational software?

Fully integrated software model or scaled prototype system coded that adequately addresses all critical scaling issues and component and hardware interfaces?

All software testing/V&V completed as specified in project plans and results documented, reviewed and approved by approriate authorities?All defects and software issues, including impacts on requirements either fixed or resolved, and results documented?

All required software user documentation, version description, and maintenance documentation completed?

Page 90: TRL Calculator Ver BI.1 Beta

H 0 Flight system peformance analyzed? ###

H 0 Flight system performance verified as meeting operational requirements? ###

H 0 Verification of flight system performance meeting operational requirements documented? ###

S 0 Software fully integrated and operated in the operational environment? ###

S 0 Software performance analyzed, verified and documented as meeting operational requirements?###

S 0###

S 0 Sustaining software engineering support in place? ###

S 0 Software operations, maintenance and retirement procedures finalized and documented? ###

M 0 Design stable? ###

M 0 Production operating at desired levels? ###

M 0 Planned product improvement program in place for future acquisitions? ###

M 0 All manufacturing processes controlled to appropriate quality level? ######

Level 9 Comments: Max TRL9######

All required software documentation completed? (Minimum NASA software documentation requirements are described in NPR 7150.2. Center/project may elaborate, tailor or augment.)

Page 91: TRL Calculator Ver BI.1 Beta

*Development Terminology

Proof of Concept: (TRL 3)

Breadboard: (TRL 4)

Developmental Model/ Developmental Test Model: (TRL 4)

Mass Model: (TRL 5)Nonfunctional hardware that demonstrates form and/or fit for use in interface testing, handling, and modal anchoring.

Proof Model: (TRL 6)

.

Analytical and experimental demonstration of hardware/software concepts that may or may not be incorporated into subsequent development and/or operational units.

A low fidelity unit that demonstrates function only, without respect to form or fit in the case of hardware, or platform in the case of software. It often uses commercial and/or ad hoc components and is not intended to provide definitive information regarding operational performance.

Any of a series of units built to evaluate various aspects of form, fit, function or any combination thereof. In general these units may have some high fidelity aspects but overall will be in the breadboard category.

Brassboard: (TRL 5 – TRL6)

A mid-fidelity functional unit that typically tries to make use of as much operational hardware/software as possible and begins to address scaling issues associated with the operational system. It does not have the engineering pedigree in all aspects, but is structured to be able to operate in simulated operational environments in order to assess performance of critical functions.

Subscale model: (TRL 5 – TRL7)

Hardware demonstrated in subscale to reduce cost and address critical aspects of the final system. If done at a scale that is adequate to address final system performance issue it may become the prototype..

Hardware built for functional validation up to the breaking point, usually associated with fluid system over pressure, vibration, force loads, environmental extremes, and other mechanical stresses.

Page 92: TRL Calculator Ver BI.1 Beta

Proto-type Unit: (TRL 6 – TRL 7)

Engineering Model: (TRL 6 – TRL 8)

Flight Qualified Unit: (TRL8 – TRL9)

Flight Proven: (TRL 9)Hardware/software that is identical to hardware/software that has been successfully operated in a space mission.

The proto-type unit demonstrates form (shape and interfaces), fit (must be at a scale to adequately address critical full size issues), and function (full performance capability) of the final hardware. It can be considered as the first Engineering Model. It does not have the engineering pedigree or data to support its use in environments outside of a controlled laboratory environment – except for instances where a specific environment is required to enable the functional operation including in-space. It is to the maximum extent possible identical to flight hardware/software and is built to test the manufacturing and testing processes at a scale that is appropriate to address critical full scale issues.

A full scale high-fidelity unit that demonstrates critical aspects of the engineering processes involved in the development of the operational unit. It demonstrates function, form, fit or any combination thereof at a scale that is deemed to be representative of the final product operating in its operational environment. Engineering test units are intended to closely resemble the final product (hardware/software) to the maximum extent possible and are built and tested so as to establish confidence that the design will function in the expected environments. In some cases, the engineering unit will become the protoflight or final product, assuming proper traceability has been exercised over the components and hardware handling.

Flight Qualification Unit: (TRL 8)

Flight hardware that is tested to the levels that demonstrate the desired margins, particularly for exposing fatigue stress., typically 20-30%. Sometimes this means testing to failure. This unit is never flown. Key overtest levels are usually +6db above maximum expected for 3 minutes in all axes for shock, acoustic, and vibration; thermal vacuum 10C beyond acceptance for 6 cycles, and 1.25 times static load for unmanned flight.

Protoflight Unit: (TRL 8 – TRL 9)

Hardware built for the flight mission that includes the lessons learned from the Engineering Model but where no Qualification model was built to reduce cost. It is however tested to enhanced environmental acceptance levels. It becomes the mission flight article. A higher risk tolerance is accepted as a tradeoff. Key protoflight overtest levels are usually +3db for shock, vibration, and acoustic; 5C beyond acceptance levels for thermal vacuum tests.

Actual flight hardware/software that has been through acceptance testing. Acceptance test levels are designed to demonstrate flight-worthiness, to screen for infant failures without degrading performance. The levels are typically less than anticipated levels.

Page 93: TRL Calculator Ver BI.1 Beta

Environmental Definitions;

Laboratory Environment:

Operational Environment:

Additional Definitions:

Mission Configuration:

An environment that does not address in any manner the environment to be encountered by the system, subsystem or component (hardware or software) during its intended operation. Tests in a laboratory environment are solely for the purpose of demonstrating the underlying principles of technical performance (functions) without respect to the impact of environment.

Relevant Environment:

Not all systems, subsystems and/or components need to be operated in the operational environment in order to satisfactorily address performance margin requirements. Consequently, the relevant environment is the specific subset of the operational environment that is required to demonstrate critical “at risk” aspects of the final product performance in an operational environment.

The environment in which the final product will be operated. In the case of spaceflight hardware/software it is space. In the case of ground based or airborne systems that are not directed toward space flight it will be the environments defined by the scope of operations. For software, the environment will be defined by the operational platform and software operating system.

The final architecture/system design of the product that will be used in the operational environment. If the product is a subsystem/component then it is embedded in the actual system in the actual configuration used in operation.

Verification – Demonstration by test that a device meets its functional and environmental requirements. (ie., was it built right?)

Validation – Determination that a device was built in accordance with the totality of its prescribed requirements by any appropriate method. Commonly uses a verification matrix of requirement and method of verification. (ie., did I build the right thing?)

Part – Single piece or joined pieces impaired or destroyed if disassembled –eg., a resistor.

Subassembly or component – Two or more parts capable of disassembly or replacement – eg., populated printed circuit board..

Page 94: TRL Calculator Ver BI.1 Beta

*from NASA NPR 7120.8

Assembly or Unit – a complete and separate lowest level functional item – eg., a valve.Subsystem – Assembly of functionally related and interconnected units - eg ., electrical power subsystem.

System – The composite equipment, methods, and facilities to perform and operational role.

Segment - The constellation of systems, segments, software, ground support, and other attributes required for an integrated constellation of systems.

Page 95: TRL Calculator Ver BI.1 Beta

NASA TRL, SRL Scales & Exit Criteria & JDMTP* MRL Scale

Level Definition Hardware Description Software Description

1

2

3

4

5

Basic principles observed and

reported .

Scientific knowledge generated underpinning

hardware technology concepts/applications.

Scientific knowledge generated underpining basic

properties of software architecture and mathematical

formulation.

Technology concept or application

formulated

Invention begins, practical application is identified but

is speculative, no experimental proof or

detailed analysis is available to support the conjecture.

Practical application is identified but is speculative,

no experimental proof or detailed analysis is available

to support the conjecture. Basic properties of algorithms,

representations & concepts defined. Basic principles

coded. Experiments performed with synthetic data.

Analytical and/ or experimental

critical function or characteristic

proof-of-concept .

Analytical studies place the technology in an appropriate

context and laboratory demonstrations, modeling

and simulation validate analytical prediction.

Development of limited functionality to validate critical

properties and predictions using non-integrated software

components

Component or breadboard validation in laboratory

A low fidelity system/component

breadboard is built and operated to demonstrate basic functionality and

critical test environments and associated performance

predicitions are defined relative to the final operating

environment.

Key, functionally critical, software components are

integrated, and functionally validated, to establish

interoperability and begin architecture development.

Relevant Environments defined and performance in this environment predicted.

Component or breadboard

validation in a relevant

environment

A mid-level fidelity system/component

brassboard is built and operated to demonstrate overall performance in a

simulated operational environment with realistic

support elements that demonstrates overall

performance in critical areas. Performance predictions are

made for subsequent development phases.

End-to-end Software elements implemented and interfaced

with existing systems/simulations conforming to target

environment. End-to-end software system, tested in

relevant environment, meeting predicted performance.

Operational Environment Performance Predicted.

Prototype implementations developed.

Page 96: TRL Calculator Ver BI.1 Beta

5

6

7

Component or breadboard

validation in a relevant

environment

A mid-level fidelity system/component

brassboard is built and operated to demonstrate overall performance in a

simulated operational environment with realistic

support elements that demonstrates overall

performance in critical areas. Performance predictions are

made for subsequent development phases.

End-to-end Software elements implemented and interfaced

with existing systems/simulations conforming to target

environment. End-to-end software system, tested in

relevant environment, meeting predicted performance.

Operational Environment Performance Predicted.

Prototype implementations developed.

System/subsystem model or prototype demonstration in a

relevant environment

A high-fidelity system/component prototype that adequately addresses all critical scaling issues is built

and operated in a relevant environment to demonstrate

operations under critical environmental conditions.

Prototype implementations of the software demonstrated on full-scale realistic problems.

Partially integrate with existing hardware/software systems.

Limited documentation available. Engineering

feasibility fully demonstrated.

System prototype demonstration in

space

A high fidelity engineering unit that adequately

addresses all critical scaling issues is built and operated in a relevant environment to demonstrate performance in

the actual operational environment and platform

(ground, airborne or space).

Prototype software exists having all key functionality available for demonstration

and test. Well integrated with operational hardware/software

systems demonstrating operational feasibility. Most

software bugs removed. Limited documentation

available.

Page 97: TRL Calculator Ver BI.1 Beta

7

8

9

System prototype demonstration in

space

A high fidelity engineering unit that adequately

addresses all critical scaling issues is built and operated in a relevant environment to demonstrate performance in

the actual operational environment and platform

(ground, airborne or space).

Prototype software exists having all key functionality available for demonstration

and test. Well integrated with operational hardware/software

systems demonstrating operational feasibility. Most

software bugs removed. Limited documentation

available.

Actual system completed and flight qualified

through test and demonstration

The final product in its final configuration is successfully demonstrated through test

and analysis for its intended operational environment and platform (ground, airborne or

space).

All software has been thoroughly debugged and fully integrated with all operational

hardware and software systems. All user

documentation, training documentation, and

maintenance documentation completed. All functionality

successfully demonstrated in simulated operational

scenarios. V&V completed..

Actual system flight proven

through successful

mission operations

The final product is successfully operated in an

actual mission.

All software has been thoroughly debugged and fully integrated with all operational hardware/software systems. All documentation has been

completed. Sustaining software engineering support is in place. System has been successfully operated in the

operational environment.

Page 98: TRL Calculator Ver BI.1 Beta

9

Actual system flight proven

through successful

mission operations

The final product is successfully operated in an

actual mission.

All software has been thoroughly debugged and fully integrated with all operational hardware/software systems. All documentation has been

completed. Sustaining software engineering support is in place. System has been successfully operated in the

operational environment.

Page 99: TRL Calculator Ver BI.1 Beta

NASA TRL, SRL Scales & Exit Criteria & JDMTP* MRL Scale

MRL Description Exit Criteria

No corresponding MRL

No corresponding MRL

Peer reviewed publication of research underlying the

proposed concept/ application.

Documented description of the application/concept that

addresses feasibility and benefit

Manufacturing Concepts Identified. Assessment of current

manufacturability concepts or producibility needs for key breadboard components.

Documented analytical/experimental

results validating predicitions of key

parameters

Laboratory Manufacturing Process Demonstration. Key

processes identified and assessed in lab. Mitigation

strategies identified to address manufacturing/producibility

shortfalls. Cost as an independent variable (CAIV)

targets set and initial cost drivers identified.

Documented test performance demonstrating agreement with analytical predictions. Documented

definition of relevant environment.

Manufacturing Process Development. Trade studies and

lab experiments define key manufacturing processes and sigma levels needed to satisfy

CAIV targets. Initial assessment of assembly needs conducted.

Process, tooling, inspection, and test equipment in development.

Significant engineering and design changes. Quality and

reliability levels not yet established. Tooling and

machines demonstrated in lab. Physical and functional interfaces

have not been completely defined.

Documented test performance demonstrating agreement with analytical predictions. Documented

definition of scaling requirements.

Page 100: TRL Calculator Ver BI.1 Beta

Manufacturing Process Development. Trade studies and

lab experiments define key manufacturing processes and sigma levels needed to satisfy

CAIV targets. Initial assessment of assembly needs conducted.

Process, tooling, inspection, and test equipment in development.

Significant engineering and design changes. Quality and

reliability levels not yet established. Tooling and

machines demonstrated in lab. Physical and functional interfaces

have not been completely defined.

Documented test performance demonstrating agreement with analytical predictions. Documented

definition of scaling requirements.

Critical Manufacturing Processes Prototyped. Critical

manufacturing processes prototyped, targets for improved yield established. Process and

tooling mature. Frequent design changes still occur. Investments

in machining and tooling identified. Quality and reliability levels identified. Design to cost

goals identified.

Documented test performance demonstrating agreement with analytical

predictions.

Prototype Manufacturing System. Prototype system built on soft

tooling, initial sigma levels established. Ready for low rate initial production (LRIP). Design changes decrease significantly. Process tooling and inspection

and test equipment demonstrated in production environment. Manufacturing processes generally well understood.

Machines and tooling proven. Materials initially demonstrated in

production and manufacturing process and procedures initially demonstrated. Design to cost

goals validated.

Documented test performance demonstrating agreement with analytical

predictions

Page 101: TRL Calculator Ver BI.1 Beta

Prototype Manufacturing System. Prototype system built on soft

tooling, initial sigma levels established. Ready for low rate initial production (LRIP). Design changes decrease significantly. Process tooling and inspection

and test equipment demonstrated in production environment. Manufacturing processes generally well understood.

Machines and tooling proven. Materials initially demonstrated in

production and manufacturing process and procedures initially demonstrated. Design to cost

goals validated.

Documented test performance demonstrating agreement with analytical

predictions

Manufacturing Process Maturity Demonstration. Manufacturing

processes demonstrate acceptable yield and producibility

levels for pilot line, LRIP, or similar item production. All

design requirements satisfied. Manufacturing process well

understood and controlled to 4-sigma or appropriate quality level. Minimal investment in

machine and tooling - machines and tooling should have

completed demonstration in production environment. All

materials are in production and readily available. Cost estimates <125% cost goals (e.g., design to

cost goals met for LRIP).

Documented test performance verifying analytical predictions

Manufacturing Processes Proven. Manufacturing line operating at

desired initial sigma level. Stable production. Design stable, few or

no design changes. All manufacturing processes controlled to six-sigma or appropriate quality level.

Affordability issues built into initial production and

evolutionary acquisition milestones. Cost estimates

<110% cost goals or meet cost goals (e.g., design to cost goals

met).

Documented mission operational results

Page 102: TRL Calculator Ver BI.1 Beta

**Joint Defense Manufacturing Technology Panel

Manufacturing Processes Proven. Manufacturing line operating at

desired initial sigma level. Stable production. Design stable, few or

no design changes. All manufacturing processes controlled to six-sigma or appropriate quality level.

Affordability issues built into initial production and

evolutionary acquisition milestones. Cost estimates

<110% cost goals or meet cost goals (e.g., design to cost goals

met).

Documented mission operational results

Page 103: TRL Calculator Ver BI.1 Beta

Systematic Assessment of the Program/ProjectImpacts of Technological Advancement and

Insertion Revision A

Page 104: TRL Calculator Ver BI.1 Beta

Systematic Assessment of the Program/ProjectImpacts of Technological Advancement and

Insertion Revision A

Page 105: TRL Calculator Ver BI.1 Beta

Systematic Assessment of the Program/ProjectImpacts of Technological Advancement and

Insertion Revision A

Page 106: TRL Calculator Ver BI.1 Beta

Systematic Assessment of the Program/ProjectImpacts of Technological Advancement and

Insertion Revision A

Page 107: TRL Calculator Ver BI.1 Beta

Technology Maturity Assessment Report Template

TBD ProjectTechnology Readiness Assessment Report

Page 108: TRL Calculator Ver BI.1 Beta

Technology Maturity Assessment Report Template

TBD ProjectTechnology Readiness Assessment Report

Page 109: TRL Calculator Ver BI.1 Beta

James W. BilbroJB Consulting International

Huntsville, Alabama

Introduction to Technology Assessment

Page 110: TRL Calculator Ver BI.1 Beta

James W. BilbroJB Consulting International

Huntsville, Alabama

Introduction to Technology Assessment

Page 111: TRL Calculator Ver BI.1 Beta

Appendix A – NASA Software Classification Report

Project Name:      Date:      Organization:       Project Manager:      ===============================================================NPR 7150.2 Software Class:

===============================================================Class A - Human Rated Software SystemsApplies to all space flight software subsystems (ground and flight) developed and/or operated by or for NASA to support human activity in space and that interact with NASA human space flight systems. Space flight system design and associated risks to humans are evaluated over the program's life cycle, including design, development, fabrication, processing, maintenance, launch, recovery, and final disposal. Examples of Class A software for human rated space flight include but are not limited to: guidance; navigation and control; life support systems; crew escape; automated rendezvous and docking; failure detection, isolation and recovery; and missionoperations.Class B - Non-Human Space Rated Software SystemsFlight and ground software that must perform reliably in order to accomplish primary mission objectives. Examples of Class B software for non-human (robotic) spaceflight include, but are notlimited to, propulsion systems; power systems; guidance navigation and control; fault protection; thermal systems; command and control ground systems; planetary surface operations; hazard prevention; primary instruments; or other subsystems that could cause the loss of science return from multiple instruments.Class C - Mission Support SoftwareFlight or ground software that is necessary for the science return from a single (non-critical) instrument or is used to analyze or process mission data or other software for which a defect could adversely impact attainment of some secondary mission objectives or cause operational problems for which potential work-arounds exist. Examples of Class C software include, but are not limited to, software that supports prelaunch integration and test, mission data processing and analysis, analysis software used in trend analysis and calibration of flight engineering parameters,primary/major science data collection and distribution systems, major Center facilities, data acquisition and control systems, aeronautic applications, or software employed by network operations and control (which is redundant with systems used at tracking complexes). Class C software must be developed carefully, but validation and verification effort is generally less intensive than for Class B.Class D - Analysis and Distribution SoftwareNon-space flight software. Software developed to perform science data collection, storage, and distribution; or perform engineering and hardware data analysis. A defect in Class D software may cause rework but has no direct impact on mission objectives or system safety. Examples of Class D software include, but are not limited to, software tools; analysis tools, and science data

A B C D E E1 F G H

Page 112: TRL Calculator Ver BI.1 Beta

Appendix A – NASA Software Classification Report

Project Name:      Date:      Organization:       Project Manager:      ===============================================================NPR 7150.2 Software Class:

===============================================================Class A - Human Rated Software SystemsApplies to all space flight software subsystems (ground and flight) developed and/or operated by or for NASA to support human activity in space and that interact with NASA human space flight systems. Space flight system design and associated risks to humans are evaluated over the program's life cycle, including design, development, fabrication, processing, maintenance, launch, recovery, and final disposal. Examples of Class A software for human rated space flight include but are not limited to: guidance; navigation and control; life support systems; crew escape; automated rendezvous and docking; failure detection, isolation and recovery; and missionoperations.Class B - Non-Human Space Rated Software SystemsFlight and ground software that must perform reliably in order to accomplish primary mission objectives. Examples of Class B software for non-human (robotic) spaceflight include, but are notlimited to, propulsion systems; power systems; guidance navigation and control; fault protection; thermal systems; command and control ground systems; planetary surface operations; hazard prevention; primary instruments; or other subsystems that could cause the loss of science return from multiple instruments.Class C - Mission Support SoftwareFlight or ground software that is necessary for the science return from a single (non-critical) instrument or is used to analyze or process mission data or other software for which a defect could adversely impact attainment of some secondary mission objectives or cause operational problems for which potential work-arounds exist. Examples of Class C software include, but are not limited to, software that supports prelaunch integration and test, mission data processing and analysis, analysis software used in trend analysis and calibration of flight engineering parameters,primary/major science data collection and distribution systems, major Center facilities, data acquisition and control systems, aeronautic applications, or software employed by network operations and control (which is redundant with systems used at tracking complexes). Class C software must be developed carefully, but validation and verification effort is generally less intensive than for Class B.Class D - Analysis and Distribution SoftwareNon-space flight software. Software developed to perform science data collection, storage, and distribution; or perform engineering and hardware data analysis. A defect in Class D software may cause rework but has no direct impact on mission objectives or system safety. Examples of Class D software include, but are not limited to, software tools; analysis tools, and science data

A B C D E E1 F G H

Page 113: TRL Calculator Ver BI.1 Beta

Appendix A – NASA Software Classification Report

Project Name:      Date:      Organization:       Project Manager:      ===============================================================NPR 7150.2 Software Class:

===============================================================Class A - Human Rated Software SystemsApplies to all space flight software subsystems (ground and flight) developed and/or operated by or for NASA to support human activity in space and that interact with NASA human space flight systems. Space flight system design and associated risks to humans are evaluated over the program's life cycle, including design, development, fabrication, processing, maintenance, launch, recovery, and final disposal. Examples of Class A software for human rated space flight include but are not limited to: guidance; navigation and control; life support systems; crew escape; automated rendezvous and docking; failure detection, isolation and recovery; and missionoperations.Class B - Non-Human Space Rated Software SystemsFlight and ground software that must perform reliably in order to accomplish primary mission objectives. Examples of Class B software for non-human (robotic) spaceflight include, but are notlimited to, propulsion systems; power systems; guidance navigation and control; fault protection; thermal systems; command and control ground systems; planetary surface operations; hazard prevention; primary instruments; or other subsystems that could cause the loss of science return from multiple instruments.Class C - Mission Support SoftwareFlight or ground software that is necessary for the science return from a single (non-critical) instrument or is used to analyze or process mission data or other software for which a defect could adversely impact attainment of some secondary mission objectives or cause operational problems for which potential work-arounds exist. Examples of Class C software include, but are not limited to, software that supports prelaunch integration and test, mission data processing and analysis, analysis software used in trend analysis and calibration of flight engineering parameters,primary/major science data collection and distribution systems, major Center facilities, data acquisition and control systems, aeronautic applications, or software employed by network operations and control (which is redundant with systems used at tracking complexes). Class C software must be developed carefully, but validation and verification effort is generally less intensive than for Class B.Class D - Analysis and Distribution SoftwareNon-space flight software. Software developed to perform science data collection, storage, and distribution; or perform engineering and hardware data analysis. A defect in Class D software may cause rework but has no direct impact on mission objectives or system safety. Examples of Class D software include, but are not limited to, software tools; analysis tools, and science data

A B C D E E1 F G H

Page 114: TRL Calculator Ver BI.1 Beta

Appendix A – NASA Software Classification Report

Project Name:      Date:      Organization:       Project Manager:      ===============================================================NPR 7150.2 Software Class:

===============================================================Class A - Human Rated Software SystemsApplies to all space flight software subsystems (ground and flight) developed and/or operated by or for NASA to support human activity in space and that interact with NASA human space flight systems. Space flight system design and associated risks to humans are evaluated over the program's life cycle, including design, development, fabrication, processing, maintenance, launch, recovery, and final disposal. Examples of Class A software for human rated space flight include but are not limited to: guidance; navigation and control; life support systems; crew escape; automated rendezvous and docking; failure detection, isolation and recovery; and missionoperations.Class B - Non-Human Space Rated Software SystemsFlight and ground software that must perform reliably in order to accomplish primary mission objectives. Examples of Class B software for non-human (robotic) spaceflight include, but are notlimited to, propulsion systems; power systems; guidance navigation and control; fault protection; thermal systems; command and control ground systems; planetary surface operations; hazard prevention; primary instruments; or other subsystems that could cause the loss of science return from multiple instruments.Class C - Mission Support SoftwareFlight or ground software that is necessary for the science return from a single (non-critical) instrument or is used to analyze or process mission data or other software for which a defect could adversely impact attainment of some secondary mission objectives or cause operational problems for which potential work-arounds exist. Examples of Class C software include, but are not limited to, software that supports prelaunch integration and test, mission data processing and analysis, analysis software used in trend analysis and calibration of flight engineering parameters,primary/major science data collection and distribution systems, major Center facilities, data acquisition and control systems, aeronautic applications, or software employed by network operations and control (which is redundant with systems used at tracking complexes). Class C software must be developed carefully, but validation and verification effort is generally less intensive than for Class B.Class D - Analysis and Distribution SoftwareNon-space flight software. Software developed to perform science data collection, storage, and distribution; or perform engineering and hardware data analysis. A defect in Class D software may cause rework but has no direct impact on mission objectives or system safety. Examples of Class D software include, but are not limited to, software tools; analysis tools, and science data

A B C D E E1 F G H

Page 115: TRL Calculator Ver BI.1 Beta

Appendix C - NASA Software Assurance Level Report

Project Name: Date: Organization:       Project Manager:      ===================================================================

Software Assurance Level of Effort:

===========================================================================

If the software to be developed or acquired meets the criteria identified on the left of the table below, then the corresponding software assurance level of effort on the right shall be assigned. If the software meets the criteria of more than one level, the highest level shall be assigned.

Software Assurance Level of Effort

NPR 7150.2 Software Class Full High Medium Low

A X B X C X D E E1 Software Safety Criticality X Potential for:

Catastrophic Mission Failure: Loss of vehicle, or total inability to meet mission objectives

X

Partial Mission Failure: Inability to meet one or more mission objectives

X

Potential for waste of resource investment: Greater than 200 work-years on software X

Greater than 100 work-years on software X

Greater than 20 work-years on software X

Greater than 4 work years on software

Less than 4 work years on software

Potential for impact to equipment, facility, or environment:

Greater than $100M X

Greater than $20M X

Greater than $2M X

Full High Medium Low Not Applicable

Page 116: TRL Calculator Ver BI.1 Beta

Appendix C - NASA Software Assurance Level Report

Project Name: Date: Organization:       Project Manager:      ===================================================================

Software Assurance Level of Effort:

===========================================================================

If the software to be developed or acquired meets the criteria identified on the left of the table below, then the corresponding software assurance level of effort on the right shall be assigned. If the software meets the criteria of more than one level, the highest level shall be assigned.

Software Assurance Level of Effort

NPR 7150.2 Software Class Full High Medium Low

A X B X C X D E E1 Software Safety Criticality X Potential for:

Catastrophic Mission Failure: Loss of vehicle, or total inability to meet mission objectives

X

Partial Mission Failure: Inability to meet one or more mission objectives

X

Potential for waste of resource investment: Greater than 200 work-years on software X

Greater than 100 work-years on software X

Greater than 20 work-years on software X

Greater than 4 work years on software

Less than 4 work years on software

Potential for impact to equipment, facility, or environment:

Greater than $100M X

Greater than $20M X

Greater than $2M X

Full High Medium Low Not Applicable

Page 117: TRL Calculator Ver BI.1 Beta

Appendix C - NASA Software Assurance Level Report

Project Name: Date: Organization:       Project Manager:      ===================================================================

Software Assurance Level of Effort:

===========================================================================

If the software to be developed or acquired meets the criteria identified on the left of the table below, then the corresponding software assurance level of effort on the right shall be assigned. If the software meets the criteria of more than one level, the highest level shall be assigned.

Software Assurance Level of Effort

NPR 7150.2 Software Class Full High Medium Low

A X B X C X D E E1 Software Safety Criticality X Potential for:

Catastrophic Mission Failure: Loss of vehicle, or total inability to meet mission objectives

X

Partial Mission Failure: Inability to meet one or more mission objectives

X

Potential for waste of resource investment: Greater than 200 work-years on software X

Greater than 100 work-years on software X

Greater than 20 work-years on software X

Greater than 4 work years on software

Less than 4 work years on software

Potential for impact to equipment, facility, or environment:

Greater than $100M X

Greater than $20M X

Greater than $2M X

Full High Medium Low Not Applicable

Page 118: TRL Calculator Ver BI.1 Beta

Appendix C - NASA Software Assurance Level Report

Project Name: Date: Organization:       Project Manager:      ===================================================================

Software Assurance Level of Effort:

===========================================================================

If the software to be developed or acquired meets the criteria identified on the left of the table below, then the corresponding software assurance level of effort on the right shall be assigned. If the software meets the criteria of more than one level, the highest level shall be assigned.

Software Assurance Level of Effort

NPR 7150.2 Software Class Full High Medium Low

A X B X C X D E E1 Software Safety Criticality X Potential for:

Catastrophic Mission Failure: Loss of vehicle, or total inability to meet mission objectives

X

Partial Mission Failure: Inability to meet one or more mission objectives

X

Potential for waste of resource investment: Greater than 200 work-years on software X

Greater than 100 work-years on software X

Greater than 20 work-years on software X

Greater than 4 work years on software

Less than 4 work years on software

Potential for impact to equipment, facility, or environment:

Greater than $100M X

Greater than $20M X

Greater than $2M X

Full High Medium Low Not Applicable

Page 119: TRL Calculator Ver BI.1 Beta

Appendix B - NASA Software Safety-Criticality Report

Project Name: Date: Organization:       Project Manager:

===================================================================

Is Software Safety-Critical? Yes No

Brief Justification for Assessment:

===================================================================

Software shall be classified as safety-critical if it meets at least one of the following criteria:

1. Resides in a safety-critical system (as determined by a hazard analysis) AND at least one of the following apply:

a. Causes or contributes to a hazard. b. Provides control or mitigation for hazards. c. Controls safety-critical functions. d. Processes safety-critical commands or data (see Note 1 below). e. Detects and reports, or takes corrective action, if the system reaches a specific hazardous

state. f. Mitigates damage if a hazard occurs. g. Resides on the same system (processor) as safety-critical software (see note 4-2 below).

2. Processes data or analyzes trends that lead directly to safety decisions (e.g., determining when to turn power off to a wind tunnel to prevent system destruction).

3. Provides full or partial verification or validation of safety-critical systems, including hardware or software subsystems.

Note 1: If data is used to make safety decisions (either by a human or the system), then the data is safety-critical, as is all the software that acquires, processes, and transmits the data. However, data that may provide safety information but is not required for safety or hazard control (such as engineering telemetry) is not safety-critical. Note 2: Non-safety-critical software residing with safety-critical software is a concern because it may fail in such a way as to disable or impair the functioning of the safety-critical software. Methods to separate the code, such as partitioning, can be used to limit the software defined as

Page 120: TRL Calculator Ver BI.1 Beta

Appendix B - NASA Software Safety-Criticality Report

Project Name: Date: Organization:       Project Manager:

===================================================================

Is Software Safety-Critical? Yes No

Brief Justification for Assessment:

===================================================================

Software shall be classified as safety-critical if it meets at least one of the following criteria:

1. Resides in a safety-critical system (as determined by a hazard analysis) AND at least one of the following apply:

a. Causes or contributes to a hazard. b. Provides control or mitigation for hazards. c. Controls safety-critical functions. d. Processes safety-critical commands or data (see Note 1 below). e. Detects and reports, or takes corrective action, if the system reaches a specific hazardous

state. f. Mitigates damage if a hazard occurs. g. Resides on the same system (processor) as safety-critical software (see note 4-2 below).

2. Processes data or analyzes trends that lead directly to safety decisions (e.g., determining when to turn power off to a wind tunnel to prevent system destruction).

3. Provides full or partial verification or validation of safety-critical systems, including hardware or software subsystems.

Note 1: If data is used to make safety decisions (either by a human or the system), then the data is safety-critical, as is all the software that acquires, processes, and transmits the data. However, data that may provide safety information but is not required for safety or hazard control (such as engineering telemetry) is not safety-critical. Note 2: Non-safety-critical software residing with safety-critical software is a concern because it may fail in such a way as to disable or impair the functioning of the safety-critical software. Methods to separate the code, such as partitioning, can be used to limit the software defined as

Page 121: TRL Calculator Ver BI.1 Beta
Page 122: TRL Calculator Ver BI.1 Beta
Page 123: TRL Calculator Ver BI.1 Beta

Contact Information

James W. Bilbro

JB Consulting International

4017 Panorama Drive SE,Huntsville, AL 35801

Telephone: 256-655-6273 Fax: 866-235-8953 E-mail: [email protected]

Page 124: TRL Calculator Ver BI.1 Beta

Contact Information

JB Consulting International

Telephone: 256-655-6273 Fax: 866-235-8953 E-mail: [email protected]

Page 125: TRL Calculator Ver BI.1 Beta

TRL AD2 Project Status Definition

Project Type Current TRL AD2 Risk Level Project Status

Basic Research TRL 1 or 2 AD2L 1, 2,3,4

Applied Research TRL 3 or 4 AD2L 5

Advanced Research TRL 5 AD2L 6,7,8,9

Advanced Tech Demonstrator TRL 6, 7

Acquisition Program TRL 8, or 9

TRL Level AD2 Risk

TR

L I

ncr

easi

ng

Mat

uri

ty

9

AD

2 In

crea

sin

g R

isk

Ch

aos

8 80%

7 70%

6 50%

5 40%

4

Wel

l U

nd

erst

oo

d 30%

3 20%

Technology concept or application formulated 2 10%

Actual system flight proven through successful mission operations

Requires new development outside of any existing experience base. No viable approaches exist that can be pursued with any degree of confidence. Basic research in key areas needed before feasible approaches can be defined.

90+%

Actual system completed and flight qualified through test and demonstration

Un

kn

ow

nU

nk

no

wn

s

Requires new development where similarity to existing experience base can be defined only in the broadest sense. Multiple development routes must be pursued.

System/subsystem model or prototype demonstration in a relevant environment

Requires new development but similarity to existing experience is sufficient to warrant comparison in only a subset of critical areas. Multiple development routes must be pursued.

System/subsystem model or prototype demonstration in a relevant environment

Requires new development but similarity to existing experience is sufficient to warrant comparison on only a subset of critical areas. Dual development approaches should be pursued in order to achieve a moderate degree of confidence for success. (desired performance can be achieved in subsequent block upgrades with high degree of confidence.

Component or breadboard validation in a relevant environment

Kn

ow

nU

nk

no

wn

s

Requires new development but similarity to existing experience is sufficient to warrant comparison in all critical areas. Dual development approaches should be pursued to provide a high degree of confidence for success.

Component or breadboard validation in laboratory

Requires new development but similarity to existing experience is sufficient to warrant comparison across the board. A single development approach can be taken with a high degree of confidence for success.

Analytical and/ or experimental critical function or characteristic proof-of-concept

Requires new development well within the experience base. A single development approach is adequate.

Exists but requires major modifications. A single development approach is adequate.

Page 126: TRL Calculator Ver BI.1 Beta

Basic principles observed and reported

TR

L I

ncr

easi

ng

Mat

uri

ty

1

AD

2 In

crea

sin

g R

isk

Wel

l U

nd

erst

oo

d

0%Exists with no or only minor modifications being required. A single development approach is adequate.

Page 127: TRL Calculator Ver BI.1 Beta

TRL AD2 Project Status Definition

Page 128: TRL Calculator Ver BI.1 Beta