Master’s Thesis / Diplomarbeit
Identification of Lean Enablers for Program Management
by Kristian Kinscher
© Kristian Kinscher 2011. All rights reserved.
Signature of the Author ______________________________________________________________
Kristian Kinscher
Mechanical Engineering, RWTH Aachen University
Certified by ________________________________________________________________________
Dr. Josef Oehmen
Thesis Supervisor
Research Scientist, Lean Advancement Initiative (LAI), MIT
Accepted by _______________________________________________________________________
Marcus Rauhut
Thesis Supervisor
Research Scientist, Laboratory for Machine Tools and Production Engineering WZL, RWTH Aachen University
I Acknowledgments i
I Acknowledgments
I would like to express my heartfelt gratitude to the individuals who supported me throughout
the research of this thesis.
First and foremost, I would like to thank my advisor Dr. Josef Oehmen from the Lean
Advancement Initiative (LAI) of the Massachusetts Institute of Technology (MIT) for his
excellence guidance and relentless support. Furthermore, I would like to thank Dr. Eric
Rebentisch who contributed a lot of guidance and without whom a contact to the Lean
Advancement Initiative could not have been established.
In Germany, my biggest thanks go to Prof. Dr. Günther Schuh and Dr. Bastian Franzkoch
who made it possible for me to write my diploma thesis at MIT. In addition, I would like to
thank Marcus Rauhut for supervising my thesis from the side of the Laboratory for Machine
Tools (WZL) in Germany.
My greatest thanks go to all researchers and staff of the Lean Advancement Initiative for their
contributions to my thesis. Especially, I would like to thank the program managers of the
Lean Advancement Initiative – Sean Dorey, Dan Marticello, Gregory McKnew, Dave Morgan
and Dustin Ziegler - for their time in many hours of interviews. I would also like to thank Sid
Rupani, Christoph Knoblinger, and Stephan Baur for being great lab partners and real
friends. I owe my biggest thanks especially to Denis Bassler since he was not only a good
squash and lab partner, but also contributed many valuable ideas in countless discussions.
II Abstract ii
II Abstract
Companies and governments grapple with an environment of increasing complexity in large-
scale and highly innovative engineering development programs. Program management is
recognized as a key tool to manage the transition of project-based organizations to face
these difficulties. However, there are major challenges in program management: First, there
are immense differences between people’s concepts of what program management is.
Second, conjoined with the unclear definition is a lack of a framework specifically for
engineering development program management. Third, there is no comprehensive
compilation of common issues and pitfalls in engineering programs. Last, there is no
approach adapting the idea of “lean thinking” to program management.
These challenges led to major research questions for this thesis: first, “What is program
management and how can it be defined?”; second, “Is there a framework for engineering
program management?”; third, “What are common pitfalls in program management?”; fourth
and last, “Are there ways to adapt “lean thinking” to program management?”
The findings in this thesis are: First, program management in the context of engineering
development programs can be defined as “the management of multiple related projects and
the resulting enterprise over an extended period of time to deliver a complex value
proposition and utilize efficiency and effectiveness gains.” Second, there is a framework
provided by the Lean Advancement Initiative (LAI) which includes both aspects of program
management – cross-cutting program management capabilities and an engineering program
life cycle. The framework is verified in this thesis through interviews, a survey and insights
from an industry focus group. It is structured in six major parts: Program Execution;
Enterprise Management; Scoping, Planning & Contracting; Technology Integration; Product
Design; and Production, Use, Service & Decommissioning. Third, 99 pitfalls in engineering
programs, mapped to the LAI program management framework, were identified. Fourth, 310
Lean Enablers customized to program management are described in detail. In addition, a
mapping of the Lean Enablers to program management pitfalls is carried out to highlight
potential areas of application of the Lean Enablers.
This thesis is based on extensive literature reviews, interviews with Air Force program
managers, a survey and insights from an industry focus group consisting of seven industry
companies in the aerospace and defense sectors and two organizations.
III Table of contents iii
III Table of contents
I Acknowledgments ........................................................................................................... i
II Abstract ........................................................................................................................... ii
III Table of contents ........................................................................................................... iii
1 Introduction ..................................................................................................................... 1
1.1 Program management can help reaching performance goals .................................. 1
1.2 Research Goals ........................................................................................................ 2
1.3 Thesis organization .................................................................................................. 3
2 Current state and research gaps in program management ....................................... 6
2.1 Existing program management frameworks ............................................................. 6
2.1.1 Defense Acquisition System of the United States Department of Defense .......... 7
2.1.2 Managing Successful Programmes (MSP) of the United Kingdom Government 10
2.1.3 The Standard for Program Management of the Project Management Institute
(PMI) 13
2.1.4 Lean Advancement Initiative (LAI) program management framework ................ 17
2.1.5 Comparison of existing frameworks .................................................................... 20
2.2 Issues in program management ............................................................................. 29
2.3 Introduction to lean management ........................................................................... 29
2.4 Identified gaps in current literature ......................................................................... 30
2.5 Research approach ................................................................................................ 31
2.5.1 Literature review ................................................................................................. 32
2.5.2 Interviews ............................................................................................................ 32
2.5.3 Surveys ............................................................................................................... 33
2.5.4 Industry focus group ........................................................................................... 33
2.6 Summary of current state of program management ............................................... 35
3 Definition of program management ............................................................................ 36
3.1 Definition of project management ........................................................................... 36
3.2 Definition of program management ........................................................................ 38
3.3 Summary ................................................................................................................ 42
4 Selection of a framework for engineering programs ................................................ 44
4.1 Discussion of current state of program management in industry ............................ 45
III Table of contents iv
4.2 Validation of the LAI program management framework ......................................... 46
4.2.1 Insights from interviews with program managers ............................................... 47
4.2.2 Insights from the industry focus group ................................................................ 54
4.3 Summary ................................................................................................................ 55
5 Common issues and pitfalls in program management ............................................. 57
5.1 Identification of common program management pitfalls ......................................... 57
5.2 Verification of the pitfalls through interviews and the industry focus group ............ 67
5.3 Mapping pitfalls to program management framework ............................................. 68
5.4 Summary ................................................................................................................ 72
6 Lean Enablers in program management .................................................................... 74
6.1 Sources of Lean Enablers ...................................................................................... 74
6.1.1 Introduction to Lean Enablers in the context of program management .............. 75
6.1.2 Discussion and comparison ................................................................................ 76
6.2 Findings along the Lean Principles ......................................................................... 76
6.2.1 Principle 1: Specification of customer value ....................................................... 77
6.2.2 Principle 2: Value Stream definition .................................................................... 77
6.2.3 Principle 3: Creation of a continuous flow ........................................................... 78
6.2.4 Principle 4: Pull of the value ............................................................................... 79
6.2.5 Principle 5: Striving for perfection ....................................................................... 79
6.2.6 Principle 6: Respect for People .......................................................................... 79
6.3 Synthesis in one general framework ....................................................................... 80
6.4 Assessment through the industry focus group ...................................................... 102
6.5 Summary .............................................................................................................. 102
7 Mapping of Lean Enablers to program management pitfalls ................................. 103
7.1 Methodology for the mapping ............................................................................... 103
7.2 Results of the mapping process ........................................................................... 104
7.3 Verification through review by the industry focus group ....................................... 111
7.4 Summary .............................................................................................................. 112
8 Conclusion and research outlook ............................................................................. 113
IV Table of symbols and abbreviations ........................................................................... iv
V Index of literature .......................................................................................................... vi
VI Table of figures and spreadsheets ............................................................................ xxi
III Table of contents v
VII Appendix A .............................................................................................................. xxv
VIII Appendix B ............................................................................................................. xxxi
1 Introduction 1
1 Introduction
Whereas project management is well understood and much research and literature exists for
it, there is little guidance for program management. The time since the 1990s is
characterized by highly dynamic, highly uncertain and very quickly executed projects
(Williams 1999). Projects have often been perceived as happening in isolation and current
research partly still uses this approach (Fricke and Shenhar 2000).
Current requirements and user needs are often too complex to be achieved by a single
project. By executing large developments in multiple parallel projects, a program
management environment is created (Haughey 2001). Examples of the benefits of a program
organization are the ability to introduce process standardization or a coordinated use of a
common pool of (limited) resources (Platje 1993; Fricke and Shenhar 2000).
1.1 Program management can help reaching performance goals
The concept of program management is used by both companies and governmental
organizations, but both sectors struggle with its execution. For example, companies in the
aircraft industry, with their highly complex products, often do not reach their cost and
schedule goals. The A380 program of Airbus (Michaels) as well as Boeing’s 787
(Dreamliner) (Tang and Zimmerman 2009) missed their originally stated cost and schedule
aims by several billion dollars and several years.
On the governmental side, the US Department of Defense, for example, runs an enormous
number of programs. Many of them, categorized as “major defense acquisition,” are
equipped with vast budgets, totaling $1.6 trillion in the federal fiscal year 2009. In the same
year, the Department of Defense spent $384 billion for contracts on goods and services. As
of March 10 2010, the Department of Defense was facing a cost overrun of $296 billion and
an average schedule overrun of 22 months in the major acquisition programs, with R&D
costs being an average of 45% over budget (GAO 2010).
Figure 1-1 shows the cost overrun of the entire Department of Defense acquisitions portfolio
for each of three recent fiscal years. It is clear that the deviation over the years became
larger rather than smaller.
1 Introduction 2
42,0%40,0%
37,0%
25,0%26,0%
FY2008
50%
40%
30%
20%
10%
0%FY2007FY2003
19,0%
Change in total cost
Change in RDT&E cost
Figure 1-1: Total Department of Defense acquisition portfolio: actual cost compared to
initial estimate (GAO 2006; GAO 2010)
Though the idea of program management is already widely used in larger companies and
governmental organizations, it is obvious that there is a vast potential for achieving further
benefits through its correct application.
1.2 Research Goals
Despite the rising importance of program management, there is a widespread lack of a clear
understanding of what it is. Therefore, the term “program management” must be defined to
establish a common base for further discussions. Since there is no doubt in the current
literature that the success of a program is highly dependent on project management, the first
step toward establishing a definition of program management should be to provide a clear
definition of project management.
An extensive literature review on program management reveals multiple facets in current
research. Including all important aspects in a definition of the term will provide one that is not
only comprehensive but also hopefully can be widely accepted.
In addition to general program management, there are some specialized frameworks for a
program management environment. Most of these approaches have shortcomings in certain
characteristics due to their specialization and are not used in industry. Based on research
undertaken by the LAI and on the generally accepted literature, a new program management
framework is proposed in this thesis and included in the comparison of existing approaches.
1 Introduction 3
This framework is one that is intended to provide a knowledge base for every program
manager – commercial and governmental – and to avoid any shortcomings since there is no
specialized context for this framework.
Further, this thesis gives an overview of existing lean management literature. Based on the
different recognized sources a set of promising lean enablers for a program management
environment can be identified.
The identification of the most common pitfalls in program management provides a base for
the application of these lean enablers. By connecting the lean enablers and the pitfalls telling
conclusions about the applicability of “lean thinking” to program management can be derived.
One possible insight can be the type of distribution of pitfalls according to different framework
parts and the corresponding lean enablers.
1.3 Thesis organization
Figure 1-2 shows the general thesis organization. The different research domains addressed
are shown on the left side. Program management embraces all chapters within this thesis.
The idea of “lean thinking” or lean management is only found during identifying the lean
enablers and the corresponding mapping to the program management pitfalls. In addition,
the figure shows the different research question and where they are answered. All research
questions are posed during the initial literature review in chapter 2. Further, the different
research approaches and the parts where they are utilized in this thesis are displayed.
1 Introduction 4
Research QuestionChapter in Thesis
Research Approach
RQ4: Are there ways to adapt lean thinking to
PM?Chapter 6
RQ3: What are common pitfalls in PM? Chapter 5
RQ2: Is there a framework for engineering PM?
Chapter 4
RQ1:What is program management and how can it be defined?
Chapter 3
Literature
review
Interviews
Survey
IFG
X
X X X X
X X X
X X
X
Research Domain
Program
Man
agem
ent
Lean
Man
agem
ent
Chapter 6
Figure 1-2: Thesis organization
In chapter 2 of this thesis a literature review on current research about program management
and lean management is described. During this review different gaps can be identified.
These gaps lead to research question which are phrased in the different sections. Figure 1-3
shows the four research question of this thesis and the section where the gap corresponding
with the question is discovered.
1 Introduction 5
Research Question 4:Are there ways to adapt lean thinking to program management?
Research Question 3:Are there common pitfalls in program management?
Research Question 2:Is there a framework for engineering program management?
Research Question 1:What is program management and how can it be defined?
Chapter 2.3
Chapter 2.2
Chapter 2.1
Figure 1-3: Research question and corresponding sections in chapter 2
Chapter 3 answers the first research question. This chapter fills the gaps of the missing
understanding of program management by suggesting a definition based on multiple different
facets used in the current literature.
The second gap is addressed in chapter 4. Different aspects of program management are
considered and the necessary elements for a framework for technical development program
are determined. Based on these findings a framework will be selected and necessary
improvements of it are further investigated.
The identification of common pitfalls in program management is explained in detail in chapter
5. Based on an extensive literature review a first collection of challenges is presented and
afterwards improved and validated through interviews and insights from an industry focus
group.
In chapter 6 the way to a comprehensive pool of lean enablers for program management is
explained. Further, the different lean enablers are briefly explained.
Together with chapter 6, chapter 7 answers the fourth and last research question about the
way to adapt “lean thinking” in program management. This chapter explains a method to map
the findings of the pitfalls and lean enablers to highlight possible connections. These
connections could be used to effectively obviate occurring pitfalls during a program
execution.
The last chapter of this thesis - chapter 8 – summarizes the findings and gives and succinct
research outlook. Further, imaginable shortcomings of this thesis are shown and possible
improvement potentials in the research approaches are presented.
2 Current state and research gaps in program management 6
2 Current state and research gaps in program
management
This chapter will provide an overview of the existing literature in the fields of program
management and lean management. Figure 2-1 shows the four research questions of this
thesis and how they correspond to the sections of this chapter.
The first part of this chapter will describe various existing frameworks for program
management and compare them in detail. The different frameworks have diverse focuses,
and their ways of understanding and defining program management differ. In addition,
common issues and pitfalls in program management will be discussed briefly. Furthermore,
the idea of lean management and its basic principles will be explained in this chapter. The
contents of this chapter provide the foundation of knowledge which is necessary in order to
proceed further through this thesis.
Research Question 4:Are there ways to adapt lean thinking to
program management?
Research Question 3:Are there common pitfalls in program
management?
Research Question 2:Is there a framework for engineering
program management?
Research Question 1:What is program management and how
can it be defined?
Chapter 2.3
Chapter 2.2
Chapter 2.1
Summary2.4
Research Approach
2.5
Figure 2-1: Correlation of the research questions and sections
2.1 Existing program management frameworks
There are some existing frameworks for program management in the current literature. This
chapter will briefly describe four different frameworks. Furthermore, the frameworks will be
compared and assessed regarding various aspects. These aspects will be developed within
this chapter.
Basically, the current literature describes two fundamental aspects of program management:
- Programs as a method for organizational change (this might include the use of small
change projects)
2 Current state and research gaps in program management 7
- Programs in an engineering environment in the context of large development efforts
and the management of multiple related projects
In this chapter, examples of frameworks will be presented which especially emphasize one or
both of the aspects.
2.1.1 Defense Acquisition System of the United States Department of
Defense
A widely accepted and used framework in the United States is the one utilized by the
Department of Defense (DoD) in the context of defense acquisitions. A defense acquisition
“includes design, engineering, test and evaluation, production, and operations and support of
defense systems” (University 2009). The Joint Capabilities Integration and Development
System (JCIDS) identifies capability gaps in the current equipment and organizations using a
capabilities-based assessment (CBA) (University 2011). Figure 2-2 shows how the JCIDS fits
in the general decision-making support systems (University 2009). While the JCIDS is
responsible for the identification of capability gaps, the Planning, Programming, Budgeting,
and Execution (PPBE) Process is the Department of Defense’s process for crafting plans
and programs to satisfy the demands of the National Security Strategy. The Defense
Acquisition System is the management process for acquiring new materiel (including, for
example, weapon systems and automated information systems) (University 2011).
Figure 2-2: DoD Decision Support Systems (University 2011)
Capability gaps can be transformed into needs and possible solutions based on the
DOTMLPF guideline (University 2009). The DOTMLPF guideline enables the DoD to find
2 Current state and research gaps in program management 8
solutions structured along different possibilities with different extents (e.g., necessary
organizational changes, budget):
- Doctrine: the way the military acts and fights; e.g., new maneuvers
- Organization: the organizational structures within the military; e.g., Army, Navy, Air
Force, etc.
- Training: the training which is necessary for fighting; e.g., different types of training
(basic vs. advanced)
- Materiel: all resources to equip the forces; e.g., weapons, spares, etc.
- Leadership and education: the education of the leaders of the forces; e.g., different
ranks (squad leader up to general)
- Personnel: accessibility of well-trained people for different purposes (fighting,
peacetime, etc.)
- Facilities: installations and facilities; e.g., government owned ammunition production
facilities (Defense 2010; University 2011)
In the context of this thesis, the acquisition of materiel is the special focus. New materiel is
acquired within defense acquisitions, which are part of the Defense Acquisition System (see
Figure 2-2). In general, the system is defined by the Department of Defense Directive
5000.01 (DoDD 5000.01) (DoD 2003) and the Department of Defense Instruction 5000.02
(DoDI 5000.02) (DoD 2008). While DoDD 5000.01 focuses on policies and principles to
govern defense acquisition, the DoDI 5000.02 provides a management framework to
implement these policies and principles (University 2009; University 2011).
Depending on the category of defense acquisition, the processes differ in details; e.g.,
different persons are in charge for milestone decisions. These categories depend on the
budget of the acquisition. Table 2-1 gives an overview of the different categories and the
criteria.
Table 2-1: Different Categories of defense acquisitions (University 2009)
Budget
>$365M for research,
development, testing
and evaluation; or total
procurement >$2.190B
>$140M for research,
development, testing
and evaluation; or total
procurement >$660M
Less than other
categories
Acquisition categories
for weapon systems
ACAT I
(major defense
acquisition)
ACAT II ACAT III
Acquisition categories
for automated
information systems
ACAT IA does not apply ACAT III
2 Current state and research gaps in program management 9
The Defense Acquisition Management System is an event-based process and structured
along the life cycle of a program. Figure 2-3 shows the life cycle from the identification of
user needs, technology opportunities, and resources to operations and support or finally
decommissioning.
Figure 2-3: Life cycle for defense acquisitions (University 2011)
The pre-systems acquisition is the first phase of the life cycle and incorporates material
solution analysis and technology development. The purpose of this phase is to reduce
technology risk, determine appropriate technologies, and show the capabilities of these
technologies by building representative prototypes. In addition, a traceable requirements flow
to complete a preliminary design for the full requirement is often established (University
2011). The technology development phase starts with the results of the Analysis of
Alternatives (AoA) on a high level of abstraction. The goal is to narrow the scope to continue
development. The technology development phase might include different types of reports
(e.g., System Requirements Review (SRR), System Functional Review (SFR), Preliminary
Design Review (PDR), etc.) that ensure accurate documentation (University 2011).
There is a first milestone during the pre-systems acquisition phase that is required by DoDI
5000.02 (DoD 2008). Milestone A includes multiple technology development demonstrations
to ensure that the chosen technology is affordable, militarily useful, and based on mature
technology (University 2011). This guarantees that no unsuitable technologies are used for
further development during the life cycle.
Milestone B at the end of the technology development phase is based on a Capability
Development Document (CDD) and approves the acquisition strategy, the program baseline,
and the type of contract for the next phase and allow the program to enter the next phase
(University 2009). The CDD provides more details on the material solutions and contains the
thresholds and objectives for the system attributes to measure the delivered capabilities
(University 2009).
The first phase during systems acquisition is engineering and manufacturing development.
The purpose of this phase is to develop the technologies entering from Milestone B to a level
2 Current state and research gaps in program management 10
which is ready for production. This involves contract definition (with suppliers), complete full-
system integration, development of a manufacturing process, ensure operational
supportability, further documentation, etc. (University 2009; University 2011).
After engineering and manufacturing development, Milestone C authorizes entry into low-rate
initial production (LIRP), if required. The documents submitted for Milestone C should include
completion/approval dates for the Capability Production Document (CPD). The CPD as an
approach for Milestone C is similar to the CDD as an approach for Milestone B. In addition to
the CPD, the requirements of the Initial Production Baseline (IPB) have to be fulfilled, and the
technical reviews have to be completed (University 2011). If Milestone C approves low-rate
initial production, an additional subsequent review is necessary to authorize full-rate
production (University 2009).
During the production and deployment phase, the system is produced and delivered for
operational use. The focus during this phase is to ensure an economical production and
meet the user requirements. It might be necessary to conduct further tests with repetition
parts (University 2009).
The last phase of a program life cycle is operations and support. Full operational capability is
achieved during this phase. Operational readiness is assessed and should be established.
Further, logistics support (e.g., supply, maintenance, training, etc.) is evaluated. This phase
also includes life cycle sustainment and at the end the disposal of the system (University
2009).
DoDI 5000.02 describes the requirements during the different phases of the program life
cycle. To provide guidance, the Defense Acquisition University (DAU) published the Defense
Acquisition Guidebook (University 2011), which describes the steps and tools necessary
during each program phase. The Defense Acquisition Guidebook references additional
literature for specific topics. This literature often focuses on a single topic and offers in-depth
knowledge (e.g., “Manager's Guide to Technology Transition in an Evolutionary Acquisition
Environment” (Defense 2005) or “Risk Management Guide for DoD Acquisitions” (Defense
2006)).
2.1.2 Managing Successful Programmes (MSP) of the United Kingdom
Government
A framework in widespread use in the United Kingdom is that presented in “Managing
Successful Programmes” (MSP) published by the Office of Government Commerce (OGC)
(Commerce 2007). They provide a framework to manage programs – “a temporary, flexible
organization created to coordinate, direct and oversee the implementation of a set of related
projects and activities in order to deliver outcomes and benefits related to the organization’s
strategic objectives […] and is likely to have a life that spans several years" (Commerce
2007).
The OGC further defines program management as “the action of carrying out the coordinated
organization, direction and implementation of a dossier of projects and transformation
activities […] to achieve outcomes and realize benefits of strategic importance to the
2 Current state and research gaps in program management 11
business” (Commerce 2007). It states that there are three critical organizational elements for
program management:
- Corporate strategy
- Delivery mechanisms for change
- Business-as-usual environment (Commerce 2007)
Figure 2-4 shows the MSP framework, which is based on three concepts:
- Outer ring: MSP principles; the principles are based on lessons learned from both
positive and negative observations of programs
- Second ring: MSP governance themes; the governance themes allow the definition,
measurement, and control of the organization’s approach to program management.
They enable the organization to find the right leadership, delivery team,
organizational structures, and information control to achieve the planned outcomes
and realize the expected benefits
- Inner circle: MSP transformational flow; this part of the framework represents the
life cycle of the program from its conception, through delivery of benefits and desired
outcomes, to the implementation of the benefits and the close of the program
(Commerce 2007).
2 Current state and research gaps in program management 12
Figure 2-4: The “Managing Successful Programmes” framework of the Office of Government
Commerce (Commerce 2007)
The MSP framework is used to manage programs that deliver change in parts of an
organization, the entire organization, multiple organizations, or the environment of the
organization (Commerce 2007). The inner circle of the MSP framework shows the
transformational flow, which is an iterative process to achieve the change (Commerce 2007).
This flow represents the life cycle of the program and is structured in three major parts:
- Defining a program: This phase provides the baseline for the further development of
the program. It should include the creation of the program definition document,
plans/schedules, and the control framework (Commerce 2007).
- Managing the tranches: Based on the baseline from the definition phase further tasks
are executed to deliver the capability and realize benefits. The projects and activities
during this phase are grouped into tranches. Each tranche delivers a different part
necessary for the change. Throughout this phase it is essential to monitor the
2 Current state and research gaps in program management 13
progress and observe the external environment to ensure that the program is still on
track (Commerce 2007).
- Closing the program: This phase is initiated when the necessary capabilities to
achieve the change are fully developed and implemented. It must be judged whether
the program was successful or not. In addition, it has to be assessed, wheter the
implemented change will be beneficial in the business-as-usual environment. Beside
the planned closure of a program due to the completion of the work, there are
additional possible reasons for a program ending: changes in the external
environment, changed strategy, evidence that goals cannot be achieved by the
program, evidence that the remaining benefits to be realized are disproportionate to
the necessary effort, and the discovery of more beneficial methods for achieving the
outcome (Commerce 2007).
2.1.3 The Standard for Program Management of the Project Management
Institute (PMI)
The Project Management Institute (PMI) is an American non-profit professional organization
(Institute 2011). PMI is well known for its project management framework described in “A
guide to the project management body of knowledge” (PMBOK guide) (Institute 2004). Based
on the established PMBOK framework, PMI published a first edition of a similar approach for
program management: “The Standard for Program Management” (Institute 2006). Two years
later an updated, improved, and three-times-longer second edition was issued (Institute
2008).
The PMI understands program management as being “the centralized coordinated
management of a program to achieve the program’s strategic objectives and benefits. It
involves aligning multiple projects to achieve the program goals and allows for optimized or
integrated cost, schedule, and effort.” (Institute 2008). The definition implies that there is a
hierarchy between projects and programs (programs consist of projects). Figure 2-5 shows
different possible structures of projects, programs, and portfolios (one level above a
program). There are projects that can be part of a portfolio directly rather than through a
program. In addition, Figure 2-5 expresses that programs always consist of at least one
project in the understanding of the PMI.
2 Current state and research gaps in program management 14
Portfolio
ProjectsPortfolio Programs
Programs Projects ProjectsPrograms Projects
Projects Projects Projects
Figure 2-5: Portfolio, programs and projects: high-level view (Institute 2008)
The primary intention of PMI’s framework for program management is to describe and
capture generally accepted best practices. In addition, a general language for program
management and an understanding of program management in the context of portfolio and
project management are established. The framework defines all processes that are
necessary to successful execute a program (Institute 2008).
The framework primarily addresses the following groups of persons:
- Project managers: The framework provides knowledge enabling them to better
understand their own role in the context of a program organization. In addition, the
interfaces between project and program managers are described.
- Program managers: The framework ensures that they understand existing best
practices for effectively managing programs.
- Portfolio managers: The framework clarifies their role in the context of actively
managed portfolios of programs and ensures that they know their interfaces with
program managers and, if appropriate, project managers
- Stakeholders: The framework improves their understanding of the role of a program
manager and how the program manager interacts with various types of stakeholders;
the types of stakeholders are described within the framework (Institute 2008).
Figure 2-6 shows the interaction between the levels of project and program management. It
displays the information flow between those two areas. During the life cycle of a program,
many projects are initiated and overseen by the program manager. He or she provides
direction and guidance to the project managers. In general, the interactions between the
program manager and the project managers tend to be cyclical and iterative (Institute 2008).
The direction and type of interaction between the program and project levels changes during
the life cycle of project. In the early stages, the information flows from the program to the
project level. The program manager guides the project managers and frames the desired
goals and benefits. Later in the project life cycle, the project managers send reports on
2 Current state and research gaps in program management 15
various areas – e.g., project status, risks, changes, costs, issues, etc. to the program domain
(Institute 2008).
Program Management Process
Project Management Process
Initiating Planning
Project
Project
•Project Risk•Change Request•Change in Baselines (time, cost & quality)• Issues
•Program Desired Goals•Benefits•Management Approach
ExecutingMonitoring & Controlling
Closing
Figure 2-6: Interaction between project management and program management (Institute 2008)
The upper part of Figure 2-7 shows a program life cycle. The lower part of Figure 2-7 shows
the basic steps of program benefits management. Both – program life cycle and benefits
management – being earlier, during pre-program phases, with preparations for the program
life cycle and benefits identification(Institute 2008).
During “pre-program preparations” needs are identified that are supported by a valid
business case. This is to ensure that one has a stable groundwork for the initiating of the
program (Institute 2008). Throughout the first phase of the program life cycle valid plans are
created, and information is gathered about and definitions established for organizational
strategies, internal and external influences, program drivers, and, especially, the benefits the
program is supposed to deliver (Institute 2008).
In the phase of “program initiation” the ways in which the program can be structured and
managed to generate the desired benefits are detailed. During this phase, a program charter
is created to capture all conclusion of the pre-program work. The charter normally contains
several different aspects; e.g., justification, vision, outcomes, scope, etc. (the full list can be
found on p. 24 in (Institute 2008)). Later decisions and detailed plans within the program will
be based on the charter (Institute 2008).
The phase of “program setup” is reached after passing the phase-gate review G2, which
implies that the program has received “approval in principle.” The program manager, who
has already been identified by a committee, starts to progressively elaborate the program
charter and develops a detailed “roadmap” for the future activities and directions of the
2 Current state and research gaps in program management 16
program. Furthermore, the basic infrastructure of the program is established and a basic plan
about cost and schedule is created (Institute 2008).
In the phase of “delivery of program benefits” the program has passed another review (G3),
and the execution of the core work of the program takes place. This phase only comes to an
end when the expected benefits are achieved and are fully delivered to the customer. During
this phase, the program manager checks constantly and ensures a correct program
alignment. The project managers perform similar tasks on a project level. One must consider
that different capabilities might be deliverable earlier than others. To ensure the delivery of
the full committed benefits of the program it might be necessary to integrate these
capabilities with capabilities delivered by another project before the associated benefit can
be achieved (Institute 2008).
The last phase in the program life cycle is the “program closure.” This phase is reached
when all work is completed and the benefits are accruing. It is necessary to initiate activities
to shut down the program organization and the corresponding infrastructure. Some programs
might be completely closed, while others are managed by normal operations to ensure
support during the operation phase of the delivered product. During this phase, it is essential
to establish processes to guarantee good documentation providing feedback to different
stakeholders and manage the transition to operations (if applicable) (Institute 2008).
Usually projects as well as programs deliver benefits to the organization. In general,
programs are created to deliver bigger benefits than a single project could. While projects
normally deliver their benefits or products at the end of their life cycle, programs can do the
same or deliver parts of the benefits incrementally during their execution. Therefore, it is vital
to emphasize benefits management within the program.
Benefits management requires processes and metrics to track and measure benefits
throughout the program life cycle. Good benefits management assesses the value of
individual program benefits, identifies interdependencies within benefits, and assigns
responsibilities for the creation of the benefits. At the end of a program, the delivered benefits
must be compared to the desired and planned benefits to benchmark program performance
(Institute 2008).
2 Current state and research gaps in program management 17
Benefits Identification
• Identify and qualify business benefits
Benefits Analysis & Planning
• Derive and prioritize components
• Derive benefits metrics
• Establish benefits realization plan and monitoring
• Map benefits into program plan
Benefits Realization
• Monitor components
• Maintain benefits register
• Report benefits
Benefits Transition
• Consolidate coordinated benefits
• Tranfer the ongoing responsibility
Program Benefits Management
Pre‐Program Preparations
Program Integration
Program Setup
Delivery of Program Benefits
Program Closure
G1 G2 G3 G4
Figure 2-7: Program life cycle and program benefits management (Institute 2008)
2.1.4 Lean Advancement Initiative (LAI) program management framework
The program management framework of the Lean Advancement Initiative (LAI) was
developed based on former work by the institute. The framework focuses on engineering
development projects in the aerospace and defense industry; the LAI refers to these as
“programs”. The generic term “project” is used for smaller-scale efforts; e.g., the
development of subsystems within the program. These programs lead to the formation of
“enterprises”. An enterprise is the interorganizational network that has the common purpose
of the development and delivery of a system. An enterprise is comprised of both public
organizations, such as the program office, as well as private organizations, such as the main
contractor and its suppliers (Stanke 2006).
Figure 2-8 shows the structure of a program and the corresponding stakeholders. The parts
shown in the brackets represent the parts being understood as the program enterprise. The
program enterprise is not just the program office, but rather the program office, the main
contractor and the 1st tier suppliers. A result of the interorganizational nature of (program)
enterprises is that they have distributed responsibility and leadership, and they have
stakeholders with both common and diverse interests.
2 Current state and research gaps in program management 18
Government, Congress, Agencies, Services and the General Public
Program Office
Main Contractor
System Supplier 1
2nd Tier Supplier 1.1
3rd Tier Supplier 1.1.1
2nd Tier Supplier 1.y
...
System Supplier x
2nd Tier Supplier x.1
3rd Tier Supplier x.1.1
2nd Tier Supplier x.z
...
Figure 2-8: Program structure and stakeholders
Program management can be perceived as a “black box.” In general, program management
coordinates the transformation of various input factors, such as stakeholder needs,
technology, competitive actions, process definitions, organization, skills, manpower,
allocated budget, regulations, etc., into the final product, an operational system. The
representation in Figure 2-9 is clustered around the core competencies of a program
enterprise to successfully deliver the operational system, given a certain set of input factors,
or resources. The core competencies can be structured into two major parts:
- Cross-cutting program management capabilities (program execution and enterprise
management)
- Engineering life cycle processes (scoping, planning, and contracting; technology
integration; product design; production, use, service, and decommissioning)
Figure 2-9 provides an overview of the LAI program management framework. The upper part
represents general program management activities. These are independent of the type of
program and must be executed within both engineering programs and change programs.
The lower part of Figure 2-9 represents the life cycle of an engineering program. This life
cycle covers an engineering program from the decision for material development to the start
2 Current state and research gaps in program management 19
of production. The structure of the life cycle does not represent a strict time line for the
execution; it is more a way to organize the different processes. Various activities from
diverse categories of the framework might be executed iteratively and repetitively during the
execution of the program. For example, programs can go through several spirals of planning,
technology integration, and design.
The framework focuses on the processes and organizational practices that are important for
successful program execution from both a government program office perspective and a
contractor’s perspective. Whereas some practices in the model explicitly focus on
interactions between the program office and the main contractor and on interactions within
those two entities, many recommendations are also valid for the extended program
enterprise. Furthermore, the recommendations are valid for interactions between the main
contractor and the system suppliers, between the system suppliers and their suppliers, and
so forth, and for interactions within those entities.
Scoping, Planning & Contracting
• Stakeholder needs & requirements definition
• Managing trade‐offs
• Cost estimation & life cycle costing
• Incentive alignment, contract negotiation & conclusion
• Defining lifecycle strategy
Technology Integration
• Technology maturation monitoring
• Technology transition management
Product Design
• PD team organization
• Integrated product and process development
• PD and SE best practices
• Monitoring PD progress
• Product architecting
• Value Stream Optimization
• Product design strategy
• Testing & prototyping
Production, Use, Service &
Decommissioning
Program Execution
• Integrating and leading the program organization
• Stakeholder management
• Progress monitoring and management
• Risk management
• Project management
• Multi‐project coordination
• Developing intellectual capital
• Program manager qualifications
Enterprise Management
• Building and transforming the program enterprise
• Understanding the program enterprise
• Learning and improving the program enterprise
• Knowledge management
• Secure IT information / document sharing in the enterprise
• Program portfolio and environment management
Figure 2-9: Lean Advancement Initiative (LAI) program management framework
The program management activities are structured in six main areas: Program Execution;
Enterprise Management; Scoping, Planning, and Contracting; Technology Integration;
Product Design; and Production, Use, Service, and Decommissioning. As the LAI research
focuses on the early phases of a system’s life cycle, the last phase of the engineering life
cycle, Production, Use, Service, and Decommissioning, is not covered in the model.
Program Execution focuses on the practices for efficient execution of a program by the
government program office, the main contractor, and the system suppliers. Enterprise
Management addresses more generally those aspects that are important in creating,
2 Current state and research gaps in program management 20
understanding, and improving large and complex enterprises. During Scoping, Planning, and
Contracting, the user needs are translated into a legal contract between the program office
and the main contractor. During Technology Integration, technologies for the design phase
are selected and integrated into the program. The readiness of low-maturity technologies
may be increased in parallel technology development tracks to a level that is suitable for their
introduction into the design process. In Product Design, system integration and the
components of the system are specified to a level that makes it fit for production. During
Production, Use, Service, and Decommissioning, the system to satisfy the user needs is
produced, delivered to the customer, serviced, and eventually dismantled. Since the last
phase is out of focus of the LAI, there are no subcategories listed in this main area.
2.1.5 Comparison of existing frameworks
Comparison of the different frameworks makes it clear that there are major differences in the
understanding of what the necessary aspects of program management are. On one hand, all
the approaches emphasize fully addressing all possible aspects program management. On
the other hand, as the remainder of this chapter will show, not all frameworks cover the same
processes and facets of program management. In addition, many authors hold that there is a
lack of a clear definition for program management in current literature (Ferns) (Williams and
Parr) (Vereecke, Pandelaere et al. 2003) (Haughey 2001) (Görög 2011). Basically, as has
been mentioned, the literature describes two fundamental aspects in program management:
- Programs as a method for organizational change (this might include the use of small
change projects)
- Programs in an engineering environment in the context of large development efforts
and the management of multiple related projects
Vereecke (Vereecke, Pandelaere et al. 2003) discusses a survey undertaken by e-
Programme.com (E-Programme.com 2011) (a website host findings of an organization
dealing with program management) early in 2001. One focus of the survey was the
identification of the common understanding of program management. The two major answers
from the respondents were that program management is “the management of organizational
change through projects that bring about change” (ca. 50%) and that it is “the management
of multiple projects” (ca. 40%) (Vereecke, Pandelaere et al. 2003).
These findings lead to the first research question of this thesis:
“What is program management and how can it be defined?”
Due to the fact that no clear definition of program management has wide acceptance, it is
even harder to find a broadly accepted framework for program management. In addition,
most descriptions of frameworks do not provide any empirical verification but rather just state
that program management has to be executed in a specific way.
This leads to the second research question, which will be answered partly in this chapter and
partly in chapter 4:
“Is there a framework for engineering program management?”
2 Current state and research gaps in program management 21
As discussed earlier, there are two major perspectives in program management: cross-
cutting program capabilities, as majorly occurring in change programs, and a program life
cycle. Consequently, it is beneficial to assess the existing frameworks in terms those aspects
and choose the most promising one for the further development of this thesis.
The frameworks are compared and assessed based on the literature sources named in
Table 2-2. Additional publications by the same author or organization are not taken into
account. This is especially important for the framework “The Standard for Program
Management” of the Project Management Institute (Institute 2008), since many chapters are
completely blank with the comment: “This section is not included in this standard, but is
provided so as to be aligned with the PMBOK® Guide – Fourth Edition” (e.g., chapters 7,8,
and 9 (Institute 2008)). The PMI simply refers to its publication “A Guide to the Project
Management Body of Knowledge: (PMBOK guide)” (Institute 2004), which is a widely
accepted standard for project management. In the same way, the “Defense Acquisition
Guidebook” of the Defense Acquisition University (University 2011) often refers to other
publications such as “Integrated Product and Process Development Handbook” (Defense
1998), “Manager’s Guide to Technology Transition in an Evolutionary Acquisition
Environment” (Defense 2005), and “Risk Management Guide for DoD Acquisitions” (Defense
2006). Furthermore, the Office of Government Commerce also refers to its book “Managing
Successful Projects with PRINCE2” (OGC 2009) – also a standard for project management –
in the description of the “Managing Successful Programmes” framework.
Table 2-2: Literature sources for framework evaluation
Framework Literature Source
Defense Acquisition System of the United
States Department of Defense
“Defense Acquisition Guidebook” by the
Defense Acquisition University (University
2011)
Managing Successful Programmes (MSP) by
the Office of Government Commerce
“Managing Successful Programmes” by the
Office of Government Commerce
(Commerce 2007)
The Standard for Program Management of
the Project Management Institute (PMI)
“The Standard for Program Management” by
the Project Management Institute (Institute
2008)
Lean Advancement Initiative (LAI) program
management framework
Internal unpublished whitepaper
The sources which are used to compare the frameworks having been described, the criteria
need to be defined. The first set of criteria derives from the cross-cutting program
management activities. For each topic, the individual frameworks are assessed and rated on
the degree to which they address a topic on a scale with three levels:
2 Current state and research gaps in program management 22
- Not addressed (displayed in table as )
- Somewhat addressed (displayed in table as )
- Fully addressed (displayed in table as )
Table 2-3 presents the first comparison of the frameworks. The framework topics are taken
from the various frameworks themselves. As one can see, every topic is fully addressed by
at least one framework. The exception is portfolio management, which was considered
important enough by the author to be included in the assessment but is not fully addressed
by any framework.
The major findings from the first comparison, which looks at cross-cutting program
management activities, are that all frameworks address each topic to some degree except for
the Defense Acquisition System. This is due to the fact that the DoD framework mainly
focuses on a technical program life cycle rather than general program activities.
Table 2-3: Comparison of frameworks in terms of cross-cutting program management activities
Framework
topic
Defense
Acquisition
System (see.
section 2.1.1)
Managing
Successful
Programmes
(see section
2.1.2
The Standard
of Program
Management
(see chapter
2.1.3)
LAI program
management
framework
(see section
2.1.4)
Risk
management
Progress
management
Improvement
management
Managing
organizational
influences
Forming and
improving the
program
enterprise
Information
management
Portfolio
management
2 Current state and research gaps in program management 23
The second part of the assessment focuses on the life cycle of an engineering program. The
phases are very closely related to those of a process for new product development and
production. Cooper (Cooper 2001) defines the following steps for a Stage-Gate approach – a
broadly accepted standard for product development:
1. Discovery: Pre-work to discover and uncover opportunities and needs and create
ideas.
2. Scoping: The initial exploration of the project/product.
3. Build business case: A more detailed investigation involving primary market and
technical research. This leads to a business case with product and project definition
and justification, and the development of a master plan.
4. Development: The actual design and development phase of the product and its
production processes.
5. Testing and validation: Tests with different focuses: market acceptation, lab
evaluation of technical performance, etc.
6. Full production and market launch: Commercialization: full production of the product,
marketing, and selling (Cooper 2001)
These phases by Cooper can be transferred to a program environment. This transfer
produces the following stages:
1. Pre-Program and Program Establishment
2. Technology Development
3. Product Design
4. Production, Deployment, and Disposal
Figure 2-10 shows how the gates of the Stage-Gate process correlate with the phases of the
program life cycle. The Stage-Gate process has many stages, especially during the early
phases, but they can easily be grouped in broader categories.
2 Current state and research gaps in program management 24
1. Pre‐Program and Program Establishment
2. Technology Development
3. Product Design
4. Production, Deployment & Disposal
1. Discovery
2. Scoping
3. Build Business Case
4. Development
5. Testing & Validation
6. Full Production & Market Launch
Program Environment Stage‐Gate Process
Figure 2-10: Correlation of program life cycle phases and the stages of a Stage-Gate process
2 Current state and research gaps in program management 25
Table 2-4 compares the frameworks under discussion in terms of the first phase of the
program life cycle. It is obvious that the frameworks that are focusing on change programs
rather than technical programs do not fully address all aspects of this life cycle phase. The
aspect of contracting, in particular, is not normally part of change programs, because they
are mostly executed internally in an organization or company and do not require any kind of
external support. In contrast, technical development programs typically involve several
different suppliers, and they might also contribute to the development of the product. These
suppliers might even be structured hierarchically (i.e., 1st-tier suppliers as direct contractors
of the program enterprise, 2nd-tier contractors as contractors of a 1st-tier supplier, and so on).
2 Current state and research gaps in program management 26
Table 2-4: Comparison of frameworks in terms of pre-program and program establishment
activities
Framework
topic
Defense
Acquisition
System (see.
section 2.1.1)
Managing
Successful
Programmes
(see section
2.1.2
The Standard
of Program
Management
(see chapter
2.1.3)
LAI program
management
framework
(see section
2.1.4)
Need
identification
Identification of
alternatives
Requirement
definition
Contracting
Definition of life
cycle strategy
2 Current state and research gaps in program management 27
Table 2-5 compares the different frameworks according to how fully they address activities in
the phase of technology development. Again, it is particularly the frameworks that focus on
change programs or try to provide a broader overview of program management that do not
contribute any information about the specific necessary actions and best practices. In
addition, the Defense Acquisition System of the DoD does not provide many insights into
technology development and technology maturation monitoring, even though the main focus
of the framework is the technical program life cycle.
2 Current state and research gaps in program management 28
Table 2-5: Comparison of frameworks in terms of technology development activities
Framework
topic
Defense
Acquisition
System (see.
section 2.1.1)
Managing
Successful
Programmes
(see section
2.1.2
The Standard
of Program
Management
(see chapter
2.1.3)
LAI program
management
framework
(see section
2.1.4)
Technology
maturation
assessment
Technology
transition
management
Table 2-6 assesses the frameworks in terms of the activities during the phase of product
design. Once more, the general program management and change program management
frameworks do not address this aspect very well. Both frameworks which include a technical
program life cycle provide information and best practices for nearly all activities during
product design. There is no aspect which is not contributed to in either of the technical
program frameworks.
Table 2-6: Comparison of frameworks in terms of product design activities
Framework
topic
Defense
Acquisition
System (see.
section 2.1.1)
Managing
Successful
Programmes
(see section
2.1.2
The Standard
of Program
Management
(see chapter
2.1.3)
LAI program
management
framework
(see section
2.1.4)
Individualizing
the design
process
Process
monitoring and
improvement
Testing and
evaluation
2 Current state and research gaps in program management 29
A comparison of the frameworks in terms of the life cycle phase of production, deployment,
and disposal is shown in Error! Not a valid bookmark self-reference.. Since this phase is
outside the scope of this thesis, it is not broken down in the table into component activities.
The primary focus of this thesis is on development programs and not on the production of the
product itself. In addition, only one framework can contribute any insights about this phase.
Since there are no activities listed, the full circle in the table only indicates that some kind of
information and/or best practices is provided in the Defense Acquisition System framework.
Table 2-7: Comparison of frameworks in terms of production, deployment, and disposal
Framework
topic
Defense
Acquisition
System (see.
section 2.1.1)
Managing
Successful
Programmes
(see section
2.1.2
The Standard
of Program
Management
(see chapter
2.1.3)
LAI program
management
framework
(see section
2.1.4)
Production,
Deployment &
Disposal
By aggregating each framework’s ratings for the different activities of an individual program
life cycle phase, an overall rating can be produced for each framework for each life cycle
phase. All activities are weighted equally in this aggregation. An entirely filled circle means
that all activities of the life cycle phase are fully addressed while an empty circle means that
none of the activities in the phase is addressed by the framework. In total, eight different
ratings (from empty to full circle) are possible. A quarterly filled circle for example means that
one fourth of all possible activities in this life cycle phase are fully addressed (or half of the
activities can be partly addressed).
2 Current state and research gaps in program management 30
Table 2-8 shows the results of the rating process. The only two frameworks which contribute
information and/or best practices for the different life cycle phases are the Defense
Acquisition System by the Department of Defense and the LAI program management
framework.
2 Current state and research gaps in program management 31
Table 2-8: Comparison of frameworks in the phases of an engineering program lifecycle
Framework
topic
Defense
Acquisition
System (see.
section 2.1.1)
Managing
Successful
Programmes
(see section
2.1.2
The Standard
of Program
Management
(see chapter
2.1.3)
LAI program
management
framework
(see section
2.1.4)
Pre-Program
and Program
Establishment
Technology
Development
Product Design
Production,
Deployment,
and Disposal
To gain an even broad overview, only the two major categories – cross-cutting program
activities and the technical life cycle activities – are assessed in Table 2-9. The results show
that there is only one framework that addressed both of these categories.
Table 2-9: Overall comparison of the different frameworks
Framework
topic
Defense
Acquisition
System (see.
section 2.1.1)
Managing
Successful
Programmes
(see section
2.1.2
The Standard
of Program
Management
(see chapter
2.1.3)
LAI program
management
framework
(see section
2.1.4)
Cross-cutting
program
activities
Technical
program life
cycle activities
(not including
production)
2 Current state and research gaps in program management 32
2.2 Issues in program management
There are a number of sources that provide valuable insights into common issues in program
management. Most of these publications focus on a specific kind of program environment.
One very helpful source for the identification of challenges, especially in the environment of
defense acquisition programs (this is in the context of the Defense Acquisition System of the
United States Department of Defense; see section 2.1.1 for a detailed description), are the
annual reports of the Government Accountability Office (GAO). These reports assess all
major acquisition programs (ACAT I programs and programs designated major acquisition
programs for other reasons). The findings are summarized together with possible actions to
improve the performance of these programs. Unfortunately, these reports only emphasize
the findings of the current assessment. Older identified issues and recommendations for
improvements are not mentioned again. Consequently, one must examine all reports ever
published by the GAO to find all challenges identified in defense acquisition programs by this
agency.
Another source for common challenges is general program management literature,
especially literature on frameworks, because they claim to be based on known best practices
for avoiding common pitfalls. In addition, there are many publications dealing with challenges
in program environments or parts of these environments (e.g., product development). By
compiling the diverse findings, one can produce a comprehensive collection of common
issues in program management.
This discussion leads to the third research question of this thesis:
“What are common pitfalls in program management?”
2.3 Introduction to lean management
The term “lean” as applied to management was first used by John Krafcik (Krafcik 1988) in
the paper, “Triumph of the lean production system” (Womack, Jones et al. 1990). He
differentiates between buffered and lean production (Krafcik 1988). By buffered, Krafcik
means the way Western companies used to produce, which included many buffers between
different steps in the production process.
Later, the term was coined by Womack et al. in their book, “The Machine That Changed the
World” (Womack, Jones et al. 1990). In their five-year study with a budget of $5 million, the
authors investigated various automobile companies around the world. They identified
fundamental differences between the production processes of Western and Japanese
(especially as represented by Toyota) companies. The efficiency of Toyota’s processes was
found to be better by factors of up to 10 compared to Western companies in terms of defect
rates, plant productivity, manufacturing lead times, or use of resources.
Over the years, a number of studies examined the Toyota Production System (TPS), and the
idea of “lean thinking” emerged. The term is often misunderstood as “doing the same work
with fewer employees” or “creating flatter hierarchies” (Hoppmann 2009). The main vision of
lean thinking is to create an uninterrupted, continuously flowing value stream that delivers in
2 Current state and research gaps in program management 33
the shortest time with the least waste (Womack and Jones 1996). In an effort to detail the
idea of lean thinking two of the three authors of the “The Machine That Changed the World”
produced the book “lean thinking” (Womack and Jones 1996). Womack and Jones structure
their understanding of lean along five principles (Womack and Jones 1996):
1. Specification of customer value
2. Identification of the value stream
3. Creation of a continuous flow
4. Pull of the value
5. Striving for perfection
Originally, the principles derived by Womack and Jones originated in production. Today, the
idea of lean thinking is spread throughout the entire enterprise. Morgan for example
conducted a two-and-a-half-year, in-depth study of Toyota’s product development system to
determine if the same principles as in production are applicable to product development.
Together with Liker he describes his findings in “The Toyota Product Development System”
(Morgan and Liker 2006).
As Toyota outperforms its competitors in all areas where it uses lean, the adaption of lean
thinking to program management – a discipline with major issues (see section 2.2 and
chapter 5) – might be beneficial. This led to the fourth research question of this thesis:
“Are there ways to adapt Lean to program management?”
2.4 Identified gaps in current literature
In this section, gaps in the current research and the corresponding literature are identified,
and for each of them the correlative research question of this thesis is given:
- Lack of definition: There is currently no clear definition of “program management,”
even though many authors are using the term. Consequently, a broadly accepted
definition must be established in order to provide a common understanding for further
discussions. The research question for this gap is:
“What is program management and how can it be defined?”
- Lack of a framework for engineering programs: In addition to there being no clear,
widely accepted understanding of the term “program management,” there is no widely
used framework for this discipline. Some existing approaches have been assessed in
this thesis in terms of the two major aspects of program management in the context of
engineering programs: cross-cutting program activities and technical program life
cycle activities. It became clear that there is only one framework which addresses
both facets with any approach to thoroughness: the Lean Advancement Initiative
(LAI) program management framework. Unfortunately, this framework is only
literature based, and there is no empirical validation of it available. This leads to the
research question:
2 Current state and research gaps in program management 34
“Is there a framework for engineering program management?”
- No existing collection of common pitfalls: Currently, one can only find literature
addressing possible issues in quite limited aspects of the field of program
management. There is no comprehensive collection of the issues and pitfalls in all
possible aspects of program management. Consequently, it is necessary to collect
current knowledge about issues and summarize it to enable research to better
grapple with the biggest problems in program management. The research question
for this gap is:
“What are common pitfalls in program management?”
- No application of “lean thinking” in program management: Research and experience
in various fields (e.g., production and product development) has shown that the
practice of “lean thinking” is beneficial. Up to now, there are no approaches for
adapting lean ideas to program management, even though there are major issues
where the application of lean might be advantageous. This leads to the fourth and last
research question of this thesis:
“Are there ways to adapt lean thinking to program management?”
2.5 Research approach
To address the research questions of this thesis scientifically, diverse research approaches
have been chosen. Figure 2-11 shows a list of the research approaches and where they are
used in this thesis. Not all methods fit all of the different research questions.
Literature review
Interviews
Surveys
Industry focus group
Chapter 6Chapter 5Chapter 4Chapter 3
Chapter 6Chapter 5Chapter 4Chapter 3
Chapter 6Chapter 5Chapter 4Chapter 3
Chapter 6Chapter 5Chapter 4Chapter 3
Figure 2-11: Usage of different research approaches in the thesis
Since the method of reviewing literature provides basic knowledge about specific topics and
enables one to formulate hypotheses, it is used extensively throughout this thesis. It ensures
that the diverse aspects and perspectives captured in current literature are considered for
2 Current state and research gaps in program management 35
future research steps. The technique of conducting interviews was used for more focused
research to gain insights into specific areas. By undertaking a survey, one collects more data
from a broader audience than by conducting an interview, but the focus and level of detail
are limited since extremely long surveys are normally not well accepted. The last research
method – exploiting the knowledge of an industry focus group – is in between a survey and
an interview. Working with a large group of experienced people can provide valuable insights
from different companies in the form of “mini case studies.” The interviewer gets the chance
to specify his or her questions to gain exactly the desired information (this is only partly
possible in a survey).
2.5.1 Literature review
Many of the sections of this thesis are based on knowledge gained in a review of literature.
The main sources of the literature are:
- MIT libraries (engineering as well as business libraries).
- WorldCat Search Engine: A search engine that includes all university and public
libraries in the United States. An interlibrary loan service based on this service
provides access to books that are not available at the MIT libraries.
- Google: A large Internet search engine.
- Google Scholar: A search engine provided by Google for scholarly literature in
various formats and for various disciplines.
- Amazon.com: A large online shop and database for books.
- Recommendations from other researchers.
The literature review started with basic keyword searches. Following up on the most
promising findings, forward and backward searches based on identified citations were
conducted. These in-depth searches focused on specific fields with smaller scopes.
2.5.2 Interviews
In addition to the literature review, interviews were used as a source of information. Seven
interviews were conducted with Air Force program managers. All interviews were carefully
planned based on a literature review of study research (Eisenhardt 1989; Eisenhardt 1991;
Yin 2009). All interviews were based on preplanned questionnaires to ensure appropriate
administering of them.
The main goals of the interviews were:
- Evaluation of the LAI program management framework (detailed in chapter 4)
- Ranking of the LAI program management framework parts (also detailed in chapter 4)
- Verification of the identified pitfalls (detailed in chapter 5)
- Refinement of a survey for further data collection on pitfalls and Lean Enablers
(detailed in chapter VIII (Appendix B))
2 Current state and research gaps in program management 36
Five of the seven interviews were conducted to verify the LAI program management
framework. A detailed description of the questionnaire that was used and the results can be
found in section 4.2. The primary goal of the other two interviews was to validate the findings
about pitfalls in program management from current literature. The secondary goal was to test
the effectiveness of a questionnaire to collect data about pitfalls and lean enablers from the
industry focus group. Details from the second two interviews may be found in sections 5.2
and section VIII (Appendix B).
2.5.3 Surveys
In general, there are several methods to acquire data (Fowler 1995; Aday and Cornelius
2006; Neuman 2006; Fowler 2009). As already discussed, interviews focus only on a small
sample but provide in-depth information. A survey normally covers a larger sample but has a
lower level of detail. Consequently, a survey is less suitable for exploratory research. On the
other hand, conducting a survey will provide meaningful insights for explanatory research
attempting to verify hypothesizes.
For this thesis, a web-based survey type was chosen, for several reasons. First, the link to
the web-based survey can be sent to the exact focus group (Aday and Cornelius 2006). In
this case, the members of the industry focus group and the LAI consortium were the
participants in the survey. Second, the cost per unit is very low for a web-based survey,
compared to a sending a printed survey to various companies (Fowler 2009). Third,
convenience for the respondent is higher due to there being no constraints on when and
where to answer the survey (the respondent must carry a printed version with him/her to be
able to fill it out wherever and whenever he/she chooses to). Fourth, the throughput time of a
web-based survey is dramatically lower than that of a paper-based version. The process of
printing and mailing does not apply to a web-based version. Finally, all data collected during
a web-based survey is directly accessible electronically. Most websites for web-based
surveys offer the option to download the data in a spreadsheet that can be further processed
by any spreadsheet software (e.g., Microsoft Excel).
The questionnaire used in this thesis will be explained in a detailed way in section 4.2.2. It
was pre-evaluated with a number of people, including research assistants, students, and
program managers, to ensure clarity, comprehensiveness, and acceptability (Aday and
Cornelius 2006). Due to the findings of this pretest, the survey was dramatically
shortened to decrease the time expense and therefore increase the likelihood of
participation by possible respondents.
2.5.4 Industry focus group
The idea of the industry focus group is closely related to case study research. The general
idea is to gain in-depth information about specific topics from experienced participants
(Eisenhardt 1989; Eisenhardt 1991; Yin 2003).
The industry focus group for this these had members from different companies as well as
organizations. Error! Not a valid bookmark self-reference. gives an overview of the
2 Current state and research gaps in program management 37
participating companies, Table 2-11 of the participating organizations. Due to the
confidentiality of some of the information provided, the general information about the
participants is only provided in an anonymous form. For the different companies the given
information consists of the number of employees, the revenue in 2010 (if available, otherwise
from 2009), and the sector of the company. Since the organizations are nonprofit-oriented,
there is no data available about revenue for them.
Table 2-10: List of companies participating in the industry focus group
Company Name # of Employees Revenue Sector
Company A 107,000 $31.60 B Aerospace, Defense
Company B 160,500 $64.31 B Aerospace, Defense
Company C 36,000 $12.94 B Aerospace,
(Defense)
Company D 72,000 $25.18 B Aerospace, Defense
Company E 20,000 $4.70 B Aerospace,
(Defense)
Company F 405,000 $104.53 B Technology
Company G 3,900 unknown Aerospace
Table 2-11: List of organizations participating in the industry focus group
Organization # of Members Focus
Organization A 6,000+ System Engineering
Organization B 318,400+ Project Management
The industry focus group was used for different methods of verification:
- A day-long group meeting with the key participants in the industry focus group took
place at the end of January 2011 in California. The companies present debated
general topics related to program management and had a guided discussion about
topics with high relevance for this thesis and further research (Haberfellner and
Daenzer 2002).
- Regularly occurring telephone conferences with durations of one and a half hours
were used to discuss certain topics.
2 Current state and research gaps in program management 38
- The members of the industry focus group participated in a survey undertaken to rank
different topics in program management in terms of the future relevance for further
research.
- The members of the group were occasionally asked to provide feedback via email
about the current findings of the research for this thesis as it proceeded.
2.6 Summary of current state of program management
In this chapter, the current state of the literature on various topics related to program
management has been discussed. In addition, research questions have been formulated for
each of the four identified literature gaps.
First, different existing frameworks of program management were described. Second, these
frameworks were compared. This analysis and evaluation led to the identification of two gaps
in the current literature: There is currently no broadly accepted definition of program
management, even though there are a noticeable number of publications dealing with it.
Furthermore, none of the frameworks currently available fully covers all identified aspects of
program management. There is one framework, that of the Lean Advancement Initiative,
that addresses both basic facets, but it has not been verified yet.
Third, an additional insight from the literature review was there is no inventory of the issues
or pitfalls in program management. While there is research that tries to unveil issues in
program management, these publications only cover aspects and do not attempt to deliver a
comprehensive list of issues or pitfalls.
Last, the idea of “lean thinking” was identified as promising for producing performance
improvements in multiple environments. However, there are no approaches to adapt lean
ways and ideas to program management.
Figure 2-12 lists the research questions with the sections where the corresponding gaps
were identified.
2 Current state and research gaps in program management 39
Research Question 4:Are there ways to adapt lean thinking to program management?
Research Question 3:Are there common pitfalls in program management?
Research Question 2:Is there a framework for engineering program management?
Research Question 1:What is program management and how can it be defined?
Chapter 2.3
Chapter 2.2
Chapter 2.1
Figure 2-12: Research questions and corresponding sections
3 Definition of program management 40
3 Definition of program management
In the preceding chapter it became clear that there are various different concepts of program
management. None of them has gained widespread acceptance. This chapter compares
various existing definitions and tries to identify the most common themes within these
definitions. Based on these findings, a definition will be formulated covering all of the facets
of program management. This will establish an understanding of program management
before the thesis progresses further and describes details based on that understanding. In
addition, the first research question, related to the lack of a clear definition of program
management, is answered by providing a comprehensive definition.
3.1 Definition of project management
Most authors consider program management to in some manner includes project
management (Ferns 1991; Williams 1999; Fricke and Shenhar 2000; Haughey 2001;
Andersen and Jessen 2003; Archibald 2003; Institute 2004; Lycett, Rassau et al. 2004;
Maylor, Brady et al. 2006; Commerce 2007; Hut 2008; Mao, Cheng et al. 2008; Patanakul
and Milosevic 2008; Patanakul and Milosevic 2009). Therefore, it is necessary to briefly
discuss project management and establish a definition of it before developing a definition of
program management.
There are three major aspects of a project (Maylor, Brady et al. 2006):
1. Uniqueness: A project is a unique endeavor; the same project is normally never
executed in the same way more than once (Maylor, Brady et al. 2006; Williams and
Parr 2006; Patanakul and Milosevic 2009). However, the same set of capabilities can
be required for the execution of more than one project (Davies and Brady 2000).
2. Temporariness: A project is a finite task, temporally (Maylor, Brady et al.).
Temporariness does not necessarily imply brevity; there are projects with durations of
even 40 years which still are not fully integrated into the normal organization of a
company and are therefore temporally restricted (Maylor, Brady et al.). A project
normally comes to an end when the desired product or process is delivered (Williams
and Parr).
3. Level of pre-determinism: When a project starts, normally what is to be achieved, in
what timescales, and with what resources is defined (Maylor, Brady et al. ; Williams
and Parr). In addition, the tool set necessary to execute the project successfully must
be discussed, clarified, and provided to the members of the project team before the
project starts (Commerce).
Table 3-1 summarizes existing often-quoted definitions of the terms “project” and “project
management.” The definition most often found in current literature is the one from the Project
Management Institute. “A Guide to the Project Management Body of Knowledge: (PMBOK
3 Definition of program management 41
guide)” (Institute 2004) provides a widely – especially in the United States – accepted
framework for project management consisting of a collection of best practices. The other very
frequently cited definition is provided by the Office of Government Commerce. Since, the
OGC is a public organization of the United Kingdom, the framework PRINCE2, described in
“Managing successful projects with PRINCE2“ (Commerce 2009), is more widely used in
Europe. However, the definition from the Office of Government Commerce emphasizes the
aspect of the level of pre-determinism more than the one from the Project Management
Institute.
Table 3-1: Broadly used definitions of “project” and if available “project management”
Author Definition
International Organization for
Standardization (ISO 1994)
(revised by (ISO 2005))
“A set of co-ordinated activities, with a specific start and
finish, pursuing a specific goal with constraints on time,
cost and resources.”
Sapsed (Sapsed and Salter) “‘Project’ is used for a system of work activities for which
there is a predefined outcome to deliver and an
associated timeline with an end date."
Office of Government
Commerce (Commerce)
“A project is also a temporary organization, usually
existing for a much shorter duration, which will deliver
one or more outputs in accordance with a specific
business case. A particular project may or may not be
part of a program.”
Office of Government
Commerce (Commerce)
“Project Management is the planning, delegating,
monitoring and control of all aspects of the project, and
the motivation off those involved, to achieve the project
objectives within the expected performance targets for
time, cost, quality, scope, benefits and risks.”
Project Management Institute
(PMI)
“A project is a temporary endeavor undertaken to create
a unique product, service, or result.”
Project Management Institute
(PMI)
“Project management is the application of knowledge,
skills, tools and techniques to project activities to meet
project requirements. Project management is
accomplished through the application and integration of
the project management processes of initiating, planning,
executing, monitoring and controlling, and closing.”
It became clear that the definitions for “project” and “project management” from the Office of
Government Commerce fit best with the definition components most often found in the other
definitions. Consequently, these two definitions will be used for further discussion about
project-management-related topics.
3 Definition of program management 42
3.2 Definition of program management
After the clarifying the definition of project management, the logical next step is to define the
terms “program” and “program management.” Table 3-2 gives the definitions of those terms
found in widely used literature. If available, both definitions – “program” and “program
management” – are provided.
Table 3-2: Definitions of “program” and “program management”
Author Definition
(Andersen and Jessen
2003)
“A program could be a new product development, an
organizational restructuring of the company or the
implementation of an advanced software package in different
departments of the company. Program management is the
effective management of all the projects under the umbrella of
the program."
(Management 2006) “Program management is the coordinated management of
related projects, which may include related business-as-usual
activities that together achieve a beneficial change of a
strategic nature for an organization”
(Archibald 2003) “Program: A long-term undertaking that includes two or more
projects that require close coordination.”
(Brown 2007) “Program Management is management of a group of projects
and/or operations to achieve business targets, goals, or
strategies, and may or may not have a defined end point.”
(University 2009) “The process whereby a single leader exercises centralized
authority and responsibility for planning, organizing, staffing,
controlling, and leading the combined efforts of participating/
assigned civilian and military personnel and organizations, for
the management of a specific defense acquisition program or
programs, through development, production, deployment,
operations, support, and disposal. (DAU Glossary)”
(Ferns 1991) “A program is a group of projects that are managed in a
coordinated way to gain benefits that would not be possible
were the projects to be managed independently.
“Program management is the coordinated support, planning,
prioritization and monitoring of projects to meet changing
business needs.”
(Haughey 2001) “Program management is a technique that allows
3 Definition of program management 43
organizations to run multiple related projects concurrently and
obtain significant benefits from them as a collection. Program
management is a way to control project management, which
traditionally has focused on technical delivery. A group of
related projects not managed as a program are likely to run
off course and fail to achieve the desire outcome.”
(Hut 2008) “In the simplest of terms, program management is the
definition and integration of a number of projects to cause a
broader, strategic business outcome to be achieved. Program
management is not just the sum of all project management
activities but also includes management of the risks,
opportunities and activities that occur “in the white space”
between projects.”
(Lycett, Rassau et al. 2004) “Program Management – defined as the integration and
management of a group of related projects with the intent of
achieving benefits that would not be realized if they were
managed independently. Whilst connected, this is distinct
from portfolio management.”
(Mao, Cheng et al. 2008) “However, in general, it is concluded that a Program consists
of a series of interrelated projects while Program
Management is to manage all interrelated projects as a whole
in a harmonic manner which is beneficial to successfully fulfill
each single project and the entire program.”
(Commerce 2007) “In MSP [Managing Successful Programmes], a program is
defined as a temporary, flexible organization created to
coordinate, direct and oversee the implementation of a set of
related projects and activities in order to deliver outcomes and
benefits related to the organization’s strategic objectives. A
program is likely to have a life that spans several years.”
(Commerce 2007) “Program management is the coordinated organization,
direction and implementation of a dossier of projects and
transformation activities (i.e. the program) to achieve
outcomes and realize benefits of strategic importance.”
(Patanakul and Milosevic
2008)
“On the other hand, a program, led by a program manager, is
a family of projects that are strongly dependent, share
common goals, and lead to a single deliverable product or
service.”
3 Definition of program management 44
(Patanakul and Milosevic
2009)
“Another case of MPM [multi-project management] is program
management, where the projects in the group are mutually
dependent, share a common goal, and lead to a single
deliverable product or service. In contrast with MGMP
[management of a group of multiple projects], program
management is the centralized coordinated management of a
group of goal-related projects to achieve the program’s
strategic objectives and benefits.”
(Institute 2008) “Program Management is the centralized coordinated
management of a program to achieve the program’s strategic
objectives and benefits. It involves aligning multiple projects
to achieve the program goals and allows for optimized or
integrated cost, schedule, and effort.”
By comparing the different existing definitions, five key aspects could be identified:
1. A program consists of multiple, related projects. The first aspect in which all
definitions for program management found in current literature agree (except for one)
is that a program consists of multiple projects. A broadly accepted definition for the
term “project” was established in the preceding section, 3.1. Programs may also
evolve from single projects as they become more complex (Andersen and Jessen
2003). The organization of the projects within a program can vary depending on the
purpose of the program. Projects can be organized sequentially, run in parallel as
hybrid sequential-parallel models, or be organized as a network of interlinked projects
(Lycett, Rassau et al. 2004). However, perhaps the most common concept of the
organization of projects within a program is an environment of interlinked projects.
2. Long program duration. The second aspect focuses on the duration of a project or
program. Whereas projects are time constrained, programs in general do not need to
have a fixed end date. A program can only come to an end when the deliverable
(e.g., a product or service) is fully developed and handed over to the user/customer
(Lycett, Rassau et al. 2004; Maylor, Brady et al. 2006). In addition, some authors
state that a program is also responsible for the time after handing over the product or
service to the user. This may include, for example, deployment, operations, support,
and disposal (DoD 2008; University 2009; DAU 2010).
3. Management economy of scale. There are many tasks and objectives in the projects
within a program that are similar. By identifying best practices among different
projects and introducing organization-wide learning, huge benefits can be gained
when projects are managed in a program instead of being managed in isolation in a
project management context (Ferns 1991; Lycett, Rassau et al. 2004).
4. Complex value proposition. A program focuses on multiple stakeholders and has to
satisfy a complex value proposition. While a single project may only deliver an output
such as a product, a program has to satisfy the multifaceted needs of several
3 Definition of program management 45
stakeholders. A program could, for example, provide a transportation solution for
urban environments, while the project would deliver a train or bus for this purpose
(Commerce 2007).
5. Programs define their own enterprise. Based on the research undertaken by the LAI,
the formation of a program enterprise is included in the definition of program
management used in this thesis. The organization of each program is unique and
established to satisfy the needs of all stakeholders in the best way. Since a program
has its own program office and has to defend decisions against undue influence from
single stakeholders, a program has a certain degree of autonomy.
Similarly to the approach of comparing the different frameworks in section 2.1.5, Error! Not a
valid bookmark self-reference. assesses and compares the different definitions of program
and program management presented in Table 3-2. Again, there are three different ratings for
a definition. When an aspect is not mentioned at all in the definition, an empty circle is given.
If the aspect is partly addressed (maybe the aspect is only implied), the rating is a half-filled
circle. An entirely filled circle is given if the aspect is explicitly mentioned in the definition.
Even though there might be additional aspects mentioned in the publication where the
definition appears, only the formal definition itself is considered in these ratings. The purpose
of this chapter is to find a short and precise definition of program management, not to
generally assess the current program management literature.
Table 3-3: Comparison of definitions of “program management”
Author
Aspect 1: Consists of
multiple projects
Aspect 2: Long
program duration
Aspect 3: Generates
management economy of
scale
Aspect 4: Complex
value propositio
n
Aspect 5: Defines
own enterprise
Andersen
(Andersen and
Jessen 2003)
Association for
Project
Management
(Management
2006)
Archibald
(Archibald 2003)
Brown (Brown
2007)
DAU (University
2009)
3 Definition of program management 46
Ferns (Ferns
1991)
Haughey
(Haughey 2001)
Hut (Hut 2008)
Lycett (Lycett,
Rassau et al.
2004)
Mao (Mao,
Cheng et al.
2008)
OGC (Commerce
2007)
Patanakul
(Patanakul and
Milosevic 2008;
Patanakul and
Milosevic 2009)
PMI (Institute
2004)
: Fully addressed : Partly addressed : Not addressed
The table shows that there is no current definition that fully covers all aspects. In particular,
the aspects of long program duration and definition of a program enterprise are not widely
addressed. This leads to the necessity of formulating a definition for use in this thesis. The
term “program management” in the proceeding thesis will be understood as follows:
“Program management is defined as the management of multiple related projects and the
resulting enterprise over an extended period of time to deliver a complex value proposition
and utilize efficiency and effectiveness gains.”
3.3 Summary
This chapter has answered the first research question, “What is program management
and how can it be defined?” The first step was to provide the basic knowledge necessary
to understand a definition of program management. Since many authors state that a program
consists of multiple (interrelated) projects, a clear definition of project management had to be
found. Various aspects of project management were presented, and a number of definitions
from the literature were compared and assessed.
3 Definition of program management 47
Afterwards, a comprehensive compilation of definitions of program management from the
literature was presented. Again, different aspects could be identified within the spectrum of
definitions. It became clear that no current definition fully covers all facets of program
management. Consequently, a new definition had to be formulated. The definition and hence
the answer to the first research question is:
“Program management is defined as the management of multiple related projects and
the resulting enterprise over an extended period of time to deliver a complex value
proposition and utilize efficiency and effectiveness gains.”
4 Selection of a framework for engineering programs 48
4 Selection of a framework for engineering programs
This chapter answers the research question, “Is there a framework for engineering
program management?” The first step is to finalize the comparison of existing approaches
in section 2.1.5 and determine whether there is an existing approach that could be used.
Three approaches that are well known (but not widely used in industry) and one new
approach were chosen for the comparison:
- Defense Acquisition System – United States Department of Defense
- Managing Successful Programmes – Government of Great Britain
- The Standard for Program Management – Project Management Institute
- The Lean Advancement Initiative (LAI) program management framework
Consequently, the LAI program management framework will be the framework used for
further research in this thesis. As stated in section 2.1, the second research question
addresses the existence of a framework for the management of technical programs. The
findings so far are that there is a framework, but since this framework is only described in an
internal, unpublished whitepaper, there is no empirical evidence for the correctness of the
framework. Hence, it is necessary to empirically validate the framework in this thesis to
ensure that it covers the diverse aspects of program management. The validation will be
described in detail in the next sections.
4 Selection of a framework for engineering programs 49
Table 4-1 summarizes the comparison made in section 2.1.5. The table shows how the
different frameworks address the two major aspects of program management – cross-cutting
program activities, and activities of a technical program life cycle.
The Defense Acquisition System describes necessary steps and phases throughout the life
cycle of a defense acquisition program, a technical program. Hence, its rating for the life
cycle activities of a technical program is the second-best, while its cross-cutting program
activities are rated worst. The Managing Successful Programmes framework provides
knowledge for organizational change programs but does not consider activities of a program
life cycle. Consequently, its rating for the technical program life cycle activities is the worst,
and the cross-cutting activities are addressed very well. The Standard of Program
Management does not focus on change programs nor on technical programs. Therefore, its
rating for the technical program life cycle activities is also low, while the cross-cutting
program activities are addressed very well. The LAI program management framework’s
development was based on research about technical programs. Hence, its rating for the
technical life cycle activities is very good. In addition, the LAI program management
framework covers most of the cross-cutting program activities and receives a good rating for
that.
Consequently, the LAI program management framework will be the framework used for
further research in this thesis. As stated in section 2.1, the second research question
addresses the existence of a framework for the management of technical programs. The
findings so far are that there is a framework, but since this framework is only described in an
internal, unpublished whitepaper, there is no empirical evidence for the correctness of the
framework. Hence, it is necessary to empirically validate the framework in this thesis to
ensure that it covers the diverse aspects of program management. The validation will be
described in detail in the next sections.
4 Selection of a framework for engineering programs 50
Table 4-1: Overall comparison of existing program management frameworks (see section 2.1.5)
Framework
topic
Defense
Acquisition
System (see.
section 2.1.1)
Managing
Successful
Programmes
(see section
2.1.2
The Standard
of Program
Management
(see chapter
2.1.3)
LAI program
management
framework
(see section
2.1.4)
Cross-cutting
program
activities
Technical
program life
cycle activities
(without
production)
4.1 Discussion of current state of program management in industry
One major goal of the workshop at the end of January with the industry focus group was to
determine the current state of program management in industry. In particular, the aspect of
the application of a framework was discussed in detail.
The first finding was that no company uses a program management framework provided in
current literature. None of the frameworks described in this thesis are being used, and they
are mostly not known in industry. The only framework that is partly recognized is the Defense
Acquisition System, because many of the companies are defense contractors and therefore
involved in the acquisition process. However, most companies (even those dealing with
defense acquisition) are not familiar with the framework itself and know only the interfaces
and types of reports necessary to interact with the governmental organizations involved in
defense acquisitions.
The organizations that published the different frameworks are well recognized in industry.
However, the companies of the industry focus group are most familiar with the Project
Management Institute because of its geographical focus on the United States and the
famous project management framework, “A Guide to the Project Management Body of
Knowledge (PMBOK Guide)” (PMI 2004). Most companies know the project management
framework PRINCE2 (OGC 2009), and hence they are aware of the Office of Government
Commerce. But very few of the companies know that both organizations have also published
program management frameworks.
Only one of the collaborating companies in the industry focus group uses a formal program
management framework. The framework consists of a set of best practices drawn from
4 Selection of a framework for engineering programs 51
different fields of expertise (e.g., risk management, program organization, etc.). The best
practices are assessed on a yearly basis to improve their quality and ensure that they
continue to be considered best practices. Most of these assessments are done internally
rather than by external consultants in audits. The company itself had already stated during
the discussion that the best practices are more focused on the execution of the program than
on the implementation of new organizational structures for a program.
A second company indicated that it uses formal guidelines for program management but has
not implemented an actual framework. The company emphasizes that the main aspect of the
guidelines is a focus on the valuestream life cycle. Again the guidelines are based on
identified best practices and guidelines, which are updated regularly. And also similarly to the
first company, this company’s guidelines focus not on program organization but on the
execution of a program.
A third company described its program managers as “small business owners” capable of
executing a program on their own. This company stated that there are program management
models used in their programs. However, these models or frameworks are not standardized,
but rather are created individually by the program managers.
None of the other participating companies were using any formal model in their program
management. Some implemented best practices to improve program performance but only
sporadically, not systematically.
Finally, all of the companies agreed that a formal program management framework might
help them understand the necessary processes of program management. In addition, a
formal model might improve program performance because companies would know about
best practices (if they are included in the framework) in different disciplines of program
management.
In summary, it can be stated that there is a need for a formal program management
framework. Existing publications seem not to fit companies’ needs, since they are not widely
implemented or even recognized. Consequently, a new approach to the design of a program
management framework is necessary. This approach should reflect current needs and
include companies during the development process to ensure the practicability of the
framework.
4.2 Validation of the LAI program management framework
As the comparison in section 2.1.5 showed, the LAI program management framework is a
promising approach in the context of technical programs. In addition, there is no final
publication of it and during the workshop with the industry focus group it became clear that
industry needs an approach better aligned with its needs. The LAI program management
framework was in a preliminary state before the study documented in this thesis improved
and validated it; in its development, this study made extensive use of information gathered
from industry .
4 Selection of a framework for engineering programs 52
To recapitulate the general structure of the LAI program management framework, it is shown
in Figure 4-1. The program management activities are structured in six main areas: Program
Execution; Enterprise Management; Scoping, Planning, and Contracting; Technology
Integration; Product Design; and Production, Use, Service, and Decommissioning. As the LAI
research focuses on the early phases of a system’s life cycle, the last phase of the
engineering life cycle – Production, Use, Service, and Decommissioning – is not covered in
the model.
This framework covers both cross-cutting program activities (the upper two categories:
Program Execution and Enterprise Management) and technical program life cycle activities
(the lower categories: Scoping, Planning, and Contracting; Technology Integration; Product
Design; and Production, Use, Service, and Decommissioning).
Scoping, Planning & Contracting
• Stakeholder needs & requirements definition
• Managing trade‐offs
• Cost estimation & life cycle costing
• Incentive alignment, contract negotiation & conclusion
• Defining lifecycle strategy
Technology Integration
• Technology maturation monitoring
• Technology transition management
Product Design
• PD team organization
• Integrated product and process development
• PD and SE best practices
• Monitoring PD progress
• Product architecting
• Value Stream Optimization
• Product design strategy
• Testing & prototyping
Production, Use, Service &
Decommissioning
Program Execution
• Integrating and leading the program organization
• Stakeholder management
• Progress monitoring and management
• Risk management
• Project management
• Multi‐project coordination
• Developing intellectual capital
• Program manager qualifications
Enterprise Management
• Building and transforming the program enterprise
• Understanding the program enterprise
• Learning and improving the program enterprise
• Knowledge management
• Secure IT information / document sharing in the enterprise
• Program portfolio and environment management
Figure 4-1: The Lean Advancement Initiative (LAI) program management framework
In order to refine and validate the structure of the framework, interviews with Air Force
program managers were conducted and insights from those interviews and from the industry
focus group were incorporated. These two research methods are described in section 4.2.1
(interviews with program managers) and section 4.2.2 (insights from the industry focus
group).
4.2.1 Insights from interviews with program managers
In order to validate the framework, the author conducted five interviews with government
program managers. All of them had at least five years of experience and an aerospace
defense background. The duration of each interview was between one and two hours. All of
the interviews were one-on-one.
4 Selection of a framework for engineering programs 53
In the interviews, two main subjects were addressed:
1. Framework validation: At the beginning of each interview, the author introduced the
interviewee to the current approach of the LAI framework. The author asked
questions regarding the general structure of the framework to find out if any
restructuring is necessary. Furthermore, the interviewees were asked if any elements
are missing from the framework. By posing general questions about the daily work of
a program manager, the author tried to discover additional activities and elements the
framework should cover.
All insights about the structure gained during these interviews were used to modify
the basic structure of the framework. The general findings were that some elements
needed to be added and some activities had to be restructured or their description
supplemented.
2. Identification of “biggest issues”: In addition to assessing the framework, the author
tried to identify the “biggest issues” in order to use them to guide subsequent
research in this study. By posing four questions for each framework part, the
necessary insights were gained. The author asked the program manager to answer
all questions based on his or her experiences with different programs. The first
question addressed whether the specific framework part was important in the
interviewee’s opinion. The second question dealt with the difficulty of execution of the
specific framework part. The third question assessed the extent to which the
framework part is addressed by best practices in current literature (including general
program management literature as well as documents published by the Department
of Defense). The last question asked if the best practices, if any, are feasible and can
be used in the daily work of a program manager.
All questions were formulated as statements. The interviewees could state that they “fully
disagree,” “fully agree,” or fall somewhere in between, using a five-point Likert-type scale
(Likert 1932). The rating “1” in the figures below represents “fully disagree,” while a “5” is
“fully agree.” Consequently, an average rating of a framework part higher than 3 implies that
the program managers agree with the statement that a part is important, difficult to execute,
well addressed by existing best practices, or addressed by feasible best practices (if any).
Figure 4-2 summarizes the results of the findings from these interviews regarding the
importance and the difficulty of execution of the framework parts, each of which belongs to
one of two themes, Program Execution and Enterprise Management. Since some
refinements were made later to the LAI program management framework, there are aspects
of the current version that were not assessed during the interviews. These additional
refinements were identified based on the interviews or on comments from the members of
the industry focus group.
By consideration of the two different ratings – importance and difficulty – individually, one
arrives at different conclusions about some of the framework parts. The importance is a
measure indicating whether the part should be kept in the framework. A low rating would
imply that this part is less important and could be removed from the framework. On the other
4 Selection of a framework for engineering programs 54
hand, a high ranking indicates that the framework part is important and should be kept as
part of the framework. The difficulty rating gives guidance for possible future research.
Framework parts with a high ranking regarding difficulty might be the focus for future
research efforts. Whether a framework part requires more detailed research also depends on
the availability of best practices and literature. If these are lacking, one should consider this
framework part as being important for future research that can improve the existing
knowledge base.
In the framework themes Program Execution and Enterprise Management, there are no parts
which were identified as unimportant. Consequently, all current parts should be kept in the
framework. Furthermore, most framework parts in the two themes are considered difficult by
the program managers.
4,1
4,8
4,4
5,0
2,8
4,6 4,54,8 4,8
5,0
3,8
4,5
3,3
4,8
4,04,03,7
4,04,3
4,8
4
3
2
1
difficulty
5
importance
Learning and improving in the program enterprise
Understanding the program enterprise
Building and transforming the program enterprise
Developing intellectual capital
Multi‐project coordination
Project management
Risk management
Progress monitoring and management
Stakeholder management
Integrating and leading the program organization
Program Execution andEnterprise Management
n = 3‐5
Figure 4-2: Interview results for Program Execution and Enterprise Management
Figure 4-3 shows the ratings for the framework parts of the theme Scoping, Planning, and
Contracting. These framework parts are considered even more important by the program
managers. In addition, these parts are perceived as being more difficult in their execution
than the ones of the themes of Program Execution and Enterprise Management.
4 Selection of a framework for engineering programs 55
4,65,0
4,5
5,0
4,6
5,04,84,8
2
1
3
5
difficulty
4
importance
Incentive alignment, contract negotiation & conclusion
Cost estimation & life cycle costing
Definition of stakeholder needs and requirements
Managing trade‐offs
Scoping, Planning & Contracting
n = 3‐5
Figure 4-3: Interview results for Scoping, Planning & Contracting
Figure 4-4 displays the ratings for the two parts of the LAI program management framework
that pertain to the theme Technology Integration. Both are rated important and of greater
than average difficulty to execute.
4,0
4,7 4,74,7
importance
2
1
3
4
5
difficulty
Technology maturation monitoring
Technology transition management
Technology Integration
n = 3‐5
Figure 4-4: Interview results for Technology Integration
The last figure, Figure 4-5, shows the ratings for the theme Product Design. Again, these
entire framework parts should be kept in the framework, because they are all considered
important (for each, an average response of at least “agree” was given by the program
managers to the statement that the framework part is important). The average rating of
difficulty in this theme is not as high as the ones in Scoping, Planning, and Contracting and
4 Selection of a framework for engineering programs 56
Technology Integration. But all parts are to some extent considered difficult to execute since
the rating is higher than “3” for each of them.
3,5
4,7
3,3
4,04,34,3
4,0
4,7 4,74,7
4,0
5,0 5,05,04,7
5,0
3
4
importance
2
1
5
difficulty
Product Architecting
Value Stream Optimization
Product design strategy
Testing & prototyping
PD and SE Best Practices
PD team organization
Integrated Product and Process Development
Monitoring PD ProgressProduct Design
n = 3‐5
Figure 4-5: Interview results for Product Design
Figure 4-6 presents the average ratings regarding difficulty and importance for all framework
parts combined. The average importance rating of 4.72 indicates that all of the framework
parts of the LAI program management framework are relevant and should be kept in the
framework. Furthermore, the average difficulty rating of 4.18 shows that the general
perception of program management is that it is very hard to execute properly.
4 Selection of a framework for engineering programs 57
1 2 3 4 5 6
Average rating Difficulty
Importance
Figure 4-6: Average rating of all framework parts combined (including standard
deviation)
In Figure 4-7 , the interview results for the most relevant parts of program management are
visualized (the legend is shown in Figure 4-8 to improve the readability of the text in the
figures). Each circle equals one part of the framework. The two axes display the importance
and the difficulty of the specific part. The size of the circle for the part corresponds to the
level of being addressed by feasible best practices. Because there are two different
conditions being assessed in relation to best practices – existence and feasibility – the circle
size reflects the arithmetic average of these two ratings. In the figure, only 11 framework
parts are displayed since if the display included all framework parts it would be difficult to
discern the information in it. The parts with the highest combined overall ratings for difficulty
and importance are shown.
4 Selection of a framework for engineering programs 58
Product Architecting
Incentive alignment, contract negotiation & conclusion
Cost estimation & life cycle costing
5,0
4,8
4,6
4,4
4,2
4,0 importance
difficulty
5,2
Definition of stakeholder needs and requirements
Learning and improving in the program enterprise
Project management
Risk management
Stakeholder management
Technology transition management
5,004,904,804,704,60
Testing & prototyping
Product design strategy
Figure 4-7: Difficulty and importance rating of the most relevant elements of the LAI program
management framework
1
3
5
Program Execution
Enterprise Management
Scoping, Planning & Contracting
Technology Integration
Product Designn = 3‐5
Level of being addressed by existing best practices:
Framework theme:
Figure 4-8: Legend for Figure 4-7
4 Selection of a framework for engineering programs 59
In the interviews, in addition to being asked questions relevant to validation of the framework
and the rating of its parts, the program managers were asked for individual feedback
regarding the general structure of the LAI program management framework and the different
framework parts. Based on this feedback some definitions were rephrased and the general
structure was adjusted.
4.2.2 Insights from the industry focus group
The resource consisting of the work experience of the industry focus group was exploited in
two ways:
- The members of the industry focus group were asked for individual feedback on the
LAI program management framework. The version which was up-to-date at that time
was sent to the members via email and feedback was requested.
- The members of the industry focus group were asked to participate in a survey on the
LAI program management framework. The major aim was to gain a better
understanding of their assessment of the relative importance of the different
framework themes through having them rate the themes for their importance.
The feedback from individual members of the industry focus group regarding the LAI
program management framework mostly arrived in annotated Microsoft Word documents.
The members commented on the definitions of the framework parts and the general structure
of the framework. Due to their input, some framework parts received refinements in their
definitions, their names, or their places in the framework. In addition, the members were
asked to report whether any aspects of the framework were illogical and whether the
framework was missing any aspects.
Shortly before the meeting with the industry focus group at the end of January, a survey was
conducted to gain insights into the members’ assessment of the relative importance of the
different framework themes for future research through having them rate the themes for their
importance. This enabled the author and the other members of the LAI to be better prepared
for the workshop and for questions about specific framework themes. Figure 4-9 shows the
results of this survey. The members ranked the framework theme Program Execution as the
most important one. The themes Scoping, Planning, and Contracting and Enterprise
Management were ranked second and third in importance.
The finding that Program Execution was considered most important matches the findings of
the literature review and the current state of the resources available to the industry. The only
framework that currently is widely used or known – the Defense Acquisition System of the
United States Department of Defense (see section 2.1.1 for detailed description) – does not
address the theme of Program Execution very well. Consequently, there is a lack of
knowledge in this area. In addition, the members of the industry focus group stated during
the workshop that they struggle with requirement definition. Since dealing with requirements
is a major part of the framework theme Scoping, Planning, and Contracting, it is logical that
that theme was also rated as being very important. Finally, the theme Enterprise
4 Selection of a framework for engineering programs 60
Management is also inadequately addressed by the only widely used framework and also
ranked high in the survey.
2,37
2,61
3,05
3,32
3,67
Technology Integration
Product Design
Enterprise Management
Scoping, Planning & Contracting
Program Execution
n = 19
Please rank the program management categories regarding their importance for future research (5 – high to 1 – low):
Figure 4-9: Rating of the importance of framework themes by members of the industry focus
group
4.3 Summary
The primary objective of this chapter was to answer the research question, “Is there a
framework for engineering program management?” By comparing the existing
approaches, it became clear that the LAI program management framework is the most
promising approach for fully covering all aspects of the management of a technical program.
One disadvantage of this framework was that it had not been validated through empirical
analysis. Consequently, the author decided to validate the structure and the elements of the
framework before using it. By conducting interviews with experienced Air Force program
managers, meaningful insights could be gained about the degree of correctness of the
framework. These findings were used for refining and validating the framework. Additional
information about the importance of different framework parts and the difficulty of executing
them was gathered and may be used to determine the direction of future research in program
management.
The knowledge of the industry focus group was exploited in two additional ways beside
interviews. First, the members of the group were asked for individual feedback on the LAI
program management framework. Second, the members participated in a survey to rate the
importance of the different framework themes; this survey was conducted in order to improve
the contents of the workshop and adjust the direction of future research.
4 Selection of a framework for engineering programs 61
In summary, all feedback was considered in creating the current version of the LAI program
management. In addition, all members of the industry focus group and the program
managers approved the structure and the definitions of the framework parts of the current
version of the framework. Therefore, the LAI program management framework can be
considered to have been validated as suitable for technical programs. Consequently, the
answer to the research question is that there is a framework for technical programs. The LAI
program management framework meets all requirements for fully addressing the
management of technical programs and has been refined and validated via the feedback of
a large group of experts in this discipline.
5 Common issues and pitfalls in program management 62
5 Common issues and pitfalls in program management
This chapter answers the research question, “What are common pitfalls in program
management?” The first step is the identification of pitfalls in program management via the
current literature (see section 5.1). The second step is the verification of the findings (see
section 5.2). The final step is the mapping of the pitfalls to the LAI program management
framework (see section 0). This step is necessary for later analyses in this thesis.
5.1 Identification of common program management pitfalls
There are a number of sources that are highly useful for identifying common issues in
program management. Most of these sources fall into one of three major groups:
1. Government reports: One very helpful source for the identification of challenges,
especially in the management of defense acquisition programs (this is in the context
of the Defense Acquisition System of the United States Department of Defense; see
section 2.1.1 for a detailed description) is the reports published annually by the
Government Accountability Office (GAO). These reports assess all major acquisition
programs (ACAT I programs and programs designated major acquisition programs for
other reasons). The findings are summarized, together with possible actions to
improve the performance of these programs.
2. General program management literature: Another source for common challenges is
general program management literature and literature that focuses on topics within
program management. The LAI program management framework is based on
publications from the LAI, and the vast majority of these publications give useful
insights into topics in program management. The findings of many of these papers,
theses, etc., can be used to identify common challenges in program management. In
addition, the other, more well-known frameworks claim to be based on known best
practices to avoid common pitfalls. Consequently, the literature on the other
frameworks can also be used to identify common pitfalls.
3. Discussions with experts: The workshop with the industry focus group at the end of
January was used for conducting a first effort to collect data on pitfalls experienced in
program management. The participating members were grouped in small teams
(between four and six persons per team) and asked to compile their insights into
challenges and issues in program management. In addition, two Air Force program
managers were interviewed to learn their opinions about possible reasons for failures
of programs. These interviews were divided into two parts. First, a general discussion
about their experiences with failures was conducted to gain uninfluenced insights.
Second, the list of pitfalls that had identified by this study, up to that point in time,
from current literature was used to help the program managers remember more
pitfalls and frame their experiences within the existing categories.
5 Common issues and pitfalls in program management 63
Table 5-1 shows the entire set of diverse pitfalls in program management that were derived
from the three different sources. For better identification, a unique ID number was assigned
to each pitfall. The table also indicates the sources that state that the identified pitfall has a
major impact on program management or one of the processes in program management.
The sources might be literature (different types of publications; e.g., theses, papers, books,
etc.), the workshop with the industry focus group, or the interviews with the Air Force
program managers.
Table 5-1: Compilation of program management pitfalls and corresponding sources
ID Pitfall Sources
1 Unstable funding (undermining program
stability)
(Mandelbaum, Kaplan et al.
2001; Kirtley 2002; Shroyer
2002; Ferdowsi 2003; Wirthlin
2009; McKnew 2010; Kinscher
2011; LAI 2011)
2 Long waiting time for external stakeholders
(e.g., Acquisition Panel) activities
(Ippolito 2000; Wirthlin 2009)
3 “Firefighting”
(Morgan 1999; CMU 2008;
Levine and Novak 2008;
McKnew 2010; Kinscher 2011)
4 Lack of leadership commitment to cycle
time reduction
(McNutt 1998)
5 No incentives for cycle time reduction (Cowap 1998; McNutt 1998;
Stanke 2006)
6 Overriding influence of funding-related
constraints
(McNutt 1998)
7 Wrong allocation of responsibilities and
decision-making rights
(Klein, Cutcher-Gershenfeld et
al. 1997; McKenna 2006;
Stanke 2006)
8 Lack of coordination of communication (Cohen 2005; McKenna 2006)
9 Lack of process standardization (Cohen 2005; Bador 2007)
10 Poor communication with stakeholders
(Lucas 1996; Rebentisch 1996;
Dare 2003; Roberts 2003;
Gordon, Musso et al. 2009;
McKnew 2010)
11 No realistic program schedule (Gordon, Musso et al. 2009;
Lamb 2009; LAI 2011)
5 Common issues and pitfalls in program management 64
12 Lack of metrics
(Cunningham 1998; Beckert
2000; Stagney 2003; Cohen
2005; McKenna 2006; Stanke
2006; Bador 2007; Boehm,
Valerdi et al. 2008; Rhodes,
Valerdi et al. 2009; Roedler,
Rhodes et al. 2010)
12.1 No metrics to reflect cross-functional
processes
(Ittner and Larcker 1998;
Hammer, Haney et al. 2007;
Blackburn 2009; Rhodes,
Valerdi et al. 2009; Roedler,
Rhodes et al. 2010)
12.2
No process implemented to measure
project performance or project
progress (e.g. EVM)
(Whitaker 2005)
13
Wrong metrics: Not using the right measure
(ignoring something important) or choosing
metrics that are wrong
(Schmenner and Vollmann
1994; Hauser and Katz 1998;
McGarry and Card 2002;
Blackburn 2009) (Klein,
Cutcher-Gershenfeld et al.
1997; Cunningham 1998;
Beckert 2000; Stagney 2003;
Boehm, Valerdi et al. 2008;
Rhodes, Valerdi et al. 2009;
Office 2010; Roedler, Rhodes
et al. 2010)
13.1
Implementing metrics that focus on
short-term results, or that do not give
thoughts to consequences on
human behavior and enterprise
performance
(Schmenner and Vollmann
1994; Kerr 1995; Hauser and
Katz 1998; Ittner and Larcker
1998; McGarry and Card 2002;
Hammer, Haney et al. 2007;
Blackburn 2009)
13.2
Metrics that are not actionable or are
hard for a team/group to impact, or
collecting too much data
(Hauser and Katz 1998; Ittner
and Larcker 1998; McGarry
and Card 2002; Blackburn
2009)
14 No activity-based costing and management (Paduano 2001)
15 No defined risk management process
(Browning, Deyst et al. 2002;
Oehmen 2005; Bresnahan
2006; Stanke 2006; Valerdi,
5 Common issues and pitfalls in program management 65
Rieff et al. 2007; Wagner 2007;
Wirthlin, Seering et al. 2008;
Madachy and Valerdi 2010;
McKnew 2010; Oehmen and
Ben-Daya 2010; LAI 2011;
Oehmen and Seering 2011)
16 Lacking ability to understand
uncertainty/risk
(Ferns 1991; Cunningham
1998; Browning, Deyst et al.
2002; Oehmen 2005; Valerdi
2005; Bresnahan 2006;
McKenna 2006; Stanke 2006;
Wagner 2007; Wirthlin, Seering
et al. 2008; Ahern 2009;
America 2009; Rhodes, Valerdi
et al. 2009; Francis, Golden et
al. 2010; McKnew 2010;
Oehmen and Rebentisch 2010;
Roedler, Rhodes et al. 2010;
LAI 2011; Oehmen and
Seering 2011)
17 Lacking the staff/capacity to deal with
uncertainty/risk
(Ferns 1991; McKenna 2006;
Stanke 2006; Wirthlin, Seering
et al. 2008; Ahern 2009;
America 2009; (AT&L) 2010;
Francis, Golden et al. 2010;
McKnew 2010; Oehmen and
Rebentisch 2010; Roedler,
Rhodes et al. 2010; LAI 2011)
18 Not all staff is involved into risk
management
(Bresnahan 2006)
19 Insufficient management of sub-projects (PMI 2004; PMI 2008; OGC
2009)
20 Competing resource requirements (e.g.
allocation and choice of resources)
(McKenna 2006)
21 Too much overtime (Herweg and Pilon 2001)
22 Overly unstable project priorities (Herweg and Pilon 2001)
23 Too much outsourcing (Herweg and Pilon 2001)
24 Unrealistic plan or no plan for ramp-up and
ramp-down regarding staffing
(Herweg and Pilon 2001;
Kinscher 2011)
5 Common issues and pitfalls in program management 66
25 Troubled projects are not canceled early (Herweg and Pilon 2001)
26 No buffer scheduled between projects (Herweg and Pilon 2001)
27 Inadequate identification of individual skill
development
(Hernandez 1995; Cohen
2005; Davidz 2006; Kinscher
2011)
28 Unsupportive environment for experiential
and general learning
(Browning 1996; Klein,
Cutcher-Gershenfeld et al.
1997; Forseth 2002; Cohen
2005; Davidz 2006; Lamb
2009)
29 No or insufficient assessment of intellectual
capital base regarding program needs
(Siegel 2004)
30 No specialist career path
(Wheelwright and Clark 1992;
Forseth 2002; Cohen 2005;
Kinscher 2011)
31 Inadequate team experience
(Susman and Petrick 1995;
Klein, Cutcher-Gershenfeld et
al. 1997; Valerdi 2005; Stanke
2006; Valerdi, Rieff et al. 2007;
Lamb 2009; LAI 2011)
32 Insufficient program manager qualifications
(Ferns 1991; Wheelwright and
Clark 1992; Klein, Cutcher-
Gershenfeld et al. 1997; Lycett,
Rassau et al. 2004; Kinscher
2011)
33
Lack of enterprise-wide coordination of
optimization: only local processes and
organizational optimization
(LAI 2001; Nightingale and
Mize 2001; Jobo 2003; Davidz
2006; Stanke 2006; Stanke,
Nightingale et al. 2008; LAI
2011)
34 Lack of performance incentives for staff
(Susman and Petrick 1995;
Klein, Cutcher-Gershenfeld et
al. 1997; Cowap 1998; Cohen
2005; Stanke 2006)
35 Wrong performance incentives for staff
(Susman and Petrick 1995;
Klein, Cutcher-Gershenfeld et
al. 1997; Cowap 1998; Cohen
2005; Stanke 2006)
5 Common issues and pitfalls in program management 67
36 No utilization of a social network (Stanke 2006)
37 Too little customer and stakeholder
interaction
(Downen 2005; Glazner 2006;
Stanke 2006)
38 Too little integration of suppliers
(Lucas 1996; Rebentisch 1996;
Bozdogan, Deyst et al. 1998;
Ippolito 2000; Glazner 2006)
39 No fostering and maintaining of personal
accountability for plans and outcomes
(Klein, Cutcher-Gershenfeld et
al. 1997; Stanke 2006)
40 Understaffing
(Ahern 2009; (AT&L) 2010;
McKnew 2010; Office 2010;
Roedler, Rhodes et al. 2010;
LAI 2011)
41 Insufficient resource planning (no
identification of possible understaffing)
(Ahern 2009; (AT&L) 2010;
McKnew 2010; Office 2010;
Roedler, Rhodes et al. 2010;
LAI 2011)
42
Insufficient use of benchmarking and
assessment tools for evaluation of
enterprise structure
(LAI 2001; Nightingale and
Mize 2001)
43 No enterprise-wide integrated continuous
improvement process
(Imai 1986; Bessant, Caffyn et
al. 1994; Laraia, Moody et al.
1999; Maass and McNair 2009)
44
Insufficient use of benchmarking and
assessment tools to identify improvement
potentials
(Constantinescu and Iacob
2007; Nightingale, Stanke et al.
2008; Stanke, Nightingale et al.
2008)
45 No enterprise-wide organizational learning
and change management plan
(Kotter 1990; Klein, Cutcher-
Gershenfeld et al. 1997; Roth
2008)
46 No open information sharing
(Klein, Cutcher-Gershenfeld et
al. 1997; Bernstein 2000;
Stanke 2006)
47 No documentation of lessons learned (Klein, Cutcher-Gershenfeld et
al. 1997; Cunningham 1998)
48 Insufficient or non-standardized usage of
information technology
(Browning 1996; Rebentisch
1996)
49 Insufficient information flow (Bozdogan, Deyst et al. 1998;
5 Common issues and pitfalls in program management 68
Pomponi 1998; Ippolito 2000)
50 Unclear requirement definition
(Ferns 1991; Walton 1999;
Wirthlin 2000; Dare 2003;
Derleth 2003; Tondreault 2003;
Kadish, Abbott et al. 2006;
McKenna 2006; Cameron,
Crawley et al. 2008; Mahé
2008; Ahern 2009; Francis
2009; Andrews, Cooper et al.
2010; Dickerson and Valerdi
2010; Francis, Golden et al.
2010; McKnew 2010; Office
2010; LAI 2011)
51 No understanding of stakeholder needs
(Ferns 1991; Cunningham
1998; Ferdowsi 2003;
Tondreault 2003; Valerdi 2005;
Kadish, Abbott et al. 2006;
Loureiro, Crawley et al. 2006;
Valerdi, Rieff et al. 2007;
Cameron, Crawley et al. 2008;
Mahé 2008; Ahern 2009;
Francis 2009; Andrews,
Cooper et al. 2010; Francis,
Golden et al. 2010; McKnew
2010; LAI 2011)
52 No learning from previous need definitions (Downen 2005; Mahé 2008;
Office 2010)
53 Insufficient multi-attribute trade-
offs/tradespace exploration
(Derleth 2003; Roberts 2003;
Ross 2003; Spaulding 2003;
Tang 2006)
54 Lack of requirement understanding (Valerdi 2005; Valerdi, Rieff et
al. 2007; Valerdi 2010)
55 Lack of life cycle documentation (Valerdi 2005; Valerdi, Rieff et
al. 2007; Valerdi 2010)
56 Insufficient probabilistic cost estimates (Dorey 2011; LAI 2011)
57 Too little updating on estimated costs during
early phases
(Valerdi 2010)
58 Cost estimates do not reflect all aspects of
the life cycle
(Silva 2001; Tondreault 2003)
5 Common issues and pitfalls in program management 69
59 Imprecise or unclear contract terms
(Ferns 1991; Kadish, Abbott et
al. 2006; McKenna 2006;
Ahern 2009; Andrews, Cooper
et al. 2010)
60 Ill-designed contract scope (Ferns 1991; Kinscher 2011)
61 Unclear award criteria and process (Kinscher 2011)
62 Lack of incentives
(Cowap 1998; Mandelbaum,
Kaplan et al. 2001; Kirtley
2002; McVey 2002; Cohen
2005; McKenna 2006; Stanke
2006)
63 Lack of incentive transparency
(Cowap 1998; Mandelbaum,
Kaplan et al. 2001; Kirtley
2002; McKenna 2006; Ahern
2009; (AT&L) 2010)
64 Mismatch of incentive with desired outcome
(Cowap 1998; Mandelbaum,
Kaplan et al. 2001; Kirtley
2002; McVey 2002; McKenna
2006; Ahern 2009; (AT&L)
2010; McKnew 2010)
65 No standard structure for (sub-)contracts (Stanke 2006)
66 Mismatch of contract type with risk profile of
program
(Kinscher 2011)
67 Missing parts in the contract (e.g., include
adaptability)
(Dare 2003; Kinscher 2011)
68 Too much granting of waivers (Cowap 1998)
69 Insufficient probabilistic cost estimates in
contracting
(Dorey 2011)
70 No process implemented to assess
technology maturation
(Kenley and Creque 1999;
Francis 2009; Rhodes, Valerdi
et al. 2009; Tang and Otto
2009; Francis, Golden et al.
2010; McKnew 2010; Office
2010; Roedler, Rhodes et al.
2010)
71 Use of immature technology (Baccarini 1996; Office 2010;
Kinscher 2011)
5 Common issues and pitfalls in program management 70
72 No established technology insertion process
(Shroyer 2002; Francis 2009;
Francis, Golden et al. 2010;
McKnew 2010)
73 No person/team in charge to manage and
monitor technology transition
(Pomponi 1998; Shroyer 2002;
Francis 2009; Francis, Golden
et al. 2010; McKnew 2010)
74 No formal reviews and communication plans
for technology transition
(Shroyer 2002)
75 No diverse learning strategies (Bresman 2004)
76 Misalignment between team goals and
program goals
(Ferns 1991; Wheelwright and
Clark 1992; Susman and
Petrick 1995)
77 Lack of skill and functional diversity within
team
(Wheelwright and Clark 1992;
Hernandez 1995; Susman and
Petrick 1995; Browning 1996;
Bozdogan, Deyst et al. 1998;
Pomponi 1998; Oehmen 2005)
78 No established flow to and from IPT level (Pomponi 1998)
79
No balance between teams and functions
(only applies to programs with matrix
organizations)
(Ferns 1991; Wheelwright and
Clark 1992; Hernandez 1995)
80
System architecture does not support
product development process or IPTs
(complex organizations often give rise to
overcomplicated system designs)
(Lucas 1996; LAI 2011)
81 Ignoring the aspect of standardization
(Beckert 2000; Nuffort 2001;
Silva 2001; Bador 2007; Boas
2008; Mahé 2008; LAI 2011)
82 Ignoring the aspect of reusability (Nuffort 2001; Bador 2007;
Boas 2008; LAI 2011)
83 Ignoring the aspect of modularity (Nuffort 2001; Bador 2007; LAI
2011)
84 Ignoring the aspect of supportability (Kinscher 2011)
85 Ignoring the aspect of maintainability (Kinscher 2011)
86 Ignoring the aspect of usability (Kinscher 2011)
87 Lack of understanding of what waste is (Kato 2005; McManus 2005;
5 Common issues and pitfalls in program management 71
Pessôa 2008; Oehmen and
Rebentisch 2010)
88 Lack of understanding of how to deal with
different types of waste or waste in general
(Kato 2005; McManus 2005;
Pessôa 2008; Oehmen and
Rebentisch 2010)
89 No understanding of current vs. preferred
Value Stream
(Kato 2005; McManus 2005;
Oehmen and Rebentisch 2010)
90 No mechanism for Value Stream
improvements
(McManus 2005)
91 Mismatch between program characteristics
and chosen development process
(Bernstein 1998; Ferdowsi
2003; Roberts 2003; Stagney
2003; Tondreault 2003)
92 Finishing engineering and 3D drawings too
late
(Office 2010)
93 Insufficient exploration of alternative
solutions
(Bernstein 1998)
94 Wrong test approaches/frameworks
(Weigel 2000; Carreras 2002;
Deonandan, Valerdi et al.
2010; Hess, Agarwal et al.
2010; Hess and Valerdi 2010;
Office 2010)
95 No balance regarding amount of testing (too
much or too little)
(Office 2010; Kinscher 2011)
In total, 99 different pitfalls were identified using the three types of sources. As one can see,
the different pitfalls have different numbers of sources. Figure 5-1 shows the distribution of
the number of sources for all pitfalls. It becomes clear that a large number of the pitfalls (28)
are each based on only one source. Then again, there are seven pitfalls with more than 10
sources. In general, a decreasing regression line could be shown in the figure. It is
reasonable to infer that the issues that are widely spread among different sources are
important. On the other hand, regarding the many pitfalls which are not often mentioned in
the literature and were not often mentioned in the discussions held as part of this study:
these may be unimportant or they may be important but relatively unrecognized.
5 Common issues and pitfalls in program management 72
110
1000
3
01
0
3
66
108
17
14
2830
20
10
01 1211109 171615142 1383 75 6 194 18
Number of sources
Pitfalls
Figure 5-1: Distribution of the number of sources for all pitfalls
5.2 Verification of the pitfalls through interviews and the industry focus group
Two different approaches were used to verify the findings presented in Table 5-1:
1. Interviews with Air Force program managers
2. Insights from the members of the industry focus group (telephone interviews and
workshop)
In two separate interviews with Air Force program managers the findings from the literature
were discussed. The interviews were between one and a half hour and two hours long. The
first part of the interviews was used to gain a general overview of the opinion of the
interviewees about pitfalls in program management. This part was based on a free
discussion guided by some starting questions. During the second part of the interview the list
of pitfalls was discussed. Every single pitfall was analyzed in detail together with the program
managers. The author supported the program managers whenever help was needed to
understand the pitfall. These unclarities often implied that the phrasing of the pitfall was not
precise enough. Consequently, some of the pitfalls were rephrased together with the
program managers.
During these discussions it often became clear that there might be additional pitfalls.
Whenever the program manager stated that a missing pitfall is essential for a comprehensive
collection, the pitfall was added to the list. It is outside the objectives of this thesis to
inventory every possible pitfall in program management. Several discussions about specific
pitfalls indicated that some pitfalls on the list could be subdivided into several pitfalls at a
higher level of detail. The author and the program managers tried to provide the level of
detail that was most useful for the purposes of this study.
The second approach to verify the pitfalls was based on the insights of the industry focus
group. First, the workshop with the industry focus group at the end of January was used to
5 Common issues and pitfalls in program management 73
collect data on program management pitfalls experienced by the members of the group. The
participants were grouped in small teams (four to six persons per group) and asked to
compile their insights into challenges and issues in program management. This data was
compared to the findings from the literature. All identified pitfalls from the workshop were
already in the list and no further insights could be added. This indicates that some of the
pitfalls from the literature can also be found in real programs. Consequently, some of the
pitfalls in the list can be considered to be verified by practical experience.
In addition, the list of pitfalls was further discussed with members of the industry focus group
in a telephone conference. All participants in the conference received the list via email in
advance and were asked to be prepared to provide feedback. During the telephone
conference the participants discussed the organizational structure of the pitfalls and their
experiences of them.
In summary, the members of the industry focus group participating in the telephone
conference as well as the program managers who were interviewed stated that the final
version of the list of program management pitfalls is a complete inventory of common pitfalls
occurring in real programs.
5.3 Mapping pitfalls to program management framework
All identified program management pitfalls correspond with at least one of the framework
parts of the LAI program management framework (otherwise aspects would be missing in the
framework). A pitfall might relate to multiple framework parts, but it is most practicable to
assign a single framework part for each pitfall. For each pitfall there is one framework part
which corresponds most closely with it. The discipline where the pitfall originates (e.g., risk
management) gives a good indication of which framework part might fit. The author identified
the framework part that applies to each pitfall best.
5 Common issues and pitfalls in program management 74
Table 5-2 shows the results for the framework parts belonging to the Program Execution
area of the framework. The ID numbers given to the pitfalls in Table 5-1 are used again to
show the results of the mapping. On the left side of the table the name of the framework part
is shown, while the right column shows which pitfalls fit best into the framework part. Some
parts of the framework do not have any assigned pitfalls. These are not shown in the
following tables. Furthermore, the number of assigned pitfalls per framework part varies. This
can be explained by the fact that there are different levels of research – and therefore
different numbers of pitfalls which can be identified – in the different framework parts.
5 Common issues and pitfalls in program management 75
Table 5-2: Mapping of pitfalls to the Program Execution parts of the LAI program management
framework
LAI program management framework
part within Program Execution Corresponding pitfall IDs
Integrating and leading the program
organization
1-9
Stakeholder management 10
Progress monitoring and management 11-14
Risk management 15-18
Project Management 19
Multi-project coordination 20-26
Developing Intellectual Capital 27-31
Program manager characteristics 32
Table 5-3 shows the results of the mapping of the framework parts belonging to Enterprise
Management. Not all framework parts in this area have pitfalls allocated. The part
“Understanding the program enterprise” seems to be underrepresented compared to the
number of pitfalls in the other parts in this area.
Table 5-3: Mapping of pitfalls to the Enterprise Management parts of the LAI program
management framework
LAI program management framework
part within Enterprise Management Corresponding pitfall IDs
Building and transforming the program
enterprise
33-41
Understanding the program enterprise 42
Learning and improvement of the program
enterprise
43-45
Knowledge management 46-49
Table 5-4 displays the results of the mapping of the framework parts belonging to Scoping,
Planning and Contracting in the LAI program management framework. It is clear that the
framework part “Incentive alignment, contract negotiation, and conclusion” is especially
difficult, since 11 pitfalls are allocated to it. On the other hand, the framework part “Managing
trade-offs” seems to be relatively easy—or there might not be enough research on that part
to identify further pitfalls.
5 Common issues and pitfalls in program management 76
Table 5-4: Mapping of program management pitfalls to the Scoping, Planning & Contracting
framework parts of the LAI program management framework
LAI program management framework
part with Scoping, Planning, and
Contracting
Corresponding pitfall IDs
Definition of stakeholder needs and
requirements
50-52
Managing trade-offs 53
Cost estimation and life cycle costing 54-58
Incentive alignment, contract negotiation,
and conclusion
59-69
Table 5-5 shows the pitfalls in the area of Technology Integration. Only five pitfalls are
allocated to the two parts of this area.
Table 5-5: Mapping of pitfalls to the Technology Integration parts of the LAI program
management framework
LAI program management framework
part within Technology Integration Corresponding pitfall IDs
Technology maturation monitoring 70-71
Technology transition management 72-74
The last table in this series, Table 5-6, presents the pitfalls in the area of Product Design.
The framework part “Monitor PD progress” has the same pitfalls as the framework part
“Progress monitoring and management” in the area of Program Execution; these two
framework parts address nearly the same activities and are closely related but have different
scopes.
Table 5-6: Mapping of pitfalls to the Product Design parts of the LAI program management
framework
LAI program management framework
part within Product Design Corresponding pitfall IDs
PD team organization 75-77
Integrated product and process development 78-80
Monitor PD progress 11-14
Product Architecting 81-86
5 Common issues and pitfalls in program management 77
Value Stream Optimization 87-90
Product Design Strategy 91-93
Test & Prototyping 94-95
Figure 5-2 shows how many pitfalls are assigned to a framework part. The figure shows that
there are framework parts which cover only very little pitfalls as well as there are framework
parts which cover up to 11 pitfalls.
1
0
22
11
2
3
6
2
5
10987654321
2
4
6
110
Number of pitfalls in a framework part
Framework parts
Figure 5-2: Distribution of number of pitfalls per framework part
Another meaningful insight can be gained by showing the distribution of the 99 pitfalls among
the five major areas within the LAI program management framework; see Figure 5-3. The
numbers of pitfalls correspond very well with the ratings of the members of the industry focus
group shown in Figure 4-9. The members were asked to rate the major framework areas in
terms of the need for future research. Except for the area Product Design, the numbers of
identified pitfalls within the framework areas correspond with the ratings from the survey.
With 36 pitfalls, Program Execution has the highest number, and it was also rated as most in
need of further research. The other areas also “rank” the same in number of pitfalls as they
do in the survey. The areas with high numbers of pitfalls and therefore high chances of
negatively affecting program performance are perceived as not well covered by current
research.
5 Common issues and pitfalls in program management 78
99
5
36
0
20
TotalTechnology Integration
Enterprise Management
17
Scoping, Planning & Contracting
20
Product Design
21
Program Execution
40
60
80
100
Figure 5-3: Number of pitfalls for the major areas within the LAI program management
framework
5.4 Summary
This chapter was intended to answer the research question, “Are there common pitfalls in
program management?”
The findings in this chapter show that there are indeed common issues and pitfalls in
program management. Table 5-1 is based on conclusions drawn from the current literature.
Two different types of literature are covered by the comprehensive list. First, publications
pertaining to the performance of Defense Acquisition System programs were reviewed.
These are performance assessments carried out by the Government Accountability Office
(GAO), an independent governmental organization that evaluates acquisition program
performance. These reports were used to identify pitfalls and failure patterns in program
management. Second, literature about program management in general, or aspects of it, was
reviewed in order to identify best practices, recommendations for improving performance,
and examples of “worst practices” (actions which lead to degraded performance). All these
results were summarized in the form of a list of pitfalls.
The content of the pitfall list was verified in two basic ways. The first was two interviews with
Air Force program managers. These interviews were used to generate additional pitfalls in
order to cover all crucial aspects of program management. In addition, the pitfalls included in
the list as it stood before these interviews were discussed in detail; this led to the refining of
some of the pitfalls’ definitions and to the elimination from the list of pitfalls that the program
managers stated were not sufficiently important.
5 Common issues and pitfalls in program management 79
The second verification was performed through work with the industry focus group. First,
some of the pitfalls found in literature were verified by program managers’ having
experienced them during their professional careers. Second, the collection of pitfalls was
discussed with key members of the industry focus group during a telephone conference. All
attendees received the list in advance and were asked to be prepared to provide feedback
during the conference.
In summary, the members of the industry focus group participating in the telephone
conference as well as the program managers who were interviewed stated that the final
version of the list of program management pitfalls includes all of the common pitfalls
occurring in real programs.
To better understand the aspects of program management that are affected by the pitfalls, a
mapping of the pitfalls to the different parts of the LAI program management framework was
conducted. This mapping led to the conclusion that the number of pitfalls related to each of
the five basic framework themes correlates closely with the ratings of the themes in terms of
the need for further research efforts. It was found that the themes rated as more in need of
research in a survey of the members of the industry focus group also had more pitfalls.
6 Lean Enablers in program management 80
6 Lean Enablers in program management
This chapter intends to adapt the ideas of “lean thinking” to program management and will
partly answer the research question, “Are there ways to adapt lean thinking to program
management?”
Section 2.3 briefly introduced the idea of “lean thinking” and its roots. As was explained, the
idea derived from a manufacturing environment. Womack and Jones describe six case
studies in which the use of lean ideas in manufacturing improved costs, lead times, and
inventory up to 90% (Womack and Jones 1996). Especially notable were cost and lead time
reductions of up to 50% achieved by rearranging machines in a factory to establish a
continuous flow. These results were achieved in only a couple of days (LEI 2007).
Even though Toyota also applies “lean thinking” to design and engineering, supply chain
management, and all support activities, not just to production (Morgan and Liker 2006), many
companies, especially in the United States, understand lean as a phenomenon mostly used
in manufacturing (Womack and Jones 1996). This perception might lead to the false
impression that lean is only usable in manufacturing environments.
Another common misperception of “lean thinking” is that it only applies to high-volume/mass
production (Oppenheim, Murman et al. 2011). The fact is that “lean thinking” might also be
effective in one-off applications; e.g., engineering projects or systems engineering.
Oppenheim et al. state that in many environments that are one-off “the processes and
individual tasks should use repeatable logic based on best engineering practices”
(Oppenheim, Murman et al. 2011).
Since the development of “lean thinking,” which was concentrated in the period 1990–96, the
idea has been widely adapted in other environments, such as engineering (McManus,
Haggerty et al. 2007), product development (Morgan and Liker 2006; Ward 2007), education
(Emiliani 2004), supply chain management (Bozdogan 2005), health care (Graban 2008),
and enterprise management (LAI 2001; Jones 2006). The different approaches to adapting
the idea of “lean thinking” to diverse environments demonstrate that the concept of adapting
it to program management is far from unrealistic.
6.1 Sources of Lean Enablers
The work and ideas of Oppenheim et al. (Oppenheim, Murman et al. 2011) pertaining to
identifying Lean Enablers for a specific environment (system engineering) have been very
well known and respected in industry as well as in academia.
Oppenheim defines lean and Lean Enablers in the following manner: “Any enterprise process
which follows the six Lean Principles will be regarded as Lean. As a corollary, any practice
that enables the Value and follows the Lean process will be defined as a Lean enabler.”
(Oppenheim, Murman et al. 2011)
6 Lean Enablers in program management 81
The major advantage of the idea of Lean Enablers is that those best practices can be used in
any program management environment regardless of which management framework is used.
Lean Enablers can be understood as a supplement to currently implemented guidelines. A
comprehensive collection might be used as a road map for implementing “lean thinking” in
program management.
6.1.1 Introduction to Lean Enablers in the context of program management
The Lean Enablers identified in this thesis for application to program management are based
primarily on six sources in the literature, interviews with program managers, and insights
from the industry focus group.
The first source in the literature is “Lean thinking: Banish waste and create wealth in your
corporation” (Womack and Jones 1996) by Womack and Jones. The two authors posit and
describe in detail five lean principles:
1. Specification of customer value
2. Identification of the value stream
3. Creation of a continuous flow
4. Pull of the value
5. Striving for perfection
The second source in the literature is “Toyota's principles of set-based concurrent
engineering” (Sobek, Ward et al. 1999) by Sobek, Ward, and Liker. They describe the idea of
set-based concurrent engineering being used by Toyota. Rather than committing early to a
design solution during the product development process, “design participants reason about,
develop, and communicate sets of solutions in parallel and relatively independently. As the
design progresses, they gradually narrow their respective sets of solutions based on
additional information from development, testing, the customer, and other participants’ sets”
(Sobek, Ward et al. 1999).
Oppenheim’s paper “Lean Product Development Flow” (Oppenheim 2004) is the third source
in the literature used here in identifying Lean Enablers for program management. The author
proposes a framework, based on the five lean principles from Womack and Jones, for
product development. The framework adapts “lean thinking” to product development; its
elements include the ideas of value and waste in product development, best practices to
adapt the five lean principles, metrics to measure if the lean principles are successfully
implemented, etc. The paper also emphasizes the idea of a chief engineer, the person in
charge of the entire program.
The fourth, dual source in the literature is the book chapter “Lean Innovation – Auf dem Weg
zur Systematik” (Schuh, Adickes et al. 2008) and the corresponding paper “Lean Innovation:
Introducing Value Systems to Product Development” (Schuh, Lenders et al. 2008) by Schuh
et al. The authors adapt the idea of “lean thinking” to innovation management, the
management of product or process innovations and their development (Schuh, Lenders et al.
6 Lean Enablers in program management 82
2008). The published framework focuses mainly on the ideas of value and waste in the
development processes.
The fifth source in the literature source of Lean Enablers for program management is “The
Toyota product development system: integrating people, process, and technology” (Morgan
and Liker 2006) by Morgan and Liker. The authors conducted a two-and-a-half-year, in-depth
study of the product development system of Toyota. They define 13 new lean principles for
product development in three subsystems: processes, people, and tools.
The sixth and last source in the literature is the paper, “Lean enablers for systems
engineering” (Oppenheim, Murman et al. 2011) by Oppenheim et al., published in 2011. The
authors identify 194 Lean Enablers for systems engineering. They structure these Lean
Enablers in accord with six lean principles: the five from Womack and Jones (Womack and
Jones 1996) and the principle “respect for people,” emphasized by Sugimori (Sugimori,
Kusunoki et al. 1977).
6.1.2 Discussion and comparison
The literature sources provide multiple insights into various applications of “lean thinking.”
Since many different processes across diverse disciplines are part of program management
(see chapter 4 and the framework description for a detailed overview of the
processes/framework parts), the different sources in the literature can be used to identify
lean ideas for program management. Because this thesis focuses on technical development
programs, the sources that describe ideas of “lean thinking” in this environment are
especially valuable for adaptation to program management.
The identification of Lean Enablers for systems engineering highly influenced the approach
of the author of this thesis. The first idea of the author was to adapt lean ideas to program
management. It became clear that this approach would fail to produce methods that would
be easily applicable. The use (by Oppenheim et al.) of lean best practices, also called Lean
Enablers, to structure findings is simple and brilliant. The term “best practices” is well known
in industry, and developing one’s work methods by identifying and implementing best
practices is common. Consequently, the idea of Lean Enablers will be used in this thesis and
their application will be adapted from systems engineering to program management.
6.2 Findings along the Lean Principles
As described in section 6.1.2, the author uses an approach to structuring lean ideas in
program management that is similar to the approach Oppenheim et al. used for systems
engineering. Oppenheim et al. structure their Lean Enablers in accord with six lean
principles. These six principles are derived from the five from Womack and Jones and from
the principle “Respect for people” posited by Sugimori (Sugimori, Kusunoki et al. 1977).
By using the same structure for Lean Enablers in program management as in system
engineering, one can easily merge these approaches (if this will be necessary). In addition,
Oppenheim et al. used the six principles to structure their Lean Enablers so as to ensure
compatibility with the work of the organization INCOSE (International Council on Systems
6 Lean Enablers in program management 83
Engineering) (INCOSE 2011). Consequently, any Lean Enablers identified for program
management will also be structured in accord with the publications and works of INCOSE.
The next section will describe key findings of the author about “lean thinking” in program
management, organized according to the six lean principles. These findings were used to
create new Lean Enablers for program management and to adapt existing Lean Enablers for
systems engineering to program management.
6.2.1 Principle 1: Specification of customer value
The basic idea of the first lean principle is to distinguish value for the customer and waste in
processes (Morgan and Liker 2006). Value can be defined as every activity which satisfies
the three following criteria:
- The external customer is willing to ”pay” for it
- It transforms information or material or reduces uncertainty
- It provides specified performance right the first time (Oppenheim, Murman et al. 2011)
A key idea for ensuring the delivery of value is the implementation of a chief engineer.
Oppenheim as well as Morgan and Liker emphasizes appointing an employee with broad
experience in different divisions of the company (Oppenheim 2004; Morgan and Liker 2006).
The chief engineer is often described as “the voice of the customer” (Morgan and Liker
2006). He or she has the power to make the final decisions about the directions of the
program, decisions intended to ensure that the final deliverable product or service will fit
customer needs.
The idea of set-based concurrent engineering partly relates to the first lean principle. All
three authors, Sobek, Oppenheim, and Schuh, assert that set-based concurrent engineering
enables a program to explore more solutions than would a traditional product development
approach. Consequently, using the newer approach increases the likelihood of finding a
solution that will satisfy all customer needs (Sobek, Ward et al. 1999; Oppenheim 2004;
Schuh, Adickes et al. 2008).
Another aspect of the first principle is a way to structure the goals of a program according to
the value they deliver to the customer. Schuh suggests a method called “Value System” to
establish a hierarchy in the different goals (Schuh, Adickes et al. 2008). Schuh emphasizes
that his method prevents programs from delivering capabilities that are not required by the
customer and from failing to satisfy some customer needs (Schuh, Adickes et al. 2008).
6.2.2 Principle 2: Value Stream definition
The basic idea of the second principle is the identification of all value-adding and non-value-
adding activities (Hoppmann 2009). The sequence of activities, throughout the program, that
eventually delivers value to the customer is called the value stream.
The idea of the chief engineer’s defining the value stream is explained by Morgan and Liker
and by Oppenheim (Oppenheim 2004; Morgan and Liker 2006). The authors state that the
6 Lean Enablers in program management 84
chief engineer should be responsible for the final definition of the value stream since he or
she is “the voice of the customer.” The chief engineer is also accountable for finding a
consensus across functions regarding the value stream.
A widely used method to define a value stream in a company or a program is value stream
mapping (Oppenheim 2004; McManus 2005; Schuh, Adickes et al. 2008). A core team
(including the chief engineer) plans the activities necessary to deliver a product or service to
satisfy customer needs. The different activities span from the first contact with the customer
through product development and production to the final delivery.
Morgan and Liker describe the idea of frontloading a program (Morgan and Liker 2006). This
idea is especially applicable to development programs since its original application was in a
product development environment.
Another aspect of the idea of set-based concurrent engineering can be found in the second
lean principle. Sobek states that that engineering method is able to provide knowledge to
improve the definition of the value stream (Sobek, Ward et al. 1999). Consequently, all
aspects of set-based concurrent engineering influencing the value stream are found as Lean
Enablers associated with the second lean principle.
A further aspect of the second lean principle is the monitoring of the performance of the
value stream. Oppenheim et al. recommend the implementation of leading indicators and
metrics to manage the program (Oppenheim, Murman et al. 2011).
6.2.3 Principle 3: Creation of a continuous flow
The basic idea of the third lean principle is the implementation of a flow in the activities
defined in the value stream. Some authors state that this principle might be the superordinate
vision of “lean thinking” (Hoppmann 2009).
Having clear and stable requirements (stemming ultimately from customer/user needs) is
crucial to ensure a smooth flow in the program. Therefore, a continuous assessment and
updating of the requirements is essential during the execution of the program (Oppenheim,
Murman et al. 2011).
Since many programs are highly dependent on the information, products, and services of
their suppliers, Morgan and Liker suggest fully integrating the suppliers in the decision-
making of the program (Morgan and Liker 2006). They advocate long-term relationships and
growing trust between the program and the suppliers. In addition, they propose a way to
structure the suppliers hierarchically.
Oppenheim recommends the strategic separation of product development and long-term
research in a program (Oppenheim 2004). He states that research in general is not very
predictable and therefore difficult to synchronize with the rest of the program.
All of the authors – Schuh, Oppenheim, Womack, and Morgan and Liker – describe best
practices for promoting a smooth flow in the program. The use of integrative events on a
regular basis is promoted by Oppenheim (Oppenheim 2004). Morgan and Liker advocate
wrap-up meetings in addition to integrative events (Morgan and Liker 2006).
6 Lean Enablers in program management 85
In addition to implementation of a chief engineer who is also responsible for promoting a
smooth flow in the program, the idea of a “war room” is described in literature (Oppenheim
2004; Morgan and Liker 2006). The war room provides workspace for the chief engineer and
his/her team, and space for their meetings with staff leaders. There are a few best practices
for the furnishing of the war room; e.g., the use of “mobile walls” is recommended.
6.2.4 Principle 4: Pull of the value
The concept of pull, in the strictest sense, is that all processes and activities along the value
stream should be triggered by the customer (Womack and Jones 1996).
Beside some mentions that the chief engineer might help to enforce the concept of pull in the
program (Morgan and Liker 2006), there is not much about applying this principle in current
literature. Only Oppenheim et al. describe some best practices for enforcing the idea of pull
in a program (Oppenheim, Murman et al. 2011).
6.2.5 Principle 5: Striving for perfection
The fifth lean principle describes the concept of the continuous striving for perfection. The
execution and implementation of this concept is never-ending, because a perfect state will
never be reached (Hoppmann 2009).
The ideas of the fifth principle are broadly explained in nearly all literature sources that
describe the five lean principles. Consequently, because the Lean Enablers for systems
engineering are based on these sources, implementing the fifth principle through them can
be accomplished by adjusting the context from systems engineering to program
management.
Some additional aspects, which are not part of the Lean Enablers for systems engineering,
must be added. The practice of comparison with your competition is described by Morgan
and Liker (Morgan and Liker 2006). This practice ensures that managers of future programs
and activities not only learn from past mistakes of their own company or program but also
utilize knowledge gained through the experiences of competitors in a similar environment
with comparable customers.
6.2.6 Principle 6: Respect for People
The concept of the sixth lean principle, “Respect for people,” is not widespread in
conventional lean literature. Oppenheim et al. made a great contribution by developing Lean
Enablers based on the sixth lean principle (Oppenheim, Murman et al. 2011). Since the Lean
Enablers for systems engineering can easily be adapted to program management, most of
the Lean Enablers for program management associated with the sixth lean principle are
based on the work of Oppenheim et al.
6 Lean Enablers in program management 86
6.3 Synthesis in one general framework
The major findings described in section 6.2, together with feedback from various sources,
lead to the comprehensive list of Lean Enablers for program management presented in
Error! Not a valid bookmark self-reference.. The table shows the Lean Enablers and the
corresponding literature which was used to create and phrase the Lean Enablers.
Table 6-1: Lean Enablers for program management
Lean Enabler Literature
source
1. Specification of customer value
(Womack and Jones
2003; Oppenheim
2004)
1.1 Establish customer-defined value to distinguish added
value from waste
(Morgan and Liker
2006)
1.1.1 Define value as the outcome of an activity that satisfies
at least three conditions
(Oppenheim,
Murman et al. 2011)
1.1.1.a. The external customer is willing to pay for it (Oppenheim,
Murman et al. 2011)
1.1.1.b. It transforms information or material or
reduces uncertainty
(Oppenheim,
Murman et al. 2011)
1.1.1.c. It provides specified performance right the
first time
(Oppenheim,
Murman et al. 2011)
1.1.2. Define added value in terms of value to the customer
and for filling his needs
(Oppenheim,
Murman et al. 2011)
1.1.3. Develop a robust process to capture, develop, and
disseminate customer value with extreme clarity
(Oppenheim,
Murman et al. 2011)
1.1.4. Develop an agile process to anticipate, accommodate
and communicate changing customer requirements
(Oppenheim,
Murman et al. 2011)
1.1.5. Do not ignore potential conflicts with other stakeholder
values; seek consensus
(Oppenheim,
Murman et al. 2011)
1.1.6. Explain customer culture to program employees; i.e.,
the value system, approach, attitude, expectation, and issues
(Oppenheim,
Murman et al. 2011)
1.2. Establish a chief engineer (CE) who is ultimately
responsible for delivering value to customer
(Oppenheim 2004;
Morgan and Liker
2006)
6 Lean Enablers in program management 87
1.2.1. CE must have education in systems engineering and
broad experience in all areas involved in programs (Oppenheim 2004)
1.2.2. CE must have technical superiority and exceptional
engineering skills
(Morgan and Liker
2006)
1.2.3. CE has strong leadership skills to mobilize the
organization
(Morgan and Liker
2006)
1.2.4. CE is system designer and overall architect of the
product
(Morgan and Liker
2006)
1.2.5. CE develops the product concept (Morgan and Liker
2006)
1.2.6. CE guides a process of developing consensus across
functions
(Morgan and Liker
2006)
1.2.7. CE is more responsible for decisions about system
integration than personal decisions and project administration
(Morgan and Liker
2006)
1.2.8. CE is responsible for product architecture, product
performance, and product characteristics
(Morgan and Liker
2006)
1.2.9. CE sets performance targets (Morgan and Liker
2006)
1.2.10. CE is responsible for product objectives (Morgan and Liker
2006)
1.2.11. CE is system integrator (bottom-up and technical
integration, rather than top-down approach and social
coordination)
(Morgan and Liker
2006)
1.2.12. CE provides data for parameters in early stages of
the program (Oppenheim 2004)
1.2.13. CE has a no-compromise attitude to achieving targets
(even against stakeholders)
(Morgan and Liker
2006)
1.2.14. CE decides about level of modularity fitting long term
program/company strategy (Oppenheim 2004)
1.3. Set-based concurrent engineering
(Oppenheim 2004;
Schuh, Lenders et al.
2010)
1.3.1. Define feasible regions (Sobek, Durward et
al. 1998)
1.3.2. Explore trade-offs by designing multiple alternatives (Sobek, Durward et
al. 1998; Schuh,
6 Lean Enablers in program management 88
Lenders et al. 2010)
1.3.3. Communicate sets of possibilities (Sobek, Durward et
al. 1998)
1.3.4. Do not narrow down design space early (Schuh, Lenders et
al. 2010)
1.4 Value System – Target Hierarchy (Schuh, Lenders et
al. 2010)
1.4.1. Prioritize targets (Schuh, Lenders et
al. 2010)
1.4.2. Make targets visible to… (Schuh, Lenders et
al. 2010)
1.4.2.a. customers (Schuh, Lenders et
al. 2010)
1.4.2.b. stakeholders (Schuh, Lenders et
al. 2010)
1.4.3.c. suppliers (Schuh, Lenders et
al. 2010)
1.5. Fully integrate your suppliers in the process of the
customer value definition
(Morgan and Liker
2006)
1.3.1. Everyone involved in the program must have a
customer-first spirit
(Oppenheim,
Murman et al. 2011)
1.3.2. Establish frequent and effective interaction with
internal and external customers
(Oppenheim,
Murman et al. 2011)
1.3.3. Pursue an architecture that captures customer
requirements clearly and can be adaptive to changes
(Oppenheim,
Murman et al. 2011)
1.3.4. Establish a plan that delineates the artifacts and
interactions that provide the best means for drawing out
customer requirements
(Oppenheim,
Murman et al. 2011)
2. Value Stream Definition
(Womack and Jones
2003; Oppenheim
2004; Schuh,
Lenders et al. 2010)
2.1 Establish a CE who is ultimately responsible to define the
value stream
(Oppenheim 2004;
Morgan and Liker
2006)
2.1.1. CE is system designer and overall architect of the (Morgan and Liker
6 Lean Enablers in program management 89
product 2006)
2.1.1.a. CE defines value stream according the
product structure
(Morgan and Liker
2006)
2.1.1.b. CE develops product concept (Morgan and Liker
2006)
2.1.1.c. CE is responsible for product architecture,
product performance and product characteristics
(Morgan and Liker
2006)
2.1.2. CE guides a process of developing consensus across
functions
(Morgan and Liker
2006)
2.1.3. CE has a vision for all functional program teams (Morgan and Liker
2006)
2.1.4. CE is responsible to find consensus across functions
regarding the ideal Value Stream (Oppenheim 2004)
2.2. Map the Value stream over the whole program, including all
projects and sub-processes, and improve it
(Womack and Jones
2003ff.; Oppenheim
2004; Schuh,
Lenders et al. 2010)
2.2.1 Develop and execute clear communication plan that
covers entire value stream and stakeholders
(Oppenheim,
Murman et al. 2011)
2.2.2 Have cross-functional stakeholders work together to
build the agreed-upon value stream
(Oppenheim,
Murman et al. 2011)
2.2.3 Maximize co-Iocation opportunities for different
planning departments in the program
(Oppenheim,
Murman et al. 2011)
2.2.4 Use formal value Stream mapping methods to identify
and eliminate waste, and to tailor and scale tasks
(Oppenheim,
Murman et al. 2011)
2.2.5. Scrutinize every step to ensure it adds value, and plan
nothing because “it has always been done”
(Oppenheim,
Murman et al. 2011)
2.2.6. Carefully plan the precedence of both SE and PD
tasks (which task to feed what other task(s) with what data
and when), understanding task dependencies and parent-
child relationships
(Oppenheim,
Murman et al. 2011)
2.2.7. Maximize concurrency of tasks within the program (Oppenheim,
Murman et al. 2011)
2.2.8. Synchronize work flow activities using scheduling
across functions, and even more detailed scheduling within
functions
(Oppenheim,
Murman et al. 2011)
6 Lean Enablers in program management 90
2.2.9. For every action, define who is responsible, approving,
supporting, and informing (“RASI”), using a standard and
effective tool, paying attention to precedence of tasks
(Oppenheim,
Murman et al. 2011)
2.2.10. Plan for level work flow and with precision to enable
schedule adherence and drive out arrival time variation
(Oppenheim,
Murman et al. 2011)
2.2.11. Plan below full capacity to enable work without
accumulation of variability, and permit scheduling flexibility in
work loading; i.e., have appropriate contingency and
schedule buffers
(Oppenheim,
Murman et al. 2011)
2.2.12. Plan to use visual methods wherever possible to
communicate schedules, workloads, changes in customer
requirements, etc.
(Oppenheim,
Murman et al. 2011)
2.3. Plan for Frontloading the Program (Morgan and Liker
2006)
2.3.1. Plan to utilize cross-functional team made up of the
most experienced and compatible people at the start of the
project to look at a broad range of solution sets
(Oppenheim,
Murman et al. 2011)
2.3.2. Anticipate and plan to resolve as many downstream
issues and risks as early as possible to prevent downstream
problems.
(Oppenheim,
Murman et al. 2011)
2.3.3. Plan early for consistent robustness and “first time
right” under “normal” circumstances instead of hero behavior
in later “crisis” situations
(Oppenheim,
Murman et al. 2011)
2.4. Set-based concurrent engineering
(Oppenheim 2004;
Schuh, Lenders et al.
2010)
2.4.1. Integrate knowledge to improve value stream (Sobek, Durward et
al. 1998)
2.4.2. Look for intersections of feasible sets (Sobek, Durward et
al. 1998)
2.4.3. Impose minimum constraint (Sobek, Durward et
al. 1998)
2.4.4. Seek conceptual robustness (Sobek, Durward et
al. 1998)
2.5. Use a CE concept paper (Morgan and Liker
2006)
2.5.1. Let the CE and his/her team create a concept paper (Morgan and Liker
6 Lean Enablers in program management 91
2006)
2.5.2. Schedule meetings to enable including all data in the
concept paper (customer data, marketing department, etc.)
(Morgan and Liker
2006)
2.5.3. Declare the concept paper highly confidential (Morgan and Liker
2006)
2.5.4. Ensure that the length of the concept paper is between
15 and 25 pages
(Morgan and Liker
2006)
2.5.5. Use tables, graphs, and sketches to underpin the
information
(Morgan and Liker
2006)
2.5.6. Use the concept paper as a guideline for future
decisions
(Morgan and Liker
2006)
2.5.7. Distribute the final version of the concept paper to all
assistant CEs and functional staff leaders on the program
(Morgan and Liker
2006)
2.6. Plan to Develop Only What Needs Developing
(Morgan and Liker
2006; Oppenheim,
Murman et al. 2011)
2.6.1. CE is visionary and innovative yet aware of technology
maturation
(Morgan and Liker
2006)
2.6.2. The CE guides tradeoff between creativity and
standard/legacy knowledge (Oppenheim 2004)
2.6.3. The CE takes no unnecessary risks, but does not
avoid risk in general
(Morgan and Liker
2006)
2.6.4. Well-organized research output should be in the form
of technologies that are mature enough to be made robust
but are not yet so (Oppenheim 2004)
2.6.5. Promote reuse and sharing of program assets: Utilize
platforms, standards, busses, and modules of knowledge,
hardware, and software
(Oppenheim,
Murman et al. 2011)
2.6.6. Insist that module proposed for use is robust before
using it
(Oppenheim,
Murman et al. 2011)
2.6.7. Remove show-stopping research/unproven technology
from critical path, staff with experts, and include it in the Risk
Mitigation Plan
(Oppenheim,
Murman et al. 2011)
2.6.8. Defer unproven technology to future technology
development efforts, or future systems
(Oppenheim,
Murman et al. 2011)
2.6.9. Maximize opportunities for future upgrades (e.g., (Oppenheim,
6 Lean Enablers in program management 92
reserve some volume, mass, electric power, computer
power, and connector pins), even if the contract calls for only
one item
Murman et al. 2011)
2.7. Plan Leading Indicators and Metrics to Manage the
Program
(Oppenheim,
Murman et al. 2011)
2.7.1. Use leading indicators to enable action before waste
occurs
(Oppenheim,
Murman et al. 2011)
2.7.2. Focus metrics around customer value, not profit (Oppenheim,
Murman et al. 2011)
2.7.3. Use only few simple and easy-to-understand metrics
and share them frequently throughout the enterprise
(Oppenheim,
Murman et al. 2011)
2.7.4. Use metrics structured to motivate the right behavior (Oppenheim,
Murman et al. 2011)
2.7.5. Use only those metrics that meet a stated need or
objective
(Oppenheim,
Murman et al. 2011)
3. Creation of continuous flow
(Womack and Jones
2003; Oppenheim
2004; Morgan and
Liker 2006; Schuh,
Lenders et al. 2010)
3.1. Establish a CE to ensure a flow throughout the program
(Oppenheim 2004;
Morgan and Liker
2006)
3.1.1. CE has strong leadership skills to mobilize the
organization and establish a flow
(Morgan and Liker
2006)
3.1.2. CE is responsible for product architecture, product
performance, and product characteristics
(Morgan and Liker
2006)
3.1.3. CE sets performance targets (Morgan and Liker
2006)
3.2. Use derivative approaches and platforms
(Morgan and Liker
2006; Schuh,
Lenders et al. 2010)
3.2.1. Use a modular architecture (Schuh, Lenders et
al. 2010)
3.2.2. Plan a frequency for launching new product versions (Schuh, Lenders et
al. 2010)
3.2.3. Define a module or system life cycle (Schuh, Lenders et
6 Lean Enablers in program management 93
al. 2010)
3.2.4. Use derivation management to synchronize the life
cycle with the launch of new products or derivations
(Schuh, Lenders et
al. 2010)
3.2.5. Group modules in a logical way to identify subsystems
to be used again, substituted, or further developed
(Schuh, Lenders et
al. 2010)
3.3. Clarify, Derive, Prioritize Requirements Early and Often
During Execution
(Oppenheim,
Murman et al. 2011)
3.3.1. Since formal written requirements are rarely enough,
provide for follow-up verbal clarification of the context and
need, without allowing requirement creep
(Oppenheim,
Murman et al. 2011)
3.3.2. Create effective channels for clarification of
requirements (possibly involve customer participation in
development IPTs)
(Oppenheim,
Murman et al. 2011)
3.3.3. Listen for and capture unspoken customer
requirements
(Oppenheim,
Murman et al. 2011)
3.3.4. Use architectural methods and modeling for system
representations (3D integrated CAE toolset, mockups,
prototypes, models, simulations, and software design tools)
that allow interactions with customers as the best means of
drawing out customer requirements
(Oppenheim,
Murman et al. 2011)
3.3.5. “Fail early, fail often” through rapid learning techniques
(prototyping, tests, digital reassembly, spiral development,
models, and simulation)
(Oppenheim,
Murman et al. 2011)
3.3.6. Identify a small number of goals and objectives that
articulate what the program is set up to do, how it will do it,
and what the success criteria will be to align stakeholders
and repeat these goals and objectives consistently and often
(Oppenheim,
Murman et al. 2011)
3.4. Fully integrate the supplier into the program (Morgan and Liker
2006)
3.4.1. Do not only focus on costs while choosing suppliers:
balance between quality and cost
(Morgan and Liker
2006)
3.4.2. Help out struggling suppliers if necessary (Morgan and Liker
2006)
3.4.3. Design contracts that are easy to understand (Morgan and Liker
2006)
3.4.4. Invite engineers from your suppliers into your program (Morgan and Liker
2006)
6 Lean Enablers in program management 94
3.4.4. Honor the contracts – do not renege on them to save
costs
(Morgan and Liker
2006)
3.4.5. Treat suppliers with respect (Morgan and Liker
2006)
3.4.5.1. Do not change orders without reimbursement
for the supplier
(Morgan and Liker
2006)
3.4.5.2. Prevent any bureaucracy in the interaction
with your supplier
(Morgan and Liker
2006)
3.4.5.3. Never bully your supplier (Morgan and Liker
2006)
3.4.5.4. Respect the intellectual property of the
knowledge of your suppliers
(Morgan and Liker
2006)
3.4.6.Set aggressive performance targets, but work together
with suppliers to achieve them
(Morgan and Liker
2006)
3.4.7. Implement a process for supplier selection (Morgan and Liker
2006)
3.4.8. Use a tier structure for your suppliers: (Morgan and Liker
2006)
3.4.8.a. Partner: highest level of supplier; technically
autonomous; delivers products with highest technical
complexity
(Morgan and Liker
2006)
3.4.8.b. Mature: very strong engineering and
manufacturing capabilities, but a bit less autonomous;
relies on specification of program; delivers products
with high technical complexity
(Morgan and Liker
2006)
3.4.8.c. Consultative: influences the specifications by
own suggestions; program can tap into its expertise;
delivers products with low technical complexity
(Morgan and Liker
2006)
3.4.8.d. Contractual: no major relationship; delivers
products with low technical complexity; program
should monitor quality, cost and delivery
(Morgan and Liker
2006)
3.5 Front-load the product development process in the program (Morgan and Liker
2006)
3.5.1. Product Line Optimization – Feature Clusters (Schuh, Lenders et
al. 2010)
3.5.1.1. Use a technology model to structure the (Schuh, Lenders et
6 Lean Enablers in program management 95
product al. 2010)
3.5.1.2. Use “musts” and “don’ts” to classify product
characteristics
(Schuh, Lenders et
al. 2010)
3.5.2. Use a clear architectural description of the agreed-
upon solution to plan coherent program, engineering, and
commercial structures
(Oppenheim,
Murman et al. 2011)
3.5.3. All other things being equal, select the simplest
solution
(Oppenheim,
Murman et al. 2011)
3.5.4. Invite suppliers to make a serious contribution to
various stages of the program as trusted program partners
(Oppenheim,
Murman et al. 2011)
3.6. Strategic separation of research, development and
deployment (Oppenheim 2004)
3.6.1. Research should be strategic, long-term, and ongoing
(must not even be connected to the program) (Oppenheim 2004)
3.6.2. Functional engineering department transforms and
translates the research output into a robust, mature
technology using modularization and ensuring maximum: (Oppenheim 2004)
3.6.2.a. Degree of usability (Oppenheim 2004)
3.6.2.b. Manufacturability (Oppenheim 2004)
3.6.2.c. Low cost (Oppenheim 2004)
3.6.3. Incorporate the robust, mature technology into the
program flow (Oppenheim 2004)
3.6.4. Ensure that there is an established flow between the
three functions (but independent from the internal program
flow) (Oppenheim 2004)
3.7. Use Efficient and Effective Communication and
Coordination
(Oppenheim,
Murman et al. 2011)
3.7.1. Capture and absorb lessons learned from almost all
programs: “never enough coordination and communication”
(Oppenheim,
Murman et al. 2011)
3.7.2. Maximize coordination of effort and flow (Oppenheim,
Murman et al. 2011)
3.7.3. Maintain counterparts with active working relationships
throughout the enterprise to facilitate efficient communication
and coordination among different parts of the enterprise, and
with suppliers
(Oppenheim,
Murman et al. 2011)
3.7.4. Use frequent, timely, open and honest communication (Oppenheim,
6 Lean Enablers in program management 96
Murman et al. 2011)
3.7.5. Promote immediate direct informal communications as
needed
(Oppenheim,
Murman et al. 2011)
3.7.6. Use concise, one-page electronic forms (e.g., Toyota’s
A3 form) rather than verbose unstructured memos to
communicate, and keep detailed working data as backup
(Oppenheim,
Murman et al. 2011)
3.7.7. Report cross-functional issues to be resolved on
concise standard one-page forms to Chief’s office in real time
for his/her prompt resolution
(Oppenheim,
Murman et al. 2011)
3.7.8. Communicate all expectations to suppliers with crystal
clarity, including the context and need, and all procedures
and expectations for acceptance test, and ensure that the
requirements are stable
(Oppenheim,
Murman et al. 2011)
3.7.9. Trust engineers to communicate with suppliers’
engineers directly for efficient clarification, within a framework
of rules, (but watch for high-risk items, which must be
handled at top level)
(Oppenheim,
Murman et al. 2011)
3.8. Promote smooth flow in the entire program
(Womack and Jones
2003; Oppenheim
2004; Morgan and
Liker 2006; Schuh,
Lenders et al. 2010)
3.8.1. Use integrative events (Oppenheim 2004)
3.8.1.a. Question everything with multiple “whys” (Oppenheim,
Murman et al. 2011)
3.8.1.b. Align process flow to decision flow (Oppenheim,
Murman et al. 2011)
3.8.1.c. Resolve all issues as they occur in frequent
integrative events
(Oppenheim,
Murman et al. 2011)
3.8.1.d. Discuss trade-offs and options (Oppenheim,
Murman et al. 2011)
3.8.2. Establish daily wrap-up meetings (Morgan and Liker
2006)
3.8.3. Be willing to challenge the customer’s assumptions on
technical and meritocratic ground, and to maximize program
stability, relying on technical expertise
(Oppenheim,
Murman et al. 2011)
3.8.4. Ensure the use of the same measurement standards (Oppenheim,
6 Lean Enablers in program management 97
and ensure data base commonality Murman et al. 2011)
3.8.5. Ensure that both data deliverers and receivers
understand their mutual needs and expectations
(Oppenheim,
Murman et al. 2011)
3.9. Provide a workspace for CE and his team: the “war room”
(Oppenheim 2004;
Morgan and Liker
2006)
3.9.1. Staff leaders (leaders of functional groups) meet CE in
the “war room”
(Morgan and Liker
2006)
3.9.2. Staff leaders have desk in their functions and come to
the war room regularly
(Morgan and Liker
2006)
3.9.3. There are long, collaborative work sessions in the war
room (not traditional meetings)
(Morgan and Liker
2006)
3.9.4. Room is equipped with “mobile walls” that can be used
as display surfaces for visualization
(Morgan and Liker
2006)
3.9.5. Engineers plaster the room’s walls and “mobile walls”
with information
(Morgan and Liker
2006)
3.9.6. Information on the wall is organized in terms of the
projects of the program
(Morgan and Liker
2006)
3.9.7. Project managers are responsible for assignment of
person in charge of providing visualizations of information in
the war room
(Morgan and Liker
2006)
3.10. Align your organization through simple, visual
communication, and make program progress visible to all
(Morgan and Liker
2006)
3.10.1. Avoid too many and too long meetings (Morgan and Liker
2006)
3.10.2. Make work progress visible to and easy to
understand for all, including external customer
(Oppenheim,
Murman et al. 2011)
3.10.3. Utilize visual controls in public spaces for best
visibility (avoid computer screens)
(Oppenheim,
Murman et al. 2011)
3.10.4. Develop a system that makes imperfections and
delays visible to all
(Oppenheim,
Murman et al. 2011)
3.10.5. Use traffic light system (green, yellow, red) to report
task status visually (good, warning, critical) and make certain
problems are not concealed
(Oppenheim,
Murman et al. 2011)
3.11. Set-based concurrent engineering (Oppenheim 2004;
Schuh, Lenders et al.
6 Lean Enablers in program management 98
2010)
3.11.1. Establish feasibility before commitment to ensure
smooth flow
(Sobek, Durward et
al. 1998)
3.11.2. Narrow sets gradually while increasing level of detail (Sobek, Durward et
al. 1998)
3.11.3. Stay within sets once committed (Sobek, Durward et
al. 1998)
3.11.4. Control by managing uncertainty at process gates (Sobek, Durward et
al. 1998)
3.12. Utilize rigorous standardization to reduce variation and to
create flexibility and predictable outcomes
(Morgan and Liker
2006)
3.12.1. Adapt the technology to fit the people and process (Morgan and Liker
2006)
3.12.2. Use lean tools to promote the flow of information and
minimize handoffs: small batch size of information, small takt
times, wide communication bandwidth, standardization, work
cells, training
(Oppenheim,
Murman et al. 2011)
3.12.3. Use minimum number of tools and make common
wherever possible
(Oppenheim,
Murman et al. 2011)
3.12.4. Minimize the number of software revision updates
and centrally control the update releases to prevent
information churning
(Oppenheim,
Murman et al. 2011)
3.12.5. Adapt the technology to fit the people and process (Oppenheim,
Murman et al. 2011)
3.12.6. Avoid excessively complex “monument” tools (Oppenheim,
Murman et al. 2011)
3.13. Capacity planning/balancing Model (Schuh, Lenders et
al. 2010)
3.13.1. Implement a control loop to control capacity utilization (Schuh, Lenders et
al. 2010)
3.13.2. Identify necessary skills for program execution and
specify profiles for each job position based on the skills
(Schuh, Lenders et
al. 2010)
3.13.3. Calculate available capacity for each identified skill (Schuh, Lenders et
al. 2010)
3.13.4. Base capacity planning on the deviation between
available capacity and the capacity utilization aimed at
(Schuh, Lenders et
al. 2010)
6 Lean Enablers in program management 99
(command variable)
3.13.5. Evaluate and execute different actions (e.g., employ
more/fewer employees, develop your employees, etc.) to
minimize deviation
(Schuh, Lenders et
al. 2010)
4. Pull of the value (Womack and Jones
2003; Oppenheim
2004)
4.1. Establish a CE
(Oppenheim 2004;
Morgan and Liker
2006)
4.1.1. CE has strong leadership to mobilize the organization
and to coordinate the pull through the program
(Morgan and Liker
2006)
4.1.2. CE sets performance targets to pull information
through the program
(Morgan and Liker
2006)
4.2. Pull Tasks and Outputs Based on Need, and Reject Others
as Waste
(Oppenheim,
Murman et al. 2011)
4.2.1. Let information needs pull the necessary work
activities
(Oppenheim,
Murman et al. 2011)
4.2.2. Understand the Value Stream Flow (Oppenheim,
Murman et al. 2011)
4.2.3. Train the team to recognize, for every task, who the
internal customer (receiver) is and who the supplier (giver) is,
using a SIPCO (supplier, inputs, process, outputs, customer)
model to better understand the value stream
(Oppenheim,
Murman et al. 2011)
4.2.4. Stay connected to the internal customer during task
execution
(Oppenheim,
Murman et al. 2011)
4.2.5. Avoid rework by coordinating task requirements with
internal customer for every non-routine task
(Oppenheim,
Murman et al. 2011)
4.2.6. Promote effective, real-time direct communication
between each giver and receiver in the value flow
(Oppenheim,
Murman et al. 2011)
4.2.7. Develop giver-receiver relationships based on mutual
trust and respect
(Oppenheim,
Murman et al. 2011)
4.2.8. When pulling work, use customer value to separate
added value from waste
(Oppenheim,
Murman et al. 2011)
5. Striving for Perfection (Womack and Jones
2003; Oppenheim
2004; Morgan and
6 Lean Enablers in program management 100
Liker 2006; Schuh,
Lenders et al. 2010)
5.1. Strive for Excellence of Program Management (Oppenheim,
Murman et al. 2011)
5.1.1. Build a culture to support excellence and relentless
improvement
(Morgan and Liker
2006)
5.1.2. Do not ignore the basics of Quality: (Oppenheim,
Murman et al. 2011)
5.1.2.a. Build-in robust quality at each step of the
process, and resolve to not pass along problems
(Oppenheim,
Murman et al. 2011)
5.1.2.b. Strive for perfection in each process step
without introducing waste
(Oppenheim,
Murman et al. 2011)
5.1.2.c. Do not rely on final inspection; error-proof
wherever possible
(Oppenheim,
Murman et al. 2011)
5.1.2.d. If final inspection is required by contract,
perfect upstream process pursuing 100% inspection
pass rate
(Oppenheim,
Murman et al. 2011)
5.1.2..e. Move final inspectors upstream to take the
role of quality mentors
(Oppenheim,
Murman et al. 2011)
5.1.2..f. Apply basic PDCA method (plan, do, check,
act) to problem solving
(Oppenheim,
Murman et al. 2011)
5.1.2.g. Adopt and promote a culture of stopping and
permanently fixing a problem as soon as it becomes
apparent
(Oppenheim,
Murman et al. 2011)
5.1.3. Promote excellence under normal circumstances
instead of hero behavior in “crisis” situations
(Oppenheim,
Murman et al. 2011)
5.1.4. Use and communicate failures as opportunities for
learning, emphasizing process and not people problems
(Oppenheim,
Murman et al. 2011)
5.1.5. Treat any imperfection as an opportunity for immediate
improvement and a lesson to be learned, and frequently
review lessons learned
(Oppenheim,
Murman et al. 2011)
5.1.6. Maintain a consistent, disciplined approach to
engineering
(Oppenheim,
Murman et al. 2011)
5.1.7. Promote the idea that the system should incorporate
continuous improvement in the organizational culture, but
also…
(Oppenheim,
Murman et al. 2011)
6 Lean Enablers in program management 101
5.1.8. …balance the need for excellence with avoidance of
wasteful overprocessing (pursue refinement to the point of
ensuring value and “first time right”)
(Oppenheim,
Murman et al. 2011)
5.1.9. Use a balanced matrix/project organizational
approach, avoiding extremes: such as functionally territorial
organizations with isolated technical specialists and all-
powerful IPTs separated from functional expertise and
standardization
(Oppenheim,
Murman et al. 2011)
5.2. Build a culture that supports excellence and relentless
improvement
(Morgan and Liker
2006)
5.2.1. Use lessons learned from past programs in future
programs
(Oppenheim,
Murman et al. 2011)
5.3.1. Maximize opportunities to make each next
program better than the last
(Oppenheim,
Murman et al. 2011)
5.3.2. Create mechanism to capture, communicate,
and apply experience-generated learning and
checklists
(Oppenheim,
Murman et al. 2011)
5.3.3. Insist on workforce training regarding
identification of the root cause of a problem and
appropriate corrective action
(Oppenheim,
Murman et al. 2011)
5.3.4. Identify best practices through benchmarking
and professional literature
(Oppenheim,
Murman et al. 2011)
5.3.5. Share metrics of suppliers’ performance with
them so they can improve
(Oppenheim,
Murman et al. 2011)
5.2.2. Compare your results with your competitors’
results
(Morgan and Liker
2006)
5.2.2.1. Evaluate your products by competitor
teardowns regarding…
(Morgan and Liker
2006)
5.2.2.1.a. Quality (Morgan and Liker
2006)
5.2.2.1.b. Performance (Morgan and Liker
2006)
5.2.2.1. c. Ease of manufacturing (Morgan and Liker
2006)
5.3. Develop policy of perfect communication, coordination and
collaboration across people and processes
(Oppenheim,
Murman et al. 2011)
6 Lean Enablers in program management 102
5.3.1. Develop a plan for – and train the entire program team
in – communications and coordination methods at the
beginning of the program
(Oppenheim,
Murman et al. 2011)
5.3.2. Include communication competence among the
desired skills during hiring
(Oppenheim,
Murman et al. 2011)
5.3.3. Promote good coordination and communications skills
with training and mentoring
(Oppenheim,
Murman et al. 2011)
5.3.4. Publish instructions for email distributions and
electronic communications
(Oppenheim,
Murman et al. 2011)
5.3.5. Publish instructions for artifact content and data
storage – central capture versus local storage, paper versus
electronic – balancing between excessive bureaucracy and
the need for traceability
(Oppenheim,
Murman et al. 2011)
5.3.6. Publish a directory of the entire program team and
provide training to new hires on how to locate nodes of
knowledge as needed
(Oppenheim,
Murman et al. 2011)
5.3.7. Ensure timely and efficient access to centralized data (Oppenheim,
Murman et al. 2011)
5.3.8. Develop an effective body of knowledge that is
historical, searchable, and shared by the team, and a
knowledge management strategy to facilitate the sharing of
data and information within the enterprise
(Oppenheim,
Murman et al. 2011)
5.4. For every program, develop a chief engineer system
(Oppenheim 2004;
Morgan and Liker
2006)
5.4.1. Provide small team to the CE (Morgan and Liker
2006)
5.4.2. CE guides a process of developing consensus across
functions
(Morgan and Liker
2006)
5.4.3. CE is more responsible for decisions about system
integration than personnel decisions and project
administration
(Morgan and Liker
2006)
5.4.4. CE has a vision for all functional program teams (Morgan and Liker
2006)
5.4.5. Implement a culture to support the CE (Morgan and Liker
2006)
5.4.6. CE has good interpersonal skills to motivate and lead (Oppenheim 2004;
6 Lean Enablers in program management 103
his/her team and is a patient listener Morgan and Liker
2006)
5.4.7. CE is a good communicator, both internally and
externally
(Morgan and Liker
2006)
5.4.8. If Program Manager and Chief Engineer are two
separate individuals (required by contract or organizational
practice), co-locate them to facilitate constant close
coordination
(Oppenheim,
Murman et al. 2011)
5.5. Use powerful tools for standardization and organizational
learning
(Morgan and Liker
2006)
5.5.1. Use a model of product architecture, technology-
and function
(Schuh, Lenders et
al. 2010)
5.5.1.1. Promote design standardization through
engineering checklists, standard architecture,
modularization, busses, and platforms
(Oppenheim,
Murman et al. 2011)
5.5.2. Promote process standardization in development,
management, and manufacturing
(Oppenheim,
Murman et al. 2011)
5.5.3. Promote standardized skill sets with careful training,
mentoring, rotations, strategic assignments, and
assessments of competencies
(Oppenheim,
Murman et al. 2011)
5.5.4. Organize so as to balance functional expertise and
cross-functional integration
(Morgan and Liker
2006)
5.6 Motivate the employees of the program through product
identity
(Schuh, Lenders et
al. 2010)
5.6.1. Motivate all members of the program intrinsic (Schuh, Lenders et
al. 2010)
5.6.2. Establish an emotional tie to the product (Schuh, Lenders et
al. 2010)
5.6.2.1. Communicate uniqueness of product (Schuh, Lenders et
al. 2010)
5.6.2.2. Idealize and perfect the product (Schuh, Lenders et
al. 2010)
5.6.2.3. Make the product experienceable (Schuh, Lenders et
al. 2010)
5.6.2.4. Imprint identification with the product on
employees early and follow up by making use of this
imprinting
(Schuh, Lenders et
al. 2010)
6 Lean Enablers in program management 104
6. Respect for People (Oppenheim,
Murman et al. 2011)
6.1. Build an Organization Based on Respect for People (Oppenheim,
Murman et al. 2011)
6.1.1. Create a vision which draws and inspires the best
people
(Oppenheim,
Murman et al. 2011)
6.1.2. Invest in personnel selection and development to
promote enterprise and program excellence
(Oppenheim,
Murman et al. 2011)
6.1.3. Promote excellent human relations: trust, respect,
empowerment, teamwork, stability, motivation, drive for
excellence
(Oppenheim,
Murman et al. 2011)
6.1.4. Ensure that there is professional trust through: (Morgan and Liker
2006)
6.1.4.1. Integrity: People must have the intent to do
what they say they will
(Morgan and Liker
2006)
6.1.4.2. Competence: They must be capable of doing
it
(Morgan and Liker
2006)
6.1.5. Read applicants’ resumes carefully for both technical
and non-technical skills, and do not rely upon computer
scanning for keywords
(Oppenheim,
Murman et al. 2011)
6.1.6. Promote direct person-to-person communication (Oppenheim,
Murman et al. 2011)
6.1.7. Promote and honor technical excellence – establish a
technical meritocracy
(Oppenheim,
Murman et al. 2011)
6.1.8. Reward people based upon team performance, and
include teaming ability among the criteria for hiring and
promotion
(Oppenheim,
Murman et al. 2011)
6.1.9. Use flow-down of responsibility, authority and
accountability (RAA) to make decisions at lowest appropriate
level
(Oppenheim,
Murman et al. 2011)
6.1.10. Eliminate fear and promote conflict resolution at the
lowest level
(Oppenheim,
Murman et al. 2011)
6.1.11. Keep management decisions crystal clear but also
promote and reward the bottom-up culture of continuous
improvement, human creativity, and entrepreneurship
(Oppenheim,
Murman et al. 2011)
6.1.12. Do not manage from cubicle; go to the spot and see (Oppenheim,
6 Lean Enablers in program management 105
for yourself Murman et al. 2011)
6.1.13. Within program policy and within their area of work,
empower people to accept responsibility by promoting the
motto “Ask for forgiveness rather than ask for permission”
(Oppenheim,
Murman et al. 2011)
6.1.14. Build a culture of mutual support (there is no shame
in asking for help)
(Oppenheim,
Murman et al. 2011)
6.1.15. Prefer physical team co-location to virtual co-location (Oppenheim,
Murman et al. 2011)
6.2. Develop towering technical competence in all engineers
and strive for technical excellence
(Morgan and Liker
2006)
6.2.1. Establish training across functions (Morgan and Liker
2006)
6.2.2. Establish standardization for skill expectations (Morgan and Liker
2006)
6.2.3. Establish and support a communities of practice (Oppenheim,
Murman et al. 2011)
6.2.4. Invest in Workforce Development (Oppenheim,
Murman et al. 2011)
6.2.5. Ensure tailored lean training for all employees (Oppenheim,
Murman et al. 2011)
6.2.6. Give leaders at all levels in-depth lean training (Oppenheim,
Murman et al. 2011)
6.3. Nurture a Learning Environment (Oppenheim,
Murman et al. 2011)
6.3.1. Perpetuate technical excellence through mentoring,
training, continuing education, and other means
(Oppenheim,
Murman et al. 2011)
6.3.2. Promote and reward continuous learning through
education and experiential learning
(Oppenheim,
Murman et al. 2011)
6.3.3. Provide knowledge experts as resources and for
mentoring
(Oppenheim,
Murman et al. 2011)
6.3.4. Cultivate the most powerful competitive weapon: the
ability to learn rapidly and continuously improve
(Oppenheim,
Murman et al. 2011)
6.3.5. Value people for the skills they contribute to the
program; foster mutual respect and appreciation
(Oppenheim,
Murman et al. 2011)
6.3.6. Capture learning to stabilize the program when people
change their position in the program or leave the program
(Oppenheim,
Murman et al. 2011)
6 Lean Enablers in program management 106
6.3.7. When developing a standard, consider human factors,
including reading and perception abilities
(Oppenheim,
Murman et al. 2011)
6.3.8. Immediately organize quick training in any new
standard
(Oppenheim,
Murman et al. 2011)
6.4. Treat People as the program’s most valued assets, not as
commodities
(Oppenheim,
Murman et al. 2011)
To provide a better overview of the different sources used in compiling Lean Enablers for
program management, Figure 6-1 shows the percentage of the enablers that were derived
from each of the different sources. The total adds up to 108,1% because some of the Lean
Enablers are based on more than one source.
108,1%
2,3%3,5%
47,7%
Morgan & Liker 2006
31,6%
Oppenheim et al. 2011
TotalWomack 2003Sobek 1998Oppenheim 2004
10,3%
Schuh 2008
12,6%
Figure 6-1: Percentage of the compiled Lean Enablers for program management that were
derived from each literature source
Figure 6-2 shows the total number of Lean Enablers. In addition, the number of categories
into which these Lean Enablers are divided is shown (e.g., 1.1 is a category of lean principle
1.). Each category has an average of slightly under nine Lean Enablers.
6 Lean Enablers in program management 107
35
310
Total
Subcategories
Enablers
Figure 6-2: Total number of Lean Enablers and categories
Figure 6-3 shows how the different Lean Enablers are distributed over the six lean principles.
One can see that a higher number of Lean Enablers corresponds with a higher number of
categories. Consequently, the average number of Lean Enablers per category estimated
from Figure 6-2 seems to be also valid for each lean principle’s portion of the subcategories.
The third lean principle has the highest number of Lean Enablers. This is in accord with the
finding in section 6.2.3 that many authors consider the third lean principle – the creation of a
continuous flow – as the one most fundamental to the implementation of lean thinking.
462
137
3
36
61
13
103
55
42
6. Respect for People
5. Striving for Perfection
4. Pull of the Value3. Creation of a Continuous Flow
2. Value Stream Definition
1. Specification of Customer Value
Enablers
Subcategories
Figure 6-3: Distribution of the Lean Enablers and the subcategories among the six lean
principles
6 Lean Enablers in program management 108
6.4 Assessment through the industry focus group
To verify the comprehensiveness of the list of Lean Enablers presented in The major findings
described in section 6.2, together with feedback from various sources, lead to the
comprehensive list of Lean Enablers for program management presented in Error! Not a
valid bookmark self-reference.. The table shows the Lean Enablers and the corresponding
literature which was used to create and phrase the Lean Enablers.
Table 6-1, a document containing the list was sent to the members of the industry focus
group for review. The results of the reviews by the members of the industry focus group were
discussed in a telephone conference. All participants provided feedback to further improve
the list.
One participant of the group who developed the Lean Enablers for Systems Engineering was
a participant in the telephone conference. She provided insights into the development
process for the Lean Enablers for System Engineering and could give precise and specific
feedback on the development process and structure of the Lean Enablers for program
management.
All feedback from the members of the industry focus group provided in writing or during the
telephone conference was used to improve the list of enablers. In summary, it can be stated
that all persons from the industry focus group contributing final feedback on the list of Lean
Enablers for program management found the list as presented in The major findings
described in section 6.2, together with feedback from various sources, lead to the
comprehensive list of Lean Enablers for program management presented in Error! Not a
valid bookmark self-reference.. The table shows the Lean Enablers and the corresponding
literature which was used to create and phrase the Lean Enablers.
Table 6-1 to be comprehensive and to be easy to comprehend in terms of the descriptions of
the enablers.
6.5 Summary
This chapter described the first part of the adaption of “lean thinking” to program
management. This adaption corresponds to the fourth research question of this thesis: “Are
there ways to adapt lean thinking in program management?”
The author adapted the idea of Lean Enablers for Systems Engineering from Oppenheim et
al. (Oppenheim, Murman et al. 2011). Oppenheim et al. identified best practices
corresponding with lean and called them Lean Enablers. These Lean Enablers are structured
according to six lean principles: the five principles from Womack and Jones (Womack and
Jones 1996) and the principle “Respect for people” from Sugimori (Sugimori, Kusunoki et al.
1977).
The first part of this chapter gave an overview of the sources that were used to identify Lean
Enablers for program management. The sources were all briefly discussed. Afterwards, the
adaptability of Lean Enablers in general to program management was assessed.
6 Lean Enablers in program management 109
Using the same structure as Oppenheim et al. for the Lean Enablers for program
management, six lean principles were discussed in terms of how they can be applied through
Lean Enablers for program management. The literature sources provide knowledge about
lean best practices in program environments.
Based on these findings, a list of 310 Lean Enablers for program management was created
and presented. The approach to having the industry focus group verify the
comprehensiveness and clarity of the list was then described.
7 Mapping of Lean Enablers to program management pitfalls 110
7 Mapping of Lean Enablers to program management
pitfalls
In this chapter, the central findings of chapter 6, the list of 310 Lean Enablers for program
management are taken to the next step in the application of “lean thinking” to program
management. This step is the enabling of persons involved in program management (e.g.,
program managers, members of the program office, et al.) avoid the identified pitfalls in
program management. The approach used is to map the Lean Enablers of chapter 6 to the
program management pitfalls identified in chapter 5. The basic idea of the mapping of the
Lean Enablers to the program management pitfalls is the identification of the most beneficial
combinations. If the Lean Enablers are applied correctly, the most common pitfalls can be
evaded by the program managers. If program managers know about some pitfalls which are
especially likely to happen in their programs, they can use the specialized Lean Enablers to
improve program performance by escaping these pitfalls.
Together with the mapping of the Lean Enablers to the common program management
pitfalls, program managers are fully capable of using lean thinking in program management.
Consequently, the research question: “Are there ways to adapt lean thinking in program
management?” will be answered at the end of the chapter by combing the findings of the
mapping of this chapter with the Lean Enablers and the program management pitfalls.
7.1 Methodology for the mapping
The mapping approach used in this thesis to map the Lean Enablers to the program
management pitfalls is a connection matrix between the enablers and the pitfalls. The first
decision regarding this approach was the scope of the mapping. In order to map all 310 Lean
Enablers to all 99 pitfalls, one would have to assess about 30,690 possible connections.
Assuming that it takes about 10 seconds to judge if there is a connection between a Lean
Enabler and a pitfall (this might even be underestimating, since the mere reading of the pitfall
description and the Lean Enabler description would take several seconds), one would need
5.115 minutes to perform the mapping. In addition, it would not be very useful if the program
manager were to be confronted by a list of dozens of Lean Enablers for a single pitfall.
Consequently, the author decided to limit the scope of the mapping. The approach used in
this thesis maps the 35 categories of the Lean Enablers to the 99 pitfalls. This still leads to
3,465 possible connections, but this number is smaller by about an order of 10 than the one
that would result from a full mapping. This method required about 600 minutes of mapping
(based on the same estimate of 10 seconds per possible connection).
The assessment of whether a Lean Enabler subcategory would be beneficial in connection
with a pitfall was based on the author’s personal experience and opinion. The author based
his decisions on the knowledge gained during the literature research, the interviews with
program managers, and the interactions with the members of the industry focus group.
7 Mapping of Lean Enablers to program management pitfalls 111
In the matrix (recorded using a Microsoft Excel spreadsheet), the pitfalls are arranged in
rows and the Lean Enabler subcategories in columns. The author judged whether a pitfall
and an enabler have a connection (indicated by an “x” in the appropriate cell) or not
(indicated by an empty cell). Error! Reference source not found. is a screenshot of the
spreadsheet.
Figure 7-1: Screenshot from the mapping (in Microsoft Excel)
An example of a connection that was judged to be beneficial by the author is the link
between the pitfall “Lack of metrics” (ID #12) and the Lean Enabler subcategory “Plan
Leading Indicators and Metrics to Manage the Program” (2.7). An example of a lack of
connection is the same pitfall, “Lack of metrics” (ID #12), and the Lean Enablers subcategory
“Build an Organization Based on Respect for People” (6.1).
It is clear that all of the assessments made by the author regarding possible connections
were highly subjective. One might argue that some beneficial connections were omitted and
that some of the inclusions should have been omitted. But it is the author’s opinion that a
proliferation of connections would have led to confusion rather than helping. Giving program
managers a usable tool to avoid common pitfalls in their program environment was the
purpose of the mapping. Therefore, focused, specific guidance has been provided rather
than recommending all possible Lean Enablers for each pitfall.
7 Mapping of Lean Enablers to program management pitfalls 112
7.2 Results of the mapping process
This section describes various analyses of the results of the process. The main objective of
this section is to determine where the application of the Lean Enablers for program
management is most beneficial. A complete record of the mapping itself can be found – as a
list of pitfalls and corresponding Lean Enablers subcategories – in chapter VII, Appendix A.
The first results are shown in Figure 7-2. The figure displays the distribution of the number of
identified connections per program management pitfall. The average number of identified
connections to Lean Enablers for one program management pitfall is 9,17. Most pitfalls are
addressed by 4–15 Lean Enablers; very few pitfalls are addressed by more or fewer than
that.
100
2
0
3
0
233
2
6
4
1515
11
2
10
5
9
32
00
5
10
15
1 2 3 54 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23
Average 9,17
Figure 7-2: Number of identified connections with a Lean Enabler per program management
pitfall
A second finding of the mapping is how well the various Lean Enabler categories address the
different framework themes of the LAI program management framework. Figure 7-3 displays
the numbers of identified connections between Lean Enabler categories and pitfalls that are
associated with each framework theme. The pitfalls of the framework theme “Program
Execution” are addressed by the largest number of Lean Enabler categories. On the other
hand, the number of connections between the Lean Enabler categories and the pitfalls
associated with the framework theme “Technology Integration” is very small.
7 Mapping of Lean Enablers to program management pitfalls 113
244
41
180
93
341
Scoping, Planning & Contracting
Technology Integration
Product DesignProgram Execution Enterprise Management
Figure 7-3: Number of identified pitfall-enabler connections per framework theme
To accompany the detailed analysis of the findings for the different framework themes, the
following figures provide the number of identified connections between Lean Enabler
categories and the pitfalls assigned to each of the framework parts of each framework
theme. Figure 7-4 shows the number of connections for the framework theme Program
Execution. The most connections are identified between the Lean Enabler categories and the
framework parts “Integrating and leading the program organization” and “Progress monitoring
and management.” The pitfalls of the framework part “Project management” seem to be only
weakly addressed: they have only four connections to Lean Enabler categories.
341
15
16
43
TotalStakeholder management
Risk management
44
Project management
4
Progress monitoring
and management
82
Program manager
characteristics
Multi‐project coordination
52
Integrating and leading the program organization
85
Developing intellectual capital
Figure 7-4: Number of identified pitfall-enabler connections for the framework parts of the
theme "Program Execution"
7 Mapping of Lean Enablers to program management pitfalls 114
Figure 7-5 shows the number of connections for the pitfalls of the framework parts of the
theme “Enterprise Management.” Again, one framework part, “Building and transforming the
program enterprise,” has a very large number of connections while another framework part,
“Understanding the program enterprise,” is nearly unaddressed by Lean Enabler
subcategories.
933
55
17
Knowledge management
18
Building and transforming the program enterprise
Learning and improvement of the program enterprise
TotalUnderstanding the program enterprise
Figure 7-5: Number of identified pitfall-enabler connections for the framework parts of the
theme "Enterprise Management"
Figure 7-6 displays the results for the LAI program management framework theme “Scoping,
Planning, and Contracting.” Once more, one framework part, “Incentive alignment, contract
negotiation, and conclusion,” has many more connections than any other.
7 Mapping of Lean Enablers to program management pitfalls 115
1806
38
Cost estimation & life cycle costing
Incentive alignment, contract
negotiation & conclusion
Total
114
Managing trade‐offs
Definition of stakeholder needs and
requirements
22
Figure 7-6: Number of identified pitfall-enabler connections for the framework parts of the
theme "Scoping, Planning and Contracting"
Figure 7 7 shows the results for the framework parts of the framework theme “Technology
integration.” The numbers of identified connections between Lean Enabler subcategories and
the pitfalls corresponding with each of the two framework parts differs by a factor of three.
However, one must consider that there might be different numbers of pitfalls in the different
framework parts . A detailed discussion of this factor will follow later in this section.
7 Mapping of Lean Enablers to program management pitfalls 116
11
41
TotalTechnology transition
management
30
Technology maturation monitoring
Figure 7-7: Number of identified pitfall-enabler connections for the framework parts of the
theme "Technology Integration"
Figure 7-8 displays the number of identified connections between the Lean Enabler
subcategories and the pitfalls of the framework parts of “Product Design.” Even though the
framework theme “Product Design” has the second-highest number of identified connections
(244), the distribution of these connections among the different framework parts is very even.
244
52
32
TotalValue stream optimization
42
Test & prototyping
40
Product design strategy
Product architecting
48
PD team organization
30
Integrated product and process
development
7 Mapping of Lean Enablers to program management pitfalls 117
Figure 7-8: Number of identified pitfall-enabler connections for the framework parts of the
theme "Product Design"
As mentioned earlier, the total number of identified connections might not be the perfect
metric to assess how well the Lean Enablers help in different framework parts, since the
number of pitfalls in these parts might differ enormously. Figure 7-9 shows, for each of the
five framework themes, the number of that theme’s identified connections as a percentage of
the total number of identified connections. On average, every fourth connection (25,9%) out
of the total number is judged to be beneficial to help avoiding pitfalls of some kind. The two
framework themes “Program Execution” and “Product Design” have the highest percentage
of identified connections. This might imply that the Lean Enablers are especially applicable to
these two themes. A possible reason for this is the fact that the Lean Enablers for program
management are primarily based on lean best practices from the literature describing the
execution of a program or product development. On the other hand, the framework theme
“Enterprise Management” seems to be not well addressed by the current set of Lean
Enablers for program management. Once more, the sources of the Lean Enablers might
explain why this is the case. The lean literature used does not focus on the forming of a
program enterprise and is more focused on its processes and handling during program
execution. Consequently, the current set of Lean Enablers might not perfectly fit the early
phases of a program.
25,9%
33,2%
23,4%25,7%
15,6%
27,1%
40%
30%
20%
10%
0%AverageProduct DesignTechnology
IntegrationScoping, Planning & Contracting
Enterprise Management
Program Execution
Figure 7-9: Percentage of total number of identified connections that is applicable to each
framework theme
Other insights that might be gained from the results of the mapping pertain to the level of
usefulness of the various Lean Enabler categories. In particular, the number of identified
connections from each lean principle’s categories and the percentage that number
represents of the total number of Lean Enablers within that principle’s categories are
interesting subjects for further investigation. These findings are shown in Figure 7-10. For
7 Mapping of Lean Enablers to program management pitfalls 118
each principle, the red bar displays the number of identified connections, while the blue line
shows the percentage of that principle’s Lean Enablers.
One can see that the highest number of connections derives from the third lean principle,
Creation of a Continuous Flow. This corresponds well with the literature’s assertion that the
third lean principle is essential to implementing the idea of lean thinking. However, the
percentage those connections make up of that principle’s total number of Lean Enablers is
not the highest. This might imply that the Lean Enablers of the third principle are important
overall, but the number of them might be reduced without a deleterious effect on outcomes.
The highest percentage of identified connections compared to the number of Lean Enablers
is found in the first lean principle, Specification of Customer Value. This might be explained
by the fact that a basic understanding of lean (especially value and waste) is found in the first
lean principle. Consequently, the first lean principle and the corresponding Lean Enablers
must be extensively utilized in order to implement the idea of lean thinking.
72
20
310
130
0
50
100
150
200
250
300
350 50,0%
40,0%
30,0%
20,0%
10,0%
0,0%6. Respect for People
5. Striving for Perfection
156
4. Pull of the Value
3. Creation of a Continuous
Flow
2. Value Stream
Definition
211
1. Specification of Customer
Value
% of established of possible connections
Number of connections
Figure 7-10: Total number of identified connection per lean principle and % identified of
possible connections
7.3 Verification through review by the industry focus group
To verify the results of the mapping, a document containing a list similar to that in chapter
VII, Appendix A, was sent to the members of the industry focus group for review. The results
of their reviews were discussed in a telephone conference. All participants provided feedback
that further improved the list.
7 Mapping of Lean Enablers to program management pitfalls 119
All feedback from the members of the industry focus group provided in writing or during the
telephone conference about the connections identified between the Lean Enabler
subcategories and the program management pitfalls was used to enhance the list presented
in this thesis.
In summary, it can be stated that all persons contributing final feedback about the list -
containing the results of the mapping process between the Lean Enabler subcategories and
the program management pitfalls - found the list to be comprehensive and the used method
comprehensible.
7.4 Summary
This chapter presented an analysis that connected the program management pitfalls and
Lean Enablers identified by this study. After describing the approach used to map the
enablers to the pitfalls, the chapter presented some major findings and results in a series of
figures. It raised some questions about the quality of the Lean Enablers and the mapping.
Some could be answered by the present study, while some call for further research and
analysis.
The results of this study make it possible to give persons involved in program management
better guidance about the lean best practices to use in order to avoid common pitfalls. When
a program manager knows that his/her program is especially susceptible to specific pitfalls,
he/she can foster the use of applicable Lean Enablers and thus improve program
performance.
The list of the Lean Enablers for program management together with the mapping explained
in this chapter can be used to implement lean thinking in program management. Hence,
chapters 6 and 7 combine to give a positive answer to the fourth and final research question
of this thesis, “Are there ways to adapt lean thinking in program management?”
8 Conclusion and research outlook 120
8 Conclusion and research outlook
This final chapter summarizes the findings and contributions of this thesis. It also discusses
the follow-up research that is called for.
Four research questions have been raised and answered:
1. “What is program management and how can it be defined?”
2. “Is there a framework for engineering programs?”
3. “Are there common pitfalls in program management?”
4. “Are there ways to adapt lean thinking in program management?”
Figure 8-1 shows how the chapters of this thesis correspond with these questions.
Chapter 6
Chapter 5
Chapter 4
Chapter 3
Chapter 6
Research Question 4:Are there ways to adapt lean thinking to program management?
Research Question 3:Are there common pitfalls in program management?
Research Question 2:Is there a framework for engineering program management?
Research Question 1:What is program management and how can it be defined?
Figure 8-1: Research questions and corresponding chapters
This thesis began by screening the current literature on program management. In particular,
existing program management frameworks were analyzed in detail. One finding of this
screening was that there is no widely accepted definition of program management. Many
authors use the term but few of them formally define it. Another finding was that there is no
widely accepted or widely used framework for technical development programs. The only
framework addressing the necessary, established aspects of program management in a
technical environment was the LAI program management framework. The source for this
framework was an unpublished, internal white paper of the LAI; the framework was based on
literature research and had not been assessed. The author refined and assessed this
framework through interviews with experienced program managers in order to ensure its
practicability and the appropriateness of using it in the further analysis in this thesis.
8 Conclusion and research outlook 121
The next step after identifying a framework for program management was identifying
common challenges in program management. The initial collection of so-called program
management pitfalls was based on a comprehensive literature review including more than
110 literature sources. To verify the results, additional interviews with program managers
were conducted. In addition, the insights of members of an industry focus group representing
seven companies and two organizations were used for further refinement and verification of
the compilation of pitfalls.
The final idea presented in this thesis was the adaptation of lean thinking to program
management. In the existing literature, the work that appeared most promising as an
ingredient in pursuing this idea was that of Oppenheim et al. (Oppenheim, Murman et al.
2011). The authors used the idea of lean best practices, so-called Lean Enablers, to
implement lean ideas in systems engineering. Since the results are widely accepted and
praised and are compatible with the work of INCOSE (the International Council on Systems
Engineering), using a similar approach for program management made sense.
The first collection of Lean Enablers for program management, based on various sources of
“lean thinking,” was assessed in interviews with members of the industry focus group. This
assessment was used in producing a revised, comprehensive list of 310 Lean Enablers for
program management; these Lean Enablers were organized in 35 categories. Again, after
revision, the list was verified based on feedback from the members of the industry focus
group.
To help program managers improve program performance by avoiding common pitfalls, the
Lean Enablers for program management were mapped to the identified pitfalls. This idea
originated in the hypothesis that the use of specific Lean Enablers might be beneficial in
circumventing pitfalls. The analysis of connections between the Lean Enablers and the
pitfalls provided great insights regarding the applicability of the Lean Enablers in different
parts of program management.
Figure 8-2 summarizes the contributions of this thesis in two categories. The provision of a
clear definition of program management, the evaluation of existing frameworks, and the
refinement and verification of the LAI program management framework will be used in the
final, published version of the LAI program management white paper. The collections of
program management pitfalls and Lean Enablers, together with the mapping, can be directly
applied or can be used as bases for further intensive research in the industry focus group.
8 Conclusion and research outlook 122
Relevant to LAI program management whitepaper1. Provision of a clear definition of program management (PM)2. Comparison and evaluation of existing PM frameworks3. Refinement and verification of the LAI PM framework
Relevant to further research in the industry focus group4. Identification of common PM pitfalls5. Identification of Lean Enablers for PM6. Mapping of Lean Enablers to PM pitfalls
Figure 8-2: Contributions of this thesis
Research in the following areas is called for to follow up on the work of this thesis:
- Due to the small number of interviews (five), one might argue that further verification
of the LAI program management framework is necessary. Since all of the
interviewees stated that no further major refinements are necessary, a survey
approach might be used for a final verification of the framework.
- The current list of 99 pitfalls consists of root causes as well as secondary effects. It
might be beneficial or even necessary for further research to distinguish these two
categories. A discussion with experts (e.g., the program managers interviewed to
verify the LAI program management framework and members of the industry focus
group) could lead to a connection diagram or fishbone diagram of the 99 pitfalls
identified in this thesis. A brief discussion during one of the interviews indicated that
some root causes might be identifiable as such with little effort and that it also might
not be difficult to accurately trace many secondary-effect pitfalls back to their root
causes.
- The literature basis used for the development of the Lean Enablers for program
management might be too small. There are several other sources that describe lean
thinking in different environments. The main focus of the sources used in this thesis
was product development, and it might be necessary to examine adaptations of lean
thinking in additional environments. Section 0 discusses briefly the fact that the early
phases of program management (development of the program organization and the
immediately following steps) seem not to be optimally covered by the current set of
Lean Enablers for program management.
- In addition to drawing upon more sources for the Lean Enablers, further verification of
the Lean Enablers might be necessary. The current list of them was only reviewed by
a small number of people (mostly the members of the industry focus group). Again, a
8 Conclusion and research outlook 123
survey might be applicable. Oppenheim et al. (Oppenheim, Murman et al. 2011) used
the same approach to verify their collection of Lean Enablers for systems
engineering.
IV Table of symbols and abbreviations iv
IV Table of symbols and abbreviations
ACAT Acquisition category
AoA Analysis of Alternatives
CDD Capability Development Document
CE Chief Engineer
CPD Capability Production Document
DAU Defense Acquisition University
DoD Department of Defense
DOTMLPF Doctrine, Organization, Training, Materiel,
Leadership and Education, Personnel, and
Facilities
GAO Government Accountability Office
IFG Industry Focus Group
INCOSE International Council on Systems Engineering
IPB Initial Production Baseline
IPT Integrated
JCIDS Joint Capabilities Integration and Development
System
LAI Lean Advancement Initiative
LIRP into low-rate initial production
MSP Managing Successful Programmes
OGC Office of Government Commerce
PD Product Development
PDR Preliminary Design Review
PM Program Management
PMBOK guide “A Guide to the Project Management Body of
Knowledge” by PMI
PMI Project Management Institute
PO Program Office
PPBE Planning, Programming, Budgeting, and
Execution
RDT&E Research, Development, Testing and Evaluation
IV Table of symbols and abbreviations v
SFR System Functional Review
SRR System Requirements Review
USD(AT&L) Under Secretary of Defense for Acquisition,
Technology, and Logistics
V Index of literature vi
V Index of literature
(AT&L), U. S. o. D. (2010). Better Buying Power: Guidance for Obtaining Greater Efficiency
and Productivity in Defense Spending. Memorandum. Washington D.C.
Aday, L. A. and L. J. Cornelius (2006). Designing and conducting health surveys: a
comprehensive guide, Jossey-Bass Inc Pub.
Ahern, D. (2009). OSD Study of Program Manager Training and Experience, Defense
Acquisition University Press.
America, U. S. o. (2009). Public Law 111–23: Weapon System Acquisition Reform Act
(WSARA).
Andersen, E. S. and S. A. Jessen (2003). "Project maturity in organisations." International
Journal of Project Management 21(6): 457-461.
Andrews, R., J. Cooper, et al. (2010). House Armed Services Committee Panel on Defense
Acquisition Reform Findings and Recommendations.
Archibald, R. D. (2003). Managing high-technology programs and projects: A complete,
practical, and proven approach to managing large-scale projects with
emphasis on those involving advanced technology. Hoboken, NJ, John Wiley
& Sons Inc; Wiley.
Baccarini, D. (1996). "The concept of project complexity-a review." International Journal of
Project Management 14(4): 201-204.
Bador, D. (2007). Improving the Commonality Implementation in the Cockpit of Commercial
Aircraft. Cambridge, MA, Master's thesis, LAI and MIT.
Beckert, M. T. (2000). Organizational Characteristics for Successful Product Line
Engineering. Cambridge, MA, Master's thesis, LAI and MIT.
Bernstein, J. I. (1998). Design Methods in the Aerospace Industry: Looking for Evidence of
Set-Based Practices. Cambridge, MA, Master's thesis, LAI and MIT.
Bernstein, J. I. (2000). Multidisciplinary Design Problem Solving on Product Development
Teams. Cambridge, MA, PhD thesis, LAI and MIT.
Bessant, J., S. Caffyn, et al. (1994). "Rediscovering continuous improvement." Technovation
14(1): 17-29.
Blackburn, C. D. (2009). Metrics for Enterprise Transformation. Cambridge, MA, Master's
thesis, MIT and LAI.
V Index of literature vii
Boas, R. C. (2008). Commonality in Complex Product Families: Implications of Divergence
and Lifecycle Offsets. Cambridge, MA, PhD thesis, LAI and MIT.
Boehm, B., R. Valerdi, et al. (2008). "The ROI of Systems Engineering: Some Quantitative
Results for Software-Intensive Systems." Systems Engineering 11(3): 221-
234.
Bozdogan, K. (2005). Supplier networks transformation toolset. Cambridge, MA, Lean
Advancement Initiative, Massachusetts Institute of Technology.
Bozdogan, K., J. Deyst, et al. (1998). "Architectural innovation in product development
through early supplier integration." R&D Management 28(3): 163-173.
Bresman, P. H. M. (2004). Learning Strategies and Performance in Organizational Teams.
Cambridge, MA, PhD thesis, MIT and LAI.
Bresnahan, S. M. (2006). "Understanding and Managing Uncertainty in Lean Aerospace
Product Development." Master's Thesis Retrieved 01/14, 2010.
Brown, J. T. (2007). The Handbook of Program Management. New York, McGraw-Hill
Professional.
Browning, T. R. (1996). Systematic IPT Integration in Lean Development Programs.
Cambridge, MA, Master's thesis, LAI and MIT.
Browning, T. R., J. J. Deyst, et al. (2002). "Adding Value in Product Development by Creating
Information and Reducing Risk." IEEE Transactions on Engineering
Management 49(4): 443-458.
Cameron, B. G., E. F. Crawley, et al. (2008). "Value flowmapping: Using networks to inform
stakeholder analysis." Acta Astronautica 62(4-5).
Carreras, C. E. (2002). Opportunities for Lean Thinking in Aircraft Flight Testing and
Evaluation, Master Thesis, LAI and Massachusetts Institute of Technology.
CMU (2008). Acquisition Archetypes: Firefighting. C. M. U.-S. E. Institute.
Cohen, J. L. (2005). United States Air Force Air Logistics Centers: Lean Enterprise
Transformation and Associated Capabilities. Cambridge, MA, Master's thesis,
MIT and LAI.
Commerce, O. o. G. (2007). Managing successful programmes. London, TSO.
Commerce, O. o. G. (2009). Managing successful projects with PRINCE2. London, TSO.
Constantinescu, R. and I. M. Iacob (2007). "Capability Maturity Model Integration." Journal of
Applied Quantitative Methods 2(1): 187.
V Index of literature viii
Cooper, R. G. (2001). Winning at New Products: Accelerating the Process from Idea to
Launch, Basic Books.
Cowap, S. A. (1998). Economic Incentives in Aerospace Weapon Systems Procurement,
Master Thesis, MIT and LAI.
Cunningham, T. W. (1998). Chains of Function Delivery: A Role for Product Architecture in
Concept Design. Cambridge, MA, PhD thesis, LAI and MIT.
Dare, R. E. (2003). Stakeholder Collaboration in Air Force Acquisition: Adaptive Design
Using System Representations. Cambridge, MA, PhD thesis, LAI and MIT.
DAU (2010). Public Access Course Material of the Defense Acquisition University.
https://myclass.dau.mil/webapps/portal/frameset.jsp?tab_id=_54_1, Defense
Acquisition University.
Davidz, H. L. (2006). Enabling Systems Thinking to Accelerate the Development Senior
Systems Engineers. Cambridge, MA, PhD Thesis, LAI and MIT.
Davies, A. and T. Brady (2000). "Organisational capabilities and learning in complex product
systems: towards repeatable solutions." Research Policy 29(7-8): 931-953.
Defense, D. o. (1998). "Integrated Product and Process Development Handbook." from
https://acc.dau.mil/adl/en-
US/37477/file/9017/DoD%20IPPD%20Handbook%20Aug%2098.pdf.
Defense, D. o. (2005). Manger's Guide to Technology Transition in an Evolutionary
Acquisition Environment. Fort Belvoir, Defense Acquisition University Press.
Defense, D. o. (2006). Risk Management Guide for DoD Acquisitions.
http://www.acq.osd.mil/se/docs/2006RMGuide4Aug06finalversion.pdf,
OUSD(AT&L) Systems and Software Engineering.
Defense, D. o. (2010). "Dictionary of Military and Associated Terms." from
http://www.dtic.mil/doctrine/new_pubs/jp1_02.pdf.
Deonandan, I., R. Valerdi, et al. (2010). Cost and Risk Considerations for Test and
Evaluation of Unmanned and Autonomous Systems of Systems.
Loughborough, UK, 5th IEEE International Conference on Systems of
Systems Engineering, June 2010.
Derleth, J. E. (2003). Multi-Attribute Tradespace Exploration and its Application to
Evolutionary Acquisition. Cambridge, MA, Master's thesis, LAI and MIT.
Dickerson, C. and R. Valerdi (2010). Using Relational Model Transformations to Reduce
Complexity in SoS Requirements Traceability: Preliminary Investigation.
V Index of literature ix
Loughborough, UK, 5th IEEE International Conference on Systems of
Systems Engineering, June 2010.
DoD (2003). Department of Defense Directive 5000.01.
DoD (2008). Department of Defense Instruction 5000.02.
DoD (2008). Operation of the Defense Acquisition System, Department of Defense
Instruction (DoDI) 5000.02, Department of Defense.
Dorey, S. P. (2011). Enchancing Cost Realism Through Risk-Driven Contracting: Designing
incentive Fees Based on Probabilistic Cost Estimates, Master's Thesis, LAI
and Massachusetts Institute of Technology.
Downen, T. D. (2005). A Multi-Attribute Value Assessment Method for the Early Product
Development Phase With Application to the Business Airplane Industry.
Cambridge, MA, PhD Thesis, MIT and LAI.
E-Programme.com. (2011). "Homepage." from http://www.E-Programme.com.
Eisenhardt, K. (1989). "Building theories from case study research." Academy of
management review 14(4): 532-550.
Eisenhardt, K. (1991). "Better stories and better constructs: The case for rigor and
comparative logic." Academy of management review 16(3): 620-627.
Emiliani, M. L. (2004). "Improving business school courses by applying lean principles and
practices." Quality Assurance in Education 12(4): 175-187.
Ferdowsi, B. (2003). Product Development Strategies in Evolutionary Acquisition.
Cambridge, MA, Master's thesis, LAI and MIT.
Ferns, C. (1991). "Developments in programme management." International Journal of
Project Management 9(3): 148-156.
Forseth, C. E. (2002). The Pursuit of Acquisition Intrapreneurs. Cambridge, MA, Research
Report, LAI and MIT.
Fowler, F. (1995). Improving Survey Questions. Design and Evaluation. Thousand Oaks,
SAGE Publications.
Fowler, F. (2009). Survey Research Methods. Thousand Oaks, SAGE Publications.
Francis, P. (2009). Defense Acquisitions: Charting a Course for Lasting Reform:
Congressional Testimony. Washington D.C., DIANE Publishing.
Francis, P., M. Golden, et al. (2010). Defense acquisitions: Managing Risk to Achieve Better
Outcomes. Washington D.C.
V Index of literature x
Fricke, S. E. and A. J. Shenhar (2000). "Managing multiple engineering projects in a
manufacturing support environment." IEEE Transactions on Engineering
Management 47(2): 258-268.
GAO (2006). Defense Acquisitions - Major Weapon Systems Continue to Experience Cost
and Schedule Problems under DOD’s Revised Policy (GAO-06-368).
Washington, United States Government Accountability Office, Report to
Congressional Committees.
GAO (2010). Defense Acquisitions - Managing Risk to Achieve Better Outcomes (GAO-10-
374T). Washington, D.C., United States Government Accountability Office.
Glazner, C. G. (2006). Enterprise Integration Strategies Across Virtual Extended Enterprise
Networks: A Case Study of the F-35 Joint Strike Fighter Program Enterprise.
Cambridge, MA, Master's thesis, LAI and MIT.
Gordon, M., C. Musso, et al. (2009). "The Path to Developing Successful New Products."
The Wall Street Journal, November 30, 2009.
Görög, M. (2011). "Translating single project management knowledge to project programs."
Project Management Journal 42(2): 17-31.
Graban, M. (2008). Lean hospitals: Improving quality, patient safety, and employee
satisfaction. Florence, KY, Productivity Press.
Haberfellner, R. and W. F. Daenzer (2002). Systems Engineering - Methodik und Praxis.
Zürich, Verlag Industrielle Organisation.
Hammer, M., C. J. Haney, et al. (2007). "The 7 Deadly Sins of Performance Measurement."
MIT Sloan Management Review 48(3): 80-.
Haughey, D. (2001). "A perspective on programme management." Project Smart Website.
Hauser, J. and G. Katz (1998). "Metrics: you are what you measure!" European Management
Journal 16(5): 517-528.
Hernandez, C. M. (1995). Challenges and Benefits to the Implementation of Integrated
Product Teams on Large Military Procurements. Cambridge, MA, Master's
thesis, LAI and MIT.
Herweg, G. M. and K. E. Pilon (2001). System Dynamics Modeling for the Exploration of
Manpower Project Staffing Decisions in the Context of a Multi-Project
Enterprise, Master Thesis, LAI and Massachusetts Institute of Technology.
V Index of literature xi
Hess, J., G. Agarwal, et al. (2010). Normative and Descriptive Models for Test & Evaluation
of Unmanned and Autonomous Systems of Systems. Chicago, IL, 20th
INCOSE Symposium, July 2010.
Hess, J. T. and R. Valerdi (2010). Test and Evaluation of a SoS using a Prescriptive and
Adaptive Testing Framework. Loughborough, UK, , 5th IEEE International
Conference on Systems of Systems Engineering, June 2010.
Hoppmann, J. (2009). The Lean Innovation Roadmap - A Systematic Approach to
Introducing Lean in Product Development Processes and Establishing a
Learning Organization. Institute of Automotive Management and Industrial
Production. Braunschweig, Technische Universitaet Braunschweig.
Hoppmann, J. (2009). The Lean Innovation Roadmap: A Systematic Approach to Introducing
Lean in Product Development Processes and Establishing a Learning
Organization. Lean Advancement Initiative. Cambridge, Massachusetts
Institute of Technology (MIT).
Hut, P. M. (2008). "How Program Management Differs From Project Management." from
http://www.pmhut.com/how-program-management-differs-from-project-
management.
Imai, M. (1986). Kaizen: The key to Japan's competitive success. New York, McGraw-Hill.
INCOSE. (2011). "International Council on Systems Engineering (INCOSE) website."
Retrieved March 6, 2011.
Institute, P. M. (2004). A Guide to the Project Management Body of Knowledge: (PMBOK
guide) ; an American national standard, ANSIPMI 99-001-2004. Newton
Square, Pa.
Institute, P. M. (2006). The Standard for Program Management. Newton Square,
Pennsylvania, Project Management Institute.
Institute, P. M. (2008). The Standard for Program Management. Newton Square,
Pennsylvania, Project Management Institute.
Institute, P. M. (2011). "Project Management Institute website." Retrieved 20.02.2011, 2011,
from http://www.pmi.org/.
Ippolito, B. J. (2000). Identifying Lean Practices for Deriving Software Requirements.
Cambridge, MA, Master's thesis, LAI and MIT.
ISO (1994). ISO 8402:1994(E) - Quality management and quality assurance -- Vocabulary.
Geneva, International Organization for Standardization.
V Index of literature xii
ISO (2005). ISO 9000:2005(E) - Quality management systems -- Fundamentals and
vocabulary. Geneva, International Organization for Standardization.
Ittner, C. and D. Larcker (1998). "Innovations in performance measurement: Trends and
research implications." Journal of management accounting research 10: 205-
238.
Jobo, R. S. (2003). Applying the Lessons of “Lean Now!” To Transform the US Aerospace
Enterprise - A study guide for government lean transformation. Cambridge,
MA, LAI Report.
Jones, C. M. (2006). Leading Rockwell Collins Lean transformation. San Antonio, TX, April 9,
2006 LAI Keynote Talk: http://mitworld.mit.edu/video/378/.
Kadish, R. T., G. Abbott, et al. (2006). "Defense acquisition performance assessment report."
from https://acc.dau.mil/adl/en-
US/18554/file/733/Defense%20Acquisition%20Performance%20Assessment
%20Report%202006.pdf.
Kato, J. (2005). Development of a Process for Continuous Creation of Lean Value in Product
Development Organizations, Master Thesis. LAI and Massachusetts Institute
of Technology.
Kenley, C. R. and T. R. Creque (1999). "Predicting Technology Operational Availability Using
Technical Maturity Assessment." Systems Engineering 2(4): 198-211.
Kerr, S. (1995). "An academy classic: On the folly of rewarding A, while hoping for B."
Academy of Management Executive 9(1): 7-14.
Kinscher, K. (2011). Interviews with Air Force program managers. Cambridge, MA, United
States, Lean Advancement Initiative.
Kirtley, A. L. (2002). Fostering Innovation across Aerospace Supplier Networks, Master
Thesis, LAI and Massachusetts Institute of Technology.
Klein, J., J. Cutcher-Gershenfeld, et al. (1997). Implementation Workshop: High Performance
Work Organizations. Cambridge, MA, LAI Report RP97-02-34.
Kotter, J. P. (1990). A force for change: How leadership differs from management, Free Pr.
Krafcik, J. F. (1988). "Triumph of the lean production system." Sloan Management Review
30(1): 41-52.
LAI (2001). Lean Enterprise Self-Assessment Tool (LESAT) Version 1.0.
LAI (2011). Seminar "Lean in Program Management: Establishment of a Community of
Practice". Seal Beach, CA, Lean Advancement Initiative.
V Index of literature xiii
Lamb, C. M. T. (2009). Collaborative Systems Thinking: An exploration of the mechanisms
enabling team systems thinking. Cambridge, MA, PhD thesis, LAI and MIT.
Laraia, A. C., P. E. Moody, et al. (1999). The Kaizen Blitz: accelerating breakthroughs in
productivity and performance. New York, Wiley.
LEI (2007). Reflections on Lean. Boston, Lean Enterprise Institute.
Levine, L. and B. Novak (2008). Identifying Acquisition Patterns of Failure Using Systems
Archetypes. Systems Conference, 2008 2nd Annual IEEE, Montreal, QC,
Canada IEEE.
Likert, R. (1932). "A technique for the measurement of attitudes." Archives of Psychology
140: 1-55.
Loureiro, G., E. Crawley, et al. (2006). "From Value to Architecture - Ranking the Objectives
of Space Exploration (IAC-06-D1.4.2)." Proceedings of the International
Astronautical Congress (IAC 2006), Valencia, Spain, 2 - 6 October 2006.
Lucas, M. V. (1996). Supplier Management Practices of the Joint Direct Attack Munition
Program. Cambridge, MA, Master's thesis, LAI and MIT.
Lycett, M., A. Rassau, et al. (2004). "Programme management: a critical review."
International Journal of Project Management 22(4): 289-299.
Maass, E. and P. D. McNair (2009). Applying Design for Six Sigma to Software and
Hardware Systems, Prentice Hall Press Upper Saddle River, NJ, USA.
Madachy, R. and R. Valerdi (2010). "Automating Systems Engineering Risk Assessment."
Proceedings of the 8th Conference on Systems Engineering Research,
Hoboken, NJ, March 17-19 2010.
Mahé, V. R. (2008). A Survey of Front End Modularity as an Automotive Architecture and its
Ability to Deliver Value. Cambridge, MA, Master's thesis, MIT.
Management, A. f. P. (2006). Body of Knowlegde. High Wycombe, UK, APM.
Mandelbaum, J., W. S. Kaplan, et al. (2001). Incentive Strategies for Defense Acquisitions.
Washington, D.C., Department of Defense.
Mao, P., H. Cheng, et al. (2008). Innovation of Large-Scale Project Multi-Programme
Construction Management Mode, IEEE.
Maylor, H., T. Brady, et al. (2006). "From projectification to programmification." International
Journal of Project Management 24(8): 663-674.
McGarry, J. and D. Card (2002). Practical software measurement: objective information for
decision makers. Boston, Addison-Wesley Professional.
V Index of literature xiv
McKenna, N. (2006). The Micro-foundations of Alignment among Sponsors and Contractors
on Large Engineering Projects. Cambridge, MA, Master's thesis, MIT and LAI.
McKnew, G. (2010). An Examination of the Patterns of Failure in Defense Acquisition
Programs, Master Thesis. LAI and Massachusetts Institute of Technology.
McManus, H. (2005). Product Development Value Stream Mapping (PDVSM) Manual.
Cambridge, MA, Lean Advancement Initiative (LAI) at MIT.
McManus, H., A. Haggerty, et al. (2007). "Lean engineering: a framework for doing the right
thing right." Aeronautical Journal 111(1116): 105-114.
McNutt, R. T. (1998). Reducing DoD Product Development Time: The Role of the Schedule
Development Process, PhD Thesis, LAI and Massachusetts Institute of
Technology.
McVey, M. E. (2002). Valuation Techniques for Complex Space Systems: An Analysis of a
Potential Satellite Servicing Market Cambridge, MA, Master's thesis, LAI and
MIT.
Michaels, D. Airbus Woes Darken Jet Delivery. Wall Street Journal.
Morgan, J. M. and J. K. Liker (2006). The Toyota Product Development System. New York,
Productivity Press.
Morgan, J. M. and J. K. Liker (2006). The Toyota product development system: integrating
people, process, and technology. New York, Productivity Press.
Morgan, S. (1999). The Cost and Cycle Time Implications of Selected Contractor and Air
Force System Program Office Management Policies during the Development
Phase of Major Aircraft Acquisition Programs, Master Thesis, LAI and
Massachusetts Institute of Technology.
Neuman, W. (2006). Social Research Methods. Qualitative and Quantitative Approaches.
Boston, Pearson.
Nightingale, D., A. Stanke, et al. (2008). Enterprise Strategic Analysis and Transformation
(ESAT) - Version 2.0. Cambridge, MA, LAI Guide.
Nightingale, D. J. and J. H. Mize (2001). "Development of a Lean Enterprise Transformation
Maturity Model." Information, Knowledge, Systems Management 3(1): 15-30.
Nuffort, M. R. (2001). Managing Subsystem Commonality. Cambridge, MA, Master's thesis,
LAI and MIT.
V Index of literature xv
Oehmen, J. (2005). Approaches to Crisis Prevention in Lean Product Development by High
Performance Teams and through Risk Management. Munich and Cambridge,
Technical University of Munich and LAI. Master's Thesis: 161p.
Oehmen, J. and M. Ben-Daya (2010). A Reference Model for Risk Management in Product
Development Programs. Research Paper of the MIT-KFUPM Center of Clean
Water and Energy, Cambridge and Dhahran, MIT and KFUPM.
Oehmen, J. and E. Rebentisch (2010). Risk Management in Lean PD. LAI Paper Series
"Lean Product Development for Practitioners", Cambridge, MA, LAI and MIT.
Oehmen, J. and E. Rebentisch (2010). Waste in Lean Product Development. LAI Paper
Series "Lean Product Development for Practitioners", Cambridge, MA, LAI and
MIT.
Oehmen, J. and W. Seering (2011). Risk-Driven Design Processes – Balancing Efficiency
with Resilience in Product Design. The Future of Design Methodology. H.
Birkhofer. London, Springer.
Office, G. A. (2010). Defense Acquisition: Assessments of Selected Weapon Programs.
GAO-10-388SP. Washington D.C.
OGC (2009). Managing Successful Projects with PRINCE2. London, Office of Government
Commerce, The Stationary Office.
Oppenheim, B. W. (2004). "Lean product development flow." SYSTEMS ENGINEERING-
NEW YORK- 7(4): 352-378.
Oppenheim, B. W., E. M. Murman, et al. (2011). "Lean enablers for systems engineering."
Journal of Systems Engineering 14(1): 29-55.
Paduano, R. (2001). Employing Activity Based Costing and Management Practices Within
the Aerospace Industry: Sustaining the Drive for Lean. Cambridge, MA,
Master's thesis, MIT and LAI.
Patanakul, P. and D. Milosevic (2008). "A competency model for effectiveness in managing
multiple projects." The Journal of High Technology Management Research
18(2): 118-131.
Patanakul, P. and D. Milosevic (2009). "The effectiveness in managing a group of multiple
projects: Factors of influence and measurement criteria." International Journal
of Project Management 27(3): 216-233.
Pessôa, M. V. P. (2008). Weaving the waste net: a model to the product development
system low performance drivers and its causes. Cambridge, MA, LAI White
Paper 08-01.
V Index of literature xvi
Platje, H. (1993). "Breakthrough in multiproject management: how to escape the vicious
circle of planning and control* 1." International Journal of Project Management
11(4): 209-213.
PMI (2004). A Guide to the Project Management Body of Knowledge (PMBOK Guide).
Newton Square (PA), Project Management Institute.
PMI (2008). A guide to the project management body of knowledge (PMBOK guide). Drexel
Hill, PA, Project Management Institute.
Pomponi, R. A. (1998). Organizational structures for technology transition : rethinking
information flow in the integrated product team. Cambridge, MA, PhD thesis,
LAI and MIT.
Rebentisch, E. (1996). Preliminary Observations on Program Instability. Cambridge, MA, LAI
Whitepaper LEAN 96-03.
Rhodes, D. H., R. Valerdi, et al. (2009). "Systems Engineering Leading Indicators for
Assessing Program and Technical Effectiveness." Systems Engineering 12(1):
21-35.
Roberts, C. J. (2003). Architecting Evolutionary Strategies Using Spiral Development for
Space Based Radar. Cambridge, MA, Master's thesis, LAI and MIT.
Roedler, G., D. H. Rhodes, et al. (2010). Systems Engineering Leading Indicators Guide
(Version 2.0), INCOSE-TP-2005-001-03.
Ross, A. M. (2003). Multi-Attribute Tradespace Exploration with Concurrent Design as a
Value-Centric Framework for Space System Architecture and Design.
Cambridge, MA, Master's thesis, LAI and MIT.
Roth, G. (2008). The Order and Chaos of the Learning Organization. Handbook of
Organization Development. T. Cummings. Newbury, CA, Sage.
Sapsed, J. and A. Salter (2004). "Postcards from the edge: local communities, global
programs and boundary objects." Studies 25(9): 1515-1534.
Schmenner, R. and T. Vollmann (1994). "Performance Measures: Gaps, False Alarms, and
the “Usual Suspects”." International Journal of Operations & Production
Management 14(12): 58-69.
Schuh, G., H. Adickes, et al. (2008). Lean Innovation - Auf dem Weg zur Systematik. AWK
Aachener Werkzeugmaschinen-Kolloquium: Wettbewerbsfaktor
Produktionstechnik - Aachener Perspektiven, Aachen, Apprimus Verlag.
V Index of literature xvii
Schuh, G., M. Lenders, et al. (2008). Lean innovation: introducing value systems to product
development: Management of Engineering & Technology, 2008. PICMET
2008. Portland International Conference on, IEEE.
Schuh, G., M. Lenders, et al. (2010). Lean Innovation – Introducing takt time to product
development processes. Proceedings of APMS2010 - International
Conference on Advances in Production Management Systems. Mailand,
Poliscript - Politecnico di Milano.
Shroyer, E. (2002). Lean Transition of Emerging Industrial Capability (LeanTEC), Final
Report, Cooperative Agreement F33615-97-2-5153, U.S. Air Force and
Boeing Company.
Siegel, L. R. (2004). Measuring and Managing Intellectual Capital in the U.S. Aerospace
Industry. Cambridge, MA, Master Thesis, LAI and MIT.
Silva, L. M. (2001). A Partitioning Methodology for Helicopter Avionics System with a focus
on Life Cycle Cost. Cambridge, MA, Master's thesis, LAI and MIT.
Sobek, K. Durward, et al. (1998). "Another look at how Toyota integrates product
development." Harvard Business Review 76(4): 36-47.
Sobek, D. K., A. C. Ward, et al. (1999). "Toyota's principles of set-based concurrent
engineering." Sloan Management Review 40(2): 67-84.
Spaulding, T. J. (2003). Tools for Evolutionary Acquisition: A Study of Multi-Attribute
Tradespace Exploration (MATE) Applied to the Space Based Radar.
Cambridge, MA, Master's thesis, LAI and MIT.
Stagney, D. B. (2003). The Integrated Concurrent Enterprise. Cambridge, MA, Master's
thesis, LAI and MIT.
Stanke, A., D. Nightingale, et al. (2008). Enterprise Strategic Analysis and Transformation
(ESAT) - Facilitator’s Guide - Version 2.0. Cambridge, MA, LAI Guide.
Stanke, A. K. (2006). Creating High Performance Enterprises, PhD Thesis, LAI and
Massachusetts Institute of Technology.
Stanke, A. K. (2006). Creating high performance enterprises. Lean Advancement Initiative
(LAI). Cambridge, Massachusetts Institute of Technology (MIT).
Sugimori, Y., K. Kusunoki, et al. (1977). "Toyota production system and kanban system
materialization of just-in-time and respect-for-human system." International
Journal of Production Research 15(6): 553-564.
V Index of literature xviii
Susman, G. and I. Petrick (1995). Product Development Team Effectiveness. Cambridge,
MA, LAI Whitepaper 95-06.
Tang, C. S. and J. D. Zimmerman (2009). "Managing New Product Development and Supply
Chain Risks: The Boeing 787 Case." Supply Chain Forum: An International
Journal 10(2): 74-85.
Tang, V. (2006). Corporate Decision Analysis: An Engineering Approach. Cambridge, MA,
PhD thesis, LAI and MIT.
Tang, V. and K. N. Otto (2009). "Multifunctional Enterprise Readiness: Beyond the Policy of
Build-Test-Fix Cyclic Rework." Proceedings of the ASME 2009 International
Design Engineering Technical Conferences & Design Theory and Design
IDETC/DTM 2009 , August 30 - September 2, 2009, San Diego, California: 1-
9.
Tondreault, J. P. (2003). Improving the Management of System Development to Produce
More Affordable Military Avionics Systems. Cambridge, MA, Master's thesis,
LAI and MIT.
University, D. A. (2009). Introduction to Defense Acquistion Management. Fort Belvoir,
Defense Acquisition University Press.
University, D. A. (2011). Defense Acquisition Guidebook. Fort Belvoir, Defense Acquisition
University Press: online accessible at https://dag.dau.mil.
Valerdi, R. (2005). The Constructive Systems Engineering Cost Model (COSYSMO). Los
Angeles, CA, PhD Thesis, USC.
Valerdi, R. (2010). "Heuristics for Systems Engineering Cost Estimation." IEEE Systems
Journal.
Valerdi, R., J. E. Rieff, et al. (2007). Lessons Learned From Industrial Validation of
COSYSMO. San Diego, CA, 17th INCOSE Symposium, June 2007.
Vereecke, A., E. Pandelaere, et al. (2003). "A classification of development programmes and
its consequences for programme management." International Journal of
Operations & Production Management 23(10): 1279-1290.
Wagner, C. (2007). Specification Risk Analysis: Avoiding Product Performance Deviations
through an FMEA-based Method. Munich and Cambridge, Technical
University of Munich and LAI. Master's Thesis: 149p.
Walton, M. A. (1999). Identifying the Impact of Modeling and Simulation in the Generation of
System Level Requirements. Cambridge, MA, Master's thesis, LAI and MIT.
V Index of literature xix
Ward, A. C. (2007). Lean product and process development. Cambridge, United States,
Lean Enterprises Institute Inc.
Weigel, A. L. (2000). Spacecraft System-Level Integration and Test Discrepancies:
Characterizing Distribution and Costs. Cambridge, MA, Master's thesis, MIT
and LAI.
Wheelwright, S. C. and K. B. Clark (1992). Revolutionizing product development: Quantum
leaps in speed, efficiency, and quality. New York , Toronto , New York, Free
Press; Maxwell Macmillan Canada; Maxwell Macmillan International.
Whitaker, R. B. (2005). Value Stream Mapping and Earned Value Management: Two
Perspectives on Value in Product Development. Cambridge, MA, Master's
thesis, MIT and LAI.
Williams, D. and T. Parr (2006). Enterprise programme management: Delivering value.
Basingstoke, Palgrave MacMillan.
Williams, T. M. (1999). "The need for new paradigms for complex projects." International
Journal of Project Management 17(5): 269-273.
Wirthlin, J. R. (2000). Best Practices in User Needs / Requirements Generation. Cambridge,
MA, Masters thesis, LAI and MIT.
Wirthlin, J. R. (2009). Identifying enterprise leverage points in Defense Acquisition Program
performance. Engineering Systems Division. Cambridge, Massachusetts
Institute of Technology (MIT).
Wirthlin, J. R., W. Seering, et al. (2008). "Understanding Enterprise Risk Across an
Acquisition Portfolio: A Grounded Theory Approach." Seventh National
Symposium on Space Systems Engineering & Risk Management, Los
Angeles, CA, February 26-29, 2008.
Womack, J. and D. Jones (1996). Lean Thinking - Banish Waste and Create Wealth in Your
Corporation. New York, Simon Schuster.
Womack, J. P. and D. T. Jones (1996). Lean thinking: Banish waste and create wealth in
your corporation. New York, Simon & Schuster.
Womack, J. P. and D. T. Jones (2003). Lean thinking: banish waste and create wealth in
your corporation thinking: Banish waste and create wealth in your corporation.
New York, Simon and Schuster; Free Press.
Womack, J. P., D. T. Jones, et al. (1990). The machine that changed the world: Based on the
Massachusetts Institute of Technology 5-million-dollar 5-year study on the
V Index of literature xx
future of the automobile. M. I. o. Technology. New York, Rawson: VIII, 323 S.
/// 323.
Yin, R. (2009). Case study research: Design and methods, Sage Publications, Inc.
Yin, R. K. (2003). Case study research - Design and methods. Thousand Oaks, Sage
Publications.
VI Table of figures and spreadsheets xxi
VI Table of figures and spreadsheets
Figure 1-1: Total Department of Defense acquisition portfolio: actual cost compared to initial
estimate (GAO 2006; GAO 2010) ..................................................................................... 2
Figure 1-2: Thesis organization ................................................................................................ 4
Figure 1-3: Research question and corresponding sections in chapter 2 ................................ 5
Figure 2-1: Correlation of the research questions and sections ............................................... 6
Figure 2-2: DoD Decision Support Systems (University 2011) ................................................ 7
Figure 2-3: Life cycle for defense acquisitions (University 2011) ............................................. 9
Figure 2-4: The “Managing Successful Programmes” framework of the Office of Government
Commerce (Commerce 2007) ........................................................................................ 12
Figure 2-5: Portfolio, programs and projects: high-level view (Institute 2008) ....................... 14
Figure 2-6: Interaction between project management and program management (Institute
2008) .............................................................................................................................. 15
Figure 2-7: Program life cycle and program benefits management (Institute 2008) .............. 17
Figure 2-8: Program structure and stakeholders .................................................................... 18
Figure 2-9: Lean Advancement Initiative (LAI) program management framework ................. 19
Figure 2-10: Correlation of program life cycle phases and the stages of a Stage-Gate
process ........................................................................................................................... 24
Figure 2-11: Usage of different research approaches in the thesis ........................................ 31
Figure 2-12: Research questions and corresponding sections .............................................. 35
Figure 4-1: The Lean Advancement Initiative (LAI) program management framework .......... 47
Figure 4-2: Interview results for Program Execution and Enterprise Management ................ 49
Figure 4-3: Interview results for Scoping, Planning & Contracting ......................................... 50
Figure 4-4: Interview results for Technology Integration ........................................................ 50
Figure 4-5: Interview results for Product Design .................................................................... 51
Figure 4-6: Average rating of all framework parts combined (including standard deviation) .. 52
VI Table of figures and spreadsheets xxii
Figure 4-7: Difficulty and importance rating of the most relevant elements of the LAI program
management framework ................................................................................................. 53
Figure 4-8: Legend for Figure 4-2 .......................................................................................... 53
Figure 4-9: Rating of the importance of framework themes by members of the industry focus
group .............................................................................................................................. 55
Figure 5-1: Distribution of the number of sources for all pitfalls ............................................. 67
Figure 5-2: Distribution of number of pitfalls per framework part ........................................... 71
Figure 5-3: Number of pitfalls for the major areas within the LAI program management
framework ....................................................................................................................... 72
Figure 6-1: Percentage of the compiled Lean Enablers for program management that were
derived from each literature source .............................................................................. 100
Figure 6-2: Total number of Lean Enablers and categories ................................................. 101
Figure 6-3: Distribution of the Lean Enablers and the subcategories among the six lean
principles ...................................................................................................................... 101
Figure 7-1: Screenshot from the mapping (in Microsoft Excel) ............................................ 104
Figure 7-2: Number of identified connections with a Lean Enabler per program management
pitfall ............................................................................................................................. 105
Figure 7-3: Number of identified pitfall-enabler connections per framework theme ............. 106
Figure 7-4: Number of identified pitfall-enabler connections for the framework parts of the
theme "Program Execution" ......................................................................................... 106
Figure 7-5: Number of identified pitfall-enabler connections for the framework parts of the
theme "Enterprise Management" .................................................................................. 107
Figure 7-6: Number of identified pitfall-enabler connections for the framework parts of the
theme "Scoping, Planning and Contracting" ................................................................. 108
Figure 7-7: Number of identified pitfall-enabler connections for the framework parts of the
theme "Technology Integration" ................................................................................... 109
Figure 7-8: Number of identified pitfall-enabler connections for the framework parts of the
theme "Product Design" ............................................................................................... 110
Figure 7-9: Percentage of total number of identified connections that is applicable to each
framework theme .......................................................................................................... 110
VI Table of figures and spreadsheets xxiii
Figure 7-10: Total number of identified connection per lean principle and % identified of
possible connections .................................................................................................... 111
Figure 8-1: Research questions and corresponding chapters .............................................. 113
Figure 8-2: Contributions of this thesis ................................................................................. 115
Table 2-1: Different Categories of defense acquisitions (University 2009) .............................. 8
Table 2-2: Literature sources for framework evaluation ......................................................... 21
Table 2-3: Comparison of frameworks in terms of cross-cutting program management
activities .......................................................................................................................... 22
Table 2-4: Comparison of frameworks in terms of pre-program and program establishment
activities .......................................................................................................................... 25
Table 2-5: Comparison of frameworks in terms of technology development activities ........... 26
Table 2-6: Comparison of frameworks in terms of product design activities .......................... 26
Table 2-7: Comparison of frameworks in terms of production, deployment, and disposal ..... 27
Table 2-8: Comparison of frameworks in the phases of an engineering program lifecycle .... 28
Table 2-9: Overall comparison of the different frameworks .................................................... 28
Table 2-10: List of companies participating in the industry focus group ................................. 34
Table 2-11: List of organizations participating in the industry focus group ............................. 34
Table 3-1: Broadly used definitions of “project” and if available “project management” ......... 37
Table 3-2: Definitions of “program” and “program management” ........................................... 38
Table 3-3: Comparison of definitions of “program management” ........................................... 41
Table 4-1: Overall comparison of existing program management frameworks (see section
2.1.5) .............................................................................................................................. 45
Table 5-1: Compilation of program management pitfalls and corresponding sources ........... 58
Table 5-2: Mapping of pitfalls to the Program Execution parts of the LAI program
management framework ................................................................................................. 69
Table 5-3: Mapping of pitfalls to the Enterprise Management parts of the LAI program
management framework ................................................................................................. 69
VI Table of figures and spreadsheets xxiv
Table 5-4: Mapping of program management pitfalls to the Scoping, Planning & Contracting
framework parts of the LAI program management framework ....................................... 70
Table 5-5: Mapping of pitfalls to the Technology Integration parts of the LAI program
management framework ................................................................................................. 70
Table 5-6: Mapping of pitfalls to the Product Design parts of the LAI program management
framework ....................................................................................................................... 70
Table 6-1: Lean Enablers for program management ............................................................. 80
VII Appendix A xxv
VII Appendix A
Figure VII-1: Common pitfalls in program management and corresponding Lean Enablers
ID Pitfall Corresponding Lean Enablers
1 Unstable funding (undermining program
stability) 2.5.; 3.13.
2
Long waiting time for external
stakeholders (e.g., Acquisition Panel)
activities
1.5.; 2.7.
3 “Firefighting”
2.3.; 2.4.; 2.5.; 2.6.; 2.7.; 3.1.; 3.2.;
3.3.; 3.4.; 3.5; 3.6.; 3.7.; 3.8.; 3.10.;
3.11.; 3.12.; 3.13.; 5.1.; 5.3.; 5.4.;
5.5.; 5,6; 6.4.
4 Lack of leadership commitment to cycle
time reduction
1.2.; 1.4; 2.1; 2.7.; 3.1.; 3.12.; 4.1.;
4.2.; 5.1.; 5.2.; 5.4.; 6.2.
5 No incentives for cycle time reduction 1.1.; 1.4; 1.5.; 2.6.; 3.4.; 5.6; 6.1.;
6.2.; 6.3.; 6.4.
6 Overriding influence of funding-related
constraints 1.4; 2.2.; 2.5.; 5.4.
7 Wrong allocation of responsibilities and
decision-making rights
1.2.; 1.5.; 2.1; 2.5.; 3.1.; 3.3.; 3.4.;
3.6.; 3.7.; 3.9.; 4.1.; 5.4.
8 Lack of coordination of communication 1.2.; 1.5.; 2.1; 2.2.; 2.5.; 3.1.; 3.4.;
3.7.; 3.8.; 3.9.; 3.10.; 5.3.; 5.4.
9 Lack of process standardization 1.3.; 2.2.; 2.4.; 2.5.; 3.2.; 3.11.; 5.5.
10 Poor communication with stakeholders
1.1.; 1.2.; 1.3.; 1.4; 1.5.; 2.1; 2.5.;
2.7.; 3.3.; 3.4.; 3.7.; 3.10.; 3.11.;
5.3.; 5.4.
11 No realistic program schedule 1.1.; 2.5.; 2.7.; 3.5; 3.6.; 3.7.; 3.10.
12 Lack of metrics 2.2.; 2.5.; 2.7.; 3.7.; 3.10.; 3.13.;
4.2.; 5.3.; 5.5.
12.1 No metrics to reflect cross-
functional processes
2.2.; 2.5.; 2.7.; 3.7.; 3.9.; 3.10.;
3.13.; 4.2.; 5.3.; 5.5.
12.2
No process implemented to
measure project performance or
project progress (e.g. EVM)
1.1.; 1.2.; 1.3.; 1.4; 1.5.; 2.1; 2.2.;
2.5.; 2.7.; 3.3.; 3.4.; 3.7.; 3.10.;
3.13.; 4.2.
VII Appendix A xxvi
13
Wrong metrics: Not using the right
measure (ignoring something important)
or choosing metrics that are wrong
2.2.; 2.5.; 2.7.; 3.7.; 3.10.; 3.13.;
4.2.; 5.3.; 5.5.
13.1
Implementing metrics that focus
on short-term results, or that do
not give thoughts to
consequences on human
behavior and enterprise
performance
2.2.; 2.5.; 2.7.; 3.7.; 3.10.; 3.13.;
4.2.; 5.3.; 5.5.
13.2
Metrics that are not actionable or
are hard for a team/group to
impact, or collecting too much
data
2.2.; 2.5.; 2.7.; 3.7.; 3.10.; 3.13.;
4.2.; 5.3.; 5.4.; 5.5.; 6.4.
14 No activity-based costing and
management
1.1.; 1.2.; 1.4; 1.5.; 2.1; 2.2.; 2.5.;
2.7.; 3.4.; 3.7.; 3.10.; 3.13.
15 No defined risk management process 1.3.; 1.5.; 2.4.; 2.7.; 3.4.; 3.5; 3.10.;
3.11.; 5.4.
16 Lacking ability to understand
uncertainty/risk
1.1.; 1.2.; 1.3.; 1.4; 1.5.; 2.1; 2.2.;
2.3.; 2.4.; 2.5.; 2.6.; 2.7.; 3.1.; 3.3.;
3.11.; 5.5.; 6.2.; 6.3.
17 Lacking the staff/capacity to deal with
uncertainty/risk
2.5.; 3.3.; 3.13.; 5.6; 6.1.; 6.2.; 6.3.;
6.4.
18 Not all staff is involved into risk
management
1.3.; 2.4.; 2.5.; 3.3.; 3.11.; 5.4.; 6.2.;
6.3.; 6.4.
19 Insufficient management of sub-projects 2.1; 2.2.; 2.7.; 5.4.
20 Competing resource requirements (e.g.
allocation and choice of resources)
1.5.; 2.1; 2.2.; 2.5.; 3.4.; 3.9.; 3.13.;
5.4.
21 Too much overtime 2.6.; 2.7.; 3.1.; 3.2.; 3.3.; 3.10.;
3.13.; 6.1.; 6.2.; 6.4.
22 Overly unstable project priorities 1.1.; 1.2.; 1.3.; 1.4; 1.5.; 2.1; 2.2.;
2.5.; 2.7.; 3.3.; 3.4.; 3.5
23 Too much outsourcing 1.4; 1.5.; 2.6.; 3.2.; 3.4.
24 Unrealistic plan or no plan for ramp-up
and ramp-down regarding staffing
2.1; 2.2.; 2.3.; 2.5.; 2.7.; 3.1.; 3.8.;
3.13.
25 Troubled projects are not canceled early 2.5.; 2.7.; 3.3.; 3.10.
26 No buffer scheduled between projects 2.5.; 2.6.; 2.7.; 3.6.; 5.1.
VII Appendix A xxvii
27 Inadequate identification of individual
skill development
5.1.; 5.2.; 5.3.; 5.5.; 5.6; 6.1.; 6.2.;
6.3.; 6.4.
28 Unsupportive environment for
experiential and general learning
5.1.; 5.2.; 5.3.; 5.5.; 5.6; 6.1.; 6.2.;
6.3.; 6.4.
29
No or insufficient assessment of
intellectual capital base regarding
program needs
2.7.; 5.1.; 5.3.; 5.5.; 6.3.; 6.4.
30 No specialist career path 5.1.; 5.2.; 5.3.; 5.5.; 6.1.; 6.2.; 6.3.;
6.4.
31 Inadequate team experience 1.5.; 3.4.; 5.1.; 5.2.; 5.3.; 5.5.; 5.6;
6.1.; 6.2.; 6.3.; 6.4.
32 Insufficient program manager
qualifications
1.2.; 2.1; 2.5.; 3.1.; 3.9.; 4.1.; 5.1.;
5.2.; 5.3.; 5.4.; 5.5.; 5.6; 6.1.; 6.2.;
6.3.; 6.4.
33
Lack of enterprise-wide coordination of
optimization: only local processes and
organizational optimization
1.5.; 2.2.; 2.5.; 2.7.; 3.4.; 3.7.; 3.12.;
5.1.; 5.2.; 5.3.; 5.5.; 5.6; 6.1.; 6.2.;
6.3.
34 Lack of performance incentives for staff 2.7.; 3.7.; 3.10.; 5.2.; 5.6; 6.1.; 6.2.;
6.3.
35 Wrong performance incentives for staff
36 No utilization of a social network 3.10.; 5.3.; 5.5.
37 Too little customer and stakeholder
interaction
1.1.; 1.2.; 1.3.; 1.4; 1.5.; 2.7.; 3.3.;
3.4.; 3.7.; 3.10.; 5.4.
38 Too little integration of suppliers 1.5.; 2.7.; 3.4.; 3.7.; 3.10.; 5.4.
39 No fostering and maintaining of personal
accountability for plans and outcomes 1.4; 2.7.; 3.7.; 3.10.
40 Understaffing 2.5.; 2.7.; 3.10.; 3.13.
41 Insufficient resource planning (no
identification of possible understaffing) 2.5.; 2.7.; 3.10.; 3.13.
42
Insufficient use of benchmarking and
assessment tools for evaluation of
enterprise structure
2.7.; 3.7.; 3.10.
43 No enterprise-wide integrated
continuous improvement process 5.1.; 5.2.; 5.3.; 5.5.
44 Insufficient use of benchmarking and
assessment tools to identify 2.7.; 3.7.; 3.10.
VII Appendix A xxviii
improvement potentials
45 No enterprise-wide organizational
learning and change management plan
5.1.; 5.2.; 5.3.; 5.4.; 5.5.; 5.6; 6.1.;
6.2.; 6.3.; 6.4.
46 No open information sharing 2.7.; 3.7.; 3.10.; 5.3.
47 No documentation of lessons learned 5.1.; 5.2.; 5.3.; 5.5.; 6.3.
48 Insufficient or non-standardized usage of
information technology 2.7.; 3.7.; 3.9.; 3.10.; 5.3.
49 Insufficient information flow 3.1.; 3.5; 3.7.; 3.8.
50 Unclear requirement definition 1.1.; 1.2.; 1.3.; 1.4; 1.5.; 3.3.
51 No understanding of stakeholder needs 1.1.; 1.2.; 1.3.; 1.4; 1.5.; 2.5.; 2.6.;
2.7.; 3.3.; 3.4.
52 No learning from previous need
definitions 3.3.; 5.1.; 5.2.; 5.3.; 5.5.; 6.3.
53 Insufficient multi-attribute trade-
offs/tradespace exploration 1.3.; 1.4; 1.5.; 2.3.; 2.4.; 3.11.
54 Lack of requirement understanding 1.1.; 1.2.; 1.3.; 1.4; 1.5.; 2.7.; 3.3.;
3.4.; 3.7.; 3.10.
55 Lack of life cycle documentation 1.2.; 1.5.; 2.1; 2.2.; 2.5.; 2.7.; 3.3.;
3.7.; 3.9.; 3.10.
56 Insufficient probabilistic cost estimates 1.1.; 1.2.; 1.3.; 1.4; 1.5.; 2.4.
57 Too little updating on estimated costs
during early phases 1.1.; 1.2.; 1.3.; 1.4; 1.5.; 2.4.
58 Cost estimates do not reflect all aspects
of the life cycle 1.1.; 1.2.; 1.3.; 1.4; 1.5.; 2.5.
59 Imprecise or unclear contract terms 1.5.; 2.5.; 2.7.; 3.3.; 3.4.; 3.7.; 3.10.;
3.12.; 5.3.
60 Ill-designed contract scope 1.5.; 2.5.; 2.7.; 3.3.; 3.4.; 3.7.; 3.10.;
3.12.; 5.3.
61 Unclear award criteria and process 1.5.; 2.5.; 2.7.; 3.3.; 3.4.; 3.7.; 3.10.;
3.12.; 5.3.
62 Lack of incentives 1.1.; 1.2.; 1.3.; 1.5.; 2.4.; 2.5.; 2.7.;
3.3.; 3.4.; 3.7.; 3.8.; 3.10.; 3.12.; 5.6
63 Lack of incentive transparency 1.1.; 1.2.; 1.3.; 1.5.; 2.4.; 2.5.; 2.7.;
3.3.; 3.4.; 3.7.; 3.8.; 3.10.; 3.12.; 5.6
64 Mismatch of incentive with desired 1.1.; 1.2.; 1.3.; 1.5.; 2.4.; 2.5.; 2.7.;
VII Appendix A xxix
outcome 3.3.; 3.4.; 3.7.; 3.8.; 3.10.; 3.12.; 5.6
65 No standard structure for (sub-)contracts 1.5.; 2.5.; 2.7.; 3.3.; 3.4.; 3.7.; 3.10.;
3.12.; 5.3.
66 Mismatch of contract type with risk
profile of program
1.5.; 2.5.; 2.7.; 3.3.; 3.4.; 3.7.; 3.10.;
3.12.; 5.3.
67 Missing parts in the contract (e.g.,
include adaptability)
1.5.; 2.5.; 2.7.; 3.3.; 3.4.; 3.7.; 3.10.;
3.12.; 5.3.
68 Too much granting of waivers 2.6.; 3.1.; 3.2.; 3.5; 3.6.; 3.7.; 4.1.;
5.1.; 5.2.; 5.4.; 5.5.; 6.2.
69 Insufficient probabilistic cost estimates in
contracting 1.1.; 1.2.; 1.3.; 1.4; 1.5.; 2.4.
70 No process implemented to assess
technology maturation 2.3.; 2.5.; 2.6.; 3.6.; 3.11.
71 Use of immature technology 1.3.; 2.3.; 2.4.; 2.6.; 3.6.; 3.11.
72 No established technology insertion
process
1.3.; 2.3.; 2.4.; 2.6.; 3.1.; 3.2.; 3.5;
3.6.; 3.11.; 5.5.
73 No person/team in charge to manage
and monitor technology transition
1.3.; 2.3.; 2.4.; 2.6.; 3.1.; 3.2.; 3.5;
3.6.; 3.11.; 5.5.
74 No formal reviews and communication
plans for technology transition
1.3.; 2.3.; 2.4.; 2.6.; 3.1.; 3.2.; 3.5;
3.6.; 3.11.; 5.5.
75 No diverse learning strategies 5.1.; 5.2.; 5.3.; 5.5.; 5.6; 6.1.; 6.2.;
6.3.; 6.4.
76 Misalignment between team goals and
program goals
2.7.; 3.13.; 5.1.; 5.2.; 5.3.; 5.5.; 5.6;
6.1.; 6.2.; 6.3.; 6.4.
77 Lack of skill and functional diversity
within team
1.5.; 3.4.; 3.9.; 5.1.; 5.2.; 5.6; 6.1.;
6.2.; 6.3.; 6.4.
78 No established flow to and from IPT
level
3.1.; 3.2.; 3.3.; 3.4.; 3.5; 3.6.; 3.7.;
3.8.; 3.9.; 3.10.; 3.11.; 3.12.; 3.13.
79
No balance between teams and
functions (only applies to programs with
matrix organizations)
1.5.; 3.4.; 3.9.; 5.1.; 5.2.; 5.6; 6.1.;
6.2.; 6.3.; 6.4.
80
System architecture does not support
product development process or IPTs
(complex organizations often give rise to
overcomplicated system designs)
1.3.; 2.3.; 2.4.; 2.5.; 2.6.; 3.2.; 3.5;
3.11.; 3.12.
81 Ignoring the aspect of standardization 1.3.; 2.3.; 2.4.; 2.6.; 3.2.; 3.3.; 3.5;
VII Appendix A xxx
3.11.
82 Ignoring the aspect of reusability 1.3.; 2.3.; 2.4.; 2.6.; 3.2.; 3.3.; 3.5;
3.11.
83 Ignoring the aspect of modularity 1.3.; 2.3.; 2.4.; 2.6.; 3.2.; 3.3.; 3.5;
3.11.
84 Ignoring the aspect of supportability 1.3.; 2.3.; 2.4.; 2.6.; 3.2.; 3.3.; 3.5;
3.11.
85 Ignoring the aspect of maintainability 1.3.; 2.3.; 2.4.; 2.6.; 3.2.; 3.3.; 3.5;
3.11.
86 Ignoring the aspect of usability 1.3.; 2.3.; 2.4.; 2.6.; 3.2.; 3.3.; 3.5;
3.11.
87 Lack of understanding of what waste is 2.1; 2.2.; 2.3.; 2.6.; 2.7.; 4.2.; 5.1.;
5.2.; 5.3.; 5.5.
88
Lack of understanding of how to deal
with different types of waste or waste in
general
2.1; 2.2.; 2.3.; 2.6.; 2.7.; 4.2.; 5.1.;
5.2.; 5.3.; 5.5.
89 No understanding of current vs.
preferred Value Stream
2.1; 2.2.; 2.3.; 2.6.; 2.7.; 4.2.; 5.1.;
5.2.; 5.3.; 5.5.
90 No mechanism for Value Stream
improvements
2.7.; 3.7.; 3.8.; 3.10.; 3.12.; 5.1.;
5.2.; 5.3.; 5.4.; 5.5.; 5.6; 6.2.
91
Mismatch between program
characteristics and chosen development
process
1.3.; 2.3.; 2.5.; 2.6.; 2.7.; 3.2.; 3.3.;
3.4.; 3,5; 3.6.; 3.7.; 3.8.; 3.10.; 3.11.;
3.12.; 4.2.; 5.2.; 5.4.
92 Finishing engineering and 3D drawings
too late
2.3.; 2.4.; 2.6.; 2.7.; 3.1.; 3.2.; 3.3.;
3.4.; 3,5; 3.6.; 3.7.; 3.8.; 3.9.; 3.10.;
3.11.; 3.12.; 3.13.; 4.2.
93 Insufficient exploration of alternative
solutions
1.1.; 1.2.; 1.3.; 1,4; 1.5.; 2,1; 2.2.;
2.4.; 2.5.; 2.6.; 2.7.; 3.11.; 5.1.; 5.2.;
5.3.; 6.2.
94 Wrong test approaches/frameworks
2.5.; 2.6.; 2.7.; 3.1.; 3.3.; 3,5; 3.6.;
3.7.; 3.8.; 3.9.; 3.10.; 3.11.; 3.12.;
3.13.; 4.1.; 4.2.; 5.1.; 5.2.; 5.3.; 5.5.
95 No balance regarding amount of testing
(too much or too little)
2.5.; 2.6.; 2.7.; 3.1.; 3.3.; 3,5; 3.6.;
3.7.; 3.8.; 3.9.; 3.10.; 3.11.; 3.12.;
3.13.; 4.1.; 4.2.; 5.1.; 5.2.; 5.3.; 5.5.
VIII Appendix B xxxi
VIII Appendix B
Data collection approach for pitfalls for program management:
The document begins with a brief introduction to the LAI program management framework to
enable the interviewee to categorize the pitfalls he/she wants to add or relocate.
General question regarding the interviewee:
1. What type of organization do you work for? a. Government organization b. Company c. Non-profit organization
2. What is the yearly budget of your company or government organization? a. Less than $1 million b. $1 - $10 million c. $10 - $100 million d. $100 million - $1 billion e. $1 billion - $10 billion f. More than $10 billion
3. How many years of total experience do you have? a. Text field
4. How many years of experience in your present company/government organization do you have?
a. Text field 5. In how many programs did you have a significant involvement during your career?
a. Text field 6. What role did you have in these programs?
a. Program manager b. Chief engineer c. Member of the PMO d. Contracting Officer e. Finance Officer f. Engineer g. Other (please specify)
7. What kind of academic background do you have? a. BS b. BA c. MS d. MA e. Phd f. Other (please specify)
8. Which of the following DAU certifications do you have? (0: no certificate) (only applies if DoD background)
a. Program Management (1-3) i. Text field
b. Systems Planning, Research, Development and Engineering (1-3) i. Text field
c. Test & Evaluation (1-3)
VIII Appendix B xxxii
i. Text field d. Other (please specify)
9. Please list the type of certifications you have corresponding with your current position (only applies if not DoD background):
a. Please list
Questions asked for every pitfall:
1. Please rate the impact of the pitfall based on your experience: a. 1 – low (local effect: e.g. overtime of some employees) b. 3 – medium c. 5 – high (program wide impact: major deviance from plan (e.g. in cost, quality
or schedule)) 2. Please indicate the area where the pitfall had a direct impact:
a. Cost b. Quality/technical performance c. Schedule
3. How frequently does the pitfall occur in your experience (1: one every 10 programs; 10: in 10 out of 10 programs):
a. Text field (0-10) 4. In which size of programs have you experienced that pitfall?
a. Small (e.g. ACAT III and below; less than $660M total procurement) b. Medium (e.g. ACAT II; between $660M and $2.190B total procurement) c. Large (e.g. ACAT I; more than $2.190B total procurement) d. Universally applicable
5. Please list any leading indicators/metrics/best practices that might be used to discover you are in this pitfall. (General example: How does one know they have unclear requirements?):
a. Text box
The data collection for the assessment of the existing and the capturing of additional Lean
Enablers consists of the list of Lean Enablers shown in Table 6-1: Lean Enablers for program
management. For each Lean Enabler the following categories are to be filled out:
1. Title and description of Lean Enabler
2. Indicator to measure performance of Lean Enabler
3. Experience-based relevance of Lean Enabler (low / medium / high)
4. Application example (anonymized)
5. Literature reference
6. Reference to relevant PM Pitfall (number only)