© 2010 carnegie mellon university benefits of cmmi within the defense industry software engineering...
TRANSCRIPT
© 2010 Carnegie Mellon University
Benefits of CMMI Within the Defense Industry
Software Engineering InstituteCarnegie Mellon UniversityPittsburgh, PA 15213
May 2010
2© 2010 Carnegie Mellon University
Outline
• Introduction• Benefits of CMMI Implementation
–Quantitative–Qualitative
• Looking Ahead• Summary
This report was created with the cooperation of the Systems Engineering Division (SED) of the National Defense Industrial Association (NDIA) and their member companies and DoD organizations.
3© 2010 Carnegie Mellon University
Purpose of Presentation
Present new evidence about effective implementations of CMMI
• Examples are provided by the defense industrial base and DoD organizations.
• New examples are based upon the measures that practicing organizations use to track value to their businesses.
• Examples are provided by organizations that have tracked and measured performance improvements from using CMMI over many years.
• Many of the organizations emphasize high maturity results and show that they enabled superior performance.
• Their data indicate why CMMI is important to the DoD & its suppliers.
The new data presented in this report demonstrates that effective implementation of good practices aided by use of CMMI can improve cost, schedule, and quality performance.
4© 2010 Carnegie Mellon University
CMMI: Major Benefits to DoD
“Does CMMI work?” We asked our nation’s defense contractors, as well as government agencies, to share results from their performance improvement efforts using CMMI. The results spoke for themselves: “Yes, CMMI works!”
The following slides include information from six defense organizations that responded.*
*Results reported in this presentation are not attributed to protect confidentiality.
5© 2010 Carnegie Mellon University
Background on the Data for this PresentationOrganizational and project leaders decided which measures were most useful to them when tracking the results of CMMI-based improvements.
A common thread was their interest in measuring the effect CMMI had on schedule, effort and cost, and quality.
The summarized results demonstrate the wide scope of business values and goals of the participating organizations.
The source studies in this presentation used current data as follows:• 2010: Organizations 1, 2A, 3, & 6• 2009: Organizations 5 & 7• 2008: Organization 2B
6© 2010 Carnegie Mellon University
Quantitative Measures: SchedulePerformance Results Summary
We all do!
Measure Used By The Organization Performance Result
On-time deliverables (Organization 2a)
On-time deliverable increase of 4.9% (organization went from 95% to 99.9% of projects delivered on time)
Earlier Defect Detection and Repair (Organization 1)
6.35 times less defect discovery and repair hours after start of system testing; potential savings of 5 – 6.5 months in schedule delay after system tests begin for average sized project
Schedule performance index (Organization 7)
Increased from .78 to .93 over three years (a 19.2% improvement in estimation and execution of schedule)
7© 2010 Carnegie Mellon University
Quantitative Measures: Effort (Rework) and Cost Performance Results Summary
We all do!
Measure Used By The Organization Performance Result
Total hours for defect repair (Organization 1) 58% fewer hours needed to repair defects for ML5 versus ML3; Result: a potential cost savings of $1.9 to $2.3 M per average-sized project (defined as 233 KESLOC [Kilo Equivalent Source Lines of Code])
Hours per KLOC to find and fix defects for CMMI ML5 relative to the SW-CMMI ML3 baseline (Organization 6)
Defect find and fix cost down 22%
Effort hours needed to repair high severity defects in integration and test phases (Organization 4)
24% reduction in effort hours per defect
Cost performance index (Organization 4) Increased from .88 to .96 over two years
Overhead rates for CMMI ML5 relative to the SW-CMMI ML3 baseline (Organization 6)
Reduced by 7.3%
Software development cost for CMMI ML5 relative to the SW-CMMI ML3 baseline (Organization 6)
Reduced by 28%
8© 2010 Carnegie Mellon University
Selected Results: High Maturity ReducesCosts for Repair (Organization 1)
High Maturity Projects Discover defects earlier
Early detection and repair lowers Costs
57.7% fewer hours for ML5 projects expended to repair defects versus ML3
105.3 fewer hours per defect • 88.6 fewer hours during
Testing alone• When largest risk to
schedule occurs
7.9
47.3
105.
1
22.1
182.
5
9.3
35.7
16.5
15.7
77.2
-
20
40
60
80
100
120
140
160
180
200
Req & Design Code & UT Sys & Acpt Test Post Delivery Total Hours
Average Hours per Defect per Phase
Maturity Level 3
Maturity Level 5
57.7% Fewer HoursOverall for ML 5
Saves 105.3 hours per
defect
88.6 fewer hours in Testing
8
9© 2010 Carnegie Mellon University
1,846
11,022
24,496
5,155
42,519
2,177
8,309
3,855 3,651
17,992
-
5,000
10,000
15,000
20,000
25,000
30,000
35,000
40,000
45,000
Req & Design Code & UT Sys & Acpt Test Post Delivery Total Hours
Hours to Repair Defects (by Phase)(233 KESLOC Avg Project)
Maturity Level 3
Maturity Level 5
57.7% fewer hours (24,527)expended for ML 5
6.35 times (20,641 hrs) less risk of Cost or Schedule impact late in program
Potential Cost Savings From $ 1.9 M to $2.3 M per average sized program
Selected Results: Effort to Repair Defectsby Phase (Organization 1)
10© 2010 Carnegie Mellon University
Quantitative Measures: Quality Performance Results Summary
We all do!
Measure Used By The Organization Performance Result
Defect density by severity, ML5 compared to ML3 (Organization 1)
62.5% fewer high-severity defects with ML5 projects
Defect density in circuit board design (Organization 2a)
65% improvement
Defect containment by phase (Organization 3)
The fix of defects within the phase they were injected increased by 240%
Defect containment, ML5 compared to ML3, by phase per KLOC (thousands of lines of code) (Organization 2b)
Defect containment improved 13%
User acceptance test defects per KLOC (Organization 7)
Less than 0.15 defects per KLOC
% of defects removed prior to system test (Organization 7)
>85%
11© 2010 Carnegie Mellon University
Selected Results: Quality Performance (Organization 3)
12© 2010 Carnegie Mellon University
Quantitative Measures: Productivity Results Summary
We all do!
Measure Used By The Organization Performance Result
Productivity Gain with ML5 (Organization 1)
42% gain with ML5 organizational practices over 9 years
Organizational productivity vs. Galorath SEER SEM Estimation Model (Organization 1)
Production hours reduction: 33.0% at ML3; 37.4% at ML5
Productivity for CMMI ML5 relative to the SW-CMM ML3 baseline (Organization 6)
Productivity up 25.2%
13© 2010 Carnegie Mellon University
0.00
0.20
0.40
0.60
0.80
1.00
1.20
1.40
1.60
9 Year ML5 Org Other Projects
Productivity Gain with Long Term ML 5
42% Gain with ML 5Organizational Practices
Average project size was 233 KESLOC
Largest = 1,360 KESLOCSmallest = 29 KESLOC
Average customer project savings due to increased productivity • Equivalent of 406 work months per project (33.8 work years)
Selected Results: Software Productivity(Organization 1)
14© 2010 Carnegie Mellon University
Quantitative Measures: Customer Satisfaction Results Summary
We all do!
Measure Used By The Organization Performance Result
Award fee (used as an indicator of customer satisfaction) for CMMI ML5 relative to the SW-CMM ML2 baseline (Organization 6)
50% of potential additional award fee achieved
Cost savings to customer in a cost-plus contract (Organization 1)
Rose from 5.7 M to 7.1 M (25%)
15© 2010 Carnegie Mellon University
Selected Results: Award Fee (Organization 6)
Customer Satisfaction Continues to Improve
Potential Additional Award Fee Available
SW CMM L2 SW CMM L3 SW CMM L4 SW CMM L5 CMMI L5 CMMI L5
Percent
50% of Potential Additional Award FeeAchieved
16© 2010 Carnegie Mellon University
Quantitative Result: Return on Investment (Organization 2a)
Organization 2a reported their quantified ROI from CMMI Maturity Level 5 activity to be 24 : 1.
These results are a consequence of meaningful process improvement aligned with the business and engineering objectives.
Using the data in Performance Results of CMMI ® -Based Process Improvement (CMU/SEI-2006-TR-004) they were able to compare their ROI performance to others in industry:
• Median ROI 4 : 1• Lowest ROI 1.7 : 1• Organization 2a 24 : 1• Highest ROI 27.7 : 1
17© 2010 Carnegie Mellon University
• Reduced overtime and less intense pressure
• Clear roles and responsibilities for business execution
• Common language (i.e., defined processes, measures) across business units
• Decrease in replanning• Products with lower levels of defects and lower risk; one organization offers a lifetime warranty on products
CMMI Provides Many QualitativeBenefits as Well*
Organizations also gathered various qualitative measures to compliment their quantitative measurements. They found qualitative benefits such as:
• Improved program insight, control, and tracking• Reduced training: process documentation enables knowledge transfer to new generation of workers• Process transformation (via consistency, integration, coordination)• Personnel retention and job satisfaction
*based on published benefits from a wide variety of organizations
18© 2010 Carnegie Mellon University
The Bottom Line
Why improve processes? - Because processes are the foundation for all other business improvements, and critical for
• lasting improvements• successful technology insertion
If a performance management system is not in use, leadership is unaware of what is and is not working.
CMMI is a proven approach to performance management – with more than a decade of results showing it does work.
Organizations have provided data that shows CMMI• enables the delivery of lower-defect products, with predictable
cost, schedule, and quality• improves business performance • serves as competitive discriminator
19© 2010 Carnegie Mellon University
Results Depend on Implementation
Simply deciding to “do CMMI” is not enough to achieve benefits.
Defining good processes, using them, measuring the results, and making improvements based on what you learn are all key to reaping the benefits described in this presentation.
The CMMI models are a foundational part of a comprehensive approach to process improvement that helps organizations understand • why they should improve• what frameworks and tools would best fit their needs• how to implement them
20© 2010 Carnegie Mellon University
Recent Research on CMMI: Just the Tipof the Iceberg!
21© 2010 Carnegie Mellon University
CMMI Research - References
Bibliographic information cited in this presentation:
• Gibson, Diane; Goldenson, Dennis R.; and Kost, Keith. Performance Results of CMMI-Based Process Improvement (CMU/SEI-2006-TR-004). Pittsburgh, PA: Software Engineering Institute, Carnegie Mellon University, August 2006.
Journal Issue: “Performance Results from Process Improvement.” SoftwareTech News. Vol. 10, Number 1. March 2007.
Goldenson, Dennis R. and Gibson, Diane L. Demonstrating the Impact and Benefits of CMMI®: An Update and Preliminary Results. (CMU/SEI-2003-SR-009). Pittsburgh, PA: Software Engineering Institute, Carnegie Mellon University, October 2003.
Journal Issue: “CMMI: Getting a Handle on Process.” CrossTalk. Vol. 23, Number 1. Jan/Feb 2010.
Herbsleb, James D.; Carleton, Anita; Rozum, James A.; Siegel, Jane; and Zubrow, David. Benefits of CMM-Based Software Process Improvement: Initial Results* (CMU/SEI-94-TR-013). Pittsburgh, PA: Software Engineering Institute, Carnegie Mellon University, August 1994. (*Also see SEI Special Report: Benefits of CMM-Based Software Process Improvement: Executive Summary of Initial Results, CMU/SEI-94-SR-013)
Stoddard II, Robert W. and Goldenson, Dennis R. Approaches to Process Performance Modeling: A Summary from the SEI Series of Workshops on CMMI High Maturity Measurement and Analysis (CMU/SEI-2009-TR-021). Pittsburgh, PA: Software Engineering Institute, Carnegie Mellon University, January 2010.
Jones, Capers. Assessment and Control of Software Risks. Upper Saddle River, NJ: Prentice-Hall, 1994 (ISBN 0-13-741406-4).
Website about CMMI at the Software Engineering Institute: <http://www.sei.cmu.edu/cmmi/index.cfm>
22© 2010 Carnegie Mellon University
Looking AheadThe road ahead for CMMI implementation
• A continued focus on high maturityMore and more organizations are striving for and achieving high maturity – and are collecting data demonstrating the benefits. Once at ML 4 or 5, organizations must maintain their focus on good implementation practices for continuous improvement.
• Implementation of CMMI for Services (CMMI-SVC)CMMI-SVC extends the benefits of CMMI to a new audience. Service providers can use the model concept that has proven useful in the development community to specifically address their interests and concerns.
• Implementation of CMMI for Acquisition (CMMI-ACQ)CMMI-ACQ helps organizations improve relationships with their suppliers and improve acquisition processes. The model can enable increased control of projects, better management of global sourcing of products and services, and more successful acquisition solutions.
• Integration with other improvement paradigms (e.g., TSP, ISO, Lean Six Sigma)Organizations are finding that integrated improvement initiatives can produce outstanding results. Choosing CMMI doesn’t mean discontinuing improvement efforts already in place or avoiding new ones that show promise.
23© 2010 Carnegie Mellon University
Summary
Many stakeholders are involved in the development and maintenance of CMMI models, with participants from commercial industry, government, and the DoD. Broad adoption has occurred worldwide. Adopters range from small and midsize organizations (these are the majority) to large and very large organizations.
Organizations that provide products and services to the DoD use CMMI to improve programs, systems, product and service management, systems and software engineering, work processes, and training solutions.
Quantitative and qualitative results have been documented by defense contractors and others, as shown in this report. There is a great deal of additional data showing the benefits of CMMI from a broad range of industries, including banking and finance, manufacturing, medical, and others.
CMMI enables performance improvement focused on business objectives, but the level of success depends on the implementation.
24© 2010 Carnegie Mellon University
Who Benefits from CMMI Today?
We all do!
25© 2010 Carnegie Mellon University
Background Slides if Needed
26© 2010 Carnegie Mellon University
Background: The Achievement of Excellence
CMMI leads the way to high performance through improved processes.
The management of the development and delivery of software systems must be guided by quantitatively managed processes.
Performance comes from processes that arepredictable, repeatable, and continuously improving in terms of product quality, cost and schedule performance,process performance, and customer satisfaction.
82
9261
TSP Fidelity
Project PerformanceOrganizational Coverage
84
115 69
126
100
Schedule
Cost & effort
QualityFunctional Completion
Cust. Satisfaction
27© 2010 Carnegie Mellon University
Results Overview – Quantitative Measures
We received data/information showing performance improvements in the following categories:
• Schedule
• Effort/cost
• Quality
• Customer satisfaction
• Business growth
28© 2010 Carnegie Mellon University
Background: Leadership, Stewardship, and Evolution of Maturity Models
41
Many stakeholders have been involved in the development and evolution of the maturity models published by the SEI, with hundreds of people contributing their time and expertise over the years.
AT&TAutomatic Data ProcessingBAE SystemsBoeingComarco SystemsComputer Sciences CorporationDNV IT Global ServicesEER SystemsEricssonGeneral DynamicsGeneral MotorsHarris CorporationHewlett-Packard CompanyHoneywell CorporationIBMIntegrated System Diagnostics, Inc.JP Morgan ChaseKPMG Consulting
MotorolaNational Defense Industrial AssociationNorimatsu Process Engineering LabNorthrop Grumman CorporationPacific BellQ-Labs, IncRaytheonRockwell CollinsSAICSiemensSoftware Productivity ConsortiumSpirulaSRA InternationalSystems and Software ConsortiumTata Consultancy ServicesTeraQuest, IncTHALESTRWUniversity of Pittsburgh Medical Center
Industry participants
Government participantsDefense Logistics AgencyDepartment of Homeland SecurityFederal Aviation AdministrationInstitute for Defense AnalysesNational Reconnaissance OfficeNational Security AgencyOffice of the Secretary of DefenseRD&E CommandUS Air ForceUS ArmyUS Navy
29© 2010 Carnegie Mellon University
CMMI Adoption Knows No Borders
There are 33 countries with more than ten appraisals as of March 2010:
USA 1582 China 1229 India 524 Japan 306 Spain 180 France 168 Korea (ROK) 165 Brazil 144 Taiwan 134 U.K. 113Mexico 86 Argentina 77 Germany 76 Malaysia 71 Canada 59 Egypt 43 Italy 43Thailand 38 Chile 37Australia 36
Also: Colombia, Pakistan, Philippines, Singapore, Israel, Hong Kong, Vietnam, Turkey, Netherlands, Portugal, Sri Lanka, Ireland and Russia
An estimated 1.8 million people work in organizations that have had at least one SCAMPI A appraisal since April 2002.
30© 2010 Carnegie Mellon University
CMMI Works for Organizations of All Sizes
25 or fewer13.6%
26 to 5015.6%
51 to 7512.8%
76 to 1008.9%
101 to 20019.8%
201 to 3009.0%
301 to 5007.4%
501 to 10006.8%
1001 to 20003.7%
2000+2.5%
201 to 2000+23.8%
1 to 10058.2%
Source for these statistical analyses: http://www.sei.cmu.edu/cmmi/casestudies/profiles/cmmi.cfm
31© 2010 Carnegie Mellon University
Retail Trade0.3%
Wholesale Trade0.1%
Transportation, Communication, Electric, Gas
and Sanitary Services3.6%
Finance, Insurance and Real Estate5.5%
Public Administration (Including Defense)
3.2%
Fabricated Metal Products0.2%
Primary Metal Industries0.3%
Industrial Machinery And Equipment
0.7%
Electronic & Other Electric Equipment
10.4%
Instruments And Related Products
1.0%
Transportation Equipment2.4%
Other Manufacturing Industries
1.2%Health Services
1.3%
Other Services10.7%
Engineering & Management Services24.2%
Business Services35.0%
CMMI Adoption Is Multi-Sector
Manufacturing16.3%
Services71.1%
Source for these statistical analyses: http://www.sei.cmu.edu/cmmi/casestudies/profiles/cmmi.cfm
32© 2010 Carnegie Mellon University
Why Care about Improving Software Engineering Performance?
• To improve software engineering cost, schedule, and quality performance
• To improve competitive economic and military advantage
33© 2010 Carnegie Mellon University
CMMI: A Strong Partner for DoD and theDefense Industrial Base
Large or small, organizations that provide products and services to the DoD share common challenges, from meeting defense software specifications and requirements, to securing networks, to developing and retaining a talented workforce.
CMMI helps the defense industrial base create better systems management, improved software engineering, more efficient processes, and tailored training solutions.
CMMI’s worldwide growth even in tough economic times indicates the value of the framework.
34© 2010 Carnegie Mellon University
What Happens When EffectiveProcesses are Applied in an Organization?
-Defects-Cost-Time-Risk
-Quality-Time to Market-Customer Satisfaction-Performance
CMMI
35© 2010 Carnegie Mellon University
The CMMI Mission & Vision at the SEI
Mission • Improve the development and acquisition of software through research, and the transition
to practice, of new, breakthrough, but proven engineering management methods. [by proven we mean having hard data and evidence]
Vision • Systems and software engineering management are guided by facts, models, and data
that are shown to predictably improve performance and results well beyond the limits of current practice.
• The practice of managing engineering work is recognized to be not just the responsibility of management, but of professionals at all levels and in every related activity.
• Professionals that are developing or acquiring systems think and manage quantitatively.
36© 2010 Carnegie Mellon University
Improving Performance Requires Knowledgeand Expertise
The “What” – Quality Principles
The “How” – Appraisal Methods, Operational Practices, Improvement Techniques, Measurement and Analysis Tools
37© 2010 Carnegie Mellon University
Quantitative Project
Management
Organizational Process
Performance
Organizational Innovation and
Deployment
Causal Analysis and Resolution
What processes characterizehigh maturity organizations?
38© 2010 Carnegie Mellon University
Quality and Process Performance Objectives
The engine that drives project performance
The engine that drives business performance
The engine that drives high maturity
Productivity
Defect Containment
CPI
SPI
Requirements Volatility
Defect Density
39© 2010 Carnegie Mellon University
Organizational Process Performance (OPP)
The purpose of Organizational Process Performance (OPP) is to establish and maintain a quantitative understanding of the performance of selected processes within the organization’s set of standard processes in support of achieving quality and process-performance objectives, and to provide process-performance data, baselines, and models to quantitatively manage the organization’s projects.
40© 2010 Carnegie Mellon University
Quantitative Project Management (QPM)
The purpose of Quantitative Project Management (QPM) is to quantitatively manage the project’s defined process to achieve the project’s established quality and process-performance objectives.
41© 2010 Carnegie Mellon University
Causal Analysis and Resolution (CAR)
The purpose of Causal Analysis and Resolution (CAR) is to identify causes of selected outcomes and take action to improve process performance.
42© 2010 Carnegie Mellon University
Organizational Innovation and Deployment (OID)The purpose of Organizational Innovation and Deployment (OID) is to proactively seek, identify, select and deploy incremental and innovative improvements that measurably improve the organization’s processes and technologies. The improvements support the organization’s quality and process-performance objectives as derived from the organization’s business objectives.
43© 2010 Carnegie Mellon University
CMMI Transition StatusReported to the SEI as of April 30, 2010Training
Introduction to CMMI 115,371
Intermediate Concepts of CMMI 3,049
Understanding CMMI High Maturity Practices 595
Introduction to CMMI V1.2 Supplement for ACQ 1,172
Introduction to CMMI V1.2 Supplement for SVC 1,774
Introduction to CMMI for Services, V1.2 217
CMMI Level 2 for Practitioners 71
CMMI Level 3 for Practitioners 48
Certifications
Introduction to CMMI V1.2 Instructors 444
CMMI-ACQ V1.2 Supplement Instructors 63
CMMI-SVC V1.2 Supplement Instructors 124
CMMI Level 2 for Practitioners 18
CMMI Level 3 for Practitioners 18
SCAMPI V1.2 Lead Appraisers 497
SCAMPI V1.2 High Maturity Lead Appraisers 166
CMMI-ACQ V1.2 Lead Appraisers 72
CMMI-SVC V1.2 Lead Appraisers 134
44© 2010 Carnegie Mellon University
SCAMPI v1.1/v1.2 Class A AppraisalsConducted by Quarter Reported as of 4-30-10
0
50
100
150
200
250
300
350
400
450
500
Q1/0
4
Q1/0
5
Q1/0
6
Q1/0
7
Q1/0
8
Q1/0
9
Q1/1
0
Q2/0
4
Q2/0
5
Q2/0
6
Q2/0
7
Q2/0
8
Q2/0
9
Q2/1
0
Q3/0
4
Q3/0
5
Q3/0
6
Q3/0
7
Q3/0
8
Q3/0
9
Q4/0
4
Q4/0
5
Q4/0
6
Q4/0
7
Q4/0
8
Q4/0
9
45© 2010 Carnegie Mellon University
Countries where Appraisals have been Performed and Reported to the SEI
Argentina Australia Austria Bahrain Bangladesh Belarus Belgium BrazilBulgaria Canada Chile China Colombia Costa Rica Czech Republic DenmarkDominican Republic Egypt Finland France Germany Greece Guatemala Hong KongHungary India Indonesia Ireland Israel Italy Japan Korea, Republic OfLatvia Lithuania Luxembourg Malaysia Mauritius Mexico Morocco NepalNetherlands New Zealand Norway Pakistan Panama Peru Philippines PolandPortugal Romania Russia Saudi Arabia Singapore Slovakia South Africa SpainSri Lanka Sweden Switzerland Taiwan Thailand Tunisia Turkey UkraineUnited Arab Emirates United Kingdom United States Uruguay Viet Nam
46© 2010 Carnegie Mellon University
Number of Appraisals and Maturity LevelsReported to the SEI by Country
CountryNumber of Appraisals
Maturity Level 1
Reported
Maturity Level 2
Reported
Maturity Level 3
Reported
Maturity Level 4
Reported
Maturity Level 5
Reported CountryNumber of Appraisals
Maturity Level 1
Reported
Maturity Level 2
Reported
Maturity Level 3
Reported
Maturity Level 4
Reported
Maturity Level 5
Reported
Argentina 77 50 18 2 4 Malaysia 71 22 43 6Australia 36 1 8 7 2 4 Mauritius 10 or fewerAustria 10 or fewer Mexico 86 36 39 3 6Bahrain 10 or fewer Morocco 10 or fewerBangladesh 10 or fewer Nepal 10 or fewerBelarus 10 or fewer Netherlands 14 5 7 1Belgium 10 or fewer New Zealand 10 or fewerBrazil 144 1 71 57 1 11 Norway 10 or fewerBulgaria 10 or fewer Pakistan 28 1 21 4 1Canada 59 1 16 24 5 4 Panama 10 or fewerChile 37 22 12 2 Peru 10 or fewerChina 1229 1 135 987 36 48 Philippines 23 2 11 8Colombia 34 12 13 3 2 Poland 10 or fewerCosta Rica 10 or fewer Portugal 14 5 7 1Czech Republic 10 or fewer Romania 10 or fewerDenmark 10 or fewer Russia 11 3 3 4Dominican Republic 10 or fewer Saudi Arabia 10 or fewerEgypt 43 1 22 12 2 3 Singapore 21 4 11 1 4Finland 10 or fewer Slovakia 10 or fewerFrance 168 4 98 53 1 2 South Africa 10 or fewerGermany 76 9 37 15 1 1 Spain 180 1 108 54 3 4Greece 10 or fewer Sri Lanka 14 2 12Guatemala 10 or fewer Sweden 10 or fewerHong Kong 18 2 11 5 Switzerland 10 or fewerHungary 10 or fewer Taiwan 134 1 76 51 2 2
India 524 17 278 24 189 Thailand 38 12 24 1
Indonesia 10 or fewer Tunisia 10 or fewerIreland 11 2 4 Turkey 16 14 2Israel 19 3 11 3 Ukraine 10 or fewer
Italy 43 19 20 United Arab Emirates10 or fewerJapan 306 18 86 140 13 17 United Kingdom 113 3 50 35 1 4Korea, Republic Of 165 1 55 75 15 8 United States 1582 30 564 610 23 141Latvia 10 or fewer Uruguay 10 or fewerLithuania 10 or fewer Viet Nam 17 12 2 3
47© 2010 Carnegie Mellon University
Why do we need improved performanceand better processes? - Because there is STILL a management crisis in software! A recent Standish report confirms that the number of troubled projects rises each year
The result?
• Losses in the millions for the government agencies and companies affected • Leadership can be unaware of what is and is not working • Without a robust performance management system, management is
operating without needed data to make quality decisions
Source: http://www.projectsmart.co.uk/docs/chaos-report.pdf
48© 2010 Carnegie Mellon University
Measurement Challenges
There are challenges in measuring return on investment in a traditional sense when correlating the CMMI framework to predictable performance improvement in a given organization
• Companies that wanted to adopt CMMI had little data related to their conditions before starting and generally took no data to record their investments in applying the framework
• They rarely had any data on their organizations performance until after they were well along the path to adherence
• As a result, the ROI data published was generally notional and unsupported
49© 2010 Carnegie Mellon University
The Cost of Quality - Before and AfterCMMI Transition
50© 2010 Carnegie Mellon University
Time History- Cost
51© 2010 Carnegie Mellon University
Time History- Schedule
52© 2010 Carnegie Mellon University
“Cost” to Implement CMM
Company J
Company I
Company F
Company MCompany C
Company A
53© 2010 Carnegie Mellon University
Recurring Costs for PI
Company I
Company H
Company D
Company B
Company O
Company M
Company C
54© 2010 Carnegie Mellon University
CMM Cost Details
• Company A: $4.5M, 18 Months, 350 People, Level 2• Complete Outsourcing of CMM Support
• Company C: NTE 5% of Budget; 1 Year, 30 People, Level 3 (SW)• Extensive Capture of Cost to Implement
• Company B: 2% of Budget, 1 Year, 560 People, Level 2, Then 3• Extensive Outsourcing of CMM Support
• Company O: 1.5-2.5%, 18 Months, 180 People, Level 2 (SW)• Company J: $900K, 2 Year, 400 People, Level 3 (SW)• Company D: 2% of Annual Budget, 150 People, No Assessment
• 5 Years to Best Productivity; All Costs Not Captured• Company F: $1M, 150 People, Level 2, Then 3• Company M: Staffing at 3-5%; Up to 5% for Metrics Expense• Company H: 2% of Budget, 60% of Company at Level 4• Company I: Implement costs: $2.5M, 2.5 years, Level 3 (SW/SE), not all costs
included, 15% workforce initial pilot. Sustainment costs: 15.25 work-years government, 4 full and part-time contractors. 3600 employees.
50% of Costs Devoted to Training
55© 2010 Carnegie Mellon University
• This material is considered SEI-Proprietary and is distributed by the Software Engineering Institute (SEI) to SEI Staff ONLY.
• This material SHALL NOT be reproduced or used for any other purpose without requesting formal permission from the SEI at [email protected].
• THE MATERIAL IS PROVIDED ON AN “AS IS” BASIS, AND CARNEGIE MELLON DISCLAIMS ANY AND ALL WARRANTIES, IMPLIED OR OTHERWISE (INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR A PARTICULAR PURPOSE, RESULTS OBTAINED FROM USE OF THE MATERIAL, MERCHANTABILITY, AND/OR NON-INFRINGEMENT).