bizier.brenda

24
EARNED VALUE IN SOFTWARE Using Software Metrics & Measures for Earned Value DEVELOPED BY: NAVAL AIR SYSTEMS COMMAND AIR-4.2 Cost Department AIR-4.2.3 Earned Value Management Division & AIR-4.2.1.4 Advanced Concepts/Special Studies & Databases Branch Patuxent River, MD May 2004 PRESENTED BY: Brenda Bizier AIR-4.2.3 Integrated Project Management Division IPM Process Owner, Earned Value Management Amy Houle Caruso E-6 Block I Modification Team Lead Program Management/Software Management NAVAIR Public Release 09-33 Distribution Statement A – “Approved for public release; distribution is unlimited”

Upload: nasapmc

Post on 26-May-2015

13.722 views

Category:

Technology


0 download

TRANSCRIPT

Page 1: Bizier.brenda

EARNED VALUE IN SOFTWAREUsing Software Metrics & Measures for Earned

ValueDEVELOPED BY: NAVAL AIR SYSTEMS COMMAND AIR-4.2 Cost Department AIR-4.2.3 Earned Value Management Division & AIR-4.2.1.4 Advanced Concepts/Special Studies & Databases Branch Patuxent River, MD May 2004

PRESENTED BY: Brenda Bizier

AIR-4.2.3 Integrated Project Management DivisionIPM Process Owner, Earned Value Management

Amy Houle CarusoE-6 Block I Modification Team Lead

Program Management/Software Management

NAVAIR Public Release 09-33

Distribution Statement A – “Approved for public release; distribution is unlimited”

Page 2: Bizier.brenda

2BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT

Reality

For more and more DOD systems, software development will consume the majority of resources, schedule and cost while generating the bulk of program risk.

Page 3: Bizier.brenda

3BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT

Outline

• Current Issues• WBS • Software Measures• Rework • Conclusion: Final Thoughts

Page 4: Bizier.brenda

4BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT

Current Challenges in S/W Earned Value

• Excessive use of Level of Effort (LOE)• Crediting full-earned value for tasks and requirements even

though all tasks and requirements have not been completed• Basing earned value on metrics and measures that do not

directly relate to implementation of the software requirement• Basing earned value on the metrics and measures that are

obsolete or inaccurate• Utilizing EVM in isolation vice in conjunction with other software

measurements and metrics to evaluate program status• Failure to consider rework in developing the Performance

Measurement Baseline (PMB)• Failure to correlate earned value with Technical Performance

Measurement (TPM)

Page 5: Bizier.brenda

5BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT

Needing to Know What is Important (Example)

Basing Earned Value on metrics and measures that do not directly relate to implementation of the software requirement

Requirement: Build 100 miles of highway in 10 months for $10M Contractor Estimate: 10,000 loads of fill, concrete, and other material

required to complete requirement Status: After 5 months, 30 miles have been completed and 5000

truckloads have been used.

How is the project doing? (or, how do we know we are “x%” complete and what are the “artifacts” to prove it)

Page 6: Bizier.brenda

6BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT

Needing to Know What is Important (Example)

Basing Earned Value on metrics and measures that do not directly relate to implementation of the software requirement

Requirement: Build 100 miles of highway in 10 months for $10M Contractor Estimate: 10,000 loads of fill, concrete, and other material required to complete requirement Status: After 5 months, 30 miles have been completed and 5000 truckloads have been used. How is the project doing? If the measure for EV was defined as follows:

Customer Measure: # of miles completed Contractor Measure: # of truckloads used

Earned value would be reported as:Measure: # of miles completed BCWP

67% over cost and 40% behind schedule Measure: # of truckloads used BCWP

on cost and project is on schedule

Page 7: Bizier.brenda

7BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT

Things You Want to Know Sooner rather than Later

• Requirements are the primary cost driver of software development efforts.– Software requirements tend to increase by 1–5% per month between the end of requirements

analysis and the start of systems and integration testing, with the national average being about 2%.

– Sometimes changes in requirements continue after testing begins.– According to Capers Jones, approximately 20% of all defects in software are caused by

poorly defined and contradictory requirements.

• Size is often underestimated – EV results are over optimistic cost and schedules

• Just because a developer’s earned value system is compliant with EVMS, does not mean that the base and derived measures driving it will provide adequate information on program status.

– It is essential the customer has contractually established a measurement program that will insure that measures capable of identifying deviations from the programs cost schedule and technical objectives are delivered to the customer.

– The contract should be implemented so that the Measurement IPT is able to modify the program measures as the information needs of the program change over its life cycle.

Page 8: Bizier.brenda

8BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT

Easier Said Than Done (a.k.a. DUH!)

• An EV system based on a robust flexible measurement program that adapts to the current program needs will be much more effective than one that is not.

• It is also essential when selecting measures and setting up an earned value system that the benefit of tracking a specific risk or information need is determined for the program.

• All of the measures are directly or indirectly required for organizations which have achieved a SW-CMM®3 or CMMI®4 level III certification. NAVAIR requires that Developers working on ACAT I, II, III and IV software intensive systems have achieved Level III certification. Thus asking the developer to change the measure driving their earned value system should have only minimal or no impact on the cost of implementing their earned value and measurement program.

• A software development program should contractually establish quality criteria defining the maximum number of defects of different priority for the effort.

• Reasonable amount of change should be built into the project plan based on the developer’s and acquisition organization’s prior history.

• Tasks being reviewed must be broken down so that they can be completed in less than a month and preferably less than a week.

Page 9: Bizier.brenda

9BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT

EV is a part of Risk Management

• If an effective risk management, measurement and earned value program is combined, it can serve to identify the occurrence of the risks at an earlier date when it is more likely that effective corrective action which minimizes perturbations to the program plan can be made.

• Requirements deferral is always the result of a risk occurring.

• The more effective a program is at managing and tracking its risks, the less functionality or requirements will be deferred.

Page 10: Bizier.brenda

10BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT

Types of Code and EV Issues

• New Code1. Size is often underestimated. This results in overoptimistic costs and schedules

with resulting low Cost Performance Index (CPI) and Schedule Performance Index (SPI).

2. Generally the most expensive type of code to produce. Every phase of software development must be implemented.

• Reuse Code1. The amount of functionality that can be gained through software reuse is often

overestimated. This results in a reduction in the amount of reuse and an increase in new and/or modified code resulting in higher cost and longer schedules. CPI and SPI degrade.

2. Often cost and schedule overruns are experienced in development efforts based on software reuse integration. These overruns can often be attributed to developer unfamiliarity with the code, poor documentation, and low quality of code.

• Modified Code– The amount of modified code that can be used in a system is often

overestimated and/or the amount of modification the code will require is underestimated. This results in increasing cost and schedule for implementing and integrating the modified code or additional new code and subsequent degradation in CPI and SPI.

Page 11: Bizier.brenda

11BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT

Types of Code and EV Issues (Continued)

• Deleted Code– Similar to the problems with modified code. Often the amount of code to be deleted

and the amount of testing after the deletions is underestimated as part of the modification effort. This leads to higher cost and schedule and lower CPI and SPI.

• Automatically Generated Code– Automatically generated code does not produce free code. Requirements analysis,

design and testing are still necessary for automatically generated code.

• Converted/Ported Code– Automatically converted/translated/ported code may require extensive manual

corrections to get it to run in the new environment. However, if it was automatically generated without comments it is likely to be much more difficult, costly and time consuming to understand and modify or correct.

• Commercial Off the Shelf (COTS)1. Selection of a COTS product must not only consider the technical merit of the

product but the commercial viability of the vendor. If the vendor’s long-term prospects are questionable, a mitigation plan for replacing the product must be developed and additional funding to cover the risk built into the project.

2. The project plan must include plans for upgrading any COTS tools. Failure to do so is likely to result in unplanned for costs associated with product upgrades, integration and testing.

Page 12: Bizier.brenda

12BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT

Software WBS Challenges

• Obscuring the software development effort by burying it deeply within the system WBS often as a sub component of much cheaper, lower risk hardware efforts can only aggravate these problems.

• The deeper the software is buried in the effort, the deeper the contractor must report and the greater the burden placed on the contractor’s financial tracking and reporting system.

• It is important to keep these issues in mind when developing theProgram and Contract WBS.

Page 13: Bizier.brenda

13BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT

WBS Challenges (WBS Example)

Contract WBS - SW Treated as subsystem Contract WBS - SW Treated as equal system 1(3) 2(4) 3(5) 4(6) 5(7) 1(3) 2(4) 3(5) 4(6) 5(7) Fire Control Fire Control

Radar Radar Receiver Receiver

Applications Software Receiver Applications Software Build 1 Build 1 Build 2 Build 2 Integration Testing Integration Testing

Systems Software Receiver Systems Software Build 1 Build 1 Build 2 Build 2 Integration Testing Integration Testing

Transmitter TransmitterAntenna AntennaRadar Integration Radar Integration

Platform Integration Platform Integration

Software could be placed on a parallel level with the hardware on which it executes or possibly as a sub component of a higher component of the entire system as shown in the figure above.

Page 14: Bizier.brenda

14BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT

COMPARISON OF SOFTWARE LIFE CYCLE PHASES vs. DEVELOPMENT FUNCTIONS

Software Life Cycle

FUNCTION

PHASE

SYSTEMDEFINITION

SOFTWAREREQUIREMENTS

ANALYSISDESIGN

CODE &UNIT TEST

INTEGRATION& TEST

OPERATION &MAINTENANCE

SOFTWAREPROJECT

MANAGEMENT

SOFTWAREDEVELOPMENT

SOFTWAREINTEGRATION &

TEST

SOFTWAREDEVELOPMENT

FACILITIES

SOFTWAREQA/CM

• FormulateInital plan

• Define systemconcept

• Allocate reqtsto software

• Execute plan• Update plan

• Analyze & refinesoftware require-ments

• Analyzetestability

• Analyzetestability

• Plan for facilities• Identify long-lead

items

• Plan for QA & CM• Execute QA &CM

plans

• Define how thesoftware isstructured andhow it works

• Plan for tests• Define tests

• Execute plan

• Write the code• Test Individual

units of code

• Develop testprocedures

• Combine units &check out

• Test integratedunits

• Managesoftwarechanges

• Change thesoftware

• Retest asrequired

• Maintain facilities

• Maintain quality &configurationcontrol

Page 15: Bizier.brenda

15BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT

Additional Insight

Breaking the development effort/WBS into phases results in tasks of shorter duration contributing to EVM accuracy and early warning of project problems.

Page 16: Bizier.brenda

16BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT

No single measure should ever be used as the sole measure of evaluating status or to make program management decisions.

Base vs Derived Measures

• Base measures can be thought of as raw data. On their own, they provide little information on the status of the project.

• A derived measure combines data from two or more base measures to provide insight into the actual status of the project and the basis upon which alternative corrective courses of action can be developed and program management decisions made.

• Earned value is a formally defined derived measure. It is one of many measures that can be used to evaluate the status of a project during the software lifecycle. Some measures are useful throughout the project lifecycle; others are applicable to only specific tasks within specific development phases. In the majority of cases earned value is determined based on other derived software measures.

Page 17: Bizier.brenda

17BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT

Software Measures

• Level of Effort (LOE)• Schedule Milestones• Software Defects• Test Procedures/Cases• Modules• Source Lines of Code (SLOC) / Equivalent SLOC

(ESLOC)• Function Points• Technical Performance Measures (TPMs)• Requirements

Page 18: Bizier.brenda

18BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT

RequirementsMeasure Effectiveness - GOOD

• Since Requirements are the ultimate driver in determining software cost and schedule, they are also an excellent choice for determining earned value

• Requirements:– Are applicable to all phases of the system and software

development.– Are directly related to producing the functionality the

Customer want in a new system– Other metrics/measures that are indirectly related to

implementing the desired functionality can inject errors into the earned value calculations.

If requirements are not considered when determining earned value , then earned value will not reflect actual progress in meeting the requirements.

Page 19: Bizier.brenda

19BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT

S/W Measures Summary

Page 20: Bizier.brenda

20BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT

Measures for Phases

Page 21: Bizier.brenda

21BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT

Rework

• It is unreasonable to assume that there will be no defects detected in any of the requirements, design or code.

• Time and effort for rework is usually based on the developers estimate of the number of defects likely to occur and the average amount of time required to correct such defects.

• If such rework phases are not planned for, it can cause severe problems to the earned value system when it is attempted to determine how toimplement it at the spur of a moment.

• Programmatically, any project plan that does not include time for rework is unexecutable and questions the maturity of the developing organization.

• The developer must take into consideration that some percentage of the requirements will not pass testing.

Page 22: Bizier.brenda

22BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT

Rework (Cont.)

• The rework must not only include time to correct the flaw in requirements, design and/or code that caused the problem, but also to retest the corrected software. In a multi release/build development, this may mean that some or all of the failed requirements will be rolled into the next build/release.

• Rework should be planned and tracked in separate work packages from the initial development of requirements, design and code. In planning incremental builds, all builds must include budget and schedule for rework of requirements, design and code to correct defects that were found in the current and previous builds.

• To ensure adequate budget and period of performance, the planning assumptions for rework should include the planned rate or number of defects expected and the budgeted resources to fix the defects. Failure to establish a baseline plan forrework and objectively measure rework progress has caused many projects to get out of control.

• All of this must be taken into account in the project plan.

Page 23: Bizier.brenda

23BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT

Conclusion: Final thoughts

• Ability to manage software will be dictated by the visibility into the effort– Where is the software in the WBS?– Is it reported by phase, build to optimize ability to track

requirements & functionality ?

• SW Measures not ideal for taking EV still help to answer questions “where are we at?”, “how much is it going to cost”, “how long will it take”– Defects (finding and burning down)– Headcounts (look for sharp drop offs in support)

• Visibility into Subcontracted & ”Black” SW effort• Rework part of plan (baseline) not managed through

Management Reserve

Page 24: Bizier.brenda

24BRIEF DATE: 24-25 FEBRUARY 2009CONFIG. MGR: IPM ANALYSIS PROCESS GROUP, 4.2.3FILE NAME: SW AMD EARMED VALUE 01-09 REV1.PPT

For More Information

• Brenda Bizier (301) [email protected]

• Amy Houle Caruso (301) [email protected]

• NAVAIR Using Software Metrics and Measurements for Earned Value Toolkit

https://acc.dau.mil/CommunityBrowser.aspx?id=19591&lang=en-US