software management metrics herman p. schultz

40
Software Management Software Management Metrics Metrics Herman P. Schultz Herman P. Schultz 1988 1988 EEL6887: Software Engineering EEL6887: Software Engineering Chi-Hwa J Marcos Chi-Hwa J Marcos 3/29/2006 3/29/2006

Upload: sixsigmacentral

Post on 10-May-2015

937 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Software Management Metrics Herman P. Schultz

Software Management Software Management MetricsMetrics

Herman P. SchultzHerman P. Schultz19881988

EEL6887: Software EngineeringEEL6887: Software Engineering

Chi-Hwa J MarcosChi-Hwa J Marcos

3/29/20063/29/2006

Page 2: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 22

Reference SourcesReference Sources

Herman P. Schultz, “Software Management Herman P. Schultz, “Software Management Metrics”, Hanscom AFB, MA 1988Metrics”, Hanscom AFB, MA 1988

David L. Hallowell, “Six Sigma Software David L. Hallowell, “Six Sigma Software Metrics”, iSixSigma LLC, Metrics”, iSixSigma LLC, http://software.isixsigma.com/library/content/c03http://software.isixsigma.com/library/content/c030910a.asp0910a.asp

Karl E. Wiegers, “A Software Metrics Primer”, Karl E. Wiegers, “A Software Metrics Primer”, Software Development, July 1999, Software Development, July 1999, http://www.processimpact.com/articles/metrics_phttp://www.processimpact.com/articles/metrics_primer.pdfrimer.pdf

Page 3: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 33

OverviewOverview IntroductionIntroduction CoverageCoverage ReportingReporting AnalysisAnalysis

CorrelationCorrelation ExtrapolationExtrapolation

Software Size MetricSoftware Size Metric Software Personnel MetricSoftware Personnel Metric Software Volatility MetricSoftware Volatility Metric Computer Resource Utilization MetricComputer Resource Utilization Metric Schedule Progress MetricSchedule Progress Metric Metric Tool ExamplesMetric Tool Examples SummarySummary ConclusionConclusion

Page 4: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 44

IntroductionIntroduction

““metric -- A quantitative measure of the degree to which metric -- A quantitative measure of the degree to which a system, component, or process possesses a given a system, component, or process possesses a given attribute.” [IEEE Std 610.12-1990] attribute.” [IEEE Std 610.12-1990]

This report was the result of approximately 3 years of This report was the result of approximately 3 years of experience from government and industry use and experience from government and industry use and analysis of metrics. The use of metrics data can detect analysis of metrics. The use of metrics data can detect potential problems in a software project while time still potential problems in a software project while time still permits for resolution discovery. These potential permits for resolution discovery. These potential problems may impact cost and schedule.problems may impact cost and schedule.

Page 5: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 55

CoverageCoverage

Metrics should cover all phases of software Metrics should cover all phases of software development.development.

Metrics can cover some development phases more than Metrics can cover some development phases more than once.once.

Multiple coverage provides better visibility into each Multiple coverage provides better visibility into each development phases.development phases.

Multiple phases allows consistency checks of metrics.Multiple phases allows consistency checks of metrics. Metrics address two aspects of software development:Metrics address two aspects of software development:

Progress metrics tracks deviation between plan and actual Progress metrics tracks deviation between plan and actual progress.progress.

Planning metrics affects software development progress.Planning metrics affects software development progress.

Page 6: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 66

ReportingReporting

Recommends pre-Program Management Review (PMR) Recommends pre-Program Management Review (PMR) screeningscreening

For example:For example: Deliver metrics to government at least one week prior to PMR.Deliver metrics to government at least one week prior to PMR. Discuss metrics during Technical Interchange Meeting (TIM).Discuss metrics during Technical Interchange Meeting (TIM). Discuss TIM results with System Program Office (SPO) to Discuss TIM results with System Program Office (SPO) to

separate issues between PMR and TIMs.separate issues between PMR and TIMs.

Metrics presented at PMR provides management with Metrics presented at PMR provides management with visibility into potential cost and schedule impact visibility into potential cost and schedule impact problems.problems.

Page 7: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 77

AnalysisAnalysis

Metrics provides a means of evaluating the software Metrics provides a means of evaluating the software plan credibility.plan credibility.

Metrics identify trends.Metrics identify trends. Two analysis methods:Two analysis methods:

CorrelationCorrelation• Identifying strong relationships between reported metrics.Identifying strong relationships between reported metrics.

• Look for inconsistencies within a group of related metrics.Look for inconsistencies within a group of related metrics. ExtrapolationExtrapolation

• Identify trends.Identify trends.

• Shows potential impact on schedules.Shows potential impact on schedules.

Page 8: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 88

Correlation ExampleCorrelation Example

Page 9: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 99

Extrapolation ExampleExtrapolation Example

Page 10: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 1010

Extrapolation ExampleExtrapolation Example

Page 11: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 1111

Extrapolation ExampleExtrapolation Example

Page 12: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 1212

Extrapolation ExampleExtrapolation Example

Page 13: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 1313

Extrapolation ExampleExtrapolation Example

Page 14: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 1414

Software Size MetricSoftware Size Metric PurposePurpose

Track magnitude changes in software development effort.Track magnitude changes in software development effort. SLOCSLOC

Behavior - A lack of understanding and appreciation of requirements can Behavior - A lack of understanding and appreciation of requirements can cause increase or decrease in SLOCcause increase or decrease in SLOC

Increase SLOCIncrease SLOC• Better understanding of requirements.Better understanding of requirements.• Better understanding of design implication and complexity.Better understanding of design implication and complexity.• Optimistic original estimateOptimistic original estimate

Decrease SLOCDecrease SLOC• Overestimate at beginning of program.Overestimate at beginning of program.

Data InputsData Inputs Estimated new SLOCEstimated new SLOC Estimated Reused SLOCEstimated Reused SLOC Estimated modified SLOCEstimated modified SLOC Estimated total SLOCEstimated total SLOC

Page 15: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 1515

Software Size Metric Cont.Software Size Metric Cont. Tailoring IdeasTailoring Ideas

Delete SLOC types not applicable.Delete SLOC types not applicable. Separate data reporting for each coding language used.Separate data reporting for each coding language used. Required separate reporting for each processor and/or CSCI.Required separate reporting for each processor and/or CSCI. Report object code size.Report object code size.

Interpretation NotesInterpretation Notes Should not vary from previous reporting period by more than 5%.Should not vary from previous reporting period by more than 5%. SLOC vary by more than 5% form previous reporting period.SLOC vary by more than 5% form previous reporting period.

• Software Developer provides detail explanation.Software Developer provides detail explanation.• Related discussion regarding cost and schedule improvements.Related discussion regarding cost and schedule improvements.

Total SLOC does not linearly relate to effort.Total SLOC does not linearly relate to effort. New SLOC requires different effort than reuse or modified.New SLOC requires different effort than reuse or modified.

Page 16: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 1616

Software Size Metric ExampleSoftware Size Metric Example

Page 17: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 1717

Software Personnel MetricSoftware Personnel Metric PurposePurpose

Tracks planned staffing to maintain level and maintain sufficient staffing to Tracks planned staffing to maintain level and maintain sufficient staffing to complete task on schedule.complete task on schedule.

BehaviorBehavior Too few experience personnel will experience difficulties.Too few experience personnel will experience difficulties. Bringing many personnel at later phases will experience difficulties.Bringing many personnel at later phases will experience difficulties. Normal shape of total staffing profile:Normal shape of total staffing profile:

• Grow through design phases.Grow through design phases.• Peak through coding and testing phases.Peak through coding and testing phases.• Gradually tapered off through integration phases.Gradually tapered off through integration phases.

Normal shape of experience staff profile:Normal shape of experience staff profile:• High during initial stages of the project.High during initial stages of the project.• Dip slightly during CSU development.Dip slightly during CSU development.• Grow during testing.Grow during testing.

Page 18: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 1818

Software Personnel Metric Software Personnel Metric Cont.Cont.

Data InputData Input InitialInitial

• Planned total personnel level for each month of the contract.Planned total personnel level for each month of the contract.• Planned experience personnel level for each month of the contract.Planned experience personnel level for each month of the contract.• Expected attrition rate.Expected attrition rate.

Each reporting periodEach reporting period• Total personnel.Total personnel.• Experience personnel.Experience personnel.• Unplanned personnel losses.Unplanned personnel losses.

Tailoring IdeasTailoring Ideas Report staffing separately for each development task.Report staffing separately for each development task. Report staffing separately for special development skills needed.Report staffing separately for special development skills needed. Report staffing separately for each development organization.Report staffing separately for each development organization.

Page 19: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 1919

Software Personnel Metric Software Personnel Metric Cont.Cont.

Interpretation NotesInterpretation Notes Understaffing result in schedule slippage.Understaffing result in schedule slippage. Adding staff too late will seldom improve schedule and often cause more delays.Adding staff too late will seldom improve schedule and often cause more delays. High experience personnel turnover rate will cause schedule delays.High experience personnel turnover rate will cause schedule delays. Initial staffing level should be at least 25% of average staffing level.Initial staffing level should be at least 25% of average staffing level.

Page 20: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 2020

Software Personnel Metric Software Personnel Metric ExampleExample

Page 21: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 2121

Software Volatility MetricSoftware Volatility Metric PurposePurpose

Tracks changes in requirements.Tracks changes in requirements. Tracks developers understanding of the requirements.Tracks developers understanding of the requirements.

• Software Action Items (SAI).Software Action Items (SAI). BehaviorBehavior

More changes in requirements during requirement analysis and preliminary More changes in requirements during requirement analysis and preliminary design phases.design phases.

Changes after CDR may have significant impact on schedule.Changes after CDR may have significant impact on schedule. SAIs expected to rise at each review then tapered off exponentially.SAIs expected to rise at each review then tapered off exponentially.

• Clear and complete specification produce less level of rise at each review.Clear and complete specification produce less level of rise at each review.• Good communications among developers and customers will have higher Good communications among developers and customers will have higher

rate of decay.rate of decay.

Page 22: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 2222

Software Volatility Metric Cont.Software Volatility Metric Cont. Data InputsData Inputs

Current total number of requirements.Current total number of requirements. Cumulative number of requirements changes (addition, deletion and Cumulative number of requirements changes (addition, deletion and

modification).modification). Number of new SAIsNumber of new SAIs Cumulative number of open SAIsCumulative number of open SAIs

Tailoring IdeasTailoring Ideas Track longevity of SAIs.Track longevity of SAIs. Track open SAIs by priority.Track open SAIs by priority.

Interpretation NotesInterpretation Notes Requirements volatility between CDR and TRR could cause significant schedule Requirements volatility between CDR and TRR could cause significant schedule

impact.impact. SAIs open for more than 60 days should be closely examined.SAIs open for more than 60 days should be closely examined.

Page 23: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 2323

Software Volatility Metric Software Volatility Metric ExampleExample

Page 24: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 2424

Software Volatility Metric Software Volatility Metric ExampleExample

Page 25: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 2525

Computer Resource Utilization Computer Resource Utilization MetricMetric

PurposePurpose Tracks changes in estimated/actual computer utilization in target machine Tracks changes in estimated/actual computer utilization in target machine

resources.resources.• CPU, Memory and IOCPU, Memory and IO

BehaviorBehavior Most system experience upward creep in resource utilization.Most system experience upward creep in resource utilization. Large system typically reserve 50% of resource for growth.Large system typically reserve 50% of resource for growth. Dependencies among resource results in parallel movements of resources.Dependencies among resource results in parallel movements of resources.

Page 26: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 2626

Computer Resource Utilization Computer Resource Utilization Metric Cont.Metric Cont.

Data InputsData Inputs InitialInitial

• Planned spare for each resource.Planned spare for each resource. Each reporting periodEach reporting period

• Estimated/actual percentage of CPU utilization.Estimated/actual percentage of CPU utilization.

• Estimated/actual percentage of memory utilization.Estimated/actual percentage of memory utilization.

• Estimated/actual percentage of I/O channel utilization.Estimated/actual percentage of I/O channel utilization.

Page 27: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 2727

Computer Resource Utilization Computer Resource Utilization Metric Cont.Metric Cont.

Tailoring IdeasTailoring Ideas Report combined utilization in a multi resource architecture that uses a load-Report combined utilization in a multi resource architecture that uses a load-

leveling operating system.leveling operating system. Report utilization separately in a multi resource architecture that has dedicated Report utilization separately in a multi resource architecture that has dedicated

functions.functions. Report average and worst case utilization.Report average and worst case utilization. Report separately for development and target Processors.Report separately for development and target Processors. Consider memory addressing limit of the architecture when establishing Consider memory addressing limit of the architecture when establishing

utilization limits.utilization limits. Interpretation NotesInterpretation Notes

Performance deteriorates rapidly when utilization exceeds 70 percent for real Performance deteriorates rapidly when utilization exceeds 70 percent for real time applications.time applications.

Planned for resource expansion.Planned for resource expansion. Necessary optimization cause by resource usage approaching limit will increase Necessary optimization cause by resource usage approaching limit will increase

cost and schedule.cost and schedule.

Page 28: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 2828

Computer Resource Utilization Computer Resource Utilization Metric ExampleMetric Example

Page 29: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 2929

Schedule Progress MetricSchedule Progress Metric PurposePurpose

Tracks delivery of software work packages defined in the Work Break Down Tracks delivery of software work packages defined in the Work Break Down Schedule (WBS) against scheduled delivery.Schedule (WBS) against scheduled delivery.

Estimated Schedule (Months) =Estimated Schedule (Months) =

Program Schedule (months) / (BCWP / BCWS)Program Schedule (months) / (BCWP / BCWS)

BCWP - budgeted cost of work performedBCWP - budgeted cost of work performed

BCWS - budgeted cost of work scheduledBCWS - budgeted cost of work scheduled BehaviorBehavior

Tend to initially fall behind due to insufficient time allocated to the design Tend to initially fall behind due to insufficient time allocated to the design process.process.

Likely to fall behind during testing due to inadequate test planning and testing at Likely to fall behind during testing due to inadequate test planning and testing at the CSU and CSC levelsthe CSU and CSC levels

Page 30: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 3030

Schedule Progress Metric Cont.Schedule Progress Metric Cont. Data InputsData Inputs

InitialInitial• Number of months in program schedule.Number of months in program schedule.

Each reporting periodEach reporting period• BCWP for software.BCWP for software.• BCWS for software.BCWS for software.• Number of months in program schedule if revised.Number of months in program schedule if revised.

Tailoring IdeasTailoring Ideas Track progress separately for each CSCI.Track progress separately for each CSCI.

Interpretation NotesInterpretation Notes Can be extrapolated to identify trendsCan be extrapolated to identify trends If trend is up, it indicates a worsening condition.If trend is up, it indicates a worsening condition. If trend is down, productivity is under control and improving.If trend is down, productivity is under control and improving.

Page 31: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 3131

Schedule Progress Metric Schedule Progress Metric ExampleExample

Page 32: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 3232

Imagix Metric ToolImagix Metric Tool

Page 33: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 3333

Slim-MetricSlim-Metric

Page 34: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 3434

Slim-Metric Cont.Slim-Metric Cont.

Page 35: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 3535

Essential MetricEssential Metric

Page 36: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 3636

SEER SEMSEER SEM

Page 37: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 3737

SummarySummary Metrics is a valuable management tool allowing management to Metrics is a valuable management tool allowing management to

exercise control during each phases of a software development exercise control during each phases of a software development process.process.

Metrics provide control by giving different views or visibility into each Metrics provide control by giving different views or visibility into each of the phases of a development process.of the phases of a development process.

Metric analysis can identify trends which may have impact on cost Metric analysis can identify trends which may have impact on cost and schedule. Early detection of trends allow for effective recovery and schedule. Early detection of trends allow for effective recovery planning.planning.

Page 38: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 3838

ConclusionsConclusions Metric Tools are used by both project managers and software developers. Metric Tools are used by both project managers and software developers.

Project managers are more interested in planning and progress metrics. Project managers are more interested in planning and progress metrics. Software developers mainly focus on software specific metrics such as Software developers mainly focus on software specific metrics such as defects, cyclomatic complexity, SLOC, etc.defects, cyclomatic complexity, SLOC, etc.

Metric ToolsMetric Tools Essential Metric - Essential Metric - http://www.powersoftware.com/em/screenshot.htmlhttp://www.powersoftware.com/em/screenshot.html Imagix - Imagix - http://www.imagix.com/products/metrics.htmlhttp://www.imagix.com/products/metrics.html McCabeIQ - McCabeIQ - http://www.mccabe.com/iq_qa.htmhttp://www.mccabe.com/iq_qa.htm SEER-SEM - http://www.gaseer.com/tools_sem.htmlSEER-SEM - http://www.gaseer.com/tools_sem.html Semantic Design - Semantic Design -

http://www.semdesigns.com/Products/Metrics/index.html?Home=SoftwareMetrichttp://www.semdesigns.com/Products/Metrics/index.html?Home=SoftwareMetricss

Slim Metric - Slim Metric - http://www.qsm.com/slim_metrics.htmlhttp://www.qsm.com/slim_metrics.html (Site with various metrics tools) - (Site with various metrics tools) - http://measurement.fetcke.de/products.htmlhttp://measurement.fetcke.de/products.html

Page 39: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 3939

Conclusions Cont.Conclusions Cont. At CMM Level 2 basic management control is installed and software costs, At CMM Level 2 basic management control is installed and software costs,

schedules and functionality are tracked. Therefore a limited set metric schedules and functionality are tracked. Therefore a limited set metric gathering occurs at CMM Level 2. But it is not until CMM Level 4, where gathering occurs at CMM Level 2. But it is not until CMM Level 4, where process are measure and quality quantified, that a complete set of metrics is process are measure and quality quantified, that a complete set of metrics is gathered and trends identified.gathered and trends identified.

Page 40: Software Management Metrics Herman P. Schultz

04/11/2304/11/23 Presenter's NamePresenter's Name 4040

AcronymsAcronyms

SRR - System Requirement Review SRR - System Requirement Review SDR - System Design Review SDR - System Design Review SSR - System Specification Review SSR - System Specification Review PDR - Preliminary Design Review PDR - Preliminary Design Review CDR - Critical Design Reviews CDR - Critical Design Reviews TRR - Test Readiness Review TRR - Test Readiness Review

http://sparc.airtime.co.uk/users/wysywig/sehttp://sparc.airtime.co.uk/users/wysywig/semp39.htmmp39.htm (Descriptions for each review) (Descriptions for each review)