kpa ltd. and tel aviv university

35
S T A M S T A M © 2000, KPA Ltd. S S oftware oftware T T rouble rouble A A ssessment ssessment M M atrix atrix *This presentation is extracted from SOFTWARE PROCESS QUALITY: Management and Control by Kenett and Baker, Marcel Dekker Inc., 1998. It was first published as "Assessing Software development and Inspection Errors", Quality Progress, KPA Ltd. and Tel Aviv University Assessing Software Assessing Software Inspection Processes Inspection Processes with STAM* with STAM* Ron S. Kenett Ron S. Kenett

Upload: posy

Post on 06-Jan-2016

33 views

Category:

Documents


0 download

DESCRIPTION

- PowerPoint PPT Presentation

TRANSCRIPT

Page 1: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

SSoftwareoftware T Troublerouble A Assessmentssessment M Matrixatrix

*This presentation is extracted from SOFTWARE PROCESS QUALITY: Management and Control by Kenett and Baker, Marcel Dekker Inc., 1998. It was first published as "Assessing Software development and Inspection Errors", Quality Progress, pp. 109-112, October 1994 with corrections in the issue of February 1995.

KPA Ltd. and Tel Aviv University

Assessing Software Inspection Assessing Software Inspection Processes with STAM*Processes with STAM*

Ron S. KenettRon S. Kenett

Page 2: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

Presentation agenda

Software life cycles Inspection processes Measurement programs Assessing software inspection processes

Page 3: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

Presentation agenda

Software life cycles Inspection processes Measurement programs Assessing software inspection processes

Page 4: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

Informal software life cycleInformal software life cycleMarketingRequirements Spec

SystemRequirements Spec

SystemDesignSpec

SoftwareRequirements Spec

SoftwareTest Spec

System IntegrationSpec

SystemTest Spec

AcceptanceTestSpec

Design and Construction

Artifacts

Page 5: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

Design a little ...

Implement a little ...

Test a little ...

Web applications life cycleWeb applications life cycle

Page 6: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

Requirements stage (Program Statement)

Proposal stage

Analysis

Definition

END

Design stage

(Draft Requirements Specification)

(Requirements Specification)

(Proposal)

(Project Plans)

Proposal and Project Planning

Code stage

Verification stage

(Functional Description)

(Design)

(Code and Unit Test)

(Documentation)

(Technical Testing)

(System Testing)

Change to Requirements?

Change to Requirements?

Change to Requirements?

Develop Proposal and Project Plans to fulfill project requirements

Analyze requirements, categorizeto expose incomplete areas, and prioritize by importance

Gather initial requirements, clarify requirements for understanding

Change Control

Update all related documents, code, and tests to reflect the change

Form

al s

oftw

are

life

cycl

e

Form

al s

oftw

are

life

cycl

e

Page 7: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

Software Life Cycle PhasesSoftware Life Cycle Phases

RequirementsRequirements

AnalysisAnalysis

RequirementsRequirements

AnalysisAnalysis

Top LevelTop Level

DesignDesign

Top LevelTop Level

DesignDesign

DetailedDetailed

DesignDesign

DetailedDetailed

DesignDesign

SystemSystem

TestsTests

SystemSystem

TestsTests

Unit Unit

TestsTests

Unit Unit

TestsTests

Program-Program-mingming

Program-Program-mingming

AcceptanceAcceptance

TestsTests

AcceptanceAcceptance

TestsTests

Page 8: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

Presentation agenda

Software life cycles Inspection processes Measurement programs Assessing software inspection processes

Page 9: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

The software development matrix

Supporting Practices

KeyActivities

WORK-PRODUCTS specifications descriptions drawings source code ...

WORK-PRODUCTDEVELOPMENTPRATICES procedures standards techniques tools

SQA PRACTICES WP evaluations monitoring measurement internal auditing corrective action quality records

WORK-PRODUCT CONTROLPRACTICES sw development library document control source code control version dev control version release control change control ... (F) = Future

GENERAL CRSD (Core Requirementsfor Software Development ) -The Software DevelopmentProcess”

“CRSD - The SoftwareDevelopment Process”

“DocumentingNon-Deliverable Software”

“CRSD - SoftwareDevelopment ProcessPlanning”

“CRSD - Software QualityAssurance”

“CRSD - Work Product Control” “Software Development Process

Monitoring”

SOFTWARE:Project Planning

MS-Project Doc. MS-Project.

SOFTWAREREQUIREMENTS:

SRS (SoftwareRequirements Specification)(for new product)

iDDE/SRS

SOFTWARE:Software Design

SDS (Software DesignSpecification)

iDDE/SDS

SOFTWARE:Software Construction Detailed Design Coding Unit Testing

source code ISEF/Codewright/C “File Naming Rules for

ISEF/C” “Identifier naming Rules for

ISEF/C” “Layout Rules for ISEF/C” Pseudo-Code Guidelines” DAC (F)

iTTS/Meeting iTTS/PR “Code Inspection” [tbd] “Module Testing”

MKS/SI iTTS/CR

Supporting Practices

KeyActivities

WORK-PRODUCTS specifications descriptions drawings source code ...

WORK-PRODUCTDEVELOPMENTPRATICES procedures standards techniques tools

SQA PRACTICES WP evaluations monitoring measurement internal auditing corrective action quality records

WORK-PRODUCT CONTROLPRACTICES sw development library document control source code control version dev control version release control change control ... (F) = Future

GENERAL CRSD (Core Requirementsfor Software Development ) -The Software DevelopmentProcess”

“CRSD - The SoftwareDevelopment Process”

“DocumentingNon-Deliverable Software”

“CRSD - SoftwareDevelopment ProcessPlanning”

“CRSD - Software QualityAssurance”

“CRSD - Work Product Control” “Software Development Process

Monitoring”

SOFTWARE:Project Planning

MS-Project Doc. MS-Project.

SOFTWAREREQUIREMENTS:

SRS (SoftwareRequirements Specification)(for new product)

iDDE/SRS

SOFTWARE:Software Design

SDS (Software DesignSpecification)

iDDE/SDS

SOFTWARE:Software Construction Detailed Design Coding Unit Testing

source code ISEF/Codewright/C “File Naming Rules for

ISEF/C” “Identifier naming Rules for

ISEF/C” “Layout Rules for ISEF/C” Pseudo-Code Guidelines” DAC (F)

iTTS/Meeting iTTS/PR “Code Inspection” [tbd] “Module Testing”

MKS/SI iTTS/CR

Key ActivitiesKey Activities

Work ProductsWork Products

Work ProductDevelopment

Practices

Work ProductDevelopment

Practices InspectionPractices

InspectionPractices

Work ProductControl

Practices

Work ProductControl

Practices

Page 10: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

SEI Capability Maturity Model

Maturity Level Software Inspection FeaturesSoftware Inspection FeaturesCharacteristics

Defined Defect removal, Entry, ExitDefect removal, Entry, Exit Defined processes,peer reviews

Initial Depends entirely on individuals.

NoneNone

Repeatable Policies, procedures, experience base

Writing-Task Rules, QA Policies, Writing-Task Rules, QA Policies, Inspection ProceduresInspection Procedures

ManagedQuantitative goals for product & process

Optimum rates, quality level at Optimum rates, quality level at exit & entry, data summary, d-baseexit & entry, data summary, d-base

Optimizing Entire organization.focused on continuousprocess improvement

Defect Prevention ProcessDefect Prevention ProcessImprovements logging,Improvements logging,Owners, Proc. Change Mgt. TeamOwners, Proc. Change Mgt. Team

Based on Paulk et al, “Capability Maturity Model Version 1.1”, IEEE Software, July 1993.

Page 11: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

Presentation agenda

Software life cycles Inspection processes Measurement programs Assessing software inspection processes

Page 12: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

Software Measurement Programs

Chapter 4

Software Measurements Programs:Strategies and implementation issues

4.1 Introduction

Measurement is an integral part of the process management strategies. Chapter 3 discussed how theSEI Capability Maturity Model and the ISO 9000 standards emphasize the role of process indetermining the quality of the products. In this chapter we focus on measurement of softwaredevelopment and maintenance processes. The topic of process measurement will be introduced inthe context of organization objectives and management strategies. One or more managementstrategies may be used in an organization, but for process measurement to be a relevant activity, thenet result should be a focus on increasing the quality and responsiveness of the organization.

Measurement ties several disciplines together, creating an environment where inputs, processes, andoutputs are controlled, and improved. Measurement is a basic need of any management activity. Thischapter focuses on software measurement programs in the context of the management and control ofsoftware quality. Specifically we will cover measurement of software development and maintenanceprocesses.

During the planning phase of a new project, measurement information from prior projects may beused to estimate parameters of the new project. During the execution of the project, process andresource measures can be collected and analyzed against such estimates. As products are created,product measures are collected and analyzed, yielding product metrics. The number of open bugs,per subsystem, is an example of a product metric.

The answer to the question, “Why Measure?”, involves several other questions such as:

Are the software processes more productive today than last year? Less productive?Why?

Are the software processes delivering higher quality products today than last year? Lowerquality? Why?

Are our customers more satisfied with our software processes today than they were lastyear? Less satisfied? Why?

Are the software processes more reliable this year than last year? Less? Why?Do the software processes cost more today than last year? Less? Why?

Note that if one removes the words software processes from each of the questions above and insertany other business process, the questions would still apply to organizations in general. Softwareprocesses should be treated the same as other business processes. Authors such as W. EdwardsDeming, J.M. Juran, and Phil Crosby have made process modeling, measurements, and the

Chapter 4

Software Measurements Programs:Strategies and implementation issues

4.1 Introduction

Measurement is an integral part of the process management strategies. Chapter 3 discussed how theSEI Capability Maturity Model and the ISO 9000 standards emphasize the role of process indetermining the quality of the products. In this chapter we focus on measurement of softwaredevelopment and maintenance processes. The topic of process measurement will be introduced inthe context of organization objectives and management strategies. One or more managementstrategies may be used in an organization, but for process measurement to be a relevant activity, thenet result should be a focus on increasing the quality and responsiveness of the organization.

Measurement ties several disciplines together, creating an environment where inputs, processes, andoutputs are controlled, and improved. Measurement is a basic need of any management activity. Thischapter focuses on software measurement programs in the context of the management and control ofsoftware quality. Specifically we will cover measurement of software development and maintenanceprocesses.

During the planning phase of a new project, measurement information from prior projects may beused to estimate parameters of the new project. During the execution of the project, process andresource measures can be collected and analyzed against such estimates. As products are created,product measures are collected and analyzed, yielding product metrics. The number of open bugs,per subsystem, is an example of a product metric.

The answer to the question, “Why Measure?”, involves several other questions such as:

Are the software processes more productive today than last year? Less productive?Why?

Are the software processes delivering higher quality products today than last year? Lowerquality? Why?

Are our customers more satisfied with our software processes today than they were lastyear? Less satisfied? Why?

Are the software processes more reliable this year than last year? Less? Why?Do the software processes cost more today than last year? Less? Why?

Note that if one removes the words software processes from each of the questions above and insertany other business process, the questions would still apply to organizations in general. Softwareprocesses should be treated the same as other business processes. Authors such as W. EdwardsDeming, J.M. Juran, and Phil Crosby have made process modeling, measurements, and the

Page 13: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

Measurement Program ImplementationSection Phase Activities4.5.1 Plan/Evaluate Definition of goals, objectives, and

benefits Establishment of sponsorship Communication and promotion of

measurement Identification of roles and

responsibilities4.5.2 Analyze Audience analysis & target metrics

identification Definition of software measures Definition of the data collection,

analysis, and data storage approach4.5.3 Implement/

Measure

Education Reporting and publishing results

4.5.4 Improve Managing expectations Managing with metrics

Section Phase Activities4.5.1 Plan/Evaluate Definition of goals, objectives, and

benefits Establishment of sponsorship Communication and promotion of

measurement Identification of roles and

responsibilities4.5.2 Analyze Audience analysis & target metrics

identification Definition of software measures Definition of the data collection,

analysis, and data storage approach4.5.3 Implement/

Measure

Education Reporting and publishing results

4.5.4 Improve Managing expectations Managing with metrics

Page 14: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

4.5.1 Plan/Evaluate Phase4.5.1.1 Reasons for implementation

Establish a baseline from which to determine trends Quantify how much was delivered in terms the client understands Help in estimating and planning projects Compare the effectiveness and efficiency of current processes, tools, and techniques Identify and proliferate best practices Identify and implement changes that will result in productivity, quality, and cost improvements Establish an ongoing program for continuous improvement Quantitatively prove the success of improvement initiatives Establish better communication with customers Manage budgets for software development more effectively

4.5.1 Plan/Evaluate Phase4.5.1.1 Reasons for implementation

Establish a baseline from which to determine trends Quantify how much was delivered in terms the client understands Help in estimating and planning projects Compare the effectiveness and efficiency of current processes, tools, and techniques Identify and proliferate best practices Identify and implement changes that will result in productivity, quality, and cost improvements Establish an ongoing program for continuous improvement Quantitatively prove the success of improvement initiatives Establish better communication with customers Manage budgets for software development more effectively

Measurement Program Implementation:Plan/Evaluate Phase

Page 15: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

4.5.1 Plan/Evaluate Phase4.5.1.2 Questions to help identify goals

How fast can we deliver reliable software to our customers? Does it satisfy their requirements? Can we efficiently estimate the development cost and schedule? Are the estimates accurate? What can we do to improve our systems-development life cycle and shorten the cycle time? What is the quality of the software we deliver? Has it improved with the introduction of new tools or techniques? How much are we spending to support existing software? Why does one system cost more than another to support? Which systems should be re-engineered or replaced? When? Should we buy or build new software systems? Are we becoming more effective and efficient at software development? Why? Why not? How can we better leverage our information technology? Has our investment in a particular technology increased our productivity?

4.5.1 Plan/Evaluate Phase4.5.1.2 Questions to help identify goals

How fast can we deliver reliable software to our customers? Does it satisfy their requirements? Can we efficiently estimate the development cost and schedule? Are the estimates accurate? What can we do to improve our systems-development life cycle and shorten the cycle time? What is the quality of the software we deliver? Has it improved with the introduction of new tools or techniques? How much are we spending to support existing software? Why does one system cost more than another to support? Which systems should be re-engineered or replaced? When? Should we buy or build new software systems? Are we becoming more effective and efficient at software development? Why? Why not? How can we better leverage our information technology? Has our investment in a particular technology increased our productivity?

Measurement Program Implementation:Plan/Evaluate Phase

Page 16: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

4.5.1 Plan/Evaluate Phase4.5.1.3 Identification of sponsors4.5.1.4 Identification of roles and responsibilities

Who will decide what, how, and when to collect the measurement information? Who will be responsible for collecting the measurement information? How will the data be collected? What standards (internal or external) will be used? At which phases will the data be collected? Where will it be stored? Who will ensure consistency of data reporting and collection? Who will input and maintain the measurement information? Who will report measurement results? When? What will be reported to each level of management? Who will interpret and apply the measurement results? Who is responsible for training? Who will maintain an active interest in the measurement program to ensure full usage of the measurement information? Who will evaluate measurement results and improve the measurement program? Who will ensure adequate funding support?

4.5.1 Plan/Evaluate Phase4.5.1.3 Identification of sponsors4.5.1.4 Identification of roles and responsibilities

Who will decide what, how, and when to collect the measurement information? Who will be responsible for collecting the measurement information? How will the data be collected? What standards (internal or external) will be used? At which phases will the data be collected? Where will it be stored? Who will ensure consistency of data reporting and collection? Who will input and maintain the measurement information? Who will report measurement results? When? What will be reported to each level of management? Who will interpret and apply the measurement results? Who is responsible for training? Who will maintain an active interest in the measurement program to ensure full usage of the measurement information? Who will evaluate measurement results and improve the measurement program? Who will ensure adequate funding support?

Measurement Program Implementation:Plan/Evaluate Phase

Page 17: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

4.5.2 Analysis Phase4.5.2 Analysis Phase4.5.2.1 Analysis of audience and identification of target metrics4.5.2.2 Definition of Software Metrics

4.5.3 Implement/Measure Phase4.5.3 Implement/Measure Phase4.5.3.1 Organizing for Just In Time training and education processes4.5.3.2 Reporting and publishing results

4.5.4 Improve Phase4.5.4 Improve Phase4.5.4.1 Managing expectations4.5.4.2 Managing with metrics

4.5.2 Analysis Phase4.5.2 Analysis Phase4.5.2.1 Analysis of audience and identification of target metrics4.5.2.2 Definition of Software Metrics

4.5.3 Implement/Measure Phase4.5.3 Implement/Measure Phase4.5.3.1 Organizing for Just In Time training and education processes4.5.3.2 Reporting and publishing results

4.5.4 Improve Phase4.5.4 Improve Phase4.5.4.1 Managing expectations4.5.4.2 Managing with metrics

Measurement Program Implementation:Analysis/Implementation/Improve Phases

Page 18: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.Source: SEI, 1994, number of organizations: 261 1997, number of organizations 606

Statistics from formal assessments“the tip of the iceberg”

Page 19: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

Most organizations are moving towards level 2

INITIALINITIAL

REPEATABLEREPEATABLE

• Requirements Management

• Project Planning

• Project Tracking & Oversight

• Subcontract Management

• Quality Assurance

• Configuration Management

• Requirements Management

• Project Planning

• Project Tracking & Oversight

• Subcontract Management

• Quality Assurance

• Configuration Management

Page 20: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

CMM Level 2 Key Process Areas

Requirements Management

SoftwareProjectPlanning

Software Configuration Management

Software Quality Assurance

Software Subcontract Management

SoftwareProject Tracking and Oversight

Page 21: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

Software Development Management Dashboard“it works only for organizations above level

2”

PP and PTOPP and PTORMRM

CMCM

PP and PTOPP and PTO

QAQA

Page 22: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

Presentation agenda

Software life cycles Inspection processes Measurement programs Assessing software inspection processes

Page 23: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

Software Trouble Assessment MatrixSoftware Trouble Assessment Matrix

When were errors detected?When were errors detected?Depends on the inspection process efficiency - i.e.,

how it performs

When errors could have been detected?When errors could have been detected?Depends on the inspection process effectiveness - i.e.,

how it was designed

When were errors created?When were errors created?Depends on the overall performance of the software

development process

Page 24: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

Software Life Cycle PhasesSoftware Life Cycle Phases

RequirementsRequirements

AnalysisAnalysis

RequirementsRequirements

AnalysisAnalysis

Top LevelTop Level

DesignDesign

Top LevelTop Level

DesignDesign

DetailedDetailed

DesignDesign

DetailedDetailed

DesignDesign

SystemSystem

TestsTests

SystemSystem

TestsTests

Unit Unit

TestsTests

Unit Unit

TestsTests

Program-Program-mingming

Program-Program-mingming

AcceptanceAcceptance

TestsTests

AcceptanceAcceptance

TestsTests

Page 25: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

When were errors detected? When were errors detected?

RequirementsRequirements

AnalysisAnalysis

RequirementsRequirements

AnalysisAnalysis

Top LevelTop Level

DesignDesign

Top LevelTop Level

DesignDesign

DetailedDetailed

DesignDesign

DetailedDetailed

DesignDesign

SystemSystem

TestsTests

SystemSystem

TestsTests

Unit Unit

TestsTests

Unit Unit

TestsTests

Program-Program-mingming

Program-Program-mingming

AcceptanceAcceptance

TestsTests

AcceptanceAcceptance

TestsTests

3333 7777

2222252525253131313129292929

13131313

Page 26: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

When were errors detected?When were errors detected?When were errors detected?When were errors detected?

Life Cycle PhaseLife Cycle Phase Number of Number of ErrorsErrors

Requirements AnalysisRequirements Analysis 33Top Level designTop Level design 77Detailed DesignDetailed Design 22ProgrammingProgramming 2525Unit TestsUnit Tests 3131System TestsSystem Tests 2929Acceptance TestAcceptance Test 1313Cumulative profile = S1Cumulative profile = S1Cumulative profile = S1Cumulative profile = S1

Page 27: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

When errors could have been detected?When errors could have been detected?When errors could have been detected?When errors could have been detected?

Life Cycle PhaseLife Cycle Phase Number of Number of ErrorsErrors

Requirements AnalysisRequirements Analysis 88Top Level designTop Level design 1414Detailed DesignDetailed Design 1010ProgrammingProgramming 3939Unit TestsUnit Tests 88System TestsSystem Tests 2626Acceptance TestAcceptance Test 55Cumulative profile = S2Cumulative profile = S2Cumulative profile = S2Cumulative profile = S2

Page 28: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

When were errors created?When were errors created?When were errors created?When were errors created?

Life Cycle PhaseLife Cycle Phase Number of Number of ErrorsErrors

Requirements AnalysisRequirements Analysis 3434Top Level designTop Level design 2222Detailed DesignDetailed Design 1717ProgrammingProgramming 2727Unit TestsUnit Tests 55System TestsSystem Tests 55Acceptance TestAcceptance Test 00Cumulative profile = S3Cumulative profile = S3Cumulative profile = S3Cumulative profile = S3

Page 29: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

S1, S2, S3 cumulative profilesS1, S2, S3 cumulative profiles

1 2 3 4 5 6 7

34

56

73

100 105 110 110

6

2232

7179

105 110

3 10 12

37

68

97

110

0

20

40

60

80

100

120

Cu

mu

lati

ve

Err

ors

Development Cycle Phase

S1

S2

S3

Page 30: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

The Software Trouble Assessment MatrixThe Software Trouble Assessment Matrix

CUM.CUM.FRQ.

=

AREAUNDE

RCURV

E

CUMULATIVE

FRQ.

ACCEPTANCE

TEST

SYSTEM

TESTS

UNIT

TESTS

PROGRAMMING

DETAILED

DESIGN

TOP

level

DESIGN

REQUIREMENTS

Where could the errorshave been detected?

ACCEPTANCE

TEST

SYSTEM

TESTS

UNIT

TESTS

PROGRAMMING

DETAILED

DESIGN

TOP

level

DESIGN

REQUIREMENTS

Errors found by phase -> 13 29 31 25 2 7 3

8 8 8 Requirements Analysis 1 - - 2 - 2 330 22 4 10 Top Level Design - 4 1 3 1 562 32 1 4 5 Detailed Design 1 - 7 1 1

133 71 15 15 4 5 Programming - - 20 19212 79 - 6 - 2 - Unit Tests - 5 3317 105 5 5 6 - 5 5 System Tests 6 20427 110 - - - - 1 3 1 Acceptance Tests 5S2 110 110 105 100 73 56 34 Cumulative frequency 110 97 68 37 12 10 3

S3 588 478 368 263 163 90 34 S1 337 227 130 62 25 13 3

CUM.CUM.FRQ.

=

AREAUNDE

RCURV

E

CUMULATIVE

FRQ.

ACCEPTANCE

TEST

SYSTEM

TESTS

UNIT

TESTS

PROGRAMMING

DETAILED

DESIGN

TOP

level

DESIGN

REQUIREMENTS

Where could the errorshave been detected?

ACCEPTANCE

TEST

SYSTEM

TESTS

UNIT

TESTS

PROGRAMMING

DETAILED

DESIGN

TOP

level

DESIGN

REQUIREMENTS

Errors found by phase -> 13 29 31 25 2 7 3

8 8 8 Requirements Analysis 1 - - 2 - 2 330 22 4 10 Top Level Design - 4 1 3 1 562 32 1 4 5 Detailed Design 1 - 7 1 1

133 71 15 15 4 5 Programming - - 20 19212 79 - 6 - 2 - Unit Tests - 5 3317 105 5 5 6 - 5 5 System Tests 6 20427 110 - - - - 1 3 1 Acceptance Tests 5S2 110 110 105 100 73 56 34 Cumulative frequency 110 97 68 37 12 10 3

S3 588 478 368 263 163 90 34 S1 337 227 130 62 25 13 3

When were errors created?When were errors created? When were errors detected?When were errors detected?

Page 31: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

The S Software T Trouble A Assessment M Matrix

CUM.CUM.FRQ.

=

AREAUNDE

RCURV

E

CUMULATIVE

FRQ.

ACCEPTANCE

TEST

SYSTEM

TESTS

UNIT

TESTS

PROGRAMMING

DETAILED

DESIGN

TOP

level

DESIGN

REQUIREMENTS

Where could the errorshave been detected?

ACCEPTANCE

TEST

SYSTEM

TESTS

UNIT

TESTS

PROGRAMMING

DETAILED

DESIGN

TOP

level

DESIGN

REQUIREMENTS

Errors found by phase -> 13 29 31 25 2 7 3

8 8 8 Requirements Analysis 1 - - 2 - 2 330 22 4 10 Top Level Design - 4 1 3 1 562 32 1 4 5 Detailed Design 1 - 7 1 1

133 71 15 15 4 5 Programming - - 20 19212 79 - 6 - 2 - Unit Tests - 5 3317 105 5 5 6 - 5 5 System Tests 6 20427 110 - - - - 1 3 1 Acceptance Tests 5S2 110 110 105 100 73 56 34 Cumulative frequency 110 97 68 37 12 10 3

S3 588 478 368 263 163 90 34 S1 337 227 130 62 25 13 3

CUM.CUM.FRQ.

=

AREAUNDE

RCURV

E

CUMULATIVE

FRQ.

ACCEPTANCE

TEST

SYSTEM

TESTS

UNIT

TESTS

PROGRAMMING

DETAILED

DESIGN

TOP

level

DESIGN

REQUIREMENTS

Where could the errorshave been detected?

ACCEPTANCE

TEST

SYSTEM

TESTS

UNIT

TESTS

PROGRAMMING

DETAILED

DESIGN

TOP

level

DESIGN

REQUIREMENTS

Errors found by phase -> 13 29 31 25 2 7 3

8 8 8 Requirements Analysis 1 - - 2 - 2 330 22 4 10 Top Level Design - 4 1 3 1 562 32 1 4 5 Detailed Design 1 - 7 1 1

133 71 15 15 4 5 Programming - - 20 19212 79 - 6 - 2 - Unit Tests - 5 3317 105 5 5 6 - 5 5 System Tests 6 20427 110 - - - - 1 3 1 Acceptance Tests 5S2 110 110 105 100 73 56 34 Cumulative frequency 110 97 68 37 12 10 3

S3 588 478 368 263 163 90 34 S1 337 227 130 62 25 13 3

When were errors created?When were errors created? When were errors detected?When were errors detected?

1 2 3 4 5 6 7

34

56

73

100 105 110 110

6

2232

7179

105 110

3 10 12

37

68

97

110

0

20

40

60

80

100

120C

um

ula

tiv

e E

rro

rs

Development Cycle Phase

S1

S2

S3

1 2 3 4 5 6 7

34

56

73

100 105 110 110

6

2232

7179

105 110

3 10 12

37

68

97

110

0

20

40

60

80

100

120C

um

ula

tiv

e E

rro

rs

Development Cycle Phase

S1

S2

S3

Page 32: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

Definition of STAM MetricsDefinition of STAM Metrics

Negligence ratio:Negligence ratio: indicates the amount of errors that escaped through the inspection process filters.

INSPECTION EFFICIENCY

Evaluation ratio:Evaluation ratio: measures the delay of the inspection process in identifying errors relative to the phase in which they occurred.

INSPECTION EFFECTIVENESS

Prevention ratio:Prevention ratio: an index of how early errorsare detected in the development life cycle relative to the total number of reported errors.

DEVELOPMENT PROCESS EXECUTION

Page 33: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

Areas under cumulative profiles:Areas under cumulative profiles:

S1 = 337S1 = 337S2 = 427S2 = 427S3 = 588S3 = 588

Negligence ratio: 100 x (S2 - S1)/S1 = 26.7%

Evaluation ratio: 100 x (S3 - S2)/S2 = 37.7%

Prevention ratio: 100 x S1/(7 x total) = 43.7%

Negligence ratio: 100 x (S2 - S1)/S1 = 26.7%

Evaluation ratio: 100 x (S3 - S2)/S2 = 37.7%

Prevention ratio: 100 x S1/(7 x total) = 43.7%

Computation of STAM MetricsComputation of STAM Metrics

1 2 3 4 5 6 7

34

56

73

100 105 110 110

6

2232

7179

105 110

3 10 12

37

68

97

110

0

20

40

60

80

100

120

Cu

mu

lati

ve

Err

ors

Development Cycle Phase

S1

S2

S3

1 2 3 4 5 6 7

34

56

73

100 105 110 110

6

2232

7179

105 110

3 10 12

37

68

97

110

0

20

40

60

80

100

120

Cu

mu

lati

ve

Err

ors

Development Cycle Phase

S1

S2

S3

Page 34: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

1. Errors are detected 27% later than they should have been (I.e. if the inspection processes worked perfectly)

2. The design of the inspection processes imply that errors are detected 38% into the phase following their creation.

3. Ideally all errors are requirements errors, and they are detected in the requirements phase. In this example only 47% of this ideal is materialized implying significant opportunities for improvement.

Interpretation of STAM MetricsInterpretation of STAM Metrics

Page 35: KPA Ltd.  and Tel Aviv University

S T

A M

S T

A M

© 2000, KPA Ltd.

Inspection processes need to be designed in the context of a software life cycle

Inspection processes need to be evaluated using quantitative metrics

STAM metrics provide such an evaluation STAM metrics should be integrated in an overall measurement

program

ConclusionsConclusionsConclusionsConclusions