software inspection version 2009b_en peter rösler 1 software inspection introduction to reviews...

91
Peter Rösler www.reviewtechni k.de Software Inspection Version 2009b_en 1 Software Inspection Introduction to Reviews and Inspections 15 Critical Success Factors Benefits of Inspection Video Scenes Exercise: Conduct an actual Inspection Tools Work Aids programprog ramprogramprogr amprogramprogram program BUG progr amprogramprogra mprogramprogr ampro

Post on 18-Dec-2015

226 views

Category:

Documents


0 download

TRANSCRIPT

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

1Software Inspection

• Introduction to Reviews and Inspections

• 15 Critical Success Factors

• Benefits of Inspection

• Video Scenes

• Exercise: Conduct an actual Inspection

• Tools

• Work Aids

programprog ramprogramprogramprogramprogramprogram BUG progr amprogramprogra mprogramprogr ampro

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

2Capability Maturity Model (CMM)

Level Focus Key Process Areas

Level 5Optimizing

Level 4Managed

Level 3Defined

Level 2Repeatable

Level 1

Initial

Source: www.software.org/quagmire/descriptions/swcmm.asp

Heroes No KPAs at this time

Projectmanagement

Requirements Management, SW Project PlanningSW Project Tracking and OversightSW Subcontract Management, SW Quality Assurance SW Configuration Management

Engineering process

Organization Process Focus, Org. Process DefinitionPeer Reviews, Training Program Intergroup Coordination, SW Product EngineeringIntegrated Software Management

Product andprocess quality

Software Quality ManagementQuantitative Process Management

Continuous improvement

Process Change ManagementTechnology Change ManagementDefect Prevention

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

3Software Inspection

• Introduction to Reviews and InspectionsIntroduction to Reviews and Inspections

• 15 Critical Success Factors

• Benefits of Inspection

• Video Scenes

• Exercise: Conduct an actual Inspection

• Tools

• Working Aids

Introduction to Reviews and Inspections

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

4Introduction to Reviews and Inspections

• Introduction / Terms and DefinitionsIntroduction / Terms and Definitions

• Roles of the Participants

• Phases of an Inspection Introduction / Terms and Definitions

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

5Terms and Definitions (1)

• In the following we use the terms “review” or „software review“ synonymic for “Inspection” / “formal inspection” / “Fagan/Gilb style inspection”.

• Caution: in literature a “review” often denotes ananalytic QA activity which is clearly separatedfrom an “Inspection” (see next slide).

• In this sense the seminar treats exclusively of “Inspections”!

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

6

Inspection: Main activity: find defects

• formal individual and group checking, using sources and standards, according to detailed and specific rules

Review: Activity: decision making

• group discusses document and makes a decision about the content, e.g. how something should be done, go orno-go decision

Walkthrough: Activity: understanding

• author guides the group through a document and his or her thought processes, so all understand the same thing, consensus on change

Types of Review of Documents

Source: Dorothy Graham

[Presentation Review]

[Management Review / Project Status Review]

Walkthrough:

Review:

Inspection:

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

7Terms and Definitions (2)

“major defect” (in contrast to “minor defect”)

• Defect, which possibly causes substantially higher costs, if it is found later in the development process

other common definitions:

• Defect, which can be found by tests

• Defect, which can be found by the user

The programmer makes an error or mistake or slip, therefore the program has a defect or fault or bug or flaw, and the program execution may run into a failure or malfunction or crash.

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

8Terms and Definitions (3)

for programs:

NLOC

• non-commentary lines of code

for text documents:

“Page” often also called "net page" (vs. "print page")

• 300 logical (“non-commentary”) words

“Inspection rate“

• Checking rate, indicated in NLOC / h for programs and in pages / h for text documents.

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

9Experiences of other Companies

Source: Humphrey 1989, Managing the software process, p186/187

• An AT&T Bell Laboratory project with 200 professionals instituted several changes, including inspections. Productivity improved by 14% and qualityby a factor of ten.

• Aetna Insurance Company: inspectionsfound 82% of errors in a COBOL program,productivity was increased by 25%.

• Another COBOL example (Gilb, Software Metrics): 80% of the development errors were found by inspections, productivity was increased by 30%.

For further examples see notes page of this slide

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

10Hidden Cost of Defect Rework

6%

12%16%

12% 10%

4%

8%12%

19%

1%

Requirements Preliminary Design Detailed Design Code & Unit Test Integration &System Test

Rework

Production

44 %

56 %

Source: Wheeler 1996,Software inspection: an industrybest practice, p 9

Approx. 2/3 of the rework effort can be avoided by Inspec-

tions!

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

11Relative Costs to fix Defects

Source: Tom Gilb,Software Engineering Management,Data of Standard Chartered Bank

»Some maladies, as doctors say, at their beginning are easy to cure but difficult to recognize ... but in the course of time when they have not at first been recognized and treated, become easy to recognize but difficult to cure.«  Niccolo Machiavelli (1469-1527)

1,5 1

10

60

100

0

20

40

60

80

100

120

During Design Before Code Before Test During Test In Production

Co

st U

nit

s

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

12

0

10

20

30

40

50

During Design Before Code Before Test During Test In Production

Err

ors

fo

un

d

Time of Error Discovery (with Inspections)

Source: Tom Gilb,Software Engineering Management,Data of Standard Chartered Bank

0

10

20

30

40

50

During Design Before Code Before Test During Test In Production

Err

ors

fo

un

d

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

13People Resource with and without Inspections

Source: Fagan 1986Schema

WithoutInspections

Peo

ple

Res

ourc

e

Design Code Testing Ship Planning/Requirements

Schedule

UsingInspections

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

14Successful Projects are not the Normal Case!

Failed Challenged Succeeded

Source: The Standish Group (2004)

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

15Two Magic Potions …

… to avoid failing projects

Software Inspectio

n

Incremental Software

Development

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

16Incremental Software Development

The costscosts of the worst Mj defects are upward limited

• with weekly increment the maximum loss is:

1 work week * number of coworkers

• with waterfall model possibly:

2 years * number of coworkers

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

17Software Inspection

The numbernumber of Mj defects, which are sent to the next phases, is reduced

• a single Inspection finds about 50% of the existing defects (some companies achieve about 90%)

• cumulated the customer gets 10 - 100 times less defects

Source: Gilb1988, Principles of SoftwareEngineering Management, Table 12.2see also SI p 24

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

18Learning Curve at Incremental Software Development

The numbernumber of Mj defects, which the authors produce, drops by the learning curve.

• The authors constantly receive feedback (by each increment), which defects they produced.

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

19Learning Curve at Software Inspection

The numbernumber of Mj defects, which the authors produce, drops by the learning curve.

• The authors constantly receive feedback (by each Inspection), which defect they produced.

Numerical values for learning curves see slides „Marie’s Personal Learning Curve”“Gary’s Personal Learning Curve“„Learning Curve of an Entire Organization”„Learning Curve and Positive Motivation“

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

20Introduction to Reviews and Inspections

• Introduction / Terms and Definitions

• Roles of the ParticipantsRoles of the Participants

• Phases of an Inspection Roles of the

Participants

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

21Roles of the Participants

• Moderator (or Inspection Leader)

• Author

• Scribe (or Recorder)

• Checker (or Reviewer, Inspector)

• Reader (only necessary if „double checking“ is intended)

A participant can take over several roles.Only restriction: the author may take over additionally at the most the checker role.

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

22Introduction to Reviews and Inspections

• Introduction / Terms and Definitions

• Roles of the Participants

• Phases of an InspectionPhases of an Inspection Phases of an

Inspection

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

23Overall Process Map

Sources

Product

Checklists

ChangeRequeststo Project

andProcess

DataSummary

MasterPlan

InspectionIssueLog

ProcessBrainstorm

Log

ExitedProduct

EntryPlanning

Kickoff

Checking

Logging

ProcessBrainstorming

Edit

Followup

Exit

Source: Tom Gilb,Team Leader Course

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

24Entry

• Check, that the Inspection makes sense, i.e.

• the product document (document under test) and the source documents have to meet entry criteria.

For examples see slide 73 (" Release of Document only in Case of good Quality ")

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

25Planning

• The participants and their roles are specified

• The dates are fixed

• If necessary the document is splitted in inspectable parts ("chunks")

Result is the “master plan”.

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

26Team Selection Table

Employee Type

Con-tract

Reqts. Archit.Design

DetailDesign

TestPlan

TestDesign

SourceCode

Tech.Doc.

UserManual

Task author x x x x x x x x x

Req. analyst x x x x x x x

Arch. designer x x x x x x x

Detail. designer x x x x x x

Programmer x x x x x

Tester x x x x x x x x x

Maintainer x x x x x x x x

User x x x x x x

Manager x x x x x

Marketing x x x x

Legal department x x x x

Source: SI p159Table 8.1

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

27Kickoff Meeting (optional)

• The author gives so much background information over the document under test to the checkers, so that they really can find defects.

• The moderator explains the distribution of roles, checking strategies, and changes in the Inspection process.

At the latest in the kickoff meeting the documents are distributed to the checkers.

Practice hint: insert line numbers and change bars in the document!

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

28Individual Checking

• Find potential “major defects”

• and record them

• Adhere to optimum checking rate

• Use checklists

Nearly all defects (approx. 80-99%), which the Inspection team can detect, are detected in this phase!

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

29Time of Error Discovery in Phase „Individual Checking“, Checker 1

When did Checker 1 find the Defects?

0 20 40 60 80 100 120 min.

A 2 p text document had to be checked

MjRem

6 2 1,5 1,2 1312

10

p/h∞

checker finished checking

1091,1

checklists used

Source: Peter Rösler, www.reviewtechnik.de, 2005

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

30

0

1

2

3

4

5

6

7

8

9

10

0 20 40 60 80 100 120

min.

Che

kcer

exercise example 1(text document)

exercise

example 2(C program)

zum 0 abdecken

Checkers 11-28 not depicted

Time of Error Discovery in Phase „Individual Checking“

Source: Peter Rösler, www.reviewtechnik.de, 2005

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

31

zum 0 abdecken Defects found vs. Checking Time

0%

20%

40%

60%

80%

100%

0% 20% 40% 60% 80% 100%

Individual checking time

0

100

200

300

400

500

Def

ects

fo

un

d

constant error discovery

rate!

The 80-20 rule is not valid

here!

Cumulative Histogram for 28 Checkers

Source: Peter Rösler, www.reviewtechnik.de, 2006

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

32Logging Meeting (1)

3 goals:

• Record defects found

• Find further defects by „double checking“

“Phantom Inspector”

• Suggest process improvements and ask questions

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

33Logging Meeting (2)

• The document is checked, not the author!

• No discussions about defects and solutions

• High logging rate (> 1 defect per minute)

• If „double checking“ is performed:the optimum inspection rate has to be met

The output is the “Inspection issue log”.

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

34Inspection Issue Log

Inspection Identification ___GD5________ Page _1_____ of __9_____ No Document

Reference Tag Doc. Page

Line No. Location

Type of Item (circle)

Checklist or Rule Tag

Description (key offending words)

Number of occur.

Editor note

1 GRMOD 1 4 Major PI New

Minor ?

CI4 ‘interface spec’

2 „ „ 15-17 Major PI New

Minor ?

CU2 New user guidelines not followed

3 GDT 1 - Major PI New

Minor ?

GDT Test Plan Format should corresp. with GDC

4 STP 2 3.7-12 Major PI New

Minor ?

CHK-7 Test missing

5 GRMOD 2 13 Major PI New

Minor ?

SDC1 Wrong format 12

6 „ „ 44-46 Major PI New

Minor ?

Meaning unclear

Subtotals:

New issues found in meeting ____1___________ (among items logged on this page)

Major issues logged _______3_________ Minor issues logged _______12________

Process Improvements logged ___1_____ Questions to the author logged ____1_____

Form Owner: TsG/DG. Version: 4th May 1998

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

35Process Brainstorming Meeting

For some “major defects”:

• Root causes are identified

Why?

• and improvement ideas are created

The output is the “process brainstorm log”, a log of improvement suggestions to the software development process.

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

36Edit (= Rework)

• The author fixes the defects in the document

• and creates change requests if necessary

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

37Follow-up

• The moderator checks, whether the author dealt

with all problems

• and enters the statistical data (“data summary”)

into the QA database

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

38Exit

• Decision, whether release of document is possibleor a re-inspection is necessary

• based on exit criteria

For examples see slide 73 (" Release of Document only in Case of good Quality ")

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

39Software Inspection

• Introduction to Reviews and Inspections

• 15 Critical Success Factors 15 Critical Success Factors

• Benefits of Inspection

• Video Scenes

• Exercise: Conduct an actual Inspection

• Tools

• Work Aids

15 Critical Success Factors

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

40Success Factors of an Inspection

• Most of the following 15 success factors are probably not explicitly demanded in the Inspection process of your company, but probably also not "forbidden".

• These 15 success factors you as moderatorsimply should know, so that you candecide on an individual basis, which ofthem you will use and which not.

Michael Fagan,inventor of formal

software Inspections,IBM 1976, knew already

success factors 1-10

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

41Success Factors 1- 5

1. Inspections are carried out everywhere in the development process.

2. Inspections are detecting all classes of defects. Not only defects, which can easily be found in tests.

3. Inspections should be psychologically unstressed. The direct boss of the author should not be part of the Inspection team.

4. Inspections are carried out in a prescribed series of steps ("phases").

5. Checklists help to find as much as possible defects.

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

42Example of a Checklist

Example of a checklist for module specifications:

• CL-MODSP-1: Does the module export generation functions, initialization and restart functions?

• CL-MODSP-2: Are the function specifications described in the "pre/post-condition style”?

• CL-MODSP-3: Are the name conventions kept for the function names?

• ...

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

43Success Factors 6-10

6. Inspection meetings are limited to two hours.

7. Checkers are assigned specific roles (or checking strategies).

Thus the effectiveness is increased.

8. Inspections are led by a trained moderator.

9. Statistics are kept. Thus the Inspection process can be optimized and the benefits of Inspections

can be quantified.

10. Material is inspected at the optimum Inspection rate.

15 slides follow to the topics „optimum Inspection rate“ and "effectiveness of an Inspection"

(counted in pages/h and NLOC/h respectively)

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

44Effectiveness against Inspection Rate

70 to 122 130 to 285Inspection rate (lines/hour)

Ave

rag

e d

efec

ts

fou

nd

(o

f 21

max

imu

m k

no

wn

)

Source: Frank Buck,1981, Technical Report21.802, September, IBM

9

10

11

12

13

14

15

16

17

18

19

20

21

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

45Defect Density against Inspection Rate

Def

ect

den

sity

(d

efec

ts/p

age)

Inspection rate (pages/hour)

15

12

9

6

3

20 40 60 80 100

Source: Tom Gilb, Denise LeighSoftware Inspection p 334,230 inspections of Sema Group (GB)

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

46Recommended Inspection Rates

Programs

• 100 – 150 NLOC / h

1 0.8, “typically 0.3 - 1.0“, “avoid rates above 2 p/h“

Further data see notes pageof this slide

100 - 150 LOC/h

Text documents

• Gilb/Graham: ca. 1 page / h

• Strauss/Ebenau: 3 – 5 pages / h

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

47Reading and Checking Rates compared

• Normal reading rate: ca. 50 pages / h „reading a novel“

• Checking rate for finding „minor defects“: ca. 10 pages / h

• Checking for „Major defects“ lasts approx. 10 times longer than checking for „minor defects“!

More exact numbers see next slides

average estimate of the seminar participants

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

48

0 10 20 30 40 50 60 70 80 90 100

p/h

No.

par

ticip

ants

Reading Rates

No. of data points: 193Mean: 47,7 p/h

Natural reading rate according to speed reading literature:48 p/h (= 240 wpm)

Source: Peter Rösler, www.reviewtechnik.de, 2005

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

49

0 10 20 30 40 50 60 70 80 90 100p/h

No

. p

art

icip

an

tsChecking Rates for Searching for „Minor Defects“

No. of data points: 183Mean: 9,1 p/h

Source: Peter Rösler, www.reviewtechnik.de, 2005

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

50

0 5 10 15 20 25 30 35 40

factor

No

. p

art

icip

an

tsEstimated additional Time

for Searching for „Major defects“

No. of data points: 180Mean: factor 9,0

Source: Peter Rösler, www.reviewtechnik.de, 2005

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

51Example for Optimum Inspection Rate

Automated Teller Machine (ATM) requirement:

»A valid user must be able to withdraw up to $200or the maximum amount in the account.«

Ref.: R. Craig, S.P. Jaskiel Systematic Software Testing

These 18 words can be read in about 4 - 5 sec. (equals ca. 50 p/h) For checking about 3 - 4 min. are necessary(equals ca. 1 p/h)

see also notes page

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

52Optical Illusions

The optimum Inspection rate amounts to about 1 page per hour.If you don't believe it, you can measure it at any time! Here too you cannot trustyour intuition!

Source: www.reviewtechnik.de

The diagonals are parallel to each other.If you don't believe it, you can measure it at any time. With this picture you cannot trust your intuition.

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

53A Document checked with 2 Methods

No. of checkers

Inspection rate in phase

„individual checking“

found Mj defects in 13-page

chapter

Effective-ness

Traditional Review

Inspection

System architecture with 220 pages

• checked by a traditional review (all 220 pages)

• and checked by an Inspection (only 13-page chapter)

Source: Diamantis Gikas, Siemens AGMore details see notes page of this slide

planned: 4 actual: 4

planned: 4,3 p/h actual: 4,1 p/h

16 max. 30% (estimated)

nothing planned, actual: >= 50 p/h

1 max. 2% (= 1/16 * 30%)

planned: 23actual: 7

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

54A Document checked with 2 Methods (2)

Source: Diamantis Gikas, Siemens AG Annotations: Peter Rösler

found Mj defects

Inspection rate

Effective-ness

Effort

Inspection

Traditional Review

Efficiency

max. 2%

max. 30%

>= 50 p/h

4,1 p/h 16 (in 13 p) ≈ 0,46 Mj/h

11 (in 220 p) ≈ 0,5 Mj/h≈ 22 h

≈ 35 h

The effectiveness collapses roughly by the factor, by which was inspected too fast.

The efficiency does not dependon the inspection rate.

(But on the defect density in the document, see notes page)

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

55A Document checked with 2 Methods (3)

Effort found Mj defects

Hours saved *

Traditional Review

Inspections

Extrapolation, if all 220 pages were checked with Inspections:

≈ 271 (in 220 p)≈ 592 h

≈ 110 h (ROI ≈ 5:1)

≈ 2660 h (ROI ≈ 4,5:1)

* Estimated time for find&fix a Major later: 12h

≈ 22 h 11 (in 220 p)

Source: Diamantis Gikas, Siemens AGMore details see notes page of this slide

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

56Inspection Effectiveness

„Rule of thumb“: ca. 50% of the in the document existing defects are discovered by the Inspection team, if the optimum Inspection rate is met.

Maturity Effectiveness

inspections just implemented

50-60%

refined processes 75-85%

refined processes, exceptional cases

>90%

Source: Fagan 2001

Data ofMichael Fagan: (inventor offormal software Inspections)

More exact referencessee notes page

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

57Inspection Effectiveness (2)

More exact referencessee notes page

immature inspection process

Effectiveness

default estimation (for all types of

document)

30%

matureinspection process

Effectiveness

source code 60%

pseudocode 80%

module and interface spec.

88%

Source: SI p23, p383, Lindner 1992Source: Radice 2002

90-100%Level 5

75-90%Level 4

65-75%Level 3

50-65%Level 2

<50%Level 1

EffectivenessSW-CMM

Further bibliographical references

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

58Inspection Effectiveness (3)

immature inspection process

Effectiveness

the normal for inspections not optimized with

optimal checking rates etc.

about 3%

Source: Tom Gilb / Kai Gilb 2004

If the optimum Inspection rate is ignored:

Sounds much too pessimistic.But it's true!(At least for text documents)

A real example with 2%effectiveness see slide"A Document checked with 2 Methods"

More exact referencessee notes page

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

59Success Factors of an Inspection (continued)

• Success factors 11-15 follow

successfactors 11-15

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

60Success Factors 11-13

11. The focus is on “Major defects” Checklists are formulated in such a way that “Major defects” thereby are found.

The benefits of Inspections is justified by number of “Major defects” found, etc.

12. “Process brainstorming meeting“ Directly after the logging meeting in a “process brainstorming meeting“ is

considered, how the found defects can be avoided in the future completely.

13. After each Inspection is estimated, how many hours were saved.

(See next slide, line “Development time probably saved by this Inspection”.)

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

61Inspection Data Summary

Date _____________ Inspection ID _____________ Inspection Leader ______________

Product Document Reference ___________________ Total Pages ________________

Date Inspection Requested _____________________ Date Entry Criteria passed ________

Planning-time _________ Entry-time _________ Kickoff time __________ (wk-hrs)

Individual Checking Results (to be reported during the entry process of the logging meeting)

Inspector Hourschecking

Pagesstudied

MajorIssues

MinorIssues

Improve-ments

Questionsnoted

Checkingrate

123456Totals

Totals for all InspectorsChecking-time ____________ (wk-hrs) Average checking rate ____________

Logging

No. Of people ______ Logging-duration (hours _______ Logging-time _________ (wk-hrs)

Major Issueslogged

Minor Issueslogged

Improvementsuggestions

Questions ofintent

New items foundin the meeting

Logging-rate ______ (items/min) Detection-time _____ (wk-hrs) Logging-meeting-rate _____

Editing, Follow-up and Exit

No. major defects _______ No. minor defects _______ No. Change Requests _______

Edit-time ________ Follow-up-time ________ Exit-time ________ Exit date ___________

Control-time ______ Defect-removal-time ______ Estimated remaining defects/page ______

Estimated effectiveness (major defects found / total) _____ Efficiency (major/wk-hrs) _____

Development time probably saved by this Inspection _______ (based on 8 or ____ hrs/major)

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

62Success Factors 14-15

14. Re-inspection based on exit criteria (e.g. “max. 0.5 remaining major defects / page”)

15. Sampling

4 slides follow to the topic „sampling“

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

63Sampling permits thorough Checking

ordinary “review” - finds some defects, one major, fix them,consider the document now corrected and OK

(from Dorothy Graham)

Inspection can find deep-seated defects: all of that type can becorrected, but needs optimum checking rate

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

64Costs for a Sample

Text document: 1,5 pagesAssumptions:

team size 3 (author + 2 checkers)Inspection rate 1,5 p/h (author checks too)phase "checking" takes 30% of the total effort

More data see notes page

, effort: 10 h

Expected effectiveness: ca. 50%

Program: 125 NLOCAssumptions:

team size 3 (author + 2 checkers) Inspection rate 125 NLOC/h (author checks too)phase "checking" takes 30% of the total effort

, effort: 10 h

Expected effectiveness: ca. 50%

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

65Costs for 100% Inspection Coverage

Details see notes page

Gilb: ca. 10-15% of development budget (Radice: ca. 8-20%)

Effort:e.g. 500 h distributed on 25 Inspections

Savings:e.g. 2000 h saved by 250 Mj defects found

ROI = 4:1

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

66Nobody asks You for 10-15% Effort en bloc!

Source: www.reviewtechnik.de

You invest first only in 1 Inspection (effort 10 - 20h)

The Inspection must detect approx. 10 Mj defects!

1 Inspection,efforte.g. 20h

Enough Mj defects found? Yes the next Inspection follows

No do nothing, the quality is already good enough

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

67Software Inspection

• Introduction to Reviews and Inspections

• 15 Critical Success Factors

• Benefits of InspectionBenefits of Inspection

• Video Scenes

• Exercise: Conduct an actual Inspection

• Tools

• Work Aids

Benefits of

Inspection

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

68Benefits of Inspection

• Reduced development costs

(25-35%)

• Earlier delivery, control over schedule

(25-35%)

• Reduced maintenance costs (10 to 30 times less)

• Increased reliability (10 to 100 times fewer defects)

Source: Gilb1988, Principles of SoftwareEngineering Management, Table 12.2compare also SI p 24

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

69Marie’s Personal Learning Curve

Source: Tom Gilbwww.gilb.com

Estimated remaining Major defects / Page

28

13

43

0

5

10

15

20

25

0 1 2 3 4 5

30

6 7

25

3 5

Order of documents submitted to Inspection

Marie Lambertsson’s Learnability Curve,

Ericsson, Stockholm, 1997

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

70Gary’s Personal Learning Curve

Source: Douglas Aircraft, 1988,private report to Tom Gilb

80 Majors Found

(~160-240 exist!)

40

23

800

20

40

60

80

100

0 1 2 3 4 5

Major defects / Page found

February April

Inspections of Gary’s Designs

“Gary” atMcDonnell-Douglas

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

71Learning Curve of an Entire Organization

Source: Tom Gilbwww.gilb.com

Major defects / Page found

0

5

10

15

20

25

18 months later (~1994)

20

1-1,5

British Aerospace, Eurofighter Project,

Wharton

Before Tom Gilb’s Inspection training

Another example at Douglas, 1988:

they ended up within the first year with sixty

times better quality in terms of rejected and

reworked drawings (0.5% versus earlier

about 30% reworked).

For another example at

Raytheon see notes page of

this slide.

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

72Learning Curve and Positive Motivation

• The individual engineer is generally capable of reducing

the defect injection density by 50% per cycle of personal

learning and feedback. (Tom Gilb’s personal experience, 1988 and later).

• We can expect this reduction only if the authors have

positive motivation to produce good quality documents. Quantified exit criteria demanded by management can be such a motivation.

• There is no indication that the

authors need significantly more

time to produce these documents!(Tom Gilb at Review Symposium 06/2005)

Niels Malotaux: »In my experience the ’zero defects’ attitude

results in 50% less defects overnight.«

Source: Gilb‘s Review Symposium 06/2005

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

73Release of Document only in Case of good Quality

If management insists on such

an exit criterion,

the learning curve can be

achieved!

see SI p202 and notes part of slide

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

74Cost of a Major Defect in Test or in Field

Mean time to find and correct a Major after Inspection was 9.3 hours.

Number of defects of the 1,000 sampled Majors(all types of document)

Estimated hours to find and correctin test or in field. Range 1 to 80 hours

For comparison:It cost about

1 hour to find and fix a Major using

Inspection

Source: MEL inspection dataJune 1989 to January 1990, “Software Inspection”, p 315

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

75Benefits of Inspection (2)

• “Quality is free”

• “Inspection is the most cost-effective manual quality

technique”

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

76Video Scenes

• Video scenes demonstratingappropriate and inappropriateconduct of Inspections

(For a detailed description of the video scenes see www.sei.cmu.edu/reports/91em005.pdf)

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

77Exercise

• Conduct an actual Inspection(a C program is checked against its specification)

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

78Further Exercises

• Game Rules as TextDocument to be inspected

• Dr. Juran‘s F-Test

• Training Optimum Inspection Rates

• Reading and Checking Rates

!!! Please Do Not Read Any Further !!!

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

79Dr. Juran’s Test “The Game Rules”

• No questions! No discussion! Make your best interpretation of these rules.

• You will get 30 seconds to check.

• Count all defects for

Source: Tom Gilb,Team Leader Course

Rule F: no instances of “F” (any type) allowed on the screen.

• Advice: count even“remote cousins” “remote cousins” (example “f” and “F ”).

• Write down your count of “defects” on paper.

• You may move to any position in theroom to see better. Do not interferewith the view of others.

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

80Juran's “80%” Test

How many letter F's can you find on How many letter F's can you find on this page?this page?

Write the number down in this boxWrite the number down in this box

"FEDERAL FUSES ARE THE RESULTS OF "FEDERAL FUSES ARE THE RESULTS OF YEARS OF SCIENTIFIC STUDY COMBINED YEARS OF SCIENTIFIC STUDY COMBINED

WITH THE EXPERIENCE OF YEARS."WITH THE EXPERIENCE OF YEARS."

Source: Tom Gilb,Team Leader Course

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

81Checklist for "F" SearchingAll questions support “Rule F”

F1. Do you find the word "of"?F2. Did you look outside borders?F3. Do you find large graphic

patterns resembling F ?F4. Did you find all "F" shapes within other symbols, for example in letter "E"?F5. Did you find all numbers and shapes pronounced "F", for example 55 and "frames"?F6. Did you examine things under a microscope?F7. Did you check the back of the screen?F8. Did you look for lettering on the screen casing?F9. Did you see the upside-down, backwards letter “t” (= “f”)?

Source: Tom Gilb,Team Leader Course

Rule F: no instances of “F” (any type) allowed on the screen.

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

82

• According to Gilb/Graham, what is the adequate duration for the phase “individual checking”?

(at least 5 min.)(at least 5 min.)10 min. 10 min.

Training Optimum Inspection Rates

The slide Juran's “80 %” Test is the document under test.

• Which further “documents” (source documents and others) should be handed out to the checkers?

Source: www.reviewtechnik.de

slide 79 (Rules)slide 81 (Checklist)slide 79 (Rules)slide 81 (Checklist)

1/6 page1/6 page• Of how many “pages” consists Juran's “80%” Test ?

• Conduct phase “individual checking” – carefully enough to find as many defects as possible– mark the defects in the document and note the number of defects

(according to each checklist question and totally)– take the time und calculate your inspection rate (by entering the

data in the trainer’s PC)

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

83Tools

• Lotus Inspection Data System (LIDS)

• ReviewPro

• QA-C/C++

• PolySpace

tools.htmllids.htmreviewpro.htmlQA-C Control Graph p24QA-C Key QuestionsPolySpace-Folie

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

84Work Aids and Forms

• Literature

• Software Inspection Blank Forms(Download possible under www.reviewtechnik.de/checklisten.html)

• Software Inspection Filled-in Forms

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

85Miscellaneous / Solutions

!!! Please Do Not Read Any Further !!!

• Results Survey Typical Inspection Rates

• Solution True-False Quiz

• Results Dr. Juran‘s F-Test

• Results Training Optimum Inspection Rates

• Results Inspection of a Text Document

• Sample Solution Inspection of a Text Document

• Sample Solution Inspection of a C program

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

86Typical Inspection Rates for Inspecting Text Documents

0 10 20 30 40 50 60 70 80 90 100

p/h

No.

par

ticip

ants

No. of data points: 177Mean: 13,3 p/hMedian: 10,0 p/h

Source: www.reviewtechnik.desee also notes page

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

87

1 218

71

184

252242

158

73

0

50

100

150

200

250

300

1 2 3 4 5 6 7 8 9

No. F's found

No

. pa

rtic

ipa

nts

Juran's “80%” Test Results

Source: www.reviewtechnik.de

The 1001 participants found on the average 6.4 Fs, that are 71% of all 9 „obvious“ F's.

Tom Gilb: The group average for “obvious” F’s will be 55%±7% for software people, 65%±8% for engineers. The group average for ‘really obscure’ F’s will be less than 10%. (Source: slides from www.gilb.com, Download Center)

Peter Rösler‘s 68 courses and lectures between Feb. 2002 and Dec. 2005 lay all (with 8 exceptions) in the range 71%±9%.

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

88Training Optimum Inspection Rates Solution

• The checkers should be handed out above all slide 79 (“The Game Rules”) and slide 81 (“Checklist”)

• The slide “Juran's 80% Test” consists of 50 - 55 words, thus of ca. 1/6 page (see slide 8 “Terms and Definitions”)

• 10 minutes (see slide 46 „Recommended Inspection Rates“),

at least 5 minutes (“avoid rates above 2 p/h“)

Results: for 90% of the participants "individual checking" took between 5 and 13 minutes (i.e. inspection rate 2,0 to 0,8 p/h).90% of the participants detected between 32 and 56 defects.

Source: www.reviewtechnik.de

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

89

0

10

20

30

40

50

60

70

80

90

100

0,00 1,00 2,00 3,00 4,00 5,00

Inspection rate

Def

ects

fo

un

d

Strauss/EbenauGilb/Graham

90% interval

p/h

Training Optimum Inspection Rates Results

Source: www.reviewtechnik.de

No. of data points: 333

Peter Röslerwww.reviewtechnik.de

Software InspectionVersion 2009b_en

90Inspection of a Text Document Results

Source: www.reviewtechnik.de

Anzahl Teilnehmer: 44Mittelwert: 1,6 p/h

Complete Practice Inspection of Text Documents

0

5

10

15

0,00 1,00 2,00 3,00 4,00 5,00

Inspection rate in p/h

Mj /

p fo

und

game rules 1,45 p game rules 2,0 p system architecture 1,83 p

weitere Reviewfolien

2 106 13

23

534645

44

80

71 74

Insp_Raten_Übung.xls

2T-Seminar Animation ein/aus: Esc [Alt c s Alt+o Return] Shift+F5

63

51

73

25

30

6564

56

1412

16

52

29

78

31

90

1T-Seminar

91

GI-Vortrag