1 testing metrics software reliability csse 376, software quality assurance rose-hulman institute of...

27
1 Testing Metrics Software Reliability CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 5, 2007

Post on 21-Dec-2015

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 1 Testing Metrics Software Reliability CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 5, 2007

1

Testing MetricsSoftware Reliability

CSSE 376, Software Quality Assurance

Rose-Hulman Institute of Technology

April 5, 2007

Page 2: 1 Testing Metrics Software Reliability CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 5, 2007

2

Outline

Testing MetricsAn Important Metric: Reliability

Page 3: 1 Testing Metrics Software Reliability CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 5, 2007

3

Common Metrics

Product KLOC

thousands of lines of code

need to remove comment lines?

Function Points #Bugs

Process Staff hours Tests planned Tests passed

Page 4: 1 Testing Metrics Software Reliability CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 5, 2007

4

Bug Density

Measure #Bugs/KLOCExpect different densities at different

stages of a projectMay categorize bugs by severity

Page 5: 1 Testing Metrics Software Reliability CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 5, 2007

5

Example Bug Density

0

5

10

15

20

25

30

Reqts HLD LLD Code UnitTest SysTest

Bugs/KLOC

Catastrophic

Major

Minor

Page 6: 1 Testing Metrics Software Reliability CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 5, 2007

6

Cartoon of the Day (1/3)

Page 7: 1 Testing Metrics Software Reliability CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 5, 2007

7

Using Bug Metrics

Count bugs discovered during each phase of a project

Compare to previous projectsprovides estimates of expected values at

each phase -- could use to set milestonedeviation of more than 20% from expected

indicates need for investigation

Page 8: 1 Testing Metrics Software Reliability CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 5, 2007

8

Analysis of Bug Data

Root cause analysisSearch for explanationsMight look at other process data (effort,

experience of team, etc.)Trend analysis

Make predictions from current data

Page 9: 1 Testing Metrics Software Reliability CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 5, 2007

9

Reliability

Page 10: 1 Testing Metrics Software Reliability CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 5, 2007

10

Failures vs. Faults

Fault:developer-oriented6 faults/1000 source lines

Failures:customer-oriented3 failures/1000 CPU hours

Page 11: 1 Testing Metrics Software Reliability CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 5, 2007

11

Calculating Reliability

Reliabilityprobability of failure-free operation for a

specified time interval0.82 for 8 CPU hours

Failure Intensitynumber of observed failures within a

specified time interval3 failures/1000 CPU hours

Page 12: 1 Testing Metrics Software Reliability CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 5, 2007

12

Factors Influencing Reliability

Fault removalby error correction (debugging)

Fault introductionby error correction (unintended)by new feature development

Operational profile

Page 13: 1 Testing Metrics Software Reliability CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 5, 2007

13

Operational Profile

0

0.005

0.01

0.015

0.02

0.025

0.03

0.035

0.04

0.045

0.05

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Functions

Probabilityof Use

Page 14: 1 Testing Metrics Software Reliability CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 5, 2007

14

Example

FunctionUsage Probability

Distribution Interval

Change 32% 0-31

Delete 14% 32-45

Insert 46% 46-91

Print 8% 92-99

Page 15: 1 Testing Metrics Software Reliability CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 5, 2007

15

Test Generation

TestRandom Numbers Test Cases

1 29, 11, 47, 52, 26, 94

C, C, I,I, C, P

2 62, 98, 39, 78, 82, 65

I, P, D,I, I, I

3 83, 32, 58, 41, 36, 17

I, D, I,D, D, C

4 36, 49, 96, 82, 20, 77

D, I, P,I, C, I

Page 16: 1 Testing Metrics Software Reliability CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 5, 2007

16

Test Compression

Real use of a product involves repetitive operationsdifferent users invoke the same operationssame user invokes the same operations on

different daysRedundant tests waste computer and

personnel timeCompression: when generating random

tests, do not duplicate previous tests

Page 17: 1 Testing Metrics Software Reliability CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 5, 2007

17

Cartoon of the Day (2/3)

Page 18: 1 Testing Metrics Software Reliability CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 5, 2007

18

Cartoon of the Day (3/3)

Page 19: 1 Testing Metrics Software Reliability CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 5, 2007

19

Curve Fitting

Reliability models focus on failure removal

Use a random process to model the failure removal process

Page 20: 1 Testing Metrics Software Reliability CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 5, 2007

20

Execution Time Model

FailureIntensity

Execution Time

Goal

Page 21: 1 Testing Metrics Software Reliability CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 5, 2007

21

Resource Constraints

Early phase of a projectconstrained by availability of developers

(debuggers)Middle phase

constrained by availability of testersLate phase

constrained by availability of machinesmay run tests in parallel to increase number

of tests per CPU hour

Page 22: 1 Testing Metrics Software Reliability CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 5, 2007

22

Adjusting for Calendar Time

Estimate resource usage during each phase of the project

Model calculates failure intensity in terms of execution time

Model adjusts fault removal rate according to resource constraints

Page 23: 1 Testing Metrics Software Reliability CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 5, 2007

23

Calendar Time Component

FailureIntensity

Execution Time

Goal

Constrained by debuggers

Constrained by testers

Constrained by machines

Page 24: 1 Testing Metrics Software Reliability CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 5, 2007

24

Calculating CalendarTime/ExecutionTime Ratio

10 staff-hours to fix each failure

2 failures/CPU-hr

That means it will take 10 * 2 = 20 staff-hrs per CPU-hr

Suppose you have 5 developers

Then you have 20 / 5 = 4 hrs per CPU-hrEach CPU-hr will take 4 calendar hours

Page 25: 1 Testing Metrics Software Reliability CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 5, 2007

25

Estimating Completion

1. Establish a failure intensity objective

2. Record execution times of failures

3. Run model to estimate reliability

4. Model reports estimated completion date

Values are not absolute---within confidence bounds

Page 26: 1 Testing Metrics Software Reliability CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 5, 2007

26

Estimating Completion

FailureIntensity

Calendar Time

Goal

ShipDate

Page 27: 1 Testing Metrics Software Reliability CSSE 376, Software Quality Assurance Rose-Hulman Institute of Technology April 5, 2007

27

Acceptance Charts

Bugs

Time

Reject

AcceptContinue Testing