se300 swe practicesmercury.pr.erau.edu/~siewerts/se300/documents/lectures/lecture-week-12-b.pdfbased...

31
April 2, 2015 Sam Siewert SE300 SWE Practices Lecture 12 Design and Prototype Testing Strategies (Design and Code Metrics)

Upload: others

Post on 11-Aug-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

April 2, 2015 Sam Siewert

SE300

SWE Practices

Lecture 12 – Design and Prototype

Testing Strategies

(Design and Code Metrics)

Page 2: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

When are We Done Testing?

Based on a Metric

– Code Coverage – Path, Statement

– Hours of Continuous Testing – Soak Time

– Number of Test Cases

– Test Vectors (Variation of Inputs to Functions and Units)

– Other?

Path Coverage Tools and Metrics

– E.g. Enigma code Function and Path Coverage using Gcov

SE420, SQA (Software Quality Assurance)

Sam Siewert 2

Page 3: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.

18-3

Test Driven Development

Y

Write/edit tests for

features to be

implemented next

Run new tests

All new

tests fail?

Write/edit

production code

Run all tests All tests

pass?

Refactor code

More features

to implement?

Halt N

N

N

Y

Y

Test coverage

accomplished? N

Add tests to

increase coverage

Y

Prepare for

TDD

Page 4: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.

18-4

Merits of Test Driven Development

• TDD helps the team understand and improve

requirements.

• TDD produces high-quality code.

• TDD improves program structure and

readability through refactoring.

• TDD makes it easier to debug.

Page 5: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.

19-5

What is Software Quality Assurance?

• SQA is a set of activities to ensure that the software product and/or process confirms to established requirements and standards.

• The activities include – Verification: Are we building the product right?

– Validation: Are we building the right product?

– Technical reviews

– Testing: Attempts to uncover program errors.

Page 6: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.

19-6

Cost of Quality

• Cost of quality consists of

– Costs of preventing software failures

• Costs to implement an SQA framework (one time)

• Costs to perform SQA activities (on going)

• Costs to monitor and improve the framework (on going)

• Costs of needed equipment (machines and tools)

– Costs of failure

• Costs to analyze and remove failures

• Costs of lost productivity (developer and customer)

• Costs of liability, lawsuit, and increased insurance

premium

• Costs associated with damage to company’s reputation

Page 7: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.

19-7

Cost of Quality

Software engineering considers costs to accomplish

something. Quality and cost trade-off.

Cost

Quality

Failure Cost

Prevention

Cost

Higher

Higher

Page 8: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.

19-8

Costs of Quality

10

100

0

100

1 requirement

s

design

coding

development test

system test

field operation relative error correction cost

Page 9: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.

19-9

Software Quality Attributes

• Reliability (adequacy: correctness, completeness, consistency; robustness)

• Testability and Maintainability (understandability: modularity, conciseness, preciseness, unambiguity, readability; measurability: assessibility, quantifiability)

• Usability

• Efficiency

• Portability

• etc.

Page 10: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.

19-10

Quality Measurements and Metrics

• Software measurements are objective,

quantitative assessments of software attributes.

• Metrics are standard measurements.

• Software metrics are standard measurements

of software attributes.

• Software quality metrics are standard

measurements of software quality.

• Class discussion: why do we need software

metrics?

Page 11: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.

19-11

Software Quality Metrics

• Requirements metrics

• Design metrics

• Implementation metrics

• System metrics

• Object-oriented software quality metrics

Page 12: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.

19-12

Requirements Metrics

Requirements Unambiguity Q1 =

#of uniquely interpreted requirements

#of requirements

Requirements Completeness Q2 =

#of unique functions

#of combinations of states and stimuli

Page 13: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.

19-13

Requirements Metrics

Requirements Correctness Q3 =

#of validated correct requirements

#of requirements

Requirements Consistency Q4 =

#of non-conflicting requirements

#of requirements

Page 14: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.

19-14

Design Metric Fan-In

• Number of incoming messages or control flows into a module.

• Measures the dependencies on the module.

• High fan-in signifies a “god module” or “god class.”

• The module may have been assigned too many responsibilities.

• It may indicate a low cohesion.

Page 15: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.

19-15

Design Quality Metrics: Fan-In

DBMgr

M11

M7

M1

M13

M9

M5

M12

M2

M10

M3

M8

M6

M4

Page 16: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.

19-16

Design Quality Metrics Fan-Out

• Number of outgoing messages of a module.

• Measures the dependencies of this module on other modules.

• High fan-out means it will be difficult to reuse the module because the other modules must also be reused.

Page 17: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.

19-17

Quality Metrics: Fan-Out

GUI

M11

M7

M1

M13

M9

M5

M12

M2

M10

M3

M8

M6

M4

Page 18: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.

19-18

Modularity

• Modularity – Measured by cohesion and

coupling metrics.

• Modularity = (a * Cohesion + b * Coupling)/(a

+ b), where a and b are weights on cohesion

and coupling.

Page 19: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.

19-19

Cohesion

• Each module implements one and only one

functionality.

• Ranges from the lowest coincidental cohesion

to the highest functional cohesion.

• High cohesion enhances understanding, reuse

and maintainability.

Page 20: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.

19-20

Coupling

• An interface mechanism used to relate

modules.

• It measures the degree of dependency.

• It ranges from low data coupling to high

content coupling.

• High coupling increases uncertainty of run

time effect, also makes it difficult to test,

reuse, maintain and change the modules.

Page 21: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.

19-21

Module Design Complexity mdc

• The number of integration tests required to

integrate a module with its subordinate

modules.

• It is the number of decisions to call a

subordinate module (d) plus one.

M0

M1 M2 M3

M5 M4 M6 M7

Module Design

Complexity mdc=4

Page 22: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.

19-22

Design Complexity S0

• Number of subtrees in the structure chart with

module M as the root.

• S0(leaf) = 1

• S0(M) = mdc+S0(child1 of M)+...+S0(childn of M)

M0

M1 M2 M3

M5 M4 M6 M7

S0=1

S0=1

S0=1

S0=1

S0=11

mdc=2

S0=3

mdc=1

S0=3

mdc=2

S0=3

mdc=2

Page 23: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.

19-23

Integration Complexity

• Minimal number of integration test cases

required to integrate the modules of a structure

chart.

• Integration complexity S1 = S0 – n + 1, where

n is the number of modules in the structure

chart.

Page 24: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.

19-24

Implementation Metrics

• Defects/KLOC

• Pages of documentation/KLOC

• Number of comment lines/LOC

• Cyclomatic complexity

– equals to number of binary predicates + 1

– measure the difficulty to comprehend a

function/module

– measure the number of test cases needed to cover

all independent paths (basis paths)

Page 25: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.

19-25

What is the cyclomatic

complexity of this

module?

Page 26: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.

19-26

Depth of Inheritance Tree

• Distance from a derived class to the root class

in the inheritance hierarchy

• Measures • the degree of reuse through inheritance

• the difficulty to predict the behavior of a class

• costs associated with regression testing due to change

impact of a parent class to descendant classes

Page 27: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.

19-27

High DIT Means Hard to Predict Behavior

• All three classes include

print()

• It is difficult to determine

which “print()” is used:

public static void main (...) {

Shape p; ….

p.print(); // which print()?

}

Shape

print()

Box

print()

Square

print()

Page 28: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.

19-28

ClassA

ClassB

ClassC

ClassD

ClassE

m()

m()

m()

Change to parent

class requires

retesting of

subclasses.

Page 29: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Negative Tests - HA Design Criteria

5 9’s – Out of Service no more than 5 Minutes per Year [Up 99.999% of the Time]

Availability = MTTF / [MTTF + MTTR]

MTTF ≈ MTBF

E.g. 0.99999 ≈ (365.25x60x24 minutes) / [(365.25x60x24) + 5 minutes]

MTBF=Mean Time Between Failures, MTTR=Mean Time to Recovery

Sam Siewert 29

Big iron lessons, Part 2: Reliability and availability:

What’s the difference?

Apply RAS architecture lessons to the autonomic

Self-CHOP roadmap

Page 30: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Availability Time System is Available for Use

Time System is Unavailable for Use

Ratio of Time Available over Total Time Both Available and Not Available

Some Variation in Definitions of MTTF, MTBF, MTTR – MTBF = MTTF + MTTR [In Book]

– MTTF is Simply Time Between Failures (Observed in Accelerated Lifetime / Stress Tests)

– MTBF Often Stated – Does it Really Include MTTR or Not?

– Typically Recovery Time is Small, So Negligible, But if Long, Difference is More Important

– Just Be Clear on Definitions [Can Vary]

% of Time System is Available for Use Each Year (Or Other Arbitrary Period of Time)

Sam Siewert 30

available recovery available recovery

MTBF

Failure #1 Failure #2

MTTF

MTTR

Page 31: SE300 SWE Practicesmercury.pr.erau.edu/~siewerts/se300/documents/Lectures/Lecture-Week-12-B.pdfBased on a Metric – Code Coverage – Path, Statement – Hours of Continuous Testing

Copyright {c} 2014 by the McGraw-Hill Companies, Inc. All rights Reserved.

19-31

Usefulness of Quality Metrics

1.Definition and use of indicators.

2.Directing valuable resources to critical areas.

3.Quantitative comparison of similar projects

and systems.

4.Quantitative assessment of improvement

including process improvement.

5.Quantitative assessment of technology.