systematic test design … all on one page · 2013-08-25 · statistical testing (markov chains)...

34
W12 Concurrent Session Wednesday 05/07/2008 3:00 PM Systematic Test Design … All on One Page Presented by: Peter Zimmerer Siemens AG Presented at: STAREAST Software Testing Analysis & Review May 5-9, 2008; Orlando, FL, USA 330 Corporate Way, Suite 300, Orange Park, FL 32043 888-268-8770 904-278-0524 [email protected] www.sqe.com

Upload: others

Post on 29-May-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

W12 Concurrent Session Wednesday 05/07/2008 3:00 PM

Systematic Test Design … All on One Page

Presented by:

Peter Zimmerer Siemens AG

Presented at: STAREAST Software Testing Analysis & Review

May 5-9, 2008; Orlando, FL, USA

330 Corporate Way, Suite 300, Orange Park, FL 32043 888-268-8770 904-278-0524 [email protected] www.sqe.com

Page 2: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Peter Zimmerer Peter Zimmerer is a Principal Engineer at Siemens AG, Corporate Technology, in Munich, Germany. He studied Computer Science at the University of Stuttgart, Germany and received his M.Sc. degree (Diplominformatiker) in 1991. He is an ISTQBTM Certified Tester Full Advanced Level. For more than 15 years he has been working in the field of software testing and quality engineering for object-oriented (C++, Java), distributed, component-based, and embedded software. He was also involved in the design and development of different Siemens in-house testing tools for component and integration testing. At Siemens he performs consulting on testing strategies, testing methods, testing processes, test automation, and testing tools in real-world projects and is responsible for the research activities in this area. He is co-author of several journal and conference contributions and speaker at international conferences, e.g. at Conference on Quality Engineering in Software Technology (CONQUEST), SIGS-DATACOM OOP, Conference on Software Engineering (SE), GI-TAV, STEV Austria, SQS Software & Systems Quality Conferences (ICS Test), SQS Conference on Software QA and Testing on Embedded Systems (QA&Test), Dr. Dobb’s Software Development Best Practices, Dr. Dobb’s Software Development West, Conference on Testing Computer Software, Quality Week, Conference of the Association for Software Testing (CAST), Pacific Northwest Software Quality Conference (PNSQC), PSQT/PSTT, QAI’s Software Testing Conference, EuroSTAR, and STARWEST. He can be contacted at [email protected]. Internet: http://www.siemens.com/research-and-development/ http://www.siemens.com/corporate-technology/

1

Page 3: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Copyright © Siemens AG 2008. All rights reserved.

Corporate Technology

Systematic Test Design . . .All on One Page

STAREAST 2008Orlando, FL, USA

Peter ZimmererPrincipal Engineer

Siemens AG, CT SE 1Corporate Technology

Corporate Research and TechnologiesSoftware & Engineering, Development Techniques

D-81739 Munich, [email protected]

http://www.siemens.com/research-and-development/http://www.siemens.com/corporate-technology/

Page 4: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 2 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Contents

IntroductionTest design methodsHere: methods, paradigms, techniques, styles, and ideas

to create, derive, select, generate a test caseExamples and references

Problem statement

Poster Test Design Methods on One Page

Guidelines and experiences

Summary

Page 5: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 3 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

What is a Test Case?

A set of input values, execution preconditions, expected results and execution postconditions, developed for a particular objective or test condition, such as to exercise a particular program path or to verify compliance with a specific requirement. ISTQB 2007, IEEE 610

A Test Case should includeunique identification – who am I?test goal, test purpose – why?test conditions – what?preconditions – system state, environmental conditionstest data – inputs, data, actionsexecution conditions – constraints, dependenciesexpected results – oracles, arbiters, verdicts, tracespostconditions – system state, traces, environmental conditions,

expected side effects, expected invariants

Page 6: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 4 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Introduction – Test design methods

Good test design, i.e. high-quality test cases,is very important

There are many different test design methodsand techniques

Static, dynamicBlack-box, white-box, grey-boxBased on fault model, experience, exploratoryStatistical (user profiles), random (monkey)

The tester‘s challenge is to adequatelycombine these methods dependent on thegiven problem, domain, and requirements

This is art as well!

Black-box test design methods are oftenbased on models – model-based testing

Page 7: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 5 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Some systematic methods for test design

Black-box (models, interfaces, data)Requirements-based (traceability matrix)Use case-based testing, scenario testingDesign by contractEquivalence class partitioningClassification-tree methodBoundary value analysisState-based testingCause-effect graphingDecision tables, decision treesCombinatorial testing (n-wise)

White-box (internal structure, paths)Control flow testingData flow testing

Selection, usage and applicability depends on the

specific domain (domain knowledge is required!)used software technologytest requirements: required test intensity, quality criteria, risksexisting test basis: specifications, documents, modelsproject factors: constraints and opportunities

Page 8: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 6 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Example – There are always too many test cases ...

Page 9: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 7 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Examples – Demo

Microsoft PowerPoint

Microsoft Word 2002

Page 10: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 8 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Test effectiveness and formal (systematic) test design

There are studies showing advantages of systematic test design.There are also studies showing advantages of random testing.

But do you really want to design your test cases only randomly?

Formal test design wasalmost twice as effectivein defect detection per testcase as compared to expert(exploratory) type testing,and much more effectivecompared to checklist typetesting.Bob Bartlett, SQS UK, 2006

Page 11: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 9 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Some references

Many testing books cover test design to some extent

Boris Beizer: Software Testing TechniquesLee Copeland: A Practitioner's Guide to Software Test DesignRick D. Craig, Stefan P. Jaskiel: Systematic Software TestingTim Koomen et. al.: TMap Next: For Result-driven TestingGlenford J. Meyers: The Art of Software TestingTorbjörn Ryber: Essential Software Test DesignJames Whittaker: How to Break SoftwareJames Whittaker, Herbert Thompson: How to Break Software SecurityStandard for Software Component Testing by the British Computer Society Specialist Interest Group in Software Testing (BCS SIGIST) (see http://www.testingstandards.co.uk/)

There are many different training offerings by different providers

Page 12: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 10 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Problem statement

Starting from a risk-based testing strategy an adequate test designis the key for effective and efficient testing.Automation of bad test cases is a waste of time and money!

There are many different test design methods around for a long time(perhaps too many?) and a lot of books explain them in detail.There are different

categorizations, classifications, and dimensionsnaming, interpretations, and understandings

of test design methods which does not simplify their usage …

When we look into practice we can see that often there is quite limited usage of these test design methods at all.

What are the reasons behind that?How can we overcome this and improve our testing approaches?

Page 13: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 11 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Possible reasons and consequences (1)

Possible reasonsEngineers do not have / spend time to read a book on testingMissing the big pictureInformation is not available at a glanceWhich test design methods are there at all?Which test design method should I use in which context?

ConsequencesA specific test design method is not “available” when neededA specific test design method is too detailed or too complicated to be used in practiceOften the focus is on depth and on perfectionism

Remark: But that can be especially required, e.g. in a safetycritical environment!

Page 14: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 12 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Possible reasons and consequences (2)

Test design tools are typically focused and implementonly a few test design methods, e.g.:

ATD – Automated Test Designer (AtYourSide Consulting, http://www.atyoursideconsulting.com/): Cause-effect graphingBenderRBT (Richard Bender, http://www.benderrbt.com/):Cause-effect graphing, quick design (orthogonal pairs)CaseMaker (Diaz & Hilterscheid, http://www.casemaker.de/):Business rules, equivalence classes, boundaries, error guessing,pairwise combinations, and element dependenciesCTE (Razorcat, http://www.ats-software.de/): Classification tree editorReactis (Reactive Systems, http://www.reactive-systems.com/):Generation of test data from, and validation of, Simulink and Stateflow modelsTestBench (Imbus, http://www.testbench.info/, http://www.imbus.de/):Equivalence classes, work-flow / use-case-based testing

Page 15: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 13 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Poster Test Design Methods on One Page (1)

Idea:Systematic, structured, and categorized overview aboutdifferent test design methods on one page

Focus more on using an adequate set of test design methods thanon using only one single test design method in depth / perfection

Focus more on concrete usage of test design methodsthan on defining a few perfect test design methods in detailwhich are not used then in the project

Focus more on breadth instead on depthDo not miss breadth because of too much depth

Do not miss the exploratory, investigative art of testing

Page 16: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 14 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Poster Test Design Methodson One Page (2)

Effort / Difficulty / Resulting Test Intensity (5 Levels)

Key

Methods, Paradigms, Techniques, Styles, and Ideas to Create a Test CaseCategorization

Black-box 3(models, interfaces, data) 3

334444132432153554345

Grey-box 23334

White-box Control flow-based Coverage Statements (C0), nodes 2(internal structure, paths) (specification-based, Branches (C1), transitions, links, paths 3

model-based, Conditions, decisions (C2, C3) 4code-based) Elementary comparison (MC/DC) 5

Interfaces (S1, S2) 4Static metrics Cyclomatic complexity (McCabe) 4

Metrics (e.g. Halstead) 4Read / Write access 3Def / Use criteria 5

Positive, valid cases 1Negative, invalid cases 3

35

Fault-based 2434322312245

Regression (selective retesting) 52325

Communication behavior (dependency analysis)

Fault injection

Test catalog / matrix for input values, input fieldsState-based testing (Final State Machines)Cause-effect graphing

Normal, expected behavior

Error handling

T e s t D e s i g n M e t h o d s o n O n e P a g e

Ad hoc, intuitive, based on experience, check lists

Bug reportsBug patterns: standard, well-known bug patterns or produced by a root cause analysis

Systematic failure analysis (Failure Mode and Effect Analysis, Fault Tree Analysis)

Dependencies / Relations between classes, objects, methods, functionsDependencies / Relations between components, services, applications, systems

Special values

Standards (e.g. ISO/IEC 9126, IEC 61508), norms, (formal) specifications, claims

Retest by risk, priority, severity, criticalityRetest all

User / Operational profiles: frequency and priority / criticality (Software Reliability Engineering)

Protocol based (sequence diagrams, message sequence charts)

Test patterns (e.g. by Robert Binder), Questioning patterns (Q-patterns by Vipul Kocher)

Combinatorial testing (pair-wise, orthogonal / covering arrays, n-wise)

Evolutionary testing

Equivalence class partitioning

Features, functions, interfaces

Attack patterns (e.g. by James A. Whittaker)

Retest changed parts

Error guessing

Error catalogs, bug taxonomies (e.g. by Boris Beizer, Cem Kaner)

Retest by profile, frequency of usage, parts which are often used

Statistical testing (markov chains)

CRUD (Create, Read, Update, Delete) (data cycles, database operations)

Requirements-based with traceability matrix (requirements x test cases)Use case-based testing (sequence diagrams, activity diagrams)

Flow testing, scenario testing, soap opera testing

Random (monkey testing)

Design by contract (built-in self test)

Data flow-based

Decision tables, decision trees

Classification-tree method

Syntax testing (grammar-based testing)

Time cycles (frequency, recurring events, test dates)

Domain partitioning, category-partition method

Boundary value analysis

Trace-based testing (passive testing)

Invalid, unexpected behavior

Exceptions

Risk-based

Fault model dependent on used technology and nature of system under test

Exploratory testing, heuristics, mnemonics (e.g. by James Bach, Michael Bolton)

Mutation testing

Retest parts that are influenced by the changes (impact analysis, dependency analysis)

Page 17: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 15 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Poster Test Design Methods on One Page (3)

Categories of test design methods are orthogonal and independentin some way but should be combined appropriately.The selection of the used test design methods depends on manyfactors, for example:

Requirements of the system under test and the required qualityRequirements for the tests – quality of the tests,i.e. the required intensity and depth of the testsTesting strategy: effort / quality of the tests, distribution of the testing in the development processExisting test basis: specifications, documents, modelsProblem to be tested (domain) or rather the underlying question (use case)System under test or component under testTest level / Test stepUsed technologies (software, hardware)Suitable tool support: for some methods absolutely required

Black-box 3(models, interfaces, data) 3

334444132432153554345

Grey-box 23334

White-box Control flow-based Coverage Statements (C0), nodes 2(internal structure, paths) (specification-based, Branches (C1), transitions, links, paths 3

model-based, Conditions, decisions (C2, C3) 4code-based) Elementary comparison (MC/DC) 5

Interfaces (S1, S2) 4Static metrics Cyclomatic complexity (McCabe) 4

Metrics (e.g. Halstead) 4Read / Write access 3Def / Use criteria 5

Positive, valid cases 1Negative, invalid cases 3

35

Fault-based 2434322312245

Regression (selective retesting) 52325

Communication behavior (dependency analysis)

Fault injection

Test catalog / matrix for input values, input fieldsState-based testing (Final State Machines)Cause-effect graphing

Normal, expected behavior

Error handling

T e s t D e s i g n M e t h o d s o n O n e P a g e

Ad hoc, intuitive, based on experience, check lists

Bug reportsBug patterns: standard, well-known bug patterns or produced by a root cause analysis

Systematic failure analysis (Failure Mode and Effect Analysis, Fault Tree Analysis)

Dependencies / Relations between classes, objects, methods, functionsDependencies / Relations between components, services, applications, systems

Special values

Standards (e.g. ISO/IEC 9126, IEC 61508), norms, (formal) specifications, claims

Retest by risk, priority, severity, criticalityRetest all

User / Operational profiles: frequency and priority / criticality (Software Reliability Engineering)

Protocol based (sequence diagrams, message sequence charts)

Test patterns (e.g. by Robert Binder), Questioning patterns (Q-patterns by Vipul Kocher)

Combinatorial testing (pair-wise, orthogonal / covering arrays, n-wise)

Evolutionary testing

Equivalence class partitioning

Features, functions, interfaces

Attack patterns (e.g. by James A. Whittaker)

Retest changed parts

Error guessing

Error catalogs, bug taxonomies (e.g. by Boris Beizer, Cem Kaner)

Retest by profile, frequency of usage, parts which are often used

Statistical testing (markov chains)

CRUD (Create, Read, Update, Delete) (data cycles, database operations)

Requirements-based with traceability matrix (requirements x test cases)Use case-based testing (sequence diagrams, activity diagrams)

Flow testing, scenario testing, soap opera testing

Random (monkey testing)

Design by contract (built-in self test)

Data flow-based

Decision tables, decision trees

Classification-tree method

Syntax testing (grammar-based testing)

Time cycles (frequency, recurring events, test dates)

Domain partitioning, category-partition method

Boundary value analysis

Trace-based testing (passive testing)

Invalid, unexpected behavior

Exceptions

Risk-based

Fault model dependent on used technology and nature of system under test

Exploratory testing, heuristics, mnemonics (e.g. by James Bach, Michael Bolton)

Mutation testing

Retest parts that are influenced by the changes (impact analysis, dependency analysis)

Page 18: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 16 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Poster Test Design Methods on One Page (4)

The effort / difficulty for the test design methods or rather theresulting test intensity is subdivided into 5 levels:

1 very low, simple2 low3 medium4 high5 very high, complex

This division into levels is dependent on the factors given above forthe selection of the test design methods and therefore can only beused as a first hint and guideline.A test design method also may be used continuously from“intuitive use” up to “100% complete use” as required.

In addition describe every test design method on one page toexplain their basic message and intention.

Black-box 3(models, interfaces, data) 3

334444132432153554345

Grey-box 23334

White-box Control flow-based Coverage Statements (C0), nodes 2(internal structure, paths) (specification-based, Branches (C1), transitions, links, paths 3

model-based, Conditions, decisions (C2, C3) 4code-based) Elementary comparison (MC/DC) 5

Interfaces (S1, S2) 4Static metrics Cyclomatic complexity (McCabe) 4

Metrics (e.g. Halstead) 4Read / Write access 3Def / Use criteria 5

Positive, valid cases 1Negative, invalid cases 3

35

Fault-based 2434322312245

Regression (selective retesting) 52325

Communication behavior (dependency analysis)

Fault injection

Test catalog / matrix for input values, input fieldsState-based testing (Final State Machines)Cause-effect graphing

Normal, expected behavior

Error handling

T e s t D e s i g n M e t h o d s o n O n e P a g e

Ad hoc, intuitive, based on experience, check lists

Bug reportsBug patterns: standard, well-known bug patterns or produced by a root cause analysis

Systematic failure analysis (Failure Mode and Effect Analysis, Fault Tree Analysis)

Dependencies / Relations between classes, objects, methods, functionsDependencies / Relations between components, services, applications, systems

Special values

Standards (e.g. ISO/IEC 9126, IEC 61508), norms, (formal) specifications, claims

Retest by risk, priority, severity, criticalityRetest all

User / Operational profiles: frequency and priority / criticality (Software Reliability Engineering)

Protocol based (sequence diagrams, message sequence charts)

Test patterns (e.g. by Robert Binder), Questioning patterns (Q-patterns by Vipul Kocher)

Combinatorial testing (pair-wise, orthogonal / covering arrays, n-wise)

Evolutionary testing

Equivalence class partitioning

Features, functions, interfaces

Attack patterns (e.g. by James A. Whittaker)

Retest changed parts

Error guessing

Error catalogs, bug taxonomies (e.g. by Boris Beizer, Cem Kaner)

Retest by profile, frequency of usage, parts which are often used

Statistical testing (markov chains)

CRUD (Create, Read, Update, Delete) (data cycles, database operations)

Requirements-based with traceability matrix (requirements x test cases)Use case-based testing (sequence diagrams, activity diagrams)

Flow testing, scenario testing, soap opera testing

Random (monkey testing)

Design by contract (built-in self test)

Data flow-based

Decision tables, decision trees

Classification-tree method

Syntax testing (grammar-based testing)

Time cycles (frequency, recurring events, test dates)

Domain partitioning, category-partition method

Boundary value analysis

Trace-based testing (passive testing)

Invalid, unexpected behavior

Exceptions

Risk-based

Fault model dependent on used technology and nature of system under test

Exploratory testing, heuristics, mnemonics (e.g. by James Bach, Michael Bolton)

Mutation testing

Retest parts that are influenced by the changes (impact analysis, dependency analysis)

Page 19: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 17 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Example:Requirements-based with traceability matrix

Inventory tracking matrix

xxRisk 3xxRisk 2

xxRisk 1Risks

xObjective 3xxxObjective 2

xObjective 1Objectives

xUse Case 3xUse Case 2

xxUse Case 1Use Cases

xxFeature 3xxFeature 2

xxFeature 1Features

xxRequirement 3xxxRequirement 2

xRequirement 1Requirements

Test Case 8Test Case 7Test Case 6Test Case 5Test Case 4Test Case 3Test Case 2Test Case 1InventoriesTest CasesObjectives /

Page 20: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 18 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Example: Use case-based testing – Scenario testing

A scenario is a hypothetical story used to help a person thinkthrough a complex problem or system

Based e.g. on transaction flows, use cases, or sequence diagrams

A specific, i.e. more extreme kind of scenario testing is theso-called soap opera testing

Page 21: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 19 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Example: Soap opera testing

Ref.: Hans Buwalda: Soap Opera Testing,Better Software, February 2004

Page 22: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 20 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Example: Soap opera testing – Test objectives

Corresponds to thetraceability matrix

Ref.: Hans Buwalda: Soap Opera Testing,Better Software, February 2004

Page 23: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 21 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Example: Equivalence class partitioning andboundary values

Goal:Create a minimum number of black box tests needed while still providing adequate coverage.

Two tests belong to the same equivalence class if you expect thesame result (pass / fail) of each. Testing multiple members of thesame equivalence class is, by definition, redundant testing.

Boundaries mark the point or zone of transition from oneequivalence class to another. The program is more likely to fail at aboundary, so these are the best members of (simple, numeric)equivalence classes to use.

More generally, you look to subdivide a space of possible tests intorelatively few classes and to run a few cases of each. You’d like topick the most powerful tests from each class.

Page 24: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 22 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Example: Equivalence class partitioning andboundary value analysis with 2 parameters

b

a

b

b

a a

max

min

maxmin

b

a

b

b

a a

max

min

maxmin b

a

b

b

a a

max

min

maxmin

# test cases for n parameters and one valid equivalence class: 4n + 1

# test cases for n parameters and one valid and one invalid equivalence class: 6n + 1

Page 25: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 23 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Example: Classification-tree method

Ref.: CTE XL, http://www.systematic-testing.com/

Page 26: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 24 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Example: State-based testing

State table

Snew / X …X is action (or event)and Snew is the resulting new state;action N means “do nothing”State transition diagram

State 1

State 2

State 3

Cond_1 / A

Cond_2 / B

Cond_3 / C

Cond_4 / D

3 / D1 / C3 / N3 / NState 3

2 / N2 / N3 / B2 / NState 2

1 / N1 / N1 / N2 / AState 1

Cond_4Cond_3Cond_2Cond_1State

Condition

Page 27: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 25 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Example: Cause-effect graphing

Requires dependencies between parameters and can get verycomplicated and difficult to implement

Page 28: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 26 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Example: Combinatorial Testing (pairwise testing)

Given a system under test with 4 parameters A, B, C, and DEach parameter has 3 possible values

Parameter A: a1, a2, a3Parameter B: b1, b2, b3Parameter C: c1, c2, c3Parameter D: d1, d2, d3

A valid test input data set is e.g. {a2, b1 , c2 , d3}.Exhaustive testing would require 34 = 81 test cases

Only 9 test cases arealready sufficient to coverall pairwise interactions ofparameters

Page 29: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 27 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Example: Data flow-based

defined / used pathsdefined (d)

for example value assigned to a variable, initializedused (u)

for example variable used in a calculation, predicatepredicate-use (p-u)computation-use (c-u)

Test du-paths

Read / write access:“data source“ and “data sink“

Use it on different levels of abstraction:model, unit, integration, system

Page 30: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 28 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Example: Heuristics and mnemonics*

Boundary valuesCRUD (data cycles, database operation)

Create, Read, Update, DeleteHICCUPPS (oracle)

History, Image, Comparable Products, Claims, User’s Expectations, Product, Purpose, Statutes

SF DePOT (San Francisco Depot) (product element, coverage)Structure, Function, Data, Platform, Operations, Time

CRUSSPIC STMPL (quality criteria)Capability, Reliability, Usability, Security, Scalability, Performance, Installability, Compatibility, Supportability, Testability, Maintainability, Portability, Localizability

FDSFSCURA (testing techniques)Function testing, Domain testing, Stress testing, Flow testing, Scenario testing, User testing, Risk testing, Claims testing, Automatic Testing

FCC CUTS VIDS (application touring)Feature tour, Complexity tour, Claims tour, Configuration tour, User tour, Testability tour, Scenario tour, Variability tour, Interoperability tour, Data tour, Structure tour

*Ref.: James Bach, Michael Bolton, Mike Kelly, and many more

Page 31: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 29 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Guidelines and experiences (1)

For beginnersperhaps you are confused about the many test design methods

start simple, step by stepask for help and advice by an experienced colleague, coach or consultant

For advanced, experienced testers (and developers!)check your current approach against this poster,think twice, and improve incrementally

Use the poster as a checklist for existing test design methods

Selection of test design methods is dependent on the context!So, you should adapt the poster to your specific needs.

Black-box 3(models, interfaces, data) 3

334444132432153554345

Grey-box 23334

White-box Control flow-based Coverage Statements (C0), nodes 2(internal structure, paths) (specification-based, Branches (C1), transitions, links, paths 3

model-based, Conditions, decisions (C2, C3) 4code-based) Elementary comparison (MC/DC) 5

Interfaces (S1, S2) 4Static metrics Cyclomatic complexity (McCabe) 4

Metrics (e.g. Halstead) 4Read / Write access 3Def / Use criteria 5

Positive, valid cases 1Negative, invalid cases 3

35

Fault-based 2434322312245

Regression (selective retesting) 52325

Communication behavior (dependency analysis)

Fault injection

Test catalog / matrix for input values, input fieldsState-based testing (Final State Machines)Cause-effect graphing

Normal, expected behavior

Error handling

T e s t D e s i g n M e t h o d s o n O n e P a g e

Ad hoc, intuitive, based on experience, check lists

Bug reportsBug patterns: standard, well-known bug patterns or produced by a root cause analysis

Systematic failure analysis (Failure Mode and Effect Analysis, Fault Tree Analysis)

Dependencies / Relations between classes, objects, methods, functionsDependencies / Relations between components, services, applications, systems

Special values

Standards (e.g. ISO/IEC 9126, IEC 61508), norms, (formal) specifications, claims

Retest by risk, priority, severity, criticalityRetest all

User / Operational profiles: frequency and priority / criticality (Software Reliability Engineering)

Protocol based (sequence diagrams, message sequence charts)

Test patterns (e.g. by Robert Binder), Questioning patterns (Q-patterns by Vipul Kocher)

Combinatorial testing (pair-wise, orthogonal / covering arrays, n-wise)

Evolutionary testing

Equivalence class partitioning

Features, functions, interfaces

Attack patterns (e.g. by James A. Whittaker)

Retest changed parts

Error guessing

Error catalogs, bug taxonomies (e.g. by Boris Beizer, Cem Kaner)

Retest by profile, frequency of usage, parts which are often used

Statistical testing (markov chains)

CRUD (Create, Read, Update, Delete) (data cycles, database operations)

Requirements-based with traceability matrix (requirements x test cases)Use case-based testing (sequence diagrams, activity diagrams)

Flow testing, scenario testing, soap opera testing

Random (monkey testing)

Design by contract (built-in self test)

Data flow-based

Decision tables, decision trees

Classification-tree method

Syntax testing (grammar-based testing)

Time cycles (frequency, recurring events, test dates)

Domain partitioning, category-partition method

Boundary value analysis

Trace-based testing (passive testing)

Invalid, unexpected behavior

Exceptions

Risk-based

Fault model dependent on used technology and nature of system under test

Exploratory testing, heuristics, mnemonics (e.g. by James Bach, Michael Bolton)

Mutation testing

Retest parts that are influenced by the changes (impact analysis, dependency analysis)

Page 32: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 30 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Guidelines and experiences (2)

Pick up this poster andgive it to every developer and tester in your team orput it on the wall in your office ormake it the standard screensaver ordesktop background for all team members oreven use the testing on the toilet approach by Google(see http://googletesting.blogspot.com/) …

The poster increases visibility and importance of test designmethods, especially also for developers to improve unit testing

The poster facilitates a closer collaboration of testers anddevelopers: you have something to talk about ...

Black-box 3(models, interfaces, data) 3

334444132432153554345

Grey-box 23334

White-box Control flow-based Coverage Statements (C0), nodes 2(internal structure, paths) (specification-based, Branches (C1), transitions, links, paths 3

model-based, Conditions, decisions (C2, C3) 4code-based) Elementary comparison (MC/DC) 5

Interfaces (S1, S2) 4Static metrics Cyclomatic complexity (McCabe) 4

Metrics (e.g. Halstead) 4Read / Write access 3Def / Use criteria 5

Positive, valid cases 1Negative, invalid cases 3

35

Fault-based 2434322312245

Regression (selective retesting) 52325

Communication behavior (dependency analysis)

Fault injection

Test catalog / matrix for input values, input fieldsState-based testing (Final State Machines)Cause-effect graphing

Normal, expected behavior

Error handling

T e s t D e s i g n M e t h o d s o n O n e P a g e

Ad hoc, intuitive, based on experience, check lists

Bug reportsBug patterns: standard, well-known bug patterns or produced by a root cause analysis

Systematic failure analysis (Failure Mode and Effect Analysis, Fault Tree Analysis)

Dependencies / Relations between classes, objects, methods, functionsDependencies / Relations between components, services, applications, systems

Special values

Standards (e.g. ISO/IEC 9126, IEC 61508), norms, (formal) specifications, claims

Retest by risk, priority, severity, criticalityRetest all

User / Operational profiles: frequency and priority / criticality (Software Reliability Engineering)

Protocol based (sequence diagrams, message sequence charts)

Test patterns (e.g. by Robert Binder), Questioning patterns (Q-patterns by Vipul Kocher)

Combinatorial testing (pair-wise, orthogonal / covering arrays, n-wise)

Evolutionary testing

Equivalence class partitioning

Features, functions, interfaces

Attack patterns (e.g. by James A. Whittaker)

Retest changed parts

Error guessing

Error catalogs, bug taxonomies (e.g. by Boris Beizer, Cem Kaner)

Retest by profile, frequency of usage, parts which are often used

Statistical testing (markov chains)

CRUD (Create, Read, Update, Delete) (data cycles, database operations)

Requirements-based with traceability matrix (requirements x test cases)Use case-based testing (sequence diagrams, activity diagrams)

Flow testing, scenario testing, soap opera testing

Random (monkey testing)

Design by contract (built-in self test)

Data flow-based

Decision tables, decision trees

Classification-tree method

Syntax testing (grammar-based testing)

Time cycles (frequency, recurring events, test dates)

Domain partitioning, category-partition method

Boundary value analysis

Trace-based testing (passive testing)

Invalid, unexpected behavior

Exceptions

Risk-based

Fault model dependent on used technology and nature of system under test

Exploratory testing, heuristics, mnemonics (e.g. by James Bach, Michael Bolton)

Mutation testing

Retest parts that are influenced by the changes (impact analysis, dependency analysis)

Page 33: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based

Page 31 May 7, 2008 © Siemens AG, Corporate TechnologyPeter Zimmerer, CT SE 1

Summary

There exist many different methods for adequate test design.When looking into practice often these test design methodsare used only sporadically and in a non-systematical way.

The poster Test Design Methods on One Pagecontaining a systematic, structured, and categorized overviewabout test design methods will help you to really get them usedin practice in your projects.

Do not miss breadth because of too much depth.

This will result in better and smarter testing

Black-box 3(models, interfaces, data) 3

334444132432153554345

Grey-box 23334

White-box Control flow-based Coverage Statements (C0), nodes 2(internal structure, paths) (specification-based, Branches (C1), transitions, links, paths 3

model-based, Conditions, decisions (C2, C3) 4code-based) Elementary comparison (MC/DC) 5

Interfaces (S1, S2) 4Static metrics Cyclomatic complexity (McCabe) 4

Metrics (e.g. Halstead) 4Read / Write access 3Def / Use criteria 5

Positive, valid cases 1Negative, invalid cases 3

35

Fault-based 2434322312245

Regression (selective retesting) 52325

Communication behavior (dependency analysis)

Fault injection

Test catalog / matrix for input values, input fieldsState-based testing (Final State Machines)Cause-effect graphing

Normal, expected behavior

Error handling

T e s t D e s i g n M e t h o d s o n O n e P a g e

Ad hoc, intuitive, based on experience, check lists

Bug reportsBug patterns: standard, well-known bug patterns or produced by a root cause analysis

Systematic failure analysis (Failure Mode and Effect Analysis, Fault Tree Analysis)

Dependencies / Relations between classes, objects, methods, functionsDependencies / Relations between components, services, applications, systems

Special values

Standards (e.g. ISO/IEC 9126, IEC 61508), norms, (formal) specifications, claims

Retest by risk, priority, severity, criticalityRetest all

User / Operational profiles: frequency and priority / criticality (Software Reliability Engineering)

Protocol based (sequence diagrams, message sequence charts)

Test patterns (e.g. by Robert Binder), Questioning patterns (Q-patterns by Vipul Kocher)

Combinatorial testing (pair-wise, orthogonal / covering arrays, n-wise)

Evolutionary testing

Equivalence class partitioning

Features, functions, interfaces

Attack patterns (e.g. by James A. Whittaker)

Retest changed parts

Error guessing

Error catalogs, bug taxonomies (e.g. by Boris Beizer, Cem Kaner)

Retest by profile, frequency of usage, parts which are often used

Statistical testing (markov chains)

CRUD (Create, Read, Update, Delete) (data cycles, database operations)

Requirements-based with traceability matrix (requirements x test cases)Use case-based testing (sequence diagrams, activity diagrams)

Flow testing, scenario testing, soap opera testing

Random (monkey testing)

Design by contract (built-in self test)

Data flow-based

Decision tables, decision trees

Classification-tree method

Syntax testing (grammar-based testing)

Time cycles (frequency, recurring events, test dates)

Domain partitioning, category-partition method

Boundary value analysis

Trace-based testing (passive testing)

Invalid, unexpected behavior

Exceptions

Risk-based

Fault model dependent on used technology and nature of system under test

Exploratory testing, heuristics, mnemonics (e.g. by James Bach, Michael Bolton)

Mutation testing

Retest parts that are influenced by the changes (impact analysis, dependency analysis)

Page 34: Systematic Test Design … All on One Page · 2013-08-25 · Statistical testing (markov chains) CRUD (Create, Read, Update, Delete) (data cycles, database operations) Requirements-based