Download - Value-Inspired Testing - renovating Risk-Based Testing, & innovating with Emergence (2012)
www.eurostarconferences.com
Neil Thompson, Thompson information Systems Consulting Ltd
Value-Inspired Testing:Renovating Risk-Based Testing, and
Innovating with Emergence
@esconfs#esconfs
2
Value-Inspired Testing v1.1a
Renovating Risk-Based Testing, andInnovating with Emergence
Neil Thompson ©
[email protected] @neilttweet neiltskype
3
Deming: survival is “not compulsory”
Are reports of testing’s death “greatly exaggerated”?
• Tim Rosenblatt (Cloudspace blog 22 Jun 2011) “Testing Is Dead – A Continuous Integration Story For Business People”
• James Whittaker (STARWest 05 Oct 2011) “All That Testing Is Getting In The Way Of Quality”
• Alberto Savoia (Google Test Automation Conference 26 Oct 2011) “Test Is Dead”
• (There *may* be others?)
4
But those definitions of testing seem too narrow – my Agenda instead...
• To renovate the use of Risk in testing:– collate current variants, eg “Risk-Based, Risk-Driven”– use context-driven mix of principles– grade testing from high to low (not truncate)– balance risk against benefits, giving net Value– use risk throughout testing “process”– integrate risk into SDLC using Value Flow ScoreCards
• To innovate in testing:– consider evolution in Nature – also a value flow?– appreciate concept of Memes; evolving “memeplexes” in
testing– emergent path between “too much chaos” & “too much
order”– creativity: where good ideas come from (Johnson)
5
So, when holistic & evolving, testing will not die?
(based on http://www.needham.eu/wp-content/uploads/2011/01/ascent-of-man1.jpg)
6
Start renovation of “Risk” by collating current variants
IMPLICIT RISK
PRINCIPLES
“TESTING IS RISK-
BASED”HOW TO
DO ITRISK,
SCHMISK!
Risks as entities to test,driving techniques
Risk as prioritisation of features etc
RISK-BASED TEST DESIGN
RISK-BASED TEST MANAGEMENT
2002 !
1972-3
1970s - 1984
1976
1990
1979
1984-1988
7
Use a context-driven mix of available principles
RISK-BASED TEST MANAGEMENT
RISK-BASED TEST DESIGN
Project environment
Project environment
Perceived quality
Risk workshops:• why, whether, who, where?• when, what risks, how handle?
After: Heuristic Test Strategy Model v4.8, James Bach
Quality criteria
Quality criteria
Business risks
Technical risks
Risk factors to choose, eg:• usage• newness• complexity
Test techniques
Prioritisation
Product elements
Product elements
What to prioritise & focus on:• test items?• features?• data items?• test conditions?
8
Prioritisation: better than truncating “low-risk” tests, *grade* coverage
Even distribution
Test Coverage&Effort
Riskiness
Random / spurious priorities Risk-truncated
X X
After: Chris Comey,Testing Solutions Group
Risk-graded
X• Does this make sense?• No!
• Even less sense!
• Better, but dangerous to omit some areas completely?
• This is the most responsible way
9
Consider not only risks – balance against benefits to give net value
Open
Closed
Pro
duct
R
isks
Open
Open
Closed
Closed
Open
Obje
ctiv
e
Obje
ctiv
e
Obje
ctiv
e
Obje
ctiv
e
Benefit
Obje
ctiv
e
Closed
Project objectives, hence business benefits, available for release now
After: Paul Gerrard & Neil Thompson,book Risk-Based E-Business Testing
Obje
ctiv
e
Obje
ctiv
e
Obje
ctiv
e
Obje
ctiv
e
Obje
ctiv
e
Obje
ctiv
e
Obje
ctiv
e
Benefit Benefit
Project...
Business...
.............Value
Tests graded
by...
Priorities
+ FEATURES etc
... .. .... . ..... ...
.. ... .
Apply risk principles throughout software lifecycle
10
programmingwith risk of bugs
Requirements
FunctionalSpecification
TechnicalDesign
ModuleSpec
AcceptanceTest Analysis
& Design
SystemTest Analysis
& Design
IntegrationTest Analysis
& Design
ComponentTest Analysis
& Design
TESTMODEL
DEVELOPMENTMODEL
REALWORLD
simplification
refinementwith risk ofdistortion
AT Execution
ST Execution
IT Execution
validation testing
verification testing
CT Execution
SOFTWARE
SOFTWARE(observed)
DEV MODEL(expected)
TEST MODEL(ver’d / val’d)
REALWORLD(desired)
after SOFTWARE TESTING:A CRAFTSMAN’S APPROACHPaul Jorgensen
So:• remember overlapping models• we need both
verification & validation• this is not “the” V-model!
11
Bear in mind causes and effects of risks
programmingwith risk of bugs
Requirements
FunctionalSpecification
TechnicalDesign
ModuleSpec
TESTMODEL
DEVELOPMENTMODEL
REALWORLD
simplification
refinementwith risk ofdistortion
SOFTWARE
Mistake:a human action that produces an incorrect result (eg in spec-writing, program-coding)
Defect:
incorrect information in specifications
Fault:an incorrect step, process or data definition in a computer program (ie executable software)
Anomaly:an unexpected resultduring testing
Failure:an incorrect result
Error:
amount by whichresult is incorrect
Knock-onEffects
Probability of making mistakes, of defects causing faults, faults causing failures, etcConsequence of risk if it
happens...............................................................................
On TEST“process”
On REALWORLD
aftergo-live
StaticVerification
Validation
Staticverification
Risk principles apply throughout testing “process”
12
Specification
TESTMODEL
DEVELOPMENTMODEL
Mistake DefectFault Anomaly Failure
Error
Knock-onEffects
Detect omissions, distortions, rogue additions...
Staticvalidation
Use “Peopleware”principles
Prevention
other oraclesTest
analysisTest
designTest
exec’nBug
mgmt
May be all or partially exploratory.................
Prioritise by both urgency.............& importance................................
On DEV & TEST “processes”On REAL WORLD after go-live
Detect further bugs;Adjust test coverage
Fix,test fixes,regression-test
Write / modelbetter requirements
13
A framework for managing value through the lifecycle: “Value Flow ScoreCard”
Financial
CustomerSupplier Improv’t
Process Product
FinancialCustomerSupplier Infra-structure
Process Product
WHO...
WHYWHAT, WHEN, WHERE
HOW
• In action, the ScoreCard is a 7x4 table:– uses include setting / balancing test policy, strategy,
coverage, troubleshooting & improvement– can start with repositionable paper notes, or use
spreadsheet– NB the measures & targets need not be
quantitative, may be qualitative eg rubrics
Improve-ment
Infrast
• “The seven watchwords of highly effective software people!”
14
Risk can be integrated into the scorecard
Objectives
Threats to success
Measures
Targets
Initiatives
WHY wedo things
HOW theymay fail
WHAT(will constitute success,WHEN & WHERE)
HOW todo thingswell
Risk Risk Risk Risk Risk Risk
FinancialSupplier CustomerInfrastructure
Process Product
SEVEN VIEWPOINTS of what stakeholders want
Improvement
• Now it’s a 7x5 table
15
Types of risk
Projectrisk
Processrisk
Productrisk
Eg:• supplier may
deliver late• key staff may leave
may cause
Eg:• configuration management
may install wrong version of product
Eg:• specifications may
contain defects• software may contain
faults
may cause
may cause
may cause
16
So: we’ve renovated “risk-based testing” into a whole-lifecycle structure
Objectives
Threats to success
Measures
Targets
Initiatives
WHY wedo things
WHAT(will constitute success,WHEN & WHERE)
HOW todo thingswell
Projectrisk
Processrisk
Productrisk
(Processrisks)
Projectrisk
Projectrisk
FinancialSupplier CustomerInfrastructure
Process Product
SEVEN VIEWPOINTS of what stakeholders want
Improvement
17
Now to move on to innovation
– but also: how we are planning to improve for next & future projects
Objectives
Threats to success
Measures
Targets
Initiatives
Processrisks
FinancialSupplier CustomerInfrastructure
Process Product Improvement
• The double feedback loop of the ScoreCard:– not only is our
scorecard, and its cascading, converging on desired targets for current project...
How does Nature innovate?
18 Images from wikipedia
Lamarck: Acquired characteristics,Usage,Inheritance
Darwin: Mutation, Fitness, Reproduction (various
authors)Emergence...
19
A scientific view of emergence
Image from http://www.aaas.org/spp/dser/03_Areas/cosmos/perspectives/Essay_Primack_SNAKE.GIF
Sources: Daniel Dennett “Darwin’s Dangerous Idea” “cosmic Ouroboros” (Sheldon Glashow, Primack & Abrams, Rees etc)
Physics(quantum end)
Chemistry: Inorganic
Chemistry: Organic
Biology
Socialsciences
(Ouroboros: Greek Οὐροβόρος or
οὐρηβόρος, from οὐροβόρος ὄφις "tail-devouring snake”)
Physics (gravity end)
20
Is like value flow? (and it looks better this way up!)
• Each level of progress generates possibilities, which are tested
• Then, each level is a platform which, when established, is easily built uponby “cranes” (without having to worry about the details below)
• After the science levels...• humans made tools, talked and
co-operated• printing gave us another level • now, software is following
exponential growth• So, software testing should surf the
wave of evolution (not flounder in the shallows behind it)
• Kurzweil epochs
5: Bio methods integrated into technology?
6: Intelligence into matter/energy patterns?
4: Technology
3: Brains
2: Biology
1: Chemistry & Physics
+0: Maths?!
“SINGULARITY”
The Singularity is Near,2005
21
The Darwinian view of evolution – but does this explain all emergence?
Image from www.qwickstep.com
22
Biological evolution as sophistication rising with diversity
Diversity
Sophistication
Time
23
But evolution is not smooth?
“Punctuated equilibra” idea originated by Niles Eldredge & Stephen Jay GouldImages from www.wikipedia.org
Sophistication
Diversity“Gradual” Darwinsim
Sophistication
DiversityPunctuated equilibria
“Explosion” in species, eg Cambrian
Spread into new niche,eg Mammals
Mass extinction, eg Dinosaurs
(equilibrium)
(equilibrium)
(equilibrium)
Sophistication
Diversity
Number ofspecies
24
So... evolution of sciences overall?
Biology
Chemistry
Organic
Inorganic
Physics
Social sciences
• Arguably other sciences have not evolved smoothly either• Sudden advances, akin to punctuated equilibria in biological evolution
Per Bak, “How Nature works” 1996
(image Tracey Saxby, Integration and Application Network, University of Maryland Center for Environmental Science ian.umces.edu/imagelibrary/)
Sophistication
Diversity
25
OK, what’s all this got to do with software testing?
Social sciences
Tools
Language
Books
Computers
• We have an important and difficult job to do here!
• Social sciences evolution Tipping Points
(Malcolm Gladwell)
Sophistication
Diversity
26
Testing needs to evolve / emerge / innovate to keep up with complexity
Computers
1GL
ObjectOrientation
Internet,Mobile devices
ArtificialIntelligence?!
4GL
3GL
2GL
• For example, are we ready to test AI??
Sophistication
Diversity
27
How has testing evolved so far?PERIOD
1957
1976
EXEMPLAR OBJECTIVES SCOPE “SCHOOL”?
Pre-
1983
1984
2000
2011
Weinberg(1961 & 71) Test + Debug Programs
Hetzel(1972)
Show meetsrequirements
Find bugsMyers(1976 & 79)
Programs
Programs, System,Accept’ce
Kaner et al(1988 & 99)
Experiment &Evolve?
Neo-Holistic?
Measurequality
?
Beizer(1984)
+ Integr- ation
Find bugs,show meetsrequirements,+prevent bugs
Find bugs, in serviceof improving quality,for customer needs
Analytic
Standard(Control)
Quality
ContextDriven
“no schools, but...”
?
Agile(Test-Driven)
Factory
Overall periods developed after Gelperin & Hetzel, “The Growth of Software Testing”, 1988 CACM 31 (6) as quoted on Wikipedia
?
DEBUGGING (Psychology)DEMONSTRATION (Method)DESTRUCTION (Art)
EVALUATION (Engineering?)
PREVENTION (Craft?)
HUMANISATION?
Science?
(Technology?)
(Social Science?)
AUTOMATION?
UNIFICATION??
28
Another way of thinking about evolution: genes...
Image from www.qwickstep.com Image from schools.wikipedia.org
Diversity
Sophist-ication
Replication & Selection
Mutation
GEN
E
29
...and for humans, “memes”, as an extension of the genes concept
Biological evolution
GEN
ES
Theme developed from Daniel Dennett “Darwin’s Dangerous Idea”
Mental, social & cultural evolution
Platforms
Cranes
MEM
ES
(Lamarckian??) Replication & Selection
Mutation
Ideas Beliefs PracticesSymbols
Gestures
Speech
Writing
Image from .www.salon.comTaxonomy from www.wikipedia.org
“Otherimitable
phenomena”
RitualsSophistication
Diversity
30
Considering memes in testing: here is an example “memeplex”
Always-considerEffectiveness
EfficiencyRisk management Quality management
Insurance Assurance
V-model: what testing against W-model: quality management
Risks: list & evaluate
Define & detect errors (UT,IT,ST)Give confidence (AT)
Prioritise tests based on risks
Tailor risks & priorities etc to factors
Refine test specifications progressively: Plan based on priorities & constraints Design flexible tests to fit Allow appropriate script format(s) Use synthetic + lifelike data
Allow & assess for coverage changes Document execution & management procedures
Distinguish problems from change requests
Prioritise urgency & importance
Distinguish retesting from regression testing
Use handover & acceptance criteria
Define & measure test coverage
Measure progress & problem significance
Be pragmatic over quality targets
Quantify residual risks & confidence
Decide process targets & improve over time
Define & use metrics
Assess where errors originally made
Define & agree roles & responsibilities
Use appropriate skills mix
Use independent system & acceptance testers
Use appropriate techniques & patterns
Plan early, thenrehearse-run,acceptance tests
Use appropriate tools
Optimise efficiency
Source: Neil Thompson STAREast 2003 (not “best practices” but reference points for variation?)
31
Another example memeplex for testing
Source: Neil Thompson BCS SIGiST 2002 review of Lessons Learned in Software Testing (Kaner, Bach & Pettichord)
You
r career in softw
are testing
Th
e role of the tester
Th
ink
ing lik
ea tester
Testin
gtech
niq
ues
Bu
gad
vocacy
Au
tomatin
gtestin
g
Documenting testing
Interacting with programmers
Managing the testing project
Managing the testing group
Plan
nin
g the
testing
strategy
• (Grouped here by chapter for illustration, and coloured by theme)• 293 individual “lessons” selectable by testers according to context
Management
Thinking
32
So, do we have punctuated equilibria in the evolution of testing?
Software testing
Sources: Gelperin & Hetzel 1988 etc??DEBUGGING
DEMONSTRATION eg V-model
DESTRUCTION eg test techniques
EVALUATION eg metrics initiatives
PREVENTION eg reviews, root cause analysis
HUMANISATION? eg Context-Driven school
Science?
Technology?
Social Science?
AUTOMATION? eg test-driven development
UNIFICATION??• Where were the Platforms?• What were the CRANES?• Tipping points?
Psychology• But... is there something wrong with this picture?...
Method
Art
Engineering?
Craft?
Mass-market software
Open-source tools
Belief in cost-of-failure curves
Publication of ANSI/IEEE standards
Establishment of textbooks
Acknowledg’t of testing as distinct discipline
Software analysis
Sophistication
Diversity
33
One of the existing views of innovations in software testing
After: Lines of innovation in software testing,Stuart Reid 2010/2011,testing-solutions.com
Testing & Quality
Testing (20th C)
Testing innovationsin specific subjects
• Concepts:– hierarchy– products / processes• Factors:– invention / application – individuals / organisations– bottom-up / top-down– synthesis of precursors – adjacent possibilities– role of testing!• Aids:– population size– diversity / interdiscipline– free time / free to fail– psychology & serendipity– recording media
Arguably, emergence is more than just Lamarckian / Darwinian
34
Physics
Social sciences
Chemistry
Biology
• Emergences at coarser scales not explained by “reductionism” to finer scales • For best innovation & progress, need neither too much order nor too much chaos• Examples: galaxy development, phase transitions,
Gaia, autocatalysis, aminoacids→proteins, political swings, AI & IA?
ORD
ER
CHAOS
ORD
ER
CHAOS
ORD
ER
CHAOS
ORD
ER
CHAOS
Extrapolation from various sources, esp. Stuart Kauffmann, “The Origins of Order”, “Investigations”
• Might also apply to testing??
Sophistication
Diversity
Time
35
History of testing is intertwined in “ecosystems” with technology, software lifecycles, etc
ORD
ER
CHAOS
Testing &Quality
Development
Psychology
Method
Art
Engineering
Craft
TechnologySocial science
Structuredmethodologies
CASEtools
immature Agile
mature Agile?
Diversity
Sophistication
36
And within testing, different contexts have so far evolved in separate streams?
Testing &Quality:TRADITIONAL“SCHOOLS”
CONTEXT-DRIVEN
Psychology
MethodArt
Engineering Craft
Technology
Social scienceXX
X
X• Recent changes regarding “school” & “approach”
• Limited dialogue, mutual mistrust, “language” differences
Expl
oita
tion
Expl
orat
ion
Diversity
Sophistication
37
An “emergent” view of innovation
7. Platforms1. Adjacentpossible
Reef
City
Web
2. Liquid networks
• Eight related ideas from history of human innovation
“0”
3. Slow hunch4. Serendipity5. Error6. Exaptation
Johnson’s ideas overlaid here on Neil Thompson’s graphic
38
Emergent view: (a) innovation framework
7. Platforms1. Adjacentpossible
Reef
City
Web
2. Liquid networks
“0”
• Coral reefs surprisingly diverse habitat, because crowded, wave- washed boundary zone• Cities concentrate minority interests where they can communicate • Tech innovations used to take 10 years; on www 1 is enough
“Patterns of innovation are fractal”
• Things happen wherever they can happen
• Ideas flowing without friction
• Once a new level is established, can build on it, almost without thinking
Emergent view: (b) innovation “techniques”
39
7. Platforms1. Adjacentpossible
Reef
City
Web
2. Liquid networks
“0”
3. Slow hunch
4. Serendipity
5. Error
6. Exaptation
• Many innovations are not eureka moments, they take time to evolve & establish
• You may find something different, but it’s important to be seeking something
• Noise can make us focus more• OK to fail, but try to fail fast
• Modifications can be hi-jacked for unexpected things (and beneficially)
tattoos99.com
40
A brief history of human innovation
Individual(s) Communities
Amateur
Market
1400-1600
1600-1800
1800-current
• Most discoveries “amateur individuals”, eg:
– supernovae (Brahe)
• Rise of amateur communities, eg:
– Milky Way (Al-Biruni, Galileo, Herschel & his sister)
• Rise of market communities, eg:
– radio (Marconi, Tesla, Braun, Hertz etc)
Source: Steven Johnson,“Where good ideas come from:the natural history of innovation”
So, what could software testing learn fromthe history of innovation?
HUMAN HISTORY SOFTWARE TESTING
Slow hunch, Exaptation
• Keep a notebook. You never know what may come in handy eventually (see also Jerry Weinberg’s Fieldstone method)
communities (market & amateur)
• Even competitors in this market seem to collaborate and mutually-respect. Keep it up!• Attend conferences etc
Adjacent possible • Try modifying / combining / hybridising techniques. They’re not set in stone (eg 2-D classification trees)
Reef, City, Web • Even if introvert, use LinkedIn, Twitter etc
Serendipity • If a trail goes cold, turn your nostrils in some other direction
Platforms • Seek new uses of previous achievements, eg test automation in new ways (high-volume random)
41
42
An additional thoughtRenovated risk,
& Science, as UNIFICATION?
• Testing contexts will of course continue to differ, but...• More mutual dialogue may increase innovation, both sides
Traditional, risk-averse sectors
Market-chasing,product-oriented,risk-tolerant / risk-embracingsectors
• ...if we can all share understanding across varied contexts
Diversity
Sophistication
43
A brief history of testing innovation?
Individual(s) Communities
Amateur
Market
1950s-1999?
2000-2012?
2012 onwards?
• Guru individuals?
• Communities in relative isolation?
• Communities interacting more?
ContextDriven
Agile(Test-Driven)
AnalyticQuality
Factory
44
Key references & acknowledgements (NB this is not a full bibliography)
• Use of Risk in testing (yes, other sources are available!):– Kaner, Bach & Pettichord: Lessons Learned in Software Testing– Craig & Jaskiel: Systematic Software Testing– Gerrard & Thompson: Risk-Based E-Business Testing
• Principles contributing to Value Flow ScoreCard:– Kaplan & Norton: The Balanced Scorecard – Translating Strategy into Action– Isabel Evans, Mike Smith, Software Testing Retreat
• History & innovations in testing:– Gelperin & Hetzel: The Growth of Software Testing– (Meerts: testingreferences.com incl. timeline – see Paper)– Stuart Reid: Lines of Innovation in Software Testing
• Emergence:– Dennett: Darwin’s Dangerous Idea– Eldredge & Gould: Punctuated Equilibria... (in Models in Palaeobiology)– Kauffman: The Origins of Order, Investigations etc– Johnson: Where Good Ideas Come From– (+Kurzweil: The Singularity is Near?!)
45
Takeaway ideas• All testing is risk-based/value-inspired:
whether or not you recognise it yet (so, make a virtue of it)
• Embrace diversity; discuss!don’t dismiss, disrespect or just “agree to differ”
• Mix with lots of non-testers• Seek out analogies & metaphors• Depending on your personality:
– Read lots of books (eg “things to read together” = adjacent possible)
– Do lots of thinking – deliberate & unintended– Participate in blogs, discussion groups
• Remember: change is accelerating, and innovation is fractal!