neil thompson - thinking tools: from top motors, through software process improvement, to...
DESCRIPTION
Visit SoftTest Ireland www.softtest.ie and sign up for access to free Irish Software Testing events.TRANSCRIPT
©
September 2007Neil Thompson
Thinking tools: from top motors, through
software process improvement, to context-driven
Neil Thompson Thompson information Systems Consulting Ltd
SoftTest Ireland with the support of the All Ireland Software Network
Belfast 20 Sep 2007
23 Oast House CrescentFarnham, SurreyEngland, UKGU9 0NPwww.TiSCL.com
2©
September 2007Neil ThompsonCan software process improvement learn
from these?
TOYOTA CELICA GT4
TOYOTA PRIUS
3©
September 2007Neil ThompsonHow Toyota progressed through quality to
global dominance, and now innovation• Quality (top reputation):
– has dominated JD Power satisfaction survey for decade– Toyota Production System (TPS): 14 principles across
Philosophy, Problem-Solving, Process and People & Partners
• Global dominance:– market value > GM, Chrysler & Ford combined– on track to become (2006) world’s largest-volume car mfr
• Innovation (fast):– Lexus: invaded the “quality” market and won– Prius: not evolutionary but revolutionary – and launched 2
months early and sold above expectations– Toyota Product Development System (TPDS):
13 (!) principles across Process, People and Tools & Technology
4©
September 2007Neil ThompsonAgenda
• Contents:– Analogies between world -leading improvements in manufacturing and
what we may do in the software development lifecycle (SDLC):• 1. Things that flow through a process: inventory, value (EuroSP3 2004)• 2. Constraints on process, and thinking tools to improve (EuroSTAR 2006)• 3. From process improvement to process definition, eg context-driven
(STAREast 2003)
• Acknowledgements:– Jens Pas (EuroSTAR 1998) – my introduction to Goldratt– Greg Daich (STAREast 2002) – I generalised his idea, then worked
backwards to the roots • Objectives for audience:
– entertainment? – something a bit different– appreciate some fundamental principles– take away a set of simple diagrammatic thinking tools which are useful
in many situations– think about your particular SDLC – where are the constraints?– go on to read some of the references– benefit by then improving your own processes– be more ready to learn from other disciplines & industries
5©
September 2007Neil Thompson
GOLDRATT:Drum-Buffer-RopeMaximise throughputCritical Chain manag’t
Monitoring buffers
Cause-effect treesConflict resol diagramsIdentify constraint,“elevate” & iterate
The “new” paradigm in manufacturing: value flow, pull not push, problem-solving
• And now these principles have been successfully applied beyond actual manufacturing, into product development
3. Pull to avoid over-production
• Customer-defined value (to separate value -added from waste)
2. Continuous process flow to surface problems
7. Visual control to see problems
12. See for yourself to thoroughly understand
13. Decide slowly (all options) by consensus • Front-load product dev to explore thoroughly alternatives while max design space
TOYOTA:Takt (rhythm), Low-Inventory (“lean”),Just-In-TimeMinimise wasteAndon (stop and fix)Kanban cardsTagging slow moversOne-page metricsChain of 5 “why”s
Plan-Do-Check-Act
4. Level workload
14. Learning org via reflection & improvement
• But what about development of software?...
• I prefer Goldratt for thinking tools...
TPS & TPDS
6©
September 2007Neil ThompsonGoldratt’s Theory of Constraints: an analogy
to explain
diagram based on those in “The Race”, E.M. Goldratt & R. Fox 1986
Drum Rope
Buffer
Goal: to win the warObjective: to maximise throughput (right soldiers doing right things)Constraint on throughput: slowest marcher
Critical chain: weakest link is all we need fix, by means of...Five focussing steps: identify constraint, exploit it, subordinate all else,
elevate it (i.e. strengthen so not now weakest), then... identify next constraintBut now it’s no longer simple: so need iterative tools for:
what to change, what to change to, howFive thinking tools (based on sufficient causes &
necessary conditions)
7©
September 2007Neil Thompson
Applicability of TOC beyond manufacturing
• Military logistics• Marketing, sales & distribution• Project management• Measurements, human relationships, medicine etc• Using technology, eg assessing benefits of functionality• IT systems development:
– focussing of Goldratt’s Critical Chain on hi-tech projects Robert C Newbold
– methodology design Alistair Cockburn
– “Lean software development” Mary & Tom Poppendieck
– Agile management using Feature Driven Development David J Anderson
8©
September 2007Neil ThompsonBut software development isn’t
like manufacturing?
• Software isn’t like hardware• Intellect adds value less predictably than machines• The manufacturing part of software development is disk
duplication: “development” is really a design activity• People are more important than processes• Software development doesn’t repeat exactly, people are
always tinkering with the processes• Development involves discovery, production involves
reducing variation• But I say: does that make all the analogies worthless,
or do they just need interpreting? I suggest the latter…
9©
September 2007Neil Thompson
A code factory and a bug factory
• No-waste factory
• Now here’s some waste: meeting/escalation (and loaded personal memories), or inventory?
Programming
Programming
a ab b’c d
Documentedrequirements
Implicitrequirements
Meeting / escalation to agree
I I Acceptance tests
?
?
a b c Demonstrations &acceptance tests
Statedrequirements
10©
September 2007Neil Thompson
Specs & unfinished software are inventory
• Specifications are generally not needed after go-live (I will come later to exceptions) so they are not end -product, they are work-in-progress (especially intermediate levels like functional & non-functional specs)
• Untested software, and even finished software not paid for, is a lso inventory
• Iterative lifecycles help if “adaptive” (product-based) rather than “transformational” (where specifications multiply!)
a b
ProgrammingProgramming
b’a’ c
I
May includeredesign
Revised & new requirements
11©
September 2007Neil ThompsonThe full traditional W-model bulges with
inventory!
Requirements Statement
Functional Spec.
Technical Design
Module Specs.
Code
Business objectives
V&V MS, + spec. UT
Unit test
Unit retest, fix & reg. test
Post-implement-ation review
Verify & Validate RS, + spec. AT
Acceptance test
Acc. retest, fix & reg. test
Sys. retest, fix & reg. test
Int. retest, fix & reg. test
System test
Integrat-ion test
Verify & Validate FS, + spec. AT
V&V TD, + spec. IT
Makeinto
Test against
Retest lowerlevelswhere
necessary
Static-check
Verify
Validate(incl. “QA”)
12©
September 2007Neil Thompson
In a factory, small batches reduce inventory
Based on Goldratt, The race (North River Press 1986)
200Multi-batch(ie “iterative”)
(i) 1.3
(ii) 10.0
(iii) 1.0
(iv) 10.0
(v) 2.0
0 1 2 3 4 months
200 200 200 2001000
Single-batch(ie “waterfall”)
0 1 2 3 4 months
Inventory Inventory
Stages &units / hour
13©
September 2007Neil ThompsonDrum-buffer-rope approach to constraints
• Optimise throughput by:– (1) drumbeat based on constraining stage (a) & orders (b)– (2) buffer to protect constraining stage from upstream
disruptions– (3) rope to prevent leader extending gap on constraining
stage• For subassemblies feeding in, have additional buffers
assembly constrainingstage (a)”leader”
buffer
orders (b)subassembly buffer
raw materialsin
“troops marching” materials flow
(1)
(2)
(3)
14©
September 2007Neil ThompsonIn software development & testing, small
batches = agile methods: consider inventory moving through SDLC
Amount offunctionality
Date
Systemtesting
Live andpaid-for
Acceptancetesting
Programming &unit testing
Integrationtesting
Requirements
Design
Specification
If lines not approx parallel, inventory is growing
Inventory inprocessoverall
Inventory inthis stageof process
Lead time forthis stageof process
Within each stage of testing, can
subdivide by pass/fail,
bug states etc
Based on Anderson, Agile management for software engineering (Prentice Hall 2004)
15©
September 2007Neil ThompsonAgile methods: pull value instead of pushing
documentationLEVELS OF DOCUMENTATION,pushed by specifiers
WORKINGSOFTWARE
Accepted
System-tested
Integrated
Unit / Component-tested
FLOW OF FULLY-WORKING SOFTWARE,
pulled bycustomer demand
Requirements
+ FuncSpec
+ TechnicalDesign
+ Unit / Componentspecifications
+ Test Specifications
16©
September 2007Neil ThompsonBut even if our context is not suitable (or ready)
for agile methods, we should understand flow
order(s )Acceptancetesting
Systemtesting
Integrationtesting
Unittesting
Programming
Requirements
raw materialsin Where is/are the
constraining stage(s)?
Where should buffersbe / not be?
Design
Specification
assembly
sub-assemblies
17©
September 2007Neil ThompsonNew paradigm problem-solving: the Goldratt-
Dettmer* “Thinking Tools”
Core problem+(other) Root causes
Intermediateeffects
Undesirableeffects
Prerequisites+Conflicts
Requirements +INJECTIONS
Objective
CURRENT REALITY+ Injections
Intermediateeffects
Desired effects
Intermediateobjectives
Obstacles
Objective
Needs+Specificactions
Intermediateeffects
Objective
CURRENTREALITY
........... What to change to .......(2) (3)
CONFLICTRESOLUTION
.... How to change ....
PRE-REQUISITES
TRANS-ITION
FUTUREREALITY
What tochange (1)
* very slightly paraphrased here
(4) (5)
Sources: Dettmer, W. Goldratt’s Theory of Constraints (ASQ 1997)Thompson, N. “Best Practices” & Context-Driven – building a bridge (StarEast 2003)
18©
September 2007Neil ThompsonThe thinking tools are complementary diagrams
• Causes and effects• Necessary and sufficient conditions
http://www.osaka-gu.ac.jp/php/nakagawa/TRIZ/eTRIZ/eforum/eETRIACon2003/Fig11TillmannB.jpg
19©
September 2007Neil ThompsonWhy better than “traditional” process
improvement in software testing
Sources: TMMSM - http://www.stsc.hill.af.mil/crosstalk/1996/09/developi.aspTPI® – based on http://www.sogeti.nl/images/TPI_Scoring_Tool_v1_98_tcm6 -30254.xlsTOMTM – based on http://www.evolutif.co.uk/tom/tom200.pdf, as
interpreted by Reid, S. Test Process Improvement –An Empirical Study (EuroSTAR 2003)
PREDEFINED SUBJECT AREAS
MATURITY LEVELS
MedicalanalogiesL☺ Some
flexibilityTestOrganisationMaturityTM
TestMaturityModelSM
TestProcess
Improvement®
20©
September 2007Neil ThompsonExtending the new paradigm to testing: by
rearranging TPI’s key areas…
…we can begin to see cause-effect trees…
1.Test strategy
2.Lifecycle model
3.Mom of involv’t
4.Estim & plan
5.Test spec techn
6.Static techn’s
7.Metrics
8.Test automation
9.Test env’t
10.Office env’t
11.Commit & motiv
12.Test func & train
13.Scope of meth’y
14.Communication
15.Reporting
16.Defect mgmt
17.Testware mgmt
18.Test proc mgmt
19.Evaluation
20.Low-lev testing
21©
September 2007Neil ThompsonCause-effect trees: can start with TPI’s inbuilt
dependencies
…eg for getting to at least level A throughout
1.Test strategy 2.Lifecycle model3.Mom of involv’t
4.Estim & plan 5.Test spec techn6.Static techn’s 7.Metrics
8.Test automation9.Test env’t10.Office env’t
11.Commit & motiv 12.Test func & train13.Scope of meth’y 14.Communication 15.Reporting16.Defect mgmt
17.Testware mgmt
18.Test proc mgmt19.Evaluation 20.Low-lev testing
A:Informal techniques
A:Single hi-level test
A:Budget & time
A:Plan, spec, exec
A:Compl test basis
A:Substantiated A:Product for project
B:Test int in proj org B:Progress , activities,prioritised defects
A:DefectsA:Internal
A:Planning & exec’n
B:+Monitoring &adjustment
A:Managed-controlled
A:Testers & Test Mgr
A:Project -specific
B:Formal techniques
A:Internal
(slightly simplified)
22©
September 2007Neil ThompsonCan add extra “key areas”, lifecycle inputs &
outputs, general categories
…eg TPI / TMap’s four “cornerstones”
1.Test strategy 2.Lifecycle model3.Mom of involv’t
4.Estim & plan 5.Test spec techn6.Static techn’s 7.Metrics
8.Test automation9.Test env’t10.Office env’t
11.Commit & motiv 12.Test func & train13.Scope of meth’y 14.Communication 15.Reporting16.Defect mgmt
17.Testware mgmt
18.Test proc mgmt19.Evaluation 20.Low-lev testing
TECHNIQUESin general
LIFECYCLEin general
ORGANISATIONin general
INFRA-STRUCTURE
in general
INPUTS &INFLUENCES
on testing
OUTPUTSfrom testing
+ Test data
+ Risk-BasedSTAR
23©
September 2007Neil ThompsonCan go beyond the fixed questions: SWOT
each subject areaINPUTS & INFLUENCES on STAR 4.Estimating & planning
(small Post-it® notes are good for this)
STRENGTHS
WEAKNESSES THREATS
OPPORTUNITIES STRENGTHS
WEAKNESSES THREATS
OPPORTUNITIES
Not substantiated, just “wedid it as in previous project”
Monitored, and adjustmentsmade if needed
Source: solid borders denote as in TPI; dashed borders denote additional
System requirements areagreed too late
System specs & designs aredefective, just timeboxed
The most experiencedbusiness analysts are leaving,more may follow
Release dates are fixed
Can’t recruit more staff
The squeeze on testing islikely to worsen
Some managers are consideringagile methods
System specs are heavy textdocuments
Business analysts may bemotivated by UML training
Too busy for well-consideredestimating & planning
24©
September 2007Neil ThompsonApplying the thinking tools to information
from SWOT analysisSTRENGTHS
WEAKNESSES THREATS
OPPORTUNITIES
The SWOT method can be “nested”, eg aggregate upfrom individual subject areas to whole lifeycle
CURRENT REALITY
FUTURE REALITY
PRE-REQUISITES
TRANS-ITION
Anticipating &overcomingobstacles
Action planning
CONFLICTRESOLUTION
System specs are heavy textdocuments
Culture of our testers is toprefer large text documentsto diagrams
SDLC method does notencourage diagrams
Test specs are large & “texty ”
Test coverage omissions & overlaps
Can still improve coverageat macro level with
informal techniques (80/20)
Using extracts from both 1st & 2nd examples
Too many failures in Live
Some managers are consideringagile methods
Business analysts may bemotivated by UML training
STRATEGIC: Improve SDLC method
TACTICAL: Address culture byworked examples of diagrams
TACTICAL: Include tables &diagrams in test specifications
(Use Threats to helpidentify obstacles)
(Use Strengths to helpamplify opportunities)
ONGOING: Techniques
training & coaching
25©
September 2007Neil ThompsonDifficulties in problem-solving:
conflict resolution (eg for documentation)
Objectives of documentation
are to help build and maintain a fit-for-purpose
system by knowingand agreeingwhat built andwhat tested
“We needmore
documentation”
“We needless
documentation”
Developed further to Daich, GT. Software documentation superstit ions (STAREast 2002)See also Rüping, A. Agile documentation (Wiley 2003)
CO
NF
LIC
T
“Signed-off requirementsare counterproductive tosystems meeting realuser needs now”
“Documented test plansare counterproductive tothe best testing”
“People never readany documentation”
Specifications are likeinventory, no end value
“Reviews are powerful atfinding defects early, but it’sdifficult to review just speech”
“If it’s not written,it can’t be signed off”
“Test reports need tobe formal documents”
“They will when theirmemories have fadedor when there’s acontract dispute”
Documentation varies:need to distinguish necessaryfrom unnecessary
Need to distinguish qualityof documentation, notjust quantity
“Will the live system bemaintained by itsits developers?” No!
Our users cannot be on-sitewith the project throughout
“Are test analysts writingtests for others to run?” No!
Can mix exploratory& scriptedtesting
“Sign-off can be byagreed meeting outcomes” “Are there few enough people
to make frequent widespreadmeetings practical?” No!
“What documentation is neededfor contractual reasons? Stilltime to negotiate?” Yes!
Documentationis still needed
for maintenanceafter go-live
Agree in a workshop whatdocumentationis needed
Documentationdoesn’t have tobe paper: usewikis etc
Make maximumuse of tables& diagrams
26©
September 2007Neil ThompsonNot only process improvement – we can apply
the thinking tools to defining “appropriate” practices!
Rootcauses
Intermediateeffects
Effects
REALITY + Injections
Intermediateeffects
Desired effects
Intermediatesub-prerequisites
Obstacles
Sub-prerequisite
Specificactions
Intermediateeffects
Sub-objectives
ContextAlways-good principles
CONFLICTRESOLUTION
Questions toconsider
PRE-REQUISITES
FUTUREREALITY
CURRENTREALITY
What “appropriate” meansin this context
Choice categories& actions
“Prerequisites”
Requirements
Objectives
POSITIONING+ Justifications
Extremes
Sub-requirements(valid & invalidassumptions)+ INJECTIONS
• Methodologyunhappy with(à actions)
• Unsure howbest to test(à conditions)
Good practicesin thiscontext
12a
2b
3
4
5a
5b
©
TRANSITION
Choicecategories +NEEDS +
INTERACTIONS
Objectives
Actions & sub-objectives
26
27©
September 2007Neil ThompsonThis is a structure I argued could build a bridge
between “best practices” and context-driven
BestBestPracticePractice
Context-Driven
Constraints,Requirements,Objectives etc
“Always-Good”Principles
“fossilisedthinking”
“formalisedsloppiness”
Unifying points
Goldratt’s“thinking tools”
Expert pragmatismwith structure
What How
28©
September 2007Neil ThompsonContext (CURRENT REALITY)
• Unsure how best
to testMethodologyhappy with• Methodology
unhappy with
Business/org. sector Nation
(eg USA)
Corporate culture
Technology
Legal constraints:• regulation• standards
Moralconstraints, eg:• human safety• money, property• convenience Process
constraints, eg:• quality management• configuration mgmt
Job type & size:• project/programme• bespoke/product• new/maintenance
Resources:• money (à skills, environments)• time
SCO
PE, C
OST
, TIM
E,
QU
AL
ITY
/ RISK
FAC
TO
RS
App type
1
29©
September 2007Neil ThompsonAlways-good principles
(CONFLICT RESOLUTION upper level)
Always-goodEffectiveness
EfficiencyRisk management Quality management
Insurance Assurance
V-model: what testing against W-model: quality management
Risks: list & evaluate
Define & detect errors (UT,IT,ST)Give confidence (AT)
Prioritise tests based on risks
Tailor risks & priorities etc to factors
l Refine test specifications progressively:l Plan based on priorities & constraintsl Design flexible tests to fitl Allow appropriate script format(s)l Use synthetic + lifelike data
Allow & assess for coverage changes Document execution & management procedures
Distinguish problems from change requests
Prioritise urgency & importance
Distinguish retesting from regression testing
Use handover & acceptance criteria
Define & measure test coverage
Measure progress & problem significance
Be pragmatic over quality targets
Quantify residual risks & confidence
Decide process targets & improve over time
Define & use metrics
Assess where errors originally made
Define & agree roles & responsibilities
Use appropriate skills mix
Use independent system & acceptance testers
Use appropriate techniques & patterns
Plan early, thenrehearse -run,acceptance tests
Use appropriate tools
Optimise efficiency
2a
30©
September 2007Neil Thompson
Conflicting interpretations of these principles
• adherence to standards and/or proprietary methods
• detail• amount of
documentation• scientific-ness• degree of control• repeatability
• consistency• contracted-ness• trained-ness and
certification of staff• “ceremony”, eg degree
to which tests need to be witnessed, results audited, progress reported
• any others?
Next diagram will take each box from the previous diagram and assess it on a formal-informal continuum, so...
In preparation for this: what do we mean by “formality”?
31©
September 2007Neil ThompsonGood practices in this context
(CONFLICT RESOLUTION lower level)
APPROPRIATE USE OFV-model: what testing against
Use a waterfallV-model
Don’tuse a V-model
• NEED NOT BE 4 LEVELS
V-model isdiscredited
We want to be trendy, anyway
We’re too lazy to think
We want baselines to test against
We want to test viewpoints of:• users• someone expert & independent• designers• programmers
We’re doing adaptive development (no specs)
MANY PEOPLESTAND BY V-MODEL
We’re doing iterative development
We’re object-oriented • V-MODEL IS IMPLICIT IN BINDER’S BOOKTesting OO systems: models,
patterns & tools
Documentation must be minimised
“Conflict”
We have little time
• SOME SPECS AREOUT OF DATE / IMPERFECT,BUT WE COPE
Different levelsmitigate different risks
Two heads are better than one
• CAN USE EXPLORATORYTESTING AGAINST CONSENSUS BASIS
• NEED NOT BE 1-1 CORRESPSPECS-LEVELS
• MULTIPLE PARTIAL PASSES
• THEY ARE LEVELS,NOT STAGES
All systems are integrated from parts
2b
32©
September 2007Neil ThompsonWhat “appropriate” means in this context
(FUTURE REALITY)
V-model with only 3 levels: l acceptance (v. consensus)l system (v. spec)l unit (informal)
We don’t need aseparateintegration test level
The system has:• users• (potentially) expert & independent testers• designers (where significant)• programmers
• NEED NOT BE 4 LEVELS
Our system isvery simple We do need separate
development & acceptancetest levels
Our user requirementsare out of date andwere vague when written
Our programmers hatedocumentation
We do have a good functional specand independent testers available
3
33©
September 2007Neil ThompsonAnd so on… overall what we have done is
deconstruct then reconstruct: the framework is a “meta-V-model”
Your context
All possiblecontexts
Questions toconsider
What “appropriate”means in your context
Choicecategories& actions
Each practiceto examine
Choices
CONFLICT RESOLUTION upper
PRE-REQUISITES
TRANSITIONlower
CURRENTREALITY
TRANSITIONupper
CONFLICTRESOLUTION
lower
FUTUREREALITY
FUTUREREALITY
34©
September 2007Neil Thompson
Conclusions• Summary:
– Toyota’s success (and penetration of Just In Time)– “The Goldratt Trilogy”:
• 1. Things that flow through a process: inventory, value• 2. Constraints on process, and thinking tools to improve• 3. From process improvement to process definition, eg context-
driven
• Lessons learned:– three papers are enough?
• Take away:– read references – Dettmer is key
• Way forward:– examples!
35©
September 2007Neil ThompsonKey references
• Context-Driven:– Kaner, Bach & Pettichord (2002) Lessons learned in software testing, Wiley
• Best Practice:– ISEB, ISTQB??
• My inspiration:– Jens Pas (EuroSTAR 1998) Software testing metrics
– Gregory Daich (STAREast 2002) Software documentation superstitions
• Theory of Constraints understanding:– Eliyahu M. Goldratt (1984 then 1992 with Jeff Cox) - The Goal; (1986 with R. Fox)
The Race; (1997) Critical Chain
• TOC overview and the thinking tools:– H. William Dettmer (1997) Goldratt’s TOC: a systems approach to cont. improv’t,
ASQ
• Related (but differently-specialised) thinking from Agile community:– Alistair Cockburn: A methodology per project, www.crystalmethodologies.org
– Mary Poppendieck: Lean development: an agile toolkit,www.poppendieck.com