hypothesis based testing: power + speed
DESCRIPTION
STAG Software presented a webinar on Feb 16, 2012, 2.00pm to 3.00pm EST. The objective of this webinar is to present a different approach to delivering “Clean Software” by enabling one to leverage his/her intellect and thereby unleash the power to generate speed.TRANSCRIPT
Webinar: 16 Feb 2012, 1400-1500 EST
T AshokFounder & CEO, STAG SoftwareArchitect-HBT
in.linkedin.com/in/AshokSTAG ash_thiru
Hypothesis Based Testing
Power + Speed
© 2012. STAG Software Private Limited. All rights reserved.
Our appetite for speed is yet to be satiated. So where do go from here? Leveraging intellect is key to more power and speed.
Hypothesis Based Testing (HBT) is a personal scientific test methodology that focuses on leveraging one’s intellect by enabling sharp goal focus, providing tools for scientific thinking to rapidly assess cleanliness of software.
Tools and process have been significant drivers to speed up testing. Adoption of lean principles is seen as enabling focus & continuous evolution.
© 2012. STAG Software Private Limited. All rights reserved.
Optimize process
Speed upexecution
LeverageIntellect
Assetreuse
Power + Speed
The “Tsunami” effect...
1
2
3
4
© 2012. STAG Software Private Limited. All rights reserved.
Optimize process
Speed upexecution
LeverageIntellect
Assetreuse
Power + Speed
The “Tsunami” effect...
1
2
3
4Cleanliness criteria
Potential Defect Types
Quality Levels
Complete test cases
Cleanliness Index
Validation suite
© 2012. STAG Software Private Limited. All rights reserved.
may have
Cleanliness criteria Potential Defect Type
SystemRequirements, Features, User story
to satisfy
impacts
Clear Goal
“What to Test” & “Test for What”
Clarity of Purpose
© 2012. STAG Software Private Limited. All rights reserved.
Expectations
Needs
Features
Environment
Behavior
Structure
Material
“Properties of the system”
Expectations delivered by Needs (Requirements)via Features that display Behaviorconstructed from Materials in accordance to a Structure in a given Environment
Expectations = Cleanliness Criteria
© 2012. STAG Software Private Limited. All rights reserved.
Quality Growth - Nine staged filter
Input cleanliness
Input interface cleanliness
Structural integrity
Behavior correctness
Environment cleanliness
Attributes met
Flow correctness
Clean Deployment
End user value
L1
L2
L3
L4
L5
L6
L7
L8
L9
That inputs are handled wellInput data handling
That the functional behavior is correctFunctionality
That the internal structure is robustInternal structural issues
That the user interface is cleanUI issues
That end-to-end flows work correctlyBusiness flow conditions, Linkages
That it does not mess up the environmentResource leaks, Compatibility...
That the stated attributes are metPerformance, security, volume, load...
That it deploys well in the real environmentCompatibility, migration
That user expectations are metUser flows, experience
Objective Issues
© 2012. STAG Software Private Limited. All rights reserved.
EIGHT disciplines of THINKING
SIX stages of DOINGS6
D8
D1S1 powered by
Power + Speed
HBT - Personal scientific test methodology Powered by STEMTM - Defect detection technology
S1: Understand expectationsS2: Understand contextS3: Formulate hypothesisS4: Devise proofS5: Tooling supportS6: Assess & Analyze
D1: Business value understandingD2: Defect hypothesisD3: Strategy & PlanningD4: Test designD5: ToolingD6: VisibilityD7: Execution & ReportingD8: Analysis & Management
© 2012. STAG Software Private Limited. All rights reserved.
Business value understanding D1Landscaping ViewpointsReductionist principleInteraction matrixOperational profilingAttribute analysisGQM
EFF model (Error-Fault-Failure)Defect centricity principleNegative thinkingOrthogonality principleDefect typing
Defect hypothesisD2
Reductionist principleInput granularity principleBox model Behavior-Stimuli approachTechniques landscapeComplexity assessmentOperational profiling
Test designD4
Orthogonality principleTooling needs assessmentDefect centered ABQuality growth principleTechniques landscapeProcess landscape
Test strategy & planningD3
ToolingD5Automation complexity assessmentMinimal babysitting principleSeparation of concernsTooling needs analysis
GQMQuality quantification model
VisibilityD6
Gating principleCycle scoping
Analysis & ManagementD8
Contextual awarenessDefect rating principle
Execution & ReportingD7
32 core concepts
© 2012. STAG Software Private Limited. All rights reserved.
Complete test cases
Sensible automationGoal directed measures
Staged & purposeful detection
Potential defect typesCleanliness criteria
Expectations
S1
S2S3
S4
S5S6
Hypothesis Based Testing (HBT)
© 2012. STAG Software Private Limited. All rights reserved.
Clear Baseline
Set a clear goal for quality
Example: Clean Water implies1.Colorless2.No suspended particles3.No bacteria4.Odorless
Complete test cases
Sensible automationGoal directed measures
Staged & purposeful detection
Potential defect typesCleanliness criteria
Expectations
S1, S2
What information(properties) can be used to identify this?
...Marketplace,Customers, End users
...Requirement(flows), Usage, Deployment
... Features, Attributes
...Stage of development, Interactions
... Environment, Architecture
... Behavior, Structure
© 2012. STAG Software Private Limited. All rights reserved.
A goal focused approach to cleanliness
Complete test cases
Sensible automationGoal directed measures
Staged & purposeful detection
Potential defect typesCleanliness criteria
Expectations
Scientific approach to hypothesizing defects is about looking at
FIVE Aspects - Data, Logic, Structure, Environment & Usage from THREE Views - Error injection, Fault proneness & Failure
Use STEM core concepts > Negative thinking (Aspect)> EFF Model (View)
Identify potential defect types that can impede cleanliness
Example:Data validationTimeoutsResource leakageCalculationStoragePresentationTransactional ...
S3
“A Holmes-ian way of looking at properties of elements”
© 2012. STAG Software Private Limited. All rights reserved.
Levels, Types & Techniques - STRATEGY
Complete test cases
Sensible automationGoal directed measures
Staged & purposeful detection
Potential defect typesCleanliness criteria
Expectations
Quality Levels
PDT4PDT3
PDT6PDT5
PDT7
PDT1
PDT2 PDT:Potential Defect Types
L1
L2
L3
NINE levels to Cleanliness
Input cleanliness
Input interface cleanliness
Structural integrity
Behavior correctness
Environment cleanliness
Attributes met
Robustness
Clean Deployment
End user value
L1
L2
L3
L4
L5
L6
L7
L8
L9
Test Types
PDT2PDT1
PDT4PDT3
PDT6PDT5
PDT7
TT1TT2
TT4
TT5
TT3
TT:Test Types
Test Techniques (T1-T4)
TT1
TT2
TT4
TT5
TT3
T1
T2
T3
T4
S4
“Fractional distillation of bug mixture”
© 2012. STAG Software Private Limited. All rights reserved.
Countable test cases & Fault coverage
Complete test cases
Sensible automationGoal directed measures
Staged & purposeful detection
Potential defect typesCleanliness criteria
Expectations
That test cases for a given requirement shall have the ability to detect specific types of defects
FAULT COVERAGE
Test Scenarios/Cases
TTTS1 TC1,2,3
TS2 TC4,5,6,7
R1
R2
R3
PDT1
PDT2
PDT3
Requirements & Fault traceability
Use STEM Core concepts> Box model > Behavior Stimuli approach> Techniques landscape> Coverage evaluation
to - Model behavior- Create behavior scenarios- Create stimuli (test cases)
Irrespective of who designs, #scenarios/cases shall be same - COUNTABLE
Guarantee test adequacy.Guarantee implies
that the means to the end is rational & provable
S4
© 2012. STAG Software Private Limited. All rights reserved.
HBT Test Case Architecture
Organized by Quality levels sub-ordered by items (features/modules..), segregated by type, ranked by importance/priority, sub-divided into conformance(+) and robustness(-), classified by early (smoke)/late-stage evaluation, tagged by evaluation frequency, linked by optimal execution order, classified by execution mode (manual/automated)
A well architected set of test cases is like a effective bait that can ‘attract‘ defects in the system.
It is equally important to ensure that they are well organized to enable execution optimisation and have the right set of information to ensure easy automation.
Level
Item
Type
Priority
Focus
Stage
Frequency
Order
Mode
© 2012. STAG Software Private Limited. All rights reserved.
Focused scenarios + Good Automation Architecture
Complete test cases
Sensible automationGoal directed measures
Staged & purposeful detection
Potential defect typesCleanliness criteria
Expectations Cleanliness criteria
Expectations
Level based test scenarios yield shorter scripts that are more flexible for change and easily maintainable.
Input cleanliness
Input interface cleanliness
Structural integrity
Behavior correctness
Environment cleanliness
Attributes met
Robustness
Clean Deployment
End user value
L1
L2
L3
L4
L5
L6
L7
L8
L9
S5
© 2012. STAG Software Private Limited. All rights reserved.
“Cleanliness Index” - Improved visibility
Complete test cases
Sensible automationGoal directed measures
Staged & purposeful detection
Potential defect typesCleanliness criteria
Expectations
Quality reportQuality reportQuality reportQuality reportQuality reportCC1 CC2 CC3 CC4
R1R2R3R4R5
Met
Not met
Partially met
PDT3PDT2
PDT1
TT2
TT1
PDT5PDT4
PDT8
PDT7PDT6
TT5
TT4
TT3L1
L2
L3
L4
Stage
PDT10
PDT9
PDT9 TT6
TT7
TT8
Clea
nlin
essS6
© 2012. STAG Software Private Limited. All rights reserved.
HBT - A Case Study Two Teams, one using HBT & The other conventional approach
Effort details (person-hours)Effort details (person-hours)Effort details (person-hours)
Stage HBT Normal
Test analysis & Design 30 20
Front loading of effort resulted in lowering support cost
Test case detailsTest case detailsTest case detailsTest case details
Module HBT Normal Increase
M1 100 28 257%M2 85 52 63%M3 95 66 44%M4 132 72 83%M5 127 28 354%M6 855 116 637%
TOTAL 1394 362 285%
Nearly 3x increase in #test cases increasing probability of higher defect yield
Test case details (HBT)Test case details (HBT)Test case details (HBT)Test case details (HBT)
Module Total Positive Negative
M1 100 59 41
M2 85 68 17
M3 95 67 28
M4 132 112 20
M5 127 85 42
M6 855 749 106
TOTAL 1394 1140 254
2x improvement in negative cases increasing probability of better defect yield
Defect detailsDefect detailsDefect detailsDefect details
HBT Normal Increase
#Defects 32 16 100%
20 (Major), 12(Minor)Out of these 32 defects, few were
residual defects, one being critical to corrupt the entire data.
© 2012. STAG Software Private Limited. All rights reserved.
HBT Results
Re-architecting test assets increases test coverage by 250%
50%-1000% reduction in post-release defects
30% defect leakage reduction from early stage
‘Holes’ found & fixed at requirement stage
Test assessment accelerates integration, de-risks deployment
Smart automation - 3x reduction in time
Deskilling - Less experienced staff do better, faster ramp up, lower cost
© 2012. STAG Software Private Limited. All rights reserved.
Optimize process
Speed upexecution
LeverageIntellect
Assetreuse
Summarizing...
1
2
3
4Cleanliness criteria
Potential Defect Types
Quality Levels
Complete test cases
Cleanliness Index
Validation suite
Power + Speed
© 2012. STAG Software Private Limited. All rights reserved.
Follow us @stagsoft
Check out our blog at www.stagsoftware.com/blog
Thank you!