an introduction to hypothesis based testing
Post on 05-Dec-2014
8.626 Views
Preview:
DESCRIPTION
TRANSCRIPT
Copyright STAG Software Private Limited, 2010-11 www.stagsoftware.com
Introduction to HBT“Hypothesis Based testing”A scientific personal test methodology to delivering clean software
Copyright STAG Software Private Limited, 2010-11
Test Methodologies in vogue
2
…focus on activities
…are driven by process
…powered by tools
…depends on experience
Test methodologies focus on activities that are driven by a process that are powered by tools, successful outcomes still depend a lot on experience
Copyright STAG Software Private Limited, 2010-11
Testing - A Fishing Analogy
3
Assume that you are a fisherman(woman) . You fish in a lake and make Rs 100 per day. What do you have to do get higher returns say Rs 200 per day?
Constraints:You cannot:... fish anywhere else....control the selling price... increase production of fish.
What would you do extract higher business value ( or returns)?
Copyright STAG Software Private Limited, 2010-11
HBT in a nutshell
4
What ‘fishes’ to catch
When and where to catch
Big net, small holes
Cover more, move fast, course correct
What defects are you looking for?
Formulate a staged quality growth model
Create a ‘complete’ set of test cases
Measure the right stuff & course correctUse appropriate tooling
When is it detectable earliest?
Focus on the goal & then on the activities
Assess &Analyze
Copyright STAG Software Private Limited, 2010-11
Hypothesis-based Testing
5
...is a scientific personal test methodology powered by a defect detection technology that enables an individual to rapidly & effectively deliver “Clean Software”
ToolingSupport
Devise ProofHypothesizePDT
Goal
What is clean software? Identify ‘Cleanliness Criteria”
Copyright STAG Software Private Limited, 2010-11
Activities ........................................................................................................................................................ ............................................................................
Goal
Pow
ered
by
expe
rien
ce
How is HBT different from other methodologies?
Activities ........................................................................................................................................................ ............................................................................
Pow
ered
by
defe
ct d
etec
tion
te
chno
logy
(ST
EM)
Goal
drives
hopefully results in
Typical
HBT
Copyright STAG Software Private Limited, 2010-11
COMPARING METHODOLOGIES
7
Attributes of a methodologyAttributes of a methodologyAttributes of a methodologyAttributes of a methodologyAttributes of a methodologyAttributes of a methodologyEngineeringEngineeringEngineering ManagementManagementManagementEngineeringEngineeringEngineering ManagementManagementManagement
Methodology Characteristic Effective Efficient Consistent Scalable Visible Agile
Process driven Well laid process
Domain centered Experience based
Ad-hoc Individual creativity
Exploratory Individual analytical skills
Automation driven
Tool based
Agile Frequent evaluation
HBTScientific & goal-focussed
Copyright STAG Software Private Limited, 2010-11 8
Needs
Expectations
1. Be clear where you want to go! (Clear goal)
End usersRequirementsFeaturesAttributesUsage
MarketplaceEnvironmentBusiness value
Cleanliness criteria
What do I want?
How good? }Example: Clean Water implies1.Colourless2.No suspended particles3.No bacteria4.Odourless
Accelerate understanding/ramp-up
Copyright STAG Software Private Limited, 2010-11 9
2. Know the route clearly3. Use learnings from others
Cleanliness criteria
What types of defects do I need to uncover?
Example:Data validationTimeoutsResource leakageCalculationStoragePresentationTransactional ...
Hypothesiz
e
potentia
l d
efect T
ypes
Potential defect types
Accelerate goal clarity
Copyright STAG Software Private Limited, 2010-11 10
4. Find a shorter route (optimal)
Test types
PDT2PDT1
PDT4PDT3
PDT6PDT5
PDT7
TT1TT2
TT4
TT5
TT3
Potential defect types
Staged & purposeful detection Optimize testing
Quality levels
PDT4PDT3
PDT6PDT5
PDT7
PDT1
PDT2 PDT:Potential Defect Types
QL1
QL2
QL3
Copyright STAG Software Private Limited, 2010-11 11
5.Drive carefully6.Detour less
Staged & purposeful detection
Cover more ground
Test Scenarios/Cases
TTTS1 TC1,2,3
TS2 TC4,5,6,7
R1
R2
R3
PDT1
PDT2
PDT3
Requirements & Fault traceability
Complete test cases
Copyright STAG Software Private Limited, 2010-11 12
7. Buy a faster vehicle8. Use good vehicle & fuel (good technology)
Staged & purposeful detection
Move fast
Better ROI
Sensible automation
Tooling and scripts
Copyright STAG Software Private Limited, 2010-11 13
9. Keep track of where you are10.Negotiate trouble quickly
Course correct quickly
Complete test cases
Sensible automation
Quality Index
QL1
QL2
QL3
Quality, Progress & Risk
Goal directed measures
Cover more ground
Optimize testing
Copyright STAG Software Private Limited, 2010-11 14
Complete test cases
Sensible automationGoal directed measures
Staged & purposeful detection
Potential defect typesCleanliness criteria
Course correct quickly Move fast
Accelerate understanding Accelerate goal clarity
Expectations
Copyright STAG Software Private Limited, 2010-11
HBT Pictorial
15
Potential Defect Types(PDT)
PD PD PD
Cleanliness criteria
Functional aspects
Non-functional aspects
Quality Index
QL1
QL2
QL3
Test Scenarios/Cases
TTTS1 TC1,2,3
TS2 TC4,5,6,7
R1
R2
R3
PDT1
PDT2
PDT3
Needs
Expectations
PDT1
Quality levels
PDT2
PDT4PDT3
PDT6PDT5
PDT7
Test types
PDT2PDT1
PDT4PDT3
PDT6PDT5
PDT7
TT1TT2
TT4
TT5
TT3
Tooling and scripts
Copyright STAG Software Private Limited, 2010-11 16
Consists of SIX stages of “doing”
Hypothesis Based Testing (HBT)A goal focused methodology to validation
Understand EXPECTATIONS
Understand CONTEXT
Formulate HYPOTHESIS
Devise PROOF
Tooling SUPPORT
Assess & ANALYZE
S1
S2
S3S4
S5
S6
HBT
The central theme of HBT is“hypothesize potential defects thatcan cause loss of expectations and
prove that they will not exist”.
The focus is on the goal and how we shall achieve it,
rather than the various activities.i.e”goal-centric vs. activity-based”
Copyright STAG Software Private Limited, 2010-11 17
HBT & STEM
Understand EXPECTATIONS
Understand CONTEXT
Formulate HYPOTHESIS
Devise PROOF
Tooling SUPPORT
Assess & ANALYZE
S1
S2
S3S4
S5
S6
D1D2
D3D4D5
D6
D7STEM
D8
“method”
“methodology”
STEM
HBT
GOAL
a particular way of doing something‘defect detection technology
from STAG’
a system of ways of doing
‘goal centered scientific approach to validation’
‘deliver clean software quickly & cost-effectively’
SIX stages of “doing” are powered byEIGHT thinking disciplines
Copyright STAG Software Private Limited, 2010-11 18
STEM 2.0 STAG Test Engineering Method
Consists of EIGHT Disciplines
& THIRTY-TWO scientific concepts STEM Core
A discipline consists of steps each of which is aided by scientific concept(s)
32 core concepts
D1
Business value understanding
Defecthypothesis
Strategy &planning
Test designTooling
Visibility
Execution & reporting
Analysis &management
D2
D3
D4D5
D6
D7
D8
STEM Core
STEM Way
Copyright STAG Software Private Limited, 2010-11
STEM Core - Provides Scientific Basis Consists of 32 Core Concepts
19
Business value understanding D1
Landscaping ViewpointsReductionist principleInteraction matrixOperational profilingAttribute analysisGQM
EFF model (Error-Fault-Failure)Defect centricity principleNegative thinkingOrthogonality principleDefect typing
Orthogonality principleTooling needs assessmentDefect centered ABQuality growth principleTechniques landscapeProcess landscape
Reductionist principleInput granularity principleBox model Behavior-Stimuli approachTechniques landscapeComplexity assessmentOperational profilingTest coverage evaluation
Defect hypothesisD2
Test designD4 Test strategy & planningD3
Copyright STAG Software Private Limited, 2010-11
STEM Core - Provides Scientific Basis Consists of 32 Core Concepts
20
ToolingD5
Automation complexity assessmentMinimal babysitting principleSeparation of concernsTooling needs analysis
GQMQuality quantification model
Contextual awarenessDefect rating principle
Gating principleCycle scoping
VisibilityD6
Analysis & ManagementD8 Execution & ReportingD7
Copyright STAG Software Private Limited, 2010-11
Needs & Expectations
21
End users
NeedsResults in features in the softwareImplemented using technology(ies) by developers
ExpectationsHow well should the needs be met?Is the focus of the test staff
}Construction oriented
User requirements/Technical specifications
}Normally not as clear or purposeful
HBT enables extraction of Cleanliness Criteria to set up clear goal
Copyright STAG Software Private Limited, 2010-11
HBT Overview
22
Cleanliness Criteria
Potential Defect Types
Quality Levels
Test Types
Test Techniques
Test Scenarios/Cases
Req. traceability Fault traceability
Test scripts
Cycle scoping
Test outcome
Quality index
Risk assessmentExpectationsNeeds
User types
Requirements
Features
Attributes
Usage profile
Business logic
Data
Metrics
Tooling architecture}
Copyright STAG Software Private Limited, 2010-11
A pictorial on HBT
23
Potential Defect Types(PDT)
PD PD PD
Cleanliness criteria
Functional aspects
Non-functional aspects
Quality Index
QL1
QL2
QL3
Test Scenarios/Cases
TTTS1 TC1,2,3
TS2 TC4,5,6,7
R1
R2
R3
PDT1
PDT2
PDT3
Needs
Expectations
PDT1
Quality levels
PDT2
PDT4PDT3
PDT6PDT5
PDT7
Test types
PDT2PDT1
PDT4PDT3
PDT6PDT5
PDT7
TT1TT2
TT4
TT5
TT3
Tooling and scripts
Copyright STAG Software Private Limited, 2010-11
Understand Expectations
24
S1
S2
S3S4
S5
S6
HBT
Identify business requirements for each user type
Identify end user types & #users for each type
Understand marketplace for software
Understand the technology(ies) used
Understand deployment environment
Copyright STAG Software Private Limited, 2010-11
Understand Expectations
25
S1
S2
S3S4
S5
S6
HBT
Identify business requirements for each user type
Identify end user types & #users for each type
Understand marketplace for software
Understand the technology(ies) used
Understand deployment environment
STEM Core conceptsLandscapesViewpoints
Copyright STAG Software Private Limited, 2010-11
Understand Expectations
26
S1
S2
S3S4
S5
S6
HBT
Identify business requirements for each user type
Identify end user types & #users for each type
Understand marketplace for software
Understand the technology(ies) used
Understand deployment environment
Outcomes
Overview document
Requirement/feature map
User type list
Copyright STAG Software Private Limited, 2010-11
Understand Context
27
S1
S2
S3S4
S5
S6
HBT
Prioritize value of end user(s) and features
Identify critical success factors
Ensure attributes are testable
Setup cleanliness criteria
Identify technical features and baseline them
Understand dependancies
Understand profile of usage
Copyright STAG Software Private Limited, 2010-11
Understand Context
28
S1
S2
S3S4
S5
S6
HBT
Prioritize value of end user(s) and features
Identify critical success factors
Ensure attributes are testable
Setup cleanliness criteria
Identify technical features and baseline them
Understand dependancies
Understand profile of usage
STEM Core conceptsReductionist principleInteraction matrixOperational profilingAttribute analysisGQM(Goal-Question-Metric)
Copyright STAG Software Private Limited, 2010-11
Understand Context
29
S1
S2
S3S4
S5
S6
HBT
Prioritize value of end user(s) and features
Identify critical success factors
Ensure attributes are testable
Setup cleanliness criteria
Identify technical features and baseline them
Understand dependancies
Understand profile of usage
Outcomes
Feature list
Value prioritization matrix
Usage profile
Key attributes list
Cleanliness assessment criteria
Copyright STAG Software Private Limited, 2010-11
Formulate Hypothesis
30
S1
S2
S3S4
S5
S6
HBT
Identify error injection opportunities and therefore PD
Identify potential failures and therefore PD
Group PDs to form PDTsMap PDTs to requirements/features
Identify PD due to data, logic
Identify PD due to structure,technology
Identify potential faults based on usage
PD = Potential defectsPDT = Potential defect types
Copyright STAG Software Private Limited, 2010-11
Formulate Hypothesis
31
S1
S2
S3S4
S5
S6
HBT
Identify error injection opportunities and therefore PD
Identify potential failures and therefore PD
Group PDs to form PDTsMap PDTs to requirements/features
Identify PD due to data, logic
Identify PD due to structure,technology
Identify potential faults based on usage
PD = Potential defectsPDT = Potential defect types
STEM Core conceptsEFF model (Error-Fault-Failure)Defect centricity principleNegative thinkingOrthogonality principleDefect typing
Copyright STAG Software Private Limited, 2010-11
Formulate Hypothesis
32
S1
S2
S3S4
S5
S6
HBT
Identify error injection opportunities and therefore PD
Identify potential failures and therefore PD
Group PDs to form PDTsMap PDTs to requirements/features
Identify PD due to data, logic
Identify PD due to structure,technology
Identify potential faults based on usage
PD = Potential defectsPDT = Potential defect types
Outcomes
Potential defects catalog
Fault propagation chart
Fault traceability matrix
Copyright STAG Software Private Limited, 2010-11
DEVISE PROOF - Part 1of3
33
S1
S2
S3S4
S5
S6
HBT
Identify defect detection process
Identify test techniques
Identify tooling needsFormulate cycles and there scope
Understand scopeFormulate quality levels
Identify types of test to be performed
Estimate effort using defect based activity breakdown
Identify risks
Test strategy and plan
Copyright STAG Software Private Limited, 2010-11
DEVISE PROOF - Part 1of3
34
S1
S2
S3S4
S5
S6
HBT
Identify defect detection process
Identify test techniques
Identify tooling needsFormulate cycles and there scope
Understand scopeFormulate quality levels
Identify types of test to be performed
Estimate effort using defect based activity breakdown
Identify risks
STEM Core conceptsOrthogonality principleTooling needs assessmentDefect centered ABQuality growth principleTechniques landscapeProcess landscape
Test strategy and plan
Copyright STAG Software Private Limited, 2010-11
DEVISE PROOF - Part 1of3
35
S1
S2
S3S4
S5
S6
HBT
Identify defect detection process
Identify test techniques
Identify tooling needsFormulate cycles and there scope
Understand scopeFormulate quality levels
Identify types of test to be performed
Estimate effort using defect based activity breakdown
Identify risks
Test strategy and plan
Outcomes
Test strategy
Test plan
Copyright STAG Software Private Limited, 2010-11
DEVISE PROOF - Part 2of3
36
S1
S2
S3S4
S5
S6
HBT
For each scenario, generate test cases
Generate the test scenarios
Refine scenarios/cases using structural properties
Trace the scenarios to the PDT (requirement tracing is built in)
Identify test level to design consider & identify entities
Partition each entity & understand business logic/data
Model the intended behavior semi-formally
Assess the test adequacy by fault coverage analysis
Test design
Copyright STAG Software Private Limited, 2010-11
DEVISE PROOF - Part 2of3
37
S1
S2
S3S4
S5
S6
HBT
For each scenario, generate test cases
Generate the test scenarios
Refine scenarios/cases using structural properties
Trace the scenarios to the PDT (requirement tracing is built in)
Identify test level to design consider & identify entities
Partition each entity & understand business logic/data
Model the intended behavior semi-formally
Assess the test adequacy by fault coverage analysis
Test design
STEM Core conceptsReductionist principleInput granularity principleBox model Behavior-Stimuli approachTechniques landscapeComplexity assessmentOperational profilingTest coverage evaluation
Copyright STAG Software Private Limited, 2010-11
DEVISE PROOF - Part 2of3
38
S1
S2
S3S4
S5
S6
HBT
For each scenario, generate test cases
Generate the test scenarios
Refine scenarios/cases using structural properties
Trace the scenarios to the PDT (requirement tracing is built in)
Identify test level to design consider & identify entities
Partition each entity & understand business logic/data
Model the intended behavior semi-formally
Assess the test adequacy by fault coverage analysis
Test design
Outcomes
Test scenarios and cases
(conforming to HBT test case
architecture)
Fault traceability matrix
Requirements traceability matrix
Copyright STAG Software Private Limited, 2010-11
DEVISE PROOF - Part 3of3
39
S1
S2
S3S4
S5
S6
HBT
For each of these goals, identify questions to ask
For each of the aspects identify the intended goal to meet
To answer these questions, identify metrics
Identify when you want to measure and how to measure
Identify progress aspects
Identify adequacy(coverage) aspects
Identify progress aspects
Metrics design
Copyright STAG Software Private Limited, 2010-11
DEVISE PROOF - Part 3of3
40
S1
S2
S3S4
S5
S6
HBT
For each of these goals, identify questions to ask
For each of the aspects identify the intended goal to meet
To answer these questions, identify metrics
Identify when you want to measure and how to measure
Identify progress aspects
Identify adequacy(coverage) aspects
Identify progress aspects
Metrics design
STEM Core conceptsGQMQuality quantification model
Copyright STAG Software Private Limited, 2010-11
DEVISE PROOF - Part 3of3
41
S1
S2
S3S4
S5
S6
HBT
For each of these goals, identify questions to ask
For each of the aspects identify the intended goal to meet
To answer these questions, identify metrics
Identify when you want to measure and how to measure
Identify progress aspects
Identify adequacy(coverage) aspects
Identify progress aspects
Metrics design
Outcomes
Measurements chart
Copyright STAG Software Private Limited, 2010-11 42
S1
S2
S3S4
S5
S6
HBT DEVISE PROOFQuality levels and Test Types
PDT3PDT2
PDT1
TT2
TT1
PDT5PDT4
PDT8
PDT7PDT6
TT5
TT4
TT3QL1
QL2
QL3
QL4
Stage
PDT10
PDT9
PDT9 TT6
TT7
TT8Cl
eanl
ines
s
Copyright STAG Software Private Limited, 2010-11 43
S1
S2
S3S4
S5
S6
HBT DEVISE PROOFRequirements traceability
Requirements traceability
R1
R2
R3
...
Rm
TC1
TC2
TC3
...
TCi
Every test case is mapped to a requirement.
(OR)
Every requirement does indeed have a test case.
The intention of to ensure that each requirement can indeed be validated.
Is seen as a indicator of “test adequacy”
Copyright STAG Software Private Limited, 2010-11 44
S1
S2
S3S4
S5
S6
HBT DEVISE PROOFFault traceability
PD1
PD2
PD3
...
PDn
R1
R2
R3
...
Rm
Map potential defects to requirement
TC1
TC2
TC3
...
TCi
PD1
PD2
PD3
...
PDn
Map test cases to potential defects they can detect
Tracing the potential defects to the requirements & test cases is Fault Traceability.
Allows us to understand that intended potential defects can indeed be uncovered
Copyright STAG Software Private Limited, 2010-11 45
Requirement & fault Traceability
PD1PD2PD3...
PDn
R1R2R3...
Rm
PD1PD2PD3...
PDn
TC1TC2TC3...
TCi
Requirements traceability
Faulttraceability
Faulttraceability
Requirements traceability is“Necessary but not sufficient”
Assume that each requirement had just one test case. This implies that we have good RTM i.e. each requirement has been covered.
What we do know is that could there additional test cases for some of the requirements?
So RTM is a necessary condition but NOT a sufficient condition.
So, what does it take to be sufficient?
If we had a clear notion of types of defects that could affect the customer experience and then mapped these to test cases, we have Fault Traceability Matrix (FTM as proposed by HBT). This allows us to be sure that our test cases can indeed detect those defects that will impact customer experience.
Copyright STAG Software Private Limited, 2010-11 46
Key concepts
PDT1ES1
ES2
ES3
ES4PDT2
Each ES is focussed on uncovering certain PDT
will
unco
ver
PDT1ES1
ES2
ES3
ES4PDT2
Each PDT can be detected by a specific test technique
enab
led
by
tech
niqu
e
A test is a collection of evaluation scenarios (ES)
PDT1PDT2 TT1
ES1
ES2
ES3
ES4
cons
ists
of
PDT8 TT5PDT7 TT4PDT6
TT4
PDT5PDT4
TT3PDT5PDT4
TT3
PDT3PDT2
TT2PDT3PDT2
TT2
PDT1 TT1
Qual
ity
(Cle
anlin
ess)
Stage (Time)
Focused defect identification
Copyright STAG Software Private Limited, 2010-11
STEM Test Case Architecture
47
QL2 Test cases
QL1 Test cases
QL3 Test cases
Test Type #1 TCTest Type #2 TC
Test Type #3 TC
Test cases are categorized by levels and then by types
Results in >> Excellent clarity>> Purposeful i.e. defect oriented>> Clear insight to quality>> Higher coverage
PDTPotential defect types
Copyright STAG Software Private Limited, 2010-11
STEM Test Case Architecture
48
A well architected set of test cases is like a effective bait that can ‘attract defects’ in the system.
It is equally important to ensure that they are well organized to enable execution optimization and have the right set of information to ensure easy automation.
Organized by Quality levels sub-ordered by items (features/modules..), segregated by type, ranked by importance/priority, sub-divided into conformance(+) and robustness(-), classified by early (smoke)/late-stage evaluation, tagged by evaluation frequency, linked by optimal execution order, classified by execution mode (manual/automated)
Copyright STAG Software Private Limited, 2010-11
Test adequacy Analysis
49
Test breadth
Test porosity
Test
dep
th
Breadth Types of tests
Depth Quality levels
Porosity Test case “fine-ness”
Conformance vs. RobustnessConformance vs. RobustnessConformance vs. RobustnessConformance vs. RobustnessConformance vs. RobustnessConformance vs. Robustness
QL4
QL3
QL2
QL1
Copyright STAG Software Private Limited, 2010-11 50
Clear assessment (Better visibility)
Quality reportQuality reportQuality reportQuality reportQuality report
CC1 CC2 CC3 CC4
E1
E2
E3
E4
E5
Met
Not met
Partially met
Clear assessment implies that we are able to objectively state that an element under test is able to meet the intended cleanliness criteria
Copyright STAG Software Private Limited, 2010-11
Tooling Support
51
S1
S2
S3S4
S5
S6
HBT
Evaluate tools
Identify the order in which scenarios need to be automated
Design automation architecture
Develop scripts
Perform tooling benefit analysis
Identify automation scope
Assess automation complexity
Debug and baseline scripts
Copyright STAG Software Private Limited, 2010-11
Tooling Support
52
S1
S2
S3S4
S5
S6
HBT
Evaluate tools
Identify the order in which scenarios need to be automated
Design automation architecture
Develop scripts
Perform tooling benefit analysis
Identify automation scope
Assess automation complexity
Debug and baseline scripts
STEM Core conceptsAutomation complexity assessmentMinimal babysitting principleClear separation of concerns principle
Copyright STAG Software Private Limited, 2010-11
Tooling Support
53
S1
S2
S3S4
S5
S6
HBT
Evaluate tools
Identify the order in which scenarios need to be automated
Design automation architecture
Develop scripts
Perform tooling benefit analysis
Identify automation scope
Assess automation complexity
Debug and baseline scripts
Outcomes
Needs and benefits document
Complexity assessment report
Automation architecture
Tool requirements
Automation phasing and scope
Automation scripts
Copyright STAG Software Private Limited, 2010-11
Assess and Analyze
54
S1
S2
S3S4
S5
S6
HBT
Record status of execution
Record learnings from the activity and the context
Analyze execution progress
Quantify quality and identify risk to delivery
Identify test cases/scripts to be executed
Execute test cases, record outcomes
Record defects
Update strategy, plan, scenarios, cases/scripts
Copyright STAG Software Private Limited, 2010-11
Assess and Analyze
55
S1
S2
S3S4
S5
S6
HBT
Record status of execution
Record learnings from the activity and the context
Analyze execution progress
Quantify quality and identify risk to delivery
Identify test cases/scripts to be executed
Execute test cases, record outcomes
Record defects
Update strategy, plan, scenarios, cases/scripts
STEM Core conceptsContextual awareness Defect rating principleGating principle
Copyright STAG Software Private Limited, 2010-11
Assess and Analyze
56
S1
S2
S3S4
S5
S6
HBT
Record status of execution
Record learnings from the activity and the context
Analyze execution progress
Quantify quality and identify risk to delivery
Identify test cases/scripts to be executed
Execute test cases, record outcomes
Record defects
Update strategy, plan, scenarios, cases/scripts
Outcomes
Execution status report
Defect report
Progress report
Cleanliness report
Updated test scenarios/cases/
strategy/plan
Key observations/learnings
Copyright STAG Software Private Limited, 2010-11
HBT Case Study
• Web based product – Version 2.x
• One month pilot
• 5-day orientation on HBT/STEM 2.0 to the team
• Two teams were involved – HBT team and a non-HBT team
57
Copyright STAG Software Private Limited, 2010-11 58
HBT case study - Test case details
Test case detailsTest case detailsTest case detailsTest case details
Module STEM method
Normal method Increase
M1 100 28 257%M2 85 52 63%M3 95 66 44%M4 132 72 83%M5 127 28 354%M6 855 116 637%
TOTAL 1394 362 285%
Test case details (STEM Method)Test case details (STEM Method)Test case details (STEM Method)Test case details (STEM Method)
Module Total Positive Negative
M1 100 59 41
M2 85 68 17
M3 95 67 28
M4 132 112 20
M5 127 85 42
M6 855 749 106TOTAL 1394 1140 254
2x improvement in Negative cases increasing probability of defect
yield
Nearly 3x improvement in test cases increasing probability of
higher defect yield
Copyright STAG Software Private Limited, 2010-11 59
HBT case study - Defect, effort details
Defect detailsDefect detailsDefect detailsDefect detailsSTEM method
Normal method Increase
#Defects 32 16 100%
STEM Method did yield 2x #defects
20 (Major), 12(Minor)Note that out of the 32 defects found,
few were residual defects. One of them was a critical one, that
corrupts the entire data.
Effort detailsEffort detailsEffort details
StageSTEM method (hours)
Normal method (hours)
Test analysis & Design 30* 20
*Observations1. STEM team found key defects lowering the cost of support.
2. In the case of Normal method, they would have spent higher efforts post-release.
Benefits
• PDT propelled test design• Quality of defects yielded during pilot was high• Robust test design –A good input for test automation
Copyright STAG Software Private Limited, 2010-11 60
Results & Benefits
Results
• STEM found important residual defects• STEM team was confident of “guarantee”• Slight increase in initial effort, but well compensated
with the quality of defects• Customer was happy with the results and approach
Copyright STAG Software Private Limited, 2010-11
HBT Results
61
Re-architecting test assets increases test coverage by 250%
50%-5x reduction in post-release defects
30% defect leakage reduction from early stage
Terse requirement - Holes found & fixed at Stage#1
Test assessment accelerates integration
Smart automation - 3x reduction in time
Copyright STAG Software Private Limited, 2010-11 62
•Maintenance optimization•Design asset re-engineering
•Coverage enhancement•LSPS solution
Product
Organization
People
•UT assessment
•Tool adoption
•DevQ system
•IT assessment
•QA system
•Req validation•Archi. validation
•COMPASSTM
Diagnostics & control
Productivity enhancement
System enhancementSkill enhancement
Quality injection
Quality enhancement
•Mobile app validation suite•E-Learning validation suiteTest acceleration
•ERP validation suite•Bluetooth validation suite
STAG Solutions & Servicesbased on HBT & STEM 2.0
EBA
E&T
Test services
TA
•Outsourced testing•Managed QA•JumpStart QA
•Assessment services•Custom tooling
•Functional automation•LSPS validation
Corp
Retail
Training •HBT Series
•Robust TD•Purposeful strategy
•Successful automation
•Finishing school
STEMTM
Optimization
Copyright STAG Software Private Limited, 2010-11STEMTM is the trademark of STAG Software Private Limited
Thank you!
top related