tarot2013 testing school - myra cohen presentation
DESCRIPTION
TAROT 2013 9th International Summer School on Training And Research On Testing, Volterra, Italy, 9-13 July, 2013 These slides summarize Myra Cohen's presentation about "Sampling, Re-use and Incremental Testing in Software Product Lines"TRANSCRIPT
Sampling, Re-use and Incremental Testing in Software Product Lines
Myra B. Cohen [email protected]
University of Nebraska-Lincoln
Agenda
SPL System Testing: the combinatorics
SPL Integration Testing
Summary
Continuous Test Suite Augmentation
Introduction to Software Product Lines
Family-Based Approach to Development
Nokia Mobile phones Boeing BoldStroke Future Combat Systems
Intuit Philips Philips
Software Product Lines • A software product line (SPL) is a set of programs that share a
significant common (managed) set of features satisfying a market segment or mission developed from a common set of core assets in a prescribed way
Product 1
Commonality
Variability
..
. Product 2 Product n
Core Assets
Feature Oriented Domain Analysis
• SEI FODA Project in Late 1980s • Identified features (variability) as the key to software
product lines
• Identified the need for artifact-independent modeling of the features in an SPL
• • Introduced the feature diagram which led to what we know
today a the feature model
Feature Models • Representation of the features and
dependencies/constraints in an SPL • Defines what is (and is not) allowable in
the SPL • Can be written as a logic equation and can
use SAT solvers to reason about model • Several different notations for feature
models
Software Product Line Engineering
• Consists of two parts: – Domain engineering
• Determining what the re-usable components/architecture is for the SPL
• Defines the commonality and variability of the SPL
– Application engineering • Deriving products from the platform created in
the domain engineering step • Makes use of re-use and exploits variability/
commonality
Domain Testing
• Validate and verify reusable components
• No running application exists at this time - only components
• This differs from system/application testing where the entire system is tested as a unit
What is Variability?
Commonality The features shared by a set of systems
Variability
The features that differ between some pair of systems
Examples of Variability
• Electric bulb can be lit or unlit • Software applications support different
languages • A triple band mobile phone supports
three different network standards • Can load Choco vs. MiniSat to solve
constraints
Mechanisms for Implementing variability
– Compile flags – Properties files – Command-line arguments – Inheritance – Interface definition (and information hiding) – Design patterns (e.g., strategy) – Connectors (e.g., in architecture)
Mobile Phone SPL
V V V 16MC 8MC BW
V V GV TV
V V 2MP 1MP
Viewer Type
VP
Camera Type
VP
Phone
VP
V Display
V Email Viewer
V Camera
V Video
Camera
V Video
Ringtones
requires_v_vp requires_v_vp requires_v_vp
Mobile Phone SPL
Display Type
VP
V V V 16MC 8MC BW
V V GV TV
V V 2MP 1MP
Viewer Type
VP
Camera Type
VP
Phone
VP
V Display
V Email Viewer
V Camera
V Video
Camera
V Video
Ringtones
requires_v_vp requires_v_vp requires_v_vp
Mobile Phone SPL
Display Type
VP
V V V 16MC 8MC BW
V V GV TV
V V 2MP 1MP
Viewer Type
VP
Camera Type
VP
Phone
VP
V Display
V Email Viewer
V Camera
V Video
Camera
V Video
Ringtones
requires_v_vp requires_v_vp requires_v_vp
Mobile Phone SPL
Display Type
VP
V V V 16MC 8MC BW
V V GV TV
V V 2MP 1MP
Viewer Type
VP
Camera Type
VP
Phone
VP
V Display
V Email Viewer
V Camera
V Video
Camera
V Video
Ringtones
requires_v_vp requires_v_vp requires_v_vp
Mobile Phone SPL
Display Type
VP
V V V 16MC 8MC BW
V V GV TV
V V 2MP 1MP
Viewer Type
VP
Camera Type
VP
Phone
VP
V Display
V Email Viewer
V Camera
V Video
Camera
V Video
Ringtones
requires_v_vp requires_v_vp requires_v_vp
Mobile Phone SPL
Display Type
VP
V V V 16MC 8MC BW
V V GV TV
V V 2MP 1MP
Viewer Type
VP
Camera Type
VP
Phone
VP
V Display
V Email Viewer
V Camera
V Video
Camera
V Video
Ringtones
requires_v_vp requires_v_vp requires_v_vp
Mobile Phone SPL
Display Type
VP
V V V 16MC 8MC BW
V V GV TV
V V 2MP 1MP
Viewer Type
VP
Camera Type
VP
Phone
VP
V Display
V Email Viewer
V Camera
V Video
Camera
V Video
Ringtones
requires_v_vp requires_v_vp requires_v_vp
Mobile Phone SPL
Display Type
VP
V V V 16MC 8MC BW
V V GV TV
V V 2MP 1MP
Viewer Type
VP
Camera Type
VP
Phone
VP
V Display
V Email Viewer
V Camera
V Video
Camera
V Video
Ringtones
requires_v_vp requires_v_vp requires_v_vp
Mobile Phone SPL
Display Type
VP
Testing a Product Line • SPL engineering decreases time to market
• With shorter life cycles, the percentage of time spent in testing has increased – the new bottleneck
• Much of the current work on testing SPLs focuses on testing individual instances and reuse of specific test cases
V V V 16MC 8MC BW
V V GV TV
V V 2MP 1MP
Viewer Type
VP
Camera Type
VP
Phone
VP
V Display
V Email Viewer
V V Camera
Video Camera
V Video
Ringtones
requires_v_vp requires_v_vp requires_v_vp
Validating at the Unit Level
Display Type
VP
V V V 16MC 8MC BW
Validating at the Unit Level
Display Type
VP
V
16MC
V V V 16MC 8MC BW
Validating at the Unit Level
Display Type
VP
V
16MC
V
8MC
V V V 16MC 8MC BW
Validating at the Unit Level
Display Type
VP
V
16MC
V
8MC
V
BW
Focus is on reuse of artifacts during testing Views testing from the feature level rather than the system level
V V V 16MC 8MC BW
V V GV TV
V V 2MP 1MP
Viewer Type
VP
Camera Type
VP
Phone
VP
V Display
V Email Viewer
V Camera
V Video
Camera
V Video
Ringtones
requires_v_vp requires_v_vp requires_v_vp
System Level Testing
Display Type
VP
V V V 16MC 8MC BW
V V GV TV
V V 2MP 1MP
Viewer Type
VP
Camera Type
VP
Phone
VP
V Display
V Email Viewer
V Camera
V Video
Camera
V Video
Ringtones
requires_v_vp requires_v_vp requires_v_vp
System Level Testing
Viewer Type
VP
Test Case: Open email in HTML format
V V V 16MC 8MC BW
V V GV TV
V V 2MP 1MP
Viewer Type
VP
Camera Type
VP
Phone
VP
V Display
V Email Viewer
V Camera
V Video
Camera
V Video
Ringtones
requires_v_vp requires_v_vp requires_v_vp
System Level Testing
Display Type
VP
Test Case: Open email in HTML format
V V V 16MC 8MC BW
V V GV TV
V V 2MP 1MP
Viewer Type
VP
Camera Type
VP
Phone
VP
V Display
V Email Viewer
V Camera
V Video
Camera
V Video
Ringtones
requires_v_vp requires_v_vp requires_v_vp
Feature Interactions
Viewer Type
VP
Failure: caused by interaction of data passed between 16MC display and textual viewer
NASA Mars Rover: Spirit
Real Interaction Failure • NASA Detected failed communications on
two independent channels • Determined that the rover was continually
re-booting – risked running out of energy and/or overheating
• Fault caused by assumptions made between two different features: (NASA component expected third party file system to deallocate memory) – when one feature was removed the reboots
stopped
V V V 16MC 8MC BW
V V GV TV
V V 2MP 1MP
Viewer Type
VP
Camera Type
VP
Phone
VP
V Display
V Email Viewer
V Camera
V Video
Camera
V Video
Ringtones
requires_v_vp requires_v_vp requires_v_vp
Possible Interactions
Display Type
VP
Fault Map of Configurations V4
F1 F2 F3 F4 F5 F6 F7 F8 F9 F10 F11 F12 F13 F14
C0
C10
C20
C30
C40
C50
C60
V6
F1 F2 F3 F4 F5 F6 F7 F8 F9 F10 F11 F12
C0
C10
C20
C30
C40
C50
C60
Faults
C o n f I g u r a t I o n s
V V V 16MC 8MC BW
V V GV TV
V V 2MP 1MP
Viewer Type
VP
Camera Type
VP
Phone
VP
V Display
V Email Viewer
V Camera
V Video
Camera
V Video
Ringtones
requires_v_vp requires_v_vp requires_v_vp
Testing the Interaction Space
Viewer Type
VP
Number of possible interactions grows exponentially with number of variation points
Some Other Approaches to SPL Testing
• Extended use cases for variability [Bertolino et al., SPFE 2003]
• Integration testing a SPL based on UML activity diagrams [Reis et al., FASE 2007]
• Generate test cases incrementally for a SPL [Uzuncaova et al., ISSRE 2008]
• Applying test cases for a subset of products only [Kim et al., ASE 2010]
• Applying monitoring properties for a subset of products only [Kim et al., RV 2010]
Some SPL Resources
• http://splot-research.org/
• http://www.isa.us.es/fama/
• http://sir.unl.edu
Agenda
SPL System Testing: the combinatorics
SPL Integration Testing
Summary
Continuous Test Suite Augmentation
Introduction to Software Product Lines
A Combinatorial Problem Display Email Viewer Camera Video
Camera Video
Ringtones
16 Million Colors Graphical 2 Megapixels Yes Yes
8 Million Colors Text 1 Megapixel No No
Black and White None None
A Combinatorial Problem Display Email Viewer Camera Video
Camera Video
Ringtones
16 Million Colors Graphical 2 Megapixels Yes Yes
8 Million Colors Text 1 Megapixel No No
Black and White None None
3
A Combinatorial Problem Display Email Viewer Camera Video
Camera Video
Ringtones
16 Million Colors Graphical 2 Megapixels Yes Yes
8 Million Colors Text 1 Megapixel No No
Black and White None None
3 x 3
A Combinatorial Problem Display Email Viewer Camera Video
Camera Video
Ringtones
16 Million Colors Graphical 2 Megapixels Yes Yes
8 Million Colors Text 1 Megapixel No No
Black and White None None
• 33x22=108 feasible SPL instances
3 x 3 x 3 x 2 x 2
A Combinatorial Problem
• 33x22=108 feasible SPL instances • 10 features each with 5 values = 510 or
9,765,625 possible instances
Display Email Viewer Camera Video Camera
Video Ringtones
16 Million Colors Graphical 2 Megapixels Yes Yes
8 Million Colors Text 1 Megapixel No No
Black and White None None
A Combinatorial Problem
• 33x22=108 feasible SPL instances • 10 features each with 5 values = 510 or
9,765,625 possible instances • 4 hours to run test suite = approximately 4,459 years to test GCC optimizer: configuration space ~1061
Display Email Viewer Camera Video Camera
Video Ringtones
16 Million Colors Graphical 2 Megapixels Yes Yes
8 Million Colors Text 1 Megapixel No No
Black and White None None
Combinatorial Interaction Testing (CIT)
• Test all pairs or t-way combinations of feature-values, where t is a defined strength of testing
• Uses a mathematical structure called a covering array
Display Email Viewer Camera Video V. Ringtones
1 16 Million Colors None 1 Megapixel Yes Yes
2 8 Million Colors Text None No No
3 16 Million Colors Text 2 Megapixels No Yes
4 Black and White None None No Yes
5 8 Million Colors None 2 Megapixels Yes No
6 16 Million Colors Graphical None Yes No
7 Black and White Text 1 Megapixel Yes No
8 8 Million Colors Graphical 1 Megapixel No Yes
9 Black and White Graphical 2 Megapixels Yes Yes
2-way CIT
Display Email Viewer Camera Video V. Ringtones
1 16 Million Colors None 1 Megapixel Yes Yes
2 8 Million Colors Text None No No
3 16 Million Colors Text 2 Megapixels No Yes
4 Black and White None None No Yes
5 8 Million Colors None 2 Megapixels Yes No
6 16 Million Colors Graphical None Yes No
7 Black and White Text 1 Megapixel Yes No
8 8 Million Colors Graphical 1 Megapixel No Yes
9 Black and White Graphical 2 Megapixels Yes Yes
2-way CIT
Display Email Viewer Camera Video V. Ringtones
1 16 Million Colors None 1 Megapixel Yes Yes
2 8 Million Colors Text None No No
3 16 Million Colors Text 2 Megapixels No Yes
4 Black and White None None No Yes
5 8 Million Colors None 2 Megapixels Yes No
6 16 Million Colors Graphical None Yes No
7 Black and White Text 1 Megapixel Yes No
8 8 Million Colors Graphical 1 Megapixel No Yes
9 Black and White Graphical 2 Megapixels Yes Yes
2-way CIT
Combinatorial Interaction Testing (CIT)
• Based on statistical design of experiments (DOE) – Manufacturing – Drug test interactions – Chemical interactions
• For software testing – Mandl – compiler testing – Brownlie, Prowse, Phadke – OATS system – D. Cohen, Dalal, Fredman, Patton, Parelius –
AETG – Williams, Probert – network node interfaces – Yilmaz, Cohen, Porter- ACE/TAO
Empirical Results on CIT for System Inputs
• CIT has been well studied on testing input parameters for programs
• [Brownlie, Prowse, Phadke 92], [Burr,Young 98],[Burroughs, Jain, Erickson 94] [Cohen, Dalal, Fredman, Patton 97], [Dalal, Jain, Karunanithi, Leaton, Lott 98], [Dunietz, Erlich, Szablak, Mallows, Iannino 97], [Mandl 85]
$ sort -m -c input.txt
CIT for Configurable Systems
• Some studies on configurable systems
• [Kuhn, Wallace,Gallo 04] – Medical devices, web browsers, data mgt, http server
• [Yilmaz, Cohen, Porter 04,06] – Ace/TAO Corba middleware
• [Qu, Cohen, Rothermel 08]
CIT vs. Default Configuration of vim in SIR
Faul
t Det
ectio
n (%
) 0
20
4
0 6
0 8
0 1
00
v2 v3 v4 v5 v6 v7
Version default configuration
How do we Generate the Samples?
• Greedy approaches – AETG, IPOG – Search-based (simulated annealing, GAs)
• Use of constraint solvers
Meta-Heuristic Search ∑ -Set of feasible solutions
0 1 1 1 1 1 0 1 0 0 1 0 1 0 0 1 0 1 1 1 0 0 0 1 1 1 1 0 1 0
0 1 1 1 1 1 0 1 0 0 0 0 1 0 0 1 0 1 1 1 0 0 0 0 1 1 1 0 1 0
?
cost(S) Randomly move through space
S1 S2
S3 ….. Si
Problems with Approach
• May get stuck in local optima
Meta-heuristic Algorithms
• Provide mechanisms to escape local optima
• Sometimes the algorithm can accept a worse choice in the hope we will find a better route to a good solution
Examples
• Genetic Algorithms • Simulated Annealing • Tabu Search
Search Based Software Engineering
• Meant to simulate the coolng of metal
• Physical annealing is a process of slowly cooling a metal, in baths of decreasing temperatures.
• At each stage the metal molecules are allowed to come to an equilibrium. Then the bath is lowered to the next temperature.
• If the initial temperature is not high enough or if the cooling is done too quickly the metal will be brittle.
Simulated Annealing
• The basic strategy is the same as a hill climb, but the meta-heuristic creates a gentle decrease in the probability for making a bad move.
• At the start of our hill climb we often make a lot of bad moves.
• By the time we are close to an optimal solution we are almost in a pure hill climb (the probability drops to almost zero)
Simulated Annealing (cont.)
• Start with a solution S • Select a new solution S’ that is a neighbor of S • If S’-S ≤ 0 then S’ is accepted • Otherwise
– Accept S’ with a probability
Where T = temperature and B is the Boltzmann
constant (this is a distribution which we won’t actually use for simulating the process)
Simulated Annealing
€
es−s'kBT
Simulated Annealing
0 1 1 1 1 1 0 1 0 0 1 0 1 0 0 1 0 1 1 1 0 0 0 1 1 1 1 0 1 0
0 1 1 1 1 1 0 1 0 0 0 0 1 0 0 1 0 1 1 1 0 0 0 0 1 1 1 0 1 0
?
Probability of bad choice depends on cooling schedule
Lower bound L
Binary search for “best” solution
Upper bound U
(L+U)/2
Finding “N”
The Real System Constraints on Valid Configurations:���__________________________________________________________________________________________________
(1) Graphical email viewer requires color display (2) 2 Megapixel camera requires a color display (3) Graphical email viewer not supported with 2 Megapixel camera (4) 8 Million color display does not support a 2 Megapixel camera (5) Video camera requires a camera and a color display (6) Video ringtones cannot occur with No video camera (7) The combination of 16 Million colors, Text and 2 Megapixel camera will not be supported
Display Email Viewer Camera Video V. Ringtones
1 16 Million Colors None 1 Megapixel Yes Yes
2 8 Million Colors Text None No No
3 16 Million Colors Text 2 Megapixels No Yes
4 Black and White None None No Yes
5 8 Million Colors None 2 Megapixels Yes No
6 16 Million Colors Graphical None Yes No
7 Black and White Text 1 Megapixel Yes No
8 8 Million Colors Graphical 1 Megapixel No Yes
9 Black and White Graphical 2 Megapixels Yes Yes
Constraints
Display Email Viewer Camera Video V. Ringtones
1 16 Million Colors None 1 Megapixel Yes Yes
2 8 Million Colors Text None No No
3 16 Million Colors Text 2 Megapixels No Yes
4 Black and White None None No Yes
5 8 Million Colors None 2 Megapixels Yes No
6 16 Million Colors Graphical None Yes No
7 Black and White Text 1 Megapixel Yes No
8 8 Million Colors Graphical 1 Megapixel No Yes
9 Black and White Graphical 2 Megapixels Yes Yes
Constraints
“Video Ringtones cannot occur without video camera
Display Email Viewer Camera Video V. Ringtones
1 16 Million Colors None 1 Megapixel Yes Yes
2 8 Million Colors Text None No No
3 16 Million Colors Text 2 Megapixels No Yes
4 Black and White None None No Yes
5 8 Million Colors None 2 Megapixels Yes No
6 16 Million Colors Graphical None Yes No
7 Black and White Text 1 Megapixel Yes No
8 8 Million Colors Graphical 1 Megapixel No Yes
9 Black and White Graphical 2 Megapixels Yes Yes
Constraints
“Video Ringtones cannot occur without video camera
Display Email Viewer Camera Video Camera
Video Ringtones
1 16 Million Colors None 1 Megapixel Yes Yes
2 8 Million Colors Text None No No
3 16 Million Colors Text 2 Megapixels No Yes
4 Black and White None None No Yes
5 8 Million Colors None 2 Megapixels Yes No
6 16 Million Colors Graphical None Yes No
7 Black and White Text 1 Megapixel Yes No
8 8 Million Colors Graphical 1 Megapixel No Yes
9 Black and White Graphical 2 Megapixels Yes Yes
Constraints: (1) Graphical email viewer requires color display (2) 2 Megapixel Camera requires color display (3) Graphical email viewer not supported with the 2 Megapixel camera
Decrease Sample Size
Display Email Viewer Camera Video Camera
Video Ringtones
1 16 Million Colors None 2 Megapixels No No
2 16 Million Colors Graphical 1 Megapixel No No
3 8 Million Colors None 1 Megapixel Yes Yes
4 16 Million Colors Graphical None Yes Yes
5 16 Million Colors Text 1 Megapixel Yes No
6 Black and White None 1 Megapixel Yes Yes
7 8 Million Colors Graphical 2 Megapixels Yes Yes
8 Black and White Text 2 Megapixels No Yes
9 8 Million Colors Text None No No
10 Black and White None None No No
Constraints:���_____________________________________________________________________ (1) Graphical email viewer requires color display
Increase Sample Size
Display Email Viewer Camera Video Camera
Video Ringtones
1 16 Million Colors None 2 Megapixels No No
2 16 Million Colors Graphical 1 Megapixel No No
3 8 Million Colors None 1 Megapixel Yes Yes
4 16 Million Colors Graphical None Yes Yes
5 16 Million Colors Text 1 Megapixel Yes No
6 Black and White None 1 Megapixel Yes Yes
7 8 Million Colors Graphical 2 Megapixels Yes Yes
8 Black and White Text 2 Megapixels No Yes
9 8 Million Colors Text None No No
10 Black and White None None No No
Constraints:���_____________________________________________________________________ (1) Graphical email viewer requires color display
Increase Sample Size
Display Email Viewer Camera Video Camera
Video Ringtones
1 16 Million Colors None 2 Megapixels No No
2 16 Million Colors Graphical 1 Megapixel No No
3 8 Million Colors None 1 Megapixel Yes Yes
4 16 Million Colors Graphical None Yes Yes
5 16 Million Colors Text 1 Megapixel Yes No
6 Black and White None 1 Megapixel Yes Yes
7 8 Million Colors Graphical 2 Megapixels Yes Yes
8 Black and White Text 2 Megapixels No Yes
9 8 Million Colors Text None No No
10 Black and White None None No No
Constraints:���_____________________________________________________________________ (1) Graphical email viewer requires color display
Increase Sample Size
Display Email Viewer Camera Video Camera
Video Ringtones
1 16 Million Colors None 2 Megapixels No No
2 16 Million Colors Graphical 1 Megapixel No No
3 8 Million Colors None 1 Megapixel Yes Yes
4 16 Million Colors Graphical None Yes Yes
5 16 Million Colors Text 1 Megapixel Yes No
6 Black and White None 1 Megapixel Yes Yes
7 8 Million Colors Graphical 2 Megapixels Yes Yes
8 Black and White Text 2 Megapixels No Yes
9 8 Million Colors Text None No No
10 Black and White None None No No
Constraints:���_____________________________________________________________________ (1) Graphical email viewer requires color display
Increase Sample Size
Constrained CIT Approach
Initialization Determine implicit
forbidden constraints
CIT Algorithm Greedy
Meta- heuristic search
Constraint Checking
Check partial or full configurations
Greedy Worked Well
• Extended both greedy and meta-heuristic search algorithms to find constrained CIT samples
• Used off the shelf SAT solver • Mine information learned during
Boolean propagation to improve performance of greedy algorithm
M.B. Cohen, M.B. Dwyer and J. Shi, Constructing interaction test suites for highly-configurable systems in the presence of constraints: a greedy approach, IEEE Transactions on Software Engineering , 34(5), 2008, pp. 633-650.
Limitations • Greedy algorithms tend to:
• Finish quickly • Handle constraints well
• Meta-heuristic algorithms tend to: • Find smaller samples • Handle constraints poorly
• Software product lines may have long test times per instance and many dependencies: • Want a meta-heuristic search algorithm that builds small
samples quickly
Reformulated Simulated Annealing CIT Algorithm
• Modified: – Neighborhood of search
• Narrows as search progresses
– Global strategy to find minimal size of array – A set of minor modifications
• B.J. Garvin, M.B. Cohen, and M.B. Dwyer, Evaluating Improvements to a Meta-Heuristic Search for Constrained Interaction Testing, Empirical Software Engineering (EMSE), 16(1), 2011, pp.61-102. • CASA tool: http://cse.unl.edu/~citportal/tools/casa/
Major Modifications
• Original implementation uses two phases: (1) Finding minimal size of sample (2) Finding a valid sample within that size
• Modifications: • New search space (neighborhood) • Improved method for finding size (global)
Reduces infeasible search paths (dotted lines)
Original Search Space may lead to many infeasible solutions
New Search Space provides a more direct path to feasible solutions
Modified Search Space (CASA)1
1. Available at: http://www.cse.unl.edu/citportal/tools/casa/
Reduces infeasible search paths (dotted lines)
Original Search Space may lead to many infeasible solutions
New Search Space provides a more direct path to feasible solutions
Modified Search Space (CASA)
Reduces infeasible search paths (dotted lines)
Original Search Space may lead to many infeasible solutions
New Search Space provides a more direct path to feasible solutions
Modified Search Space (CASA)
Modified Global Strategy
Case Studies (t=2) Average Size Average Time Break
Even Point Greedy Old SA New SA
Greedy Old SA New SA Greedy vs.
New SA
SPINs 27.0 20.2 19.4 0.2 414.9 8.6 1.1
SPINv 42.5 33.2 36.8 11.3 4010.7 102.1 15.9
GCC 24.7 19.8 21.1 204.0 36,801.4 1902.0 471.7
Apac 42.6 32.6 32.3 76.4 23,589.1 109.1 3.2
Bugz 21.8 16.0 16.2 1.9 58.0 9.1 1.3
Case Studies (t=2) Average Size Average Time Break
Even Point Greedy Old SA New SA
Greedy Old SA New SA Greedy vs.
New SA
SPINs 27.0 20.2 19.4 0.2 414.9 8.6 1.1
SPINv 42.5 33.2 36.8 11.3 4010.7 102.1 15.9
GCC 24.7 19.8 21.1 204.0 36,801.4 1902.0 471.7
Apac 42.6 32.6 32.3 76.4 23,589.1 109.1 3.2
Bugz 21.8 16.0 16.2 1.9 58.0 9.1 1.3
Case Studies (t=2) Average Size Average Time Break
Even Point Greedy Old SA New SA
Greedy Old SA New SA Greedy vs.
New SA
SPINs 27.0 20.2 19.4 0.2 414.9 8.6 1.1
SPINv 42.5 33.2 36.8 11.3 4010.7 102.1 15.9
GCC 24.7 19.8 21.1 204.0 36,801.4 1902.0 471.7
Apac 42.6 32.6 32.3 76.4 23,589.1 109.1 3.2
Bugz 21.8 16.0 16.2 1.9 58.0 9.1 1.3
Case Studies (t=2) Average Size Average Time Break
Even Point Greedy Old SA New SA
Greedy Old SA New SA Greedy vs.
New SA
SPINs 27.0 20.2 19.4 0.2 414.9 8.6 1.1
SPINv 42.5 33.2 36.8 11.3 4010.7 102.1 15.9
GCC 24.7 19.8 21.1 204.0 36,801.4 1902.0 471.7
Apac 42.6 32.6 32.3 76.4 23,589.1 109.1 3.2
Bugz 21.8 16.0 16.2 1.9 58.0 9.1 1.3
Case Studies (t=2) Average Size Average Time Break
Even Point Greedy Old SA New SA
Greedy Old SA New SA Greedy vs.
New SA
SPINs 27.0 20.2 19.4 0.2 414.9 8.6 1.1
SPINv 42.5 33.2 36.8 11.3 4010.7 102.1 15.9
GCC 24.7 19.8 21.1 204.0 36,801.4 1902.0 471.7
Apac 42.6 32.6 32.3 76.4 23,589.1 109.1 3.2
Bugz 21.8 16.0 16.2 1.9 58.0 9.1 1.3
Case Studies (t=2) Average Size Average Time Break
Even Point Greedy Old SA New SA
Greedy Old SA New SA Greedy vs.
New SA
SPINs 27.0 20.2 19.4 0.2 414.9 8.6 1.1
SPINv 42.5 33.2 36.8 11.3 4010.7 102.1 15.9
GCC 24.7 19.8 21.1 204.0 36,801.4 1902.0 471.7
Apac 42.6 32.6 32.3 76.4 23,589.1 109.1 3.2
Bugz 21.8 16.0 16.2 1.9 58.0 9.1 1.3
Going Beyond CIT
• CIT is a system testing technique: – Need to instantiate complete products – Many interactions are low order
• CIT is black box: – Tests may or may not actually trigger
interactions between features – No consideration of data/control flow
directions
Directed vs. Undirected Interactions
Display Email Viewer Camera Video V. Ringtones
1 16 Million Colors None 1 Megapixel Yes Yes
2 8 Million Colors Text None No No
3 16 Million Colors Text 2 Megapixels No Yes
4 Black and White None None No Yes
5 8 Million Colors None 2 Megapixels Yes No
6 16 Million Colors Graphical None Yes No
7 Black and White Text 1 Megapixel Yes No
8 8 Million Colors Graphical 1 Megapixel No Yes
9 Black and White Graphical 2 Megapixels Yes Yes
Directed vs. Undirected Interactions
Display Email Viewer Camera Video V. Ringtones
1 16 Million Colors None 1 Megapixel Yes Yes
2 8 Million Colors Text None No No
3 16 Million Colors Text 2 Megapixels No Yes
4 Black and White None None No Yes
5 8 Million Colors None 2 Megapixels Yes No
6 16 Million Colors Graphical None Yes No
7 Black and White Text 1 Megapixel Yes No
8 8 Million Colors Graphical 1 Megapixel No Yes
9 Black and White Graphical 2 Megapixels Yes Yes
2 ways to test this interaction:
8 million colors à Text Text à 8 million colors
Directed vs. Undirected Interactions
Directed vs. Undirected Interactions
Directed vs. Undirected Interactions
Agenda
SPL System Testing: the combinatorics
SPL Integration Testing
Summary
Continuous Test Suite Augmentation
Introduction to Software Product Lines
Our Idea • Test at the integration level:
• We can test partial products • Incrementally increase strength of interactions • Re-use results as we test
• Reduce the interaction space: • Identify only feasible interactions • Integrate features using the analysis of feasible
interactions • Generate tests to exercise these interactions
(future)
Approach
s() {! …! v();! if (c) {! …! }! w();! …!}!
f1(){! …!}!
f2(){! …!}!
f3() !
f4() !
J. Shi, Myra B. Cohen and Matthew B. Dwyer, Integration Testing of Software Product Lines Using Compositional Symbolic Execution, International Conference on Fundamental Approaches to Software Engineering (FASE), March 2012, pp. 270-284.
Approach
s() {! …! v();! if (c) {! …! }! w();! …!}!
f1(){! …!}!
f2(){! …!}!
f3() !
f4() !
CFG PDG
Code: Program Dependency and Control Flow Graph
Approach
s() {! …! v();! if (c) {! …! }! w();! …!}!
f1(){! …!}!
f2(){! …!}!
f3() !
f4() !
Interaction trees
CFG PDG
Feature Model: Defines possible interaction space
Approach
s() {! …! v();! if (c) {! …! }! w();! …!}!
f1(){! …!}!
f2(){! …!}!
f3() !
f4() !
CFG PDG
Feature dependency graph
Approach
s() {! …! v();! if (c) {! …! }! w();! …!}!
f1(){! …!}!
f2(){! …!}!
f3() !
f4() !
Interaction trees
CFG PDG
Interaction Trees
Approach
s() {! …! v();! if (c) {! …! }! w();! …!}!
f1(){! …!}!
f2(){! …!}!
f3() !
f4() !
Interaction trees
CFG PDG
Symbolic Exeuction: Report Faults/Generate Tests
Interaction Trees f1
f3 f4
f2
2-way interaction trees
FDG
Interaction Trees f1
f3 f4
f2
2-way interaction trees
f1 f3
FDG
Interaction Trees f1
f3 f4
f2
2-way interaction trees
f1 f3
f3 f4
FDG
Interaction Trees f1
f3 f4
f2
2-way interaction trees
f1 f3
f3 f4
f2 f4
FDG
Interaction Trees f1
f3 f4
f2
2-way interaction trees 3-way interaction trees
f1 f3
f3 f4
f2 f4
f1 f3 f4
FDG
Interaction Trees f1
f3 f4
f2
2-way interaction trees 3-way interaction trees
f1 f3
f3 f4
f2 f4
f1 f3 f4
FDG
f2
f4
f3
Interaction Trees
2-way interaction trees
f1 f3 f3 f4 f2 f4
Interaction Trees
2-way interaction trees
3-way interaction trees
f1 f3 f3 f4 f2 f4
f1 f3 f4 f2
f4
f3
Hierarchical Composition
Testing Method
• Use bounded symbolic execution • Obtain summaries from individual
features • Use interaction trees to guide
composition of all t-feature summaries
Symbolic Summaries
• We have three outcomes: 1. Complete execution of a path (passing
behavior) 2. Exception (failing behavior)
– may or may not be feasible to reach in full system
3. Depth bound is reach (unknown behavior) In this work we only accumulate type 1.
Results of (Composed) Summaries
• Constraints can be collected for each region to perform test generation (in progress)
sum(f1,f2,f4) = {(f1.x>0,f4.ret=f1.x+1), …)
complete
exception
Time out
complete
exception
Time out
Feature 1 Feature 2
Compositional SE
(A>0 ∧ B> 0, return=A)
Compositional SE
Compositional SE
Case Study We performed a case study on 2 software product lines to answer these questions: • RQ1: What is the reduction from our dependency
analysis on the number of interactions that should be tested in an SPL?
• RQ2: What is the difference in time between using our compositional symbolic technique versus a traditional directed technique?
Applications Used SCARI: Software Defined Radio Reference Model
Applications Used GPL: Graph Product Line
Applications SCARI GPL
Method
RQ1: • Compared number of (directed/undirected)
interactions RQ2: • Compared directed SE vs. incremental
compositional SE
Method
• Tools: – Adapted the IFA in Soot – Direct SE – use Java PathFinder – Compositional approach – built a composer
• Depth: – 20*t – Explicit for Direct SE – Implicit for compositional
Interaction Reduction
Subject t UI DI Feas UI
Feas DI
UI Red
DI Red
SCARI 2 188 376 85 85 54.8% 77.4% 3 532 3192 92 92 82.7% 97.1% 4 532 12768 162 162 69.5% 98.7% 5 532 63840 144 144 72.9% 99.8%
GPL 2 288 576 21 27 92.7% 95.3% 3 2024 12144 29 84 98.6% 99.3% 4 9680 232320 31 260 99.7% 99.9% 5 33264 3991680 20 525 99.9% 100.0%
Interaction Reduction
Subject t UI DI Feas UI
Feas DI
UI Red
DI Red
SCARI 2 188 376 85 85 54.8% 77.4% 3 532 3192 92 92 82.7% 97.1% 4 532 12768 162 162 69.5% 98.7% 5 532 63840 144 144 72.9% 99.8%
GPL 2 288 576 21 27 92.7% 95.3% 3 2024 12144 29 84 98.6% 99.3% 4 9680 232320 31 260 99.7% 99.9% 5 33264 3991680 20 525 99.9% 100.0%
Interaction Reduction
Subject t UI DI Feas UI
Feas DI
UI Red
DI Red
SCARI 2 188 376 85 85 54.8% 77.4% 3 532 3192 92 92 82.7% 97.1% 4 532 12768 162 162 69.5% 98.7% 5 532 63840 144 144 72.9% 99.8%
GPL 2 288 576 21 27 92.7% 95.3% 3 2024 12144 29 84 98.6% 99.3% 4 9680 232320 31 260 99.7% 99.9% 5 33264 3991680 20 525 99.9% 100.0%
Interaction Reduction
Subject t UI DI Feas UI
Feas DI
UI Red
DI Red
SCARI 2 188 376 85 85 54.8% 77.4% 3 532 3192 92 92 82.7% 97.1% 4 532 12768 162 162 69.5% 98.7% 5 532 63840 144 144 72.9% 99.8%
GPL 2 288 576 21 27 92.7% 95.3% 3 2024 12144 29 84 98.6% 99.3% 4 9680 232320 31 260 99.7% 99.9% 5 33264 3991680 20 525 99.9% 100.0%
Interaction Reduction
Subject t UI DI Feas UI
Feas DI
UI Red
DI Red
SCARI 2 188 376 85 85 54.8% 77.4% 3 532 3192 92 92 82.7% 97.1% 4 532 12768 162 162 69.5% 98.7% 5 532 63840 144 144 72.9% 99.8%
GPL 2 288 576 21 27 92.7% 95.3% 3 2024 12144 29 84 98.6% 99.3% 4 9680 232320 31 260 99.7% 99.9% 5 33264 3991680 20 525 99.9% 100.0%
Interaction Reduction
Subject K UI DI Feas UI
Feas DI
UI Red
DI Red
SCARI 2 188 376 85 85 54.8% 77.4% 3 532 3192 92 92 82.7% 97.1% 4 532 12768 162 162 69.5% 98.7% 5 532 63840 144 144 72.9% 99.8%
GPL 2 288 576 21 27 92.7% 95.3% 3 2024 12144 29 84 98.6% 99.3% 4 9680 232320 31 260 99.7% 99.9% 5 33264 3991680 20 525 99.9% 100.0%
532 UI 63840 DI
Interaction Reduction
Subject t UI DI Feas UI
Feas DI
UI Red
DI Red
SCARI 2 188 376 85 85 54.8% 77.4% 3 532 3192 92 92 82.7% 97.1% 4 532 12768 162 162 69.5% 98.7% 5 532 63840 144 144 72.9% 99.8%
GPL 2 288 576 21 27 92.7% 95.3% 3 2024 12144 29 84 98.6% 99.3% 4 9680 232320 31 260 99.7% 99.9% 5 33264 3991680 20 525 99.9% 100.0%
532 UI 63840 DI
Interaction Reduction
Subject t UI DI Feas UI
Feas DI
UI Red
DI Red
SCARI 2 188 376 85 85 54.8% 77.4% 3 532 3192 92 92 82.7% 97.1% 4 532 12768 162 162 69.5% 98.7% 5 532 63840 144 144 72.9% 99.8%
GPL 2 288 576 21 27 92.7% 95.3% 3 2024 12144 29 84 98.6% 99.3% 4 9680 232320 31 260 99.7% 99.9% 5 33264 3991680 20 525 99.9% 100.0%
144 UI 144 DI
532 UI 63840 DI
Interaction Reduction
Subject t UI DI Feas UI
Feas DI
UI Red
DI Red
SCARI 2 188 376 85 85 54.8% 77.4% 3 532 3192 92 92 82.7% 97.1% 4 532 12768 162 162 69.5% 98.7% 5 532 63840 144 144 72.9% 99.8%
GPL 2 288 576 21 27 92.7% 95.3% 3 2024 12144 29 84 98.6% 99.3% 4 9680 232320 31 260 99.7% 99.9% 5 33264 3991680 20 525 99.9% 100.0%
20 UI 525 DI
Interaction Reduction
Subject t UI DI Feas UI
Feas DI
UI Red
DI Red
SCARI 2 188 376 85 85 54.8% 77.4% 3 532 3192 92 92 82.7% 97.1% 4 532 12768 162 162 69.5% 98.7% 5 532 63840 144 144 72.9% 99.8%
GPL 2 288 576 21 27 92.7% 95.3% 3 2024 12144 29 84 98.6% 99.3% 4 9680 232320 31 260 99.7% 99.9% 5 33264 3991680 20 525 99.9% 100.0%
Interaction Reduction
Subject t UI DI Feas UI
Feas DI
UI Red
DI Red
SCARI 2 188 376 85 85 54.8% 77.4% 3 532 3192 92 92 82.7% 97.1% 4 532 12768 162 162 69.5% 98.7% 5 532 63840 144 144 72.9% 99.8%
GPL 2 288 576 21 27 92.7% 95.3% 3 2024 12144 29 84 98.6% 99.3% 4 9680 232320 31 260 99.7% 99.9% 5 33264 3991680 20 525 99.9% 100.0%
55 – 99.9%
Interaction Reduction
Subject t UI DI Feas UI
Feas DI
UI Red
DI Red
SCARI 2 188 376 85 85 54.8% 77.4% 3 532 3192 92 92 82.7% 97.1% 4 532 12768 162 162 69.5% 98.7% 5 532 63840 144 144 72.9% 99.8%
GPL 2 288 576 21 27 92.7% 95.3% 3 2024 12144 29 84 98.6% 99.3% 4 9680 232320 31 260 99.7% 99.9% 5 33264 3991680 20 525 99.9% 100.0%
77– 99.99 %
RQ1: Summary
• We see a reduction of over 50% in directed and over 75% in undirected interactions when we use the interaction trees
Subject t Direct SE
Comp SE
SCARI 1 6.75 6.75 2 14.48 9.63 3 17.67 10.06 4 36.09 10.93 5 35.87 11.70
GPL 1 41.77 41.77 2 67.25 56.28 3 184.76 82.00 4 727.34 216.63 5 3887.23 965.92
Compositional vs. Directed(seconds)
Subject t Direct SE
Comp SE
SCARI 1 6.75 6.75 2 14.48 9.63 3 17.67 10.06 4 36.09 10.93 5 35.87 11.70
GPL 1 41.77 41.77 2 67.25 56.28 3 184.76 82.00 4 727.34 216.63 5 3887.23 965.92
Compositional vs. Directed(seconds)
Subject t Direct SE
Comp SE
SCARI 1 6.75 6.75 2 14.48 9.63 3 17.67 10.06 4 36.09 10.93 5 35.87 11.70
GPL 1 41.77 41.77 2 67.25 56.28 3 184.76 82.00 4 727.34 216.63 5 3887.23 965.92
Compositional vs. Directed(seconds)
Subject t Direct SE
Comp SE
SCARI 1 6.75 6.75 2 14.48 9.63 3 17.67 10.06 4 36.09 10.93 5 35.87 11.70
GPL 1 41.77 41.77 2 67.25 56.28 3 184.76 82.00 4 727.34 216.63 5 3887.23 965.92
Compositional vs. Directed(seconds)
Subject t Direct SE
Comp SE
SCARI 1 6.75 6.75 2 14.48 9.63 3 17.67 10.06 4 36.09 10.93 5 35.87 11.70
GPL 1 41.77 41.77 2 67.25 56.28 3 184.76 82.00 4 727.34 216.63 5 3887.23 965.92
Compositional vs. Directed(seconds)
Same time
Subject t Direct SE
Our Tech
SCARI 1 6.75 6.75 2 14.48 9.63 3 17.67 10.06 4 36.09 10.93 5 35.87 11.70
GPL 1 41.77 41.77 2 67.25 56.28 3 184.76 82.00 4 727.34 216.63 5 3887.23 965.92
1/3 of the time
1/4 of the time
Compositional vs. Directed(seconds)
Compositional vs. Directed(seconds)
0
5
10
15
20
25
30
35
40
1 2 3 4 5
seco
nds
t-way interactions
SCARI
Direct SE
Compositional SE
Compositional vs. Directed(seconds)
0
500
1000
1500
2000
2500
3000
3500
4000
4500
1 2 3 4 5
seco
nds
t-way interactions
GPL
Direct SE
Compositional SE
RQ2: Summary
• As we increase the interaction strength we see a greater reduction in time when using compositional symbolic execution
• We reduce the time by a factor of 4 in the best case
• We believe this will allow for greater scalability and testing of higher strengths
Summary
• We developed an incremental compositional symbolic execution technique for integration testing of SPLs
• Leverages dependency analysis to reduce interactions by more than 50%
• Compositional symbolic execution reduces time by as much as 75%
Future Directions
• Use summaries for test generation to execute specific interaction
• Characterize areas of behavior to drive further test generation
• Extend to other types of SPLs
Agenda
SPL System Testing: the combinatorics
SPL Integration Testing
Summary
Continuous Test Suite Augmentation
Introduction to Software Product Lines
COntinuous TEst Suite Augmentation
• CONTESA
Z. Xu, M. B. Cohen, W. Motycka, G. Rothermel, Continuous Test Suite Augmentation in Software Product Lines, International Software Product Line Conference (SPLC), August, 2013, (to appear).
Approaches to Testing/Analyzing SPLS1
• Feature based – Focusing on individual features
• Product based – Focusing on products
• Family based – Testing from family perspective
1. T. Thum, S. Apel, C. K�astner, M. Kuhlemann, I. Schaefer, and G. Saake. Analysis strategies for software product lines. Technical report, Faukulat fur Informatik, Otto-von-Guericke-Univerit�at, April 2012.
CONTESA
• Hybrid family/product based approach
• Aims to cover all code in family by selecting products to test
• Leverages ideas/techniques from regression testing
Software Evolves…
Software Evolves…
Software Evolves…
Test Cases Change Coverage
Tests
Test Cases Change Coverage
Tests
Test Cases Change Coverage
Tests
Test Cases Change Coverage
Tests We need to add tests to the test suite
Code-based View E
int foo (int x, int y)
P1: if(x>0)
S1:return x;
S3:return x;
P3: if(x>3)
X
S2:return x+2;
S4:return x+y; P2: if(x>y)
T
T
T
F
F
F
Test Suite: t1=(x = 2, y = 2) t2=(x = 4, y = 4) t3=(x = 1, y = 0) t4=(x = 4, y = 3) t5=(x = −1, y = 0)
Code-based View E
int foo (int x, int y)
P1: if(x>0)
S1:return x;
S3:return x;
P3: if(x>3)
X
S2:return x+2;
S4:return x+y; P2: if(x>y)
T
T
T
F
F
F
Test Suite: t1=(x = 2, y = 2) t2=(x = 4, y = 4) t3=(x = 1, y = 0) t4=(x = 4, y = 3) t5=(x = −1, y = 0)
Code-based View E
int foo (int x, int y)
P1: if(x>0)
S1:return x;
S3:return x;
P3: if(x>3)
X
S2:return x+2;
S4:return x+y; P2: if(x>y)
T
T
T
F
F
F
Test Suite: t1=(x = 2, y = 2) t2=(x = 4, y = 4) t3=(x = 1, y = 0) t4=(x = 4, y = 3) t5=(x = −1, y = 0)
Code-based View E
int foo (int x, int y)
P1: if(x>0)
S1:return x;
S3:return x;
P3: if(x>3)
X
S2:return x+2;
S4:return x+y; P2: if(x>y)
T
T
T
F
F
F
Test Suite: t1=(x = 2, y = 2) t2=(x = 4, y = 4) t3=(x = 1, y = 0) t4=(x = 4, y = 3) t5=(x = −1, y = 0)
Code-based View E
int foo (int x, int y)
P1: if(x>0)
S1:return x;
S3:return x;
P3: if(x>3)
X
S2:return x+2;
S4:return x+y; P2: if(x>y)
T
T
T
F
F
F
Test Suite: t1=(x = 2, y = 2) t2=(x = 4, y = 4) t3=(x = 1, y = 0) t4=(x = 4, y = 3) t5=(x = −1, y = 0)
Implication E
P1: if(x>0)
S1:return x;
S3:return x;
P3: if(x>3)
X
S2:return x+2;
S4:return x+y; P2: if(x>y)
T
T
T
F
F
F
E
P1: if(x>0)
S1:return x;
S3:return x;
P3: if(x>3)
X
S2:return x+2;
S4:return x+y; P2: if(x>y)
T
T
T
F
F
F
int foo (int x, int y)
Implication E
int foo (int x, int y)
P1: if(x>0)
S1:return x;
S3:return x;
P3: if(x>3)
X
S2:return x+2;
S4:return x+y; P2: if(x>y)
T
T
T
F
F
F
E
int foo’ (int x, int y)
P1: if(x>0)
S1:return x;
S3:return x;
P3: if(x>3)
X
S2:return x+2;
S4:return x+y;
P2:if(x>=y)
T
T
T
F
F
F
Implication E
P1: if(x>0)
S1:return x;
S3:return x;
P3: if(x>3)
X
S2:return x+2;
S4:return x+y;
P2:if(x>=y)
T
T
T
F
F
F
int foo′ (int x, int y)
Test Suite: t1=(x = 2, y = 2) t2=(x = 4, y = 4) t3=(x = 1, y = 0) t4=(x = 4, y = 3) t5=(x = −1, y = 0)
Implication E
P1: if(x>0)
S1:return x;
S3:return x;
P3: if(x>3)
X
S2:return x+2;
S4:return x+y;
P2:if(x>=y)
T
T
T
F
F
F
int foo′ (int x, int y)
Test Suite: t1=(x = 2, y = 2) t2=(x = 4, y = 4) t3=(x = 1, y = 0) t4=(x = 4, y = 3) t5=(x = −1, y = 0)
We need to add tests to the test suite
1. Identify affected elements (portions of the program′ or its specification for which new tests are needed)
2. Create or guide the creation of tests that exercise these elements
Test Suite Augmentation
Test Suite Augmentation for SPLs
V1 V2 V3 V4 V5 V6
Non-SPL system evolves
Products of an SPL
A B C
F
D E
V V V 16MC 8MC BW V V GV TV V V 2MP 1MP
Viewer Type
VP
Camera Type
VP
Phone
VP
V Display
V Email Viewer
V Camera
V Video Camera
V Video
Ringtones requires_v_vp
requires_v_vp requires_v_vp
Viewer Type
VP
p1
p2 p4
P3 p5
p1
p2 p4
P3
p5
p1
p4
P3 p5
p2
p4
Products Identify Targets
Select product & Test generation
Initial Test Suite
Order Products
Continuous Test Suite Augmentation
CONTESA Overview
CONTESA
• Test a single product • Use regression impact analysis
(DejaVu) to identify common vs. variable code between current/next product
• Generate tests to cover variable code in the new product (use a GA)
Two Orders of Testing
Both select product with largest number of uncovered branches (variable code) • Static:
– Determine order at start • Dynamic:
– Re-calculate after each product tested
Case Study
• RQ1: Is continuous test suite augmentation more effective and efficient than generating test cases independently for each product?
• RQ2: Does the order used in continuous test suite augmentation matter in terms of effectiveness and efficiency?
Results
Results
Actual Code Coverage
Cumulative Coverage
Summary
• Continuous test suite augmentation uses ideas from regression testing to drive SPL test generation
• It is a hybrid product and family based approach
• Improves both effectiveness and efficiency over an individual product based approach
Agenda
SPL System Testing: the combinatorics
SPL Integration Testing
Summary
Continuous Test Suite Augmentation
Introduction to Software Product Lines
Summary of Lecture
• We have seen how combinatorics impact SPL testing
• We have seen some sampling and re-use techniques – System and integration level
• We have seen an augmentation technique aimed at the entire code-base
Students and Collaborators on this work
Isis Cabral Matthew Dwyer Brady Garvin Wayne Motycka Gregg Rothermel Xiao Qu Jiangfan Shi Zhihong Xu
Sampling, Re-use and Incremental Testing in Software Product Lines
Myra B. Cohen [email protected]
University of Nebraska-Lincoln
This work is supported in part by the National Science Foundation through awards CCF-1161767, CCF-0747009 and by the Air Force Office of Scientific Research through award FA9550-10-1-0406.