1 evolutionary testing metaheuristic search techniques applied to test problems stella levin...
Post on 20-Dec-2015
214 Views
Preview:
TRANSCRIPT
1
Evolutionary Testing Metaheuristic search techniques applied to test problems
Stella LevinAdvanced Software Tools SeminarTel-Aviv University 11.2004
2
Contents
1. Introduction2. Metaheuristic Search Techniques3. White-Box Testing 4. Black-Box Testing5. Object-Oriented Testing6. Non-Functional Testing7. Search Based Software Engineering
3
Introduction - Why testing?
“The biggest part of software cost is the cost of bugs: the cost of detecting them, the cost of correcting them, the cost of designing tests and the cost of running those tests” - Beizer“In embedded systems errors could result in high risk,endanger human life, big cost” - Wegener
4
Successful EA applications
NASA evolvable antenna
Equal to 12 years working of experienced designer
There is no guarantee that a design would be as good [8]
5
1. Metaheuristic Search
Problem characteristics Large solution spaceNo precise algorithm, no “best” solutionClassification of “better” solution
“Due to non-linearity of software (if, loops…) test problems are converted to complex, discontinuous, non-linear search spaces”- Baresel
6
Transform a problem to optimization problem
Candidate solution representation – individualFitness function for individualMovement from one individual to another
7
Hill Climbing - “local” search
1. Select a point in the search space2. Investigate neighboring points3. If there is a better neighbor solution
(with fitness function), jump to it4. Repeat steps 2-3 until current
position has no better neighborsRequire definition of neighboring points
9
Simulated Annealing
Analogy of the chemical process of cooling of a material in a heat bathIf F(X2)<F(X1) then move to neighbor X2Else move to X2 with probabilityP=e^(-ΔF/T) Initially T is high; T decreases more like hill climbing
Require definition of neighbor and cooling function
11
Evolutionary Algorithms
• Genetic algorithms Developed by J. Holland in the
70s• Evolution strategies Developed in Germany at about
the same time• Analogy with Darwin’s evolution
theory, survival of the fittest
12
I nitialize RandomGeneration
Evaluate eachindividual w ithfitness function
Next generationSelection Crossover
Mutation
NO
Solution?or exceed lim it of
generationsYES
Results
13
Evolutionary Algorithms
Selection: roulette wheel with fitnessCrossover: 01101 01000
11000 11101Cross at random pointMutation:
11101 10101 Random bit change
14
Genetic Programming
Program is an individualCrossover and mutation of program’s abstract syntax treeParticular use – to find functions
which describe data
16
2. White-Box Testing
Statement coverageBranch coverage Specific path(statement)
selection
1
2
3
5
6
7
4
8
9
T
F
T F
Flow Graph
17
White-Box Testing
Input variables (x1, x2, … Xk)Program Domain D1•D2•…DkIndividuals decode input of the programGoal: to find input data that satisfies coverage criteria (statement / branch / path)
18
Fitness Function = AL + DAL: Approximation Level acc. McMinn [3]
Critical branch: branch missing the TargetAL =(Number of critical branches between
Target and diverging point) - 1
Target1 2
3
4 5
T
TT
FF
F
AL= 2 AL= 1 AL= 0
19
Fitness Function = AL + D
D: branch distanceIf (x==y) then …
if x!=y then D=abs(x-y) else D=0D normalize to [0,1]Goal: min fitness (min D)
if (x<y) then …If x>=y then D=x-y else D=0
If (flag) then …If flag==false then D=K else D=0
20
Example: Triangle Classification
Input: int a,b,c from [0,15]1. Sort a,b,c that a<=b<=c2. If a+b<=c then NOT A TRIANGLE3. If a==b or b==c then
EQUILATERAL4. If a==b and b==c then ISOSCELES5. Else REGULAR
21
Optimization problem
Program domain: I•I•IIndividual string
Goal: branch coverage Sub-goals: NOT A TRIANGLE,
EQUILATERAL, ISOSCELES, REGULAR
0101 0111 1001
a b c
5 7 9
22
Simulation by hand
Goal: EQUILATERALGeneration: <9,13,5> REGULAR
<10,4,2> NOT A TRIANGLE <3,11,7> NOT A TRIANGLE
1010 01 00 0010 1010 0111 01110011 10 11 0111 0011 1000 0010<10,7,7> EQUILATERAL
24
Real Example
Schatz[11] Autopilot system 2046 LOC75 conditionsBranch cov.Performance GA vs. random
26
White Box Testing Summary
Variety of problem mapping and fitness functionsFlag and state problems
27
3. Black Box Testing - Tracey [4]
Specification with pre/post-conditionsSearch for test input data that satisfy
pre-condition && ! post-conditionGood fitness for data that is near to satisfy
bool If TRUE then 0 else K
a==b If abs(a-b)==0 then 0 else abs(a-b)+K
a<b If a-b<0 then 0 else (a-b)+K
a&&b fit(a) + fit(b)
a||b min(fit(a), fit(b))
28
int wrap_counter(int n)
{//pre (n>=0 && n<=10) if (n>=10) i=0; else i=n+1; return i; //post (n<10 -> i=n+1) //post (n=10 -> i=0)}
Goal1: n>=0 && n<=10 && (n<10 && i!=n+1)Goal2: n>=0 && n<=10 && (n=10 && i!=0)
29
Example
Insert Error: if (n>10) i=0
Goal2: n>=0 && n<=10 &&
(n=10 && i!=0)
n=2 i=3 : 0+0+(8+K)+0=8+K
n=7 i=8 : 0+0+(3+K)+0=3+K
n=10 i=11 : 0+0+0+0=0 FOUND!!!
30
Application
Applied to safety-critical nuclear protection system Use simulated annealing and GA for
search Use mutation testing to insert errors About 2000 lines of executable code 733 different disjunctive goals 100% error detection The code was simple
31
3. Black-Box Testing Automated Parking System [10]
Individual: geometry data of parking space and vehicle – 6 parametersFitness: minimum distance to collision areaResults: 880 scenarios-25 incorrect
34
4. Object-Oriented Testing
Tonella [5]: Unit Testing of Classes1. Create object of class under test2. Put the object to proper state
Repeat 1 and 2 for all required objects
3. Invoke method under test4. Examine final state
35
Individual String and Fitness
That means A a = new A();B b = new B();a.m(3,b);
Goal: branch coverage
Fitness: proportion of exercised decision nodes that lead to the target
$a=A():$b=B():$a.m(int,$b)@3
36
Crossover
Crossover at random point after constructor and before tested methodRepair the individual string
$a=A():$b=B(int):$a.m(int,$b)@1,5$a=A(int,int):$b=B():$b.g():$a.m(int,$b)@0,3,4$a=A():$b=B(int):$b.g():$a.m(int,$b) @1,4$a=A(int,int):$b=B():$a.m(int,$b)@0,3,5
37
MutationsMutation of input valueConstructor changeInsertion of method callRemoval of method call
$a=A():$b=B(int):$a.m(int,$b)@1,5$a=A():$b=B(int):$a.m(int,$b)@1,7$a=A():$c=C():$b=B($c):$a.m(int,$b)@1,7$a=A():$c=C():$b=B($c):$b.f():$a.m(int,$b)@1,7
38
Class LOC Execs
Time(sec)
StringTokenizer
313 6 11/11 3 820 2
BitSet 1046 26 172/177
28
38700
4930-1.5h
HashMap1
982 13 39/41 8 4380 1338
HashMap2
982 13 39/41 7 4310 697
LinkedList 704 23 56/57 9 25690
2034
Stack 118 5 10/10 2 230 1
TreeSet 482 15 20/21 4 2480 60
Pu
blic
meth
od
s
Bra
nch
C
overa
ge
Test
cases
39
5. Non-Functional TestingExecution Time Testing
Real-time systems: WC/BC exe timeFitness: execution time for specified input Problem: no sufficient guidance for search
Wegener[6] on real systems: GA is better than random and hand-made
Problem with low probability branchesNo guarantee to find WC/BC execution time
40
6. SBSESearch Based Software Engineering
Module Clustering using simulated annealing, genetic algorithms
Cost/Time Estimation using genetic programming
Re-engineering using Program Transformation Ryan[7]: Automatic parallelization
using genetic programming
41
1. Goldberg “Genetic Algorithms”2. McMinn “SBS Test Data Generation: A Survey”3. McMinn “Hybridizing ET with the Chaining
Approach”4. Tracey “A Search Based Automated Test-Data
Generation Framework for Safety-Critical Systems”5. Tonella “Evolutionary Testing of Classes”6. Wegener,Pitschinetz,Sthamer “Automated testing
of real-time tasks”7. Ryan “Automatic re-engineering of software using
genetic programming”8. Nasa “Intelligence report”9. “Reformulating Software Engineering as a Search
Problem” Clarke,Jones…(11 authors)10. Buehler,Wegener “Evolutionary Functional Testing
of an Automated Parking System”11. McGraw, G., Michael, C., Schatz, M. “Generating
Software Test Data by Evolution”
References
top related