practical program analysis of object-oriented software part...
TRANSCRIPT
-
1
Intl School SE, Salerno, Sept’06, B G Ryder-2 1
Practical Program Analysis ofObject-oriented Software
Part 2Dr. Barbara G. Ryder
Rutgers University
http://www.cs.rutgers.edu/~ryderSeptember 2006
Intl School SE, Salerno, Sept’06, B G Ryder-2 2
OutlinePart 2• New results on accommodating reflection in
points-to analysis• Applications of analysis in software tools to
increase programmer productivity– Using infeasibility analysis to enable better test
coverage for recovery code• Testing web server applications for robustness, Fu et. al
ISSTA’04, TSE’05, DCS-TR-599– Combining static and dynamic analyses to report
change impact of edits• Change impact analysis for Java codes, OOPSLA’04,
FSE’06
-
2
Intl School SE, Salerno, Sept’06, B G Ryder-2 3
Handling Dynamic Class Loading• New algorithm incrementally accounts for classes
loaded and performs analysis updates online atruntime
• Generates constraints at runtime using the JVM andpropagates them when a client needs valid points-toresults
• Uses deferred evaluation to handle unresolvedreferences
• Andersen’s analysis with field-sensitive objectrepresentation, objects represented by theircreation sites, and static call graph (CHA)
M.Hirzel, A. Diwan, M. Hind, “Pointer Analysis in the Presence of Dynamic Class Loading”, ECOOP 2004
Intl School SE, Salerno, Sept’06, B G Ryder-2 4
Hirzel et.al Algorithm• Showed efficacy through use in new
connectivity-based GC algorithm• Used Jikes RVM 2.2.1 on Specjvm98benchmarks with good results; claimed needlong-running programs for the incrementalcomputation cost to be amortized.
• Validation:– Need to make sure points-to solution is updated before do
a GC.– Then GC verifies the points-to solution by making sure the
dynamically observed points-to’s are in the solution.
-
3
Intl School SE, Salerno, Sept’06, B G Ryder-2 5
Prolangs Research Project ITesting robustness of Java web server
applications using static and dynamicanalyses.
Refs:C. Fu, B. G. Ryder, “Exception-chain analysis: Revealing Exception Handling
Architecture in Java Server Applications”, DCS-TR-599, April 2006.C. Fu, B.G. Ryder, “Navigating Error Recovery Code in Java Applications”, Eclipse
Technology Exchange Workshop, held at OOPSLA, October 2005.C. Fu, A. Milanova, B.G. Ryder, D. Wonnacott, “Robustness Testing of Java
Server Applications”, IEEE Transactions on Software Engineering, Volume 31,Number 4, pp 292-312, April 2005.
C. Fu, B.G. Ryder, A. Milanova, D. Wonnacott, “Testing of Java Web Servicesfor Robustness”, in Proceedings of the International Symposium on SoftwareTesting and Analysis (ISSTA), July 2004, pp 23-33.
C. Fu, Chen, K. Naragan, T. Nguyen, R. Martin, B.G. Ryder, D.Wonnacott,“Compiler-directed Program Fault Coverage for Highly AvailableJava Internet Services”, in Proceedings of the International Conference onDependable Systems and Networks, June 2003.
Intl School SE, Salerno, Sept’06, B G Ryder-2 6
Testing for Robustness• Test application reactions to operating
system interaction faults– e.g., disk crashes, network congestion, OS resource
depletion, OS bugs
• Provide realistic simulation of faultconditions– Cannot depend on naturally occurring faults
• Provide measure of robustness of softwareresponse
• Introduce software analysis and testingmethodology to automate this process
-
4
Intl School SE, Salerno, Sept’06, B G Ryder-2 7
Testing by Fault InjectionCurrent approach, a form of
black box testing:• Learn fault distribution of
system by observation• Inject faults according to
observed frequency• Watch application
response: crash/continue• Measure how often
system avoids crash inresponse to introducedfault
Application
Java Runtime
OS
Device
s.read()SocketException
Intl School SE, Salerno, Sept’06, B G Ryder-2 8
Our ApproachOur approach, a form ofwhite box testing,
• Use knowledge of applicationfrom the compiler to injectpossibly-affecting faults atappropriate times duringexecution
• Observe applicationresponse (i.e., exceptionhandling)
• Measure coverage ofexception handlers
Application
Java Runtime
OS
Device
s.read()SocketException
-
5
Intl School SE, Salerno, Sept’06, B G Ryder-2 9
void foo() throws Exception{... try{ bar(); } catch (IOException ioe){…}}
void bar() throws Exception{. . . throw new SocketException();. . . throw new OtherException();. . .}
SocketException thrown in barOtherException thrown in bar
OtherException thrown in bar
Java Exceptions
e-c link
Intl School SE, Salerno, Sept’06, B G Ryder-2 10
Our Approach - 1.Analysistry{ …
s.read(); …
} catch (IOException e){ // recovery code}
SocketException
s.read()
vulnerableoperation
fault-sensitiveoperation
Application
Java Runtime
OS
Device1. Find vulnerable operations2. Find exception handlers
handler
-
6
Intl School SE, Salerno, Sept’06, B G Ryder-2 11
Our Approach- 2.Instrumentationtry{ …inject_fault(); s.read(); …cancel_fault();} catch (IOException e){ record_current_fault(); // recovery code}
MendosusA Fault Injection Engine
s.read()
Application
Java Runtime
OS
Device
SocketException
Intl School SE, Salerno, Sept’06, B G Ryder-2 12
FrameworkTester provided
Fault set
Mendosus,Fault Injection
Engine
JavaApplication
Instrumented Java Program
Exception-CatchLink Analysis
MeasuredExceptionDef-CatchCoverage
Compile-time
Runtime
PossibleE-C links
ObservedE-C links
-
7
Intl School SE, Salerno, Sept’06, B G Ryder-2 13
Contributions• A new white-box testing methodology for recovery
code in Java applications• A practical, fairly precise exception def-use
analysis• A new technique for determining control path
infeasibility using data reachability• A new test coverage metric for fault injection that
emphasizes coverage of the application recoverycode
• Initial experimental evidence for the practicalutility of this approach - achieved on average 84%coverage
Intl School SE, Salerno, Sept’06, B G Ryder-2 14
Exception-catch Link Analysis• Two phase algorithm
– Exception-flow analysis - initial estimateof e-c links using call graph• Can vary algm precision by varying precision ofcall graph construction
– DataReach analysis - prune awayinfeasible links using object points-toinformation
– Call graph and points-to graphs generatedby FieldSens points-to analysis, Rountev et.alOOPSLA’01
-
8
Intl School SE, Salerno, Sept’06, B G Ryder-2 15
DataReach Analysis-Examplevoid readFile(String s){ byte[] buffer = new byte[256]; try{ InputStream f =new FileInputStream(s); InputStream source=new BufferedInputStream(f); for (...) c = source.read(buffer); }catch (IOException e){ ...}}
void readNet(Socket s){ byte[] buffer = new byte[256]; try{ InputStream n =s.getInputStream(); InputStream source=new BufferedInputStream(n); for (...) c = source.read(buffer); }catch (IOException e){ ...}}
Intl School SE, Salerno, Sept’06, B G Ryder-2 16
Example Call Graph
FilterInputStream.read(byte[])
BufferedInputStream.read(byte[],int,int)
BufferedInputStream.read1(byte[],int,int)
BufferedInputStream.fill()
FileInputStream.read(...) SocketInputStream.read(...)
readFile(String s) readNet(Socket s)
-
9
Intl School SE, Salerno, Sept’06, B G Ryder-2 17
Example Call Graph-2
FilterInputStream.read(byte[])
BufferedInputStream.read(byte[],int,int)
BufferedInputStream.read1(byte[],int,int)
BufferedInputStream.fill()
FileInputStream.read(...) SocketInputStream.read(...)
readFile(String s) readNet(Socket s)
Q: what objects are visible in BufferedInputStream.fill() on paths from readFile()?
oa
ob o c
od
Visible thru:parameters,globals,ref field loads,obj creation.
spurious
infeasible
Intl School SE, Salerno, Sept’06, B G Ryder-2 18
Example Call Graph-3
FilterInputStream.read(byte[])
BufferedInputStream.read(byte[],int,int)
BufferedInputStream.read1(byte[],int,int)
BufferedInputStream.fill()
FileInputStream.read(...) SocketInputStream.read(...)
readFile(String s) readNet(Socket s)
Q: what objects are visible in BufferedInputStream.fill() on paths from readNet()?
ob o c
od
ox oyinfeasible
spurious
-
10
Intl School SE, Salerno, Sept’06, B G Ryder-2 19
Experiments• Implemented analysis in McGill’s SOOT
framework on a 2.8GHz Pentium4 PC underLinux
• Used four moderate-sized benchmarks• Experimented with different call graph
construction algorithms (CHA,RTA,Points-to)– Added DataReach separately to see its effect– Handled dynamic loading and reflection by dynamic
observation• Focused on recovery code for IOExceptions
Fu et al, ISSTA’04
Intl School SE, Salerno, Sept’06, B G Ryder-2 20
BenchmarksFu et al, ISSTA’04
-
11
Intl School SE, Salerno, Sept’06, B G Ryder-2 21
Experiment Details• Analysis combinations tried
– CHA,Dean et.al, ECOOP’95– RTA,Bacon et.al, OOPSLA’96– FieldSens points-to,Rountev et.al, OOPSLA’01– In-Points-to– Points-to/DataReach– In-Points-to/DataReach
• Measured coverage as ratio of number of executede-c links to number of possible e-c links
• Added context sensitivity by inlining constructorsthat assign to reference fields through the thisparameter
Fu et al, ISSTA’04
Intl School SE, Salerno, Sept’06, B G Ryder-2 22
Coverage Data
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
FTPD JNFS Haboob Muffin
CHA RTA PTA
InPTA PTA-DR InPTA-DR
Fu et al, ISSTA’04
-
12
Intl School SE, Salerno, Sept’06, B G Ryder-2 23
Number of e-c links obtained
480
96
104
34
CHA
3542Muffin
1012Haboob
1619JNFS
1113FTPD
CoveredIn-Points-to/DataReach
Program
Fu et al, ISSTA’04
Intl School SE, Salerno, Sept’06, B G Ryder-2 24
Timings
220.6 221.6 160.4 171.9
3,103.7
3,961.0
2,954.3
4,907.7
0
1,000
2,000
3,000
4,000
5,000
6,000
InPTA-DR PTA-DR InPTA-DR PTA-DR InPTA-DR PTA-DR InPTA-DR PTA-DR
FTPD Haboob JNFS Muffin
Seconds
1 hour
Fu et al, ISSTA’04
-
13
Intl School SE, Salerno, Sept’06, B G Ryder-2 25
More Recent Results• Extended the DataReach algorithm
– M-DataReach: 1 reaching object set per method(implemented)
• Added 3 larger programs to dataset and comparedDataReach to M-DataReach on 7 programs– HttpClient 1334(4741) methods– SpecJVM 2489(4592) methods– Vmark 1565(5029) methods
• Added measure reached to show e-c links whose tryblocks were reached during execution, to calculateeffective coverage– Filter out possible e-c links whose try blocks not reached
Fu et al, TSE’05
Intl School SE, Salerno, Sept’06, B G Ryder-2 26
Recent Results
72%28%23%Vmark
19%15%13%SpecJVM
62%61%55%HttpClient
InPTA-DR InPTA-MDR Effective Coverage
Fu et al, TSE’05
-
14
Intl School SE, Salerno, Sept’06, B G Ryder-2 27
Recent ResultsCategorized uncovered e-c links
1. Feasible, but uncovered because ofinsufficient tests or input
2. Infeasible and difficult to prune for anystatic analysis
3. Infeasible, but could be eliminated by amore precise static analysis
4015%60%25%HTTP Cl
3087%13%SpecJVM743%43%14%Muffin
Total321
Fu et al, TSE’05
Intl School SE, Salerno, Sept’06, B G Ryder-2 28
Exception-chain Analysis• Find exceptions related semantically
• Rethrows of an exception within a catch thatpreserves state of entering exception object
• Enables test and understanding of full recoverycode associated with a specific fault
– Also categorized the handlers to showcomplexity of recovery operations
– Exposes system recovery architecture touser for greater understanding
C. Fu, B. G. Ryder, “Exception-chain analysis: Revealing Exception Handling Architecture in JavaServer Applications”, DCS-TR-599
-
15
Intl School SE, Salerno, Sept’06, B G Ryder-2 29
Handlers CategorizedFu et al, DCS-TR-599
Intl School SE, Salerno, Sept’06, B G Ryder-2 30
Exception-chains FoundStarting from a Rethrow
Fu et al, DCS-TR-599
-
16
Intl School SE, Salerno, Sept’06, B G Ryder-2 31
Exception-chain Graph ofTomcat Fu et al, DCS-TR-599
Intl School SE, Salerno, Sept’06, B G Ryder-2 32
Related Work - Project I• Dataflow testing and coverage metrics
• Rapps/Weyuker TSE 4/85; Frankl/Weyuker TSE10/88,Sinha/Harrold ICSM’99
• Analysis of exception handling• Robillard/Murphy TOSEM 2003; Jo et.al JL of Sys and SW
2004; Sinha/Harrold TSE 9/00; Choi et.al PASTE’99;• Exceptions and Compilation
• IBM Tokyo JIT, Latte, MRL VM• Points-to analysis
• Ryder CC’03; Milanova et. al ISSTA’02; Grove et.alOOPSLA’97; Grove/Chambers TOPLAS 2001; O’Callahan 2000;
• Infeasible paths• Bodik et.al FSE’97; Souter/Pollock Info & SW Tech 10/2002;
• Fault injection techniques• Arlat et.al IEEE-TSE 8/93;Bouricus et.al NatlConfACM 3/69;
Dugan/Trevidi IEEE-TC 6/89• Tsai et.al IEEE-TC 11/99;Bieman et.al ISSRE’96
-
17
Intl School SE, Salerno, Sept’06, B G Ryder-2 33
Summary - Project I• Designed and demonstrated an automated white-box
testing methodology for recovery code• Designed fairly precise def-use analysis for
exceptions, including a new technique to eliminatespurious def-uses
• Extended to show exception-handling paths forsemantically related exceptions– Categorized quality of exception handlers– Showed inter-component exception structure
• Promising experimental results with methodology onseveral Java web applications
Intl School SE, Salerno, Sept’06, B G Ryder-2 34
Future Work - Project I• Empirically test exception-chains in Tomcat
and other large applications, gatheringcoverage metrics– Learn which chains are hard to cover and why
• Gather exception handler classification dataon large, framework-based applications
• Expand set of exceptions tracked by analysis• Generalize the technique to handle
distributed Java applications using RMI
-
18
Intl School SE, Salerno, Sept’06, B G Ryder-2 35
What analyses were used?• FieldSens points-to analysis augmented with
DataReach to show call path infeasibility• Flow-sensitive def-use analysis for exception
objects within a catch block• To find rethrows of semantically related exceptions
• Dynamic program instrumentation to recordtest coverage and record instances ofreflection
Intl School SE, Salerno, Sept’06, B G Ryder-2 36
Prolangs Research Project IIPractical Change Impact Analysisof Java ProgramsRefs:M. Stoerzer, B.G. Ryder, X. Ren, F. Tip, “Finding Failure-Inducing
Changes in Java Programs using Change Classification”, in theProceedings of the 14th SIGSOFT Conference on the Foundations ofSoftware Engineering, November 2006.
X. Ren, O. Chesley, Ophelia, B.G. Ryder, “CRISP, A Debugging Tool forJava Programs”, IEEE Transactions on Software Engineering, in press,April 2006.
O. Chesley, X. Ren, B.G. Ryder, “Crisp: A Debugging Tool for JavaPrograms”, in the Proceedings of the 21st International Conference onSoftware Maintenance (ICSM), Budapest, Hungary, September 2005.
X. Ren, F. Shah, F. Tip, B.G. Ryder, O. Chesley, “Chianti: A Tool forPractical Change Impact Analysis of Java Programs”, in Proceedings ofthe ACM SIGPLAN Conference on Object Oriented Programming,Systems and Applications (OOPSLA), pp 432-448, October 2004.
-
19
Intl School SE, Salerno, Sept’06, B G Ryder-2 37
Non-locality of change impactin OO programs
• Small source code changes can havemajor and non-local effects in object-oriented systems– Due to subtyping and dynamic dispatch
X x= new Z()…x.f()
X
Y Z
f()
f()
X x= new Z()…x.f()
Intl School SE, Salerno, Sept’06, B G Ryder-2 38
Motivation• Program analysis provides feedback on
semantic impact of changes– E.g., added/deleted method calls, fields and classes
• Object-oriented system presumed to consistof set of classes and set of associated unitor regression tests
• Change impact measured by tests affected– Describes application functionality modified– Discovers need for new tests
-
20
Intl School SE, Salerno, Sept’06, B G Ryder-2 39
Exampleclass A { public void foo() {}class B extends A { public void foo(){ }}class C extends A{}
public void test1{ A a = new A(); a.foo();//A’s foo} public void test2(){ A a = new B(); a.foo(); //B’s foo}public void test3(){ A a = new C(); a.foo();//A’s foo}
A
B C
foo()
foo()
Intl School SE, Salerno, Sept’06, B G Ryder-2 40
Exampleclass A { public void foo() { } public int x;}class B extends A { public void foo(){B.bar();} public static void bar() {
y = 17;} public static int y;}class C extends A{ public void foo() { x = 18; } public void baz() { z = 19;} public int z;}
A
B C
foo(), x
foo(), ybar()
foo(), zbaz()
Question: what affect did thisedit have on these tests?
-
21
Intl School SE, Salerno, Sept’06, B G Ryder-2 41
Example - Changes to foo()class A { public void foo() { } public int x;}class B extends A { public void foo(){B.bar();} public static void bar()
{y=17;} public static int y;}class C extends A{ public void foo() { x = 18; } public void baz() { z = 19;} public int z;}
public void test1{ A a = new A(); a.foo();//A’s foo} public void test2(){ A a = new B(); a.foo(); //B’s foo}public void test3(){ A a = new C(); a.foo();//A’s foo}
1.Decompose edit intoatomic changes
2.Find affected tests
3.Find affecting changes
Intl School SE, Salerno, Sept’06, B G Ryder-2 42
Assumptions• Our change analysis tools run within an interactive programming
environment - Eclipse– User makes a program edit– User requests change impact analysis of an edit
• Before and after edit, the program compiles• Tests execute different functionalities of the system• Call graphs obtainable for each test through static or dynamic
analysis– Call graphs show the (possible) calling structure of a program; nodes ~
methods, edges ~ calls• Change impact measured in terms of changes to these call
graphs, corresponding to a (growing) set of tests
-
22
Intl School SE, Salerno, Sept’06, B G Ryder-2 43
Usage Scenario
∀ Ti, run Ti
Output okay?
yes
Build call graphs
Edit code
Derive atomic changesS = {c1,c2,…,cn}
Find potentially affected testsT1, T2, …,Tk
Chianti
noFind S’⊆S that may impact Ti
Explore subsets of S’
Crisp
Junit/CIA
Chianti
Intl School SE, Salerno, Sept’06, B G Ryder-2 44
Contributions - Tools• Chianti, a prototype change impact analyzer
for full Java language (1.4)– Written as an Eclipse plug-in– Experimental results from analysis of year 2002 Daikon
system CVS check-in
• Crisp, a builder of intermediate programversions (between original and edited)– To find failure-inducing changes semi-automatically
• JUnit/CIA, augmented version of JUnit inEclipse to perform unit/regression testing andchange classification– Find likelihood of a change being failure-inducing
-
23
Intl School SE, Salerno, Sept’06, B G Ryder-2 45
Contributions - Ideas• Definition of a notion of an atomic change
and their interdependences• Find the set of tests potentially affected by
set of atomic changes• For an affected test, find subset of atomic
changes that may affect it• Determine set of atomic changes which do
not affect any test– Demonstrates need for more tests
• Determine call graph coverage as feedbackto aid user in generating new tests
Affected Tests
Affecting Changes
Intl School SE, Salerno, Sept’06, B G Ryder-2 46
Atomic Changes
AC Add an empty classDC Delete an empty classAM Add an empty methodDM Delete an empty methodCM Change body of a methodLC Change virtual method lookupAF Add a fieldDF Delete a field
CFI Change defn instance field initializerCSFI Change defn static field initializerAI Add an empty instance initializerDI Delete an empty instance initializerCI Change defn instance initializerASI Add empty static initializerDSI Delete empty static initializerCSI Change definition of static initializer
Each edit corresponds to uniqueset of (method-level) atomic changes
-
24
Intl School SE, Salerno, Sept’06, B G Ryder-2 47
Examples of Atomic Changesclass A { public void foo() { } public int x;//1}class B extends A { public void foo(){B.bar();}//6 public static void bar()
{y=17;}//5,8 public static int y;//7}class C extends A{ public void foo() { x = 18; }//2,3 public void baz() { z = 19;}//10,11 public int z;//9}
Intl School SE, Salerno, Sept’06, B G Ryder-2 48
Dynamic Dispatch ChangesA set of triples defined for a class hierarchy:
Runtime receiver type, declared method typesignature, actual target method means:1. class A contains` method m2. class C does not contain method m3. class B contains method m, and C inherits m from B
A
B
C
m()
m()
A
B
C
m()
m()
m()
Delete Add
LC change:
-
25
Intl School SE, Salerno, Sept’06, B G Ryder-2 49
Affected TestsT, set of all tests ; A, set of all atomic
changes; P, program before edit; P’,program after edit
AffectedTests (T,A) ≡ {Ti | Ti ∈ T, (Nodes(P,Ti) ∩ (CM ∪ DM)) ≠ ∅} ∪ {Ti | Ti ∈ T, n ∈ Nodes(P,Ti), n→B,X.m A.m ∈
Edges(P,Ti), for (B,X.m) ∈ LC, B
-
26
Intl School SE, Salerno, Sept’06, B G Ryder-2 51
Edited Program P’
Call GraphBuilder
Unit/Regression Tests
Atomic ChangeDecoder
Change ImpactAnalyzer
Atomic Changes
Chianti
Original Program P
Call Graphs of Tests in P
Affected Tests
Call Graphs of Affected Tests in P’
Affecting Changes
Intl School SE, Salerno, Sept’06, B G Ryder-2 52
How to run Chianti?• To simulate use in an IDE while editing
evolving code, select 2 successive CVSversions of a project and their associatedtests
• Call for change impact analysis of the edit• Chianti displays change analysis results
– Affected tests with their affecting atomicchanges and prerequisite changes
-
27
Intl School SE, Salerno, Sept’06, B G Ryder-2 53
Chianti GUI
test2 calls B.foo()affected by CM(B.foo())with prereq AM(B.bar()),because B.foo() callsB.bar()
Eclipse project files Java codeAffected tests& affecting changes
Intl School SE, Salerno, Sept’06, B G Ryder-2 54
Chianti GUI
test2 calls B.bar() affectedby CM(B.bar()) with prereqsAM(B.bar()) & AF(B.y)because B.bar() uses B.y
-
28
Intl School SE, Salerno, Sept’06, B G Ryder-2 55
Chianti Experiments• Data: Daikon project (cf M.Ernst, MIT)
– Obtained CVS repository from 2002 with versionhistory - an active debugging period
– Grouped code updates within same week for 12months
– Obtained 39 intervals with changes– Growth from
• 48K to 121K lines of code• 357 to 765 classes• 2409 to 6284 methods• 937 to 2885 fields• 40-62 unit tests per version
Ren et al, OOPLSA’04
Intl School SE, Salerno, Sept’06, B G Ryder-2 56
Number of atomic changes
268
1471
2810
308
11698
1197
171
397
1006
212
116
300
29
214
350
5238
378
2344
1319
4
286
3
15
65
7
13
6095
153
1
163
635
40
2913
659
723
665
332
465
1747
1
10
100
1000
10000
100000
0107-0114
0114-0121
0121-0128
0128-0204
0204-0211
0211-0218
0218-0225
0225-0304
0304-0311
0311-0318
0318-0401
0401-0408
0408-0415
0415-0506
0506-0527
0527-0603
0603-0610
0610-0617
0617-0625
0625-0701
0701-0708
0708-0715
0715-0722
0722-0805
0805-0819
0819-0826
0826-0902
0902-0909
0909-0916
0916-0923
0923-0930
0930-1111
1111-1119
1119-1126
1126-1202
1202-1209
1209-1216
1216-1223
1223-1230
Ren et al, OOPLSA’04
-
29
Intl School SE, Salerno, Sept’06, B G Ryder-2 57
Affected Tests
53%
80%
0%2%
63%
89%
0%
68%
0%
61%
70%
0%
61%
52%
22%
27%
68%65%
60%
44%
48%
76%
57%
0%
69%
77%
0%
63%
76%
63%
79%
68%
72%72%
63%
67%69%
57%
55%
58%
0.0%
20.0%
40.0%
60.0%
80.0%
100.0%
01
07
_0
11
4
01
14
_0
12
1
01
21
_0
12
8
01
28
_0
20
4
02
04
_0
21
1
02
11
_0
21
8
02
18
_0
22
5
02
25
_0
30
4
03
04
_0
31
1
03
11
_0
31
8
03
18
--04
01
04
01
--04
08
04
08
--04
15
04
15
--05
06
05
06
--05
27
05
27
--06
03
06
03
--06
10
06
10
--06
17
06
17
--06
25
06
25
--07
01
07
01
-07
08
07
08
-07
15
07
15
-07
22
07
22
-08
05
08
05
-08
19
08
19
-08
26
08
26
-09
02
09
02
-09
09
09
09
-09
16
09
16
-09
23
09
23
-09
30
09
30
-11
11
11
11
-11
19
11
19
-11
26
11
26
-12
02
12
02
-12
09
12
09
-12
16
12
16
-12
23
12
23
-12
30
Av
era
ge
Percentage Affected Tests
Ren et al, OOPLSA’04
Intl School SE, Salerno, Sept’06, B G Ryder-2 58
Affecting Changes
9.54%
0.34%
2.58%
15.07%
10.34%
28.24%
2.94%
5.08%
73.53%
43.41%
3.82% 3.95%
0.38%
0.26%
8.33%
5.64%
3.60%
2.88%
1.95%
7.68%
8.81%
1.54%
3.84%
1.44%
12.67%
3.36%
5.90%
0.48%
1.96% 1.92%
33.16%
1.21%1.29%
5.94%
0.10%
1.00%
10.00%
100.00%
01
07
_0
11
4
01
14
_0
12
1
01
21
_0
12
8
01
28
_0
20
4
02
04
_0
21
1
02
11
_0
21
8
02
18
_0
22
5
02
25
_0
30
4
03
04
_0
31
1
03
11
_0
31
8
03
18
--04
01
04
08
--04
15
04
15
--05
06
05
06
--05
27
05
27
--06
03
06
03
--06
10
06
10
--06
17
06
17
--06
25
06
25
--07
01
07
01
-07
08
07
08
-07
15
07
15
-07
22
07
22
-08
05
08
05
-08
19
08
19
-08
26
09
09
-09
16
09
16
-09
23
09
30
-11
11
11
11
-11
19
11
19
-11
26
12
02
-12
09
12
09
-12
16
12
23
-12
30
Ge
om
-me
an
Percentage Affecting Changes
Ren et al, OOPLSA’04
-
30
Intl School SE, Salerno, Sept’06, B G Ryder-2 59
Performance of Chianti• Deriving atomic changes from 2 successive
Daikon versions takes on average 87 seconds• Calculating the set of affected tests takes
on average 5 seconds• Calculating affecting changes for an
affected test takes on average 1.2 seconds• Results show promise of our change impact
framework
Intl School SE, Salerno, Sept’06, B G Ryder-2 60
JUnit/CIA• Integrated Chianti with JUnit in Eclipse
– Executes change impact analysis as part ofregular unit testing of code during softwareevolution
• Allows user to avoid rerunning unaffected unit tests• Allows user to see atomic changes not tested by current
test suite, in order to develop new tests• Using test outcome history to estimate
likelihood that an affecting change isfailure-inducing for a test– Classifying changes by whether they are affecting
to tests with worsening, improving, or sameoutcomes on both versions
Stoerzer et al, FSE’06
-
31
Intl School SE, Salerno, Sept’06, B G Ryder-2 61
Junit/CIA - Intuition• JUnit/CIA change classifications
– untested : change affects no test, more tests should be added
– green : all affected tests improve in outcome– yellow: some affected tests worsen,
some improve in outcome,indeterminate– red: all affected tests worsen in outcome;
good candidate for a bug!• Performed in situ experiments with SE class at
University of Passau – Overall results showed best (coloring) classifier helped focus
programmer attention on failure-inducing changes in ~50% ofcases
Stoerzer et al, FSE’06
Intl School SE, Salerno, Sept’06, B G Ryder-2 62
JUnitCIA Screen Shot
-
32
Intl School SE, Salerno, Sept’06, B G Ryder-2 63
CRISP• Tool to find changes introducing bugs• IDEA: Atomic changes which do not affect a
test cannot be responsible for its failure– Starts with original program and allows user to
select atomic changes to add– Tool adds selected changes and their prerequisites
to the original version, allowing running of testson this intermediate program version
• Goal: to perform this task automatically making ‘smart’choices of changes to add
Chesley et al ICSM’05, Ren et al, TSE’06 inpress
Intl School SE, Salerno, Sept’06, B G Ryder-2 64
Crisp - Daikon Case StudyNov 11th (original) and Nov 19th (new) versions
35 affecting changes in a failing test
2 failure-inducingchanges
Crisp
Select changes
Apply group ofchanges
Un-applychanges
Run tests
Need to create failingtests scenario for Crisp;executed test suitefor Daikon version pair inversion n against sourcecode in version n+1
6093 atomic changes
Chianti
-
33
Intl School SE, Salerno, Sept’06, B G Ryder-2 65
Related Work - Project IIImpact Analysis• Static impact analysis techniques
– Reachability: Bohner-Arnold ‘96; Kung et al.JSS’96; Tonella TSE 2003;
– Year 2000 analyses: Eidorff et.al POPL’99;Ramalingam et al. POPL’99;
• Dynamic impact analysis techniques– Zeller FSE’99; Law-Rothermel ICSE’03;
• Combined analysis techniques– Orso et al. FSE’03; Orso et al. ICSE’04;
Intl School SE, Salerno, Sept’06, B G Ryder-2 66
Related Work - Project II
Selective regression testing– TestTube-Rosenblum ICSE’94; DejaVu-Harrold-
Rothermel TOSEM 1997;– Slicing-based: Bates-Horwitz POPL’93; Binkley
TSE’97;– Test selection: Harrold et al. OOPSLA’01;
Elbaum et al. JSTVR 2003;Techniques for Avoiding Recompilation
– Tichy ACM TOPLAS’86; et al., ACM TOSEM ’94;Hood et al., PSDE’87; Burke et al. TOPLAS‘93;Chambers et al., ICSE’95; Dmitriev, OOPSLA’02
-
34
Intl School SE, Salerno, Sept’06, B G Ryder-2 67
Related Work - Project IIFault localization• Comparing dynamic data across executions
• Reps et al. FSE’97; Harrold et al. PASTE’98; Jones etal. ICSE’02; Reneris and Reiss ASE’03; Dallmeier et al.ECOOP’05;
• Statistically-based techniques: Liblet et al. PLDI’03,PLDI’05; Liu et al. FSE’05;
• Slicing-based techniques• Lyle et al. ISICCA’87; DeMillo et al. ISSTA’96; Bonus
et al. ASE’03;
Intl School SE, Salerno, Sept’06, B G Ryder-2 68
Summary - Project IIDeveloped change impact analysis for object-
oriented codes with emphasis on practicality,viability in an IDE, and scalability toindustrial-strength systems– Defined atomic changes to subdivide edits– Defined impact using tests affected and their
affecting changes– Tested ideas in Chianti on 12 months of data
from Daikon project with promising results– Developed new tools --JUnitCIA, CRISP-- that
incorporate change impact analysis in practicalways for testing and debugging applications
-
35
Intl School SE, Salerno, Sept’06, B G Ryder-2 69
Future Work - Project II• Experiment in Crisp with heuristics to suggest
an exploration order of the affecting changesto the user
• Comparison of several possible heuristics foradding in affecting changes to build newintermediate program versions
• Automation of the running of the test on the newintermediate program version
• Try technique on larger programs -- e.g.,Eclipse compiler
• Use change impact analysis to examinecollaborative changes to determine unexpectedsemantic dependences in resulting code
Intl School SE, Salerno, Sept’06, B G Ryder-2 70
What analyses were used?• Combines dynamic analysis (e.g., test callgraphs) with static analysis to findAffected tests and their affectingchanges
• Also can be used with call graphs constructedusing FieldSens points-to
• Use dynamic analysis to run tests, obtainoutcomes and measure test coverage
-
36
Intl School SE, Salerno, Sept’06, B G Ryder-2 71
References• O. Agesen. The cartesian product algorithm: Simple and precise
type inference of parametric polymorphism. In EuropeanConference on Object Oriented Programming, pages 2–26, 1995
• L. O. Andersen. Program analysis and specialization for the Cprogramming language. PhD thesis, DIKU, University ofCopenhagen, 1994. Also available as DIKU report 94/19.
• D. Bacon and P. Sweeney, Fast Static Analysis of C++ VirtualFunction Calls, OOPSLA’96
• M.Berndl, O. Lhotak, F. Qian, L. Hendren, N.Umanee, Points-toanalysis using BDDs, PLDI’03, pp 103-114.
• R. Chatterjee. Modular Data-flow Analysis of Statically TypedObject-oriented Programming Languages. PhD thesis, Departmentof Computer Science, Rutgers University, October 1999.
• R. Chatterjee, B. G. Ryder, and W. Landi. Relevant contextinference. In Symposium on Principles of Programming Languages,pages 133–146, 1999.
• J. Dean, D. Grove, C. Chambers, Optimization of OO ProgramsUsing Static Class Hierarchy, ECOOP’95
Intl School SE, Salerno, Sept’06, B G Ryder-2 72
References• D. Grove and C. Chambers. A framework for call graph
construction algorithms. ACM Transactions on ProgrammingLanguages and Systems (TOPLAS), 23(6), 2001.
• M. Hirzel, A. Diwan, M. Hind, Pointer Analysis in the Presenceof Dynamic Class Loading, ECOOP 2004, pp 96-122.
• O. Lhotak and L. Hendren, Scaling Java Points-to Analysis usingSpark, In Proceedings of the Twelvth International Conferenceon Compiler Construction, Warsaw, Poland, April 2003.
• O. Lhotak and L. Hendren, Jedd: a BDD-based relationalextension of Java, PLDI’04, pp 158-169.
• D. Liang, M. Pennings, and M.J. Harrold. Evaluating theprecision of static reference analysis using profiling. InProceedings of the international symposium on software testingand analysis (ISSTA), pages 22–32. 2002
• D. Liang, M. Pennings, M.J. Harrold, Evaluating the impact ofcontext-sensitivity on Andersen’s algorithm for Java programs,Workshop on Program Analysis for Software Tools andEngineering, (PASTE), pp 6-12, 2006.
-
37
Intl School SE, Salerno, Sept’06, B G Ryder-2 73
References• A. Milanova, A. Rountev, and B.G. Ryder. Parameterized object-
sensitivity for points-to and side-effect analyses for Java. InInternational Symposium on Software Testing and Analysis(ISSTA), pages 1–11, 2002.
• A. Milanova, A. Rountev, B.G. Ryder, Parameterized ObjectSensitivity for Points-to Analysis for Java, in ACM Transactions onSoftware Engineering Methodology, Volume 14, Number 1, pp 1-41, January 2005.
• M. Naik, A. Aiken, J. Whaley, Effective static race detection forJava, PLDI’06, pp 308-319.
• N. Oxhoj, J. Palsberg, and M. Schwartzbach. Making typeinference practical. In European Conference on Object-OrientedProgramming, pages 329–349, 1992.
• J. Palsberg and M. Schwartzbach. Object-oriented type inference.In Conference on Object-Oriented Programming Systems,Languages, and Applications, pages 146–161, 1991
Intl School SE, Salerno, Sept’06, B G Ryder-2 74
References• J. Plevyak and A. Chien. Precise concrete type inference for object
oriented languages. In Proceeding of Conference on Object-OrientedProgramming Systems, Languages and Applications.(OOPSLA ’94),pages 324–340, October 1994.
• A. Rountev, A. Milanova, B.G. Ryder, Points-to Analysis for JavaUsing Annotated Constraints, OOPSLA’01
• E. Ruf. Effective synchronization removal for Java. In Conference onProgramming Language Design and Implementation, pages 208–218,2000.
• B.G. Ryder, Dimensions of Precision in Reference Analysis of Object-oriented Programming Languages, invited paper in the Proceedings ofthe Twelveth International Conference on Compiler Construction,Warsaw, Poland, April 2003, pp 126-137.
• B. Steensgaard. Points-to analysis in almost linear time. InConference Record of the Twenty-third Annual ACMSIGACT/SIGPLAN Symposium on Principles of Programming Languages,pages 32–41, 1996.
-
38
Intl School SE, Salerno, Sept’06, B G Ryder-2 75
References• M. Sridharan and R. Bodik, Refinement-based context-sensitive
points-to analysis for Java, PLDI’06, pp3 87-400.• V.Sundaresan, et. al, Practical Virtual Method Call Resolution for
Java, OOPSLA’00• F. Tip and J. Palsberg, Scalable Propagation-based Call Graph
Construction Algorithms, OOPSLA’00• J. Whaley and M. Lam. An efficient inclusion-based points-to
analysis for strictly-typed languages. In Static Analysis Symposium,2002.
• J. Whaley and M. Lam, Cloning-based context-sensitive pointer aliasanalysis using binary decision diagrams, PLDI’04, pp 131-144.
• J. Zhu and S. Calman, Symbolic pointer analysis revisited, PLDI’04,pp 145-157.