user-session based testing of web applications. two papers l a scalable approach to user-session...
Post on 21-Dec-2015
213 views
TRANSCRIPT
![Page 1: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/1.jpg)
User-session based Testing of Web Applications
![Page 2: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/2.jpg)
Two Papers
A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis– Uses concept analysis to reduce test suite size
An Empirical Comparison of Test Suite Reduction Techniques for User-session-based Testing of Web Applications– Compares concept analysis to other test suite
reduction techniques
![Page 3: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/3.jpg)
Talk Outline
Introduction
Background– User-session Testing– Concept Analysis
Applying Concept Analysis– Incremental Reduced Test Suite Update– Empirical Evaluation (Incremental vs. Batch)
Empirical Comparison of Concept Analysis to other Test Suite Reduction Techniques
Conclusions
![Page 4: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/4.jpg)
Characteristics of Web-based Applications
Short time to market
Integration of numerous technologies
Dynamic generation of content
May contain millions of LOC
Extensive use– Need for high reliability, continuous availability
Significant interaction with users
Changing user profiles
Frequent small maintenance changes
![Page 5: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/5.jpg)
User-session Testing
User session– A collection of user requests in the form of URL and
name-value pairs
User sessions are transformed into test cases– Each logged request in a user session is changed
into an HTTP request that can be sent to a web server
Previous studies of user-session testing– Previous results showed fault detection capabilities
and cost effectiveness– Will not uncover faults associated with rarely entered
data– Effectiveness improves as the number of sessions
increases (downside: cost increases as well)
![Page 6: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/6.jpg)
Contributions
View user sessions as use cases
Apply concept analysis for test suite reduction
Perform incremental test suite update
Automate testing framework
Evaluate cost effectiveness– Test suite size– Program coverage– Fault detection
![Page 7: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/7.jpg)
Concept Analysis
Technique for clustering objects that have common discrete attributes
Input:– Set of objects O– Set of attributes A– Binary relation R
• Relates objects to attributes• Implemented as a Boolean-valued table
– A row for each object in O– A column for each attribute in A
• Table entry [o, a] is true if object o has attribute a, otherwise false
![Page 8: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/8.jpg)
Concept Analysis (2)
Identifies concepts given (O, A, R)
Concept is a tuple (Oi, Aj)– Concepts form a partial order
Output:– Concept lattice represented by a DAG
• Node represents concept• Edge denotes the partial ordering
– Top element T = most general concept• Contains attributes that are shared by all objects in O
– Bottom element = most special concept• Contains objects that have all attributes in A
![Page 9: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/9.jpg)
Concept Analysis for Web Testing
Binary relation table– User session s = object– URL u = attribute– A pair (s, u) is in the relation table if s requests u
![Page 10: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/10.jpg)
Concept Lattice Explained
Top node T– Most general concept– Contains URLs that are requested by all user sessions
Bottom node – Most special concept– Contains user sessions that requests all URLs
Examples:– Identification of common URLs requested by 2 user sessions
• us3 and us4– Identification of user sessions that jointly request 2 URLs
• PL and GS
![Page 11: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/11.jpg)
Concept Analysis for Test Suite Reduction
Exploit lattice’s hierarchical use-case clustering
Heuristic– Identify smallest set of user sessions that will cover
all URLs executed by original suite
![Page 12: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/12.jpg)
Incremental Test Suite Update
![Page 13: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/13.jpg)
Incremental Test Suite Update (2)
Incremental algorithm by Godin et al.– Create new nodes/edges– Modify existing
nodes/edges
Next-to-bottom nodes may rise up in the lattice
Existing internal nodes never sink to the bottom
Test cases are not maintained for internal nodes
Set of next-to-bottom nodes (user sessions) form the test suite
![Page 14: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/14.jpg)
Web Testing Framework
![Page 15: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/15.jpg)
Empirical Evaluation
Test suite reduction– Test suite size– Replay time– Oracle time
Cost-effectiveness of incremental vs. batch concept analysis
Program coverage
Fault detection capabilities
![Page 16: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/16.jpg)
Experimental Setup
Bookstore Application– 9748 LOC– 385 methods– 11 classes
JSP front-end, MySQL backend
123 user sessions
40 seeded faults
![Page 17: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/17.jpg)
Test Suite Reduction
Metrics– Test suite size– Replay time– Oracle time
![Page 18: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/18.jpg)
Incremental vs. Batch Analysis Metric
– Space costs– Relative sizes of files required by incremental and batch
techniques
Methodology– Batch: 123 user sessions processed– Incremental: 100 processed first, then 23 incrementally
![Page 19: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/19.jpg)
Program Coverage Metrics
– Statement coverage– Method coverage
Methodology– Instrumented Java classes using Clover– Restored database state before replay– Wget for replaying user sessions
![Page 20: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/20.jpg)
Fault Detection Capability
Metric– Number of faults detected
Methodology– Manually seeded 40 faults into separate copies of the
application– Replayed user sessions through
• Correct version to generate expected output• Faulty version to generate actual output
– Diff expected and actual outputs
![Page 21: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/21.jpg)
Empirical Comparison ofTest Suite Reduction Techniques
![Page 22: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/22.jpg)
Empirical Comparison of Test Suite Reduction Techniques
Compared 3 variations of Concept with 3 requirements-based reduction techniques– Random– Greedy– Harrold, Gupta, and Soffa’s reduction (HGS)
Each requirements-based reduction technique satisfies program or URL coverage– Statement, method, conditional, URL
![Page 23: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/23.jpg)
Random and Greedy Reduction
Random– Selection process continues until reduced test suite
satisfies some coverage criterion
Greedy– Each subsequent test case selected provides
maximum coverage of some criterion– Example:
• Select us6 – maximum URL coverage• Then, select us2 – most marginal improvement for all-
URL coverage criterion
![Page 24: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/24.jpg)
HGS Reduction
Selects a representative set from the original by approximating the optimal reduced set
Requirement cardinality = # of test cases covering that requirement
Select most frequently occurring test case with lowest requirement cardinality
Example: – Consider requirement with cardinality 1 – GM
• Select us2
– Consider requirement with cardinality 2 – PL and GB– Select test case that occurs most frequently in the union
• us6 occurs twice, us3 and us4 once• Select us6
![Page 25: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/25.jpg)
Empirical Evaluation
Test suite size
Program coverage
Fault detection effectiveness
Time cost
Space cost
![Page 26: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/26.jpg)
Experimental Setup
Bookstore application
Course Project Manager (CPM)– Create grader/group accounts– Assign grades, create schedules for demo time– Send notification emails about account creation, grade
postings
![Page 27: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/27.jpg)
Test Suite Size
Suite Size Hypothesis– Larger suites than:
• HGS and Greedy
– Smaller suites than:• Random
– More diverse in terms of use case representation
Results– Bookstore application:
• HGS-S, HGS-C, GRD-S, GRD-C created larger suites
– CPM• Larger suites than HGS and Greedy• Smaller than Random
![Page 28: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/28.jpg)
Test Suite Size (2)
![Page 29: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/29.jpg)
Program Coverage
Coverage Hypothesis– Similar coverage to:
• Original suite– Less coverage than:
• Suites that satisfy program-based requirements– Higher URL coverage than:
• Greedy and HGS with URL criterion
Results– Program coverage comparable to (within 2% of)
PRG_REQ techniques– Slightly less program coverage than original suite
and Random– More program coverage than URL_REQ techniques,
Greedy and HGS
![Page 30: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/30.jpg)
Program Coverage (2)
![Page 31: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/31.jpg)
Fault Detection Effectiveness
Fault Detection Hypothesis– Greater fault detection effectiveness than:
• Requirements-based techniques with URL criterion
– Similar fault detection effectiveness to:• Original suite• Requirements-based techniques with program-based
criteria
Results– Best fault detection but low number of faults
detected per test case - Random PRG_REQ– Similar fault detection to the best PRG_REQ
techniques– Detected more faults than HGS-U
![Page 32: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/32.jpg)
Fault Detection Effectiveness (2)
![Page 33: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/33.jpg)
Time and Space Costs
Costs Hypothesis– Less space and time than:
• HGS, Greedy, Random– Space for Concept Lattice vs. space for requirement mappings
Results– Costs considerably less than PRG_REQ techniques– Collecting coverage information for each session is the clear
bottleneck of requirements-based approaches
![Page 34: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/34.jpg)
Conclusions Problems with Greedy and Random reduction
– Non-determinism– Generated suites with wide range in size, coverage, fault
detection effectiveness
Test suite reduction based on concept-analysis clustering of user sessions– Achieves large reduction in test suite size– Saves oracle and replay time– Preserves program coverage– Preserves fault detection effectiveness
• Chooses test cases based on use case representation
Incremental test suite reduction/update– Scalable approach to user-session-based testing of web
applications– Necessary for web applications that undergoes constant
maintenance, evolution, and usage changes
![Page 35: User-session based Testing of Web Applications. Two Papers l A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis](https://reader030.vdocuments.us/reader030/viewer/2022032522/56649d6a5503460f94a483bd/html5/thumbnails/35.jpg)
References
Sreedevi Sampath, Valentin Mihaylov, Amie Souter, Lori Pollock "A Scalable Approach to User-session based Testing of Web Applications through Concept Analysis," Automated Software Engineering Conference (ASE), September 2004.
Sara Sprenkle, Sreedevi Sampath, Emily Gibson, Amie Souter, Lori Pollock, "An Empirical Comparison of Test Suite Reduction Techniques for User-session-based Testing of Web Applications," International Conference on Software Maintenance (ICSM), September 2005.