barry boehm supannika koolmanojwong csci 577 ref: qi li’s lectures
DESCRIPTION
Continuous Integration Personas CRUD Analysis MoSCoW Software Test Automation TDD Agile Testing SW design pattern Walkthrough V&V standard Defect Tracking Tool Code Inspection “Bug” is found Software Reliability Software Metrics HCI http://www.testingreferences.com/testingtimeline/testingtimeline.jpgTRANSCRIPT
University of Southern California
Center for Systems and Software Engineering
Software Testing
Barry BoehmSupannika Koolmanojwong
CSCI 577
Ref: Qi Li’s lectures1
University of Southern California
Center for Systems and Software Engineering
2http://www.testingreferences.com/testingtimeline/testingtimeline.jpg
Continuous Integration
Personas
CRUD Analysis
MoSCoW
Software Test Automation
TDD
Agile Testing
SW design pattern
Walkthrough
V&V standard
Defect Tracking Tool
Code Inspection
“Bug” is found
Software Reliability
Software Metrics
HCI
University of Southern California
Center for Systems and Software Engineering
Positive and Negative Testing• Positive Testing
– Do “normal” user actions– Find cases where the program does not do what it is supposed
to do– Test with valid input
• Negative Testing– Do “abnormal” user actions– Find cases where the program does things it is not supposed to
do– Test with invalid input
3
University of Southern California
Center for Systems and Software Engineering
4
Outline
• Software Test in General
• Value-based Software Test
University of Southern California
Center for Systems and Software Engineering
Most Common Software problems Incorrect calculation Incorrect data edits & ineffective data edits Incorrect matching and merging of data Data searches that yields incorrect results Incorrect processing of data relationship Incorrect coding / implementation of business
rules Inadequate software performance
5
University of Southern California
Center for Systems and Software Engineering
Confusing or misleading dataSoftware usability by end users & Obsolete
Software Inconsistent processingUnreliable results or performance Inadequate support of business needs Incorrect or inadequate interfaces
with other systems Inadequate performance and security
controls Incorrect file handling
6
University of Southern California
Center for Systems and Software Engineering
Cost to fix faults
Cost
Definition Development Post Release
1*
1.5* to 6*
60* to 100*
7
University of Southern California
Center for Systems and Software Engineering
Objectives of testing• Executing a program with the intent of finding an
error.• To check if the system meets the requirements
and be executed successfully in the Intended environment.
• To check if the system is “ Fit for purpose”.• To check if the system does what it is expected to
do.
8
University of Southern California
Center for Systems and Software Engineering
A good test :
• A good test case is one that has a probability of finding an as yet undiscovered error.
• A successful test is one that uncovers a yet undiscovered error.
• A good test is not redundant.• A good test should be “best of breed”.• A good test should neither be too simple nor too
complex.
9
University of Southern California
Center for Systems and Software Engineering
Objective of a Software Tester• Find bugs as early as possible and make sure they get
fixed.• To understand the application well.• Study the functionality in detail to find where the bugs are
likely to occur.• Study the code to ensure that each and every line of code
is tested.• Create test cases in such a way that testing is done to
uncover the hidden bugs and also ensure that the software is usable and reliable
10
University of Southern California
Center for Systems and Software Engineering
• Software reviews, inspections and walkthroughs - Concerned with analysis of the static system representation to discover problems (static verification)
• Software testing with test cases - Concerned with exercising and observing product behaviour (dynamic verification)– The system is executed with test data and
its operational behaviour is observed
Static and Dynamic Verification
11
University of Southern California
Center for Systems and Software Engineering
Software reviews• Peer Review• Code Review• Walkthrough• Design review board• Software control board• Change control board• Mission assurance
12
University of Southern California
Center for Systems and Software Engineering
Inspection• Fagan Inspection
– Process• Planning Overview meeting Preparation
Inspection meeting Rework Follow-up– Inspection roles
• Author, Moderator, Reader, Recorder, Inspector– Inspection types
• Code review, peer review– Inspect
• Req Spec, Sys Arch, Programming, Test scripts
13
University of Southern California
Center for Systems and Software Engineering
Inspections and testing• Inspections and testing are complementary and
not opposing verification techniques• Both should be used during the V & V process• Inspections can check conformance with a
specification but not conformance with the customer’s real requirements
• Inspections cannot check non-functional characteristics such as performance, usability, etc.
14
University of Southern California
Center for Systems and Software Engineering
• Test data Inputs which have been devised to test the system
• Test cases Inputs to test the system and the predicted outputs from these inputs if the system operates according to its specification
Test data and test cases
15
University of Southern California
Center for Systems and Software Engineering
Methods of testing• Test to specification:
– Black box– Data driven– Functional testing– Code is ignored: only use specification
document to develop test cases• Test to code:
– Glass box/White box– Logic driven testing– Ignore specification and only examine the code.
16
University of Southern California
Center for Systems and Software Engineering
Testing Levels• Unit testing• Integration testing • System testing• Acceptance testingEmbedded software testing• Hardware / software integration• HOOTL – Human out of the loop• HITL – Human in the loop• Run-for-record• TLYF – Test like you fly 17
University of Southern California
Center for Systems and Software Engineering
Unit testing• The most ‘micro’ scale of testing.• Tests done on particular functions or code
modules.• Requires knowledge of the internal
program design and code.• Done by Programmers (not by testers).
• Unit testing tool• http://en.wikipedia.org/wiki/List_of_unit_tes
ting_frameworks18
University of Southern California
Center for Systems and Software Engineering
Don’t forget rainy day scenario• Corner cases / off-nominal cases / alternate
course / performance / robustness
• Generate Sunny day scenarios based on use cases and/or requirements.
• Generate Rainy Day scenarios that correspondto the previously defined Sunny Day scenarios.– Negative– Boundary– (when something goes wrong)
19
University of Southern California
Center for Systems and Software Engineering
Integration Testing• Testing of combined parts of an application to
determine their functional correctness.• ‘Parts’ can be
– code modules– individual applications– client/server applications on a network.
• Types of Integration Testing– Top-down– Bottom-up– Sandwich– Big-bang
20
University of Southern California
Center for Systems and Software Engineering
Systems Testing • To test the co-existence of products and
applications that are required to perform together in the production-like operational environment (hardware, software, network)
• To ensure that the system functions together with all the components of its environment as a total system
• To ensure that the system releases can be deployed in the current environment
21
University of Southern California
Center for Systems and Software Engineering
Acceptance TestingObjectives To verify that the system meets
the user requirements
When After System Testing
Input Business Needs & Detailed Requirements
Master Test Plan User Acceptance Test Plan
Output User Acceptance Test report
22
University of Southern California
Center for Systems and Software Engineering
Load testing
– Testing an application under heavy loads.– Eg. Testing of a web site under a range of
loads to determine, when the system response time degraded or fails.
23
University of Southern California
Center for Systems and Software Engineering
Stress Testing
– Testing under unusually heavy loads, heavy repetition of certain actions or inputs, input of large numerical values, large complex queries to a database etc.
– Term often used interchangeably with ‘load’ and ‘performance’ testing.
Performance testing– Testing how well an application complies to
performance requirements.
24
University of Southern California
Center for Systems and Software Engineering
Alpha testingTesting done when development is nearing completion; minor design changes may still be made as a result of such testing.
Beta-testing•Testing when development and testing are essentially completed and final bugs and problems need to be found before release.
25
University of Southern California
Center for Systems and Software Engineering
Test tools• Crucible - Collaborative peer code review • Phabricator - Peer Code Review Tools • Collaborator - Code Review Tools
• Tarantula – Test Management System• Capybara - Web app testing tool• IBM AppScan - Security Testing Tool • Load Runner - performance testing tool • Load Complete - Load testing • Tsung - Load Testing Tool• Load Impact - Performance Testing Service on the cloud • QuickTest Professional HP - Functional Testing • Hyperion - Load testing tool
• AppPerfect Java Unit Test• AppPerfect Load Test • AppPerfect Web Test• AppPerfect App Test • AppPerfect Java Code Test• Jmeter
26
• Cucumber - BDD tool • Lettuce - BDD tool
• QuickTest Professional HP - Functional Testing
• Badboy Software, web automated testing tool• Robot Framework - test automation framework• Geb (browser automation solution)• Sikuli - GUI test
• Selenium - SElinium IDE • Selenium - SElinium WebDriver
University of Southern California
Center for Systems and Software Engineering
Good Test Plans (1/2)• Developed and Reviewed early.
• Clear, Complete and Specific
• Specifies tangible deliverables that can be inspected.
• Staff knows what to expect and when to expect it.
27
University of Southern California
Center for Systems and Software Engineering
Good Test Plans (2/2)
• Realistic quality levels for goals
• Includes time for planning
• Can be monitored and updated
• Includes user responsibilities
• Based on past experience
• Recognizes learning curves
28
University of Southern California
Center for Systems and Software Engineering
Test CasesContents
– Test plan reference id– Test case– Test condition– Expected behavior
29
University of Southern California
Center for Systems and Software Engineering
Good Test CasesFind Defects
• Have high probability of finding a new defect.
• Unambiguous tangible result that can be inspected.
• Repeatable and predictable.
30
University of Southern California
Center for Systems and Software Engineering
Good Test Cases
• Traceable to requirements or design documents
• Push systems to its limits• Execution and tracking can be automated• Do not mislead• Feasible
31
University of Southern California
Center for Systems and Software Engineering
Bugs prioritization
32
University of Southern California
Center for Systems and Software Engineering
33
Outline
• Software Test in General
• Value-based Software Test
University of Southern California
Center for Systems and Software Engineering
Tester’s Attitude and Mindset
“The job of tests, and the people that develop and run tests,
is to prevent defects, not to find them.”
34
- Mary Poppendieck
University of Southern California
Center for Systems and Software Engineering
35
Pareto 80-20 distribution of test case value[Bullock, 2000]
Actual business value
% of Valuefor
CorrectCustomer
Billing
Customer Type
100
80
60
40
20
5 10 15
Automated test generation tool-
% of Valuefor
CorrectCustomer
Billing
Customer Type
100
80
60
40
20
5 10 15
Automated test generation tool-all tests have equal value*
*Usual SwE assumption for all requirements, objects, defects, …
University of Southern California
Center for Systems and Software Engineering
Business Case for Value-Based Testing
36
-1
-0.5
0
0.5
1
1.5
2
0 20 40 60 80 100
% Tests Run
Ret
urn
on In
vest
men
t (R
OI)
Pareto testing ATG testing
University of Southern California
Center for Systems and Software Engineering
• How can we compare the value of test cases ?
• How to prioritize test cases ? • How to measure the value of test cases?
37
University of Southern California
Center for Systems and Software Engineering
Value-based Software Testing Framework- Feature Prioritization
38
University of Southern California
Center for Systems and Software Engineering
How much test is enough?
39
Li, Q., Yang, Y., Li, M., Wang, Q., Boehm, B. W. and Hu, C., Improving software testing process: feature prioritization to make winners of success-critical stakeholders. Journal of Software Maintenance and Evolution: Research and Practice, n/a. doi: 10.1002/smr.512
University of Southern California
Center for Systems and Software Engineering
Value-based Test Case Prioritization
40
Failed
Not-Tested-Yet
Passed
NAReady-to-Test
<<No dependencies or all test cases in Dependencies
Set have been passed>>
<<Change the status to NA for all test cases that
depends on this failed test case>>
University of Southern California
Center for Systems and Software Engineering
Value-based Test Order Logic
41
•Value First: Test the one with the highest value.•Dependency Second: If the test case with the highest value is not “Ready-to-Test”, which means at least one of the test cases in its Dependencies Set is “Not-Tested-Yet”. In such situation, prioritize the “Not-Tested-Yet” test cases according to “Value First” in this Dependencies Set and start to test until all test cases in the Dependencies Set are “Passed”. Then the test case with the highest value is “Ready-to-Test”.•Shrink the prioritization set ASAP: Exclude the tested one out of the prioritization set.
University of Southern California
Center for Systems and Software Engineering
Value-based Test Order Logic
42
Pick the one with the highest Value
Have dependencies?
All dependencies passed?
Y
Start to testN
Y
Exclude the “Passed” one for prioritization
Failed?
Exclude the “Failed” one and the others “NA” that
depends on it for prioritization
N
<<In the Dependencies Set>>
N
Y
<<Ready-to-Test>>
<<Ready-to-Test>>
<<- -In the Whole Set- ->>
Value-based Prioritization for One Regression Testing Round
Multiple Regression Tests Until all Test Cases “Passed”
University of Southern California
Center for Systems and Software Engineering
Test Case Dependency Tree
43
TC1.1.1 (8, 8)
Start
TC1.1.2 (1,1)
TC1.1.3 (1, 4.5) TC1.2.1 (8, 8) TC1.2.2 (1, 4.5)
TC1.2.3 (1,1)
TC1.2.4 (1, 4.5)
TC3.1.1 (16, 11)
TC3.2.1 (12, 11.2) TC3.3.1 (4,9.6) TC4.1.1 (15, 11.8)
TC4.2.1 (10, 11.5)
TC5.1.1 (2, 6)
TC5.2.1 (2, 10.1 )
TC2.1.1(12,9.3)
TC3.2.1 (4, 10)
University of Southern California
Center for Systems and Software Engineering
Accumulated Cost-Effectiveness (ACE) of Test
44
University of Southern California
Center for Systems and Software Engineering
Test Plan• What, when, where, how, by whom?
– Type of testing– Timeline– Developers’ machine / server– Tools, HW, SW– Responsible person
• Traceability Matrix
45
University of Southern California
Center for Systems and Software Engineering
Test Cases
46
University of Southern California
Center for Systems and Software Engineering
47
University of Southern California
Center for Systems and Software Engineering Examples
48
University of Southern California
Center for Systems and Software Engineering
49
University of Southern California
Center for Systems and Software Engineering
Developer: There is no I in TEAM
Tester: We cannot spell BUGS without U
50