using qtp for load and performance testing of rich client applications
DESCRIPTION
An overview by Capgemini on using HP QTP for Load and Performance TestingTRANSCRIPT
CEA v6.4 ©2010 Hewlett-Packard Development Company, L.P. The information contained herein is subject to change without notice
Twitter hashtag #HPSWU
BTOT-WE-1145-8Twitter hashtag #HPSWU
Stephan Wiesner
Barcelona, 2011
Using QTP for Load and Performance Testing of Rich Client Applications
CEA v6.4 © 2010 Capgemini - All rights reservedHP_BARCELONA_WIESNER_06.PPT
3
CEA v6.4 © 2010 Capgemini - All rights reservedHP_BARCELONA_WIESNER_06.PPT
4
AGENDA
• Introduction
• Challenges
• Solutions
• Lessons Learned
CEA v6.4 © 2010 Capgemini - All rights reservedHP_BARCELONA_WIESNER_06.PPT
5
Performance testing is testing that is performed, to determine how fast some aspect of a system performs under a particular workload
Performance testing definition
PerformanceTesting
Endurance TestingEndurance testing is usually done to determine if the application can sustain the continuous expected load.
Load TestingA load test is usually conducted to understand the behaviour of the application under a specific expected load.
Stress TestingStress testing is used to
understand the upper limits of capacity within
the application landscape.
EvaluationEvaluate results, perform
bottle neck analysis, perform optimisation
CEA v6.4 © 2010 Capgemini - All rights reservedHP_BARCELONA_WIESNER_06.PPT
6
Customer performed a major hardware upgrade for it‘s critical trading system based on Standard Software
Customer
Changes
Testproject
■ Large Swiss utility company■ Central trading department■ 7x24 business■ ca. 40 active traders■ Performance crucial for business transactions
■ Replace host database with modern Oracle based database system■ Replace all server hardware ■ Virtualisation of all windows based servers■ Implement high availability measures
■ Perform functional regression testing■ Define and test high availability scenarios■ Plan, test and manage the system migration■ Plan, execute and evaluate extensive performance tests !
Overview of the situation
CEA v6.4 © 2010 Capgemini - All rights reservedHP_BARCELONA_WIESNER_06.PPT
7
The (new) system under test is build on a typical 3-tier architecture
Simplified system view
Solaris Systems
Windows Systems
DB Server
System ASystem B…System N
OAS
OAS
LoadBalancer
[...]
CEA v6.4 © 2010 Capgemini - All rights reservedHP_BARCELONA_WIESNER_06.PPT
8
QTP 9.2 was already used for automation of functional regression tests on testsystems
Overview of testsystems and -tools
Testsystems
Test Tools
■ New production system■ 5 Testsystems, one of them was an exact duplicate of the Production
■ QC 9.2 for requirements, testcases and testscripts■ QTP 9.2 for automated regression testing
Reqs / Test Cases
■ No formal requirements (standard software)■ Some test cases from old projects
CEA v6.4 © 2010 Capgemini - All rights reservedHP_BARCELONA_WIESNER_06.PPT
9
AGENDA
• Introduction
• Challenges
• Solutions
• Lessons Learned
CEA v6.4 © 2010 Capgemini - All rights reservedHP_BARCELONA_WIESNER_06.PPT
10
The performance testing faced three major challenges: unclear requirements, budget constraints and old technologies
Major challenges and derived measures
Time & Budget - 4 Weeks of total time- No money for new tools- No inhouse expertise
Successful Performance Testing
Requirements - „double the speed“- critical workflows- number of user
Technology- Client: Oracle Forms 6.0- No API for automation- No testdata generation
Perform GUI based testautomation
Reuse as much as possible
Perform requirements gathering and
priorisation
CEA v6.4 © 2010 Capgemini - All rights reservedHP_BARCELONA_WIESNER_06.PPT
11
The (performance) testing of standard software has some very specific challenges
Comparison of custom and standard software concerning testing
Custom Software Standard Software
Functional requirements
•Customer specifies requirements
•Functional test cases derived from requirements
•Customer has “expectations”
•Test cases derived from manual or best practices
Nonfunctional requirements
•Nonfunctional requirements part of contract
•Software is provided on a “as is” basis
Implementing changes
•Contract specifies change process
•Custom release cycle
•Changes can affect all customers
•Regular releases
Testautomation
•Customer can influence development and testautomation can be part of contract (e.g. automatic smoke tests per release)
•Customer has low influence
•Changes have high impact on automation by customer
Requirements and change management are crucial for performance testing of standard software
CEA v6.4 © 2010 Capgemini - All rights reservedHP_BARCELONA_WIESNER_06.PPT
12
Database
Reuse of existing automated workflow tests and lack of API support can be a challenge
Workflow
Functions
Testdata
APIBusines
s Logic
API
API
SystemUnderTest
(Black-Box)
GUI
Testdata
Testdata
Sys
tem A
Sys
tem B
CEA v6.4 © 2010 Capgemini - All rights reservedHP_BARCELONA_WIESNER_06.PPT
13
QTP simulates real users – using mouse, keyboard and screen
Only ONE instance of QTP can run on any single computer!
CEA v6.4 © 2010 Capgemini - All rights reservedHP_BARCELONA_WIESNER_06.PPT
14
AGENDA
• Introduction
• Challenges
• Solutions
• Benefits and Lessons Learned
CEA v6.4 © 2010 Capgemini - All rights reservedHP_BARCELONA_WIESNER_06.PPT
15
A subset of already automated workflow test cases was selected for performance testing
Custom
er & T
est Team
Testcase transformation process
Automated Workflow Tests Performance Tests
Select test cases that simulate a realistic workload
1. Select test cases with customer2. Adapt for concurrent execution3. Adapt for perf. testing (data,
logging, etc.)
CEA v6.4 © 2010 Capgemini - All rights reservedHP_BARCELONA_WIESNER_06.PPT
16
Using 20 QTP licences and virtual systems enabled us to simulate 20 concurrent usersOverview of test system and usage of QTP
Solaris-SystemsWindows Systems
DB Server
lasttest
lasttest
lasttest
Firewall
OAS
OAS
NLB OAS
VM Client 01
VM Client 02
VM Client 20
[...]
CEA v6.4 © 2010 Capgemini - All rights reservedHP_BARCELONA_WIESNER_06.PPT
17
Using QTP for performance tests means a lot of manual work
List of Manual Tasks
1. Log into virtual client2. Start QTP3. Load script4. Start execution
5. Stop execution6. Aggregate log files7. Perform analysis8. Optional: reporting
x 20
Can we automate this?
CEA v6.4 © 2010 Capgemini - All rights reservedHP_BARCELONA_WIESNER_06.PPT
18
QualityCenter (QC) can execute distributed test scriptsOverview of test system and usage of QTP
Solaris-SystemsWindows Systems
DB Server
lasttest
lasttest
lasttest
Firewall
OAS
OAS
NLB OAS
VM Client 01
VM Client 02
VM Client 20
[...]
QC 9.2
CEA v6.4 © 2010 Capgemini - All rights reservedHP_BARCELONA_WIESNER_06.PPT
19
Performance tests where performed on production data
Production
11. Export production2. Import into test systems and isolation3. DB migration4. Optional: perform data validation5. Optional: duplicate data6. Optional: import into reference database
T03 T05 T06 T07
235
4
Overview data replication and migration process
6
CEA v6.4 © 2010 Capgemini - All rights reservedHP_BARCELONA_WIESNER_06.PPT
20
The use of production data on testsystems requires some kind of isolation to ensure safe testing
Isolation of Production Data
Testsystem
Internal systemswith external connections
Internal systems
Partner systems 1. Export productive database2. Select isolation script
depending on target system3. Isolation
The Scripts are reusable, under version control, and use parameters.
The Goal is to control the communication with internal and external Systems
ProductiveData
Isolated Data
1.
Isolation-script2.
3.
CEA v6.4 © 2010 Capgemini - All rights reservedHP_BARCELONA_WIESNER_06.PPT
21
QC 9.2
Old system
All tests were executed on old and new system to compare the results
Test execution workflow
Data 1.10.200X
New system
Data 1.10.200X
QTP scripts
1. Reset data2. Execute scripts3. Perform reporting
QTP 9.2 on VM clients
CEA v6.4 © 2010 Capgemini - All rights reservedHP_BARCELONA_WIESNER_06.PPT
22
Log In and Out should be measured separately
Although the requirements only demanded End2End performance, we introduced several additional points of measurement . . .
Examples of points of measurement
1. Start script
2. Log in
3. Screen 1
4. Screen 2
1. Start calculation
2. End calculation
5. …
6. Screen N
7. Log outLog in and out should be
measured separately
Good entry point for bottle neck
analysis
Customer demanded End2End execution time
(2 x faster)
CEA v6.4 © 2010 Capgemini - All rights reservedHP_BARCELONA_WIESNER_06.PPT
23
Log files contained status, duration, workflow ID, etc
Example extract from logfile
Time Instance WF Duration Status08.10.2008 16:51 QTP_01 COE 132.58 PASSED08.10.2008 16:54 QTP_01 CM 130.47 PASSED08.10.2008 16:56 QTP_01 RM 76.98 PASSED08.10.2008 16:57 QTP_01 LI 34.48 PASSED08.10.2008 17:40 QTP_01 FU 52.36 FAILED08.10.2008 17:41 QTP_01 SF 53.73 PASSED08.10.2008 17:42 QTP_01 BS Mess 67.67 PASSED
CEA v6.4 © 2010 Capgemini - All rights reservedHP_BARCELONA_WIESNER_06.PPT
24
We defined test scenarios for full load, base load and average load by varying the number of concurrent users (1/2)
Three test scenarios
Don’t change think times!
1
10
20
0
5
10
15
20
25
Base Load Average Load Full Load
CEA v6.4 © 2010 Capgemini - All rights reservedHP_BARCELONA_WIESNER_06.PPT
25
We defined test scenarios for full load, base load and average load by varying the number of concurrent users (2/2)
Example report for different test scenarios
0
100
200
300
400
500
600
700
WF A WF B WF C
New Base Load New Average Load New Full Load Old Average Load Old Full Load
Old system was highly sensitive to load differences
New system was not sensitive to load differences
The new database had 32 CPUs (vs. 4) 20 QTP instances proofed that the application supported concurrent execution!
CEA v6.4 © 2010 Capgemini - All rights reservedHP_BARCELONA_WIESNER_06.PPT
26
AGENDA
• Introduction
• Challenges
• Solutions
• Lessons Learned
CEA v6.4 © 2010 Capgemini - All rights reservedHP_BARCELONA_WIESNER_06.PPT
27
Lesson learned: capture replay is not sufficient – use a central library and global configurations
Use of central library and parameterization
iterations = Environment("v_NumberOfIterations")InitLogWhile Iterations <> 0 RunWorkflow("FU") RunWorkflow("SF") [. . .] If (Iterations > 0)Then Iterations = Iterations - 1 End IfWend
<Variable><Name>v_NumberOfIterations</Name><Value>2</Value>
</Variable>Central
configuration file
Central execution flow
CEA v6.4 © 2010 Capgemini - All rights reservedHP_BARCELONA_WIESNER_06.PPT
28
Lesson learned: find a way to stop test execution “smartly”
How to stop running tests using “stop-files”
Modes of performancetest execution:•Number of runs•Time frame•Ramp up (and down)
And in case of error?
Hard stop of QTP scripts using stop button Result: “probably” corrupt data
Better: after each block of code:•Check for existence of stop-file•If file exists perform controlled stop•Else go to next block
Try to preserve consistent data
CEA v6.4 © 2010 Capgemini - All rights reservedHP_BARCELONA_WIESNER_06.PPT
29
Lesson learned
Testdata
Test systems
QTP
• Use production data (anonymized and/or isolated)• Define a way to reset the data (fast)
• Virtualisation cuts costs and eases system management
• Use QC test execution controls
• Define central library (logging, bottle neck analysis, error handling, etc.)
• Use read-only-tests (where possible)• Clean data after testruns wherever possible
CEA v6.4 © 2010 Capgemini - All rights reservedHP_BARCELONA_WIESNER_06.PPT
30
CEA v6.4
Continue the conversation with your peers at the HP Software Community hp.com/go/swcommunity