© 2009 ibm corporation ibm(r) rational(r) quality manager in a globally distributed world ibm(r)...
TRANSCRIPT
© 2009 IBM Corporation
IBM(R) Rational(R) Quality Manager in a Globally Distributed World
IBM(R) Rational Team Concert(TM) System Performance Testing
Anu RamamoorthyStaff Software Engineer, IBM Rational Software
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
© 2009 IBM Corporation
IBM Rational Software Conference 2009
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
Session Speaker
Anu Ramamoorthy, Staff Software EngineerRTC System Verification Test Performance Leader
Anu Ramamoorthy is a Staff Software Engineer working on the Rational System and Integration Test Team. She has worked on a number of test automation and performance projects for IBM Rational Software including ClearCase and ClearCase Remote Client (CCRC). She has been a part of IBM Rational for the past 5 years. Currently, she is the leader for the RTC SVT Performance testing efforts.
IBM Rational Software Conference 2009
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
Disclaimer
Each IBM Rational Team Concert installation and configuration is unique. The performance data reported in this document are specific to the product software, test configuration, workload and operating environment that were used. The reported data should not be relied upon, as performance data obtained in other environments or under different
conditions may vary significantly. Due to the sensitive nature of the system testing which is often based off
of specific information from customers, the results of our testing will not be revealed except through official sources like developerWorks and release readiness reports.
IBM Rational Software Conference 2009
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
Session Overview
Today, the need for globally managed operations and lower IT infrastructure costs is becoming a reality for many companies. Companies must be able to adapt their environment to changing demands without having to sacrifice good performance.
In this session, we will discuss our objectives, use of customer-based workloads, and results from RTC 2.0 SVT performance testing. This includes: Our observations around the scalability of RTC 2.0 in a typical customer deployment.
Our observations of performance impacts in a high availability network configuration.
Our findings on the impact to RTC server performance with large numbers of contributor licenses.
Performance data resulting from network conditions provided by customers at VoiCE 2008.
Due to the forward-looking nature of this presentation, slides will not be made available prior to the session.
Please check back here after the conference for updated materials.
IBM Rational Software Conference 2009
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
Agenda
Background
Test Strategy
Establish Baseline for Workload
Develop Automation
Measure Server Load
Collect Metrics
Lab Infrastructure
Repository Details
Test Results
Scalability Testing Results
High Availability Testing Results
Large Contributors Testing Results
WAN Testing Results
IBM Rational Software Conference 2009
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
Background
RTC 2.0 Product Goal
Deliver global enterprise readiness which includes enhanced scalability and high availability amongst several other features.
Performance Team Goal
Help validate scalability of the RTC Server and provide input into customer collaterals.
Rational Performance Engineering Team (RPE) System Verification Team (SVT)
IBM Rational Software Conference 2009
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
Test Strategy: Establish Baseline
Obtain realistic baseline for server workload
Jazz.net used by agile development team familiar with Team Concert features
Server reports web service counters of requests from all users
User actions on client translate to one or more service calls to server
Counters record total calls, response times, bytes sent and received per service call
Establish simulation target total calls by scaling up baseline service calls
Focus closely on most frequent and time consuming services
RTC Web Service Counters
IBM Rational Software Conference 2009
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
Test Strategy: Develop Automation
Rational Performance Tester (RPT) v 8.0
Build test harness to simulate user transactions
Develop TeamConcert test harness covering key user transactions
HTTP record-and-playback for Work Items, Reports, Iteration Plans
SCM operations using IFileRestClient API
Builds using Ant script to load workspaces, publish status/results
Feed queries using API calls
Generate load across multiple client machines
Multiple clients at high speed to simulate many users
RPT Record-and-Playback
Work Item
HTTP
IFileRestClient API
Source Control
HTTP
Feeds API
Feeds
HTTP
Ant
Build
HTTP
IBM Rational Software Conference 2009
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
Test Strategy: Measure Server Load
Measure and calibrate the load the test harness generated.
Record counters hourly during test run
Compute simulated users by comparing to baseline rates
Ensure response times reasonable and consistent
Compare average response times to Jazz.net baseline
Track trends in response times for consistent performance
Web Service Count Trend Analysis Sample
0
0.05
0.1
0.15
0.2
0.25
0.3
0.35
0.4
1 2 3 4 5 6 7 8
Samples in hoursTi
me
in S
econ
ds
scm.common.IScmService.createBaseline()
workitem.common.internal.rest.IWorkItemRestService.getEditorPresentation()
IBM Rational Software Conference 2009
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
Test Strategy: Collect Metrics
Client metrics
Transaction Counts from low level framework
RPT Average Response times
Traffic metrics
Web Service Counters
Average response time
Byte sent /Byte received
Resource Metrics
% CPU Utilization
Memory Used
Disk I/O
IBM Rational Software Conference 2009
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
Test Strategy: Lab Infrastructure
User-Load Generation Data Center
Switch
`
RPT Workbench
Web RPT Agent
Network Gateway WAN Simulator
User-Load Generation Data Center
RTC Server2CPU, 4GB RAM
DB Server2 CPU/4GB RAM
Build RPT Agent
SCM RPT Agent
Web Server(2 CPU, 4 GB)
SCM RPT Agent
SCM RPT Agent
Feed RPT Agent
IBM Rational Software Conference 2009
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
Test Strategy: Repository Details
Realistic Repository based on Jazz.net snapshot, real data base.
Repository has over 2 years history.
Repository Details
Total Size on Disk: 22 GB
Total Size on Disk Uncompressed: 56 GB
Total Size available to DB: 60GB
Component Name
# of Items Size on Disk in MB
SCM 887,566 2,717
Build 245,550 6,180
Filesystem 193,436 26,387
Work Item 62,812 3,917
IBM Rational Software Conference 2009
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
Web Transaction Workload
Component Test Name Description % Composition
Iteration Plan Query Current Iteration Plans The test queries for all current iterations plans for Rational Team Concert project. Total count = 55
3
Workitem Create and Save Query This test opens, saves, and runs a query for work items submitted by the members of the test team area in the SVT Test Project.
10
Workitem Create and Save Workitems with Attachment
This test opens and saves a work item with a 125 KB attachment in the SVT Test
15
Workitem Create and Save Workitems without Attachment
This test opens and saves a work item with comments
25
Workitem Query Retrospectives The test runs predefined query for Retrospective work items in the Rational Team Concert project. Total count = 85.
38
Workitem Find Workitem via Search This test searches a previously created work item in the SVT Test Project.
7
Reports Query Workitem Closed by Iteration
The test runs a report for all work items closed per iteration.
2
IBM Rational Software Conference 2009
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
SCM Transaction WorkloadTransaction Description % Composition
Accept Accept Change sets 20%
Deliver Deliver Change sets 7%
Checkin Checkin upto 5 files (2K-5K) 39%
Login/Logout Login and Logout from the repository <1%
Suspend/Discard Perform suspend and discard RTC operation 2%
RefreshPendingChanges RefreshPending Changes 8%
CreateWorkspaceFromStreamandLoad Create a workspace from existing stream with 5000 files,50MB and load.
<1%
CompareStreamWorkspace Compares the workspace against the current in the stream using and the last baseline in the workspace against the stream
4%
CloneWorkspace Creates a clone of the existing workspace <1%
CloseChangesets Closes existing change sets 3%
Baseline Creates a baseline 3%
Unload Unloads the workspace <1%
History Performs history operation 4%
CompareWorkspaceBaseline Compares the workspace against the last baseline in the workspace
2%
Load Unloads the workspace <1%
IBM Rational Software Conference 2009
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
Scalability Observations in RTC 2.0
Small Scale Enterprise Testing Results
100 – 700 users
Large Scale Enterprise Testing Results
700 – 2000+ users
IBM Rational Software Conference 2009
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
Small Scale Enterprise Testing: Topology
Application Server
Websphere Application Server 6.1.0.23; IBM xSeries 336; Intel Xeon 5160, Hyper-Threaded EM64T, 2 Processor, 3.6GHz; 15,000 RPM SCSI Drives; 4GB RAM
DB Server
DB2 9.1 FP4; IBM xSeries 3550; Intel Xeon 5160, Dual Core, 2 Processor, 3.0 GHz; 10,000 RPM SAS Drivers; 4GB RAM
Both servers running Windows 2003
Authentication: LDAP (Microsoft Active Directory)
DatabaseServerInstance
Repository Database
ApplicationServer Instance
RTCApplication
UsersTwo-Server Configuration
IBM Rational Software Conference 2009
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
Small Scale Enterprise Testing: Results
Average Response Time for Web Service Counters:
.338 seconds
RTC Server Resource Utilization:
Average CPU: 52%
Memory used: 1.2 GBRTC Server % CPU Utilization
0
10
20
30
40
50
60
70
80
90
100
0:00 0:28 0:57 1:26 1:55 2:24 2:52
TIme in HH:MM
Me
mo
ry u
se
d in
MB
RTC Server% CPU Utilisation
*Pre Release Software Data
IBM Rational Software Conference 2009
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
Large Scale Enterprise Testing: Topology
IBM System x3650 M2; Dual CPU, Intel Xeon 5500, 2.4 GHz or higher, 64-bit
Memory - 18GB or higher
Operating System – Red Hat Release 5.3
Web server - Tomcat 5.5
Database - DB2 9.5 FP 4
DatabaseServerInstance
Repository Database
Application Server Instance
RTCApplication
Users
Single-Server Configuration
IBM Rational Software Conference 2009
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
Large Scale Enterprise Testing: Results Average Response Time for Web Service Counters:
.199 seconds
RTC Server Resource Utilizations :
Average CPU: 47%
Memory used: 18 GBSystem Summary
0
10
20
30
40
50
60
70
80
90
100
15
:44
15
:56
16
:08
16
:20
16
:32
16
:44
16
:56
17
:08
17
:20
17
:32
17
:44
17
:56
18
:08
18
:20
18
:32
18
:44
18
:56
19
:08
19
:20
19
:32
19
:44
19
:56
20
:08
20
:20
20
:32
20
:44
20
:56
21
:08
21
:20
21
:32
21
:44
21
:56
22
:08
22
:20
22
:32
22
:44
22
:56
23
:08
23
:20
23
:32
23
:44
23
:56
00
:08
us
r%+
sy
s%
CPU%
*Pre Release Software Data
IBM Rational Software Conference 2009
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
Key Scalability ResultsSingle-Tier Small
Enterprise Configuration
(100-700 Users)
Dual-Tier Small Enterprise
Configuration
(100-700 Users)
Single-Tier Large Enterprise
Configuration
(700- 2000)
Dual-Tier Large Enterprise
Configuration
(700 – 2000+)
Machine Type IBM System x3650 M2 – Single CPU
Intel Xeon 5500 2.4 GHz or higher,
64-bit
IBM System x3550 Dual CPU Intel Xeon 5160 2.4 GHz or higher,
64-bit
IBM System x3650 M2 – Dual CPU
Intel Xeon 5500 2.4 GHz or higher,
64-bit
2- IBM System x3650 M2 – Dual CPU Intel Xeon 5500 2.4 GHz or
higher, 64-bit
Memory 12 GB 4GB/8GB 18GB 12GB
OS Windows 2003/RHEL 5.3
Windows 2003/RHEL 5.3
Windows 2003/RHEL 5.3
Windows 2003/RHEL 5.3
Application Server Tomcat 5.5 or WAS 6.1.0.23 or
higher
Tomcat 5.5 or WAS 6.1.0.23 or
higher
Tomcat 5.5 or WAS 6.1.0.23 or
higher
Tomcat 5.5 or WAS 6.1.0.23 or
higher
Database Server Oracle 10GR2, DB2 9.1, DB2 9.5 FP 4, SQL 2005
and 2008
Oracle 10GR2, DB2 9.1, DB2 9.5 FP 4, SQL 2005
and 2008
Oracle 10GR2, DB2 9.1, DB2 9.5 FP 4, SQL 2005
and 2008
Oracle 10GR2, DB2 9.1, DB2 9.5 FP 4, SQL 2005
and 2008
IBM Rational Software Conference 2009
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
Basic High Availability Testing
Idle Standby is a failover strategy for basic high availability. The backup server only becomes active when the primary server fails.
The Scenario consists of:
2 WebSphere Application Servers (Version 6.1.0.19)
1 IBM HTTP Server Web Server (Version 6.1)
1 DB2 Server (DB2 9.1, FP4) populated with complex 60 GB repository
Verify that the switch over is seamless.
Addition of the web server causes minimal performance impact with 50 users at 1 transaction every 2 minutes.
DatabaseServerInstance
Repository Database
Application Server Instance
RTCApplication
Primary Server A
Application Server Instance
RTCApplication
Backup Server B
Users
Plugin-cfg
IBM HTTP Server
HTTP
HT
TP
S
*Async Tasks disabled
IBM Rational Software Conference 2009
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
Basic High Availability Topology
Idle Standby-Server Configuration – Backup Server enabled
DatabaseServerInstance
Repository Database
Application Server Instance
RTCApplication
Primary Server A
Application Server Instance
RTCApplication
Backup Server B
Users
Plugin-cfg
IBM HTTP Server
HTTPS
Idle Standby-Server Configuration – Primary Server enabled
DatabaseServerInstance
Repository Database
Application Server Instance
RTCApplication
Primary Server A
Application Server Instance
RTCApplication
Backup Server B
Users
Plugin-cfg
IBM HTTP Server
HTTPS
IBM Rational Software Conference 2009
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
Basic High Availability Performance: Results
Performance impact: Most Web transactions
showed better performance with the web server.
SCM transactions were mostly similar with or without the web server.
HA Configuration Web Server Impact
00.5
11.5
22.5
33.5
4
Deliver
Unload
Baseline
History
web_login_RTC
web_wi_new
WorkItem
web_ip_selectPlan
web_wi_runQ
uery
SCM Client and Web Pages Comparison
Tim
e in
Sec
on
ds
With Web Server Without Web Server
*Pre Release Software Data
IBM Rational Software Conference 2009
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
Large numbers of Contributor Users Testing: Results
Simulated 1000 users @ 5 pages per hour on Intel Xeon 5160 (Medium scale user hardware)
Average Response Time for Web Service Counters :
.227 seconds
RTC Server Resource Utilizations:
Average CPU: 36%
Average Memory used: 998 MB
RTC Server % CPU Utilization
0
10
20
30
40
50
60
70
80
0:00 0:28 0:57 1:26 1:55 2:24 2:52 3:21
Time in HH:MM
% C
PU
Uti
liza
tio
nRTC Server % Processor Time
*Pre Release Software Data
IBM Rational Software Conference 2009
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
WAN Testing Performance: Results
Response Time Comparison against a 250 user loaded server
LAN: < 5 ms latency
Slow latency WAN : 150 ms latency
Simple File Copy Using Robo Copy of 323 KB file took 14 seconds in WAN.
*Pre Release Software Data
LAN - WAN Comparison (SCM)
0
2
4
6
History
Baseline
Checkin
Login
RefreshChanges
SCM Transactions
Tim
e in
sec
on
ds
LAN WAN
IBM Rational Software Conference 2009
IBM(R) Rational Team Concert(TM) System Performance Testing CRM30
Conclusion
Indicators of a highly loaded server
> 95% CPU over extended periods of time.
Time out errors in the Application Server logs.
Average web service response times trending upwards over a period of time.
Key Results A configuration like our Large Enterprise Configuration can support 2,000 users. A configuration like our Small Enterprise Configuration can support 700 users.
Contributors don’t add significant load to the server.
WAN performance is good for most transactions.
Idle Standby solution with additional Web server adds only minimal overhead.
IBM Rational Software Conference 2009
IBM(R) Rational Team Concert(TM) System Performance Testing CRM3027
IBM Rational Software Conference 2009
IBM(R) Rational Team Concert(TM) System Performance Testing CRM3028
© Copyright IBM Corporation 2009. All rights reserved. The information contained in these materials is provided for informational purposes only, and is provided AS IS without warranty of any kind, express or implied. IBM shall not be responsible for any damages arising out of the use of, or otherwise related to, these materials. Nothing contained in these materials is intended to, nor shall have the effect of, creating any warranties or representations from IBM or its suppliers or licensors, or altering the terms and conditions of the applicable license agreement governing the use of IBM software. References in these materials to IBM products, programs, or services do not imply that they will be available in all countries in which IBM operates. Product release dates and/or capabilities referenced in these materials may change at any time at IBM’s sole discretion based on market opportunities or other factors, and are not intended to be a commitment to future product or feature availability in any way. IBM, the IBM logo, Rational, the Rational logo, Telelogic, the Telelogic logo, and other IBM products and services are trademarks of the International Business Machines Corporation, in the United States, other countries or both. Other company, product, or service names may be trademarks or service marks of others.