blazemeter- effective performance reporting

Post on 09-May-2015

2.226 Views

Category:

Technology

3 Downloads

Preview:

Click to see full reader

DESCRIPTION

This topic focuses on effective reporting and its associated challenges while using JMeter. It delves into the importance of metrics and KPIs for effective performance reporting, followed by a brief overview of JMeter's built-in listeners (reporting elements) like Aggregate Listener, Graph Listeners etc. The 3rd and the final part covers the inadequacies of these listeners and use of third party/external reporting tools that provide enhanced reporting (ant + xslt). The new BlazeMeter reporting plugin is introduced as a quick and ready to use solution for JMeter reporting. Sub-topics: * Importance of effective performance test reporting * Typical performance testing metrics * JMeter reporting entities (Listeners) * Shortcomings of existing JMeter reporting elements * Generating advanced JMeter reports using ant + xslt * Building reporting tools frameworks * How the blazemeter reporting plugin can alleviate the challenges in JMeter reports * Details on the blazemeter reporting plugin

TRANSCRIPT

EFFECTIVE PERFORMANCE REPORTING USING

APACHE JMETER

JULY 31, 2012

THE LOAD TESTING CLOUD

A DEV-TEST CLOUD SERVICE 100%

COMPATIBLE WITH THE OPEN-SOURCE

APACHE JMETER

AGENDA

Performance Attributes

Creating Load Test Reports

Understanding Performance KPIs

JMeter Reporting Elements

BlazeMeter Reporting Plugin

Generating Advanced JMeter Reports

PERFORMANCE ATTRIBUTES

• Speed / Responsiveness

• How fast does the page load?

• How quickly can the system process a transaction?

• Scalability

• Can the application handle the expected end user load?

• Does the application throughput degrade as the user load increases?

PERFORMANCE ATTRIBUTES…

• Efficiency and Capacity Planning

• Are you using the right resources

• Can your infrastructure carry the

load?

• Reliability/Availability/

Recoverability

• What is the mean time between

failure (MTBF)?

• Does the application recover after

a crash? Does it lose user data

after crash?

UNDERSTANDING PERFORMANCE KPIS

Internet

Application Metrics • Response Time • Throughput • Error Rate

Browser Rendering Metrics* • Total Rendering Time • Heavy Images/CSS/JS • DNS Lookup

System Metrics • CPU • Memory • Disk / IO • Network

Platform Metrics • DB • App-server • Application

User Load

Re

sp

on

se

Tim

e

User Load

Re

qu

es

ts / s

ec

End User

Server

UNDERSTANDING PERFORMANCE KPIS…

Throughput

Error

•Measured from the end-user perspective

•Time taken to completely respond to request

•TTLB TTFB

Response Time Web

Server

Inter

net App

Server

DB

Server

DB

Server

Response Time

Total Response Time =

Network latency + Application latency + Browser Rendering Time

•Transactions are specific to applications

•In its simplest form, it is requests / sec

Throughput =

[TRANSACTIONS] / Second

•Defined in terms of the success of the request

•Error at HTTP level (404, 501)

•Application level error

CREATING LOAD TEST REPORTS

Capture Application Metrics • Response Time • Throughput • Errors

Capture Server Metrics • CPU / Memory / Disk / IO • Network • Application • Platform

Tables • Response Time

(avg/min/max/%/stddev) • Throughput (average) • Errors (success % / types)

Graph / Charts • Scatter / Line • Overlay

Correlate Application Metrics • User Load - Response Time • User Load - Throughput • User Load - Errors

Correlate System Metrics • User Load - Server Metrics • User Load - Network • User Load - Platform

Summarize • Overall Performance • Important Trends • Threshold Violations

Trends / Thresholds • Response Time Trends • Throughput Trends • Threshold Violation • Utilization (Server Metrics) Trends

1. Capture

2. Correlate

3. Plot / Tabulate

4. Trends / Thresholds

5. Customize / Summarize

6 . Compare

SAMPLE REPORT ELEMENTS (SNAPSHOTS)

Photo Credits:

• http://msdn.microsoft.com/en-us/library/bb924371.aspx

• Sanitized past projects

JMETER REPORTING ELEMENTS (LISTENERS)

• JMeter Listeners

• JMeter elements that display performance test metrics / output

• Various types of Listeners (Raw / Aggregated / Graphical)

• Doesn’t have inherent capability to measure system metrics*

• Useful for basic analysis

JMeter Report using xslt stylesheet

• Style-sheet under ‘extras’ folder

• .jtl output must be in xml format

– jmeter.save.saveservice.output.for

mat=xml

• Integrate using ant

GENERATING ADVANCED JMETER REPORTS

Other Reporting Options

• JMeter CSV results + Excel

• Process results programmatically

(perl / python etc.)

• BlazeMeter Reporting Plug-in

Photo Credits:

• http://www.programmerplanet.org/pages/projects/jmeter-ant-

task.php

WHAT HAPPENED?

TO LABEL A AND KPI B AT TIME C

BLAZEMETER REPORTING PLUGIN

BENEFITS

• Store a report per test run,

including

• Script that was used to run the

test

• Logs & JTL file

• Compare results of two test runs

• See an improvement trend

• Compare current with previous in

real time

• Share with co-workers

KPIS AVAILABLE IN A JMETER TEST RESPONSE TIME - THE TIME IT TAKES A REQUEST TO FULLY LOAD

• Indicates the performance level of the entire system under test (web server +

DB).

• Represents the average response time during a specific minute of the test.

BLAZEMETER REPORTING PLUGIN COMPARE TWO REPORTS

BlazeMeter - Changing the Economics of Load Testing via the Cloud’

‘BlazeMeter - Code probing, not Angry Birds will define cloud’s success’

‘BlazeMeter - Startup Offers JMeter Cloud Load Testing at Scale’

THANK YOU!

HTTP://BLAZEMETER.COM/

top related