mobile performance testing not the old wine in new bottle · mobile performance test report...
TRANSCRIPT
Mobile Performance Testing –Not the old wine in new bottle
Vijay Chelliahdhas
Agenda
Introduction
Why Mobile Testing landscape is complex?
Traditional Performance Testing Methods
What is the delta focus on Mobile PT?
Mobile PT Strategy
o Device Profiling
o Network Analysis
o Server side PT
o User Experience Testing – Mobile Web Performance
o User Experience Testing – Integrated Testing
o Mobile PT in Agile Development
Conclusion
My Introduction
14+ yrs in IT and Performance Engineering
DBA > Performance Engineer > NFT Consultant
Expertise in
Performance Engineering strategies for large scale complex distributed systems
Non-functional Risk Assessment
Setting up Performance Test Centers of Excellence
Current responsibilities @ HCL: Practice Lead – NFT and Mobile Testing
200+
Clients
served
worldwide
10000+
Professional
testers
worldwide
$400+ Mn
revenues
earned
40+
Million dollar
clients
20
Testing
Global
Delivery
centers
Why Mobile Testing is complex?
Traditional Performance Testing
Application layer, database layer
Load Stress Endurance
Load simulators
What is the delta focus on Mobile PT? –Not the same old wine
Thick client
Unpredictablenetwork conditions
Unpredictableuser patterns
“User experience”
Varying devicespecifications
Mobile PT Strategy – 360 degree approach
Device Network Server Memory Analysis
o Total size of memory allocation
o Object de-allocation and Memory leak analysis
Application Crash analysis
Thread Profiling
Method Profiling - Time consuming methods
CPU Utilization (%)
Battery and Network Analysis
Network connectivity for all pages
Network performance with different data networks and WiFi
# of HTTP Redirects
# of DNS Lookups
Response Time (Performance)
Transactions per Second (Capacity)
Memory Management, Configuration Constraints (Reliability)
Break points in the end to end application architecture (Capacity & Reliability)
Para
mete
rs t
o M
easu
re
Device side Profiling
CPU usage Memory usage Threads analysis – allocation, leak, zombies Heap analysis
Graphics performance Battery usage Export and trace data Profile code
Client side performance profiling should be a “proactive approach”rather than an afterthought
https://developer.apple.com/library/prerelease/watchos/documentation/DeveloperTools/Conceptual/InstrumentsUserGuide/TheInstrumentsWorkflow.html
Instruments User Guide
Xcode Instruments
http://developer.android.com/tools/debugging/ddms.html
DDMS
Using DDMS
Network Analysis
More bandwidth “does not mean” better performance
HTTP redirects DNS lookups WiFI, 4G, 3G connection performance
Latency Packet Loss, Jitters Bandwidth utilization
http://www.telerik.com/fiddlerhttp://www.charlesproxy.com/
Server side Performance Testing
Performance tests “without diagnostics is only academic”
Test Scenario ListOrder Platform Android
Transaction Complexity Complex OS Version Kitkat
Test Type Device Profiling Device Samsung Galaxy S5
Tested Date 9-Sep Network Connection WiFi
Iteration # 2 Signal Strength Good
Binary Release R4
Test Status Failed
Transaction Name Response Time (s) % CPU Utilization Heap Usage (MB) % Heap Usage Total Heap (MB)Thread Wait Count at
AsyncTask # 1
CMO_S01_T01_ListOrder_TM_V01_Login 13 71 34.68 71.89 48.24 36
CMO_S01_T02_ListOrder_TM_V01_SelectCustomer 10 74 35.93 74.47 48.24 36
CMO_S01_T03_ListOrder_TM_V01_SelectDepartment 10 68 34.94 72.43 48.24 36
CMO_S01_T04_ListOrder_TM_V01_NewOrder 4 72 35.35 73.27 48.24 36
CMO_S01_T05_ListOrder_TM_V01_SelectList 17 77 44.41 91.32 48.64 36
CMO_S01_T06_ListOrder_TM_V01_ReviewOrder 12 80 40.37 80.1 50.4 36
CMO_S01_T07_ListOrder_TM_V01_Order details 8 68 34.49 68.43 50.4 61
CMO_S01_T08_ListOrder_TM_V01_UpdateOrder 9 68 34.18 67.82 50.4 61
CMO_S01_T09_ListOrder_TM_V01_Submit Order 9 68 34.43 68.31 50.4 61
Observations
Response time
CPU utilization
Heap usage
Thread analysis
Memory leaks No apparent memory leaks as of this iteration
Mobile Performance Test Report
Constant increase in 'wait' state threads for Finalizer Daemon
Exactly at 'Order Details' step, about 25 threads went into 'wait' state for 'AsyncTast #1' thread (start count is 36 threads) and continued through the rest of the steps
Response times for all the screens exceed SLA and 3 screens exceed the threshold of 10 seconds as indicated above
Average of 71% which is higher than the threshold of 30%
Average heap usage is at 74%
Exactly at 'Select List' step, there is a 10MB increase in heap usage
However that heap is reclaimed after 'Review Order'
User Experience Analysis – Mobile Web Performance
User experienced response time monitoring and Waterfall analysis
User experienced network monitoringand relevant analysis
User Experience Analysis – Integrated Testing
Load Generator
Application under Test
client side profiling for native apps response time, network monitoring script driven response time capturing using
Cloud based solutions
Load Generator
LoadController
Load Generator
4G
3G
Load Balancer
Application under Test
Delivering Mobile PT in Agile Programs
Sprint
3
Sprint
1
Sprint
2Sprint
n
Hardening
SprintTest
Planning
Script development for
device side response time testsCombined server side
+ device side testing
iPadAndroid
Android iPhoneiPhone
Cloud based testing
using Perfecto Mobile
(or similar)
Early performance testing (device profiling, network analysis) Workload performance tests (Load, Stress, Endurance)
Real devices
testing using
local Mobile LabUSB / WiFi
iPhoneiPad
AndroidAndroidWindows
Mac
Conclusion
1) Need to augment traditional performance testing methods with Mobile world realities
2) Device side performance profiling and network analysis are must-haves rather than nice to haves
3) User experience analysis is paramount in real world conditions
4) Focus on integrated server and client side testing as opposed to only server side
5) Optimized device strategy is vital to meet Management’s TCO objectives
A comprehensive and holistic Mobile Performance Testing approach
is the “foundation for the future forward IOT world”