qaistc.comqaistc.com/.../uploads/2017/09/stc-2017_digital-perform… · web viewoptimize...
TRANSCRIPT
Digital Performance Testing
Main Author: Preeti Kambli, ManagerCo – Author: Venkata Goday, Director
Capgemini
AbstractDigitization and customer experience have become the key focus areas for IT organizations, In fact, Gartner predicts that 89% of companies will primarily compete on the basis of customer and Omni channel experience. Data from Gartner’s 2017 survey shows that CIOs are shifting their investment pattern in response to digital business, with average already spending 18% of their budget for digitalization and set to increase to 28% by end of 2018.
The digital ecosystem involves enterprises, partners, customers and other stakeholders who experience the Information systems through multiple mediums and access points. Comparisons are quickly drawn based on their interactions hence its utmost important for IT organizations to provide highest user experience levels. The single source for validating the systems responsiveness is by thoroughly performance testing.
This paper talks about the customer user expectations and our solution for digital performance testing
IntroductionDigital disruption in the today’s age is accelerating especially in banking, retail, telecommunication, marketing & communication and e-commerce space. This raises the standards for customer expectations regarding service, convenience, speed, scalability and a need to create a seamless, personalized customer journey. This journey must be marked by intuitively grasping and fulfilling the customer’s next demand.
2+ billion people have some form of a social media account and 61% are likely to express their frustration on social media with poor app experience. In response business is more focussed on the perceived customer experience where behavioural analysis, social sentiment analysis, heat maps and voice of customer data. On the other hand, IT transformation group focusses on the system health, load time, MTTR, TTFB, SDLC, Latency and release cycles for ironing out any technical glitch in the application. There is now a need to break this silo and have a more connected customer demand. One aspect which ties business and IT group is end UX (User experience).
Application performance is all about end user experience in the digital era, irrespective of the technology or complexity. With the evolution of Mobiles, Tablets and handheld devices it has become imperative to stay ahead in terms of understanding user experience even before users actually experience it.
This paper talks about the customer user expectations and our solution for digital performance testing which acts as an intermediary/integration point between traditional performance and functional mobile testing solutions. The solution will focus on capturing end user experience plus device side performance through real mobile devices through a common Digital Performance test report, thereby providing a reliable user experience.
Challenges in digital performance testing Traditional performance testing practice is more server focused and doesn’t consider end user responsiveness in the workload modelling. Also current performance tools have no provision for measurement of UI response times.
With real users – for both Web & Mobile, front-end/device side statistics & external influencers like distributed user base, network/bandwidth, device mix, interfaces to external APIs are often not tested for performance leaving out key customer experience parameters
A complex mix of devices, operating systems, ever emerging technologies and most importantly – customer expectation makes it difficult to “fix” issues as soon as the customer experience them
Often end to end performance tests considering above aspects, are bypassed owing to shorter release cycles & quicker time to market – ultimately resulting in unknown performance issues & bad user experience in Production
Infrastructure availability for load simulation remains challenge given the large Omni channel spread
Lack of standard approaches, frameworks and benchmarks for validation purposes
Factors affecting performance
Traditionally, digital performance testing is conveniently segregated as
Client side application performance/ Device performance Server side performance Network performance
Client side application performance
Digital application are categorized in 3 types – Native app, Web app and Hybrid app. Testing of each such app type is different than another as their implementation is quite different from one another.
For a native mobile application, the user’s perception of performance can be improved depending on how much of the application and its data resides on the local device versus the web application. The device’s own hardware and software configuration come into play. Many native applications reside on the mobile device and still communicate readily with a server application.
Mobile browser-based application performance is usually heavily dependent on network and server application performance. The performance is usually slower as a result and leads to a reduced user experience. In addition, some browsers may have higher performance than others as there is no standard. Your server application also needs to be able to recognize the device/browser combination in order to render properly.
Single page apps (SPA) are web applications built using a JavaScript MVC (Model-View-Controller) framework, to deliver a rich app-like experience. The page’s HTML is mostly built on the client browser instead of the web server. The primary aim of SPA is to measure how fast the user interface of your application responds to the input from the user. Tools which help in capturing the response time
Google Analytics and Page Speed Webpagetest.org
Server side performance
Examining the server performance is similar to measuring website or web app performance where we need to decompose our analysis into the components of the server that are providing the services including the database, application server, and associated hardware.
Each of these components has many variables that can result in numerous permutations. In addition, each permutation involves interaction between its components which could significantly impact performance yet is sometimes unpredictable and dependent on the situation and context.
Network performance
The application may behave differently on different networks as network protocols impact throughput and delays. Often our clients want us to test on different networks and in different countries because carriers sometimes place overhead on data transmission and network latency can vary.
Latency is dependent on the application in how efficient it is in its transmission method-algorithms, and the amount of data transmitted
as well (payload). In a way, we have gone backwards to the client-server application paradigm where we want to store more data on the device to reduce the network delay while at the same time the local computing device has limited capacity
For a native mobile application, the user’s perception of performance can be improved depending on how much of the application and its data resides on the local device versus the server application.
Tool & Techniques
There are many performance testing tools that try to account for these variables, but no single tool can solve all parts of the equation, and most tools are specialized towards a specific platform and testing aspect.
There are many commercial and open source performance testing tools available for desktop browser based apps. But when it comes to mobile applications, the options are very limited especially for native mobile apps.
Omni channel apps end to end performance testing is very challenging as it involves multiple devices and operating system versions, app versions, and different servers for native and web applications. All these factors are not easy to address with a single tool. APM tools monitor the apps and provide drill down analysis and diagnostics report for server side statistics.
Similarly, functional mobile testing solutions simulate user actions by recording one test and running on different devices. Test tool automation framework help in validating the functional aspect of the application on device.
A) APM tools for Omni channel performance monitoring help in diagnosis and performance optimization of the key metrics as User Journey, Page breakdown metrics, Mobile/PC/Browser, Number of Sessions, User response times, Network providers, Conversion/Bounce rates, Problem detection, Alerting. They have Mobile and Browser
modules for real user monitoring (RUM) capabilities providing business insights, user behavior analytics, root case analysis, error rates Dynatrace New Relic AppDynamics
B) Performance tools for server side metrics measure key KPIs (Key Performance Indicator) like response times, throughput, transaction/second, hits/second and provide a first glance of the server behaviour under load. HPE LoadRunner SOASTA NeoLoad JMeter BlazeMeter
C) Device side performance solution/tools improve the quality of your iOS, Android, and web applications by testing against browsers and real mobile devices hosted at the data center or on the cloud. Optimize performance by simulating real world network conditions and monitoring device vitals consumption (CPU, memory, and battery) HPE Mobile Center Perfecto Mobile Experitest SaaS digital assurance labs Mobile Labs deviceConnect
D) Network performance tools capture and emulate real-world network conditions, to help execute network performance testing to detect and remediate issues before app deployment. They analyze results and gain insight into the root cause of network performance bottlenecks to ensure that the rolled out application is optimized for target network performance. HPE Network Virtualization (HPE NV) OpManager
E) Web page diagnostics tools identify and diagnose web page and URL performance issues; especially the offending items, objects, and elements that cause sites to hang and slow down. These issues give users the perceived notion of SPA and website slowness and unresponsiveness. Google Developers Chrome DevTools Web Page Analyzer HPE Network Virtualization (NV) Analytics
Case Study - Digital Performance Testing for a Leading Media & Entertainment client
Business drivers
Measure response time experienced by its end users through the recently launched digital delivery platform for marketing of entertainment and news to global audience
Define simulation strategy to include device type/configuration, operating system, user base across geography, bandwidth/network and latency
Track real user and synthetic test interactions to benchmark performance from different geographies
Test environment included setup of dedicated Mobile lab with multiple device types and configurations to simulate browser and native app real time simulations
Business Transaction Management
Choice of ToolsHPE LoadRunner, HPE NV, HPE UFT, Mobile Labs deviceConnect, Appdynamics
Approach
In order to offer the best solution, the focus was on offering excellent digital experience with real life scenarios.
Requirement was gathered for the application design, feasibility analysis & non-functional requirement at various levels
Application type (native, hybrid, mobile web) Device variations Network, bandwidth & latency variations Application layers Usage pattern & End user experience goals
Real world workload models for effective performance tests were designed Use of Virtualization to closely mimic end user network conditionsLeverage APM tools for in-depth analysis and performance optimization including
Time consuming interactions, Sessions, Demographics, API errors, Response Times Capture Device level statistics – Battery, CPU & Memory, Interrupts
1. In order to have a real user experience, it was necessary to obtain response time from the UI interface. This digital traffic was simulated using HPE UFT functional automated scripts
2. For the concurrent user load simulation, synthetic load generation LoadRunner performance scripts were designed
3. During workload design, both the UFT and LoadRunner scripts were instrumented to be run as a part of a single setup
4. This setup was communicating with the cloud based setup of Mobile Labs deviceConnect which hosted multiple devices with a variation of operating system and platforms. The UFT scripts were actually run on the selected various devices on the cloud
5. Both the functional automation and performance tests were run in parallel during the load test. The tagged transactions from UFT scripts enabled to showcase the UI level response times under load.
6. Appdynamics APM was used for monitoring and diagnosing the entire infrastructure
Benefits/results
By implementing a common solution and consolidating a common report for the various integration points for server side, device side, APM and functional test, the following remediation were carried out
Performance tuning was carried out for the top time consuming interactions Identified “OutOfMemory” exception due to continuous usage of large object heap
memory - object sizes were redefined. Identified repetitive calls for same elements/images on same page - number of calls to
same elements/images were optimized on each page
ConclusionWith the various simulation tools in market, in order to understand the customer needs and emotions, seek out to many individual analytic and software services to separately analyse and fine tune the system. In order to simulate a real world scenario for performance testing - Choosing the right tool for your application landscape and to have a cohesive digital ecosystem combining the perfect blend of server side, device side, network, APM with a pinch of functional testing touchpoint is equally important. Going forward, the adoption of functional and device side scenarios will play a critical role in making business decision and having an accurate end user experience measured out of your digital performance testing.
References & Appendixhttps://www.slideshare.net/RyanBateman4/what-is-digital-performance-management
http://www.methodsandtools.com/archive/mobiletest1.html
https://experitest.com/mobile-cloud-testing/seetestcloud-online/?am_force_theme_layout=desktop
https://engineering.linkedin.com/blog/2017/02/measuring-and-optimizing-performance-of-single-page-applications
https://en.wikipedia.org/wiki/Single-page_application
http://www.techrepublic.com/blog/web-designer/free-diagnostic-tools-for-website-response-and-performance-issues/
Abbreviation Full form
IT Information Technology
CIO Chief Information Officer
API Application Programming Interface
UI User Interface
SDLC Software development Life Cycle
MTTR Mean Time To Repair
TTFB Time To First Byte
SPA Single Page Application
HTML Hypertext Markup Language
APM Application Performance Management
HPE Hewlett Packard Enterprise (now Microfocus)
UFT Unified Functional Testing
CPU Central Processing Unit
Author BiographyPreeti Kambli leads the Performance Testing & Engineering CoE for Capgemini Mumbai. She has around 15 year experience in business development support; architecting solutions for clients, competency building for various performance tools and enabling delivery projects in crisis situation. She has ideated and managed the development of new accelerators for Performance testing and drive the asset industrialization campaign.
Venkata Goday heads Performance Testing & Engineering CoE for Capgemini global; he has been leading this practice from the last 10 years and is well known to be a thought leader in performance engineering space. He has rich experience in setting up Performance CoE’s for customers and has partnered with several clients for strategic consulting and transformation initiatives. He architected and developed many assets in Non-Functional testing space.
THANK YOU!