automated analytics testing with open source tools
Post on 11-May-2015
303 Views
Preview:
DESCRIPTION
TRANSCRIPT
T18
Special Topics
5/8/2014 1:30:00 PM
Automated Analytics Testing
with Open Source Tools
Presented by:
Marcus Merrell
RetailMeNot, Inc.
Brought to you by:
340 Corporate Way, Suite 300, Orange Park, FL 32073
888-268-8770 ∙ 904-278-0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com
Marcus Merrell
RetailMeNot, Inc.
A test architect at RetailMeNot, Inc., Marcus Merrell has written UI and API test frameworks for several products since 2001. Marcus is obsessed with code design and holds sacred the philosophy that test frameworks should be approached, developed, and tested just as carefully as the production software they seek to examine. The successful demonstration of these principles has led to an intense focus on collaboration between testers and developers, and to a deep appreciation for code that is at once simple and sophisticated―an API for testers which is reusable, interface-agnostic, and universal to many problem sets.
4/26/2014
1
Web Analytics
TestingMarcus Merrell, RetailMeNot, inc
@mmerrell
Web Analytics – The Basics
� Traffic:
� Google, Woopra, Yahoo, Crazy Egg
�Optimization and Performance:
� Optimizely, Google
�Competitive Analysis:
� Compete.com, watchthatpage.com, faganfinder.com
� Social:
� Facebook Insights, Twitalyzer
4/26/2014
2
Web Analytics – Custom
�WebTrends
� Adobe (formerly Site Catalyst/Omniture)
� IBM, formerly CoreMetrics
�…This is what we’ll be talking about
Web Analytics - Advanced� Cookies on steroids
� Custom key-value pairs
� Data Storm!!
� Where else can you learn, correlate, and tell stories from seemingly disparate points of data?
“If you are not paying for it… you are the product being sold”
--Andrew Lewis (blue_beetle)
4/26/2014
3
A Major Newspaper� Click on a story on nytimes.com – what have they learned about you?� How far down the page was the story?
� How long were you there before you clicked it?
� Did you “bounce” to another page before clicking it?
� Were you logged in?
� Have you looked at other stories from that category/keyword/tag cloud before?
� How long before you click another link?
� If we popped up a story suggestion, did you click on it?
Abusive Comments
�Once you’re on a story, are there comments?
� Did you add a comment?
� Did you click a “show more” link to expand a comment?
� Did you do that multiple times in a thread? Across multiple threads?
� Did you click an ad?
� Was it before of after you looked at comments?
4/26/2014
4
Web Analytics – Telling a Story
� Now extrapolate for 100,000 users
� Are people consistently clicking on stories “below the scroll”?
� If so, should that story be further up?
� Is something wrong with the top stories?
� Are stories with pictures more popular?
� Are certain keywords more likely to get comments? Abusive comments?
� How does ad click-through change based on abusive comment %?
Making a Hypothesis
�Do abusive comments drive down the
likelihood of users’ clicking on ads?
�Question for A/B test: Will ad revenue
increase if we disable comments on
stories about politics?
� A path: disable comments on stories with
keyword “politics”, “election”, or “pundit”
� B path: keep comments enabled
4/26/2014
5
Drawing a Conclusion
� Test B, “variation”, won
�Disable comments when “abusive
comment” rate rises above threshold
� Now, refine the threshold with further A/B
tests
Real-world Examples
� “People who bought this also bought…”
� Shopping cart—shipping & tax calculation
� Pairing account creation with check-out
� Suggesting products and content based
on cookie, not login
4/26/2014
6
Why You Should Care
�What if the beacons they’re sending contain the wrong information?
� But furthermore…
� This is everywhere
� It is only growing
� Companies are becoming smart
� (Really really smart)
� You do not want to miss this opportunity to provide value
Why You Should Really Care
� As a tester:
� There is a whole team of people working on this
� It gets worked into features as they are developed
� It is rarely called out separately in a scheduled task
� It rarely receives QA outside of the PM and BI people who really care about it
4/26/2014
7
Fortunately, It’s Easy
� All of this comes down to an extra HTTP
request or two, made during a some
navigation event
� As long as you can intercept this request
and read from it, you can verify the data
within it
Examples
�Wells Fargo (s.gif)
� Amazon (a.gif)
� Netflix ([….]?trkId=xxx, beacon?s=xxx)
� The New York Times (pixel.gif, dcs.gif)
�OpenTable (/b/ss/otcom)
� (and RetailMeNot)
4/26/2014
8
Classic Approach
�Marketing asks the BI team to figure out our ROI on TV ads during a period of time
� BI requests PM to create a series of analytics
� PM gives Dev the particulars
�Dev assigns the code task to the newest person on the team
� If anyone tests it, it’s also the newest person on the team
Classic Approach
�Manual testing of web analytics is about
as exciting as reconciling a large column
of data with another large column of
data
�…what if it’s wrong?
�…what if it changes?
�…why not let the software do it?
4/26/2014
9
What We Do
� Testing the web component:
� Launch the browser
�Do the navigation
� Launch the debugger window
� Parse the source code for “key=value”
pairs
� Look for the one you want
What We Also Do
� Proxy FTW
� Safer, more reliable, probably faster
� Examine ALL HTTP requests, looking for
some token
� Parse out the request body for
“key=value” pairs
� Look for the one you want
4/26/2014
10
Tech Stack
�Maven
� Spring
� TestNG
� Selenium
� BrowserMob Proxy
Features
� Scalability
� Autoscaling Grid!
� Data-driven tests
�Database
� Validation
� Test data creation
4/26/2014
11
Execution
� TeamCity kicks off…
�…Maven job, which executes…
�…TestNG tests
Reporting
� Report to a dashboard
� Indicates “PASS”, “FAIL”, and a “Staleness
factor”
top related