presenter email optimization - emarketer · 2020-05-02 · you don’t learn what is working and...
TRANSCRIPT
© 2019 eMarketer Inc.
Email Optimization:
The ABC’s of A/B
Testing
Tim Ozmina
Sr. Marketing Specialist,
Commercial Demand Generation
Marketo
Nancy Taffera-Santos
SVP Media Solutions & Strategy
eMarketer
Sponsored content presented by
MODERATOR
November 21, 2019
Tech-Talk Webinar
PRESENTER
Meet the Speaker
2
Tim OzminaSr. Marketing Specialist,
Commercial Demand Generation
Email Optimization: The ABC’s of A/B Testing
Tim Ozmina, Sr. Marketing Specialist
A Brief History of Emails
4
1971 – First email was sent
1985 – Email is used by government workers, military, academics
1991 – The first email was sent from space
1996 – Microsoft and Hotmail release email tools
1997 – Microsoft buys Hotmail; almost 10 million email users
1998 – The term “Spam” enters the dictionary
2003 – Over 77 million users
2004 – U.S. Government starts regulating unsolicited emails
2009 – 94% of all emails were spam
2019 – 3.9 billion email users; 293 billion emails sent per day; 64% of email is spam
Sources: MacWorld, 99firms, Radicati, EMarketer
Why Should You A/B Test?
5
What if you don’t test?
You don’t learn what is working and what isn’t
You miss out on meaningful engagements
Your graphs don’t trend up and to the right
Loss of revenue
You don’t grow
What if you do test? You can break through the noise of 293 billion emails sent
per day
You can optimize EVERYTHING
You can let the data do the talking
You capture MORE meaningful engagements
Increase revenue
You grow
Now We Know Why… But What Is the Process?
6
Define Your Goal
Decide What to
Test
Create a Hypothesis
Test
Evaluate Results
Make a Decision
It’s Time to Pick a Test!
7
What should you test?
Subject line Number of words in SL
Personalized vs. Non-personalized
Brackets
Call-to-action button Size
Color
Font
Italicize vs. Bold
Preheader
Tone of voice
HTML vs. Text email
Day of week
Time of day
Length of email
From name
And many, many more!
How Do You Actually Run the Test?
8
A few things to keep in mind:
Only test one element at a time
Keep the testing environment as clean as possible
Send to the same audience
Send at the same time
Gather as much data as possible
VS
Before We Evaluate…Statistical Significance:
a number that expresses the probability that the result of a given experiment or study could have occurred purely by chance. This number can be a margin of error (“The results of this public opinion poll are accurate to five percent”), or it can indicate a confidence level (“If this experiment were repeated, there is a probability of ninety-five percent that our conclusions would be substantiated”).
Now It’s Time to Evaluate
Results: Open rates increased by 4% (100% statistical significance)
Click-through-rates increased by 47% (100% statistical significance)
Click-to-open rates increased by 41% (100% statistical significance)
(Thanks Mike!)
What does this mean? Context is amazing…
Adding context not only increased opens, but also increased clicks
(Thanks Mike!)
Specifically for this test: Early stage audience Marketing titles Engaged in the past 1 year
And the Winner Is…
11
What do I do now? DOCUMENT YOUR FINDINGS!
Let your team know
Include brackets to that audience
Test this on your other audiences (if you weren’t testing it already)
Pick another test
Take it a step further
Let’s Take It a Step Further
I Know What You’re Thinking:
13
We just beat the control and now know that adding brackets to the subject line can help with not only opens, but also clicks… what else is there?
Always Be Testing
14
VS
The Results Look Conclusive
15
Results:
Open rates increased by over 15% (100% statistical significance)
Click-through-rates increased by almost 30% (100% statistical significance)
Click-to-open rates increased by over 12% (100% statistical significance)
Looks conclusive, right?
The Full Story
16
Time to Break a Myth
Adding a Secondary Call-to-action
18
We’ve been told it’s a no-no Decreases clicks Increases unsubscribes Can hurt deliverability
Have you ever tested it?
For this test: Sent to all behavior score audiences Marketing titles Over 450k emails Engaged in past one year
The Results
Results: Open rates increased by 0% (no statistical significance)
Click-through-rates decreased by 3% (no statistical significance)
Click-to-open rates decreased by 3% (no statistical significance)
What does this mean? We cannot confidently say that adding a
second CTA directly increases/decreases the performance of an email
Disclaimer: Try this on your audience, don’t just take my word for it!
19
5 Key Takeaways
20
Document your findings
Keep your tests clean
Dig deeper
Find statistical significance
Always be testing
Questions?
Tim Ozmina: https://www.linkedin.com/in/timothy-ozmina/
© 2019 eMarketer Inc.
Email Optimization:
The ABC’s of A/B Testing
Please submit any questions you have and we’ll do our best to address them! All
registrants will be receiving a follow-up email with a link to view the on-demand
materials.
➢You can register for upcoming Tech-Talk and Meet the Analyst Webinars at:
emarketer.com/webinars
➢Be sure to also check out eMarketer’s “Behind the Numbers” podcast for daily,
freewheeling conversations about the ways digital is transforming media,
marketing, business and even life. Tune in at: emarketer.com/podcast
➢NEW: eMarketer Daily Forecast videos that bring our forecasts to life! You can
find them every day in the eMarketer Daily Newsletter. Sign up at:
emarketer.com/newsletters
Tim Ozmina
Sr. Marketing Specialist,
Commercial Demand Generation
Marketo
Nancy Taffera-Santos
SVP Media Solutions & Strategy
eMarketer
Sponsored content presented by
MODERATOR
November 21, 2019
Tech-Talk WebinarQ&A Session
PRESENTER