Слайд 1Rand Fishkin, Wizard of Moz | @randfish | rand@moz.com
Why Great
Marketers
Must Be Great Skeptics
Слайд 2This Presentation Is Online Here:
bit.ly/mozskeptics
Слайд 4
I have some depressing news…
Слайд 7
Does anyone in this room believe that the Earth doesn’t revolve
around the Sun?
Слайд 9The Earth (and everything in the solar system, including the Sun)
revolves around our system’s gravitational Barycenter, which is only sometimes near the center of the Sun.
Слайд 10
Let’s try a more
marketing-centric example...
Слайд 11In 2009, Conversion Rate Experts built us a new landing page,
and increased our subscribers by nearly 25%. What did they do?
Via CRE’s Case Study
Слайд 12One of the most commonly cited facts about CRE’s work is
the “long landing page.”
Слайд 13The Crap Skeptic
The Good Skeptic
The Great Skeptic
Let’s change our landing page
to be a long one right now!
We should A/B test a long landing page in our conversion funnel.
How do we know page length was responsible? What else changed?
Слайд 14The Crap Skeptic
The Good Skeptic
The Great Skeptic
“I do believe sadly it’s
going to take some diseases coming back to realize that we need to change and develop vaccines that are safe.”
“Listen, all magic is scientific principals presented like "mystical hoodoo" which is fun, but it's sort of irresponsible.”
"The good thing about science is that it's true whether or not you believe in it."
Слайд 15In fact, we’ve changed our landing pages numerous times to shorter
versions and seen equal success. Length, it would seem, was not the primary factor in this page’s success.
Слайд 16
What separates the crap, good, & great?
Слайд 17
Assumes one belief-reinforcing data point is evidence enough
Doesn’t question what’s truly
causal vs. merely correlated
Doesn’t seek to validate
Слайд 18
Doesn’t make assumptions about why a result occurred
Knows that correlation isn’t
necessarily causal
Validates assumptions w/ data
Слайд 19
Seeks to discover the reasons underlying the results
Knows that correlation
doesn’t imply
causality
Thoroughly validates, but doesn’t let imperfect knowledge stop progress
Слайд 20Will more conversion tests lead to better results?
Testing
Слайд 21
Obviously the more tests we run, the better we can optimize
our pages. We need to build a “culture of testing” around here.
Слайд 22Via Wordstream’s What is a Good Conversion Rate?
Слайд 23Via Wordstream’s What is a Good Conversion Rate?
Do Those Who Test
More Really Perform Better?
Слайд 24
Hmm… There’s no correlation between those who run more tests across
more pages and those who have higher conversion rates. Maybe the number of tests isn’t the right goal.
Слайд 25Via Factors That Drive How Quickly You Can Run New Online
Tests
Слайд 26Trust
Word of Mouth
Likability
Design
Associations
Word of Mouth
Amount of Pain
CTAs
UX
Effort Required
Process
Historical Experiences
Social Proof
Copywriting
CONVERSION DECISION
Timing
Discovery
Path
Branding
Price
(it’s a complex process)
Слайд 27
How do we know where our
conversion problems lie?
Слайд 28Ask Smart Questions to the Right People
Potential Customers Who Didn’t Buy
Those
Who Tried/Bought But Didn’t Love It
Customers Who Bought & Loved It
Professional, demographic, & psychographic characteristics
Professional, demographic, & psychographic characteristics
Professional, demographic, & psychographic characteristics
What objections did you have to buying?
What objections did you have; how did you overcome them?
What objections did you overcome; how?
What would have made you stay/love the product?
What would have made you overcome them?
What do you love most? Can we share?
Слайд 29
We can start by targeting the right kinds of customers. Trying
to please everyone is a recipe for disaster.
Слайд 30
Our tests should be focused around overcoming the objections of the
people who best match our customer profiles
Слайд 32Testing headlines, copy, visuals, & form fields
Слайд 33Designing for how customers think about their problems & your solution
Слайд 35Does telling users we encrypt data scare them?
Security
Слайд 36Via Visual Website Optimizer
Could this actually HURT conversion?
Слайд 38Via Visual Website Optimizer
A/B Test Results
They found that without the secure
icon had over 400% improvement on conversions as compared to having the image.
[Note: results ARE statistically significant]
Слайд 39
We need to remove the security messages on our site ASAP!
Слайд 41
Is this the most meaningful test we can perform right now?
(I’m
not saying it isn’t, just that we should prioritize intelligently)
Слайд 42Via Kayak’s Most Interesting A/B Test
vs.
Слайд 43Via Kayak’s Most Interesting A/B Test
A/B Test Results
“So we decided to
do our own experiment about this and we actually found the opposite that when we removed the messaging, people tended to book less.”
- Vinayak Ranade, Director of Engineering for Mobile, KAYAK
Слайд 44
Good thing we tested!
Good thing we tested!
Your evidence is no match
for my ignorance!
Слайд 45What should we expect from sharing our content on social media?
Social
CTR
Слайд 46
Just find the average social CTRs and then try to match
them or do better. No brainer.
Слайд 47Via Signup.to’s Analysis of CTR on Twitter
Слайд 48Via Signup.to’s Analysis of CTR on Twitter
Слайд 53Phew! We’re not alone.
Via Chartbeat
Слайд 54
Assuming social metrics and engagement correlate was a flawed assumption. We
need to find a better way to measure and improve social sharing.
Слайд 58
OK. We can create some benchmarks based on these numbers and
their averages, then work to improve them over time.
Слайд 59That is an insane amount of variability!
Слайд 60
There are other factors at work here. We need to understand
them before we can create smart metrics or useful expectations
Слайд 61
Timing
Source
Audience Affinity
Formatting
Network-Created Limitations to Visibility
Brand
Reach
Traffic
Engagement
Слайд 62
Let’s start by examining the data and impacts of timing.
Слайд 66
There’s a lot of nuance, but we can certainly see how
messages sent at certain times reach different sizes and populations of our audience.
Слайд 67
Comparing a tweet or share sent at 9am Pacific against tweets
and shares sent at 11pm Pacific will give us misleading data.
Слайд 68
But, we now know three things:
#1 - When our audience is
online
#2 – Sharing just once is suboptimal
#3 – To be a great skeptic (and marketer), we should attempt to understand each of these inputs with similar rigorousness
Слайд 69Do they work? Can we make them more effective?
Share Buttons
Слайд 70
After relentless testing, OKTrends found that the following share buttons worked
best:
Слайд 73OKTrends found that removing all but a single button (the “like”
on Facebook) had the most positive effect.
Слайд 74And that waiting until the visitor had scrolled to the bottom
of the article produced the highest number of actions
Слайд 75
We should remove all our social sharing buttons and replace them
with a single slide-over social CTA for Facebook likes!
Слайд 76Buzzfeed has also done a tremendous amount of social button testing
& optimization…
Слайд 79
Is Buzzfeed still in testing mode?
Слайд 80
Nope.
They’ve found it’s best to show different buttons based on both
the type of content and how you reached the site.
Слайд 81
OK… Well, then let’s do that… Do it now!
Слайд 82
Testing a small number of the most impactful social button changes
should produce enough evidence to give us a direction to pursue.
Слайд 83
Buzzfeed & OKTrends share several unique qualities:
They have huge amounts of
social traffic
Social shares are integral to their business model
The content they create is optimized for social sharing
Слайд 84
Unless we also fit a number of these criteria, I have
to ask again: Is this the most meaningful test we can perform right now?
Слайд 85
BTW – it is true that testing social buttons can coincide
with a lot of other tests (since it’s on content vs. the funnel), but dev resources and marketing bandwidth probably are not infinite ☺
Слайд 86Does it still work better than standard link text?
Anchor Text
Слайд 87
Psh. Anchor text links obviously work. Otherwise Google wouldn’t be penalizing
all these sites for getting them.
Слайд 88
It has been a while since we’ve seen a public test
of anchor text. And there’s no way to know for sure how powerful it still is.
Слайд 89
Testing in Google is very, very hard. There’s so many confounding
variables – we’d have to choose our criteria carefully and repeat the test multiple times to feel confident of any result.
Слайд 90
1) Three word, informational keyword phrase with relatively light competition and
stable rankings
Test Conditions:
2) We selected two results (“A” and “B”), ranking #13 (“A”) and #20 ( “B”) in logged-out, non-personalized results
3) We pointed links from 20 pages on 20 unique, high-DA, high-trust, off-topic sites at both “A” and “B”
Слайд 91A) We pointed 20 links from 20 domains at this result
with anchor text exactly matching the query phrase
#11
#12
#13
#14
#15
#16
#17
#18
#19
#20
B) We pointed 20 links from the same 20 pages as “A” to this URL with anchor text that did not contain any words in the query
Слайд 92#11
#12
#13
#14
#15
#16
#17
#18
#19
#20
#1
#2
#3
#4
#5
#6
#7
#8
#9
#10
After 20 days, all of the links had been indexed by
Google. “A” and “B” both moved up 4 positions. None of the other results moved more than 2 positions.
Слайд 94
While both results moved up the same number of positions, it’s
almost certainly the case that #13 to #9 was against more serious challengers, and thus anchor text would seem to make a difference. That said, I’d want to repeat this a few times.
Слайд 95
Princess Bubblegum and I are in agreement. We should do the
test at least 2-3 more times keeping as many variables as possible the same.
Слайд 96
1) Three word, informational keyword phrase with relatively light competition and
stable rankings
Early Results from a Second Test:
2) We selected two results (“A” and “B”), ranking #20 (“A”) and #14 ( “B”) in logged-out, non-personalized results
3) We pointed links from 20 pages on 20 unique, high-DA, high-trust, off-topic sites at both “A” and “B”
Слайд 97B) We pointed 20 links from 20 domains to this URL
with anchor text that did not contain any words in the query
#11
#12
#13
#14
#15
#16
#17
#18
#19
#20
A) We pointed 20 links from the same pages/domains at this result with anchor text exactly matching the query phrase
Слайд 98#11
#12
#13
#14
#15
#16
#17
#18
#19
#20
#1
#2
#3
#4
#5
#6
#7
#8
#9
#10
After 16 days, all of the links had been indexed by
Google. “A” moved up 19 positions to #1! B moved up 5 positions to #9. None of the other results moved more than 2 positions.
Слайд 99
Good thing we tested!
This is looking more conclusive, but we should
run at least one more test.
Anchor text = rankings. Stick a fork in it!
Слайд 100Does it influence Google’s non-personalized search rankings?
Google+
Слайд 101Good discussion about Google+ correlations in this post
Google+ is just too
damn high.
Слайд 102Good discussion about Google+ correlations in this post
From a comment Matt
Cutts left on the blog post:
“Most of the initial discussion on this thread seemed to take from the blog post the idea that more Google +1s led to higher web ranking. I wanted to preemptively tackle that perception.”
Слайд 103Good discussion about Google+ correlations in this post
To me, that’s Google
working really hard to NOT say “we don’t use any data from Google+ (directly or indirectly) at all in our ranking algorithms.” I would be very surprised if they said that.
Слайд 104
Google explicitly SAID +1s don’t affect rankings. You think they’d lie
so blatantly? As if.
Слайд 105
The correlations are surprisingly high for something with no connection. There
have been several tests showing no result, but if all it takes is a Google+ post, let’s do it!
Слайд 106
First, remember how hard it is to prove causality with a
public test like this. And second, don’t let anything but consistent, repeatable, provable results sway your opinion.
Слайд 108#21
#22
#23
#24
#25
#26
At 10:50am, the test URL ranked #26 in logged-out, non-personalized, non-geo-biased,
Google US results.
Слайд 10942 minutes later, after ~30 shares, 40 +1s, and several other
G+ accounts posting the link, the target moved up to position #23
#21
#22
#23
#24
#25
#26
Слайд 110#21
#22
#23
#24
#25
#26
48 hours later, after 100 shares of the post, 95 +1s,
and tons of additional posts, the result was back down to #25
Слайд 111At least we proved one thing – the Google+ community is
awesome. Nearly 50 people shared the URL in their own posts on G+!
Слайд 112Many G+ users personalized results, however, were clearly affected.
Слайд 113#21
#22
#23
#24
#25
#26
#27
#28
#29
#30
Something very strange is happening in relation to the test URL
in my personalized results, though. It’s actually ranking LOWER than in non-personalized results.
Слайд 114
Could Google be donking up the test?
Sadly, it’s impossible to know.
Слайд 115
GASP!!! The posts did move the result up, then someone from
Google must have seen it and is messing with you!!!
Слайд 116
Sigh… It’s possible that Jenny’s right, but impossible to prove. We
don’t know for sure what caused the initial movement, nor can we say what’s causing the weird personalized results.
Слайд 117
More testing is needed, but how you do it without any
potential monkey wrenches is going to be a big challenge.
That said, remember this:
Слайд 119
If I were Google, I wouldn’t use Google+ activity by itself
to rank anything, but I would connect G+ to my other data sources and potentially increase a page’s rankings if many pieces of data told a story of engagement & value for visitors.
Слайд 121Rand Fishkin, Wizard of Moz | @randfish | rand@moz.com
bit.ly/mozskeptics