david brown - crawl efficiency & fixing common crawl issues
TRANSCRIPT
![Page 1: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/1.jpg)
Overcoming Common Crawl Optimisation Issues
![Page 2: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/2.jpg)
@DeepCrawl
• Importance of Technical SEO
•Crawl Space & Budget
•Benchmarking
•Authority, Speed & Efficiency
• The Future
• Summary & Questions
Contents
![Page 3: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/3.jpg)
@DeepCrawl
How important is Technical SEO?
![Page 4: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/4.jpg)
"Even a basic understanding of what to look for in technical SEO can get you far. So many people today focus too heavily
on off-page SEO, but if a site is technically flawed, it won't matter how many links you have or how good your
content is.”
@DeepCrawl
Erin Everhart, SEO Manager, The Home Depot
![Page 5: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/5.jpg)
“Infinite Crawl Space”Google, 2008
@DeepCrawl
![Page 6: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/6.jpg)
“Infinite Crawl Space”
@DeepCrawl
![Page 7: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/7.jpg)
Number of websites vs users (bn)
@DeepCrawl
0
0.5
1
1.5
2
2.5
3
3.5
1993 1994 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004 2005 2006 2007 2008 2009 2010 2011 2012 2013 2014
Websites Internet Users
Source: Internet Live Stats, W3C* Estimation based on figures from June 2014
![Page 8: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/8.jpg)
@DeepCrawl
“The amount of time Googlebot spends crawling your site and indexing pages”
Crawl Budget
![Page 9: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/9.jpg)
@DeepCrawl
39%
8%1%
40%
11%1%
Common eCommerce Issues
Unique Duplicate Paginated Non Indexable Non 200 Failed
*data taken from 157 sample crawls (10,000 URLs) of UK ecommerce websites
*
![Page 10: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/10.jpg)
@DeepCrawl
What can you Do?
![Page 11: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/11.jpg)
@DeepCrawl
Ease of Access & Valuable Pages
![Page 12: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/12.jpg)
@DeepCrawl
Ensure that your pages are easy to reach
Content is hard to find for users and unlikely to be crawled by Googlebot
![Page 13: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/13.jpg)
@DeepCrawl
Use visitor data to identify low value pages
![Page 14: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/14.jpg)
@DeepCrawl
Use visitor data to identify high value pages
![Page 15: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/15.jpg)
@DeepCrawl
Site Speed & Redirects
![Page 16: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/16.jpg)
@DeepCrawl
Ensure your pages can be crawled quickly
https://www.youtube.com/watch?v=opUfIzuzJSw&feature=youtu.be&t=1010
![Page 17: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/17.jpg)
@DeepCrawl
Keep redirects to a minimum
![Page 18: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/18.jpg)
@DeepCrawl
• Put your most important pages first
• Minimise the number of redirect loops/chains
• Cache content that changes infrequently and/or move static content to a Content Delivery Network
• Enable Compression (Gzip)
• Improve server response times
Considerations for improving Time on site & Site Speed
![Page 19: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/19.jpg)
@DeepCrawl
Making the most of Googlebots’ time
![Page 20: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/20.jpg)
@DeepCrawl
Remove all Duplicate Pages
• Makes Googlebot work twice as hard
• Weakens the authority of the Primary page you want to be indexed
![Page 21: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/21.jpg)
@DeepCrawl
Non-Indexable does not mean Non-Crawlable
![Page 22: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/22.jpg)
“Before implementing a rel=canonical tag, you have to ask yourself whether it actually addresses the underlying issue,
or whether it’s a slap-dash fix that serves as a mere cosmetic cover-up for
the problem.”Barry Adams, SEO Consultant & State of Digital Editor
@DeepCrawl
![Page 23: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/23.jpg)
@DeepCrawl
Utilise robots.txt
• When used correctly robots.txt is incredibly effective in increasing Crawl Budget
• Always test any changes as it’s very easy to make mistakes
![Page 24: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/24.jpg)
@DeepCrawl
![Page 25: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/25.jpg)
@DeepCrawl
Manage URL Parameters
![Page 26: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/26.jpg)
@DeepCrawl
• Instruct Google (in GSC) to ignore parameters
• Utilise the robots.txt file where appropriate (TEST THIS)
• Remember that non-indexable doesn’t mean it’s not crawled
• Ensure there is no duplication
• Use self-referencing canonical tags
Considerations for improving Efficiency
![Page 27: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/27.jpg)
@DeepCrawl
What can be Achieved?
![Page 28: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/28.jpg)
@DeepCrawl
What can be achieved?
https://www.deepcrawl.com/case-studies/modanisa-sales-up-400-thanks-to-seo-audit/
![Page 29: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/29.jpg)
@DeepCrawl
Where are we Heading?
![Page 30: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/30.jpg)
@DeepCrawl
The Future of Technical SEO
• Organic searches have been made secure, making keyword research increasingly difficult –even more important to get Technical SEO right
• Google will allow you to purchase additional crawl budget, if you feel you aren’t being crawled enough
• Competition to be indexed & rank continues to increase with 40,000 Google search queries per second and more & more websites– ensure sitemaps are accurate to direct Google to the pages you prioritise
• Mobile will continue to have great significance in the world of search with more than half happening on mobile – The principles of getting Technical SEO right will still apply
![Page 31: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/31.jpg)
@DeepCrawl
How to get ahead of the GameThe first step to being found in Search results is being crawled and indexed by Googlebot
Canonicalising pages removes the duplication issue but can create crawl efficiency issues
Make life easy as possible for Search Engines to crawl and index your site(s)
Big doesn’t necessarily always mean better – often, less is more
Test, test and test changes again and again
Stay up to date with the latest developments - Watch Webmaster Hangouts……………….. (or read the Summaries in our newsletter!)
All these changes are of benefit to the user
![Page 32: David Brown - Crawl Efficiency & Fixing Common Crawl Issues](https://reader031.vdocuments.us/reader031/viewer/2022012405/58823c4f1a28ab31228b63b3/html5/thumbnails/32.jpg)
@DeepCrawl
@David_BrownUK
Thank You!