how i’d help shepherd neame improve their seo (organic presence)

14
help Shepherd Neame improve their SEO (Organic pre An SEO Audit www.shepherdneame.co.uk

Upload: london-seo-consultant

Post on 19-Feb-2017

75 views

Category:

Business


1 download

TRANSCRIPT

Page 1: How I’d help Shepherd Neame improve their SEO (Organic presence)

How I’d help Shepherd Neame improve their SEO (Organic presence)-An SEO Audit

www.shepherdneame.co.uk

Page 2: How I’d help Shepherd Neame improve their SEO (Organic presence)

Contents

1. Introduction2. Recent projects/successes 3. Awards4. Quick Stats5. Drop in Visibility6. Technical on-site issues7. Competitive landscape and off-page 8. Strategy

Page 3: How I’d help Shepherd Neame improve their SEO (Organic presence)

Quick Stats “The Map is not the territory*”

 Alfred Korzybski Caveat:- If this were a typical audit, I’d run through their analytics and start the assessment from there, but for this audit, I am relying exclusively on third-party data.

Page 4: How I’d help Shepherd Neame improve their SEO (Organic presence)

To investigate SN’s organic search performance, I observed the SEO Visibility reported by Searchmetrics over the past 2 years:

Page 5: How I’d help Shepherd Neame improve their SEO (Organic presence)

Lost of keywords, these are your local pub names and geographical searches that have Been lost due to Google's Pigeon algorithm change.As you can see the loss of these keywords matches the loss in traffic.

Page 6: How I’d help Shepherd Neame improve their SEO (Organic presence)

To find how many pages are actually out there, I crawled your entire site using Screaming Frog’s SEO Spider , IIS Manager and Xenu Link Slueth.

Xenu gave 20,326 pages (html/text), IIS 8,532, and screaming frog 16,711 - Google has indexed 5,210. This discrepancy may be due to the cdn. Subdomain.

The warning below, suggests duplicate content issues.

Page 7: How I’d help Shepherd Neame improve their SEO (Organic presence)

A robots.txt file is used to restrict search engines from accessing specific sections of a site.SM’s site helpfully provides and explanation!

Although not technically wrong, the site is usins the default Drupal robots.txt, which is indicative that its notBeen considered.

I’d remove the explanation and provide a link to the xml Sitemap (of which more later).

I would do is remove the Crawl-delay line.Unless you have a very large site or spidering problems,it's not needed.

The entries are technically accurate, but they are unnecessarily verbose. And many are no longer relevant to the site.

As a general rule its better to noindex pages via the meta tags.

Page 8: How I’d help Shepherd Neame improve their SEO (Organic presence)

The XML sitemap complements the robots.txt file with the former focusing on Indexability/inclusion the later exclusion.

The sitemap looks like it’s a default setup and only has 140 pages some of which should not be there.

http://www.shepherdneame.co.uk/pubs/search/51.2601450%2C0.8442802_300 http://www.shepherdneame.co.uk/pubs/search/51.2622513%2C-0.4672517_300 Etc. which are search results pages and indicates duplication, in the eyes of Google asthe page Titles and meta descriptions are the same (there are 746 of these pages with matching Titles and descriptions).

Page 9: How I’d help Shepherd Neame improve their SEO (Organic presence)

After the pub search result pages (746) the worst directory for duplicate titles Is /blog.

There are 178 page just listing the blog pages the last beinghttp://www.shepherdneame.co.uk/blog?keys=&page=177&tag=/Early%20Bird

The first being http://www.shepherdneame.co.uk/blog?keys=&tag=/Early%20Bird Which is exactly the same page ashttp://www.shepherdneame.co.uk/blog ( but has a rel canonical back to the /blog page as do all 178 pages.

This is poor practice and may lead Search Engines to ignore canonical directives on the site, or think that you only have one page.

Better implementation would be to remove the rel canonical and sequential rel=nextAnd rel=rev tags, and change the title tags and meta descriptions to show thw sequences:

Page 10: How I’d help Shepherd Neame improve their SEO (Organic presence)

Top Page Title: Brewing, beer and pub blog from Shepherd Neame Top Page Meta Description: Read the latest Brewing, beer and pub news, from Britain's Oldest Brewer – Shepherd Neame

Page 4 Title: Page 4 of 178 of Shepherd Neame’s BlogPage 4 Meta Description: Read listing page 4 of the latest Brewing, beer and pub news, from Britain's Oldest Brewer – Shepherd Neame

Implement the following tags:-

<link rel="prev" href="http://www.example.com/article?story=abc&page=1" /><link rel="next" href="http://www.example.com/article?story=abc&page=3" />

Page 11: How I’d help Shepherd Neame improve their SEO (Organic presence)
Page 12: How I’d help Shepherd Neame improve their SEO (Organic presence)

Site speed is becoming increasingly important to Google as a Ranking signal and additionally, since search engine crawlers have a limited crawl budget, they crawl quicker sites more thoroughly and more regularly than slower sites.

Walmart.com found that:-

For every second of improvement Conversion rate can improve by 2%

SN’s site speed is decent enough but could still be improved with a few best practices (e.g.,eliminating render-blocking JavaScript, optimizing images, leveraging browser caching, etc.).

http://www.webperformancetoday.com/2014/04/09/web-page-speed-affect-conversions-infographic/

In addition to suboptimal load times, many of the site’s pages have references to inaccessible objects (i.e., objects that return 4xx HTTP status codes). I recommend fixing these broken references because they unnecessarily waste both processing and network resources.

Page 13: How I’d help Shepherd Neame improve their SEO (Organic presence)
Page 14: How I’d help Shepherd Neame improve their SEO (Organic presence)

Not included

URLsHTML MarkupContentInternal linking structureInternal keyword useageSite Architecture

Backlink AnalysisSocial EngagementYouTube Video Analysis