brightonseo - the search universe - links, log files, gsc and everything in between
TRANSCRIPT
Jon MyersChief Growth Officer - DeepCrawl
The Search Universe - Links, Log Files, GSC and everything in between.@deepcrawl @JonDMyers
What we will cover…
@DeepCrawl @JonDMyers
Using Crawling Data with:• Backlinks• Log File Data• Backlink and Log File Data• Where does GSC fit in and add • What about the Consumer and where GA fits
Using backlinks data with crawl data
@DeepCrawl @JonDMyers
Things to consider:• Backlinks Authority distribution?• Where are they backlinks landing on?• Positive/Negative Backlinks?• Backlinks to low DeepRank/high Level pages?• Backlinks to orphaned?• Backlinks to broken pages/non-indexable?
Disallowed myurl.com/my-page 1000s of Backlinks!
Simple URL Story with Backlinks
@DeepCrawl @JonDMyers
myurl.com/my-page 876 Backlinks
URL story with backlinks
@DeepCrawl @JonDMyers
myurl.com/my-page 456 Backlinks
Duplicates Canonical
456 Backlinks!
1332 Backlinks!
Incorporating log file data into a crawl reveals where crawl budget is being wastedIdentify non-indexable pages receiving bot requests:Hits to redirect chains (reduce no. redirects)Hits to 404/410 status codes (crawled occasionally)Hits to soft 404 or orphaned pages (redirect to another page?)Hits to 3xx status codes (investigate why being indexed)
Find out how frequently search engine bots hit your siteFor pages receiving low bot requests you could consider:Adding more internal links to that pageAdding a “Last Modified” date to the sitemapMaking 3øno non-indeä3‡`Íe pages å ≥å ≥sitemapSplitting SŸ~SJrŒ)the sitemap °∏\N!»úones, including one sitemap for new pages
Understand *SŸ~SJrŒ)∂…¨Ûi÷gö´kBots Crawl Your Site With TŸñ•A3àÉ€ÀD≥#ËL;ÉPT˜òRí«User Agents(Å üQü:i"˙}öLÁ˘’MF5r† ‹Ä|øñx’õÉu∂Î1§FøèvÒflNrı„î+È(˝B_ÈûBFçG%˝¢◊•‹£|Í…ıd`Öaï—V¨ªj:+Êsd…”∏Áwób•˚°ñdcUó–l™§´Pınˆ∏=õ<◊xñxÍÈ»û`ÈÏk√äËÑDΩä˜MÙ[ÉuxfiœÛ˜^Ωï‘E_≥hfeŸXGtãtÎt€tªtœËfi“á∑
I love log-files! xx
@DeepCrawl @JonDMyers
myurl.com/non-indexable 100 requests
Evaluate crawl budget wastage
@DeepCrawl @JonDMyers
myurl.com/broken 330 requests
myurl.com/page?sort=price 56 requests
myurl.com/products 232 requests
Determine most crawled site sections
@DeepCrawl @JonDMyers
myurl.com/categories 330 requests
myurl.com/contact 2,300 requests
URL story with GSC
@DeepCrawl @JonDMyers
myurl.com/my-page TrafficDisallowed
myurl.com/my-page Traffic
404
Pages with image search traffic, with broken images
@DeepCrawl @JonDMyers
myurl.com/my-page
5,320 Clicks
Adding your traffic to the equation
@DeepCrawl @JonDMyers
1. It is one thing to have internal and external power for Google’s eyes2. Need to add the consumer power3. Where and what do your customers see and convert on4. Adding GA to the mix and the traffic knowledge it brings
An advanced story with visits & backlinks
Disallowed
myurl.com/my-page
2768 Visits!
301 redirection
myurl.com/redirect-page
301 redirection
myurl.com/other-page
No Indexed!
No Backlinks!
1000s of Backlinks!
@DeepCrawl @JonDMyers
A simple URL story with visits
@DeepCrawl @JonDMyers
“Google Drives Awareness but people drive ROI!”
Now it gets fun!
@DeepCrawl @JonDMyers
Canonicalises
myurl.com/dress?colour=red
myurl.com/dresses
2,768 Visits
1000s of Backlinks
50 Googlebots
2,423 Clicks
Maintaining value through migrations
@DeepCrawl @JonDMyers
myurl.com/page.aspx
myurl.com/c/page
301 redirect
myurl.com/page
301 redirect
1 Request
500 Requests
1 Request
2768 Visits
2768 Visits
2768 Visits
500 Links
Maintaining value through migrations
@DeepCrawl @JonDMyers
myurl.com/page.aspx
myurl.com/c/page
301 redirect
myurl.com/page
404
1 Request
1 Request
1 Request
0 Visits
0 Visits
0 Visits
500 Links
Your search universe from DeepCrawl
@DeepCrawl @JonDMyers
Correlation between:
• Backlink Authority• Internal Link Authority “DeepRank”• Logfile Googlebot Data• Performance data from GSC• Consumer and Traffic Data from GA