A Guide to Technical SEO

You wouldnt build a million dollar house on quick sand so dont build a millor website on poor foundations

How does Google Work?

Even most experienced SEO’s don’t quite fully understand how Google works.

Crawl

Google has a spider which it goes and crawls a list of URL’s which is stored in its crawl queue. 
URL’s get added to list from a number of places, from finding links on other pages, from analysing your sitemap, plus other sources.
But Google bot main job is to just go and crawl list of URLs to find other ULRs to go and crawl.

Indexing 

This is where most SEO thinks is what Googlebot does, but its not the case.

Indexing is where Google analyses the pages its crawled to determine what the page is about and this is then added to its database.

It’s also important to remember at this stage – if you make changes to a page, if Google hasn’t crawled the page since you made the changes, it won’t go through the indexing phase again and there the additional content changes won’t be added to the index (in simple terms if you add a section on your page about ‘x’ you won’t rank for any terms to do with ‘x’ until Google recrawls the page).

Rankings

This part is a bit more complex and where Google is a little bit more vague about what determines where you rank.

When a user does a “search” Google doesn’t search the web, they search their database to find relevant articles.

Factors then such as authority, relevancy, page speed and other technical factors determine where each page will rank.

Let's dig into these in a bit more detail?

Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.

Crawling

Crawling is important, its the first step, if Google can’t crawl your website, it can not do any of the other stages.

You want to check your server logs to see how often Google is crawling your site.

You also want to make sure your sitemap is up to date and accurate as this after crawling the web is the most common method of Google finding new URLs.

If you have found Google has stopped crawling your site, check you / developer hasn’t accident added instructions on the site / in the robots file for Google not to crawl the site.

Indexing

So you have checked your logs and can see Google is crawling your site, you want to check to see if they can actually read your content.

While it’s not as big of an issue as it once was but sites built-in Javascript may not be as readable by Google.

New technologies are great and have some really good long term benefits, the issue is sometimes Google isn’t as quick to understand the technology and can hurt your ability to rank in the short term. 

Rankings

So Google is crawling your site and can read the content then that’s a great start.

There are several factors which determine where you rank, but we willk assume your content is good and you are an authority, then you will have technical issues and errors which are holding you back.

Errors

Error’s arent great from both a search engine bot and user. So finding and fixing them can help to improve your user’s experience but also Google’s ability to crawl and index your site.

There are many types of errors you should be looking out for, however some are more important than others and will deliver a better RIO when fixed.

 

To find error’s first you need some software – this can be done manually but its not worth it, invest in the right tool to do the job. I will cover later some of the tools I personally use.

 

In an ideal world your website would have zero technical issues, but the bigger the site and the bigger the team means more errors are likely to appear.

How to find the errors?

Ok, there are five main ways to find errors, there are some other ways, but these 4 are the most..

1. Server Logs

One of the least used tools in an SEO tool kit. The Server logs are the only true place to find out what Google is doing on your site.

The logs will highlight any issues that Google is coming across. Around 60-70% of the errors you find will be found doing a crawl, but the other 30%ish its worth doing.

If Googlebot is following an old link on a third-party site and landing on a 404 page, a crawl of the website won’t find this.

Another analysis you can do is looking for frequency of Googlebot activity, pages it is crawling not crawling, plus much more.

2. Crawling

his is quite common in the industry, whereas logs don’t seem to get the attention they deserve, crawling does and rightly so.

Using a third-party tool you can quickly crawl your entire website and find any issues, whereas unlike the log which requires Googlebot to have hit the page to see an error, you get a report back on your entire site.

Also log data can be limiting which is frustrating, but depending on which tool you use, you can get a lot more information to analyse.

Depending on the tool you use, you might just get all the raw data and you need to analyse or the tool might do the analyse and prioritise which errors needs fixing first.

There are pros and cons to each, that’s why I use a combination of the two

3. Google Analytics

If you can’t get access to your logs, or want to quickly check – analytics programmes like Google Analytics is a great way of spotting dead pages, slow loading pages.

In fact, you can create a custom dashboard to highlight any issues you might have, more specifically you can look at browser level and see are you having specific issues on Firefox etc – this you would never get from the crawling tools.

There are other great analytics platforms out there, but I just prefer to use GA.

4. Google Search Console

Another great tool is Google Search Console and Bing Webmaster tools, if you have specific technical issues, they will highlight the error and once fixed you can request Google to come and crawl the page again.

While this is a very good feature – it limited to what it actually checks for and reports to you.

5. Web Speed Tools

 

A section further a long is specifically on page speed, but the page speed tools will highlight specific issues which are slowing down your website.

What Errors to look for?

This isn’t a complete list of everything you should be looking for, in fact it’s the basics and which must be checked, but as every site is different, with different issues there isn’t a one stop list of all the errors.

Tools you need:

Most of this can be done manually but you would be wasting a lot of your own time and resources which can be better spent elsewhere.

There is a number of tools I use when auditing websites, most of which are available to download. The list isn’t in a particular order.

 

  1. Screaming Frog (Free, unless your site is over 500 pages then its £149)
  2. SEMRush (Depends on the size of your site and the number of pages you want to crawl determines the prices).

 

I use Screaming Frog and SEMRush combined, I do manual crawls every Monday of all the sites looking for errors – a big deep dive. Whereas SEMRush runs every day looking for top line changes.

  1. Page Speed Tools – who article specifically on that
  2. Sitemap checker – until recently this was an internal tool but I made it public so you can have the benefit as well
  3. Convert Server Logs to Excel

Page Speed Analysis

Page speed is super important both for UX and for SEO – in fact, it is believed to be the third biggest ranking factor for mobile rankings.

First you need to measure it before you can fix it, this helpful article shows the best tools for measuring page speed.

Once you have measured it, now you can start addressing the issues slowing down your site.

Remember its unlikely to be your home page that most people visit, so I always check key sections, so maybe one category page, one product page (if eccomerce) one blog article – while its not every pages (which can be done) as these are usually templated pages it gives you a good idea.

Technical SEO is just one part of the jigsaw.

Content

Authority

User Experience

Latest Posts