Technical SEO Tips

So in the run-up to Christmas, I did 24 tips in 24 days – while it might not have tasted as good as a chocolate advent calendar, it should allow you to start 2020 with a perfectly built website. 

In this article, I will share all the tips.

They won’t be in the order posted originally, but I’ve grouped them into theme’s, expanded on some and added some video’s as well. If you do want to join the group you can here Onpage Rocks Technical SEO Facebook Group.

So it’s broken down into four sections, listed below. Use the links below to jump to the sections you want to learn more about.

Crawling

Crawling:

Crawling is one of the key parts of doing technical SEO audits. It makes things efficent on large scale sites.

When you crawl a website, you should be looking for all the errors and getting to the point of having zero errors.

The important thing is it’s not, crawl and forget – you should be doing it regularly.

Some sites, I audit daily, weekly or monthly depending on the size of the site and the importance of the site, but every site gets audited on the first of the month.

The more important the site the more frequently, we have a couple of large clients on retainer that we audit daily because they have large teams from buyers who add products, to developers making changes.

There are new errors daily we report into the clients to make changes – the sooner the errors are fixed the better.

Our own site onpage.rocks does get audited monthly, but if I am honest I am too busy to actually make any of the changes – maybe over the Xmas period when things quieten down for a week.

There are some great tools out there you can use:

There are others, but these are our favourite three depending on the needs and requirements of the client and the size of the site.

The first thing is to actually crawl your website looking for errors. Start from the home page and let it crawl. 

Top things to look for are:

  • Broken links
  • Redirecing links
  • Redirect chains
  • Large images
  • slow loading pages

These are just the basics and so much more you can be looking for.

Following on the previous tip, I wanted to highlight the importance of crawling daily.

I crawl a large e-commerce site daily – because you never know when something goes wrong – they have a huge team and about 50 ish people have access to either the code or the front end so plenty of opportunities for something to break.

Today ‘x’ number of random products stopped working and brought up a 500 error.

Without the daily audit, this wouldn’t have been picked up as quickly and reported to their development team to work out why and fix.

These products might have taken several days (weeks – this time of the year) for a member of staff to click on them and realise they aren’t working leading to lost sales.

A quick daily audit reduces the risk.

I use SEMRush to do the quick crawl daily to spot errors.

Unsure how much your site changes – sign up for a free 14-day trial and crawl daily, see what changes. 

When crawling your site, crawl as Googlebot (yes messes up Server Logs, don’t do it every time but at least once a month).

Then check outbound links and make sure there are no outbound links that you weren’t expecting.

The reason I say crawl as Google is that hackers can be quite smart and only show the links to Googlebot – this is the extreme end, but I have seen it.

This is on the back of a Buzzfeed article talking about SEO’s hacking into sites for backlinks

So crawl your website as if your Google and check outbound links and make sure no new ones appear.

Once you have done this once – to speed things up in the future save that list as a whitelist and then do a Vlookup against it the next time. Then you are only looking at new links added.

Sitemaps:

Create and Optimize Your XML Sitemap

The sitemap is a great way for Google and other search engines to quickly find new content on your site or more importantly find quickly content you have updated.

With your sitemap – you should also download it regularly and compare it to your crawl report to make sure every page on your website is included in the file.

It can be quite easy especially with large sites for something to break in the sitemap build and it misses pages off.

The two key fields in the sitemap are:
URL
Last Modified

There are a few other fields which you can include and while it best practice it doesn’t make any difference from our testing.

Do all your websites have a sitemap?

Do you regularly audit your sitemap?

Let us know in the comments below.

Speed:

Let’s talk about speed baby,

Let’s talk about you and me

Let’s talk about all the good things

And the bad things that may be

Let’s talk about speed

Speed is super important, some Google employees have said its the third biggest ranking factor on mobile SERPs. Whether you believe that or not – speed is still important.

It affects everything and all channels.

PPC – affects Quality Score which means you can spend more or less
Conversions – stats have shown the slower the site the fewer conversions

My favourite speed checking tool is web.dev but we cover most of the tools below.

https://onpage.rocks/category/speed-kills/

When measuring speed, you need both types – symmetric data and by that I mean fake – you run from the same machine so you can measure progress

Real data – from places like GA to see how your users are doing

Speed also needs to be looked at in context – it’s not just against your peer group it’s across the sites who your customers use.

So if they are big John Lewis visitors and there sites load in 1 second and your loads in 2, customers will notice, it doesn’t matter if your site loads quicker than all your direct competitor’s users will notice a slow site.

I will spend the next few days on speed – as I think it’s super important, but today’s main tip after measuring is upgraded to HTTP2. It’s free and most hosting companies will do ALL the work for you – there are no downsides to upgrading.

Do it today and let me know when you have done it, even better measure your speeds before and after and share them below.

Hosting

There is a lot you can do to improve your actual website, but hosting is one of the biggest factors.

The earlier in the load you can improve speed it magnifies towards the end – so your total lo loads time decreases and that works in the opposite way.

If for example, your TTFB is slow it multiplies and your total load time massively increases.

So look at things that happen first – and this is all down to your hosting and set up.

All our sites are on Cloud ways – shorturl.at/gkqO1

I will take the time over the Xmas break and actually do a step by step guide showing you all the impacts and how to improve it all

https://onpage.rocks/which-hosting-company-is-the-best-for-page-speed/

If you are using Screaming Frog you can now connect it to Google Page Speed insights and get at a granular level how well each page is performing.

There is one downside at the moment in that it triggers Google Analytics hits – but you can potentially block this.

Connect to the two, find your slowest pages and fix them

Images:

Images are usually one of the largest files on your site and can easily be compressed to make them smaller and therefore the site load quickly

There are a few ways of doing it, there are a couple of plugins you can download and install (for WordPress users)

But if budget is an issue and you have the time, then use Compress Jpeg

All you do is upload your images, press compress and re-download them.

You will be surprised at what you can save.

https://compressjpeg.com/

Advance tip and only works about 60% of the time – you compressed image, re-upload to the site and try compressing again.

Sometimes the image quality is affected so it’s worth checking the image before uploading back to your site.

Make sure images are the right size – something else I see often is that the image is too big is actual size, make the image the right size for the location and don’t rely upon fancy software to make it fit.

Even if you don’t want to do it for Google, do it for the user – the larger the image the more of their data allowance you are using.

This is more important in less developed countries where internet speeds aren’t as good, so if you are targeting these markets make sure your site is efficient and loads quickly.

Remove anything which isn’t needed.

This applies to both plugins and tracking.

I audited a site about 6 months ago that had been around since around 2002.

The legacy tracking they had on the site that wasn’t being used was unbelievable – just by removing unused tracking knocked 4 seconds off the load time.

This can also apply to unused plugins, which should be removed anyway as its a potential security issue.

So while things are quite – why not have a quick check and see what tags are being fired and see if it’s still used and then check all plugins

Use a CDN to improve the speed of your website.

There are a few to chose from but will mean your site can load a lot quicker for both Googlebot and visitors.

It’s not as simple as switching it on and forgetting about it, but it’s not super complicated either.

Some hosts even offer there own CDN platforms which means you don’t even need to pay.

Cloudflare – probably the worlds most popular and well known CDN offers a free tier.

Download web.dev as an app. I am only using Mac so not sure if it’s available for Windows, but you can now have the website as a stand-alone app.

Server Logs

Server Logs:

There are several tips coming over the next few days as its one area that is rarely covered when dealing with technical SEO.

The biggest point is you can see what sections of your site Google is/is not visiting.

This could be because there are no links to these sections, they don’t have enough crawl budget

Check you logs for 404s.

You might have got an external link to a page historically and deleted the page and thought you had redirected it.

Google could be following these links and landing on dead pages.

A quick check in the logs will quickly spot this.

Unlikely to be picked up on crawling your site as you should have removed all internal links

Analyse your server logs.

You want to do a check against your sitemap to check

  1. Google is crawling every page
  2. Google isn’t crawling any pages which aren’t in your sitemap
  1. So a quick check – they should be hitting all key pages regularly. I used to say every page at least once a month, but if you have a blog which is 10 years old then it might not crawl really old blog posts.
  1. There will be some pages it crawls which aren’t in the sitemap (like css files etc) but every “page” you have on your site that is being crawled should be in the sitemap

If you have a really old site – this can be fascinating information the first time you do it.

You might see them crawling really old pages you thought you had deleted.

Monthly checks

Monthly Checks:

This is a very important file, it tells bots what they can and can not do on your website. So it must be example.com/robots.txt

There are some very strict rules, it has to be a .txt file and the file needs to be called robots all in lower case.

Always include your sitemap file as well as sections you don’t want crawling.

Also, make sure each bot is given specific instructions if it’s important.

I don’t know the numbers, but every so often we audit a site and they either don’t have the file or incorrect URL.

If you ever do check your server logs, you will see this is probably the most visited page by Google, they usually hit it several times a day to make sure nothing has changed.

Check your robots file and site and make sure someone hasn’t accidentally added noindex tags to the site.

It’s very basic but you hear horror story all the times.

Very important check if you have been building a new site on a temp hosting and didn’t want Google crawling your new build till it was ready.

Yes I know it’s basic but sometimes the basics can have the biggest impacts

Check the indexing of your URLs

if you have a large site this isn’t possible, but you should be checking the key pages.

I often run an audit using Screaming Frog and connect to GSC.

Then I look at pages which have zero impressions to see if its pages not indexed or other issues.

After all if a page isn’t indexed there is no way for it to get organic traffic.

You should either have
https://domain or
https://www.domain

Everything else should redirect to your main chosen option.

I was helping someone in this group recently, and they had

https://domain

https://www.domain

http://doamin

http://www.domain

So all 4 versions were active and more worrying there were internal links to these causing Screaming Frog so issues and in effect making the site 4 x bigger than what is.

This will cause duplicate content issues, crawl budget waste among other things.

So just check you have it set up right and then maybe once a month just check nothing has broken and its still all working – its a 1-minute check.

Most of you can ignore this but around 5% of sites I audit still fail this – making sure the site is mobile-friendly

You can quickly check here: https://search.google.com/test/mobile-friendly

I had one client recently made their internal staff visit the site daily on a mobile, they spend the entire day on a desktop but 65% of the traffic is on mobile.

Not until you use your site like your visitors will you find issues and errors.

Standards and minimum requirements do change over time so it is worth checking to make sure your site still is mobile friendly in Google’s eyes.

Automate

Automate:

Technical SEO isn’tt sexy and it isn’t fun so you want to try and automate as much as possible so you can spent your time doing more enjoyable tasks.

Just because it isn’t sexy it doesn’t mean its not important. Going to sleep isn’t sexy or fun but its important as humans so we can function. Technical SEO is a similar.

Automate as much as possible.

With Google Data Studio – you can link in Google Analytics, BigQuery etc to create detailed custom dashboards so that you can just quickly analyse errors and differences at a glance.

If nothing has changed quickly move on to your other tasks.

Data Studio is a great tool and even better it’s free.

Whats some of the reports you have built using data studio.

Leave a Comment

Your email address will not be published. Required fields are marked *