Top Issues when Auditing Websites

Top Issues when Auditing websites

Over the years I have become quite an expert and analysing websites to find out technically what’s wrong with them.

 

Historically while this has always got good results, SEOs have tended to just get more links or improve the content too boost rankings. This, however, is changing, over the last few years, more and more people are starting to look at technical SEO to get a ranking boost.

Whether it’s large sites or very small sites it’s always similar issues which pop up as easy fixes.

 

If you want to do a more in-depth audit then check out our guide to DeepCrawl Vs Screaming Frog to work out which one is best for your site.

 

On-site is important – “you wouldn’t build a million dollar website on quicksand, so don’t poor a million dollar website on poor foundations”. Sorting out the technical issues allows you to get the most out of all your other campaigns.

 

I did an audit recently of a very large well-known retailer in the UK, they have hundreds of stores across the UK and a very decent backlink profile. However there onpage was shocking – in fact, if it wasn’t for their high-quality backlink profile they wouldn’t have got the ranking they did.

 

There 3rd and fourth place rankings for some very large terms could quite easily be number 1 by making a few small changes.

 

In this article, we are going to look at the most common mistakes.

 

HTTP2

 

There isn’t a reason why I can’t think of not migrating to https and if you are that means you can upgrade to http2. While http2 has been around now for quite a few years it’s only in the last 12-18 months that it really has become popular.

 

You can check to see if you are set up for http2 here https://tools.keycdn.com/http2-test

 

If you are not then it’s very easy to set up – speaking with your developers is the quickest and easiest way to get to get this sorted.

 

Sitemaps

 

There are three issues within this not just one but I wanted to group them together.

Pages you want indexing aren’t included in the sitemap

 

This is one of the most common things I see on the majority of sites we audit. Pages not being added to the sitemap. Yes, in theory, Google will find these pages as they crawl your site, but you are then waiting for Google to crawl these sections, which might be a few weeks away.

 

One of the most crawled sections of a site is the sitemap – think about it from Google’s point of view – this document should, in theory, have every page – why waste resources crawling the site for new pages when it can just check one central point.

Pages you don’t want indexing are in the sitemap

 

This baffles me just as much as the above, the amount of times we see pages in the sitemap, but then the webmaster adds nofollow noindex tags to the pages.

If you feel the page isn’t worth being indexed in Google and other search engines and you have added the noindex nofollow tag to the pages, then there is no point in having them listed in your sitemap – actually its wasting crawl budget as Google will go and hit these pages to  

Not having an image and video sitemaps

 

While these aren’t as important as a traditional sitemap and potentially won’t help a page rank higher, anything you can do to make Google bot’s job easier is a good thing. The video sitemaps have the potential to help your videos rank higher in Youtube as well as Google and the image sitemap to help your images rank higher.

 

In some niche’s images can drive a lot of traffic to a website so anything you can do to help is a boost.

 

This tip doesn’t apply to every audit we do, some website just don’t have any video so wouldn’t need a video sitemap, but even for a small site like this which has around 3 videos, we still created a video sitemap.

 

Broken pages

 

Well over 95% of the sites I have audited have this issue – they are internally linking to dead pages.

 

In fact usually the bigger the site the more errors – we are all guilty of this one, we do internally linking as we know its good for SEO, but then over time we delete old posts, if its an ecommerce store, brands or categories become obsolete and go away, but the links are still live on other pages.

 

These are so easy to fix, using tools like DeepCrawl or Screaming Frog highlights the links, the pages they are on and even the anchor text making them easy to find. Once found you either just need to delete the links or swap them to a page which is more relevant.

 

301/302 redirects

 

Similar to broken pages – this is one which really really bugs me – it just screams lazy. There is no reason at all to have 301/302 redirects on your website. You control it, tools like Screaming Frog or Deepcrawl will locate these issues – tell you where they are, what the anchor text is and where they are pointing to.

 

It’s just then up to you to find and fix.

 

Side note – there are a few times I have come across where the content team / SEO person doesn’t have a decent CMS and requires these to go through the development team and then it hits there queue and gets pushed down for priority reasons.

 

If this is your case – make sure you’re regularly crawling the site and keep adding to the original ticket any new redirects you find. Then keep updating and reminding the developers about the redirects, explain the importance and that it’s only a small task and once done, you will stop bugging them about it. If that fails – they usually like chocolate/beer so take them in a present. While I would never prompt bribery, I wouldn’t call this bribery – you are just breaking down the barriers and building relationships.

 

Duplicate titles/meta descriptions

 

Every page on your website should have a purpose and be unique. Conflicting pages have been shown to hurt your rankings – in fact, Ahrefs released a very good article on this.

 

While most people are aware of the benefits now of sorting out duplicate pages, quite a few still seem to have duplicate descriptions and titles. Google has confirmed that the content in the meta description isn’t a ranking factor and I believe them, it’s not a direct ranking factor.

 

That said I do believe it’s an indirect ranking factor. Click through rate is a ranking factor (insert link to article) so this is your one opportunity to sell the page to the user and encourage them to click. Don’t lie and set unrealistic expectations as pogo sticking is also a ranking factor so you don’t want people to bounce but there are so many tactics you can use to encourage people to click through to your site ahead of the competition. If your an unknown brand this is potentially the first impression a user will have of your site to make it good so they want to click through.

 

While big brands like Apple, Pepsi, McDonalds etc don’t need to really worry about this issue as they are household names and people know before clicking what they are likely to get, it’s still good practice to have decent and unique meta titles and descriptions – after all, they still want the traffic.

 

Www to non-www redirect

 

This is quite a common problem, in fact when I was doing some analysis on Wedding Bloggers for a client – 56% of the sites had this issue – these are some of the biggest Wedding Bloggers as well, not small sites.

 

The issue is simple, they have both the www and non-www version of the site live, as well as causing duplication problems, diluting link power – they are also wasting crawl budget.

 

The issue was exaggerated more when looking at https. Not all sites had an SSL certificate but ones which did usually messed up the SSL installation meaning they, in theory, had 4 versions of their websites live.

 

There are a couple of ways to check this:

 

  • Manually: type in the 4 version of the domain and make sure they 301 to the prefered version
  • When doing a Crawl with Programmes like DeepCrawl they will do the check for you
  • Using a third party option like LittleWarden which does daily checks to make sure they are still active.

 

My prefered option is Little Warden – it means I don’t have to worry about things breaking, it does the check daily and lets you know if there are any issues – for the price (£14.99 a month) it’s well worth it.

 

Useful Tip: Don’t just check the home page, check half a dozen random pages across the site – it’s here that where the developer forget or something breaks.

 

Reducing images size

 

10 years ago this was never really an issue, most people were migrating from dial-up (I don’t think I will ever forget that sound) to broadband so web developers kind of got lazy – speed wasn’t an issue. Heck people we’re happy anyway as websites we’re loading faster than ever before.

 

However, with the rise of mobiles, Google now says that in 1+ countries they see more mobile searches than desktop, and with mobile speed has become an issue again for two reasons.

 

Page load time – no one wants to stand around waiting for a page to load over a 3G connection but more importantly people don’t want to visit a website and use up more of their Internet allowance than they need to.

 

Compressing images / making images smaller in size is a great way to boost page speed, in the recent article we did on Screaming Frog, we showed you how to find your largest images on your website, once you have the images there are even free easy to use websites which can do most of the work for you. I use compresspng.com and compressjpeg.com – even for the less technical savvy, these are very easy to use.

 

Tip: After you have compressed your image, take the same compress image and put it back through the same website. Sometimes there is no difference, other times the new compressed compressed file isn’t usable but sometimes you can get even better improvement in size with no loss in quality.

 

Additional pages

 

These are most common with images – by default WordPress, creates an additional page for every image – while this issue isn’t a purely WordPress issue its where I come across it the most often.

 

There are a few occasions where you do want additional pages for your images but most times people aren’t aware.

 

As well as creating additional pages which Google then has to crawl, instead of getting links to the blog articles where they are located you get them to these additional pages.

 

Fixing them is very simple, in fact, there is a setting in Yoast which means they are automatically redirected back to the source.

 

Overall

 

These are just the most common issues I have found when crawling a wide range of websites, there are plenty more and no doubt your site will potentially have different issues, the most important thing to do is crawl your website, find the issues and fix them.

 

Some issues are relatively easy to fix, others require other departments and areas of expertise – but knowing what’s wrong with your site is the first step.

 

Crawling and Server Log Analysis, while different does have an 80% coverage so, will uncover the same issues. However, combining both allow you to get a greater understanding of the issues of your website.

 

The two tools that i would recommend are DeepCrawl or Screaming Frog.

 

Leave a Comment

Your email address will not be published. Required fields are marked *