Basic elements you should be looking out for in server logs

Some of the topics I cover on here are about advance server log analysis, but I often get asked what are some of the basic / initial things to look for in the server logs.

The Transcript

Hi, I am the sever log kid. Today’s question is, what is some of the basic things to look for in a server log audit?

Great question. So what are some of the basic things to look forward to? So in this short video, I’m going to try and cover some of the basics.

So yeah, basically this video came about because we post a video yesterday, what is the difference between several logs and platforms like Google Analytics, and one of the first questions I got back was, so what type of things should you look for in an audit?

And it was a great question to be fair and something assume most people know but you can never make assumptions I guess. So in this video, I was trying to explain to some of the basic things you should be looking for when doing server log audits. The first thing is obviously you only really care about bots.

And so you can figure out about any of the human traffic there when you’re doing technical audits with us here anyway, just look for the bots of two really and the main one that is Googlebot. So you want to be looking for mobile, versus desktop, see what percentage they’re crawling you with the mobile bot and desktop bot. Are they crawling the entire site or part of the site with both bots?

How frequently are they crawling the site, how often are they crawling it and when are they crawling it. So if you’re only calling your category page every two weeks, and you put a new link in there, on a day after they’ve called it, you only have 13 days to really recrawl, knowing when how frequently quality is quite important.

The number of pages in the crawling Like I said earlier all the pages so are they just crawling certain sections.

Is it because it can’t get to a certain section?

Or is something blocking it, is no internal links to it.

Google bot doesn’t guess URLs, ( they do some of the more general ones like /blog etc, but usually 999 out of 1000 they do not guess URLs.

You also should be looking for errors. So things like 404 pages, 5xx error, 3xx errors. And basically, one thing you are looking out for is, are they getting a lot of 5xx errors because that means they’re hitting the server too fast. And if they are hitting your server too fast, what you’re going to do is slow down the amount of time to crawl you, to really want to reduce the 5xx Errors down to zero, because then it will allow you to come and crawl the site more frequently, more often. As soon as you start seeing 5xx errors in there, you need to be looking at is it too slow to load the site seem to work on optimization for speed is your maximum capacity, do you to increase your service specs etc? But yeah, you really want to reduce all the 5xx errors. Other things you want to look at is page size, and things like that.

It’s something I should have made clear at the beginning.

So There is a lot of crossover between doing server log audits and crawling your site.

Probably 70% is crossover, you’ll get the same information. So if a page, if you’re linking from one page to another and it returns a 404 error, you’re gonna get a lot in both the crawling on the audit. 3xx errors are the same thing. The 30% difference is what meets the difference between good audit and great audit. So you will be able to see what Google bot is doing on your site you will be able to see where they’re crawling, where they are not crawling, why they are having issues and be able to fix it. And you’ll be able to get stuff from crawling that you’ll know the company your logs, because log base information is quite basic. So and the other things you should be looking out for is okay, we’re called Google bought, bought what what other bots are crawling your site, what percentage of the bots of bots is crawling your site, and what large pages they are. So yeah, the majority of the time I spend probably 70% of your time looking at what Google bots, maybe even up to 80% of the time what Google bot was doing on the site. And then the other 20% is looking at what the bots are doing which SEO tools are crawling your site, etc so you can be quite information. I hope you found this video useful any questions please do ask please subscribe to our YouTube channel. As you can see the child is joining YouTube channel is subscribing hit the bell icon so you get the latest videos. do follow us on Facebook and please do join our technical SEO group called on Onpage Rocks – technical SEO Made Easy. If you want to find out more information about this cool thank you bye


Share on facebook
Share on twitter
Share on pinterest
Share on linkedin

Leave a Comment

Your email address will not be published. Required fields are marked *

On Key

More Great Articles

Speed kills your business
Speed kills

Speed kills your business

Speed kills your business. A slow loading website is going to increase the bounce rate meaning you’re going to have to get a lower RIO

Read More »
Batchspeed Review
Speed kills

BatchSpeed – Review

Ever been like me and wanting to bulk analyse page speeds for websites, but hated the fact its quite a manual task. Well, I’ve got

Read More »

Subscribe to my newsletter

Want to learn more about technical SEO and Server Log Analysis, sign up below and I will teach you.

Each week I will send you the most helpful articles delivered straight into your inbox as well as sharing some very useful tips and tricks to improve your technical SEO skills to Ninja level.

* indicates required