[lmt-page-modified-info]

Enterprise Level Technical Auditing

Technical Auditing on a whole different level.
How I tested the tools

Sites Crawled

Tools Used

Enterprise Level Technical Auditing

Crawling your site and finding technical issues is important, in fact, its a must in 2020, but there is a huge difference between crawling your 500-page affiliate site and crawling an enterprise-level site.
If you are looking to crawl a smallish site, check out our Screaming Frog vs SEMRush article, those two pieces of software are going to be a much better option for you.
If your site is Enterprise, 100,000+ pages then you’re going to want to use enterprise-level software.

Three tools:

  • DeepCrawl
  • Botify
  • OnCrawl

These are all web based products and if you’re crawling this size, it’s probably the level you will need. I have Screaming Frog set up on a Google Server – which kind of is the same, but it’s not the easiest to set up – plus you still need to do the manually auditing.

The Purpose:

The purpose of these tools is pretty simple, crawl your website, alert you to any issues, make it easier for you to let the relevant team/individuals know and then check on the next crawl that these issues have been fixed.
Ideally at a reasonable price, whether you’re in house or run an agency you have got budgets to keep t0o.
DeepCrawl platform
What I liked
  • Third-party integrations
  • Stealth Crawler
  • Phone support
Pros
  • Prioirty List
  • Analytics Dashboard
  • Internal Dashboards for in-house teams

Preferred Tool

Oncrawl platform
What I liked
  • Price
  • Dashboard
  • Connecting Datapoints
  • Project sharing

Full Review

The Features:

There are several things I looked at when deciding which tool to recommend and use.

Integrations:

While the main aim of these platforms is to crawl your websites and look for errors, connecting to third parties allows them to get even more data and analysis. Like connecting into analytic platforms to see which pages are getting sessions, Google Search Console for impression data.
The only integration they all seemed to be missing was the connection to the Google Page Speed API – which ~I think would make these platforms stand out.

Winner: All – they all basically offer the same integrations.

Simultaneous Crawls:

This isn’t that big of an issue if you’re working in house and managing one website, but if you’re an agency and do weekly crawls of your clients that being able to crawl multiple at once will be important. You don’t want to wait until one audit is complete before starting the next, it’s just inefficient and time-wasting.

Winner: OnCrawl

Reporting:

This is key, it’s one thing to find the issues, it’s another for them to be reported to you so that they can be actioned.
All of their reporting dashboards are different, but also the same. They show the same information, just laid out slightly differently.

Winner: All – they are all great

Project Management:

This might seem a small issue, but for efficiency sake, it’s quite important. It’s unlikely to be the same person who is responsible for fixing all the issues, actually, I would go as far as saying if your an enterprise client then it’s very unlikely.
Botify is great if its inhouse teams and you can get them to log in and see all the data. It’s not as good when it’s for clients. It can automate into certain platforms, like Jira.
Deepcrawl has some nice simple interaction, similar to Botify but also additional ones like Slack and Trello.

Winner: DeepCrawl

 

The Price:

When it came to pricing, I looked at the top tier end, if I wanted to crawl around 2 million URLs a month – after all, we are talking about Enterprise level here.
This required me speaking with their customer service teams and getting bespoke quotes.
I don’t want to reveal pricing as this is custom, depending on number of project and URLs so don’t think it would be fair.
Botify and DeepCrawl were roughly the same prices.
OnCrawl, was cheaper and over the course of the year would make a nice saving. Price isn’t everything but can be important to some in house and agency staff as most companies don’t have unlimited budgets.

Winner: OnCrawl

The Crawl Test

Ok, so that’s all nice – but here is the real test. We are going to audit one website (enterprise-level) and see which tool finds the most errors. In theory, they should all spot and report back the same errors, but theory and reality can sometimes be different things.
This article will be updated shortly with the audit results – I just need to speak with the site and make sure they make no changes over the crawling period (otherwise that could screw the results).

Winner: TBC 

The downsides:

This applies across all three platforms in that you’re limited to the number of “projects” you can crawl i.e. the number of sites you can audit. Why most of these tools are generous on the number of URLs you can audit, it’s usually limited by the number of projects you can audit in that month.
This is why they are enterprise-level solutions – if you have 100 clients with average site size of 10,000 pages the tools would be extremely expensive quickly and a solution like Screaming Frog and Google DataStudio is a much better solution.
If you work in house, it’s obvious you will likely only have the one website so that won’t be an issue, or if you do run an agency it might be worth keeping the projects for your bigger clients.

Overall: 

Until I have the results of the Audit winner, it’s quite hard to pick the overall winner, they pretty much all do the same thing.
I have been doing SEO for over 10 years now and across the three years, I have liked them all. Most of the time I just use Screaming Frog, but when a large enterprise client lands – then we use one of the platforms.
Check out the platforms for yourself.

More Great Articles