There are several things I looked at when deciding which tool to recommend and use.
While the main aim of these platforms is to crawl your websites and look for errors, connecting to third parties allows them to get even more data and analysis. Like connecting into analytic platforms to see which pages are getting sessions, Google Search Console for impression data.
The only integration they all seemed to be missing was the connection to the Google Page Speed API – which ~I think would make these platforms stand out.
Winner: All – they all basically offer the same integrations.
This isn’t that big of an issue if you’re working in house and managing one website, but if you’re an agency and do weekly crawls of your clients that being able to crawl multiple at once will be important. You don’t want to wait until one audit is complete before starting the next, it’s just inefficient and time-wasting.
This is key, it’s one thing to find the issues, it’s another for them to be reported to you so that they can be actioned.
All of their reporting dashboards are different, but also the same. They show the same information, just laid out slightly differently.
Winner: All – they are all great
This might seem a small issue, but for efficiency sake, it’s quite important. It’s unlikely to be the same person who is responsible for fixing all the issues, actually, I would go as far as saying if your an enterprise client then it’s very unlikely.
Botify is great if its inhouse teams and you can get them to log in and see all the data. It’s not as good when it’s for clients. It can automate into certain platforms, like Jira.
Deepcrawl has some nice simple interaction, similar to Botify but also additional ones like Slack and Trello.
When it came to pricing, I looked at the top tier end, if I wanted to crawl around 2 million URLs a month – after all, we are talking about Enterprise level here.
This required me speaking with their customer service teams and getting bespoke quotes.
I don’t want to reveal pricing as this is custom, depending on number of project and URLs so don’t think it would be fair.
Botify and DeepCrawl were roughly the same prices.
OnCrawl, was cheaper and over the course of the year would make a nice saving. Price isn’t everything but can be important to some in house and agency staff as most companies don’t have unlimited budgets.
The Crawl Test
Ok, so that’s all nice – but here is the real test. We are going to audit one website (enterprise-level) and see which tool finds the most errors. In theory, they should all spot and report back the same errors, but theory and reality can sometimes be different things.
This article will be updated shortly with the audit results – I just need to speak with the site and make sure they make no changes over the crawling period (otherwise that could screw the results).
This applies across all three platforms in that you’re limited to the number of “projects” you can crawl i.e. the number of sites you can audit. Why most of these tools are generous on the number of URLs you can audit, it’s usually limited by the number of projects you can audit in that month.
This is why they are enterprise-level solutions – if you have 100 clients with average site size of 10,000 pages the tools would be extremely expensive quickly and a solution like Screaming Frog and Google DataStudio is a much better solution.
If you work in house, it’s obvious you will likely only have the one website so that won’t be an issue, or if you do run an agency it might be worth keeping the projects for your bigger clients.
Until I have the results of the Audit winner, it’s quite hard to pick the overall winner, they pretty much all do the same thing.
I have been doing SEO for over 10 years now and across the three years, I have liked them all. Most of the time I just use Screaming Frog, but when a large enterprise client lands – then we use one of the platforms.
Check out the platforms for yourself.