Crawling your website is one of the most important parts of technical audits, but finding the best tools to complete the job can be difficult, but don’t worry we have analysed the two most popular tools on the market to see which one is the best to use.
This is quite a big debated which I see often both online in groups but also at conferences, which one is best for crawling a website to find technical errors.
The answer is simple – it depends.
It really does and that seems like the answer most SEO’s give when answering a question (personally I think SEOs would make great politicians as never give a straight answer to a question), but I am hopefully going to show you the pro’s and weaknesses of each to give you a good understanding which works best for you and your needs.
A side note – I have been using Screaming Frog for about 5 years and DeepCrawl for about three years so have seen the platforms evolve over time.
The size of your site:
Knowing how big you think your site is and how big it is, in reality, is two different things, but at least roughly knowing the size should help. We got a new client who thought they had less than 1,000 pages. As a default when using DeepCrawl (DC) we always set it to 10,000 crawl limit and was shocked to see why the site had over 10,000 pages. Case Study coming soon, but it’s worth noting. We set the limit at 10,000 as we once didn’t and wasted an entire crawl budget on a new clients site not expecting there to be as many issues as there was.
Screaming Frog – it all depends on how good your machine is to whether you can crawl your entire site. SF is a desktop application and uses the resources of your machine. Historically I have had SF installed on dedicated servers to run large sites.
For sites which are 1,000 pages or less SF will be sufficient, even on my 2013 MacBook Air I can manage to crawl a 10,000-page site – but it does take a while and means I can’t use my machine.
Dan from Screaming Frog did say “The SEO Spider now saves to a database. So you can crawl 10m URLs if you want to crawl large sites using Screaming Frog. It’s only just been released. It does require recommended hardware with an SSD, though (or the crawl is slower).”
I haven’t personally tested this but does seem like a good addition if you want to crawl large sites and only use SF.
DeepCrawl – in theory, there is no limit as you pay for the number of URLs you want to crawl. It’s a cloud-based solution so it doesn’t matter what device you sue. I would really like at some point to give it a good attempt. Over the years I have had some fairly big clients, the largest site was about 250,000 pages and DC was fine.
Winner – depends on the size of your website. I am happy to use both, but probably prefer DC as it means I don’t lose the ability to work on my mac. Also, I have started to use my iPad Pro to do more and more work meaning I can still crawl websites when out and about, and currently you can’t crawl websites on an iPad. If I know the website is small and due to the amount of “active projects” I am allowed a month with DC I sometimes just use SF.
Scheduling of Crawls:
Having the crawl data is important, but being able to schedule a crawl so it runs at a convenient time for you is perfect. Having to manually go in and set a crawl to run is a waste of time and resources, so the ability to schedule is hugely important.
Screaming Frog – A common misunderstanding is that you can’t schedule crawl with Screaming Frog, but it can be done. It’s just a lot more complicated and difficult to set up and helps if you have a dedicated server.
If you have a server you can “Schedule tasks” you want to be done at specific times.
DeepCrawl – When setting up the project, you can set of how often you want the crawl to run and on what day. You can also change this at any point.
Most of my crawls are set to run on a Sunday, so when I get into the office on Monday they are waiting for me to analyse.
Winner – Just because of how simple it is to set up DeepCrawl has to be the best out of the two here.
Comparing against previous results can be important, it allows you to see any new errors which might have appeared.
DeepCrawl – it’s one of the key selling points in my opinion for DeepCrawl, after the second crawl it compares the results to the previous crawl.
For most accounts I monitor on a regular basis I have the crawl going once a week/month and then I can quickly see what changes have been done. See what the development team has broken/fixed.
Screaming Frog – most people assume you can’t do this with Screaming Frog (SF), but they are wrong. It can be done but it’s quite manual and time-consuming.
You need to export all the data into Excel for each crawl and put this data into a new tab.
You then need to create a master sheet which does the summary between the two – it’s a lot of sumifs and vlookup type formulas. It can be done, much easier for very small websites (sub 100 pages) but it can be done on large sites too.
Winner: DeepCrawl, it’s one of the main reason I love DeepCrawl, yes you can do it with SF but it’s a lot less manual.
Crawling a list of URLs:
Sometimes you don’t want to crawl a website but a list of URLs, there are many reasons why, for checking the current status of links, analysing to see if 404 errors have been resolved etc. It’s a very useful exercise which I must do at least once a day.
Screaming Frog – pretty easy to do just change the Mode from “Spider” to “List” and paste in the list of URLs you want to crawl.
This is a great feature I use pretty frequently when doing server log analysis. Sometimes when doing server log analysis you could be looking at several months worth of data so sometimes it handy to check the current status of pages which could have been fixed since the errors.
DeepCrawl – This feature isn’t available in DeepCrawl, however even if it was I would still use SF for this feature. Sometimes you have to wait for a crawl with DC which isn’t normally an issue but when I am doing Server Log analysis I want the answer now.
Winner – Screaming Frog, very easy DC doesn’t offer this feature, but even if they did – SF would still win this on speed.
Number of different Sites:
This feature will depend on your business, if you only own one website this isn’t an issue, but even if you work in-house you might have subsites. If you work for an agency or manage multiple affiliate sites then the number of sites you can crawl is important.
Screaming Frog – Unlimited – the only drawback is the power of your machine and you have to do one crawl at once. You get two keys per licence so I tend to have a machine running SF with new crawls and another machine which I use for analysing the crawl data so that I can constantly be crawling if needed.
DeepCrawl – this depends on your plan, the basic plan allows you to have 5 “active projects” each month and the amount increases with each plan. Each month you can crawl 5 different sites, but assuming your 5 sites are only 10,000 pages each then you will crawl 50,000 URLs in a month and won’t be able to use the remaining URLs. The advantage though is you can crawl multiple different sites at once.
Winner – Screaming Frog, just for the ability you don’t have a theoretical limit – just a limit on your machine and time to set the crawls to run. This is especially important if say you have 20 small 250-page affiliate sites. Paying DC fees per “Active project” would be expensive whereas using SF this wouldn’t be an issue.
After all, this is probably the main reason to use either tool, to find any errors and fix them. They both find all the main issues like 404s, 301s etc.
Screaming Frog – Great at finding errors, it’s just you then have to find the errors within the report, and for a geek like myself, this is fine, doing sorting, filtering and queries in excel isn’t an issue. It’s just a bit of labour intensive finding the errors. The one feature I do like in Screaming Frog that DC doesn’t seem to have is that you can filter by images and see the largest files – a key item in optimising page speed.
DeepCrawl – They group the errors for you and report them in a clean dashboard if you have done more than one crawl it compares it to previous crawls, so you can see what has changed to previous crawls, which is very useful.
DeepCrawl can also integrate with Google Analytics and Google Search Console which adds to the data it can generate in the reports and allows for more analysis and errors, which can’t be found by a traditional crawl.
Winner – while both are very good and I do like the additional feature of SF about finding large images to reduce, but I love the dashboard in DC which allows me to quickly see the errors and get them fixed.
Type of site:
This won’t affect many people if your website is built in HTML or PHP both SF and DC can crawl your website perfectly well.
We had a client recently upgrade their website and they want us to crawl the new site in dev mode prior to launch – the site was built in Angular JS so we had no choice but to use SF.
DeepCrawl – if your website like the majority is built in HTML, PHP then DeepCrawl is fine and can handle it.
Winner: Just because of the extra ability to crawl and render JS type sites, then SF has to be the winner. T
Screaming Frog – They have two options, a free option and a yearly paid of £149. I personally use the paid for version, its a fairly low amount and the free version is restricted to 500 URLs for a crawl.
DeepCrawl – there are a quite a few different options, prices start from £55 a month for the minimum amount.
Winner – both, in my opinion, the prices reflect what they each offer.
If you have the budgets I would highly recommend you get both – while they are very similar software, there are some slight differences which mean both can come in useful. However, if budgets are tight then look at the above points to determine which is best for you and your needs.
To find out more about and download Screaming Frog click here
To find out more about and sign up to Deepcrawl click here