In this video I am going to cover off the basics off how to connect the two data sources together, then a few of the advance changes you can make to get the data which matters to you and then finally I will explain why this is a must and should be something you should be doing.
So yeah, let’s get into it and but before I do I want to make you aware whenever you add in additional data sources it can really slow down the crawl rate of the site and in some cases cause the crawl to crash, so its probably worth testing this on a small site to get used to the settings etc.
I run a weekly audit for a site which has about 75,000 pages and my Macbook Pro, even though it was brand new last August and one of the higher specs – adding in additional data sources for Google Analytics and Google Search Console (that is another video you should watch) it causes the crawl to crash, so for that site I am unable to do this additional analysis.
If you are like me and lazy and don’t like to read I have put most of it in a video, the video was done off the cuff so might have missed some of the details.
Sorry I had to blur sections, I always try and record in one take and didn’t realise the sensitive client info I had open in tabs and the list of clients GA accounts I have access to.
Connecting the two:
Ok lets get into the actual connecting of the two sources and in fact its pretty straight forward:
- API Access
- Google Analytics
- Add New Account
- Sign in
- Pick the Account and View of the site you are about to crawl.
- Hit Crawl
Yeah, its that simple – you will notice a new box appears and that is just showing that the API is being used.
Then you have just got to sit back and wait.
Ok that is quite easy, but not every website is the same and more importantly not ever audit is the same, what matters for a news publication website is far different to that of say an ecommerce site and that is very much different to an affiliate site.
The good thing is, if you take a few additional steps, you can start analysing which pages are not getting any Google impressions, which of the pages are getting impressions, but no CTR but as the data is all in one place you can see the meta titles and descriptions.
But that’s just the start, if you really want to get advance you can start customising the data your importing, if your an eccomerce site you might want to bring in Organic revenue per page, if your an affiliate you might want to bring back number of goal completions per page.
Maybe your days to convert is longer than the 30 days standard that is automatically pulled in you can change that, at the moment you can’t but it would be a good feature, I’ve asked Dan – lets see what he can do.
As mentioned above you can customise the metrics you pull back that meets your needs which is super useful.
There are a few other options you chose from but the main one I would have checked is the option to “crawl NEW URLs Discovered in Google Analytics” – in theory this shouldn’t be any URLs as this would mean they are not discoverable via a crawl, but its always good to check.
Why is it important?
Firstly, crawling your website with Screaming Frog is a great start and for most website owners it’s enough, it allows them to find the majority of technical issues and fix them and if that’s you that’s great.
However, there are some proper technical geeks out there (I would call myself one of them) that like to really create perfectly technical websites. This is super useful if a site is struggling to rank or has lost rankings and you want to find out the root causes.
So the normal crawl gives you a great level of detail about every page, but adding in Google analytics data you can start to analyse which pages get zero organic clicks, which pages have a high bounce rate from Google.
All this data is available in the Google Analytics website, however seeing all the data points next to each other can allow you to spot issues easier (well it does for me anyway).
A prime example of this was I was looking at some of the top pages people bounce from Google, whether you believe the Pogo Stick is real or not its not a great user experience if the user has to bounce back.
Anyway so when I sorted the data by highest bounce rate one of the things to jumped out at me was the size of the pages. Upon further analysis it turns out that these pages we’re image heavy and while they did look great and all the images were optimised (didn’t get picked up in large image analysis), this site was custom built and the company had a in house team of developers (great team if your watching / reading this) but they were able to work there magic, and reduce the page size considerably but also massively decreased load time because of it.
When I ran the analysis about a month or so after the changes, the bounce rate drop considerably for about 90% of the pages. The other 10% I believe it was because the pages were ranking for non relevant terms so people were clicking through and the page that loaded wasn’t what they were expecting so they bounced back.
There are many more example I could share with you, but I would really recommend you just go ahead and connect the data and see with interesting information you can find out.
You can also add in other data sources including, Search Console which I will show you in a future video, but it’s basically the same process. Other sources include, SEMRush, Moz and Ahrefs – all of which you will need paid subscriptions to use.
Please do subscribe to your Youtube channel for more great videos like this, leave us a comment below if you have any specific questions either related to this video or to a technical SEO video you would like us to do.