Google Search Console: a gold mine for all SEOs
Search Console is a real Swiss army knife for all SEOs. It provides performance information (which page generated clicks or impressions on which keywords), indexing (which page is, or is not, visible in Google, and why), optimization (which page has a good user experience, is optimized for mobile), etc.
In short, it is an essential tool for anyone who works in SEO, it is not necessarily reserved for specialists.
There is also a great series of videos offered by Google and we also give you some tips on Search Console in an article.
Until now, Search Console was mainly exploited from its interface by most users despite the presence of an API that allowed its in-depth use with fewer limitations. However, it must be admitted that, unless you know how to code, or use a third-party tool, APIs are not always the easiest to handle. Things may change.
Indeed, Google has just proposed, performance side, a direct link with BigQuery which offers many advantages. BigQuery should be used only by those who are interested in its SEO performance, not just by the biggest companies or digital agencies.
What is BigQuery?
BigQuery is a serverless data warehouse, entirely managed by Google. This means Google manages the infrastructure and you need “only” worry about data. This is the whole point of the tool, because it allows you to manage and query enormous datasets (think of files of several million rows and columns that would make Excel unusable).
Another advantage of BigQuery is that it is obviously well integrated with other tools in the Google suite. It is therefore, for example, quite easy to link a data table to Looker Studio (Datastudio for those who have not yet managed to pass the rebranding course). A fairly large table related to Looker will be much faster to use than with other connectors (for example, with the “native” connector of Search Console to Looker studio). Finally, facilities for data injection, in particular from the Search Console, have been in place for some time now.
The advantages of integrating Search Console data into BigQuery
So far, if you are not necessarily in the habit of using Search Console, you might wonder if it is really worth enabling this option. There are several reasons:
The ability to keep data for more than 16 months
In its online version, Search Console keeps “only” 16 months’ worth of data. It is therefore only possible to compare the last 3 months with the last 3 months of the previous year. It is still that, but still very limiting. For most sectors, especially during particular periods, being able to compare a given year against the previous year (or even several years together) allows you to draw real useful information. With BigQuery, the problem disappears, you are able to conserve your data indefinitely!
The only drawback is the retention of data will not be retroactive. Fortunately, at Universem, we are able to collect this data retroactively (with 16 months max., obviously) thanks to the Search Console API.
You must also, for larger sites, pay attention to the size of the datasets that could explode the note. BigQuery is still good value for money. We already make frequent use at Universem (for Google Ads, Analytics, Search Console, log analysis, internal links, etc.) and the monthly note remains quite correct even though some of our customers exceed millions of pages. However, prevention is better than cure and, before linking a dataset of several hundred gigabytes to Looker Studio used daily, it is better to set a limit to your budget and provide alerts. If you have a site of a few thousand pages, no need to worry too much (alerts are still a good practice anyway).
Complete (or almost complete) data, whatever of the size of your site!
If you use Search Console exports, you know that they are limited to 1,000 keywords or 1,000 pages. The API already partially avoids this limitation, but the direct link to Google’s Data Warehouse eliminates this problem altogether. You will have the clicks of all pages for all countries and on all media.
So you have a 300,000-page site that generates clicks on hundreds of thousands of keywords? No worries, everything is fully extracted and analysable!
The one remaining limitation, but rather in connection with Google, is that some “keywords” are anonymized. We can have 50% to 30% of queries anonymized in Search Console. It is nothing new, but it is something that people may know (and it must be said that this is the first time Google has displayed it so clearly).
Endless possibilities for junctions with other data sources
If you think the last two points are rather interesting, the fact of setting out on this extraction brings another big plus: the dataset junction. You have always dreamed of comparing your SEO and SEA performance on the same chart? Nothing simpler with a little SQL. Linking Analytics (and GA4) and Search Console? You can! Linking third-tool data (ranking tools, crawl tools, etc.)? A trifle! (Well, to be fair, we admit that our consultants’ brains can get fried staring at formulas for too long, but what would we not do for useful data?)
Are you interested? Do you want to test the API? A tutorial is available here. However, if creating an account on Google cloud platform and setting up the link seems too complicated, do not hesitate to contact us. We can help you create this link and make the data understandable to improve your online visibility!
Did you find this article interesting? Please share it!