Last January 31, Google finally allowed API access to its real-time URL indexing data.
That is, the same information provided by the URL inspection tool, included in the Search Console interface in June 2018, but instead of being forced to consult this data manually and one url at a time, now it’s possible to do that automatically and en masse, within the limits of use stipulated by Google.
URL Inspection API Limits
It is limited to 2000 requests per day per property (one url, one request), and with a maximum “speed” of 600 requests per minute.
Data provided.
The information provided by the API for each URL is the same that we can see when using the URL inspection tool in Search Console.
Like this tool, the data provided is not limited to reporting the indexing status, but also includes whether the URL passes the mobile usability test.
How to use the URL Inspection API
We have two options to extract the information we want from the API: directly, making calls from a command line, or taking advantage of the integrations that several SEO tools have already made, to make API calls from their interfaces and even within their work processes.
You can have access to this using tools like FScreaming Frog and Sitebulb.you can use an alternative to access the API.
Using Bulk Page Checker template for Google Sheets
You can also use the Bulk Page Checker template by Mike Richardson, who had the merit of being the first (as far as I know) to develop an integration for the new API.
The only problem is that you need to do the whole process of activation and authorization of service within the Google Cloud Platform account, almost as if you were connecting to the API via Python or another type of script (actually what you are doing is connecting directly with Apps Script).
Besides, Google Sheets doesn’t usually deal well with hundreds or thousands of API calls within the same sheet, and you will probably experience some slowness, timeouts, etc. I recommend getting the data by another tool or method and then bringing it into Sheets to manipulate it.
With your own script
If you are good at programming or you are in favor of creating your own tools for specific use cases, or simply because you like it better, there are already several tutorials that can help you.
Simplifying, the process will always be the same:
- Create a project or leverage an existing one on Google Cloud Platform.
- Create a service account with your credentials
- Create the API keys needed to authorize this service to access the data
- Add the service account as Owner in the Search Console property you are going to inspect
- Make the API call
What is this data useful for?
Once Google has given us access to this data via API, albeit with limits (logical, since it is a free API), the possibilities are almost endless, and I am sure that in the coming weeks and months we are going to see a lot of different applications for this data.
For now, I can think of three very simple applications:
- Automate the inspection of a site’s latest published URLs
- Site tracing and API query by pages and categories
- Create some “mini-logs” of Google crawls