Best SEO toold using API
Click to hear an audio recording of this post

Google delighted webmasters and technical SEOs recently with the announcement of the Google Search Console URL Inspection API. 

Okay, so what was there to be delighted about? We hear you ask!

The good news is: Now you don’t have to spend a lot of time inspecting individual URLs with Google’s URL Inspection Tool. This new API allows third-party SEO tools to:

  • Interact with your Search Console properties 
  • Inspect URLs in bulk
  • Get the URL data from Search Console

In this guide, we’ll discuss what exactly the Google Search Console URL Inspection API is and the best SEO tools you can use to make the most of this new API.

Let’s start. 

What is the Google Search Console URL Inspection API?

The Google Search Console URL Inspection API is the Search Console API launched by Google recently — on 31st January 2022. It allows external apps or tools to interact with Google Search Console (GSC) and bulk-inspect the URL data. 

This new tool helps you debug and optimise the webpages. You can also request the URL indexation data held by GSC for your properties that you’d otherwise get from the URL Inspection Tool

Here’s an example of what the bulk URL inspection report looks like.



Thus, you inspect multiple URLs at once and get insights into them, unlike Google’s URL Inspection Tool where you can inspect only one page at a time. 

However, note that, GSC API has the following usage limits:

  • 2000 queries per day per property
  • 600 queries per minute

That said, let’s learn more about the insights you can get and their usefulness.

7 Types of Data you can Retrieve from Google Search Console URL Inspection API

While the GSC URL Inspection API allows you to extract a wide range of data, here are some key details to look at. 

User-declared and Google-selected Canonicals

Sometimes Google may select a different canonical URL instead of the one you’ve specified. So by running the bulk inspection of the URLs with GSC API, you’d know whether Google has the same canonicals as declared by you.

Thus, you can easily compare them side by side at scale. 

Also, for the URLs where you haven’t declared canonicals, the result field will be blank. But if you find that Google has selected a different version as canonical, then you can add the canonical tag with the original URL on the Google-selected page. 

This will inform Google about your preferred canonical URL. 

Latest Crawl Time

Now you can easily get the crawling information like the last time when the URL was crawled by Google. It helps you understand how frequently Googlebot crawls your site. 

Earlier, you could get these details through log file analysis or by checking one URL at a time using GSC. But if you’re running an eCommerce site or you’ve been publishing for years, then you can end up with thousands of URLs. 

That’s where GSC URL Inspection API can help you inspect URLs quickly in bulk.  

Index Status

It informs you of the current indexing status of URLs submitted for inspection in GSC API.  

You’d see one of the below results:

  • INDEXING_STATE_UNSPECIFIED: It means the indexing status is unknown. 
  • INDEXING_ALLOWED: It means indexing was allowed and is successful.
  • BLOCKED_BY_META_TAG: It means that indexing was not allowed due to the presence of ‘noindex’ in the  ‘robots’ meta tag.
  • BLOCKED_BY_HTTP_HEADER: It means indexing was not allowed because ‘noindex’ was detected in ‘X-Robots-Tag’ http header.
  • BLOCKED_BY_ROBOTS_TXT: It means the crawling and indexing are blocked with a robot.txt file. 

This helps you find out the indexing issues for multiple URLs at a time and take corrective action if important URLs are blocked by meta robot tags or robot.txt files. 

Coverage Status

It provides details on whether your URLs are submitted to the sitemap and indexed or not. 

Robot.txt Status

With GSC API, you also get specific data for the robot.txt status of your URLs. In other words, it lets you know if robot.txt rules block the Googlebot to crawl your URLs. 


Page Fetch Status

Page Fetch Status allows you to know if Google could retrieve the pages from your server. Its main purpose is to help you diagnose server errors like redirection errors, soft 404s, forbidden access, etc. 

Here’s the list of probable responses you can get post URL inspection.



Once you identify the page fetch errors, you can take the required measures to resolve them. 

Crawled As

It helps you know the type of user agent that was used for the crawl. Here are the expected responses:

  • CRAWLING_USER_AGENT_UNSPECIFIED: The user agent couldn’t be identified. 
  • DESKTOP: Desktop user agent.
  • MOBILE: Mobile user agent.

All of the above data you get from bulk inspection of URLs would help your SEOs to identify the technical issues on your webpages, resolve them, and perform technical optimisation. 

If you don’t have in-house technical SEOs, you can hire an SEO agency that offers expert technical SEO services. 

6 SEO Tools That use GSC URL Inspection API

Having understood the type of data you can get from GSC URL Inspection API and its usability, now let’s take a look at the SEO tools that can interact with GSC API and generate URL reports for your site. 

Here they are. 

1. Google Bulk Inspect URLs

Google Bulk Inspect URLs is a free-to-use tool developed by Valentin Pletzer. It has the simplest interface to perform the bulk inspection of URLs and get the data from GSC API. 

It doesn’t require any registration or intricate configuration. All you have to do is open the app and follow steps 1 to 4 as shown in the screenshot below. 


Step 1. Authorise access to your GSC account. 

Step 2. Select the website property linked with your GSC account.

Step 3. Enter the list of URLs in the box as shown in the image.

Step 4. Hit Inspect URLs button. 

And you’ll get the URL inspection diagnosis report organised in different columns for each of the data categories we discussed earlier. Then you can export the data into a spreadsheet or a CSV file for further analysis and reporting. 

2. URL Inspection API in Sheets

Mike Richardson has created a Google Sheets Template that allows you to inspect and retrieve URL data from GSC API directly into the sheets. 

So you don’t have to use any external tools, you can simply make a copy of the sheets (free of cost) and feed the inputs like: 

  • Private key
  • Client email
  • Client ID
  • GSC property
  • List of URLs, etc. 

Once you fill in the required fields click Run as you can see in the GIF below.


Next, you’ll get the results of the URL inspection right there in your copy of the Google Sheets. 

3. MyDomain.Dev

Lino Uruñuela has developed a tool — MyDomain.Dev. It allows you to access the URL data from GSC API once you register for this free tool. 

Here’s how you can use it. 

First, you need to register and log in with your Google account which is linked to GSC. 

Then select your domain and enter the URLs into the space provided as shown in the screenshot below.


Then click Check URLs and the tool will start inspecting your URLs and grab the URL data from GSC API. Your data will be sorted into various result categories which makes it easier to analyse. 

Further, you can export the inspection reports to a spreadsheet or CSV file. 

Also, consider hiring professional SEO services if you don’t have in-house expertise for technical SEO audits and analysis. 

Now let’s talk about some paid SEO crawlers that have this feature integrated into their existing tool. 

4. Screaming Frog Spider — 16.6 Update

Screaming Frog Spider was one of the earliest adopters to integrate URL inspection API into its crawling tool. 

They announced it with the release of Screaming Frog Spider — Version 16.6 and codenamed it ‘Romeo’. So if you’re already using this SEO tool, all you need to do is configure the GSC API access and connect to your account. 

Although it’s a simple configuration if you’re familiar with the tool, you can check the instructions given on their version release page and follow them to start the bulk inspection of URLs. 


In addition, you can also automate the URL inspection API with this tool. 

5. FandangoSEO

FandangoSEO — a cloud-based crawler — was also quick to follow the suit and announced the integration with GSC URL Inspection API. 

They provide most of the URL inspection diagnosis data types like index status, last crawl time, robot.txt status, etc. 

Besides, they also monitor the subsequent changes in these statuses and notify you accordingly.   

6. Sitebulb — Version 5.7

Sitebulb also released its Version 5.7 with the new feature of GSC URL inspection API within a few days of Google’s announcement. 

So if you’re using the paid Sitebulb SEO tool, you already have access to this new feature. 

Apart from all the URL diagnostics reports, it has also included the new URL Inspection reports in form of clickable tables, charts, and visuals like this:


Similarly, you can visualise the data for other segments too. This helps you analyse data and get further insights by leveraging data visualisation. 

So these are some of the best SEO tools for accessing the URL inspection data from GSC API.

Besides, if you have the technical and coding know-how, there are free scripts that you can run in the terminal.

Here are the scripts:

Final Thoughts

Google Search Console API is making it possible for webmasters and SEOs to retrieve bulk data and insights fairly quickly. Seeing the quick adoption of this new API, many more SEO tools may also offer integration with GSC URL Inspection API in near future. 

At the same time, the daily usage limits are also being criticised as it’s not enough for large websites. But it’s still much better than analysing individual URLs with Google’s inspection tool. 

That said, if you need help with your technical SEO auditing and analysis, with us. We’ll be happy to help 🙂 


Enter Your Website & get an instant SEO Report for FREE