Google Page Experience Algorithm Update

Google Page Experience Algorithm Update June 2021

In June 2021 Google began using Page Experience as a search ranking factor: Core Web Vitals, Mobile Usability, Security, Ad Experience. Does your website pass or fail?

Find out more.

How to use the URL Inspection Tool to find out which of your Web pages Google has indexed and which it hasn't, and why

08 July 2018 (Edited 09 July 2018)

Any Web page that Google hasn't indexed is at a big disadvantage - because Google won't be sending any organic search traffic there regardless of how well your content matches phrases your potential customers are using to search for stuff like yours.

So, how can you find out which of your pages Google has indexed, which it hasn't and why, and how you can fix that?

The new (June 2018) URL Inspection Tool - part of the new and improved Google Search Console (beta) will do that.

To get to the tool, first go to Google Search console, here: where you should see something like this:

Note: If your Search Console doesn't show the site you want to check, you need to add it as described here: .

Assuming you can see the default URL of your site of interest in Search Console, click on it.

In this example I'm going to click on . Doing that takes me here:

To get to the URL Inspection Tool, in the live Search Console page, click the Try the new search console link at upper left, which should take you here:

Then at the top right of the Index coverage chart, click on OPEN REPORT to get a view like this:

In the example, thus report is showing me that everything is OK except for 4 Excluded pages that Google hasn't indexed. To begin to find out which pages these are and why they weren't indexed, click the Excluded box to get to the next screen:

The chart shows us a day-by-day plot of numbers of Excluded pages, and the Status table below breaks these down by reason for exclusion: in the example, 2 pages excluded for Crawl anomalies, one for 404 error (not found) and one as Page with redirect.

Clicking on the Crawl anomalies, I can find that these are pages that have since been removed, as is the page with 404 error. The page with redirect is which redirects to - a normal condition.

So: now I know that the best corrective action I can take here is to run Xenu Link Sleuth on the site to ensure there are no bad links; then create, upload and submit a new XML sitemap that contains no non-existent pages - which you can do from right there within Search Console:

There's a lot more you can find out about individual URLs using the URL Inspection Tool. More on that in a future post.

If you found this article helpful and would like to see more like it, please share it via the Share This Article link, below.

And if you have questions or comments, you can easily send them to me with the Quick Reply form, below, or send me an e-mail.

David Boggs    - David
View David Boggs's profile on LinkedIn

Google Certifications - David H Boggs
View my profile on Quora
Share This Article

5/5 based on 1 vote.
Show Individual Votes
Tags , , , , ,
Related Listings
External Article:

Sorry, you don't have permission . Log in, or register if you haven't yet.

Please login or register.

Members currently reading this thread:

Previous Article | Next Article