Any Web page that Google hasn't indexed is at a big disadvantage - because Google won't be sending any organic search traffic there regardless of how well your content matches phrases your potential customers are using to search for stuff like yours.
So, how can you find out which of your pages Google has indexed, which it hasn't and why, and how you can fix that?
The new (June 2018) URL Inspection Tool - part of the new and improved Google Search Console (beta) will do that.
To get to the tool, first go to Google Search console, here: https://www.google.com/webmasters/tools/ where you should see something like this:
Note: If your Search Console doesn't show the site you want to check, you need to add it as described here: https://support.google.com/webmasters/answer/4559176?hl=en .
Assuming you can see the default URL of your site of interest in Search Console, click on it.
In this example I'm going to click on http://www.acroglobal.com . Doing that takes me here:
To get to the URL Inspection Tool, in the live Search Console page, click the Try the new search console link at upper left, which should take you here:
Then at the top right of the Index coverage chart, click on OPEN REPORT to get a view like this:
In the example, thus report is showing me that everything is OK except for 4 Excluded pages that Google hasn't indexed. To begin to find out which pages these are and why they weren't indexed, click the Excluded box to get to the next screen:
The chart shows us a day-by-day plot of numbers of Excluded pages, and the Status table below breaks these down by reason for exclusion: in the example, 2 pages excluded for Crawl anomalies, one for 404 error (not found) and one as Page with redirect.
Clicking on the Crawl anomalies, I can find that these are pages that have since been removed, as is the page with 404 error. The page with redirect is http://acroglobal.com/ which redirects to http://www.acroglobal.com/ - a normal condition.
So: now I know that the best corrective action I can take here is to run Xenu Link Sleuth http://home.snafu.de/tilman/xenulink.html on the site to ensure there are no bad links; then create, upload and submit a new XML sitemap that contains no non-existent pages - which you can do from right there within Search Console:
There's a lot more you can find out about individual URLs using the URL Inspection Tool. More on that in a future post.