How to check your website's compliance with Google Webmaster Guidelines using the free Varvy SEO Tool
By
14 August 2018 (Edited 22 August 2018)

Subscribe to my blog

Share This Article


Source: David|14 August 2018

The Google Search Console will sooner or later warn you about things on your website that don't comply with Google's SEO guidelines. But the free Varvy SEO Tool at https://varvy.com/ enables you to check compliance proactively, and also provides interpretations and explanations of its findings and the corrective actions you need to take.

For purposes of example, I'm going to run the Varvy tool on what Skift says is one of the best US tourism websites: VisitDetroit.com .

The starting point of the Varvy tool looks like the screencap below. Just enter a URL and click the TEST button.

Then after 15 seconds or so, the tool will produce a page with its results for your page's compliance with 14 Google Webmaster SEO guidelines.

1. Googlebot Access checks that your robots.txt file isn't preventing Googlebot from crawling your page. Our example page passes.

2. Mobile Devices checks how well the page displays in smartphones. The tool gives the example page a pass, but warns that "Tap targets are small". To find out what that means, on the live Varvy report, you can click the "Learn about mobile SEO" link.

3. Security checks that the page is using an encrypted HTTPS connection. The VisitDetroit.com home page passes this. And the tool shows you the details of the SSL certificate.

4. Accessibility checks page features that make it usable by visitors using screen readers. The tool notes "No skip to main content link" - because screen-reader users have to navigate through the page using their TAB key, which means they must tab through every link on the navbar and whatever else is near the top of the page in order to reach the main content. Also some images have no ALT attribute - more on this later.

5. Page Speed (self-explanatory): the tool finds several issues here; explanations are behind the link.

6. Robots.txt checks to see that a robots.txt file in fact exists on your server. This file can speed up crawling of the site by telling Googlebots and other agents to skip over directories and files with no useful content - like the Administrative directory of WordPress in the example. The example page passes.

7. Image alt text checks to see if each image tag has an ALT attribute that describes the image - useful to the visually impaired and in cases where the image fails to display. The tool lists every image on the page and shows the ALT text where present.

8. Sitemaps checks for the presence of XML (for bots) and HTML (for humans) sitemaps, which make it easier for bots and humans to navigate around, and for bots to index all pages of the site. The example page passes.

9. If Modified Since checks to see if the server is setting HTTP headers that allow bots to bypass cached files that haven't changed in the past X span of time. This speeds up crawling. The example page gets a fail on this. But this is the one test for which we have some doubts about the accuracy of the tool. If your page fails, recommend you discuss with your hosting company tech support.

10. Paid links checks to see that any links you're selling, and any ads from individual companies or websites, have "nofollow" tags or are excluded by robots.txt, as required by Google. The example site passes.

11. Valid HTML reports results of running the W3C HTML Validator on the code underlying the page. The example page passes.

12. Amount of links checks for excessive numbers of links from the page. As a rule of thumb, anything up to 100 links is OK with Google. The example page has 242 links, and passes, which probably means the tool has a failure threshold considerably higher than 100. Google will be looking at the quality of the links as well as the number, and there's no way the tool can do that in 15 seconds. If you have many, many links, make sure they're good ones, useful to the user.

13. Findable links checks that the links on the page follow a logical top-down hierarchy, and that text links are descriptive of where they go: like "Contact Us" vs. "click here". The example site passes.

14. HTTP headers checks that page headers are correct to facilitate communication between the server and users' browsers. A 200 HTTP response from the server tells the browser that the page is OK; a 404 response means "not found". When the tool accessed the example page, the server returned a 200 response, permitting the page to load. Example page passes. HTTP headers are a very technical issue, and anything going wrong here is happening at the server and is likely out of your control. Consult hosting tech support.

That's a lot of useful information you can get in 15 seconds for free. Take advantage of the findings and recommendations of the Varvy tool to make your site more Google-friendly, and you should be rewarded with better rankings in organic search.


If you found this article helpful and would like to see more like it, please share it via the Share This Article link at the top of the page.

And if you have questions or comments, you can easily send them to me with the Quick Reply form, below.


David Boggs    - David
View David Boggs's profile on LinkedIn

Google Certifications - David H Boggs
Subscribe to my blog

   
Rating
5/5 based on 1 votes.
Show Individual Votes
Tags , , , ,
Related Listings
External Article: https://varvy.com/


E-mail:
Quick Reply
Name:
E-mail:
Subscribe to my blog:
Your Comment:


You may use BB Codes in your message.
Spam Prevention:


Members currently reading this thread:

Previous Article | Next Article