Advanced Search

Software review: Using Xenu's Link Sleuth for SEO
By
13 February 2018 (Edited )
Share this article:



Xenu's Link Sleuth by developer Tilman Hausherr is probably most often used simply as a link-checker: send it to the home page of a website, and it will produce a report of broken links sorted by page. Quick and simple.

But Link Sleuth has many more features that make it a great tool for analyzing a number of SEO aspects of a website, including:

  • Page filenames
  • Page weights
  • Image file sizes
  • Page titles
  • Description META tags
  • Local link counts, ingoing and outgoing from each page
  • For any page, link from within Xenu directly to the W3C html validator
  • It's free

The only drawback for some potential users is that it's guaranteed to run on Windows (and had been found to work on some versions of Linux) but doesn't run on mac, android, etc.

Downloading and installing Link Sleuth is a 1-minute job: just go to the developer site (linked from Read External Article, below), click the download link, then un-archive XENU.ZIP using WinRAR or WinZIP, double-click Setup.exe.

When you open the program, you'll see an empty gray window with a menu at the top. Click File | Check URL, and you should get a dialog box like this:

You can see a few settings in that view, and there are a few more behind the More Options link. You can experiment with these settings freely without breaking anything.

For purposes of example, I've sent Xenu to crawl and report on MattJBarker.com - a very simple site with just 5 pages, 5 images and a stylesheet.

Here's a screencap of the most interesting parts of the results Xenu produced in just a few seconds:

A quick look at that tells us:

  • All the page, image, CSS and other file URLs within the site, and linked from any of its pages
  • Response from each URL as it was crawled; all local URLs show as OK. (For simplicity. I set Link Sleuth not to pursue external links.)
  • "Type" of each URL; text, image, etc.
  • Size of each file, which gives an indication of loading time. Using these data, you could immediately spot any enormous images; happily this site has none.
  • The <TITLE> tag of each page (ALT tags for images), all of which should be unique, relevant to the content and optimized for keywords. The site gets good marks for that; TITLEs are unique. (Note that Link Sleuth sees the home page as both [/] and [/index.htm]; it's just one page with one TITLE - not a problem.
  • Farther to the right, outside the screencap, Link Sleuth also shows the <META Description> tag for each page. These should be unique and relevant also.

But that's not all. right-click on any URL in the live Link Sleuth report and you'll get a menu that, among other things, lets you view the Properties of the URL, including all incoming and outgoing links.

Also from that same right-click menu you can choose Validate URL to go directly to the W3C online HTML validator and see its report on how well the page code complies with HTML standards. (But be warned: this validator is very fussy, and can produce pages of incomprehensible analyses of the weird code that often exists in links to Google, social media buttons, and other external things that are beyond your control.)

To dig deeper into the crawl data, open the File menu and click Export to TAB separated file. Link Sleuth will produce a tab-delimited .TXT file of the data which you can save and open in Excel (or similar).

In the example, with external data removed, the first 6 columns of the .TXT file look like this:

The advantage of having these data in your spreadsheet program is: you can sort, filter, and otherwise slice and dice the numbers in order to look at anything you want, such as: which <TITLE> tags are longer than 60 characters, so are getting truncated in Google crawls? Which <META Description> tags are too long or too short?

Run Link Sleuth on your website, and I guarantee you'll find some - maybe a LOT OF - broken links, suboptimal <TITLE> and/or <META Description> tags, too-large images, images with no or useless ALT attributes, multiple pages with the same <TITLE> (which look like duplicate content to Google), etc., etc.

Not bad for a free tool!

Share this article:

And if you have questions or comments, you can easily send them to me with the Quick Reply form, below, or send me an e-mail.


David Boggs MS    - David
David@DavidHBoggs.com
View David Boggs's profile on LinkedIn

Google Certifications - David H Boggs
View my profile on Quora
Subscribe to my blog

External Article: 


Website
Visit Website
Rating
3/5 based on 2 votes.
Show Individual Votes
Related Listings

Sorry, you don't have permission to post comments. Log in, or register if you haven't yet.

Please login or register.

Members currently reading this thread: