First, say it’s a tool you can download here and what will you do for audit the technical SEO part of your projects. Something very useful especially when you are starting a project or want to Perform an analysis of the state of the same.
I no longer dwell in the introduction because this tool, like other similar ones like Screaming Frog, has fabric.
In the following lines, I will show you some outstanding features of this software.
Netpeak Spider: fast and easy in automatic mode
What I liked most about Netpeak Spider is the ease with which it shows you the data. I have to admit that at first I hated this tool and it’s not because it was difficult to use, but because I’m too used to using Screaming Frog.
Let’s start with everything this tool tracks and what data it shows you:
- Robots: It monitors whether there are robots files, which pages are blocked and applies various advanced aspects such as canonical URLs that link to pages blocked by Robots.txt.
- Meta-robots: Indicates which URLs have meta-bots and which don’t. That have an index, no index, or other markup.
- Redirects: It not only tells you about redirects, but also chain redirects, bad ones or ones that end up in pages that are not indexable or blocked by robots.txt. here is a article where I tell you about redirects.
- Canonicals: As in the previous section, it indicates canonical strings and those that end with pages that cannot be indexed or blocked by robots.txt.
- Pages: Here it tells you if paginations are well implemented, if you are using rel = next / prev, if you are using canonicals, if you have errors, etc.
- Hierarchy of titles
- Content extension
- Text / HTML ratio
- Number of words
- Now the Palabras and
- Internal links
- Internal outbound links
- External links (outgoing)
PROBLEMS: Errors, Warnings and Notifications – THIS COOL!
One of the parts that I like the most about this tool is the part of Insults (“important” problems) that NetPeak Spider sets by default to take different actions in a project according to defined priorities.
By default, it defines 3 types of insults:
- Errors (high priority): These are usually mostly serious errors. The first to take into account when managing priorities and improvement actions on a website.
- Warnings (medium priority): These are usually lighter errors, mostly related to redirects, canonicals to different 200 code pages, etc.
- Notifications (low priority): Although I put low priority here, many times, especially when it comes to canonicals, they can be high priority, because if we have the canonicals poorly implemented, we may have a serious problem on the Web.
As I mentioned in the previous lines, this tool is very similar to Screaming Frog, so I’m not going to give a full review of the whole tool, because then it wouldn’t be an article, it would be an xD parchment. For what I decided to highlight 7 cool things this tool does that are at least worth trying and decide if it works for your projects.
LET’S GO TO MESS!
1. Redirect the channel
This is a great option. What it does exactly is inform you of everything redirect channels that you have in your project i.e. one url that has a 3xx code to another url that also has another 3xx code and all the others you may have. In the video at the bottom of this article, you can see where this option is located.
Besides, it also has another important option which you can configure to indicate the maximum number of continuous redirects you want me to identify as error (by default they are 4) but I like to put 2.
2. Redirects with blocks
Another interesting option! Notify everyone redirect or redirect strings ending with a URL blocked by robots.txt.
This can also be verified with Screaming Frog by downloading the redirects file and filtering by all URLs with 3xx codes that point to pages blocked by robots.txt or that respond to a code other than 200. But with this tool, you have it at the click of a button 😉.
3. Infinite redirect loops
It’s easy. They are simply the URLs that have a redirect to themselves (More common need than you might think). The problem you have with this is not just passing on authority, you also have googlebot running around like crazy so those redirects never end (it doesn’t, but you understand me correctly xD).
4. Calculation of internal PR
This option is one of the ones I like the most. Based on all the internal links that this tool detects and taking as a reference the original Google Page Rank formula, it will give you the internal PR data of each of your URLs.
This will help you get a feel for the strongest URLs in your project and whether you are transferring more force to the pages that interest you the most or the least.
*** Here is an article from Search Engine Land where they describe another method to perform this task based on the Page Rank formula: http://searchengineland.com/improve-internal-linking-calculate-internal-pagerank-r-246883
5. Loss of internal public relations
Another interesting option. This tool notifies you of areas of your website where you are wasting public relations. This usually happens when we link to images that are in other URLs, when we link to files, etc … since we are force-feeding those URLs and they don’t return anything (here also, it could affect the level of exploration, the eye).
6. Internal links to a single URL
While this option is just as easy to remove in Screaming Frog, I wanted to highlight it as many ask me how it can be done. All you need to do is click on the URL you are interested in and click on the inbound internal links option. With that we would have all the information of the internal links that point to this url and also show you information about the anchors etc.
7. Canonicals in Chain and those that end with Locks
Finally, I wanted to highlight this option because I find it very, very interesting. Identify that URLs have a canonical pointing to another URL that also has another canonical pointing to another URL … and so on, regardless of the maximum number.
Additionally, as with the 3xx codes I talked about earlier, it also identifies URLs that have a canonical that points to another URL that is blocked by robots.txt.
More information on Netpeak Spider – [VÍDEO]
My conclusions on the Netpeak Spider tool
I want to be totally honest with you. This tool has many features that complement with Screaming Frog, you can have some very valuable information for your projects, but beware, I would never replace it with Screaming Frog (maybe I have a lot of love for the little frog ).
Therefore, if you have big projects, if you have an agency, if you are freelance or if you want to experiment, I recommend that you use it and combine it with Screaming Frog, you will have a lot of power and … 2 The tools always identify more information than 1 and in this case too there are things that are identified faster.
*** PS I did this article at lightning speed, if you find a mistake, calm down, I will correct it when I start rereading the article. I “headed” into the edit before 7pm and had to 😀.