fallback-image

The 3 Work Areas Every SEO Consultant Should Know and Work on NOW

78

Being visible on Google is the goal of many people but, to achieve it, we need to keep up to date with the news in the world of SEO. This is why we must ask ourselves: how does SEO work today? What are the REAL pillars that we need to take into account in order to function effectively?

The techniques that the expert SEO consultant refers to are:

  1. Create and maintain content up-to-date and fresh .
  2. Convert it to evergreen by removing the date.
  3. Creates content that replies gives the question that users ask in the search engine.
  4. Take care of the internal linked and anchor to distribute the force between different URLs and provide those URLs with consistent semantics.
  5. Optimize the authority of the content when the site grows and becomes

In this way Luis answered the question that he himself had asked at the beginning of his presentation by stating that none of these techniques has changed. All are still effective

And he insisted that SEO changes little by little, thanks to both search engine users and SEO themselves. Which does not mean that we should abandon the techniques that have worked for us until now. We have to complement them with new ones.

Google: a key aspect for positioning

After this first introduction, Luis referred to Gary Illyes and John Müller, two representatives of Google giving advice to improve SEO. Some of the ideas that can help us better position ourselves are: think about mobile devices, increase loading speed or give importance to links and content.

We must keep in mind that Google learns by everything and trying to respond better and better to user searches. Therefore, we too must take this path.

However, in order not to waver here and here with all the recommendations we encounter, it is important to work with our methodology. Test it in different contexts and over time to verify its effectiveness.

The 4-piece SEO puzzle: crawling, indexing, content and popularity

From this point Luis started presenting his methodology, which he developed based on the process followed by Google. In this way, we can say that there are 4 fundamental pieces to work on SEO in a comprehensive way: crawling, indexing, content and popularity.

But we have to keep in mind that, for all the pieces of the puzzle to fit together, they need to translate correctly on the web. And this is responsible for the web architecture .

Web architecture is constituted by the architecture arq of information. Which includes the logical dependencies and semantic groupings created. And on the other hand, it also includes the technical architecture which transfers all this information to the web.

The 3 most important work areas in SEO

Next, the SEO expert explained how these 4 elements can help us create a methodology useful for us. Distribute the power of the web based on our priorities and the goals we set ourselves.

When working on them, these 4 pieces are classified into 3 areas which are: technique, quality and authority. Being able to briefly summarize the objectives and functioning of each of them.

How is the technical area related to crawling and indexing?

The technical area wants Google to see the URLs we want it to see as soon as possible. All this thanks to the efficient web architecture based on correct crawling and indexing.

For this very reason, it is important that we seek the most efficient way to develop our SEO strategy . Avoid possible positioning errors that could defeat our goals.

Why is the area of ​​quality focused on content?

On the other hand, the area of ​​quality seeks to cover the current and future research intentions of the user. Give Google as much information as possible

To cover this area it is important to create appropriate content. That responds to the needs of our users.

How does the authority area use our brand image?

Finally we have the authority area, which is related to our brand image. If we respect this point as well, we can transform the project into a reference at the research level

But, in order to improve our brand, it is important to generate an adequate link building strategy. Creating links to other web pages with a similar theme.

How to work in the technical area of ​​SEO

After presenting the first concepts, Luis entered the practical part of the presentation . In which he provided various recommendations applicable to our projects for each of the areas. Starting with the technical area.

1. Differentiate between crawlable and indexable URLs

But, before getting to know the more practical aspects, it’s important that we see the difference between some terms. This happens, for example, with crawlable and indexable URLs.

Google does not crawl URLs it finds in order or when it finds them. Rather, prioritize the authority of each URL.

Luis commented that understanding this distinction first requires clarifying what detection, crawling and indexing consist of . Three terms that are closely related to these terms.

► Detection: Detection occurs when Google discovers that there is —or detects – a URL. When this happens, it queues it to crawl, after a previous scan to see how many links it contains.

► Scan: T Whatever Google sees in HTML, it will crawl and queue it using an intelligent system that prioritizes processes . Therefore, the scan would be based on these priorities and their level of relevance.

► Indexing: If the crawled URL is indexable and Google can index it, it will. If it doesn’t index it, it might be because it doesn’t consider it canonical URL. Or you might consider having an item that shouldn’t appear in your search engine.

How does Google decide which URLs to index and which not?

At this point, Luis pointed out that we declare a canonical URL does not mean that Google will consider it canonical. Google, on the other hand, will decide for itself.

This argument is extremely important because if we declare a canonical URL and Google does not accept our suggestion, will not index it . Even if you know its existence and have tracked it down.

To develop all this well, we need to check and know how to differentiate some other important terms.

    • URL Discovered / Detected : Google detected this in the previous crawl and queued it, setting its priority. [19659075] Scannable URL : can be scanned without problems.
    • URL scanned : it was crawled by Google.
    • Indexable URL : Can be indexed without problems.
    • Indexed URL – Appears in the Google search engine when a user performs a query.

When can Google index a URL?

At this point Luis started talking about the usual cases of indexing. Taking advantage to demystify a widespread misconception: an untracked URL cannot be indexed.

Furthermore, we must also know that even if a URL is blocked by robots.txt, Google s to index it. So we need to know some of the usual situations:

    • A detected, discovered and traceable URL may last a while without being tracked . This would happen when we open a new website and it has no external authority, it has many URLs with a depth of over three or four levels. In this case, the URLs would be detected, but Google will not crawl them.
    • A crawlable URL may be indexed.
    • A Even a non-crawlable URL may be indexed.
    • A non-crawlable and non-indexable URL (when we tell Google not to index it) may also be indexed. [19659074] Why? Because, as Luis pointed out, when we declare the information and block the URL, Google will never be able to see it like that. Google even indexes 404 errors provided by a URL for the simple reason that it is unable to display the error.
  • The canon is a suggestion and it could be that the statement made by the user was incorrect.

After pointing out the difference between detection, crawling and indexing and Google’s considerations in this regard, Luis went on to explain the following suggestion in the technical area: i depth levels of a URL.

2. Consider the depth levels of a network

The depth levels constitute the hierarchy of each network. For example, Luis explained in his presentation the hierarchy of an electronic commerce: with his Home, a theory about a sub-category and a product.

In this way and according to Google, from the web hotspot – or the strongest page of the domain -, all others would be structured according to click levels . This main page is usually the Home page, because it is the page that links the most and the most popular, through external and internal links.

The more depth levels a URL has, the more strength it will lose [19659060] But what do you do when you have a website with many levels of depth and you want to attract more traffic? Luis explained that the ideal is to clean up useless URLs and to pass another set of useful URLs at lower levels all within consistency.

3. Try it. The case of the pages

Finally, the last step would be to experiment in some aspects so that the result can be obtained. To explain this, Luis alluded to Google when, in March 2019, he revealed that he had not considered rel-next / prev for years from versus pages . However, whenever they asked John Müller or Gary Illyes – representatives of Google itself – about rel-next / prev, they advised to put it.

With this “funny” case, Luis indicated that the key that always works when it comes to SEO is to experiment . That is to say, we shouldn’t blindly trust what we read as opinions can be different.

Luis returned to the subject of the pages relating to this command stating that they were necessary, even if they were not desired, especially for monitoring . And to I added that it is important that Google crawls all the URLs we have and that we shouldn’t put 4325 products into a category without paying for them. What we need to do is reduce this layout so that all products are scanned efficiently at low depth levels.

How to work in the SEO quality area

After having dealt with all the more technical part of SEO, Luis moved on to the quality part, which is also very important in SEO. And it had an impact on the key objectives of this area:

  1. Cover the user’s search intentions when they run it in the search engine.
  2. Provide as much information as possible to Google at the address via microdata, microformats, etc.

And I introduced a very hot topic about content with an audience question: Does duplicate content penalize? . Those present raised their hand with conviction, but Luis indicated that as always it depends and that it was necessary to distinguish between penalties and filtering.

What does Google do about duplicate content?

According to Luis, Google filters versions of duplicate URLs trying to identify the correct one. So, if we have multiple URLs with the same content, the information we give to the search engine is spread. In addition to not providing a correct canonical statement

There is no penalty for duplicate content. Google chooses one URL among all and declares it canonical. The others simply filter them.

We can therefore say that Google filters URLs with similar content and keeps the good one by taking it as a reference. It is the widespread signals that we, as webmasters, send out that actually harm us at the position.

Furthermore, and also pointed out that has always been like this but now we can see it better thanks to the fact that Google Search Console confirms it in the “excluded” section.

Tips for the area of ​​authority

Once two of the main areas of the strategy have been addressed, the third point remains: that related to the area of ​​authority.

To understand this, I asked the following question: “ What is the final goal that we are pursuing with the whole process?” The answer was clear: we want to convert the design into a reference and for everyone to mention (or link) it and look for it from the brand itself .

An analogy that illustrates this very well is that of those interviews made to other people in the press by a company to generate greater relevance in its sector. Luis made it clear that authority is the same, but on the internet, and went on to explain how to measure it through PageRank.

What is PageRank?

When we speak of PageRank we are referring to a formula which expresses the relative weight of a page. Luis has indicated that this is the best way to achieve a consistent and efficient distribution of that aforementioned force. Since 15% is lost for every level of depth we find in a URL.

The formula to calculate the Pagerank is as follows:

PR (A) = (1 + D) / N + D * (PR (B) / L (B) + PR (C) / L (C) + …)

Where:

N: Total number of active pages that are part of the calculation. [19659002] D: It is the damping coefficient (generally it has a value of 0.85).

L: This is the number of outgoing connections.

He also added that we should not limit ourselves to making internal ties to free will. We must rely on 2 basic premises:

  • Do not lose internal popularity : it is better not to link to URLs that do not give the code 200, canonical indexable, that give error 404 301 or that do not I’m indexable. Better to just link to URLs that will appear on Google.
  • Don’t waste popularity : It is better not to attribute internal popularity to not so important URLs. Popularity must be distributed fairly, prioritized, consistently and efficiently. Not all URLs need the same strength, nor are they all equally important or competitive.

To conclude the presentation, Luis indicated to the listeners that the goal is to focus where our actions generate the greatest “ROI”. And that is why this methodology helps to identify the work area on which we need to focus our efforts.

And he concluded his splendid presentation with a last piece of advice:

<

h2> Conclusion [19659008] After a presentation full of contents and applicable advice, the conclusions we were able to draw are:

  • The EES has not changed. It evolves little by little as Google improves its search engines to satisfy the user.
  • The SEO puzzle of any URL is based on crawling, indexing, content and popularity. [19659016] The 3 areas that comprise these pieces are the technical area (scanning and indexing), the quality area (content) and the area authority (brand popularity).
  • The first and biggest goal of any SEO web project is for the world to know, detect and search based on its branding.
  • The important thing when undertaking an SEO project is to detect in which area it is failing or can improve and focus on it.

Metricol the industry-leading analytics tool with access to more data.

Digital Content specialists in the creation of quality content

Easy Promos a platform to develop reliable contests on Social Networks.

Source link

admin

No Comment

Leave a Reply

Newsletter

Follow

Subscribe to notifications

Recent Posts

Categories

Meta

Archives