Everything about Shopify Robots txt file

Everything about Shopify Robots txt file


You have an online business and you want to work SEO positioning in Shopify?, you will have to take into account the archivo shopify robots txt.

But what are robots.txt? How do we exclude a page in our robots shopify txt file from crawling? What if it’s multiple pages or search patterns?

You will find everything the answers in this post, Keep reading!

What are robots.txt and what are they used for?

the archivo shopify robots txt it’s used for indicate to the exploration robots (robots / spiders) from Google which pages we want that follow, and which do not, since it’s the first document they save when entering your page.

For example, we are going to intend to block user cart pages from search engines.

The robots.txt file is used to hide certain sections of a page, avoid some pages appear in the SERPs, to indicate where find the site map, and for avoid who is index of image files, PDFs, layouts, templates

In order to To consult your shopify robots txt file, you just need to add /robots.txt at the end of your domain. For example, if your domain is yourdomain.com, to access the file you will need to search tudominio.com/robots.txt.

Can I edit the robots.txt file?

No, that’s the problem, what cannot be edited the content of archivo shopify robots txt compared to other CMS like WordPress or Prestashop, but don’t worry you can easily add the label in index

in your Shopify or edit the theme file.

How to exclude a page from the txt of Shopify robots?

In order to deny access to search engines to any of your Shopify pages, you need to do the following:

  1. Get to your control panel Shopify.
  2. Via administrator access “Online store”> “Themes”.
  3. Find the subject to modify and select “Actions ”>“ Modify code ”.
  4. Click on “Theme.liquid” (layout file).
  5. Find the section Yes Add the following code:

{% if handle contains ‘page-handle-you-want-to-exclude’%}

{% end if %}

To exclude a specific page, you must cambiar el ‘descriptor-of-page-that-you-want-to-exclude’ for the username of your page. What is the identifier of a page? For example, if your page URL is: ejemplo.com/pagina/archivo-shopify-robots-txt, the page identifier will be “robots shopify txt file”.

How to exclude more than one page?

You wonder if exclude more than one page In your robots shopify txt file, you need to copy the above code as many times as the pages you want to hide, you are wrong, you can also do it like this, which will be the most correct and optimal way.

In case more than one page is required, you need to paste the following fragment:

{% if handle contains ‘page-handle-you-want-to-exclude’%}

{% elsif handle contains “your-page-handle-2″%}

{% elsif handle contains “your-page-handle-3″%}

{% else%} {% endif%}

In that case, “your-page-handle-numberfor each URL snippet that you want to hide.

How to hide the search template?

In order to exclude search pattern you must enter the following code:

{% if the model contains “the contents of the model, that is to say the search”%}

{% end if %}

Now you will only need Save changes, and everything would be. Be very careful when performing these actions, try not to modify the other sections! But have follow this guide step by step, you shouldn’t have a problem, but if you have any questions, please ask e-commerce agency.

Fast and easy, right? Now you are already an expert of the robots.txt file and how it can benefit the positioning of your e-commerce.



Source link

Leave a Reply