Creating a robots.txt file [NB]

Your online store is indexed by search engines, which makes it possible for your clients to find your online store. You can use a robots.txt file to prevent some pages from being indexed and displayed in search results.

Lightspeed does not provide technical support for creating your robots.txt file. We recommend that you edit the robots.txt file only if you have the required expertise.

Creating a Robots.txt file

  1. On the left menu of the Back Office click Settings and from Website Settings choose Web Extras.
  2. In the Robots.txt area activate its Status.
  3. In the Robot field enter the robot text.

 

Entering code

User-agent: Indicate which robots the code following it applies to. For example, you can choose to have the page iindexed by other search engines such as Yahoo! or Bing, but not by Google. Within search engines, you can even be more specific. If you want the code fragment to apply to all search engines, use an asterisk * in the User-agent field.

Disallow: Enter the URL you want to block. You do not have to specify the entire URL, just the file location. For example, if you want to exclude http://domeinnaam.com/category1/ from being indexed, you only have to enter /category1/. There is a difference however, between  / omission of a slash sign (/) at the end of a code line: Disallow: /category1: this applies to the entire folder, including all files and sub-folders in it. Disallow: /category1/ - this applies only to the category1 page in this folder. The best way to check this is by looking at the URL of the page. If it has a slash sign at the end, include it in your Disallow rule.

Check your code thoroughly for proper use of slash signs and whether it contains empty fields in between.

Sitemap - You can indicate the file location of your sitemap in the robots.txt file. In this case you will have to enter the full URL.

Code example: 
User-agent: *
Disallow: /categorie1/subcategorie1/
User-Agent: Googlebot
Disallow: /product1.html Sitemap: http://www.domeinnaam.nl/sitemap.xml

User-Agent: Googlebot Disallow: /product1.html Sitemap: http://www.domainname.nl/sitemap.xml

For more information on the robots.txt file, go to the Google support page

A Crawl-delay 2 notification is not serious and does not affect the operation of your online store. Crawl-Delay involves Google bots visiting online stores and prevents server overload, which can cause operational delay in your online store.
Have more questions? Submit a request