Your online store is indexed by search engines, which makes it possible for your clients to find your online store. You can use a robots.txt file to prevent some pages from being indexed and displayed in search results.
Creating a Robots.txt file
- On the left menu of the Back Office click Settings and from Website Settings choose Web Extras.
- In the Robots.txt area activate its Status.
- In the Robot field enter the robot text.
User-agent: Indicate which robots the code following it applies to. For example, you can choose to have the page iindexed by other search engines such as Yahoo! or Bing, but not by Google. Within search engines, you can even be more specific. If you want the code fragment to apply to all search engines, use an asterisk * in the User-agent field.
Disallow: Enter the URL you want to block. You do not have to specify the entire URL, just the file location. For example, if you want to exclude http://domeinnaam.com/category1/ from being indexed, you only have to enter /category1/. There is a difference however, between / omission of a slash sign (/) at the end of a code line: Disallow: /category1: this applies to the entire folder, including all files and sub-folders in it. Disallow: /category1/ - this applies only to the category1 page in this folder. The best way to check this is by looking at the URL of the page. If it has a slash sign at the end, include it in your Disallow rule.
Sitemap - You can indicate the file location of your sitemap in the robots.txt file. In this case you will have to enter the full URL.
Disallow: /product1.html Sitemap: http://www.domeinnaam.nl/sitemap.xml
User-Agent: Googlebot Disallow: /product1.html Sitemap: http://www.domainname.nl/sitemap.xml
For more information on the robots.txt file, go to the Google support page