Your online store is indexed by search engines, which makes it possible for your clients to find your online store. The Robots.txt file is used to prevent certain pages of your online store from being indexed and displayed in search results.
Creating a Robots.txt file
Navigate to GENERAL > Settings in the left menu of the back office. You will find the Robots.txt option below the Other caption. In order to create the Robots.txt file, you will first have to activate its status. After doing so, an input field will open, where you can enter its code.
User-agent: Here you can indicate to which robots the code following it will apply. For example, you can choose to have the page in question indexed by other search engines such as Yahoo! or Bing, but not by Google. Within search engines, you can even be more specific. If you want the code fragment to apply to all search engines, use an asterisk * in the User-agent field.
Disallow: Enter the URL you wish to block here. You do not have to specify the entire URL, just the file location will do. For example, if you wish to exclude http://domeinnaam.com/category1/ from being indexed, then you only have to enter /category1/. There is a difference however, between the use / omission of a slash sign (/) at the end of a code line: Disallow: /category1 - This will apply to the entire folder, including all files and sub-folders in it. Disallow: /category1/ - This will only apply to the category1 page in this folder. The best way to check this, is looking at the URL of the page in question. If it contains a slash sign at the end, be sure to include it in your Disallow rule.
Sitemap - If so desired, you can also indicate the file location of your sitemap in the Robots.txt file. In this case however, you will have to enter its URL in full.
Disallow: /product1.html Sitemap: http://www.domeinnaam.nl/sitemap.xml
User-Agent: Googlebot Disallow: /product1.html Sitemap: http://www.domainname.nl/sitemap.xml
For more information on the Robots.txt file, please check its Google support page