Robots.txt: What is it and how to create it?

If you are here is that you have heard about the “robots.txt” as part of your SEO, and are looking for explanations / information. Do not worry, it’s not that bad in the air!

What Is Robots.txt?

This is a text file “txt” named “Robots“, simply. It is used in the optimization to search engines, specifically to exclude certain content indexing by robots.
In summary, an engine running on your page every day to add your content to the web, and you, with that file, you can prevent this to him directly. Crawlers always consult the file “robots.txt” before acting.
We will now see how to use it on our website / blog.


How to create a robots.txt file?

Robots txt
Nothing complicated this time around, you create a text document you rename “Robots“. Once completed, send the file to the root of your site.
If you want to allow all robot to access the site, type only this or do not add the file:

User-agent: *

If you want to deny access to all robots:

User-agent: *
Disallow: /

If you want to deny access to the forum or to a URL … and only allow Google. 

User-agent: *
Disallow: /forum
User-agent: googlebot

Etc, etc. You have understood, this file provides great power and flexibility in the choice of content indexing or not. Caution, however, it is not a safety barrier against robots: Some ignore the fine.



Please enter your comment!
Please enter your name here