Generate robots.txt

The robots.txt file is used by the search engines to know if you want or don't want it to index pages. It will basically allow you to tell it if you want a specific URL, folder or document not to appear in search results.


Behaviour by default

Specific behaviour


Examples of robots.txt

Allow all pages for all user-agents

User-Agent: *
Disallow:

Deny all pages for all user-agents

User-Agent: *
Disallow: /

Deny to a specific page /test.html for all user-agents

User-Agent: *
Disallow: /test.html

Deny to 3 specific folders /css/, /img/ & /js/ for all user-agents

User-Agent: *
Disallow: /css/
Disallow: /img/
Disallow: /js/

Deny to a specific page /test.html only for GoogleBot and allow all pages for all user-agents

User-Agent: GoogleBot
Disallow: /test.html
User-Agent: *
Disallow: /

Be carefull that all the lines have content. Otherwise it could cause unwanted issue with some search engine bots.



 FR     EN  

Copyright 2024 -   Contact us - 2024-12-21 14:14:14