How to configure Robots.txt for WordPress

how to configure robotsCreating Robots.txt is an important thing that many bloggers don’t know and it is very important because this actually tell to crawler which page to index and which page not.

In this tutorial you will learn how make optimal configuration of Robots.txt for wordpress blog.

Why should I disallow indexing some pages?

Search engines are the best way to find vulnerable sites. If all of your WordPress files are indexed hackers can use *dorks to search vulnerable site and they also could see which plugin you have installed and his versions. Now if you have an out of date plugin it is vulnerable and could be dangerous. Disallowing search engines to index your private folder will be hard for bad people to find vulnerabilities in your website. Of course this is just a reason but are more others.

How to configure Robots.txt

Step1:  Create a file called Robots.txt.

Step2: Now open it and add the following content:


Disallow: /wp-admin
Disallow: /wp-includes
Disallow: /wp-content
Disallow: /feed/
Disallow: /trackback/
Disallow: /wp-
Disallow: /*.css$


OK, now is your decision if you want to add the following lines, that are optional:

Disallow: /archives/
Disallow: /category/
Disallow: /date/
Disallow: /comments/

Why should not add these? Because this page will sent the visitor to category and not to post. The category have a part of post bu tnot fully and visitor must click on post again.

Step3:  Save it and then upload it on root directory of your website.

Root directory is the directory where you have installed WordPress.


You’ve just created Robots.txt file for search engines. You can also create automatically Robots.txt using WordPress plugin but you don’t have to complicate situation using plugins.

*dorks: dorks are search terms that help you to find exactly what type of content you want, for example if i want to find pages from yahoo i will use: wordpress

Got something to say? Go for it!