Optimizing Search Engine Coverage of a WordPress Blog

To opti­mize the search engine cov­er­age a robots.txt file placed in the root of the HTML serv­er can exclude some paths of the web­site from being indexed by the Google crawler.

In Word­Press this file can be gen­er­at­ed auto­mat­i­cal­ly. In the case of using Nginx a redi­rec­tion rule has to be imple­ment­ed in the nginx con­fig file so that the query to robots.txt is redi­rect­ed to the Word­Press index.php file.

# generate robots.txt with wordpress
location = /robots.txt {
    try_files     $uri $uri/ /index.php?$args;
    access_log    off;
    log_not_found off;

Now Word­Press is used to gen­er­ate the robots.txt file which is dynam­i­cal­ly edit­ed by plu­g­ins and themes.