Optimizing Search Engine Coverage of a WordPress Blog

To optimize the search engine coverage a robots.txt file placed in the root of the HTML server can exclude some paths of the website from being indexed by the Google crawler.

In WordPress this file can be generated automatically. In the case of using Nginx a redirection rule has to be implemented in the nginx config file so that the query to robots.txt is redirected to the WordPress index.php file.

# generate robots.txt with wordpress
location = /robots.txt {
    try_files     $uri $uri/ /index.php?$args;
    access_log    off;
    log_not_found off;
}

Now WordPress is used to generate the robots.txt file which is dynamically edited by plugins and themes.