Web Development WordPress

How to add items to robots.txt with WordPress

One thing that I love about my job is that most days I learn something new. Today I learned that WordPress auto-generates its own robots.txt file, and that there is a filter that you can hook into to allow you to add your own rules to the file.

Working with one of our partners (we often provide business-to-business support (b2b) at Make Do) I asked server access to add a quick rule to the robots.txt file to prevent PDFs from being indexed on a site we were working on. The client informed me that WordPress generates its own robots.txt, and that there is a hook to let you add your own rules.

I quickly took to Google, and found out about the robots_txt filter. Using this new found information I hand crafted this handy little snippet:

/**
 * Filter function used to disallow pdfs from robots.txt
 *
 * @param    string  $output     The output of the robots.txt file
 * @return   string  $public     If the site is public facing
 */
function disallow_pdf_index( $output, $public ) {

    if ( '1' === $public ) {
        $output .= "Disallow: *.pdf\n";
    }
    return $output;
}
add_filter( 'robots_txt', 'disallow_pdf_index', 0, 2 );

So there you go, a short, but handy little blog post detailing how you too can add entries to your robots.txt file using WordPress.

Matt loves building plugins for WordPress and learning about personal, professional and web development. Learn more about Matt.