robots.txt SEO for your WordPress Blog

Advertisement

Did you know that through robots.txt you can control which post or page of your site to crawl or not to crawl? and if you’re currently worried about duplicate content on your site, well not a problem folks, robots.txt is also helpful to avoid duplicate content, yes it is possible to avoid duplicate content via robots.txt by not crawling site directories, like tag, archive and category these can cause duplicate content.

Implementing good SEO robots.txt will help you get high traffic in most search engines or even receive more paying in relevant ads.

robots.txt

Copy and paste the below snippet into robots.txt of your site.


# Now most search engine support auto discovery of your sitemap.xml via robots.txt
Sitemap: http://www.example.com/sitemap.xml

# Crawls pages for the image index
# Google Image
User-agent: Googlebot-Image
Disallow:
Allow: /*

#  Crawls website pages to determine AdSense content
# Google AdSense
User-agent: Mediapartners-Google*
Disallow:

# digg mirror
User-agent: duggmirror
Disallow: /

# global
User-agent: *
Disallow: /cgi-bin/
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /wp-content/cache/
Disallow: /wp-content/themes/
Disallow: /trackback/
Disallow: /feed/
Disallow: /comments/
Disallow: /category/*/*
Disallow: */trackback/
Disallow: */feed/
Disallow: */comments/
Disallow: /*?
Allow: /wp-content/uploads/

header.php trick

Add this snippet in header.php of your current theme, we simply add conditional statement which robots to display in our blog.


<?php //for home, post, page and category ?>
<?php if(is_single() || is_page() || is_category() || is_home()) { ?>
  <meta name="robots" content="all,noodp" />
<?php } ?>

<?php //for archive page ?>
<?php if(is_archive()) { ?>
  <meta name="robots" content="noarchive,noodp" />
<?php } ?>

<?php //for search and 404 page ?>
<?php if(is_search() || is_404()) { ?>
  <meta name="robots" content="noindex,noarchive" />
<?php } ?>

Don’t want to get your hand dirty?

Don’t worry folks as always I will provide plugin from other developer, if there’s any 🙂
KB Robots.txt

Advertisement