WordPress robots.txt tips against duplicate content

by Jeremy Schoemaker on March 3, 2008 · 106 comments

Been getting some questions about my robots.txt file and what certain things do.

Thankfully some regular expressions are supported in the robots.txt (but not many).

$ in regex means the end of the file. So if you do .php$ it your robots.txt that means it will match anything that ends in .php

This is really handy when you want to block all .exe .php or other files. For example:

Disallow: /*.PDF$
Disallow: /*.jpeg$
Disallow: /*.exe$

Specifically this is some of the things I use in my robots.txt

Disallow: /*? – this blocks all urls with a ? in them. A good way to avoid duplicate content issues with wordpress blogs. Obviously you only want to use this if you have changed your url structure to not be 100% ?=.

Disallow: /*.php$ – This blocks all .php files. Another good way to avoid duplicate content with a wordpress blog.

Disallow: /*.inc$ – you should not be showing .inc or include files to bots (google code search will eat you alive)

Disallow: /*.css$ – why would you show css files for indexing seems silly.. The wildcard is used here in case there are many css files.

Disallow: */feed/ feeds being indexed dilute your site equity. The wildcard * is used incase there is preceding chars.

Disallow: */trackback/ – no reason a trackback url should be indexed. The wildcard * is used incase there is preceding chars.

Disallow: /page/ – assloads of duplicate content in pages for wordpress.

Disallow: /tag/ – more douplicate content.

Disallow: /category/ – even more duplicate content.

SO what if you want to ALLOW a page. Like for instance my serps tool is serps.php and from the above rules that would not fly.

Allow: /serps.php – this does the trick!

Keep in mind I am not a SEO but I have picked up a few tricks along the way.

full disclosure

About the author...

– who has written 2895 posts on ShoeMoney.com.

Jeremy "ShoeMoney" Schoemaker is the founder & CEO of the ShoeMoney Blog, Elite Retreat Internet Conference, & the PAR Program. In 2013 Jeremy released his #1 Amazon Best selling Autobiography titled "Nothing's Changed But My Change" - The ShoeMoney Story. Jeremy currently lives in Lincoln Nebraska with his wife and 2 daughters.


Michelle recommends you check out these amazing posts:

  1. cash How To Make $100 A Day With An Info Product (Part 4)
  2. 300px-Boschsevendeadlysins Seven Deadly Sins For People Trying to Make Money Online
  3. shutterstock_68611039 How To Profit From Invading Your Users Privacy

{ 83 comments }

1 bob

I never mess around with this stuff, but does duplicate content reduce how well your site ranks overall?

2 Kamal Hasa

This is an old article but still the same rule applies even now in 2010. The fact is that over the period of time Google has managed to know how to play around with duplicate content.

So having a good robots.txt is good and is awesome if you want to block something from your blog or site.

3 Keith Cash

Some really good info, if you are not an SEO you are pretty darn close

4 bob c

I’m about to implement this:(how does it look?)
User-agent: *
Disallow: /cgi-bin
Disallow: /wp-admin
Disallow: /wp-includes
Disallow: /wp-content/plugins
Disallow: /wp-content/cache
Disallow: /wp-content/themes
Disallow: /trackback
Disallow: /comments
Disallow: /category/*/*
Disallow: */trackback
Disallow: */comments
Disallow: /*?*
Disallow: /*?
Allow: /wp-content/uploads

5 ShoeMoney

bob not sure why you need the extra /*/* after category. just /category/ should get that and all sub directories of category.

6 Ian

Thanks for the tips shoe. A lot of people don’t realize how much duplicate content on your site can really hurt you.

7 bob c

Thanks, I copied that from another blog so I’ll fix that.

8 RacerX

Big Help! Thanks. This should help a bunch.

9 RacerX

I am not an SEO, but I play one on the internet…

If Shoe isn’t an expert…he is the closest thing that will talk to us!

10 Arejay

Very Nice post! We all know how many site’s leave this simple step out (like the ebook sales people, who u do a simple site: and u find the members download area). You put it out plain and simple!!! Don’t you find it funny how people who are non seo people like you, make more money then the seo people. LOL. Have a fantastic week Shoe and everyone else! Make that $$$$$

11 brad

thx for the great tips for the robots.txt and wordpress blogs

12 Michelle

Thanks for the excellent tips Shoe. One of my blogs had been performing amazingly until Google decided to hate it last week. These tips are just what I need to try and work out if it’s a duplicate content issue..

13 TheMadHat

I disagree with this assessment on some level. Sure, you don’t want duplicate content and it will negatively impact your site, but using the robots.txt file to fix the problem wouldn’t be my way to go.

The robots file tells Google not to even crawl the page. A better scenario would be to use the meta noindex and follow. This tells Google not to index the page, but it can and will still accumulate link juice to pass it on (unless this page is a dead end, then it’s pointless).

See this interview with Matt from a few months ago for a little more in-depth conversation.

14 Unpublished Guy

I use the robot.txt file, because I use a CMS with URL rewriting. I can’t (I don’t think) use meta tags because I have the appearance of duplicate content–not actual duplicate content. For example, teh same page might appear under the URL ./Default.aspx?tabid=1 or ./tabid/1/Default.aspx, depending on how the page is accessed. If I add meta tags, then none of pages will get indexed.

15 Solo Programmer

I have the all-in-one seo pack which applies noindex, nofollow meta tags on the actual archive/category/tag pages. I wonder if this is still worth doing but I guess it can’t hurt.

16 Hustle Strategy

it can.

17 Mayank Rocks

Thanks a lot for the tips Jeremy

18 Mayank Rocks

I agree there totally with the above person.

19 Paid Surveys Reviewed

Thanks for that, really need to get to grips with this robots stuff, I am sure it helps with SEO although don’t quite understand how. :-)

20 Money Blog

thanks, very helpful

21 Exposed SEO

lol at all the spammy comments. “I totally agree with everyone” lol

22 eMarketing Chat

This is very helpful! Thanks for sharing.

23 Guy

Disallow /category/ is a good one to add. Just make *extra* sure your Permalink structure isn’t set up to include “category” == otherwise nothing will be indexed.

To help reduce DC, I also recommend blocking the archives (just add a new line for each year your blog has been online)

# Block Duplicate Content from Archives
Disallow: /2006/
Disallow: /2007/
Disallow: /2008/

I also have this
Disallow: /*?*

instead of this;
Disallow: /*?

24 TheOfficeCubicle

Thanks Shoe! I appreciate all you have done.

:)

25 Guy

Blocking /category/ is a good one. Just need to be careful that your Permalink structure isn’t setup to include “category” — otherwise nothing will get indexed.

I also use the following to block the archives. Just add a new line for each year your blog has been online.

# Block Duplicate Content From Archives
Disallow: /2006/
Disallow: /2007/
Disallow: /2008/

One more is that I use;
Disallow: /*?*

instead of;
Disallow: /*?

26 Homefinding Book

Great tutorial – more of this please! No matter what you say, its pretty good SEO stuff.

27 Paul

Thank you for the tips.

28 Terry Tay

Excellent post Jeremy! Every single day I’m learning something new from you it seems. Just the other day with the link rel= and now today with the robot.txt file.

I’ve just read the basics about the robot.txt file and never really thought much more into it. It’s good we have people like you helping us out along the way.

Thanks!
~Terry

29 jtGraphic

Thanks for the tip. I guess I have the same question as someone above. How does duplicate content hurt your ranking? Is it a consequence of PR being spread across multiple pages – or is it just a case of being penalized for duplication? I’ll have to do more research. Thanks again.

30 Deibson Albernas

yes ok, Thanks, use in 28 blogs maide in brasil

31 anty

Interesting that the question mark doesn’t have to be escaped. Normally a question mark would be a RegEx meta character, but I just looked it up in the Google guidelines: a question mark is treaded as a regular character.
An important note: Not every crawler understands RegEx in the robots.txt. So you are “protecting” your sites against the major search engines, but not from normal bots. This is ok to avoid duplicate content, I guess.

32 anty

I wonder if Google isn’t already good at detecting a wordpress installation and can therefore react on the duplicate content accordingly (like ignoring part of the sites, indexing after a schema normal wp blogs will follow)… Just a thought :)

33 oakling

OMG. Will this keep spammers from doing that obnoxious thing where they copy a whole journal entry (or the majority of one) into their fake blogs, making it look like they are quoting it (“Someone said something great over at blahblahblah dot com, ‘entire post here,’”) with no other content? Just to get on google and steal my links? I’m sure they’re using robots at some stage….

34 ShoeMoney

well.. just had a conversion with mr cutts about this and many other things 3 days ago.

You are getting the Disallow and noindex tags confused in the robots.txt. Disallow will still let the bots visit and index them but not take in the content.

35 ShoeMoney

well its not really true regex… its just a somewhat adaptation

36 ShoeMoney

I doubt its going to keep spammers out ;)

37 TheMadHat

Agreed that disallow will allow the bots to visit but not take the content. Maybe I said this wrong.

Say for example you’ve got links coming into a page you’ve disallowed in robots.txt. This wastes any link juice that (linking) page is giving you. Using “meta noindex” will allow the bots to follow the links on the “meta noindexed” page and pass on the link juice, and also alleviate any dup issues.

So has he changed his stance on the fact that a “meta noindexed” page accumulating and pass page rank? On a robots disallowed page the bots won’t take the content thereby there will be nowhere to pass page rank to.

The way I understand it is this:

meta noindex – don’t index but follow and pass pr
meta nofollow – index but don’t follow links or pass any pr on entire page
href nofollow – don’t pass pr on that link
robots disallow – don’t index or follow or pass pr (they can reference the url still, just without content there is nowhere to pass any link juice).

38 Syed Balkhi

Great list of tips shoe … i can bet this helps alot.

39 Syed Balkhi

yeah nothing keeps them out

40 Gary R. Hess

Matt Cutts says it does.

41 Gary R. Hess

For smaller blogs this might not be the best thing to do when it comes to SEO. If implementing everything this way, you are relying on Google to find older posts (if they don’t have links to them) by going directly through the homepage. Requiring Google to go back 20 pages to find an article is a good way to end up in the supplemental index (which, of course they claim doesn’t exist anymore, but IMO it does).

42 Affiliate Confession

Thanks for the list and explaining it. I need to add a robots.txt file to my blog.

43 Douglas Karr

Thanks for these tips – I hadn’t even thought of leveraging the robots file against duplicate content (much easier than disabling those features!). Thanks!

44 Tom Beaton

I shall have to take another look at my robots.txt!

45 David Harrison

Typo in the title? Or am i seeing things

46 Squeaky

Thank you for posting these tips for WordPress on the robot.txt file.

47 Charlie

Bob,
Why do you need to block /comments/, I thought having comments indexed would be a good thing. This is new to me so any pointers would be great.

Thanks.

48 Uzair

Thats great. But don’t you think you are getting off topic.

49 Uzair

It does. Duplicate content ruins your site.

50 Uzair

You can also use
Disallow: /wp
instead of all those others like
Disallow: /wp-admin
Disallow: /wp-includes
Disallow: /wp-content/plugins
Disallow: /wp-content/cache
Disallow: /wp-content/themes

51 Affiliate Confession

I was wondering that as well. They can find and react to all sorts of things, I think they would know about WP installs and the issues it has.

52 Dexter | Techathand.net

This is only applicable if your Permalink is not structuterd to have year on it. Or else this will result with a mess..

53 Dexter | Techathand.net

Hello to all I just want to share my post regarding Robots.txt that really helps my site

54 Reynder (SEO)

Thanks! Very useful again. Avoiding duplicate content really helped me ranking well.

55 Too Much Vodka

Well, this seems to be a better version than all noindex plugins going arround. Def. will give it a try!

56 John

Very helpful post for me as I have been looking how to use the robots.txt file in this way for some time.

57 Nullamatix

I didn’t initially include a robots.txt in my blog and never had any issues with dupe content. It wasn’t until just recently I decided to add one, more for experimental purposes. So far, search engine traffic hasn’t improved or declined either way. Wordpress out of the box isn’t great for SEO purposes, but with minor tweeks, I find that a robots.txt isn’t really necessary.

-Guy
http://www.nullamatix.com

58 Nullamatix

Uzair,

How is this off topic? If Shoe thinks a robots.txt will help in SERPs, your site will get more traffic, and ultimately earn more cash. Isn’t that one of the focuses of this blog? “Skills to Pay the Bills” right?

-Guy
http://www.nullamatix.com

59 Nullamatix

Um, no. The only way to prevent those types of attacks would involve IP based content delivery.

60 RacerX

Do you have some before /after stats you can share? I understand the penalty, but just want to understand how it improves.

61 Yiwu

Ya,I dont use Disfollows..

62 Yiwu

Why my post cann’t be displayed.

63 Too Much Vodka

I agree, disallowing category and page is not the smartest move to let google find old content.

64 Andy Beard

Shoe is making an “SEO Linking Gotcha”

All the pages blocked with robots.txt will still gather juice and can still rank

Simple proof is that my Wordpress SEO Masterclass page is still ranking after being blocked by robots.txt for a couple of weeks as it was written as a paid post – actually it is ranking higher that Joost’s similar page.

This article explains why so many people have got this wrong for years
http://andybeard.eu/2007/11/seo-linking-gotchas-even-the-pros-make.html

It gets worse when people start mixing this kind of advice with their “All in one SEO” because the noindex statements added don’t get seen by googlebot.

65 Downloading...

Thanks for this Jeremy. I have been looking for a good robots.txt file. I have no idea what to put in, so this will help.

66 Secrets Of Cash Gifting

Thats good that they added it, duplication is bad.

67 Erica DeWolf

Great post with some great descriptions of what these certain words will “do.” Thanks for the post!

68 HardGeek

wow!!! Never knew that..??

69 Chip

Great tips, I’ll enhance my robots.txt file ASAP

70 SEO hosting

Shoe, I just checked your actual robots.txt. Why do you have;

Disallow: /sitemap.xml

That seems like trouble?

71 No Regrets Cash Gifting

Very nice article, thanks!

72 Erken Rezervasyon

I think this article has been translated into other languages in the converter to use a car very enjoyable

73 Tatil

thank u this article and all comments

74 bursa

Ya,I dont use Disfollows..

75 bursa emlak

Some really good info, if you are not an SEO you are pretty darn close

76 Bogan Marketing

I also think the author,date,comment and in some cases rss archives should be noindex if you want silo seo. Even adding the .html extension is a good idea, and stripping out the top level category.

there is a post about it at Wordpress Robots.txt for Silo SEO<

77 olay

Very helpful post for me as I have been looking how to use the robots.txt file in this way for some time.

78 istanbul otelleri

Hi, I hope it’s good for my writing assignments, if I get a note if I do not repeat here

79 kültür turları

I visited some of this blogs. Some are good but some were not interested for me. But good list.

80 Bursa Emlak

Nice trick, thanks for that, I’ll be using that in my sites to lop off spurious content from the feeds…

81 bursa

Some really good info, if you are not an SEO you are pretty darn close

82 Bursa Devlet Hastanesi

Thanks for the tips shoe. A lot of people don’t realize how much duplicate content on your site can really hurt you.

83 oldbaby

great great post

Previous post:

Next post: