Welcome February – I Love New Months

by Jeremy Schoemaker on February 1, 2007 · 63 comments

I love the start of a new month. Its like you get to start over in some ways. For me my stat programs run monthly for the most part so its cool to see new data.

For instance – Not to long ago I made a post about how I knew I was in supplemental hell. Well Aaron Wall made a list of stuff for me to do to fix it (mostly in my robots.txt) and I was skeptical but I am happy to report not only am I out of supplemental hell but my Google traffic has increased 1400% in only 1 month after implementing his list of stuff.

Anyway Welcome February!

full disclosure

About the author...

– who has written 2858 posts on ShoeMoney.com.

Jeremy "ShoeMoney" Schoemaker is the founder & CEO of the ShoeMoney Blog, Elite Retreat Internet Conference, & the PAR Program. In 2013 Jeremy released his #1 Amazon Best selling Autobiography titled "Nothing's Changed But My Change" - The ShoeMoney Story. Jeremy currently lives in Lincoln Nebraska with his wife and 2 daughters.


Jason recommends you check out these amazing posts:

  1. par program 99designs From the Top Designers’ Perspectives
  2. IMG_0247 The Pursuit Of Happyness
  3. doctoratefirst Talent and Skill

{ 45 comments }

1 Charles

ahem…any chance of sharing that list :)

2 lawrence

Yeah, share the list with us. We want tell. lol

3 lawrenceq

Yeah, share that list. We want tell.

4 Damien

I have a few supplemental probs…would love to see the list :)

5 Scandic

1400% increase and no supplementals sounds like another good reason to visit the elite retreat! Maybe you can offer a budget alternative with slightly less info – “Search Bootcamp” $13.99 E-book!

Excellent information.. thx Jeremy!

6 TheBen

rly well done jeremy.

I love new month, too. It’s every time a new chance to earn more than before.

7 mad4

if you guys want to know just look at the robots.txt contents:
User-Agent: Googlebot
Disallow: /link.php
Disallow: /gallery
Disallow: /gallery2
Disallow: /gallery2/
Disallow: /gallery/
Disallow: /category/
Disallow: /page/
Disallow: /pages/
Disallow: /feed/
Disallow: /feed

8 John Heard

Good to hear you got it fixed. Oh and thanks for including me in your list Shoe…

9 John Loch

I dont get it.. its one thing to kill off duplicates, another entirely to get an extra 1400% in traffic ???

The bots file *reduces* the amount of content you have in the index. I dunno.. sounds like somethings missing here..
:D

10 GeorgeB

Removing duplicate content and pages with little to no written content bring a lot of pages out of supplemental. The mroe pages your site has in supplemental the less ranking power your domain has. If you decrease your supplemental pages your domain’s pagerank and ranking power increase.

11 Dugdale

Why does SEOBook robots.txt not have this?

12 Jonathan Kemp

Well, duh. Why didn’t we think of that? Although, the increase in traffic is probably just a result of getting out of supplementals.

13 Gemme

Love to see the list as well but part of the list can already be found when you add robots.txt behind the url.

14 SonicReducer

LOL, I bet your robots.txt is going to be your most popular page today Shoemoney.

15 Eddie Mace

Anyone willing to explain this a little more?

16 suttree

wonder, can you cloak a robots.txt? show one to users, and one to the bots?

come on shoe, can be that easy:

User-Agent: Googlebot
Disallow: /link.php
Disallow: /gallery
Disallow: /gallery2
Disallow: /gallery2/
Disallow: /gallery/
Disallow: /category/
Disallow: /page/
Disallow: /pages/
Disallow: /feed/
Disallow: /feed

Maybe a combination of robots, and age of domain – just time to get out? or maybe you slipped a few shoemoney shirts to google’s top se nerds?

17 CPA Affiliates

*L* well i think one of the big keys that aaron wall suggested at Elite Retreat is not letting bots crawl thru useless pages like link pages and galleries, as with blogs or other sites we have pages that are more fun and nto really a gain in rankings.

18 POOP

That list would be amazing to read!!!

19 ShoeMoney

thats about 50% the rest was adding dynamic descriptions/titles and doing a little onpage internal linking.

20 webprofessor

You wonder ? Give me a break, of course you can.

If you want to see if Shoes a big fat liar then why don’t you look at his results in the serps. See if they match what he has in the robots.txt. Make sure to check cache dates to.

…Investigate before you speculate…

21 webprofessor

Yeah I added dynamic titles and descriptions last month as well and its gotten a lot of blog out of supplemental. I’ll look to add the robots.txt and see how that turns out next month.

thanks for the tip Shoe

22 Andy Beal

Aaron probably doesn’t have the same issues as his site is older and more established – Google is more lenient if your site is established.

23 Tob

Shoe, are you going to have a “febrazy” month like me? Probably not :P

24 doolally

“Yeah I added dynamic titles and descriptions last month”

Is this an addon you can get for wordpress?

25 webprofessor

Not that I know of.. you just edit the templates.

26 ShoeMoney

heh…. have you ever met any Google “se nerds” if so you would know how rediculas that statement is.

27 ShoeMoney

John check out how pagerank distribution works. I used to have almost 40,000 pages indexed by google but nothing ranked in the top 10 for anything. Now that I have disallowed tons of content that was douplicate i have good rankings again… for instance look at serps script

28 suttree

speaking of nerds, guess i better start running my comments past my lawyers first.

29 suttree

anyone who gets a phd in cs – that’s a nerd.

course now that they have money, like those enron ceo crooked nerds, they can all go on those dirt biking, i’m a tough man safari’s in africa to prove they are NOT nerds. lol

30 doolally

Thanks

31 Wealthy Webmaster

Hmm he didn’t say this site specifically is the one he’s talking about did he? Matt cutts said specifically that pagerank is what decides what is supplemental, so it makes sense that removing worthless pages would improve the ones that are left.

32 Greg

Aaron explaions this on his blog today here:
http://www.seobook.com/archives/002021.shtml
If folks are still interested. In short there’s only so much link authority to go around and shoe had that authority spread too thin so he disallowed low $$$ pages to increase authority associated with high $$$ pages.

IMHO the 1400% makes this seem like a silver bullet… it is, like Aaron says a 20% improvement. These gains likely came from getting a couple authoritative, topical links which pulled him out of the supplemental. I don’t believe that these changes to the index file are in any way related to being pulled out of the supplemental index.

Hope this helps and congrats Jeremy

33 Shortshire

That is a smart move. I wish I knew which pages I could take out of the sites I am taking care of to get traffic that would help me out.

34 Pogung177

How to avoid supplemental hell, Shoe..? need tips plz..

35 Matt L

Would the Head Meta Description do the trick for those dynamic description tags?

36 Biz Card Man

SM, share the secrets!

37 Andrew Kuo

Basically, you allow or disallow robots to index certain pages. Allow pages that are important (your posts), and disallow the ones that are not (ie archives pages,advertising page,search page, etc). By having good, original, unduplicated content indexed, you’ll have a better ranking. Check wikipedia for instructions on how to format robot.txt or using the robot meta tag: http://en.wikipedia.org/wiki/Robots.txt

38 Riotz

Thanks for posting about the robots.txt tip, it got me thinking and realizing just how much useless garbage I have indexed on Google now that’s probably killing me. I’ve blocked some stuff across a few of my sites, hopefully that will bring the same results you saw.

39 Stuart

You can actually just use this in your .htacesss file:

Options -Indexes

That’ll stop anyone include spiders from crawling or access your file structure. Pretty good for securing folders too.

Stuart

40 Brian Utley

Does robots.txt override sitemap.xml that I submitted to Google? For instance, if in my sitemap.xml I have /page/2/ but in robots.txt I use disallow /page/, will it get skipped?

41 krillz

especially this one, I have my birthday during february!

42 RunStream

Think I’m going to be making some changes to my robots.txt file tonite…

43 Vacation Rentals

Just curious if you know of any other case studies with implementing a robots.txt file for Google & getting large traffic increases on your wordpress blogs?

I’m playing with a couple of the many wordpress blogs I have. These particular 2 are both 99% supplemental according to this tool:

http://www.mapelli.info/tools/supplemental-index-ratio-calculator/

It’s been about a month and stuff is SLOWLY coming out of supplemental. One is at 75%, the other 95% now.

Still no changes in traffic or rankings…

44 Beauty World

Thanks for the comments….

45 Portland window cleaning

Great job.

It’s always good to make improvements and see the hard work pay off.

Previous post:

Next post: