I love the start of a new month. Its like you get to start over in some ways. For me my stat programs run monthly for the most part so its cool to see new data.

For instance – Not to long ago I made a post about how I knew I was in supplemental hell. Well Aaron Wall made a list of stuff for me to do to fix it (mostly in my robots.txt) and I was skeptical but I am happy to report not only am I out of supplemental hell but my Google traffic has increased 1400% in only 1 month after implementing his list of stuff.

Anyway Welcome February!

By Jeremy Schoemaker

Jeremy "ShoeMoney" Schoemaker is the founder & CEO of ShoeMoney Media Group, and to date has sold 6 companies and done over 10 million in affiliate revenue. In 2013 Jeremy released his #1 International Best selling Autobiography titled "Nothing's Changed But My Change" - The ShoeMoney Story. You can read more about Jeremy on his wikipedia page here.

63 thoughts on “Welcome February – I Love New Months”
  1. 1400% increase and no supplementals sounds like another good reason to visit the elite retreat! Maybe you can offer a budget alternative with slightly less info – “Search Bootcamp” $13.99 E-book!

    Excellent information.. thx Jeremy!

  2. rly well done jeremy.

    I love new month, too. It’s every time a new chance to earn more than before.

  3. if you guys want to know just look at the robots.txt contents:
    User-Agent: Googlebot
    Disallow: /link.php
    Disallow: /gallery
    Disallow: /gallery2
    Disallow: /gallery2/
    Disallow: /gallery/
    Disallow: /category/
    Disallow: /page/
    Disallow: /pages/
    Disallow: /feed/
    Disallow: /feed

  4. I dont get it.. its one thing to kill off duplicates, another entirely to get an extra 1400% in traffic ???

    The bots file *reduces* the amount of content you have in the index. I dunno.. sounds like somethings missing here..

    😀

  5. Removing duplicate content and pages with little to no written content bring a lot of pages out of supplemental. The mroe pages your site has in supplemental the less ranking power your domain has. If you decrease your supplemental pages your domain’s pagerank and ranking power increase.

  6. Well, duh. Why didn’t we think of that? Although, the increase in traffic is probably just a result of getting out of supplementals.

  7. Love to see the list as well but part of the list can already be found when you add robots.txt behind the url.

  8. wonder, can you cloak a robots.txt? show one to users, and one to the bots?

    come on shoe, can be that easy:

    User-Agent: Googlebot
    Disallow: /link.php
    Disallow: /gallery
    Disallow: /gallery2
    Disallow: /gallery2/
    Disallow: /gallery/
    Disallow: /category/
    Disallow: /page/
    Disallow: /pages/
    Disallow: /feed/
    Disallow: /feed

    Maybe a combination of robots, and age of domain – just time to get out? or maybe you slipped a few shoemoney shirts to google’s top se nerds?

  9. *L* well i think one of the big keys that aaron wall suggested at Elite Retreat is not letting bots crawl thru useless pages like link pages and galleries, as with blogs or other sites we have pages that are more fun and nto really a gain in rankings.

  10. thats about 50% the rest was adding dynamic descriptions/titles and doing a little onpage internal linking.

  11. You wonder ? Give me a break, of course you can.

    If you want to see if Shoes a big fat liar then why don’t you look at his results in the serps. See if they match what he has in the robots.txt. Make sure to check cache dates to.

    …Investigate before you speculate…

  12. Yeah I added dynamic titles and descriptions last month as well and its gotten a lot of blog out of supplemental. I’ll look to add the robots.txt and see how that turns out next month.

    thanks for the tip Shoe

  13. Aaron probably doesn’t have the same issues as his site is older and more established – Google is more lenient if your site is established.

  14. “Yeah I added dynamic titles and descriptions last month”

    Is this an addon you can get for wordpress?

  15. heh…. have you ever met any Google “se nerds” if so you would know how rediculas that statement is.

  16. John check out how pagerank distribution works. I used to have almost 40,000 pages indexed by google but nothing ranked in the top 10 for anything. Now that I have disallowed tons of content that was douplicate i have good rankings again… for instance look at serps script

  17. anyone who gets a phd in cs – that’s a nerd.

    course now that they have money, like those enron ceo crooked nerds, they can all go on those dirt biking, i’m a tough man safari’s in africa to prove they are NOT nerds. lol

  18. […] Shoemoney posted yesterday about how Aaron Wall gave him advice on how to get his blog out of the supplemental results in Google. According to Shoemoney, after one month, his blog is out of supplemental results, and his traffic from Google increased 1400%. He also said that most of it was done in the robots.txt file. […]

  19. […] Shoemoney recently had a post about how he experienced these SEO woes, and how he has corrected them. Since I have now assumed my true identity on this blog, I think I owe it to myself to take a look at the current SEO state. […]

  20. Hmm he didn’t say this site specifically is the one he’s talking about did he? Matt cutts said specifically that pagerank is what decides what is supplemental, so it makes sense that removing worthless pages would improve the ones that are left.

  21. Aaron explaions this on his blog today here:
    http://www.seobook.com/archives/002021.shtml
    If folks are still interested. In short there’s only so much link authority to go around and shoe had that authority spread too thin so he disallowed low $$$ pages to increase authority associated with high $$$ pages.

    IMHO the 1400% makes this seem like a silver bullet… it is, like Aaron says a 20% improvement. These gains likely came from getting a couple authoritative, topical links which pulled him out of the supplemental. I don’t believe that these changes to the index file are in any way related to being pulled out of the supplemental index.

    Hope this helps and congrats Jeremy

  22. That is a smart move. I wish I knew which pages I could take out of the sites I am taking care of to get traffic that would help me out.

  23. […] In about a month we’ll review this and look at how my content is indexed, how my posts rank relative to each other, etc. It will be fun to see how it all works out. I’d be thrilled to get the results that Jeremy got! […]

  24. Basically, you allow or disallow robots to index certain pages. Allow pages that are important (your posts), and disallow the ones that are not (ie archives pages,advertising page,search page, etc). By having good, original, unduplicated content indexed, you’ll have a better ranking. Check wikipedia for instructions on how to format robot.txt or using the robot meta tag: http://en.wikipedia.org/wiki/Robots.txt

  25. Thanks for posting about the robots.txt tip, it got me thinking and realizing just how much useless garbage I have indexed on Google now that’s probably killing me. I’ve blocked some stuff across a few of my sites, hopefully that will bring the same results you saw.

  26. […] Shoemoney recently revealed that Aaron helped him get a bunch of his blog posts out of Google’s supplemental index by […]

  27. […] extent of penalising the blog for duplicate content, as SEOBook documents. See how Shoemoney had managed to increase his Google search traffic by 1400% this past month by preventing some of his pages from […]

  28. […] more about this at Earners Blog, and at Shoemoney. Both of these guys got a huge increase in traffic after they instituted a simple exclusion rules. […]

  29. You can actually just use this in your .htacesss file:

    Options -Indexes

    That’ll stop anyone include spiders from crawling or access your file structure. Pretty good for securing folders too.

    Stuart

  30. Does robots.txt override sitemap.xml that I submitted to Google? For instance, if in my sitemap.xml I have /page/2/ but in robots.txt I use disallow /page/, will it get skipped?

  31. […] vaurastuneen Jeremy Schoemakerin (”Shoemoneyn”) blogi lisäsi kävijämääräänsä 1400% tällaisen toimenpiteen jälkeen (lisäksi hän vahvisti sivuston sisäistä linkitystä ja teki dynaamisesti generoituvat title- ja […]

  32. […] In February Shoemoney reported on how a few simple changes suggested by Aaron Wall increased his Google traffic tenfold. As much as I enjoyed both Aaron and Shoemoney’s posts on escaping the supplemental index I never thought about implementing any of the advice myself. That was until today! As you can see from the graph below Google searches represent only 16.35% of my overall traffic, with most of my other blogs this figure is well over 60%. […]

  33. […] poco leí via ProWeblogs que en Showmoney habían llegado a incrementar las visitas procedentes de google hasta en un 1400%. Realmente me […]

  34. […] the recent weeks there has been a lot of talk about supplemental results. Supplemental results occur when Google thinks that the pages in your […]

  35. […] content sections, among other things. Yeah, all that from 5 lines!!! Here is a link to a post by Shoe Money, a famous Internet Marketer and one of the top 100 bloggers according Technorati; recants a similar […]

  36. […] webmasters into a dilema of avoiding those filters. The solutions was found. The renowned Shoemoney increased his Google search traffic 1400% by not allowing spiders index some of his […]

  37. SEO Philippines - Internet Marketing Guide » Another SEO Challenge, This Time for Godin says:

    […] content sections, among other things. Yeah, all that from 5 lines!!! Here is a link to a post by Shoe Money, a famous Internet Marketer and one of the top 100 bloggers according Technorati; recants a similar […]

  38. […] someone who knows what they are doing and has been vouched for by respected earners who have seen results Listening to the wrong person when it comes to SEO can screw you up really bad and take forever to […]

  39. Linkrain Articles » Blog Archive » Remove your pages from Google’s Supplemental Result Index to Increase Search Engine Traffic says:

    […] was written by Chris Garret managed to increase it’s search engine traffic by 20%. Recently Shoemoney.com was able to increase his search engine traffic by 1400% by getting most of it’s pages out […]

  40. […] poco leí via ProWeblogs que en Showmoney habían llegado a incrementar las visitas procedentes de google hasta en un 1400%. Realmente me […]

  41. […] your google traffic. According to Nathan Metzger he saw a 20% increase in his google traffic. Also Shoemoney saw an increase of 1400% in his google traffic by getting out of the supplemental […]

  42. Just curious if you know of any other case studies with implementing a robots.txt file for Google & getting large traffic increases on your wordpress blogs?

    I’m playing with a couple of the many wordpress blogs I have. These particular 2 are both 99% supplemental according to this tool:

    http://www.mapelli.info/tools/supplemental-index-ratio-calculator/

    It’s been about a month and stuff is SLOWLY coming out of supplemental. One is at 75%, the other 95% now.

    Still no changes in traffic or rankings…

  43. […] acerca de cómo mejoraste tu tráfico en un 1400% al meterle nofollows a links poco importantes como en su momento hizo shoemoney. El título que escogió traducido viene a ser algo así como: Bienvenido Febrero – Amo los […]

  44. Great job.

    It’s always good to make improvements and see the hard work pay off.

Comments are closed.