Welcome February – I Love New Months

Posted by

I love the start of a new month. Its like you get to start over in some ways. For me my stat programs run monthly for the most part so its cool to see new data.

For instance – Not to long ago I made a post about how I knew I was in supplemental hell. Well Aaron Wall made a list of stuff for me to do to fix it (mostly in my robots.txt) and I was skeptical but I am happy to report not only am I out of supplemental hell but my Google traffic has increased 1400% in only 1 month after implementing his list of stuff.

Anyway Welcome February!

63 thoughts on “Welcome February – I Love New Months

  1. Scandic

    1400% increase and no supplementals sounds like another good reason to visit the elite retreat! Maybe you can offer a budget alternative with slightly less info – “Search Bootcamp” $13.99 E-book!

    Excellent information.. thx Jeremy!

  2. TheBen

    rly well done jeremy.

    I love new month, too. It’s every time a new chance to earn more than before.

  3. mad4

    if you guys want to know just look at the robots.txt contents:
    User-Agent: Googlebot
    Disallow: /link.php
    Disallow: /gallery
    Disallow: /gallery2
    Disallow: /gallery2/
    Disallow: /gallery/
    Disallow: /category/
    Disallow: /page/
    Disallow: /pages/
    Disallow: /feed/
    Disallow: /feed

  4. John Loch

    I dont get it.. its one thing to kill off duplicates, another entirely to get an extra 1400% in traffic ???

    The bots file *reduces* the amount of content you have in the index. I dunno.. sounds like somethings missing here..


  5. GeorgeB

    Removing duplicate content and pages with little to no written content bring a lot of pages out of supplemental. The mroe pages your site has in supplemental the less ranking power your domain has. If you decrease your supplemental pages your domain’s pagerank and ranking power increase.

  6. Gemme

    Love to see the list as well but part of the list can already be found when you add robots.txt behind the url.

  7. suttree

    wonder, can you cloak a robots.txt? show one to users, and one to the bots?

    come on shoe, can be that easy:

    User-Agent: Googlebot
    Disallow: /link.php
    Disallow: /gallery
    Disallow: /gallery2
    Disallow: /gallery2/
    Disallow: /gallery/
    Disallow: /category/
    Disallow: /page/
    Disallow: /pages/
    Disallow: /feed/
    Disallow: /feed

    Maybe a combination of robots, and age of domain – just time to get out? or maybe you slipped a few shoemoney shirts to google’s top se nerds?

  8. CPA Affiliates

    *L* well i think one of the big keys that aaron wall suggested at Elite Retreat is not letting bots crawl thru useless pages like link pages and galleries, as with blogs or other sites we have pages that are more fun and nto really a gain in rankings.

  9. webprofessor

    You wonder ? Give me a break, of course you can.

    If you want to see if Shoes a big fat liar then why don’t you look at his results in the serps. See if they match what he has in the robots.txt. Make sure to check cache dates to.

    …Investigate before you speculate…

  10. webprofessor

    Yeah I added dynamic titles and descriptions last month as well and its gotten a lot of blog out of supplemental. I’ll look to add the robots.txt and see how that turns out next month.

    thanks for the tip Shoe

  11. Andy Beal

    Aaron probably doesn’t have the same issues as his site is older and more established – Google is more lenient if your site is established.

  12. ShoeMoney

    John check out how pagerank distribution works. I used to have almost 40,000 pages indexed by google but nothing ranked in the top 10 for anything. Now that I have disallowed tons of content that was douplicate i have good rankings again… for instance look at serps script

  13. suttree

    anyone who gets a phd in cs – that’s a nerd.

    course now that they have money, like those enron ceo crooked nerds, they can all go on those dirt biking, i’m a tough man safari’s in africa to prove they are NOT nerds. lol

  14. Pingback: Getting Out of Supplemental Results : SEO Profile

  15. Pingback: Mymotech » SEO: ohwa tahna siam

  16. Wealthy Webmaster

    Hmm he didn’t say this site specifically is the one he’s talking about did he? Matt cutts said specifically that pagerank is what decides what is supplemental, so it makes sense that removing worthless pages would improve the ones that are left.

  17. Greg

    Aaron explaions this on his blog today here:
    If folks are still interested. In short there’s only so much link authority to go around and shoe had that authority spread too thin so he disallowed low $$$ pages to increase authority associated with high $$$ pages.

    IMHO the 1400% makes this seem like a silver bullet… it is, like Aaron says a 20% improvement. These gains likely came from getting a couple authoritative, topical links which pulled him out of the supplemental. I don’t believe that these changes to the index file are in any way related to being pulled out of the supplemental index.

    Hope this helps and congrats Jeremy

  18. Pingback: Advanced Search Engine Optimization (SEO) for Wordpress » Alister Cameron, Blog Consultant

  19. Andrew Kuo

    Basically, you allow or disallow robots to index certain pages. Allow pages that are important (your posts), and disallow the ones that are not (ie archives pages,advertising page,search page, etc). By having good, original, unduplicated content indexed, you’ll have a better ranking. Check wikipedia for instructions on how to format robot.txt or using the robot meta tag: http://en.wikipedia.org/wiki/Robots.txt

  20. Riotz

    Thanks for posting about the robots.txt tip, it got me thinking and realizing just how much useless garbage I have indexed on Google now that’s probably killing me. I’ve blocked some stuff across a few of my sites, hopefully that will bring the same results you saw.

  21. Pingback: Writing A Good Robots.txt

  22. Pingback: Cutting down on blog categories at Emporium Blog

  23. Pingback: Get Better At… » Blog Archive » Using Robots.txt to Protect Wordpress from Google

  24. Stuart

    You can actually just use this in your .htacesss file:

    Options -Indexes

    That’ll stop anyone include spiders from crawling or access your file structure. Pretty good for securing folders too.


  25. Brian Utley

    Does robots.txt override sitemap.xml that I submitted to Google? For instance, if in my sitemap.xml I have /page/2/ but in robots.txt I use disallow /page/, will it get skipped?

  26. Pingback: WordPress-blogin hakukoneoptimointi | Nettibisnes.Info

  27. Pingback: Take More Risks » Help My Blog’s Went Supplemental In Google!

  28. Pingback: Robots.txt - SEO, Posicionamiento y AdSense

  29. Pingback: The Supplemental Study » Brandon Hopkins

  30. Pingback: Famous Marketer Seth Godin Gives Bad SEO advice

  31. Pingback: Robots txt, Link Authority, and Internal Linking Know-how

  32. Pingback: SEO Philippines - Internet Marketing Guide » Another SEO Challenge, This Time for Godin

  33. Pingback: Search engine optimization for forums - AdminFusion

  34. Pingback: Linkrain Articles » Blog Archive » Remove your pages from Google’s Supplemental Result Index to Increase Search Engine Traffic

  35. Pingback: robots.txt » xzBlogs

  36. Pingback: Make Money Online » Blog Archive » Increase your Google Traffic by 20%

  37. Vacation Rentals

    Just curious if you know of any other case studies with implementing a robots.txt file for Google & getting large traffic increases on your wordpress blogs?

    I’m playing with a couple of the many wordpress blogs I have. These particular 2 are both 99% supplemental according to this tool:


    It’s been about a month and stuff is SLOWLY coming out of supplemental. One is at 75%, the other 95% now.

    Still no changes in traffic or rankings…

  38. Pingback: Seología » Aumentá tu tráfico de Google un 1400%

Comments are closed.