I love the start of a new month. Its like you get to start over in some ways. For me my stat programs run monthly for the most part so its cool to see new data.
For instance – Not to long ago I made a post about how I knew I was in supplemental hell. Well Aaron Wall made a list of stuff for me to do to fix it (mostly in my robots.txt) and I was skeptical but I am happy to report not only am I out of supplemental hell but my Google traffic has increased 1400% in only 1 month after implementing his list of stuff.
Anyway Welcome February!
ahem…any chance of sharing that list 🙂
Yeah, share the list with us. We want tell. lol
Yeah, share that list. We want tell.
I have a few supplemental probs…would love to see the list 🙂
1400% increase and no supplementals sounds like another good reason to visit the elite retreat! Maybe you can offer a budget alternative with slightly less info – “Search Bootcamp” $13.99 E-book!
Excellent information.. thx Jeremy!
rly well done jeremy.
I love new month, too. It’s every time a new chance to earn more than before.
if you guys want to know just look at the robots.txt contents:
User-Agent: Googlebot
Disallow: /link.php
Disallow: /gallery
Disallow: /gallery2
Disallow: /gallery2/
Disallow: /gallery/
Disallow: /category/
Disallow: /page/
Disallow: /pages/
Disallow: /feed/
Disallow: /feed
Good to hear you got it fixed. Oh and thanks for including me in your list Shoe…
I dont get it.. its one thing to kill off duplicates, another entirely to get an extra 1400% in traffic ???
The bots file *reduces* the amount of content you have in the index. I dunno.. sounds like somethings missing here..
😀
Removing duplicate content and pages with little to no written content bring a lot of pages out of supplemental. The mroe pages your site has in supplemental the less ranking power your domain has. If you decrease your supplemental pages your domain’s pagerank and ranking power increase.
Why does SEOBook robots.txt not have this?
Well, duh. Why didn’t we think of that? Although, the increase in traffic is probably just a result of getting out of supplementals.
Love to see the list as well but part of the list can already be found when you add robots.txt behind the url.
LOL, I bet your robots.txt is going to be your most popular page today Shoemoney.
Anyone willing to explain this a little more?
wonder, can you cloak a robots.txt? show one to users, and one to the bots?
come on shoe, can be that easy:
User-Agent: Googlebot
Disallow: /link.php
Disallow: /gallery
Disallow: /gallery2
Disallow: /gallery2/
Disallow: /gallery/
Disallow: /category/
Disallow: /page/
Disallow: /pages/
Disallow: /feed/
Disallow: /feed
Maybe a combination of robots, and age of domain – just time to get out? or maybe you slipped a few shoemoney shirts to google’s top se nerds?
*L* well i think one of the big keys that aaron wall suggested at Elite Retreat is not letting bots crawl thru useless pages like link pages and galleries, as with blogs or other sites we have pages that are more fun and nto really a gain in rankings.
That list would be amazing to read!!!
thats about 50% the rest was adding dynamic descriptions/titles and doing a little onpage internal linking.
You wonder ? Give me a break, of course you can.
If you want to see if Shoes a big fat liar then why don’t you look at his results in the serps. See if they match what he has in the robots.txt. Make sure to check cache dates to.
…Investigate before you speculate…
Yeah I added dynamic titles and descriptions last month as well and its gotten a lot of blog out of supplemental. I’ll look to add the robots.txt and see how that turns out next month.
thanks for the tip Shoe
Aaron probably doesn’t have the same issues as his site is older and more established – Google is more lenient if your site is established.
Shoe, are you going to have a “febrazy” month like me? Probably not 😛
“Yeah I added dynamic titles and descriptions last month”
Is this an addon you can get for wordpress?
Not that I know of.. you just edit the templates.
heh…. have you ever met any Google “se nerds” if so you would know how rediculas that statement is.
John check out how pagerank distribution works. I used to have almost 40,000 pages indexed by google but nothing ranked in the top 10 for anything. Now that I have disallowed tons of content that was douplicate i have good rankings again… for instance look at serps script
speaking of nerds, guess i better start running my comments past my lawyers first.
anyone who gets a phd in cs – that’s a nerd.
course now that they have money, like those enron ceo crooked nerds, they can all go on those dirt biking, i’m a tough man safari’s in africa to prove they are NOT nerds. lol
Thanks
[…] Shoemoney posted yesterday about how Aaron Wall gave him advice on how to get his blog out of the supplemental results in Google. According to Shoemoney, after one month, his blog is out of supplemental results, and his traffic from Google increased 1400%. He also said that most of it was done in the robots.txt file. […]
[…] Shoemoney recently had a post about how he experienced these SEO woes, and how he has corrected them. Since I have now assumed my true identity on this blog, I think I owe it to myself to take a look at the current SEO state. […]
Hmm he didn’t say this site specifically is the one he’s talking about did he? Matt cutts said specifically that pagerank is what decides what is supplemental, so it makes sense that removing worthless pages would improve the ones that are left.
Aaron explaions this on his blog today here:
http://www.seobook.com/archives/002021.shtml
If folks are still interested. In short there’s only so much link authority to go around and shoe had that authority spread too thin so he disallowed low $$$ pages to increase authority associated with high $$$ pages.
IMHO the 1400% makes this seem like a silver bullet… it is, like Aaron says a 20% improvement. These gains likely came from getting a couple authoritative, topical links which pulled him out of the supplemental. I don’t believe that these changes to the index file are in any way related to being pulled out of the supplemental index.
Hope this helps and congrats Jeremy
That is a smart move. I wish I knew which pages I could take out of the sites I am taking care of to get traffic that would help me out.
How to avoid supplemental hell, Shoe..? need tips plz..
Would the Head Meta Description do the trick for those dynamic description tags?
[…] In about a month we’ll review this and look at how my content is indexed, how my posts rank relative to each other, etc. It will be fun to see how it all works out. I’d be thrilled to get the results that Jeremy got! […]
SM, share the secrets!
Basically, you allow or disallow robots to index certain pages. Allow pages that are important (your posts), and disallow the ones that are not (ie archives pages,advertising page,search page, etc). By having good, original, unduplicated content indexed, you’ll have a better ranking. Check wikipedia for instructions on how to format robot.txt or using the robot meta tag: http://en.wikipedia.org/wiki/Robots.txt
Thanks for posting about the robots.txt tip, it got me thinking and realizing just how much useless garbage I have indexed on Google now that’s probably killing me. I’ve blocked some stuff across a few of my sites, hopefully that will bring the same results you saw.
[…] Shoemoney recently revealed that Aaron helped him get a bunch of his blog posts out of Google’s supplemental index by […]
[…] extent of penalising the blog for duplicate content, as SEOBook documents. See how Shoemoney had managed to increase his Google search traffic by 1400% this past month by preventing some of his pages from […]
[…] more about this at Earners Blog, and at Shoemoney. Both of these guys got a huge increase in traffic after they instituted a simple exclusion rules. […]
You can actually just use this in your .htacesss file:
Options -Indexes
That’ll stop anyone include spiders from crawling or access your file structure. Pretty good for securing folders too.
Stuart
Does robots.txt override sitemap.xml that I submitted to Google? For instance, if in my sitemap.xml I have /page/2/ but in robots.txt I use disallow /page/, will it get skipped?
[…] vaurastuneen Jeremy Schoemakerin (”Shoemoneyn”) blogi lisäsi kävijämääräänsä 1400% tällaisen toimenpiteen jälkeen (lisäksi hän vahvisti sivuston sisäistä linkitystä ja teki dynaamisesti generoituvat title- ja […]
[…] In February Shoemoney reported on how a few simple changes suggested by Aaron Wall increased his Google traffic tenfold. As much as I enjoyed both Aaron and Shoemoney’s posts on escaping the supplemental index I never thought about implementing any of the advice myself. That was until today! As you can see from the graph below Google searches represent only 16.35% of my overall traffic, with most of my other blogs this figure is well over 60%. […]
[…] poco leàvia ProWeblogs que en Showmoney habÃÂan llegado a incrementar las visitas procedentes de google hasta en un 1400%. Realmente me […]
[…] the recent weeks there has been a lot of talk about supplemental results. Supplemental results occur when Google thinks that the pages in your […]
[…] content sections, among other things. Yeah, all that from 5 lines!!! Here is a link to a post by Shoe Money, a famous Internet Marketer and one of the top 100 bloggers according Technorati; recants a similar […]
[…] webmasters into a dilema of avoiding those filters. The solutions was found. The renowned Shoemoney increased his Google search traffic 1400% by not allowing spiders index some of his […]
[…] content sections, among other things. Yeah, all that from 5 lines!!! Here is a link to a post by Shoe Money, a famous Internet Marketer and one of the top 100 bloggers according Technorati; recants a similar […]
especially this one, I have my birthday during february!
[…] someone who knows what they are doing and has been vouched for by respected earners who have seen results Listening to the wrong person when it comes to SEO can screw you up really bad and take forever to […]
[…] was written by Chris Garret managed to increase it’s search engine traffic by 20%. Recently Shoemoney.com was able to increase his search engine traffic by 1400% by getting most of it’s pages out […]
[…] poco leàvia ProWeblogs que en Showmoney habÃÂan llegado a incrementar las visitas procedentes de google hasta en un 1400%. Realmente me […]
Think I’m going to be making some changes to my robots.txt file tonite…
[…] your google traffic. According to Nathan Metzger he saw a 20% increase in his google traffic. Also Shoemoney saw an increase of 1400% in his google traffic by getting out of the supplemental […]
Just curious if you know of any other case studies with implementing a robots.txt file for Google & getting large traffic increases on your wordpress blogs?
I’m playing with a couple of the many wordpress blogs I have. These particular 2 are both 99% supplemental according to this tool:
http://www.mapelli.info/tools/supplemental-index-ratio-calculator/
It’s been about a month and stuff is SLOWLY coming out of supplemental. One is at 75%, the other 95% now.
Still no changes in traffic or rankings…
[…] acerca de cómo mejoraste tu tráfico en un 1400% al meterle nofollows a links poco importantes como en su momento hizo shoemoney. El tÃÂtulo que escogió traducido viene a ser algo asàcomo: Bienvenido Febrero – Amo los […]
Thanks for the comments….
Great job.
It’s always good to make improvements and see the hard work pay off.