WSJ, WTF?! Google Panda & Subdomains

Ian Lurie
Panda is coming

Take a deep breath, everyone.

The Wall Street Journal published an article July 13 all about HubPages, and how the site saved itself from Google Panda’s evil clutches by spreading their site across multiple subdomains. In the article Amir Efrati implies in a very-hard-to-pin-down-but-easy-to-assume way that subdomains may be a magical solution if you’ve been nuked by the Panda update.

That pisses me off.

A lot.

But, before I rant about unqualified people writing on specialized topics, I’ll clear up the subdomains hysteria.

Subdomains ain’t all dat

Aaron Wall wrote that the WSJ article means Google’s completely reversed their previous stance on subdomains versus subfolders. Google’s always said subfolders are better, as have I.

So, when I read Aaron’s article I nearly coughed up my skull. If Google really reversed this policy, then I’d led clients and readers alike off the Rankings Cliff of Doom. Developers would hunt me down (even more than they do now). Branding teams would burn me in effigy (even more than they do now). IT staff who ‘know SEO’ would see me as an annoying, clueless pest (even more… you get it). People would come up with a new expression:

“Doing a heck of a job there, Lurie.”

But with all respect to Aaron—and understand, he knows more about SEO than 99.9% of the industry, including me—I think he’s got it wrong. This isn’t a reversal, or even a change.

Panda considers the quality of all content on a subdomain when making ranking decisions. If you’re, say, HubPages, and 50% of the content on is basically brain snot, that hurts the ability of every page on your site to rank. So the other 50% of content—the arguably decent stuff—gets zapped out of the rankings. The bad content becomes an anchor, dragging everything down.

That’s why subdomains helped HubPages. They used subdomains to separate the crappy stuff from the good stuff: versus I made those up, by the way. But you get the idea. With subdomains, HubPages was able to move the bad content ‘anchor’ to a whole other site. That helped the good stuff move back up, because Google doesn’t let subdomains directly pass ranking factors back-and-forth.

This is not a change

What HubPages describes is exactly how Google has always treated subdomains. It’s not a change in their algorithm. It’s why I’ve always said putting your blog on a subdomain is a bad idea: Subdomain authority and relevance doesn’t directly transfer to other subdomains.

Apparently, the same holds true for quality.

If you’re running an SEO campaign based on content, subfolders are still the right strategy. Use subdomains to remove lousy stuff from your primary subdomain, if you want. Personally, I’d prefer you just removed the lousy stuff altogether.

So climb out of your shelters. The Google Meteor isn’t going to hit today. I’m not saying it won’t. Aaron’s distrust of huge, monopolistic search engines is, I think, dead-on. But subdomains’ status in SEO hasn’t changed.

By the way, this is another version of the hard lesson learned by Google and Twitter’s breakup: In SEO, don’t chase little shiny things. You’ll get eaten.

Tomorrow, I’ll write about journalists and their utter inability to discuss SEO in anything resembling a sane, coherent manner. Which may lead to me talking about the general fall of civilization, and why I think communication is the most-neglected, most-important skill we’ve got.

Other stuff

Start call to action

See how Portent can help you own your piece of the web.

End call to action


  1. I don’t know how you can cough up a skull, but yeah. All this means is that one of many non-real-time systems Google uses doesn’t grok subdomain spam. Yet.
    And Aaron Wall, who benefits from his multiple subdomains every day, knows that what Google reps have been saying doesn’t exactly match what their machine does.
    Other than that, let’s all panic, and split our sites up into one subdomain per page.

  2. @Dan there are a lot of problems with the approach you describe. I can’t get into all of them (for free) but the basics are:
    1. If you’re using RSS to publish to the home page, you’re not keeping the blog content on the primary domain. Posts are there, then gone. And the primary domain gets no lasting value.
    2. Links to the subdomain blog accord zero authority to the primary domain.
    3. Quality signals that Google applies to the subdomain blog will have zero impact on the primary domain.
    You’re always better off with it all on the main domain. Then show a snippet of the latest posts on the home page.

  3. Great post. I linked to it from a WebProNews article on the story that was asking if HubPages had found the magic bullet against Panda.

  4. Ian, great post and spot on IMO.
    This whole thing that cropped up yesterday could cause a lot of problems. I have already been caught up in conversations about whether we should have chosen subdomains over subfolders. But after sitting on it a little and reading a post like this I completely agree. The whole subdomain recommendation in relation to Panda sounds rather dumb and I can only imagine Matt recommending it to Hubpages if they had been insistent on wanting to keep the crappy stuff.
    However, like anything in this game there are lessons to be learned. Up until Panda you could use subfolders for a diverse range of topics even if you knew some pieces were crappy and others quality. Only the crap ones may get hit. Now that those crappy articles can effect the whole site, using subdomains gets thrown deeper into the decision process of what should used folders or subdomains.
    No idea how much I just wrote here as I am on my iPhone and it’s hard to tell. Sorry for rambling..

  5. Nice analysis, Ian. I’m looking forward to tomorrow’s post and hearing your thoughts on why this sort of writing passes for journalism; laziness on the part of the writer or if sensationalism is what is now considered “good journalism”.

Comments are closed.

Close search overlay