Google’s war on spam has left quite an impact on the SEO industry of late, with last week’s Penguin update leaving swathes of web masters taken totally by surprise, running to the forums to make their complaints. While “white hat” SEO can start to look smug, innocent bystanders have been caught in the cross fire.
Let’s imagine you’re one of those people. What are you supposed to do? Back at the GadgetPlex we discussed a series of options, some based on experience, some based on observation on specific penalised sites and some, in need of experimentation. Often the conclusion we’ve been coming to on the *experimental* ideas are to do the test and see what works. The thing is, the Penguin update doesn’t just raise the link quality threshold. At least not from some observations we’ve made.
In no particular order are the next steps an impacted site could take.
Review Your Site for the Other Problems
In a situation like Penguin, it’s very easy for SEO’s to panic, and head straight to their inbound link data. Don’t be too narrow in your focus, as Rishi pointed out. A few weeks before Penguin, Google sent out “manually generated” unnatural link warnings via Google Webmaster Tools took out a few link networks, including buildmyrank.com. I think that round of changes left a lot of SEO’s in the mindset that all updates since then (Panda 3.5 update on the 19th April excused), must have been about links.
The cryptic warning on “over-optimisation” came from Matt Cutts at SXSW back in March, where he talked about keywords on a page, link exchanges and “what a normal person would expect in a particular area”. Read into that what you like, today I read that statement as “site optimisation levels that are way beyond the norm in a given range of results”. Quite how this might be measured is an interesting conversation to have, but one I’ll save for the bar.
Google’s update announcement mentions “keyword stuffing” more than once, as well as the more obvious mentions of link schemes. When was the last time you saw a site ranking well because of keyword stuffing? I think that there were refinements to signals used to detect more recent interpretations of on-page spam. I’m basing that opinion on recent experience.
This brings me on to a domain that came on my radar as a casualty of the update. Most of the links were nothing to get excited about – reasonably natural stuff, though there were links that if you were looking at the metrics, you might want to get them removed. However, there were some interesting additional characteristics to this site namely:
– Almost all of the content was duplicated elsewhere
– There were groups of pages where the variations in copy were very difficult to see, aside from the titles
– The body copy used inline CSS to set the text the same colour as the background (with a slow loading image to appear later between the text and the background)
– Google preview showed the home page to have no actual content above the fold, just a blank space
– The site was painfully slow
Some of those characteristics should have been called out by Panda forever ago, but weren’t. The page layout update should have caught the blank space, but it didn’t. Setting your text the same colour as the background using inline styles? Uhm.
What combination of on-site issues might have triggered this penalty is guesswork. What is very interesting is the fact that there is a feasible combination of items that could be considered spammy. Why these features weren’t picked off by previous updates is equally mysterious.
My advice, don’t just focus on your backlinks. If it’s time for a spring clean in your technical SEO strategy, go ahead and do it. If you’ve had development changes for SEO queued up after the Panda update, now would be a really good time to get them done. If you know that content is duplicated elsewhere or just plain in need of a rewrite, get it done.
Remove All the Bad Links
Here’s the SEO nirvana: A domain with a nice clean link profile with site wides kept to a minimum and lots of brand centric anchor text variation from genuine, editorially awarded links. How often do you see that?
Let’s assume for a moment that any link metrics exist that definitively help us differentiate what truly makes for a “toxic” link. You’ve identified them, categorised them and used lots of spiffy tools to get your data sorted. How much work is it going to take to remove them? And how? We use ODeskers, a lot of data collection and a lot of outreach to get the job done, but it’s tough going.
Ultimately, link unbuilding is time lost to proper inbound marketing, and there are links that are pretty much impossible to remove. Occasionally you strike lucky by finding a group of inbound links owned by the same network (I managed to get ~2000 directory links removed after discovering the contact details of the site’s network owner via ARIN.
Realign and Mix Up Your Inbound Anchor Text
Have you been over cooking your anchor text links? Perhaps you’ve got lots of site wide links using highly targeted terms. You’ll be able to see the sitewide linking domains easily in Webmaster Tools and I’d suggest using a tool like Open Site Explorer or our anchor text analysis tool to work out if there might be an “unnatural” linking pattern. If you have any ability to contact the sites that are linking to you with a view to realigning the links to something more branded, I would say that’s a very worthy use of your time. We recently removed a lot of sitewides for a new client – turns out they’d been paying for them unnecessarily. Guess what? No ranking change after they were removed. Saved them thousands of pounds a month.
Review Where You’re Linking to
Are you linking to low quality, low trust sites, or linking out to sites that may have been penalised for selling links or domains buried deep in spammy blog networks? Then by association, your own trust levels may be quite low. We’ve known for a long time that links to obvious affiliate networks don’t do you any good, but since Penguin, it might be wise to really review how and where your site links out. You can use IIS SEO Toolkit, Screaming Frog or Xenu to grab all of your linked-to external links. From there it’s relatively easy to analyse the data in SEO Tools for Excel – low trust, spammy outbound links should be considered for removal.
Submit a Re-consideration Request
The last response I saw to a re-inclusion request submitted after the link quality warnings were sent out made it clear the quality analyst felt not enough had been done to deserve re-consideration. Woops.
The advice on filing a reconsideration request is pretty clear, be transparent and report all of your activity, explain what you’ve done to rectify the situation and supply a comprehensive list of links for review. There’s no point trying to hide details if you’re at this stage, Google will probably catch you out and ignore your future requests.
Penguin affected sites don’t receive a warning in Google Webmaster tools, though if you’re certain you’ve acted in a complete and thorough manner to rectify the problems causing the penalty, this is a reasonable course of action.
Start with a New Domain and Realign the Links You Built
For huge sites, obviously this isn’t an option. But what if you’re say, a smaller affiliate site with 500 or so linking root domains? If you’ve been using a link building CRM like Buzzstream and you have all of your linking contacts in a data base, you could consider realigning your good links and starting again. This is a very last resort approach, and I suspect you’d want to take into account a few caveats:
– Launch the new site entirely independently of the old one (host IP, WHOIS, new content)
– Don’t link the two sites together
– Don’t 301, we don’t know how the penalty is carried across at this stage
– Don’t cross domain canonical unless you have a safe testing ground to validate that the penalty is not transferred (again, don’t know at this stage)
– Don’t associate the sites via GWMT and Analytics
– Assume you’ll need to submit “updates” to any linking text in old guest posts
I’ve worked with sites that have pulled epic linkbuilding tasks out of nowhere and ranked (or re-ranked) well after 6 months. I’d be interested to hear what the community thinks of this idea (*ducks*).
Experiment With a New Sub-domain
We took on a client (affiliate site) after they’d got hit badly by Panda. As a test, we created a set of new pages on a subdomain targeting a group of competing keywords and they ranked perfectly well for the terms the main domain had been hit for. The cool thing about subdomains is that you can put test pages up to see what happens. In this particular case, the original pages were 301 redirected to the new subdomain pages and rank in similar positions to the original pages before the Panda update that affected them.
On that note, you could experiment with new pages “cleaned” via 301 or rel=”canonical” (and you should experiment!) but I really don’t think that recovering your rankings that way is going to be the right way to go.
Don’t Panic, Consider Your Options Carefully
Every time I see a site owner suffer temporary ranking problems, they have a tendency to panic and start changing everything. That’s not a good way to start. Right now my best opinion is try and gather a full and complete view of the situation. Identify issues that you know will be causing problems, be them links based or technical. Distribute the tasks and effect change. Going to Google as your first port of call is probably not the wise choice, unless you absolutely feel that you have been wronged by Penguin. If that’s the case, here’s the link to the form. Good luck with that.