The Most Contentious Issues in the SEO Industry

negative

This year’s “Search Engine Ranking Factors 2009” from SEOmoz offered up a new metric alongside each ranking factor to measure the standard deviations of contributor answers. Basically, a percentage to define the range of disagreement / agreement alongside each answer given by the contributors.

Interestingly, the most contended factors were all negative ranking factors, activities likely to negatively impact your ranking:

the-most-contentious-issues-in-the-seo-industry

Why were those negative factors more contentious than positive factors?

If you ask 72 known SEO’s in the community, most of them will be working on “ethical” campaigns for their clients. They’re highly unlikely to be currently recommending or engaging in activities involving hidden text or user agent detection. There are just probably fewer people in the contributors list that have used or tested all of those “black hat” techniques compared to the number of people who have employed (and tested) the more positive factors (obviously) – so the negative factor data is likely to be less reliable than the positive data.

The exceptions

In some cases though, you may choose to employ techniques similar to this for more ethical reasons, like a/b testing for example. This issue was most definitely taken into account in the contributor comments by being summarised nicely by Tom Critchlow:

A lot of these factors depend on intent. For example, cloaking by user agent can be fine so long as the intent is pure and many large sites get away with it and have done for years. Also, a fair number of the link factors (such as manipulative bait and switch campaigns) are more likely to have 0 value than negative value. We’ve seen Google preferring to de-value spammy techniques/links rather than apply penalties for them where possible.

I agree – some of the negative factors raised in the list above can sometimes be applied without considering they could be interpreted as search engine spamming. We’ve also seen Google apply page level penalties for certain activities and I can think of a few examples of sites using hidden H1 / H2’s using CSS that flew under the radar for ages and were never detected. In my own tests, on some of those items there were plenty of occasions when the page was not banned (nor was the site) but the technique did not nessecarily enhance the rankings.

At the end of the day, it’s just easier to do the job right without risking your domains. Be mindful of the negative factors and the damage they are (highly) likely to do to your site. If you are tempted by this stuff, do your testing first :-)



Stay Updated: Sign Up for Webinar & New Blog Alerts