Survival SEO Strategies – The May Day Update

Several very talented SEOs have put a lot of effort towards helping the SEO industry understand what Google’s most recent algorithm update did for search quality and search engine traffic. It seems that, gone are the days where domain authority rules the roost, if you’re pushing out a lot of pages, and those pages are a touch thin content wise, you’d better have a great strategy to get links to those pages or create a richer user / content experience.

Flat growth or a sudden reduction in long tail traffic

Has your traffic been hit by the May Day update? Perhaps it’s been flat with below expected growth of late. Your traffic might have dropped, specifically traffic in the long tail. If that’s the case, you might want to check how hard your deepest content types or subfolders have been working for you.

With all of that in mind, what sort of process should you follow with your site analytics and what types of changes can you make to claw some of that traffic back? Most importantly, how can you adjust your SEO strategy to help avoid updates like May Day hurting you again?

Get a good sense of depth, authority and employment

Experimenting with methods to measure link distribution, indexation and employment by content type (subfolder) in your site architecture might be a smart idea. By “employment” I mean, “how hard is this content group working for me?”.  Thinking about the ratio of pages per subfolder to the number of pages in that folder receiving at least one entry from Google, what was the before and after May Day view of employed pages? How many keywords were driving traffic to your site daily before and after the update? Can you see weaker areas of your site working less well?

How hard is this content working for me?

Some of the experiments I’ve been working on involve collecting data that brings together the opportunity to answer questions like the ones above, but they rely on a little patience and problem solving along the way. Particularly, collecting the data with automated tools.

If you’re interested in getting into the “nuts and bolts” of indexation depth, authority and employment metrics you’re going to need some data:

  • Top linked to pages on your domain
  • Total root domain links by content type (subfolder)
  • Total pages that receive at least one entry from Google over a period of time
  • % Indexation by subfolder (The percentage of total pages in a subfolder that receive at least one entry over a set period of time)
  • Pagerank by page
  • mozRank by page
  • Depth in architecture by page from Xenu
  • Internal links by page from Google Webmaster Tools

These are the kinds of values you can collate and compare to get a better understanding of potential weak areas in your site architecture, creating data tools if you will – a topic you’ll notice I’m covering a lot in the coming weeks. Here’s an example of how links by subfolder, total pages and the count of total pages receiving at least one entry can work together to show you how employed sections of your site may be.

Charting Indexation

I know there’s a problem, what should I be working on right now?

The biggest impact (I sense from talking to friends in the SEO sphere) has been felt on page types that are a little too “boilerplate”. Category, listings, product listing pages, white label sites “customised” with an advertiser logo or any type of site that might be using excessive, near duplication. Pages that tend to vary H1s and some meta for long tail traffic. This tactic, seemingly works less well if you don’t have the direct links to those pages today.

Uniqueness and authority by page appears to be the key formula. Dave Davis made an important point on this, essentially stating that it’s far from the easy approach, but focusing on value add pages that attract links naturally are the important way to go. If you’ve already been following this strategy, then you’ll probably be one of the folks that hasn’t had a problem with long tail traffic.

Modify your existing pages to add unique value

Remember the blank review pages issue raised by Matt Cutts over a year ago? Obviously even back then, Google were losing patience for pages that added little value to their vision of the web. Regardless of the role I believe external links (authority from external sources) can play in the rankings of even the crappiest of webpages, there’s an important quick win: make your pages appear more unique.

  • Index your own tweets and play the text back on pages that are most relevant
  • If you’re advertising information on locations, venues, events, pull through their twitter stream
  • Make your product, design, UI and user experience awesome
  • Add reviews to the pages, make it fiendishly easy for users to comment – bin your lengthy registration process to allow comments
  • Create video and rich media – make it really easy to embed any data / content you offer on each page on a 3rd party website (Salaries, Price comparison, search boxes, latest odds, celebrity news, whatever – think widgets and always make it easy to share)
  • Get quick and easy links to each page using plugins like Tweetmeme
  • Mashups can work, but it’s best to use syndicated snippets of your own content (think guide snippets, related blog post articles)
  • Use services to create snippets of genuinely custom content (Mechanical Turk for example)
  • Reward your audience for participation, create value add programs that make people want to talk about you
  • Make it quick and easy for your SEO’s to add unique content to even the most template of pages – even a simple “You might be interested in…” box where a few lines of text and a link can be added makes a world of difference to those boiler plate pages
  • Think a little outside the box – recently we added a custom text snippet at the end of each breadcrumb on a page template – adding just a little extra length to the long tail.
  • Try to apply dynamic variation to otherwise boilerplate navigational section headings
  • If you’re working with data feeds as an affiliate, think about your automated content / data feed strategy and read this presentation by Tom
  • Use your deepest, archived content wisely
  • Take a look at the site you worked on a few years ago on – if some old content has been ditched (rewritten guides, as an example) – is there any of this stuff you can turn to snippets and resurrect?

Think about your site architecture

There are heaps of ways to skin this cat – but the bottom line is, how well are you distributing authority down and across your site architecture? It’s perfectly natural for content pages lower down your site architecture to earn fewer links, but leverage the ones that do to spread the love into places that are link love starved.

  • Keep your  site architecture flat
  • Use DHTML / CSS to create more links on a page without harming the user experience
  • Cross link between content silos
  • Use your most linked to pages
  • If you’re brave, create a heat map of your most linked to areas on the site and work from that
  • Use similar, related, most frequently visited, top pages and most commented suggestions to improve your internal linking

I don’t think May Day introduces anything new – it simply enforces what SEO’s should have been doing all along – working to add value. Dave made the point well: “Google is now seeing individual documents as their own entities a lot more” – so, it’s time for us to start doing the same.

Stay Updated: Sign Up for New Blog Alerts

Follow: | | |

Tags: , , | Categories: Research, Technical

8 thoughts on “Survival SEO Strategies – The May Day Update

  1. Long overdue, don’t you think? It stands to reason that a page that had original thought go into it will be more likely to have valuable content than a cookie-cutter page generated by a script.

  2. Great article/post. I think SEO has almost come in a circle which is why a lot of stuff in this article is so relevant. Originally people just built pages, and whatever had the best content ranked best. Then people started pulling content tricks. Google then originally started valuing backlinks, and over the years people have tried all sorts of backlinking tricks. As Google develops more and more ways to ignore some of these, quality content and quality backlinks have again come to the fore.


  3. Evan says:

    Thank you for the point about blog entries, I find it quite hard in my line of work on my site that get’s a couple of hundred hits a day to write things that might engage really well and quickly. It’s mainly due to the nature of the piano business, there aren’t millions of people looking for them at any one time and if there were perhaps we would get millions of hits. I’m finding it hard to write something that will rise quickly and engage quickly due to visitor volumes. Any tips would be really appreciated. Evan

  4. Cebu blogger says:

    A lot of people say that onsite optimization doesn’t matter. They go for the link building strategy. No matter how mall is the percentage for the onsite optimization it does also count. Thanks for sharing bro

  5. ganardinero says:

    Your insights are great but it raises a somewhat odd thing… Since Google is the search engine of choice for everybody, then following their guidelines is a very important issue in the building of our sites.

    About this “thin content” thing, there are pages with categories, links to products and even ads that give the user what he/she is looking for. Considering this pages thin content and making us modify them even if they serve their purpose already is a little, say… imposing.

    But well… what are you going to do?

  6. Wool Overs says:

    Great Article, it would be good to collate all the Google updtes into a timeline, so we can see what has changed at each update!

  7. yes says:

    Nice article

  8. Very insightful article. Google is such a pain to study. Updating the way it works all the time. But that’s what makes it Google- the best and hard to crack.

Comments are closed.