I’ve written a few articles on Google’s Hummingbird update before, and in them I noted that one of the primary effects of Hummingbird is that Google is now using ‘microdata schema’ to parse your content. The upshot is that Google seems to really want you to spend a boatload of time and energy filling your content with microdata that explains exactly what each piece of data is.
For example, if you write a book review, Google claims to want you to put metadata in your content that says, in non-technical terms, “This is a book review,” “this is the title of the book,” “this is the author of the book,” and so on. This is great for Google, because it gives their algorithms lots of fun things to hook on to — but it’s a pain in the butt for us, because seriously, who wants to put in all that work?
Well, Hummingbird has been out for almost 10 months now, and the adoption rate of microdata markup is really quite low. The highest estimates I’ve seen put it around 20%; conservative estimates are half that. More importantly, Google continues to return almost entirely microdata-free results even for web searches that seem custom-made for microdata, like the aforementioned book reviews.
Hummingbird, then, seems more like a failed experiment than like a Google update we all need to be paying attention to. Sure, it probably helps those pages that use it relative to those that don’t — but a simple glance at the HTML of the top X search results of any given query will tell you they don’t give it that much weight.
SEO Success In a Hummingbird-brained World
The things that do get weight are the things we’ve always been told get weight: authority, relevance, domain age, user experience, and so on. There’s no shocking news there. What Hummingbird did tell us, however, is that Google is interested in determining the relationship between words.
We’ve known this for years, too, really: the entire concept of Latent Semantic Indexing is just a shorthand way of saying “figuring out how words relate.” But let’s expand a bit on that and talk about the one thing we know we can definitely affect about our sites: the keywords.
Keywords have a lot of attributes, and SEO people pay attention to most of them. Competition, PPC value, searches per period, and other numbers consume a lot of attention on the part of the SEO researcher. But once the keyword gets handed to an SEO writer, the assumption is basically that it doesn’t matter how it gets used, as long as it’s in the text. That couldn’t be further from the truth.
That’s because Google’s algorithms examine every phrase of every sentence that they parse, and when they do, they look for specific relationships between those phrases (and words.) The number of ways in which words can relate are infinite, but there are a few very common ones we can talk about. For example, the “is a” relationship — “A bicycle is a vehicle” is a relationship that Google understands, so talking about bicycles on a page about vehicles (or the other way around) comes across as natural to Google. The “subject-verb-object” relationship is another that Google groks; they even have a patent to an algorithm that processes that relationship.
In other words, when SEO writing goes to great lengths to make a keyword fit the context it’s couched in, they’re doing more than just making it ‘human-friendly.’ They’re also making it search-engine friendly as well, because they’re doing informally what Hummingbird tried to force on everyone: they’re making it easy for the algorithm to understand what’s going on.
Leave a Reply: