If you didn't catch the news last week, Google announced that they made the single largest change to their search algorithm since 2009. They also didn't bother telling anyone for the first month, so if you're a Google user, you've been using Hummingbird for a while now. If you're anything like me, then the first thought you may have had is "what will this mean for SEO?". Clearly, for anyone who is operating or building web sites, what Google thinks of your website is one of the most important opinions out there.
So what does this mean? How does Google Hummingbird work? Well.. they haven't really told us. What we do know, is that Hummingbird is primarily a change to how search queries are understood, rather than about how pages are indexed. The buzz around the SEO world indicates that there hasn't been much change to how web sites are ranked, so no need to worry that you'll need to relearn everything you know about search engine optimization.
The major theme of Google's announcement though, is that Google is working hard to understand the meaning of words. So rather than just matching keywords in your search, Google will try to understand the context of all the words in a question, and even take into account previous questions that you've asked. The first big part of this is that they can better understand natural language questions, such as those you might speak into a mobile phone. The other change is an even greater emphasis on semantic data. This is not a new theme for Google, as they have been a big proponent of semantic web technologies, and have been building up their Knowledge Graph since 2012. The Knowledge Graph allows them to link structured data and understand meanings better.
The latest release even adds a comparison tool, which can be used to show the difference between butter and olive oil, as Google demonstrated at their press release. We even have a simple way to compare apples with oranges now. (which, by the way, if you're counting carbs, you should eat the orange.)
With Hummingbird, it seems they are even better positioned to make use of deep, structural, semantic data.
Drupal is structured data
The exciting thing about the future of search and the future of the web for us, is that Drupal is literally built for structured data. While many content management systems have a focus on creating pages to be displayed on the web, Drupal is based on individual pieces of content, with fields attached to that content. Because of this, we don't have huge blobs of unstructured content, but we can create nice, clean chunks of information that make sense to both content creators and to search engines.
It's this architecture of clean, structured data underlying Drupal which has allowed the community to embrace the semantic web from early on. Drupal 7 was released with RDF support, which is one form of semantic data. Contributed modules have gone further as standards have evolved, supporting microdata and Schema.org. Schema.org was created by Google and other search engines and has become the de facto standard of semantic markup on the web. The Drupal community quickly embraced this standard and released the Schema.org module, and now, schema.org support has been rolled into Drupal 8 core.
The future is bright
All of these changes are good for the web. When search engines can better understand your content, they can better answer people's questions. If you're building your websites with Drupal, you're on a solid foundation to create structured and well formed data that Google and others can understand. If you're on Drupal 7, go ahead and check out the schema.org module so Google can understand your content better today. If you're evaluating content management systems, take Google's latest announcement as yet another reminder that structured data is the way of the future.
Finally, keep an eye on your Google searches to see better identification of "things not strings", and do everything you can on your sites to help the search engines understand the difference.