Say Hello to Google’s New Best Friend, BERT

sesame street character Bert at a children's birthday party

Did you know that Google’s newest search algorithm, BERT (Bidirectional Encoder Representations from Transformers), will handle 1 in 10 of all search queries? Heads up, website content writers! This is the most fundamental change to Google’s search platform since the release of RankBrain back in 2015. That’s because Google’s new algorithm understands natural language.

BERT officially began its tour-of-duty this week handling English language queries. Although a global live date hasn’t yet been set, it’s expected to come online progressively. As each country and language meet internal beta-testing metrics, the collection of algorithms and processes known as BERT will grow to encompass more and more queries. Does the name, Bidirectional Encoder Representations from Transformers, hurt your brain? Web designers, bloggers, and content writers don’t speak machine language, either — but we are expected to understand SEO, so let’s take a deeper dive into what exactly all this means.

Who is BERT? What’s He Really Like?

BERT is actually a powerful neural network that processes natural language. A neural network is a series of equations that attempt to understand the complex relationships between variables in a way that is similar to the human brain. Although the technical aspects are literally beyond rocket science, networks of this sort can be described using a couple of talking points.

Neural networks can readily adapt to input criteria that are constantly changing. Here’s a perfect example of why this flexibility is revolutionary and so important. Usually, you call your Labrador retriever in for lunch by blowing a whistle — but for some reason — you just can’t find it today. Instead, you yell, “Daisy, it’s time for lunch. Come and get it!” It turns out that your dog, smart as a whip, knows when it’s lunchtime and so you could call her with the whistle, your voice, clapping your hands, or even just making a lot of noise pouring the kibble. (Truthfully, she’s already drooling at the back door.) A lot of us content creators and social media chaps work from virtual offices at home, so I know you know what I’m talking about. 

The second aspect of understanding BERT, and neural networks generally, is that the output criteria remains the same despite major changes to the input. It doesn’t matter if we’re using the whistle or our voice to call the dog, the result is going to be the same: your pup comes inside from the backyard, enjoys her lunch, and then takes a nap on the sofa as she does every day. Essentially, BERT (and algorithmic processes like it) help our machine-counterparts understand human language a little bit more like, well, humans do.

BERT’s Artificial Intelligence: Quick Look ‘Under the Hood’

What does this look like in practice, though? BERT helps the Google search query engine better understand the overall context and subtle nuances of words in search strings. The artificial intelligence under the hood is then better equipped to match those queries with results that are — surprise — actually relevant. And from a content writing perspective, this couldn’t be more important.

In a recent press event, Google used this excellent example to demonstrate a real-world use case. In the query “2019 brazil traveler to usa need a visa,” the word “to” and its relationship to the other words are essential in parsing the meaning of the phrase. Prior to BERT’s emergence on the scene, Google would not have been able to compute the importance or weigh the variable of the connection. Unfortunately for the user, the monolithic search provider would have returned results related to American citizens traveling to Brazil.

A Google representative explained, “With BERT, Search is able to grasp this nuance and know that the very common word ‘to’ actually matters a lot here and we can provide a much more relevant result for this query.”

What Does BERT Have to Do With My Business?

Well, that’s an excellent question. Any change, especially one as fundamental and far-reaching as BERT, will have major impacts in several areas for many years to come. For one, keywords no longer have to be as specific as they once were. The order, or lack of order, also now appears to be of less importance. This is a good thing for your business and us technophiles in the website content world — if you optimize for the update! Because these algos are designed to understand language in a way that is, by its very nature, multicultural, comprehensive SEO strategies will need to incorporate non-native speakers in their outreach. There’s a whole world out there of people speaking other languages in conjunction with English — and they are potential clients! 

While the SEO-eggheads are left scratching their heads, why not take advantage of our unique knowledge base? Not only does iwebcontent stay on top of the latest trends and algorithm updates, we understand them and can put them to immediate good use. Book a call today and let’s fine-tune your keyword research strategy for maximum performance in today’s BERT-centric online world. We practically wrote the book on the best practices for SEO optimization. Oh wait, we actually did write a book!

thumbnail image of iwebcontent's eBook on SEO website ranking

Click the above image to download your free copy of 8 Keys To Make Your Website Rate (and Rank) SEO eBook!

Tagged under:

Leave a Reply

Your email address will not be published.