...

October 28

What Does Google’s BERT Mean for Marketers?

0  comments

Google is back at it again, last week releasing a new update to its search engine algorithm that it claims will help it better understand the nuances and context of words in searches and better match those queries with more relevant results. And it’s a big one — Google expects this new search algorithm to impact up to 10% of all searches on the platform worldwide.

It’s all based on a neural network technology called “Bidirectional Encoder Representations from Transformers,” or BERT for short, which also marks the first time Google is using its latest Tensor Processing Unit (TPU) chips to serve search results. By combining both the keywords in the queries it sees with the content of web pages in Google’s index for context, the idea is to create a better understanding of what the true meanings of specific words are as they’re used.

Google first mentioned BERT last year and open-sourced the code for its implementation and pre-trained models. Transformers are one of the more recent developments in machine learning. They work especially well for data where the sequence of elements is important, which obviously makes them a useful tool for working with natural language and, hence, search queries.

How BERT Works

Google’s explanation…

“Last year, we introduced and open-sourced a neural network-based technique for natural language processing (NLP) pre-training called Bidirectional Encoder Representations from Transformers, or as we call it–BERT, for short. This technology enables anyone to train their own state-of-the-art question answering system. 

“This breakthrough was the result of Google research on transformers: models that process words in relation to all the other words in a sentence, rather than one-by-one in order. BERT models can therefore consider the full context of a word by looking at the words that come before and after it—particularly useful for understanding the intent behind search queries.

“But it’s not just advancements in software that can make this possible: we needed new hardware too. Some of the models we can build with BERT are so complex that they push the limits of what we can do using traditional hardware, so for the first time we’re using the latest Cloud TPUs to serve search results and get you more relevant information quickly.”

Examples of BERT in Action

OK, that’s a lot of detail. What does it all mean in the real world?

The short story is that the Google search engine will now, by applying BERT models to both ranking and featured snippets, be able to do a much better job connecting users with truly relevant information. Says the company: “In fact, when it comes to ranking results, BERT will help Search better understand one in 10 searches in the U.S. in English, and we’ll bring this to more languages and locales over time.”

In practice, this seems to be focused on making better use of the words we search for.

For instance, a search for the phrase: “2019 brazil traveler to usa need a visa” hinges on the word “to” and its relationship to the other words in the query. It’s about a Brazilian traveling to the U.S., and not the other way around.

Previous result: Pages about U.S. citizens traveling to Brazil. That wording would be lost.

BERT result: By catching this nuance, results for this query can be much more relevant.

Let’s look at another search example…

Consider the phrase: “can you get medicine for someone pharmacy.”

Previous result: A link to a 2017 MedlinePlus article about getting a prescription filled.

BERT result: Google’s search engine now shows a 2002 article from the Department of Health and Human Services about how to have a friend or family member pick up the medicine on your behalf. It gets the nuance.

How to Use BERT in Marketing

As with every Google search algorithm update, BERT’s true impact on the industry remains to be seen. But, there’s room for hope that this technology will do a better job of bringing interested users into contact with content that really helps them.

And that’s a very good thing.

Particularly for longer, more conversational queries, or searches where prepositions like “for” and “to” matter a lot to the meaning, Search will be able to understand the context of the words in your query. You can search in a way that feels natural for you.

That means search results that feel more conversational, and mirror the way that people talk. That’s where most search traffic has been going recently, so this is a big step in the right direction that should help clear up confusion around those nuances.

Questions? Thoughts? Hit me up at tim@wearelayup.com and let’s figure out what to do about BERT as it relates to your own outreach efforts.


Tags

bert, google, google bert, layup content, search engine, search engine optimization, seo


You may also like

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}

Get in touch

Name*
Email*
Message
0 of 350