How We're Handling Google's BERT Update

Posted by Emery Pearson on Dec 16, 2019 4:40:32 PM

search bar over hands on a keyboard

In October, Google announced one of the more exciting (or terrifying, depending on who you ask) updates in recent Search Engine history: BERT. 

BERT is not the name of a yellow puppet who bunks with his pal Ernie (at least, not in this context), but is instead an advanced algorithm that utilizes natural language processing and machine learning to provide more relevant search results for internet users. 

BERT, or Bidirectional Encoder Representations from Transformers, is also an open-source research project that computational linguists from all over the world have been and will be working on. This means that BERT isn't just a one-off update that will just impact searches now. It is an ongoing project that will continue to evolve and learn in order to give searchers what they want, when they want it. After all, that's been Google's mission all along.

What We Know About BERT

There's plenty of useful information out there about the specifics of BERT (check out this recap of Dawn Anderson's Search Engine Journal webinar for an exceptionally smart overview), so we won't go into too much here, but we'll talk a little about what you should know about it. 

First, BERT is deeply bidirectional, capable of reading strings of words instead of one-by-one. It is pre-trained on the corpus of Wikipedia. Essentially, what this means is BERT can understand the context of a query based on a word and the words surrounding it. This may not seem like much, but it's actually a pretty big deal -- historically, machines have a hard time learning context or distinguishing the subtlety of language. 

English can be very confusing, with so many words that have similar meanings, words that are spelled the same but have very different meanings, and words that have dramatically different implications based on context. BERT is a way to solve for some of those tricky word problems. 

What does that mean for searchers? BERT can help find the best results for a given query based on semantic context. If you're curious about what that looks like, Google's blog post gives a few examples of specific searches that BERT improves. 

Here are a few other key takeaways:

  • BERT initially affected 10% of searches, according to Google
  • It rolled out globally in early December 2019
  • It does not replace RankBrain, Google's machine-learning AI algorithm

What We're Doing About BERT

So then, what are we doing about this new algorithm? Well, nothing, actually. 

There's no way to optimize for it, because it is designed to better understand queries. More specifically, BERT is for the people, not for the webmasters. Websites that will benefit from BERT are those that have content that makes sense to the human being seeking information. Keyword stuffing, nonsensical or poorly written content, old means of "optimizing" -- these are the things Google's algorithms will filter out. 

What does that mean for your website? It means that if you're already focused on what your target audience wants (rather than what you believe search engines want), you're already in good shape. 

We're not doing anything about BERT (except maybe a little I-told-you-so dance) because we're already people-focused. An SEO strategy that's more concerned about the people using the search engine than the search engine itself is already prepared for BERT. So, we're feeling pretty good about what we've already been doing, for ourselves and for our clients, in the context of Google's latest and greatest. 

What Won't Work for BERT

It's likely those sites that have been negatively impacted by BERT are those that were trying too hard to optimize and not trying hard enough to produce valuable, reader-focused content. Based on early reports and anecdotal evidence, sites that were ranking for irrelevant keywords saw a drop in traffic. 

Again, there's no way to optimize for BERT, because this algorithm is based on natural language. Your attempts to game the machine will fail, but that doesn't mean you should give up hope. 

Instead of trying to win at BERT, focus on your audience. Focus on natural language, on creating good content, on proven SEO tactics, and go about your day. If you've been forgetting about the people using the search engine and focusing on the search engine itself, BERT's introduction is a great reminder to shift gears and remember who (not WHAT) you're creating content for. 

If you're wondering how your current marketing strategy will work with BERT, click below to schedule your web marketing checkup!

New call-to-action

Emery Pearson

Written by Emery Pearson

Emery is the content strategist at Tribute Media. She has an MA in rhetoric and composition from Boise State University, and she is currently an MFA candidate in creative writing at Antioch University. She lives in southern California with a bunch of creatures and many plants.