Google’s latest update: BERT
Google released a new algorithm recently, named BERT, which stands for Bidirectional Encoder Representations from Transformers and incorporates Natural Language Processing (NLP) functions. BERT is a neural network AI program that helps Google recognise language the way that humans do with all its nuances and unusual spelling.
Google states that ‘BERT has helped us grasp the subtle nuances of language that computers don’t quite understand the way humans do.’
It has been dubbed the biggest change since RankBrain was introduced and will affect search queries (one in ten) and featured snippets. At the moment, it has only been rolled out for English.
Tech traditionally isn’t great at understanding language the way that humans do (that is what makes us human) but BERT brings about a new way to understand the way we write and talk and communicate in the real world.
In particular, question answering is a more prominent facet of BERT and SEO considerations going forward.
This technology is significant in that it can recognise words that are in context and also context free. So, search queries are going to be recognised as more nuanced and relevant. BERT has achieved a human level score of 91.2 per cent. Whilst that sounds incredible (no need to worry about those sci fi movies just yet), NLP researcher Allyson Ettinger says that it’s ‘…far away from understanding language and context in the same way that we humans can understand it.’
Essentially, BERT being applied to Google’s function is in relatively early stages and there’s vast scope for how it will be applied in the future. At this stage, BERT has limitations, specifically contextual impacts of negation.
BERT is great for users
It has been suggested that BERT will help searchers gain simple and clear answers out of technical and complex results or text. Google purports that people can ‘search in a way that feels natural for you.’
Which is terrific for users. Such example, explained by Moz is…
‘… let’s say someone asked the question “can you make and receive calls in airplane mode?” The block of text in which Google’s natural language translation layer is trying to understand all this text. It’s a ton of words. It’s kind of very technical, hard to understand. With these layers, leveraging things like BERT, they were able to just answer “no” out of all of this very complex, long, confusing language.’
Google has had trouble understanding context and how it is applied. With BERT, these barriers will lessen. Previously, they have predominantly used left to right context when reading search queries. With this new model, they’ll be able to apply multidirectional (or deeply bidirectional) reading to understand context and nuances better.
Here are some examples that Google has provided of BERT in action.
Optimising for BERT
As always with Google, they don’t reveal how you can specifically “win” at SEO. Although they do release details about their algorithms and changes which we can surmise and predict how to appease them. Thankfully, there are some great minds at work who understand SEO really well and can advise us on tested best practices.
You may be sick of reading this but if you want to improve your SEO and write for BERT, you just have to write great content and keep it up. The quality and value of your content is the most important thing above all. It’s more important than your word count, how many keywords you have included or what kind of words you use. Which are all important factors but if the quality of your content is crap, then these things aren’t going to help a heap.
Neil Patel suggests that now is a great opportunity to create ‘… highly specific content around a topic.
It’s not necessarily about creating a really long page that talks about 50 different things that’s 10,000 words long. It’s more about answering a searcher’s question as quick as possible and providing as much value compared to the competition.’
‘If you want to do well when it comes to ranking for informational keywords, go very specific and answer the question better than your competitors. From videos and images to audio, do whatever needs to be done to create a better experience.’
How to write for BERT
You could try writing quite naturally. By this I mean adopt a more casual tone of voice, use colloquialisms and a cadence that you would use if you were speaking. I tend to be more refined, formal and structurally pedantic in my professional writing and blogging, which could be my undoing with this algorithm and my SEO results. However, time will tell and I’d be disinclined to sacrifice great writing to see better ranking. I inherently believe that good writing will always win out.
In a nutshell:
- write specifically
- provide good quality information
- don’t waffle on
- write like you are talking to the people you want to reach
- think about what questions your target demographic are asking
- answer questions and solve problems with your content.
‘Optimizing now means that you can focus more on good, clear writing, instead of compromising between creating content for your audience and the linear phrasing construction for machines.’
Neil Patel also suggests that conversion rates will be better from organic traffic as results match queries more specifically. Meaning that people who are searching exactly what you are providing will find you easier. This will be easier for you to sell to the people who actually want what you have to offer!
This is a very simplified overview of BERT and if you want to know more, it’s very worth researching the technicalities behind the technology.