BERT is Google’s acronym for its new natural processing language (NLP) model. It stands for Bidirectional Encoder Representations from Transformers and in short this is what helps Google improve its understanding of search terms and content on web pages, in relation to their keywords. Ultimately it means Google can better index your website content and serve up more naturally accurate results in its search results.
So it’s something that we should all be aware of, whether we’re building websites, writing content or using their search engine.
As has always been the rule, your website content should be high quality – natural, well written, short sentences, using relevant key words and phrases, targeted to its audience and so on.
How does it work?
Some time ago we predicted that AI would be playing an increasingly important role in the world wide web and as Google’s latest machine-learning algorithm, BERT does just that. How? It takes note of how each word in a sentence relates to the others, thereby better understanding its meaning, rather than simply indexing a list keywords.
This will be the final nail in the coffin for websites that use strings of keywords (“keyword cramming”), since Google realizes these are generally written to fool the search engine indexing bot, instead of being natural, human-readable English.
It also means that Google no longer ignores prepositions like “to,” “with” and “from” as it needs them to make full sense of the words’ context and true meaning of the sentence or search term. So in the past we’ve searched for things online using what we assume to be understandable by search engines, but now our natural human queries will produce better results.
An example – instead of using the shortened “Web designer Chichester” search term you can use “Find a website designer in Chichester” and high quality results will follow (well, for websites that have well-written, natural content). BERT also looks at the next sentence in your content to help it understand the first. As well as filling missed words as it’s always done, it will also better handle its predictions of the words that might come next.
And, going into more detail, it splits words down into their component parts, for example identifying and using the root of a word (“designing” becomes “design” + “ing”). On use of this is in identifying the context when a word has multiple meanings as nouns and verbs, like “run” (as a verb to move quickly, manage or offer a service, as a noun a period of time, journey or jog?).
Keep writing high quality content
This move is perhaps driven by the fine tuning of voice-activated services like Siri and Cortana. It seems good news for content writers and internet users alike as it makes searching easier and search results more relevant. Expect this to be a gradual and ongoing improvement as the Google bots get cleverer and more human-like by the day.
Finally, to reiterate, keep your website content as high quality and natural as you can. Second-guessing search engines or trying to beat the Google system won’t do you any favours in the long run. You don’t need to rewrite content as a result of the BERT update, unless it was badly written in the first place!
If you would like a website performance and usability audit or need an SEO review, talk to us, we can help.