The marketing sector can be a complicated place as new marketing tools and techniques are launched, almost on a weekly basis. Powered by The Drum Network, this regular column invites The Drum Network's members to demystify the marketing trade and offer expert insight and opinion on what is happening in the marketing industry today that can help your business tomorrow.
Why BERT shouldn’t make you feel like a muppet
You might be asking right now: “Do we really need another take on Bert?” Of course, you may not be saying that because you’ve been in the Celebrity Jungle for a month and have no idea who Bert is, in which case this is definitely the article for you.
Or you might not be aware that Bert isn't a bolt out of the blue and is merely the next predictable step in Google’s evolution from a revolutionary 1990s way of looking for websites into the best Cyber Butler in the world or, alternatively, Skynet. In that case, this article is aimed at you too.
Bert stands for Bi-directional Encoder Representations from Transformers. I suspect that they chose that acronym because it fit into the name 'Bert' and they wanted it to be memorable, but we’ll know for sure if the next algorithm change is called ERNIE. Electronic Random Number Indicating Equipment? Sorry, Google, that’s taken.
Google began as an exercise in examining content to show the purpose of a website.
And it was a very crude beginning, with factors like keyword densities – which brought us keyword stuffing – used to check if the right phrases were present. However, content written using keyword densities (or even worse keyword stuffings) look terrible and are pointless.
To preserve their market share – yes Google had real competition back then – and keep the punters coming back, they needed to weed the spammy websites out of the SERPs and replace them with quality counterparts.
So began a long story of refining the logic of the algorithm, adding factors that were difficult to forge, and asking how exactly people searched for things.
The simple answer to that question is, very badly. We’re all rubbish at searching for what we want, because we know what we’re looking for, we just can’t put it into words.
So, faced with a search query like: “2019 Brazil traveler to USA need a visa”, Google would interpret it as a US citizen needing advice about traveling to Brazil. The retrieved results included mainly news stories.
Google, like Peter Pan, wanted to be a real boy – and think like a human – so they set about spending tons of cash on achieving this, including renting out a US Military Supercomputer for one month every year to do the math.
The first clear result of this research was 2013’s HummingBird update, which introduced the principle of “semantic search” – being able to find what you wanted, even if you use synonyms. It was effectively the first nail in the coffin of exact match keywords for Organic Search.
RankBrain, confirmed in October 2015, introduced machine learning to the algorithm, with an interpretational model that took in factors like current location, personalisation, search history and advanced semantics to determine the searcher’s intent.
It’s this learning aspect which gives RankBrain its true power. Having been fed seed data, the algorithm calculates and teaches itself over time to match a variety of signals to a variety of results and to re-order the SERPs accordingly.
Bert is the latest iteration of this process: an algorithm that constantly updates itself based on feedback from current and past searches using bounce rate, repeat frequency and countless other metrics.
But how does it really work? Who knows? Not Google, who have admitted that AI’s ability to learn 24/7 makes any human grasp impossible.
They seem comfortable with this. In the same way that they were comfortable with the process of compartmentalising the original algorithm itself in the mid-noughties, with every one of the more than 200 factors being nursed and nurtured by a distinct team, so that no one person could get a grip on it.
Who are the losers?
Bert’s big casualties will be the spammers, who have been trying to game the algorithm since 1998.
That means websites with content that aren't written with real people in mind. If you’re intent on writing only for Googlebot, then Google is trying to weed you out in favour of something a little more meaningful.
Anecdotal evidence already shows that traffic to spam sites from Google is already in steep decline thanks to RankBrain and HummingBird, and Bert will only assist in that. However, there are plenty of spammers to go around, with an inexhaustible supply of new cunning stunts, so I don’t think we need to shed a tear over their passing any time soon.
And if Google is just trying to be a real human, or even superhuman well, humans make mistakes and fall foul of fraudsters every now and again.
What it all means in practise is that, instead of examining a query word-by-word, Googlebot looks at the phrase in its entirety. So now prepositions like “to”, “no” and “for” – which can radically change the meaning of a query – will factor into the equation.
As a result, our previously highlighted query “2019 brazil traveller to usa need a visa” now brings up a much more useful result than an article from a newspaper site; the US Embassy in Brasilia.
If, however, you’ve always used great, informative, long-form content written by humans, for humans, you have little to fear.
Seeing your point
As well as the ordinary search results, Bert is also going to affect featured snippets” – including Position Zero – blocks of content within a page that help the user more readily identify pertinent information.
For example, when looking at the featured snippet: “parking hill no curb,” pre-Bert Google would place too much weight on the word 'curb' instead of 'no'. The results should now display featured snippets that teach you how to safely and effectively park your car on the side of Ben Nevis.
In practice, Bert has so far not set off a bomb in the SERPs, at least as observed by the third party trackers – SEM Rush, AHREFs and SearchMetrics, etc. They tend to track the “Fat Head” shorter search terms, while Bert affects longer-tail queries. Google itself estimates that Bert will affect only around one in ten searches.
In general, people use three types of queries on search engines:
- Informational: Seeking an answer to a question, such as “Should I buy an electric car?”.
- Navigational: Homing in on the question, such as “What types of electric cars are there?”
- Transactional: An intent to purchase, such as “Where is the Tesla Website?”
Bert targets top-of-the-funnel keywords – known as the informational queries – hinting that content needs to be more specific, and less generalised; to reduce the one-size-fits-all approach. The age of generics may also be coming to an end.
Google says Bert will allow users to “search in a way that feels more natural”. You don’t need to jump through hoops anymore; now Google will do that for you.
Seeing into the future
And why are they so keen on doing this? Well, for starters, if you the searcher, get a great result from your next Google search, you’re far more likely to return, and hopefully click on a paid placement, especially if you’re in a transactive frame of mind.
Ultimately, Google wants to be the only thing you turn to for everything you need in your life, so it has to be intuitive: to know what you want almost before you do.
So, look out for Bert’s clairvoyant children. Coming to a search engine near you... or Skynet.
Max Brockbank, head of SEO at The Media Image.
Have your say
Do you have a strong opinion on a topical industry issue? To submit a comment piece, please send a short summary of your idea to email@example.com. Views of writers are not necessarily those of The Drum.