What is BERT?
BERT is the AI language model that Google uses to understand the intent of a search query. It is an acronym for Bidirectional Encoder Representations from Transformers.
Google released BERT as an open-source project in 2018. Before BERT, Google uncovered the search intent by analyzing the keywords in the search query. However, with BERT, Google now uses Natural Language Processing (NLP) technology.
Instead of just considering the keywords for search results, BERT considers the entire sentence to understand the context in which each word is used. This, in turn, allows Google to understand the search intent and deliver more relevant results.