Google Search, Artificial Intelligence & Natural Language
When you go online and search for things natural language processing is a core function going on behind the screen that allows you to recieve relevant results.
Natural language processing uses artificial intelligence to gather meaning from complex searches. The world of search marketing had been constructed around AI, and marketers need to know how to write and use copy that gives the AI systems the best options to choose from. Doing so will enhance your marketing efforts.
Sometime between late 2017 and 2019 Google released a new neural network technique for natural language processing called BERT. Its stands for Bidirectional Encoder Representations from Transformers. BERT is used to understand the nuances and context of words in relation to the questions we enter into the magic search box. It is now used every time you search and impacts if customers engage with your brand online.
Natural Language Processing & Search Engines.
Natural language processing allows computer systems to learn, analyze and understand human language. The technology has come a long way, with chat bots and AI callers. However it all started long ago with a natural language technique and it may still matter today.
In 2015 google introduced rank brain which used machine learning for to determine the most relevant results.
Here was another patent released in 2015 for Ranking Search Based Results on entity metrics. Using entities for ranking uses your google business page as your entity.
In 2018 Google launched Neural Matching and added it to Search products in 2019. Neural matching help google understand how queries relate to pages and the pages context. Neural Matching has been integrated into search and is part of the ranking algorithm.
In 2019 Google announced BERT – Bidirectional Transformers for Language Understanding. BERT is a neural network for natural language processing. BERT provides a better understanding of how language is used in search. Bert handle short prepositions can and how.. It works well, we can see that content optimization is now being replaced with good writing and useful content. BERT has been integrated into search and is a part of the ranking algorithm.
In 2021 Google launched MUM – Multitask Unified Model. MUM helps Google not just with understanding languages but also generating languages, so it can be used to understand variations in new terms and languages.
LSI was implemented in the Google algorithm back in 2004 to help the search engine deliver more relevant results to users. The patents are listed as a user context based search engine.
In the patent it clearly indicates that they are creating a hierarchy of vocabulary based on a variety of topical terms, their frequency and relationship to other related words for distinguishing its context.
What exactly does Latent Semantic Indexing do?
Latent semantic Indexing ( LSI ) is a natural language processing ( NLP) technique, for computer searching developed in the 1980’s. In fact the patent for LSI was filed on September 15, 1988.
Understanding what a word means when someone uses it is very important to search engines to provide the most relevant information when you ask. Sometimes our questions are very straight forward, other times they can be extremely confusing. Many nouns, verbs and adjectives have more than one meaning and can be pronounced differently.
- Homonymy words look & sound the same but are unrelated and have different unrelated meanings.
- Polysemy words depending on the context in which they are used and the tone in which they are spoken can mean different things.
For example: You can “fall “ down, or “ fall” could be the time of season. To make it more involved in natural language processing “fall” would be generally be classified as “autumn” ; Another example would be bat, did you mean the flying animal or the implement used in baseball? The word “fine” has four meanings: pleasing to the eye, big, well executed and terrible ( irony); LSI makes sure you get the correct answers to your questions.
Here are some more common word with multiple meanings.
Examples of LSI
If someone searches “apple computer”, a primitive search might deliver results of types of apples , and types of computers not the company. Even in 2022 computers are not smart; they are in fact dumb. They still don’t come close to the same understanding of word relationships as most grade school children
Latent Semantic Indexing is the basic process that google and other search engines use to study and compare relationships between terms and concepts. Using complex mathematical formulas they help search engines figure out the primary topic of your content. Using algorithms LSI is able to statistically predict which meaning of a word represents by statistically analyzing the words that co-occur with it in a document.
You would be hard pressed to find a person that didn’t know that big and large mean the same thing or rich and wealthy could mean the same thing too, or compact holds makeup but also means small. There are many words that have more than one meanings some with subtle differences in pronunciation, or different meaning when used in a different context. With out LSI a computer or software application would not know.
Let’s say your website is an auto repair site, then LSI keywords may be [automobile], [engine], [tires], [brakes], and [transmission]. You can focus on [auto repair] all day long but with out the support of the other LSI keywords or content with LSI keywords in the meta, you wont be relevant enough to break thru the SEO barriers.
- Focus on natural writing over keyword placement.
- New technologies read long tail keywords – make sure your content answers questions.
- Keep Page focus tight and concise
- Write for your audience- make sure your content matches how they ask questions.
- Subheads H2s should answer your title and in turn the keyword you’ve targeted
- Keep it to one idea per paragraph.
- Double check your grammar and spelling.
- If you publish make sure to fact check
Homonymy and polysemy occur in all natural languages but artificial languages avoid both on the principle that, ideally a word should have a single unambiguous meaning.