About a year ago, ChatGPT was launched. The power of Large Language Models suddenly came into the hands of anyone who took the trouble to discover how this new technology works, and what you can do with it. After that, things developed really fast: in February 2023, Microsoft announced Bing Chat, a combination of the Bing search engine and ChatGPT. Since May 2023, we can use this service in Europe too. Microsoft seemed to take a big lead on Google, but Google first announced its own chatbot Bard (rebuilt and rebranded as Gemini in February 2024), and in May 2023 it released its own variant of the combination of chat & search: Search Generative Experience (SGE, not yet available in the Netherlands).
Big tech is on top of this. Google, Microsoft, Meta and others are in a fierce competition with new cool applications or improvements being released every month. Search will change permanently as a result. But exactly how, and how should your company respond to this? That’s what I describe in this article.
How will search change due to AI?
What happened in 2023 is an acceleration of a long-time development: the increasing influence of artificial intelligence (AI) in search. Google has been working on this for years: think, for example, of the introduction of Rankbrain in 2015 and BERT in 2019. These systems represented a breakthrough in the way Google could analyze the intent behind a search query. And with that, Google was able to provide better answers.
Although this has been going on for a long time, the interaction with search engines remained roughly the same: you put in a query, the search engine might give a direct answer (via the so-called featured snippets or other forms of rich results), but usually you’ll see a list of results to scroll through. If you are shown at the top of the results, you can expect to attract quite a few visitors.
And that is going to change. We will move from ‘classic search‘ to ‘conversational search‘ where the search engine or chatbot always comes up with a direct answer, and where we can immediately follow up on the previously asked question. You get a conversation, and this is different from typing in ‘keywords‘. An real-life example: (answers from the chatbot shortened for the sake of clarity):
User: Are there government subsidies for starting entrepreneurs?
Chatbot: Yes, there are various schemes and subsidies available for starting entrepreneurs in the Netherlands. Here are some of the main ones: [the bot provides an overview of 8 schemes]User: Do you have some more info about number 4?
Chatbot: The starters deduction is an increase in deductions for self-employed workers. […]User: OK, thanks and does it still matter in which place you live?
Chatbot: The starters deduction is a national scheme and is not dependent on the place you live in […]
Two things are important when it comes to these changes in search:
1. Less data for your search strategy
Due to the transition to conversational search, we’ll see a stronger variation in search queries. The queries become much more longtail and will often no longer be traceable to a specific subject or the underlying intent. In addition, at the moment it isn’t clear which data about user-engagement with chatbots will become available (if any).
A keyword should therefore not necessarily be the starting point of your content strategy. More important is to understand:
- Who is your customer
- What are your customer’s needs
One obvious way to perform research without looking at search queries is to analyze your own customer data. Check the questions that come in at the customer service. - Where is that customer located
Which platforms (including your own website) are most suitable for answering those needs as well as possible. That can also be Pinterest or Youtube etc.
In short: do the right things for your customers, look at what you want to communicate on your owned channels, and don’t pay as much attention to pure search KPIs such as traffic and positions.
2. Less traffic from organic search
If Google’s SGE panel is implemented as currently tested, it will become a kind of ‘featured snippets on steroids‘. A direct answer is almost always given, so there is no need to click through to another website – even if the links are included as sources.
Also, it is easy to imagine that the chatbot Gemini (the former Bard) will become a personal assistant, answering many questions for which we currently still go to the regular Google. Gemini will be fully integrated into the Google ecosystem.
Both developments will have consequences for the number of clicks and revenue from organic search. It is vital to take this into account and manage expectations towards the most important stakeholders.
Rise of AI in search: ‘brand is what AI thinks of you’
And yet. Actual demand will not decrease just because of chatbots and language models. Consumers are still looking for a product or a service, will still need information. As an organization, you must continue to work on developing a strong brand:
- When users know you, and know what your brand stands for, this will increase the chance they will come directly to your website or app, without relying on a chatbot.
- A chatbot can only use your products, services or information in its answer when the underlying model understands your brand and knows which solutions you have to offer.
- It is also important to show hands-on experience and to show that you are a real expert by providing up-to-date information of great quality.
- Google’s advice is: work on your EEAT (Experience, Expertise, Authority and Trust) and create ‘helpful content‘.
A strong brand is present in the knowledge graph
It is often said: language models like ChatGPT and Gemini (the former Bard) are unreliable. That is certainly true at the moment, there are countless examples of ‘hallucinating’ chatbots that completely make up answers – or make small sneaky mistakes that are hardly noticeable and pollute the information ecosystem in a perhaps even more dangerous way.
But language models are getting better with every new version. Google is currently working on two initiatives to prevent hallucinations:
- Combining LLMs with ‘fresh’ information
1. A new research report from Google, ‘Refreshing Large Language Models with Search Engine Augmentation‘ (PDF), shows how the accuracy of LLM chatbots such as Bard can be improved by adding new information retrieved from a Google search in prompts. This analysis of the paper is worth reading as well.
2. By integrating Gemini with other Google services, Gemini can retrieve ‘live’ information from, for example, Google search, Google Flights, Youtube etc. - Combining LLMs with the knowledge graph
The scientific paper ‘LaMDA: Language Models for Dialog Applications‘ (PDF) describes how the Knowlegde Graph is used to ‘check’ the answers of the language model. Good to know: LaMDA is the language model on which Bard was based, and Google’s knowledge graph is a mega-database of entities with their corresponding relations.
And this represents the link between a strong brand and being visible in conversational search: you must be an ‘entity’, you should be present in the knowledge graph, to have a chance of appearing in the chatbot’s answer. In addition, it must also be clear which other entities your brand relates to. In other words: what is your brand about. Building this kind of authority (EEAT) requires a broader long-term vision and a keen eye for the needs of your customers.
How to start – with the help of AI
Let’s say that you are not a settled brand yet, one with a years-long track record and thousands of customers. How can you start to ‘work on your brand’ and build up EEAT for search?
First, and this seems to me a no-brainer, don’t publish texts generated exclusively by ChatGPT. Always make sure that content is checked manually.
Second, use AI tools in a smart way to facilitate more efficient and scalable publishing. Google is not fundamentally against using LLMs in content creation. But they will become increasingly strict in assessing quality. This is necessary given the expected flood of AI-generated content. Hence the focus on ‘helpful content’ and real experiences with services and products.
Here’s an example of using AI in content production, thanks to SEO & Organic Marketing specialist Chantal Smink who describes the method in her podcasts:
- Create an overview of the expertises available in your organization
- Ask these experts what is happening in their branche / field
- Record this conversation with the voice recorder on your smartphone
- Have this transcribed by, for example, Whisper (Created by OpenAI)
- Summarize the transcription with the help of ChatGPT or Claude.
- Put this on your website including the expert’s profile
- Use the same content for social media posts: convert longer texts into a short, catchy text specifically for social media using ChatGPT.
The main thing here is that you try new things and experiment, even if it takes a long time the first time because you still have to figure everything out!
And this also applies to the home situation. We do not really know yet how search behavior will change because of chatbots. Some questions will work better in classic search mode, and other questions will be answered way better by the new language models. As a marketer, it’s vital to try out these tools and understand how they work. So grab your smartphone and install ChatGPT, Gemini, Bing and Claude on your phone today. If you haven’t done so already.
———————-
This article was first published in Dutch on Emerce.nl, November 1st, 2023: AI in Search: voorbij de hype en in de praktijk.
Update February 2024: References to Google’s Chatbot Bard have been changed in ‘Gemini‘.