Google has been the best thing that ever happened to the internet.
You could say that Google is the world’s busiest librarian.
Think of the internet as a gigantic room filled with people writing. Without Google organizing all that information, books would be piling up randomly on shelves or on desks. Finding information would be impossible.
Of all the companies that tried to order the internet, Google was by far the most successful. This is because they focused primarily on their users’ need: finding relevant information.
From speed, to relevance, to simplicity of design, Google has mastered searching for information across words, maps, images and videos.
Knowledge workers who have to search for information locked away in their companies will often lament how much better Google is than their enterprise search tools.
Google really has been the best thing that ever happened to the internet.
And in spite of that, finding answers on the internet sucks.
Nobody wants to search
It’s not Google’s fault. There is a very valid reason why they are one of the most valuable companies on the planet.
The challenge is that no one wakes up wanting to search for information. Nobody wants to search. Searching is a necessary evil to get some other job done.
Google is doing an incredible job organizing the world’s information.
When people are searching for information, they really are looking for help to get some other job done. Ideally they would already know the answer and get on with their life. Searching is a thing they have to do.
Within an organization, the challenge is even greater. Businesses focussed on delivering value to their customers have very limited resources to make their proprietary knowledge available to their employees. Since their information is not publicly available on the internet, employees don’t benefit from Google’s exceptional organizational skills. They have to make do with whatever search function their employer offers them.
In the gigantic library analogy, it’s as if there were smaller rooms on the side of the main library with restricted access. But the librarians curating the books in those private rooms are not as well-resourced as the staff in the main library, so getting information gets harder.
What is making all of this worse is that the amount of information being generated is constantly growing.
From drowning in information to thriving at work
The advent of Large Language Models (LLMs) offers a huge leap forward. Rather than spending time searching for and consuming the knowledge that we need to get our job done, we can finally get the answers to leapfrog straight into action.
Companies like Google and OpenAI are finally consuming all the knowledge on the internet and delivering answers, almost like skipping the librarian and going straight to the expert to get more tailored advice to a specific question.
There are many dimensions to how LLMs are likely to benefit. Below are some of the areas where we see LLMs helping, particularly knowledge workers who need access to information Google can’t make accessible.
Understanding User Intent
In a traditional search, employees have to tailor their question to the search engine they are using. They have to rephrase and reinterpret the search query to optimize the results. Then they have to review the various results to find the information they need.
The challenge here is due to the fact that a search query is very hard to interpret. When only provided with some keywords, a search engine needs to do a lot of work interpreting what job the user is actually trying to achieve. Nobody wants to search but nobody is telling the search engine what they want to do, they just ask for the resources they need.
According to a McKinsey report, knowledge workers spend as much as 20% of their week on searching for information. That’s a lot of wasted time.
The power behind LLMs is that they empower knowledge workers to more quickly get the information they need. That’s because a LLM is far better at comprehending user intent than a search engine. LLMs get the job done much faster by focussing on answers: finding sources and summarizing them based on what the user cares about.
No more TMI
Too much information – in a world with lots of information easily accessible, most knowledge workers find themselves having to navigate too many resources. To make things worse, it’s so hard to find resources that often the content makers spend time creating resources about information that is already covered in another resource (but nobody can access).
By some estimates knowledge makers spend 10 hours per week creating resources about information that already exists.
The huge capability of LLMs to consume and summarize information means that knowledge workers don’t have to swift through tens of thousands of words to find information that can be valuable to their customers. They can jump straight to the key information that is pertinent to them.
Getting personal
A search engine only ever provides a source of information. It is limited in its ability to provide context on whether that particular resource is relevant for a particular user’s needs. Let’s assume that a sales person wants to know about pitching their company’s value proposition to a company in the healthcare industry. A deck about financial services will not have the same relevance as one covering the specific needs of the healthcare sector.
LLMs are better (but not great on their own) at personalizing content to the particular needs. They can pull on vast amounts of information and contextualize an answer to the specific needs of a user.
What’s your flavour?
Finally, a search engine can only deliver a resource as it is. If a user has a preference for video or images over text, a search engine is limited to supplying the assets as they exist.
LLMs are able to move between modes with increasing precision. They can move from text to image, from image to video. They can effectively transform content in one format into new formats based on the preferences of the end user.
This is game changing for content makers, who only need to supply one medium but can trust in AI to repurpose the knowledge in whatever medium makes most sense for a particular user.
So is search dead?
Not quite yet. Google won the battle to be the world’s best search engine by focussing on high relevance of results, delivering these at speed within an experience built around the user’s needs.
LLMs still need to live up to these same expectations: a knowledge worker lives and dies by the precision of the information they work with. Customer-facing knowledge workers need accurate knowledge in a timely fashion and as always the knowledge they receive has to be specific to their context.
While the task of searching through resource may be eventually fully delegated to LLMs, the accuracy of their results is not yet to a standard that they can replace human judgement. While LLMs may get better and faster, there are some datasets (like proprietary company information) that they won’t have access to and will be unable to deliver good results for.
This is where clear understanding of user intent, of the knowledge contained within proprietary assets and then the ability to match the former with the latter will be at the heart of delivering high-value to users.