In the engine room that powers its dominant search service, Google recently flipped the switch on a powerful new tool.
According to the search giant, the new technology — a large-scale AI model known as MUM — could one day turn internet search into a far more sophisticated service, acting like a virtual research assistant as it sifts the web for solutions to complex questions.
But the company's critics warn that this comes with a clear risk: that it will accelerate a shift that has already seen Google serve up more direct answers to user queries, stepping in front of other websites to "internalise" search traffic and keep internet users locked in a Google universe.
MUM — short for multitask unified model — is the latest in a series of behind-the-scenes upgrades to the Google search engine which the company claims have brought step-changes in the quality of its results.
These include the introduction, a decade ago, of a "knowledge graph" that defined the relationship between different concepts, bringing a degree of semantic understanding to search. More recently, Google sought to apply the latest deep learning technology to improve search relevance with a tool called RankBrain.
"We think we're at the next such major milestone," said Pandu Nayak, the Google researcher in charge of MUM.
Google gave the first glimpse of the new technology at its annual developer conference in May, though it said little about how the system might be used. In an interview now, Nayak said MUM could one day handle many of the "fuzzy information needs" that people have in their daily lives, but which they have not yet formulated into specific questions that they can research.
Examples he gives are when parents wonder how to find a school that suits their child, or when people first feel the need to start a new fitness regime. "They're trying to figure out, what's a good fitness routine — one that's at my level?" he said.
Using today's search engines, "you have to actually convert that into a series of questions that you ask Google to get to the information you want," Nayak said. In future, he suggests, that cognitive load will be borne by the machine, which will take on what he calls "much more complex and maybe more realistic user needs".
He added that eventually, the applications of MUM are likely to stretch well beyond search. "We think of it as a sort of a platform," he said.
MUM is the latest example of an idea that has been sweeping the field of natural language AI. It uses a technique called a transformer, which enables a machine to look at words in context rather than as isolated objects to be matched through massive statistical analysis — a breakthrough that has brought a leap in machine "understanding".
The technique was first developed at Google in 2018, but its most dramatic demonstration came with last year's GPT-3, a system developed by OpenAI that shocked many in the AI world with its ability to generate large blocks of coherent-sounding text.
Jordi Ribas, head of engineering and product at Microsoft's Bing search engine, said this has triggered a "race across all the high tech companies to come out with bigger models that represent language better".
When Microsoft unveiled its Turing language-generation model early last year, it claimed it was the largest system of its kind ever built. But GPT-3, unveiled months later, was ten times larger. Google has not released technical details for MUM, but said it is "1,000 times more powerful" than BERT, its first experimental model using transformers.
Even with this huge leap, however, Google faces a daunting challenge. Search companies have dreamt of answering complex questions for the past 15 years but found it a far harder problem than they expected, said Sridhar Ramaswamy, a former head of Google's advertising business and now CEO of search start-up Neeva.
"There is so much variation in everything complicated that we do," said Ramaswamy. "Trying to have software understand these variations, and guide us, has turned out to be incredibly hard in practice."
The first uses of MUM involve behind-the-scenes search tasks like ranking results, classifying information, or extracting answers from text.
The difficulty of measuring the quality of search results objectively makes it hard to judge the impact of efforts like this, and many experts question whether other new search technologies have lived up to the hype. Greg Sterling, a veteran search analyst, said many search users will have failed to notice much improvement, and that product searches in particular remain highly frustrating.
The search companies, for their part, say internal tests show users prefer results from the more advanced technologies. The ability to extract answers from text has already enabled Bing to offer direct answers to 20 per cent of the queries it gets, according to Ribas.
For most people, the impact of transformers is only likely to be felt if the technology results in more visible changes. For instance, Google says that MUM's ability to understand both text and images — with video and audio to be added later — could lead to new ways of searching across different types of media.
Handling the more "fuzzy" queries Nayak has in mind would effectively result in Google gleaning information from a number of different locations around the web to present a far more precise response to each, highly particular query.
"This consolidates all the activities on Google properties," said Sara Watson, a senior analyst at market research firm Inside Intelligence. "Everything that shows on that first page [of search results] may be everything you want." Such a system could bring a backlash from publishers around the web, Watson added.
Google, already under scrutiny by regulators around the world, denies that it plans to use MUM to keep more web traffic to itself. "It's not going to become a question answering [system]," insisted Nayak. "The content that's out there on the web is rich enough that giving short answers doesn't make sense."
He also denied that distilling the results from multiple searches into a single result will reduce the amount of traffic Google sends to other websites.
"The better you do in understanding user intent and presenting information to users that they actually want, the more people come back to search," he said. The effect will be to "grow the pie" for everyone.
Search advertising, the lifeblood of Google's business, could face similar questions. Reducing the number of searches it takes to answer a user's question might reduce the advertising inventory Google can sell. But, said Watson, "if the query can be more complex and targeted, so can the ad. That makes [ads] much higher value, and potentially changes the pricing model."
Google's major search advances over the years:
Universal search - 2007: Google moves beyond showing "ten blue links" to return images and other results.
Featured snippets - 2009: Short text results start to appear in a box at the top of the results page, angering some publishers.
Voice search - 2011: Users are able to speak to Google for the first time.
Knowledge graph - 2012: Google builds a web of connections between different ideas, producing direct factual answers to queries.
RankBrain - 2015: Applies the latest AI advances from neural networking to make search results more relevant.
MUM - 2021: Brings a deeper level of understanding to many search tasks, promising useful responses to complex queries.
Written by: Richard Waters
© Financial Times