It characterises how this emergent sector – dubbed the “intention economy” – will profile users’ attention and communicative styles and connect them to patterns of behaviour and choices they make.
“AI tools are already being developed to elicit, infer, collect, record, understand, forecast and ultimately manipulate and commodify human plans and purposes,” co-author Yaqub Chaudhary said.
The new AI will rely on so-called Large Language Models – or LLMs – to target a user’s cadence, politics, vocabulary, age, gender, online history, and even preferences for flattery and ingratiation, according to the research.
That would be linked with other emerging AI tech that bids to achieve a given aim, such as selling a cinema trip, or steer conversations towards particular platforms, advertisers, businesses and even political organisations.
Co-author Jonnie Penn warned: “Unless regulated, the intention economy will treat your motivations as the new currency.”
“It will be a gold rush for those who target, steer and sell human intentions,” he added.
“We should start to consider the likely impact such a marketplace would have on human aspirations, including free and fair elections, a free press, and fair market competition, before we become victims of its unintended consequences.”
Penn noted public awareness of the issue is “the key to ensuring we don’t go down the wrong path”.