ChatGPT’s beauty tips range from bangs to Botox. Photo / 123RF
ChatGPT’s beauty tips range from bangs to Botox. Photo / 123RF
Friends want to protect your feelings. AI isn’t pulling any punches.
Ania Rucinski was feeling down on herself.
She’s fine-looking, she says, but friends are quick to imply that she doesn’t measure up to her boyfriend – a “godlike” hottie. Those same people would never tell her what she coulddo to look more attractive, she adds. So Rucinski, 32, turned to an unconventional source for the cold, hard truth: ChatGPT.
She typed in the bot’s prompt field, telling it she’s tired of feeling like the less desirable one and asking what she could do to look better. It said her face would benefit from curtain bangs.
“People filter things through their biases and bring their own subjectivity into these sorts of loaded questions,” said Rucisnki, who lives in Sydney. “ChatGPT brings a level of objectivity you can’t get in real life.”
Keep up with the latest in lifestyle and entertainment
Get the latest lifestyle & entertainment headlines straight to your inbox.
Since its launch in late 2022, OpenAI’s ChatGPT has been used by hundreds of millions of people around the world to draft emails, do research and brainstorm ideas. But in a novel use case, people are uploading their own photos, asking it for unsparing assessments of their looks and sharing the results on social media. Many also ask the bot to formulate a plan for them to “glow up,” or improve their appearance. Users say the bot, in turn, has recommended specific products from hair dye to Botox. Some people say they have spent thousands of dollars following the artificial intelligence’s suggestions.
The trend highlights people’s willingness to rely on chatbots not just for information and facts, but for opinions on highly subjective topics such as beauty. Some users view AI’s responses as more impartial, but experts say these tools come with hidden biases that reflect their training data or their maker’s financial incentives. When a chatbot talks, it’s pulling from vast troves of internet content ranging from peer-reviewed research to misogynistic web forums. Tech and beauty critics say it’s risky to turn to AI tools for feedback on our looks.
As AI companies begin to offer shopping and product recommendations, chatbots might also push consumers to spend more, according to analysts.
AI “just echoes what it’s seen online, and much of that has been designed to make people feel bad about themselves and buy more products,” Forrester commerce analyst Emily Pfeiffer said.
Still, many consumers say they value critiques from the chatbot, which offers a different perspective than their friends and family.
Kayla Drew, 32, said she turns to ChatGPT for advice on “everything,” from how to decorate her home to what to buy at the grocery store. Recently, she asked the bot for honest feedback on how she could look more attractive. It came back with suggestions for her skin, hair, brows, lashes, makeup and clothes – all of which Drew followed, she said. So far, she’s spent around US$200 ($330).
People are turning to ChatGPT for unfiltered advice on how to improve their looks. Photo / 123RF
If the real people in her life gave her point-blank feedback on her appearance, it would probably hurt her feelings, Drew said. But coming from ChatGPT, which Drew refers to with “she” pronouns, the whole thing feels more palatable.
“Today I asked about whitening my teeth, and she was like, ‘Make sure your dental hygiene is good,’ and I was like, ‘Damn, girl,’” Drew said. “Nobody else would come up to me and say that. It was pretty cool because I guess I needed to hear it.”
Users see ChatGPT as a more objective measure of beauty because, unlike friends and family, it doesn’t factor in qualities like kindness or humour, said Jessica DeFino, a beauty critic who writes the Review of Beauty newsletter. Internet-era beauty standards turn the self into an object, she said, and what better way to evaluate an object than by asking another (AI-powered) object?
“If we’re trying to optimise ourselves as beautiful objects, we can’t consider the input of a human who is, say, in love with us,” DeFino said.
When bots give advice, who’s really talking?
OpenAI said this month that it’s updating ChatGPT to show products – including images, details and links – when users appear to be shopping. Some tech and beauty experts caution that the bot’s suggestions serve its maker’s goals, not the user’s.
AI companies need new streams of revenue – some are spending billions to build and host AI tools. Having chatbots surface sponsored products and ads is one potential path forward: Already, Perplexity AI has incorporated a shopping feature inside its chatbot’s interface, and beauty is the third-most-searched category, a spokesman said.
As shopping features roll out, consumers might start seeing product recommendations without knowing why the bot is choosing those products, says Forrester’s Pfeiffer. The bot could, for example, pull ideas from a knowledgeable YouTube makeup influencer or a mean-spirited Reddit thread. It could invent a fake product or make false claims about a real one, she said. Its training data is so vast and opaque, the bot becomes vulnerable to bias and mistakes.
But that same training data could also give chatbots an edge as shopping and beauty assistants compared with traditional search engines, said Perplexity spokesman Jesse Dwyer. Rather than sifting through dozens of Reddit threads or YouTube videos for the perfect anti-ageing product, a shopper could explain to the bot that she’s tired of the dark circles under her eyes and wants something that helps. Ideally, it would understand her meaning and save her time shopping, Dwyer said.
One TikTok video asking ChatGPT for glow-up recommendations drew more than 220,000 views and a slew of positive comments. A commenter said the bot rated their attractiveness on a 10-point scale.
“It told me I am mid and could go from a five to a seven with the help of makeup and fillers,” they said.
While ChatGPT maker OpenAI doesn’t publicly share what data its AI systems are trained on, the training data probably includes online forums where people rank other people’s attractiveness (largely men rating women), such as the subreddit r/RateMe or the website Hot or Not, said Alex Hanna, director of research at the Distributed AI Research Institute.
While the training data contains diverse ideas, chatbots tend to veer toward the most common threads – such as the conviction that women need to constantly improve their looks, Hanna said.
“We’re automating the male gaze,” added Emily Bender, a computational linguist who specialises in generative AI and co-author alongside Hanna of the book The AI Con.
OpenAI spokeswoman Leah Seay Anise said the company has teams working to reduce bias in its models. She declined to say whether ChatGPT’s technology was trained on content that ranks attractiveness. Shopping features are new and still being refined, she said.
Still, the potential for bias hasn’t stopped many people from turning to bots rather than humans for sensitive conversations. Already, some users rely on chatbots for companionship, including discussions in the style of mental health therapy. Sometimes, all people want is a sounding board that doesn’t come with its own share of human messiness.
Users who asked ChatGPT for feedback on their looks said they were happy with the results, even when the bot pointed out perceived imperfections. Michaela Lassig, a 39-year-old in Washington state, asked ChatGPT to help her glow up before her wedding. In her prompt, she told the bot her goals (flawless, youthful skin), her budget (US$2500) and her timeline.
“Ideally, you will give me a list of procedures or services and tell me when to do them for the best skin and face I can have by July 16,” she prompted the bot.
It spat out a detailed list of the signs of ageing on her face. But in the end, she welcomed the recommendations – it even correctly estimated the units of Botox her injector would suggest. Lassig said she was careful in wording her prompt to focus on her personal skin goals rather than some universal standard.
Haley Andrews, 31, wanted the unfiltered truth about her looks and “wasn’t looking for nice”. She always appreciated how her older sister gave criticism without sugarcoating, so she went to the bot with a special request.
“I told it, ‘Please speak like an older sister who tells the truth because she loves you and wants the absolute best for you, even though it’s a little harsh,’” Andrews said.
It told her that her eyebrows were thinning and her complexion fell flat without blush.