How AI anchors subjective and objective predictions
SMITH BRAIN TRUST – Our collective obsession with beauty has augmented a litany of industries – from cosmetics to cosmetic surgery, and now even AI. Increasingly popular beauty filter apps let users smooth out facial imperfections and tweak individual features – enlarging eyes, slimming the face or narrowing the nose. Services such as Qoves Studio and Face Plus Plus offer facial assessment tools that run on neural networks using sample data of people’s faces to make recommendations – for makeup strategies or even surgical intervention.
But the pitfalls to AI-driven beauty scoring begin with the premise that beauty is in the eye of the beholder, says Lauren Rhue, assistant professor of information systems at the University of Maryland’s Robert H. Smith School of Business. “There are all these different cultural standards that have to do with beauty. How can you train an algorithm to determine whether or not someone is beautiful?”
Rhue, whose research explores the economic and social implications of technology, says that even though popular social platforms like TikTok, Instagram and Facebook have denied using such algorithms, the “recommendation algorithms” themselves often end up gauging attractiveness – whether intended or not.
“If you look at what Instagram wants, it's going to be essentially models, right? You're not going to see a lot of different types of facial features and expressions. And, and that's going to perpetuate this idea of beauty because… of the lack of diversity in what you see in Instagram, and what's extremely popular on Instagram,” said Rhue, sharing her insights in a recent MIT Technology Review podcast.
Rhue says she acknowledges a certain “entertainment value” in beauty filter apps. “But our choice of beauty filters is definitely informed by the culture,” she added, and that includes Eurocentric beauty standards.
In recent research, now in the working paper stage, Rhue explores how AI anchors subjective and objective predictions. She finds that women with lighter skin and hair consistently rated as “more attractive” than counterparts with darker skin and hair and that filters that use facial detection likely have racial bias built in.
Moreover, Rhue says AI applications in beauty are largely being overlooked by the tech community: “It's just not something that we're really talking about. And I think that speaks to the importance of diversity in this space. A lot of people say, ‘Oh, well, beauty is just not important because we're tech people and we're objective.’”
“But beauty is a huge industry. It has such an impact on people. And the idea that there isn't more research is really interesting to me.”
To read more, or hear the podcast, go to: MIT Technology Review’s Podcast: In the AI of the Beholder.
GET SMITH BRAIN TRUST DELIVERED
TO YOUR INBOX EVERY WEEK