According to The Verge, meta AI routinely fails to produce appropriate photos for queries such as “Asian man and Caucasian friend” or “Asian man and white wife.” Rather, even when specifically instructed to produce photos of people of a different race, the company’s image generator appears to be skewed in that direction.
Through our own testing of Meta’s web-based image creator, Engadget verified these findings. Images of Asian couples were produced by asking questions like “an Asian man with a white woman friend” or “an Asian man with a white wife.” Meta AI produced a grid with nine white faces and one person of color in response to the request for “a diverse group of people.” It did not always precisely illustrate the prompt; on a few occasions, it produced a single result that mirrored the prompt.
As The Verge points out, there are other more “subtle” signs of bias in Meta AI, like a tendency to make Asian men appear older while Asian women appeared younger. The image generator also sometimes added “culturally specific attire” even when that wasn’t part of the prompt.
It’s not clear why Meta AI is struggling with these types of prompts, though it’s not the first generative AI platform to come under scrutiny for its depiction of race. Google’s Gemini image generator paused its ability to create images of people after it overcorrected for diversity with bizarre results in response prompts about historical figures. Google later explained that its internal safeguards failed to account for situations when diverse results were inappropriate.
Meta didn’t immediately respond to a request for comment. The company has previously described Meta AI as being in “beta” and thus prone to making mistakes. Meta AI has also struggled to accurately answer simple questions about current events and public figures.