Why Is AI So Bad at Generating Images of Kamala Harris?

When Elon Musk shared an image showing Kamala Harris dressed as a “communist dictator” on X last week, it was quite obviously a fake, seeing as Harris is neither a communist nor, to the best of our knowledge, a Soviet cosplayer. And, as many observers noted, the woman in the photo, presumably generated by X’s Grok tool, had only a passing resemblance to the vice president.

“AI still is unable to accurately depict Kamala Harris,” one X user wrote. “Looks like they’re posting some random Latina woman.”

“Grok put old Eva Longoria in a snazzy outfit and called it a day,” another quipped, noting the similarity of the “dictator” pictured to the Desperate Housewives star.

“AI just CANNOT replicate Kamala Harris,” a third posted. “It’s uncanny how failed the algorithm is at an AMERICAN (of South Indian and Jamaican heritage).”

Many AI images of Harris are similarly bad. A tweet featuring an AI-generated video showing Harris and Donald Trump in a romantic relationship—it culminates in her holding their love child, which looks like Trump—has nearly 28 million views on X. Throughout the montage, Harris morphs into what look like different people, while the notably better Trump imagery remains fairly consistent.

When we tried using Grok to create a photo of Harris and Trump putting their differences aside to read a copy of WIRED, the results repeatedly depicted the ex-president accurately while getting Harris wrong. The vice president appeared with varying features, hairstyles, and skin tones. On a few occasions, she looked more like former First Lady Michelle Obama.

That Harris is a Black woman, of Jamaican and Indian descent, also may be a factor. Irene Solaiman, head of global policy at AI company Hugging Face, says that “poorer facial recognition for darker skin tones and femme features” may affect the sorting of images of Harris for automated labeling. The issue of facial recognition failing to identify female and darker-skinned faces was first highlighted by the 2018 Gender Shades study published by Joy Boulamwini, an MIT researcher, and Timnit Gebru, now the founder and executive director of the Distributed Artificial Intelligence Research Institute.

There may be yet another reason why AI portrayals of Harris are not especially good. “The images are not being created to be photorealistic but rather are being created to push a narrative,” says Hany Farid, an expert on deepfake detection and cofounder of GetReal Labs, a startup offering software to catch fake media.

In other words, those sharing AI-generated images of Harris may often be more interested in producing meme-worthy scenarios than refining the realism of her likeness. The “communist dictator” image shared by Musk and the video in which Harris holds her Trumpy baby both serve to ridicule and denigrate the Democratic candidate rather than spread disinformation.

Ari Lightman, professor of digital media and marketing at Carnegie Mellon University’s Heinz College, says some people may even be selecting bad Harris images on purpose in an effort to emphasize the idea that she is a fraud. “This is an AI-generated communications era,” Lightman says. “If it’s done crudely, it’s designed to send a message.”

Source : Wired