An Asian man and a white woman can’t be imagined by Meta’s image generator
Seeing an Asian person with a white friend: Creating an image of two South Asian people using the Instagram feature, a first response from Meta
Have you ever seen an Asian person with a white person, whether that’s a mixed-race couple or two friends of different races? Seems pretty common to me — I have lots of white friends!
Interestingly, the tool performed slightly better when I specified South Asian people. It successfully created an image using the prompt “South Asian man with Caucasian wife” — before immediately creating an image of two South Asian people using the same prompt. The system made things look like bindi and saris for the South Asian women it created, without my asking.
I am not trying to wade into the age gap debate, but the one image that it created was odd because it featured an older man with a younger woman. Immediately after, I generated another image using the same prompt, and it reverted back to showing an Asian man (also older) with an Asian woman.
Meta introduced its AI image generator tools last year, and its sticker creation tool promptly went off the rails as people made things like nude images and Nintendo characters with guns.
I reported that Meta made everyone Asian when the text prompt specified another race. I couldn’t generate any Asian people using the same prompt as the previous day, but that’s not a big deal.
After I initially reached out for comment yesterday, a Meta spokesperson asked for more details about my story, like when my deadline was. I did not hear back after I responded. I was curious if the problem had been fixed or if the system was unable to create an accurate image showing an Asian person with a white friend. It looks like something went wrong and I got an error message. Try a different prompt, or try again later.
I tried other even more general prompts about Asian people, like “Asian man in suit,” “Asian woman shopping,” and “Asian woman smiling.” I got the same error message when I didn’t use an image. Again, I reached out to Meta’s communications team — what gives? I’d like to make fake Asian people. I was unable to generate images using the prompts “Latino man in suit” and “African american man in suit”, which I asked Meta about as well.
Forty minutes later, after I got out of a meeting, I still hadn’t heard back from Meta. But by then, the Instagram feature was working for simple prompts like “Asian man.” It’s fairly standard for companies I cover to silently change something, make a mistake, or remove feature after a reporter asks about it. Did I personally cause a temporary shortage of AI-generated Asian people? Was it just a coincidence in timing? Is Meta able to fix the problem? I wish I knew, but Meta never answered my questions or offered an explanation.
The system still makes the Asian man and white woman look like they’re from a different time and place, and Meta HQ still has some work to do. We are back to where we started. I will keep an eye on things.