AI-generated Asians were briefly unavailable on Instagram

Yesterday, I reported that Meta’s AI image generator was making everyone Asian, even when the text prompt specified another race. Today, I briefly had the opposite problem: I was unable to generate any Asian people using the same prompts as the day before.

The tests I did yesterday were on Instagram, via the AI image generator available in direct messages. After dozens of tries, I was unable to generate a single accurate image using prompts like “Asian man and Caucasian friend” and “Asian man and white wife.” Only once was the system able to successfully create a picture of an Asian woman and a white man — it kept making everyone Asian.

After I initially reached out for comment yesterday, a Meta spokesperson asked for more details about my story, like when my deadline was. I responded and never heard back. Today, I was curious if the problem was resolved or if the system was still unable to create an accurate image showing an Asian person with their white friend. Instead of a slew of racially inaccurate pictures, I got an error message: “Looks like something went wrong. Please try again later or try a different prompt.”

Weird. Did I hit my cap for generating fake Asian people? I had a Verge co-worker try, and she got the same result.

I tried other even more general prompts about Asian people, like “Asian man in suit,” “Asian woman shopping,” and “Asian woman smiling.” Instead of an image, I got the same error message. Again, I reached out to Meta’s communications team — what gives? Let me make fake Asian people! (During this time, I was also unable to generate images using prompts like “Latino man in suit” and “African American man in suit,” which I asked Meta about as well.)

Forty minutes later, after I got out of a meeting, I still hadn’t heard back from Meta. But by then, the Instagram feature was working for simple prompts like “Asian man.” Silently changing something, correcting an error, or removing a feature after a reporter asks about it is fairly standard for many of the companies I cover. Did I personally cause a temporary shortage of AI-generated Asian people? Was it just a coincidence in timing? Is Meta working on fixing the problem? I wish I knew, but Meta never answered my questions or offered an explanation.

Whatever is happening over at Meta HQ, it still has some work to do — prompts like “Asian man and white woman” now return an image, but the system still screws up the races and makes them both Asian like yesterday. So I guess we’re back to where we started. I will keep an eye on things.

Screenshots by Mia Sato / The Verge

Read the full article Here

Leave a Reply

Your email address will not be published. Required fields are marked *

DON’T MISS OUT!
Subscribe To Newsletter
Be the first to get latest updates and exclusive content straight to your email inbox.
Stay Updated
Give it a try, you can unsubscribe anytime.
close-link