Get up to speed on AI and all things tech at NexTech, November 14-15 in NYC (+ virtual). Hear from leaders at Roc Nation, Legitimate, Tracer and more. Save 20% now.
When the multicultural agency, Lerma, prompted the generative artificial intelligence tool Mid Journey to generate an image of a Hispanic or Latinx man at work, the results were troubling. The team consistently received stereotypical images featuring a male figure with a prominent mustache and wearing a sombrero.
“And no matter how we regenerated prompts hoping to get different results, we ended up getting the same result,” said Pedro Lerma, the agency’s founder and CEO.
Similarly, agency Alma used Mid Journey tools to present a proof-of-concept for its client, Rockstar Energy Drink, previewing what the campaign targeting a multicultural audience would look like. That, too, led to problems.
“The tool would occasionally produce images featuring individuals with blonde hair when we requested images of women,” Mike Sotelo, vp of digital at Alma said. “Multicultural audiences typically don’t have blonde hair.”
And when seeking images of a Latinx man, Alma too would get stereotypical cultural pictures, such as a Latinx individual wearing a mariachi hat while eating a taco.
“That’s what we don’t want and that is what happens quite a bit,” added Sotelo.
These are just a few of the examples of racial biases stemming from generative AI—both image and text-based tools—which are becoming a bigger concern for Hispanic and broader multicultural agencies. It’s particularly prominent in client-facing projects, where creating accurate visual references for storyboards becomes a crucial aspect of their work. Whenever agencies refine the prompts and image requirements, the results are either distorted or lack diverse visual output altogether.
The use of AI tools is only expected to grow. Major corporations are expected to generate approximately 30% of their marketing content via gen AI tech tools by 2025, according to Gartner. This struggle to accurately represent Latinx culture undermines the progress made in achieving inclusive representation within marketing.
The problem with changing prompts
To counteract inaccurate image generation, agencies like Alma use highly specific prompts, including specifying hair color. However, refining prompts don’t always work.
Earlier this year, while working on an internal project creating an AI-generated music video, Lerma encountered several cultural stereotypes in the representation of human models. This problem occurred across several AI tools, including Mid Journey, Stable Diffusion and Adobe’s Firefly.