Drive business and brand growth with winning media strategies at Mediaweek. Hear from experts at TIME, Peloton, YouTube, Google and more, October 29-30 in New York City. Register.
Marketers are increasingly figuring out that artificial intelligence can be biased.
In recent weeks, testing by The Brandtech Group on its gen AI-as-a-service platform, Pencil, revealed troubling results when generating images of a CEO. Two models exclusively produced male images in 100 generations, while a third model showed 98% male results. A fourth and fifth model displayed male CEOs 86% and 85% of the time, respectively.
The male-heavy results highlight a disconnect from reality, where McKinsey’s 2023 report found women hold 28% of C-suite roles, and 10.4% of Fortune 500 CEOs are female. Pencil aggregates major AI models trained on both open web data and copyright-cleared libraries. The company wouldn’t share specifics on which AI models showed bias.
To tackle biases stemming from AI like stereotyping age and race, The Brandtech Group is launching a technology product aptly named called Bias Breaker. Bias Breaker has been in beta for more than a year of development. The Brandtech Group said that the tool is now being offered to all clients.
The Brandtech Group is pitching Bias Breaker as a way to correct diversity factors like race, gender, and age. It is incorporated into Pencil, which has generated over 1 million ads for more than 5,000 brands and processed $1 billion in media spend since 2018. The Brandtech Group credited Bias Breaker for new business wins but declined to name clients.
“Ethics in [marketing] is about using this technology in a responsible way,” said The Brandtech Group partner and head of emerging technology Rebecca Sykes. “That means that it does no harm and is net positive in terms of the capability marketers have to reach more people in more relevant and applicable ways.”
Marketers are enjoying the efficiency and scalability of generative AI tools for creating targeted ad images. However, if these AI-generated images lean towards certain stereotypes, the potential ROI could be compromised, according to The Brandtech Group.
Some of the big brands working with The Brandtech Group have sounded the alarms on defaulting to stereotypes in AI-generated ads with Pencil, according to Sykes. For example, AI might automatically generate an image of a woman cleaning a house for a cleaning product brand.
“[Brands] don’t always want that stereotype,” said Sykes. “We’re making sure that we don’t quietly slip back into accepting a default position.”


