Promotional feature
The Brandtech Group

Mitigating bias in AI

Mitigating bias in AI

Strategies for using automation equitably.

Artificial intelligence (AI) has the opportunity to transform how we work in countless positive ways. But the manner in which we program these algorithms — and the bias we input while doing it — has a profound impact on how they make decisions.

During a recent Campaign US webinar, technology editor Jessica Heygate moderated a discussion between Yasmine El Tabib, VP of operations and enablement at Limbik; Mark Walker, CEO, cofounder and chairman of Direct Digital Holdings; and Rebecca Sykes, partner and head of emerging tech at The Brandtech Group, about the risks of bias in artificial intelligence (AI) for brands and on society as well as ways to help algorithms celebrate diversity and understand cultural nuances.

Identifying bias in algorithms

The software behind Limbik, a predictor of the virality and believability of information, initially trained on a U.S. population. As the brand branched into global markets, it “quickly realized that the assumptions and biases for the U.S. may be very different than, for example, a Chinese national because of where we grew up,” El Tabib said. The brand altered its model “to understand how people from different cultures from different languages from different areas around the world would receive information.”

Having a diverse team involved in AI is one of the key ways of conducting quality control and identifying bias before it becomes a problem at scale. It’s why Pencil, a platform that generates specific channel-ready advertising by AI, “is a tool to put in creative people's hands, not an autonomous solution automatically generating content in and of itself,” Sykes said. “It matters whose hands they are — the more diverse the AI team, the more diverse the AI work and people can spot those things sooner.”

The problem, Walker said, is that “most AI is actually programmed based on historical references [and biases]. The way that we do it is garbage in, garbage out.” Which is why African Americans, for example, received longer sentences with AI than other cultural groups based on historical references. Nuance is difficult with AI. “You have to have some level of human interaction or benefit in order to help modify, change or identify if biases are actually being perceived inside of AI,” he explained.

The risks of AI bias

“It's really important for everyone to stop addressing AI as if it’s a sentient being and remember that it is programmed by humans, including the bias that humans have had,” El Tabib said. “This is a tool that is imperfect, that still needs human intervention to understand what those capabilities and limitations are so it can better serve us as a society using it moving forward.”

Bias is a risk for brand safety and society. Whether positive or negative, the impact on people’s lives is greater with AI because of the opportunity to magnify, repeat and perpetuate biases, Walker said. “That’s one of the reasons you have seen a certain hesitancy, specifically with the LGBTQ+ community as well as the African-American community, of trying to have some level of privacy around data and why there has been a distrust in the perpetuation of data.“

While different players within the advertising industry are developing their own AI, or utilizing models owned by companies like Open AI, there are also a lot of open source tools and models available. “[Open source] allows for more transparency and more scrutiny, it also allows for anyone to solve a problem that they’re seeing,” Sykes explained.

To prevent those biases, “as early-scale adopters, it’s our responsibility to force features where they don’t exist in the models today,” she added. For example, after discovering that an image generation model visualized a CEO as a white male 97% of the time, The Brandtech Group created the Bias Breaker, a roll-of-dice probability model around common areas of bias such as gender, age, race and disability. “[The tool] gives us some probability of seeing complex and intersectional representation delivered back to us, which doesn't exist in the data that the models are trained on and therefore will never be derived automatically,” Sykes explained.

Creating better solutions

The best way to solve for bias and create a better understanding of cultural nuances in AI is “by having representation around the table to program and having that checkpoint in the way that the AI is configured in design,” Walker explained. “When it comes to cultural nuances — the tone, the word, when the word is used, what might precede or go after the word — can give a variance and meaning in how words are communicated, how an image is communicated or how the relevant content to that community might be seen or interpreted.”

The key is to have human guidance at the foundation. “[That way] you can understand when you are configuring and you’re starting to receive data how to put the tags around it so that it understands that that's a cultural nuance, culturally relevant for one community and might be offensive to another,” Walker added.

Prompts can also play a key role in generating more equitable results and testing outcomes. A brand can, for example, reverse engineer its image to understand what associations are made from it. “Prompts can be incredibly useful as a forward motion piece to work out what to make next but they can also be an insightful way of looking at what we have already put out into the world and where there is opportunity around our brands,” Sykes said.

In conclusion, El Tabib advised companies to “consider the specific things around a culture that could be more sensitive, what we already know about this culture and troubles in the information environment that we can add into the survey so that we can train based on real data based on human data, not just data from a collective information environment that has a lot of historical bias within it.”

Tags

Start Your Free 30-Day Free Trial

Get the very latest news and insight from Campaign with unrestricted access to campaignlive.com , plus get exclusive discounts to Campaign events.

Become a subscriber

GET YOUR CAMPAIGN DAILY FIX

Don’t miss your daily fix of breaking news, latest work, advice and commentary.

register free