How targeted ads reinforce bias.
By Kiara Caridine, National SAVI Buyer
Recently, some advertisers have shifted their media strategy to be increasingly digital-focused. Digital ads allow advertisers to individualize their messaging in hope of gaining potential customers while keeping their existing ones. Pundits have made the argument that these targeted ad strategies negatively impact the consumer. As parameters are being built with data segments, advertisers “…are allowing the ability of automated systems to perpetuate harmful biases.” Consequently, these strategies take on an oppressive role to women and Black and Brown communities.
Companies like Facebook have been slammed for playing an “oppressive role” by not auditing their technology that keeps marginalized groups from being exposed to certain ads, thus maintaining the unjust systems that have created barriers for individuals and communities since our nation’s inception.
Racial and gender bias is not a new concept in American history. This is evident when we explore President Roosevelt’s Homeowner Loan Act of 1934. The goal of his policy was to assist individuals in refinancing their mortgages to prevent foreclosure. In reality, it allowed banking institutions to use a color system to evaluate into which areas were worth investing. “Areas colored green were considered the safest places to invest, followed by blue, then yellow. Areas in red were considered the riskiest…” according to Marketplace.org.
Areas that were considered “risky” did not receive loans because they were populated by Black people — this policy became known as “redlining.” The effects of redlining continue to permeate our society in that it has left predominately Black and Brown neighborhoods locked in concentrated poverty, has increased racial segregation across communities, and has reinforced our implicit racial biases that inform our daily life decisions.
The more things change, the more they stay the same.
When we fast forward 80+ years since FDR’s legislation to 2019, we discover that Facebook came under major scrutiny by the Department of Housing and Urban Development when it was revealed that their company was in violation of the Fair Housing Act. Researchers at Northeastern University proved that “Facebook was differently showing ads for housing and jobs by gender and race.”
Redlining and its disastrous consequences didn’t disappear; it has simply evolved over time and seeped its way into the digital space.
Digital redlining perpetuates racial and gender bias through targeted ads.
A recent study by Harvard Business Review explored the use of dynamic pricing and targeted discounts. They raised the question, “If (and how) biases might arise if the prices consumers pay are decided by an algorithm?” They examined many e-commerce pricing experiments in order to investigate the responses to different price promotions from people in the United States.
They were able to confirm from this study that “people in wealthy areas responded more strongly to e-commerce discounts than those in poorer ones and, since dynamic pricing algorithms are designed to offer deals to users most likely to respond them,” the campaigns will most likely offer lower prices to those with a higher household income.
Another article recounted the type of ads women in the STEM (Science, Technology, Engineer, and Mathematic) profession were exposed to on their social media platforms. Most of the women’s responses were that they did not see many ads that were related to their professional disciplines.
Additionally, many of them also expressed how their male significant others (also in the STEM profession) saw more ads in the science and tech fields that included “certificate programs to products to upcoming events, including career fairs.” While the men were being equipped with the tools to help advance them in their careers, the women reported that they typically received targeted ads that were related to weddings and starting a family.
What are some solutions to overcome these issues?
Companies need to take a step back and think about the ways that they can assist to dismantle racial and gender inequities happening through their automatic systems. Some ways that marketers and advertisers can work to achieve that include:
- Reviewing how their ads are perceived, as well as who is viewing them. Consider hiring an external agency with racial equity expertise to review the algorithms’ work and identify biases.
- Prioritizing employee education that promotes inclusion and identifies biases – this will benefit ad companies both internally and externally. All teams should have more trainings on anti-discrimination laws and implicit bias. These training sessions need to emphasize the negative impacts on protected classes and the real human impacts of getting this wrong.
- Auditing the automatic decision-making and optimization processes. Businesses need to have human eyeballs perform a formal oversight on their company’s internal system. If we turn a blind eye and allow automated systems to echo the prejudices of the past by calling it “optimization” without any checks and balances put in place, then we are doing our brands and industry a disservice.
Digital media is powerful and has the ability to inform every aspect of our daily lives. Therefore, we have a responsibility to check our work for implicit bias, dismantle oppressive systems, foster equality and increase access to opportunities for all.
Sources:
https://www.admonsters.com/ad-targeting-bias/
https://hbr.org/2019/11/how-targeted-ads-and-dynamic-pricing-can-perpetuate-bias
https://www.wired.com/story/are-facebook-ads-discriminatory-its-complicated/
https://techcrunch.com/2020/06/24/biased-ai-perpetuates-racial-injustice/
https://hbr.org/2019/10/what-do-we-do-about-the-biases-in-ai
https://www.wired.com/story/are-facebook-ads-discriminatory-its-complicated/