How the Facebook ad algorithm & ad-delivery system skews outcomes

The enormous financial success of the social media darling Facebook largely comes from the online advertising feature along with precise targeting features that it allows on its platform. Though researchers across the world through various in-depth studies have found that business owners and advertisers can target, or exclude, specific groups of users they want to show their advertisements to, relatively less attention has been paid to the ramifications of its ad delivery process that decides which user sees which ad. There are theories about how the Facebook ad algorithm can skew the ad delivery process even if the advertiser does not intend to do so, making some Facebook users more likely than others to see certain ads solely based on their demographic traits.

In 2019, the United States Department of Housing and Urban Development sued the tech giant over the way it allows advertisers to target their ads on its platform on the basis of gender, race and religion, which are all protected classes under the laws of the US.

Housing ads are protected by the Fair Housing Act according to which an advertiser cannot discriminate on any demographic attribute because by doing so, they’re excluding someone from a life opportunity, which becomes much more problematic and possibly is a legal violation.

Facebook said it would look into the matter and won’t allow it. However, a new study [1] has derived evidence showing the Facebook ad algorithm, that decides automatically who is or isn’t shown a particular ad, perpetrates demographic discrimination anyway.

The study with evidence of Facebook ad algorithm skewing ad delivery

Facebook ad algorithm & skewed outcomes | iTMunch

If an ad shows up on your Facebook or Facebook-owned Instagram feed, there are two parties involved that decided you should see it. First, the advertiser that included you as their target audience by either entering a curated list of contact numbers, email addresses and so on, or by choosing from hundreds of thousands of attributes that the tech giant offers, for example – Australian, working professionals and under 30. Second, the Facebook ad algorithm that took the final call on who in that pool would actually see, like and interact with the ad based on the automated calculations of all the attributes Facebook knows about you. The study focused on the second step and tried to understand the potential bias Facebook’s ad delivery process carries out.

The study led by Piotr Sapiezynski and Muhammad Ali along with their team from the Northeastern University ran multiple and otherwise identical ads on the platform with minute variations in headlines, text, image and available budget. They uploaded a list of random American phone numbers and switched off every other targeting aspect except adults in the United States.

What they discovered was that those slight variations had considerable impacts on the reach of the audience by each ad. The differences were the most noticeable when the advertisements were for real estate of jobs. Jobs related to heavy lifting, janitors and taxi drivers were shown to be higher proportion of minorities.

Additionally, the job ads they posted for cleaning tasks were shown to 88% women. Postings for secretaries and preschool teachers were also shown to a significantly higher fraction of females. Posting about houses for sale were shown to higher proportion of white users while ads for rental properties were shown to non-white users. The team also set up an ad for body building products and they found, despite no specific targeting, the ad was shown to 80-85% men. The ad relating to makeup kits was shown to 97% women.

Facebook, your data & how it decides what ads to show you

The researchers realized that Facebook analyses the content of the ad and compares it to the interests of the user. How does Facebook know about this data? The tech giant has extensive data on your profile based on things you and your friends have done and are interested in. This is not it. Facebook also has data about websites you’ve visited, things you’ve purchased, your locations, the apps you have on your phone and much more.

All of this information helps the Facebook ad algorithm in making automated predictions about whether or not you’re likely to interact and engage with any particular ad which ultimately decides if the ad will show up on your feed or not.

If you want to get an idea about what Facebook thinks your interests are, check you Ad Preferences page or your Ad Interests on Instagram. By doing this, you might comprehend that some of your interests show up because they’re directly linked with your age, gender, income level and race.

The Facebook ad algorithm & racial profiling

Facebook doesn’t not give data based on race, so testing whether or not the platform does racial profiling was slightly difficult. Muhammad Ali and his team used an entirely different custom audience. Instead of taking randomly-generated phone numbers, they took voter records from North Carolina, which are public and mention the individual’s race as well. They ran ads for Rolling Stones articles that were either about hip hop albums, country albums or general top 30 albums and targeted an equal number of black and white users. The team was surprised with the results. The neutral ads were shown to a relatively balanced audience, 45% being white. However, the Facebook ad algorithm decided to show country ads to 80% white users and hip-hop ads to only 13% white users.

Facebook’s skewed ad delivery system & political ads

Despite targeting the same audiences, setting the same goals, budget and bidding strategy, with the only difference being in the content and the destination link, the ads point to Burny Sanders website went to mostly democrats and a Facebook ad for Donald Trump went to Republicans. Additionally, Ali and his team realised that it cost them 1.5 times more for the ad pointing towards Sanders’ site to reach the same number of conservatives as Donald Trump’s ad. This is because the social media giant subsidized with what it believes to be a relevant ad.

Resistance towards Facebook & its discriminatory ad delivery system

Facebook login | iTMunch

In the past few years, Facebook has faced several lawsuits that claimed the same (and more) discriminative behaviour by its ad algorithm. After settling the lawsuits with a civil rights group, Mark Zuckerberg’s social platform is revamping its targeted ad-serving algorithm.

The complaint filed by the US Department of Housing and Urban Development also mentioned that their process inevitably recreates and seconds grouping. It also mentioned that the tech giants ad delivery system restricts or prevents advertisers that want to reach a broader user audience from doing so.

Major civil rights groups, like the Anti-Defamation League and the NAACP, announced an advertising boycott that urged advertisers to pull their Facebook ad spending for the month of July. The “Stop Hate for Profit” campaign emphasized how the platform benefits from misinformation and racism and its failed attempts towards curbing the same.

Ahead of this boycott, Instagram acknowledged and pledged that it would deal with the algorithmic bias on its platforms (Facebook and Instagram) more directly.

Facebook’s measures to curb racial bias on its ad delivery system

As a part of its response to the same, the tech giant has been removing certain targeting attributes that it feels advertisers might use to discriminate between certain racial groups. The platform would also be paying extra and special attention advertisements related to housing, employment and credit. Though this is a step in the right direction, the role of the Facebook ad algorithm and ad delivery system still is in question.

In another groundbreaking step to eliminate the racial bias that the ad delivery system possibly happens, Facebook has recently set up an ‘equity team’ to look into the matter. This team will be dedicated to taking a deep dive into the system’s potential racial bias baked into its ad-serving algorithm.

The move in itself is an admission or acknowledgment of the claim that the algorithm of Facebook, and subsequently Instagram, could be discriminatory. The Equity Team will study the Facebook ad algorithm and look for any racial bias and analyze the enforcement of its harassment policies. Stephanie Otway, the tech giant’s spokesperson said the Facebook’s Responsible AI team will work with the newly formed team to study the same.

Vice President of Product at Instagram, Vishal Shah said in a statement that the racial justice movement is a significant moment for their company. Any bias in their systems and policies is counter to offering a platform for everyone to express themselves.

Adam Mosseri, Instagram Head says while they do a lot of work to prevent the subconscious bias in their products, they need to take a closer and harder look at the underlying systems they’ve built and where they need to do more to keep the demographic and racial bias out of the decisions, adds Mosseri.

Though it wouldn’t be easy, as technology advances there might be ways to address these issues. There are many artificial intelligence researchers that are pursuing technical fixes for the bias in machine learning to create fairer models of online advertising. A paper released by Yale University and the Indian Institute of Technology [2] suggests there is possibly a way to control the algorithm to minimize the discriminatory behaviour but at a small cost to ad revenue.

The issue of discrimination goes back to how the Facebook ad algorithm fundamentally work, which is primarily based on machine learning whose job is to find patterns in great amounts of data and reapply them to make ad-showing decisions. Sure, there are various ways that bias can occur during the process, but the 2 most apparent in the case of Facebook relate to issues at the time of problem framing and data collection.

Though it seems that the tech giant is away from finding a solid solution to curb racial discrimination and gender stereotyping in its ad-delivery platform, the steps taken by Facebook seem to be towards the right direction.


[1] Cornell University (2019) “Discrimination through optimization: How Facebook’s ad delivery can lead to skewed outcomes” [Online] Available from: [Accessed September 2020]

[2] Yale University & Indian Institute of Technology, Kanpur (2019) “Toward Controlling Discrimination in Online Ad Auctions” [Online] Available from: [Accessed September 2020]

Referenced by ItMunch

Leave a Comment

Your email address will not be published. Required fields are marked *