Facebook ads have been reportedly showing different kinds of ads to men and women in a way that might be against the anti-discriminatory laws, as per the new study. 

Anything online or digital today runs strongly on the models of an algorithm. Whether it be content creation or ad targeting, the algorithm plays quite a significant role in placing the needs and wants of the users with that of its providers. It is quite ironic for me to talk about gender discrimination in the recent algorithms when the world’s first computer algorithm was written by a woman named Ada Lovelace. Strange times, isn’t it? 

There have been multiple cases and studies that now denote sexism and gender-based discrimination in the technical aspects of various digital platforms. The most recent has been a study that reportedly showed how Facebook ads show different kinds of ads to men and women in a way that might be against the anti-discriminatory laws. 

Interestingly, in September 2018, The American Civil Liberties Union (ACLU), along with the Communications Workers of America and the employment law firm Outten and Golden LLP, filed charges with the US Equal Employment Opportunity Commission against Facebook. ACLU was suing Facebook on behalf of three women job seekers who accused the social media giant of posting ads that are shown only to men. Well, how much has changed now? 


As per the study by University of Southern California researchers, Facebook’s ad-delivery algorithms were asymmetrical beyond the acceptance (legally justified) levels during job qualifications and targeting. The same group of researchers also tested the ad-delivery algorithms of LinkedIn but no such bias was spotted in the job advertisements. 

Source: Reuters

According to the recent study, men were more likely to see Domino’s pizza delivery driver job ads on Facebook, while women were more likely to see Instacart shopper ads. The trend also held in higher-paying engineering jobs at tech firms like Netflix and chipmaker Nvidia. 

Study author Aleksandra Korolova, an assistant professor of computer science at USC, said, “It’s not that the user is saying, ‘Oh, I’m interested in this.’ Facebook has decided on behalf of the user whether they are likely to engage.”

She added, “And just because historically a certain group wasn’t interested in engaging in something, doesn’t mean they shouldn’t have an opportunity to pursue it, especially in the job category.” 


Facebook in a statement justified, “Our system takes into account many signals to try and serve people ads they will be most interested in, but we understand the concerns raised in the report.” 

Customisable advertisement is one of the key features and revenue sources of Facebook and such repeated instances are bound to hurt its revenue-generating system. Facebook has addressed this issue and said that it will take meaningful steps to curb the discrimination in the advertisements.

Is the problem only limited to advertisements?

Words create worlds!

Machines learn and pick up on societies’ asymmetrical functioning. Agreed. However, to not rectify it would be a greater folly. Apparently, Google Translate has been the subject of sexism in translation since 2017! Yes, the matter is still a problem in 2021. 

Recently, a tweet by netizen, Dora Vargha became popular on Twitter calling out the sexist translation by Google. She pointed out that Hungarian is a gender-neutral language and Google Translate failed to understand it and assigned gender-based compliments and roles to the language. 

A lot of people then went on a spree to check this in their native language. Activities like cooking got a female subject translation whereas something like teaching got a male subject translation. 

Can you list down any such instances? If yes, write to us!


By Anukriti Khemka

Anukriti Khemka is the Digital Ninja of The Wonk. She handles all the digital needs of The Wonk. She also writes for her column "Talking Trends". She loves to analyse digital trends and make sense out of them.

Leave a Reply

Your email address will not be published. Required fields are marked *