Search
Close this search box.

Women in AI: bias, breakthroughs and closing the pay gap

In 2023, the European Network of National Human Rights Institutions (ENNHRI) asked the following question: “The progress of AI relies on using massive amounts of data to train algorithms. But what if the data used reflects existing inequalities, such as gender inequality, rooted in society?” 

Though the popularised use of AI began around 2022 with the launch of ChatGPT, discussions regarding gender and algorithmic AI bias have existed for much longer. For example, in 2015, Amazon’s team of machine-learning specialists realised that their recruiting engine was biased against the CVs of female applicants. 

Their computers were trained by observing patterns in resumes which were submitted to Amazon throughout a 10-year period. Unfortunately, just a quarter of the US tech industry was female at the time, meaning the majority of analysed CVs came from men. As a result, the AI system learned that male candidates were preferable and penalised submissions which mentioned the word “women.”

In this article, we explore the current climate for women in AI, how AI developers and everyday users alike can help to shape algorithms for a more inclusive future – and how AI can potentially help to close the gender pay gap.

Gender disparity in AI careers

In 2021, data from the World Economic Forum reported that just 22% of AI professionals globally were female. In the UK, the percentage of women in AI was lower, at 20%.

When researchers Erin Young, Judy Wajcman and Laila Sprejer published a 2021 report for The Alan Turing Institute, titled “Where are the Women? Mapping the Gender Job Gap in AI,” they revealed “extensive disparities between women and men in skills, status, pay, seniority, industry, job, attrition and educational background” within the AI and data sector.


Key findings of their report showed:

  • Women in AI were more likely than men to occupy a job associated with less status and pay
  • Women in AI and data had higher levels of formal education than men, across all industries
  • This education gap rose even higher for women in senior roles such as C-suite, with ‘over-qualification’ of female workers being most marked in the IT and technology sector
  • Contrary to this, research indicated that women “are severely under-represented in the C-suite in the technology industry, and that they self-report having fewer data and AI skills”
  • Women in AI and data science jobs have a higher rate of turnover (changing their roles) and attrition (leaving the industry) than men.

How pre-existing stereotypes can be reinforced through algorithmic bias

In 2023, Dr Kanta Dihal explained how media stereotypes can discourage women from pursuing an interest in AI. She stated that “mainstream films are an enormously influential source and amplifier of the cultural stereotypes that help dictate who is suited to a career in AI.”

In her co-authored study for Cambridge University’s Leverhulme Centre for the Future of Intelligence (LCFI), Dr Dihal analysed 142 of the most influential films featuring AI (released between 1920 and 2020). The results showed that just 8% of the AI scientists and engineers on screen were women, the remaining 92% were men.

When a machine-learning system analyses contemporary media, it can internalise stereotypes such as this and reinforce them through the results it produces – much like the algorithmic bias found in Amazon’s recruitment engine. As a result, a user who enters a prompt for an ‘image of a scientist’ will be more likely to receive the image of a male scientist than female. 

In her study, Dr Dihal advised that “we need to be careful that these cultural stereotypes do not become a self-fulfilling prophecy as we enter the age of artificial intelligence.”


The importance of women in AI development and their breakthroughs 

As pinpointed by the University of Cambridge, “without enough women building AI, there is a high risk of gender bias seeping into the algorithms set to define the future.”

Much like a child learns from the people around them, the only way that AI systems will learn and return unbiased results is by gaining exposure to data from a diverse spectrum of users, across all groups, to ensure proper representation.

In her ground-breaking 2016 TED talk, MIT researcher and bestselling author Dr. Joy Buolamwini demonstrated racial and gendered algorithmic bias in facial recognition software. A webcam trained to detect the faces of individuals, detected the face of her white colleague but failed to recognise Buolamwini’s face when she sat in the same chair, due to her Ghanaian-American-Canadian heritage. Subsequently, when Buolamwini put on a white plastic mask, the webcam immediately recognised this as a face.

“So how this works is, you create a training set with examples of faces (…) However, if the training sets aren’t really that diverse, any face that deviates too much from the established norm will be harder to detect, which is what was happening to me.”

To fight algorithmic bias, Buolamwini has formed her own ’Algorithmic Justice League’ to promote ethical code, such as through the creation of AI image databases that “reflect a richer portrait of humanity” by having better representation of all demographics.  

Influential and highly-skilled developers like Dr. Joy Buolamwini are fundamental to the inclusive future of AI. However, through day-to-day use of AI tools, even hobbyists and workers in non-STEM  (science, technology, engineering and maths) fields can influence algorithms for better equality.


Are women less likely to use AI tools?

In 2023, 54% of men in the UK reported using AI tools such as ChatGPT in their day-to-day professional or personal lives, in comparison to just 35% of women. 

Psychologist Lee Chambers told the BBC that he believed female employees may fear having their professional ability questioned if they use AI; “Women are more likely to be accused of not being competent, so they have to emphasise their credentials more (…) There could be this feeling that if people know you use AI, it’s suggesting that you might not be as qualified.”

When asked why they prefer not to use AI, female respondents working in various sectors cited doubts about its accuracy, concern about losing the creativity and personality of their work, or feeling like using AI was ‘cheating.’

Jodie Cook, founder of AI application, Coachvox.ai, believes that a lack of female representation in the wider STEM fields has also hindered many women’s confidence in trying day-to-day AI tools. 

These sectors “have traditionally been dominated by males,” with just 26% of the UK’s STEM workforce being female, according to labour market data as of 2023/23. 

In an interview with the BBC, Cook theorised that “even though many tools don’t require technical proficiency, if more women don’t view themselves as technically skilled, they might not experiment with them.”


Using everyday AI tools to close the gender pay gap

Though women in non-scientific fields may feel that AI tools are not suitable or designed for them, AI has the potential to become a helpful tool in their everyday work lives.

Through gaining a basic understanding of casual AI tools such as ChatGPT and how they can aid day-to-day tasks, efforts to close the UK’s gender pay gap could also be significantly bolstered.

In combination with ingrained traditional gender roles, the UK’s underrepresentation of women in leadership means that female workers are disproportionately designated (or expected to take on) ‘lower-level’ – or ‘low promotability’ tasks. 

This type of work has been pinpointed as a persistent contributor towards the gender pay gap, as it does not support women to advance their careers at the same pace as their male colleagues. 

These errands, though often fundamental, tend to be repetitive, time consuming and earn little recognition due to not being seen as ‘revenue-generating’. They can include administration, data collection, scheduling meetings, report writing and minute-taking. 

As highlighted by Women In Tech; “AI and automation solutions enable women to offload these repetitive, arduous and time-consuming tasks and focus their energy on high-value priorities.”

By freeing up their time through automation, working women may be able to prioritise more engaging, creative and complex projects that will advance their careers and garner higher salaries.


AI and gender equality in recruitment

Reflecting on Amazon’s CV parsing system (thankfully discontinued in 2017), modern AI technology has made leaps and bounds towards equality in the hiring process. 

A 2023 study conducted by Monash University and the University of Gothenburg found that using AI in the recruitment process “almost doubled the number of women considered to be in the top 10% of candidates”.

While further research is needed before AI can be deemed free from algorithmic bias for its widespread use in recruitment, its continued development by independent developers, organisations and through the inputs of daily users seems promising.


Work with Distinct

At Distinct, we strive to educate ourselves on equality, diversity and emerging topics such as artificial intelligence.

Whether you work with us as a client or candidate, we’d be happy to discuss equality and diversity in the workplace with you. As leaders in the field, we take the time to understand organisations and candidates alike to ensure the right fit for all. Contact us today.

Author

You may also like

Women work for free: analysing the gender pay gap in 2024

Breaking ground: Distinct’s new London office marks expansion milestone

Finance interview series: Anshu Mehan, HSKSG | Distinct

The highest motivators to change jobs: 2024 UK market data

How to spot a fake recruiter on WhatsApp

2023 reflective: our favourite highlights of the year