Skip to Main Content

#BreakTheBias: Gender & AI

Posted by Scarlet Perera on 2nd March 2022

This year, the theme of the International Women’s Day campaign is #BreakTheBias, which centres around the ongoing fight for a gender equal world, free of bias, stereotypes and discrimination.

Gender bias is something that I’ve studied closely in literature, observed in the world around me and, like most women, have experienced first-hand; it is a glaring reality.

Yet, upon looking at our world through a new tech lens, I find my eyes opened to the existence of gender bias in a place I didn’t quite expect – artificial intelligence (AI).

Granted that as AI has no gender, it is hard to imagine how it can possibly perpetrate such bias, but under closer inspection the answer is simple. AI is a human creation and so it is bound to internalise the same prejudices that circulate in the realm of the creator.

Therefore, taking into account the growing influence AI has in shaping our world, preventing gender-biased AI needs to be made a priority in order to stand a chance of achieving an equitable and inclusive world.

AI: a reflection of our society

Considering women make up only 22% of professionals in AI and data science fields, it is hardly surprising that gender prejudices have managed to infiltrate AI.

The adoption of gender bias undoubtedly occurs in the process of machine learning, which focuses on the use of data and algorithms to imitate the way that humans learn. Therefore, the AI’s perception is highly impacted by the provided dataset to learn from.

For example, a lack of female contribution to the data will inevitably result in gender gaps in the AI’s knowledge that could quite easily lead to bias errors. Similarly, underlying sexism embedded within the data would lead the AI to exhibit the same sexist behaviour in its output.

To further demonstrate how gender-biased AI can contribute to setbacks in gender equality and women’s empowerment, here are a couple of well-recognised examples.

Father is to doctor, as mother is to nurse

Word embedding is a popular method of machine learning and can be thought of as similar to a game of word association, only in a more mathematical, data-driven way.

However, this method has been identified as entrenching sexist gender norms, most famously on translation services such as Google Translate. For example, when translating English to Turkish, which has gender-neutral pronouns, a phrase like o bir muhendis becomes he is an engineer, while o bir hemsire translates to she is a nurse.

Essentially, the inherent gender bias exhibited by translation services reflects an outdated perception of women and is exemplative of the sorts of gender stereotypes that are used to confine women to traditional roles.

The “genderless” voice assistant is female

In a similar way, AI in the form of voice assistants perpetuates the sexist notion of the “ideal” woman – subservient and compliant. Think of any voice assistant and be assured, they were most likely initially brought out with a female voice.

While some would argue that most voice assistants now come with a male counterpart, it should be noted that the biggest player in the tech industry, Amazon, only launched its male counterpart of Alexa, Ziggy, less than a year ago in July 2021.

This gender portrayal is deeply problematic, especially when it’s been predicted that by 2024 there will be more voice assistants than people in the world.

So, what can be done to prevent gender bias in AI?

The algorithm for equality

Ultimately, implementing diversity into every possible level will be the key to eradicating gender bias within AI. More direct and ethical governance from AI companies is needed to ensure individuals obtain equal and fair representation.

From diversifying the building blocks of the AI’s learning process, aka the datasets and training samples, to diversifying the industry as a whole by encouraging more women to work in the field, diversity is the way forward.

While bias may currently be an inexorable part of life, by no means should we sit back and let it corrupt our new technologies and our future ways of thinking.

Scarlet Perera