Part of AI4DA’s mission is empowering youth, specifically focusing on women and girls, to lead the change and prepare for the future.

This aligns with the UN’s Sustainable Development Goal 5 “Achieve gender equality and empower all women and girls” [1]. But, what is the role of Artificial Intelligence (AI) in this matter?

WHAT IS GENDER BIAS?

“Bias” in the context of AI is often synonymous with algorithmic bias, that is “a machine-learning model produces a systematically wrong result” [2] When AI is biased it often leads to discriminatory results which could be due to different causes.

One could be the training data. This can be because there is a biased human decision involved but it can also be a result of historical or social inequities, even if sensitive variables such as gender, race, or sexual orientation are removed from the algorithm. Another reason could be flawed data sampling. This means that some groups could be over- or underrepresented in the training data. [3]

Gender bias is a “behavior that shows favoritism toward one gender over another” [4], generally speaking, it is used referring to biases that discriminate against women and girls. However, gender is currently understood as behaviours, cultural and/or psychological traits which are typically associated with one sex. [5]

AI AS A DOUBLE-EDGED SWORD 

Algorithms and automation are tools that can either function as a way to reduce human implicit bias, which is hard to challenge, having then the potential to make decision-making processes open and prone to examination and interrogation. For instance, allowing a person’s gender bias to be eliminated or reduced from a decision, or making the bias visible hence easier to challenge.

On the other hand, it could also lead to the complete opposite outcome, in which case AI can exacerbate bias [6], thus reproducing and augmenting gender discrimination.

HOW DOES GENDER BIAS AND AI AFFECT US?

One such example of the dangers of AI in preventing us to achieve gender equality would be Amazon’s experimental hiring tool, which turned out to discriminate against women. [7] There are also many other gender bias illustrations which are a bit more subtle. For instance, voice assistants usually have feminine voices [8] and language processing algorithms have been found to make sexist connections between terms, such as guessing that the occupation “nurse” is female but “professor emeritus” is male. [9]

However, AI can help us get closer to gender equality as innovative solutions based on AI can make a significant impact on eradicating gender discrimination. 

AI PROMOTING GENDER EQUALITY IN EMPLOYMENT

One example of good practice in the area of labor and employment is the case of Consumer-goods Company Unilever, which has partnered with digital HR service providers Pymetrics and HireVue to digitize the first part of the recruitment process [10] to diminish gender inequality.

In the first stage, the candidates play a selection of games that test their aptitude, logic and reasoning, and appetite for risk. Machine learning algorithms assess their suitability for their preferred role, by comparing the results with those of previously successful employees. The second stage of the process includes recording a video, which is being examined by a machine learning algorithm.

The results of this process so far are very encouraging and in case of a broader application could make a significant contribution to the elimination of gender discrimination in the first phase of the employment process.

AI REDUCING GENDER BASED VIOLENCE

Gender based violence is one of the most serious, if not the most serious, and widespread violation of Human Rights today.

According to the World Health Organization (WHO)’s data “35% of women experienced some form of physical and/or sexual intimate partner violence, or sexual violence by a non-partner, not including sexual harassment”. [11]

In these cases, AI has proven to be of much assistance. For example HiRainbow, a chatbot that uses AI and storytelling helps victims of domestic violence by providing support, information and access to resources. [12] The platform was developed by AI for Good, Sage Foundation and the Soul City Institute for Social Justice and was launched in South Africa in 2018. Since its launch, more than 300,000 conversations have been initiated on the HiRainbow platform, which is a promising result. [13]

A similar solution is being used in Thailand under the name Sis bot, which proves that this approach is perceived as a useful tool in combating gender based and domestic violence. [14]

AI PROMOTING CIVIC PARTICIPATION AND MAKING WOMEN’S VOICES HEARD

Finally, one of many interesting examples of how AI can contribute to the advancement of gender equality is in the area of civic participation and participation in the decision-making process in general. [15]

In 2016, VOTO Mobile developed an AI-based tool for amplifying civic participation through using interactive mobile surveys in local languages in West Africa. Their mobile phone notification and survey platform removes the barriers between citizens worldwide and the organizations that serve them, therefore achieving better participation of hard to reach and underrepresented groups such as women in rural areas. 

LOOKING FORWARD

There is still much to be done in order to achieve gender equality and fulfill the SDG 5 but AI can be a promising way for the advancement of equality and empowerment of women and girls. We also have to bear in mind that algorithms are nothing more than “opinions embedded in code”. [16]

The only way to successfully approach this issue is to remember that gender inequality is deeply rooted in our societies and cultures and that AI is not an exception to this fact. [17] Therefore, AI itself is not the solution to the problem, but it can be a powerful tool if used in an ethical manner, taking into account the vast amount of research on gender [18], race, disability and class during the development process.

Lastly, on a more practical note, Jake Silberg and James Manyika [19] propose 6 potential ways of reducing bias in AI

  1. Be aware of the contexts in which AI can help correct for bias as well as where there is a high risk that AI could exacerbate bias.
  2. Establish processes and practices to test for and mitigate bias in AI systems.
  3. Engage in fact-based conversations about potential biases in human decisions.
  4. Fully explore how humans and machines can work best together.
  5. Invest more in bias research, make more data available for research (while respecting privacy), and adopt a multidisciplinary approach.
  6. Invest more in diversifying the AI field itself.”

RESOURCES

[1] United Nations SDG Goal 5. https://sdgs.un.org/goals/goal5
[2] Nelson, Gregory S. “Bias in artificial intelligence.” North Carolina medical journal 80, no. 4 (2019): 220-222. https://www.ncmedicaljournal.com/content/80/4/220?utm_source=TrendMD&utm_medium=cpc&utm_campaign=North_Carolina_Medical_Journal_TrendMD_1
[3] James Manyika, Jake Silberg and Brittany Presten. 2019. What do we do about biases in AI. Harvard Business Review.
https://hbr.org/2019/10/what-do-we-do-about-the-biases-in-ai
[4] http://sociology.iresearchnet.com/sociology-of-gender/gender-bias/
[5] Merriam Webster
https://www.merriam-webster.com/dictionary/gender#usage-1
[6] Silberg, JAke and James Manyika. Tackling bias in artificial intelligence (and in humans) 2020 June Mckinsey. https://www.mckinsey.com/featured-insights/artificial-intelligence/tackling-bias-in-artificial-intelligence-and-in-humans
[7] Dastin, Jeffrey. “Amazon Scraps Secret AI Recruiting Tool That Showed Bias against Women.” Reuters. Thomson Reuters, October 10, 2018. https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-tool-that-showed-bias-against-women-idUSKCN1MK08G.
[8] Chin, Caitlin and Mishaela Robison. “How AI bots and voice assistant reinforce gender bias” Brookings. AI Governance series, November 23, 2020. https://www.brookings.edu/research/how-ai-bots-and-voice-assistants-reinforce-gender-bias/
[9] Lavie, Aron and Christine Maroti. “We can reduce gender bias in natural-language AI, but it will take a lot more work” Venture Beat. The Machine, December 6, 2020.
https://venturebeat.com/2020/12/06/we-can-reduce-gender-bias-in-natural-language-ai-but-it-will-take-a-lot-more-work/
[10] Marr, Bernard. “The Amazing Ways How Unilever Uses Artificial Intelligence To Recruit & Train Thousands Of Employees.” Forbes. Forbes Magazine, March 7, 2019. https://www.forbes.com/sites/bernardmarr/2018/12/14/the-amazing-ways-how-unilever-uses-artificial-intelligence-to-recruit-train-thousands-of-employees/?sh=23f103d46274.
[11] “Global and Regional Estimates of Violence against Women.” World Health Organization. World Health Organization. Accessed January 15, 2021. https://www.who.int/publications/i/item/9789241564625.
[12] Hi Rainbow. Accessed January 15, 2021. https://www.hirainbow.org/.
[13] Zisengwe, Melissa Tsungai. “Innovator Q&A: Kriti Sharma, Founder of RAInbow.” Medium. Civic Tech Innovation Network, July 10, 2019. https://medium.com/civictech/innovation-q-a-kriti-sharma-founder-of-rainbow-bcf84e5d321e.
[14] “Using AI in Accessing Justice for Survivors of Violence.” UN Women. Accessed January 15, 2021. https://www.unwomen.org/en/news/stories/2019/5/feature-using-ai-in-accessing-justice-for-survivors-of-violence.
[15] “Giving a Voice to Rural Women Through Mobile Surveys.” DataKind. Accessed January 15, 2021. https://www.datakind.org/projects/giving-a-voice-to-rural-women-through-mobile-surveys.
[16] O’Neil, Cathy. “Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy”. Crown, New York: 54.
[17] Powles, Julia and Helen Nissenbaum. “The seductive diversion of ‘solving’ bias in artificial intelligence.” (2018). https://onezero.medium.com/the-seductive-diversion-of-solving-bias-in-artificial-intelligence-890df5e5ef53
[18] Leavy, Susan. “Gender bias in artificial intelligence: The need for diversity and gender theory in machine learning.” In Proceedings of the 1st international workshop on gender equality in software engineering, pp. 14-16. 2018. https://dl.acm.org/doi/abs/10.1145/3195570.3195580
[19] Silberg and Manyika, Tackling bias.

By Ariadna Carrascosa Hidalgo and Bogdan Banjac, Artificial Intelligence 4 Development Agency