Before smartphones, the closest many of us came to artificial intelligence was through movies. Growing up, I loved watching science fiction. Looking back, my favourites were Blade Runner, the Terminator series, and WALL-E.

All these films had one trait in common — each had female AI in their story, and in each series, there was never a female protagonist. In Blade Runner, Rachael, the humanoid replicant, is a receptionist. In Terminator, the only female-introduced robot, T-X, is posed as the hypersexualized antagonist. In WALL-E, Eve is an expendable assistant to humans, and the sidekick and love interest to her protagonist.

Moving to reality, in today’s age, the capabilities of AI have changed and progressed enormously. Millions of individuals hold AI’s power in their hands, yet many questions have not been answered. The reception of AI from Canadians has been mixed, with more than two-thirds unsure whether AI will bring harm to ourselves. For many Canadians, the most prevalent fear has been the lack of information regarding AI’s effects on labour and law enforcement.

So, the question remains — will it turn out like the movies portrayed?

Among the most accessible AI, I would like to introduce the three most well-known digital assistants — Apple’s Siri, Amazon’s Alexa, and Microsoft’s Cortana. What these three virtual assistants have in common is that they all started as female-coded voices. However, since 2021, Apple has changed the default voice to be up to the customer's discretion. In 2022, Amazon followed suit, offering new voices to choose from.

There have been many proposed reasons as to why AI-gendered female assistants have been the default. Most commonly argued was that the higher pitch of female speakers allowed greater accessibility to distinguish against background noises, making them easier to understand and more accessible than their male counterparts. However, these are myths. While some aspects of speech, such as vowels, have been proven to be slightly more intelligible overall using female speech, the extensive range of female voices discounts this fact. Regardless of gender, the speaker's pronunciation is the most crucial factor.

----------------------------------------------

1 Government of Canada Publication (2021). Views of Canadians on Artificial Intelligence [POR 050-20]. Innovation, Science, and Economic Development Canada, publications.gc.ca/collections/collection_2021/isde-ised/Iu4-396-2021-1-eng.pdf

2 Welch, C. (2021). Apple won’t give Siri a female-sounding voice by default anymore. The Verge, theverge.com/2021/3/31/22360502/apple-siri-female-voice-default-new-voices-ios-14-5

3 Seifert, D. & Tuohy, J. P. (2021). How to change Alexa’s voice. The Verge. theverge.com/22588961/alexa-voice-echo-male-female-celebrity-how-to-change 

4 Zhang, S. (2015). No, Women's Voices Are Not Easier to Understand Than Men's Voices. Gizmodo, gizmodo.com/no-siri-is-not-female-because-womens-voices-are-easier-1683901643

So, if female voices are not scientifically more coherent than male voices, why are they preferred as our virtual assistants?

In the last decade, female virtual assistants have been readily available in homes worldwide. This gendering of domestication and sub-servitude reinforces the gender hierarchy rampant in our society today. In gendering virtual assistants — by use of voice, modelling, or pronouns — these assistants become objectified. And, if they present as female, this incites the question — how will this further support existing biases?

In recent years, the abuse of virtual assistants, such as Alexa and Siri, has prompted their respective companies to create disengagement protocols to stave off perpetrators of harassment. Now, when faced with harassing questions such as, “What are you wearing?" or “Do you love me,” they will either refuse to answer or offer rebukes, such as by reaffirming their status as AI or as being uninterested in the interaction. 

Before these changes were made, when being called derogatory terms, Alexa would respond, “Thank you for the feedback.” Likewise, Siri responded to this language with platitudes such as, “If I could blush, I would.” This caught the attention of platforms such as UNESCO. In 2021, members of the organization called for regulations from government officials for AI innovation through a committee meeting detailing recommendations on the ethics of AI. These recommendations are to create a safer environment in future development that promotes human rights, dignity, and inclusiveness. 

However, many others and I might wonder, “Where are these regulations placed?” At the root of AI’s central learning, “Where do these gender biases stem from?”

To answer these questions, we must look at why biases perpetuate subordination in women. Speaking with Dr. Wendy H. Wong, a Professor of Political Science at the UBC Okanagan campus and author of the recently published non-fiction book, “We, the Data Human Rights in the Digital Age,” I gained valuable insight into where these biases arise.

Sarah Meier: During your research for your latest book, where did you see the most significant discrepancies perpetuating gender bias?

---------------------------------------

5 Fessler, L. (2018). Amazon’s Alexa is now a feminist, and she’s sorry if that upsets you. Quartz, qz.com/work/1180607/amazons-alexa-is-now-a-feminist-and-shes-sorry-if-that-upsets-you

6 Equals Global Partnership. (2019). I’d blush if I could: Closing gender divides in digital skills through education. UNESCO, doi.org/10.54675/rapc9356 

7 UNESCO. (2022). Recommendation on the Ethics of Artificial Intelligence. The United Nations Educational, Scientific and Cultural Organization, unesdoc.unesco.org/ark:/48223/pf0000381137/PDF/381137eng.pdf.multi. 

Wendy H. Wong: One theme in the book focuses on how data by nature reflects the biases of data collectors. If collecting data is being done in a way that perpetuates gender bias, the outputs from AI will also be gender biased. AI has three components: data, computing, and algorithms. Both algorithms and data can reflect gender bias. 

AI’s biases can’t be fixed by tweaking the algorithms alone. We have to think about the bias in the types of data we are collecting about people and how those biases reflect general attitudes in society. The assumptions of their creators limit all algorithms and data.

SM: As AI becomes more integrated into our lives, objectifying assistants coded as female may increase the stigmatization of industry roles and the gender biases already present. How can computer scientists or data scientists ensure AI systems and assistants limit implementing bias?

WW: I don’t think it’s possible to have “unbiased” technology because technology always reflects its creator’s biases and assumptions about the world. That’s why we want inclusivity in the technology creation and implementation process. For AI, we also want more inclusivity at the analysis stage to ensure AI predictions do not reproduce harmful biases or spurious findings. 

The more perspectives you have at these stages, the less likely the technology and its outputs will be harmfully biased against certain groups or individuals. 

SM: In recent years, companies such as Apple have begun to change the default voice for AI assistance and have created disengagement designs to combat abuse. Do you believe this is enough initiative to protect against further gender biases?

WW: In general, AI systems are not human, so their “genders” reflect our societal assumptions about gender (e.g. “women’s voices”). But I think the main issue is that we gender AI at all. We are assigning them characteristics of human beings. Other inventions - for example, cars - aren’t presumed to have genders. So, I think one question is, “Why do we assign genders to AI?”

Reflecting on my conversation with Dr. Wong, I realized that gendering AI not only reiterates the systematic oppression of outdated ideals, but also devastatingly creates a binary system that does not recognize those who do not identify in the binary of male or female. In these realizations, it is apparent that AI serves as a mirror, reflecting only a portion of society's opinions and views — this is where the biases come in. 

In a report published in 2023 by the World Economic Forum, statistics displayed that in 2022, globally, 30% of AI researchers were women. Given this significant gender gap in the creators and innovators of AI, it becomes more prevalent than ever to increase diversity and inclusivity to all genders, ages, and races in collaboration with AI. 

To gather more insight into how universities are working to rectify this issue and address the gender disparity in computer science, I talked with Dr. Ramon Lawrence, a professor of Computer Science at the UBC Okanagan campus and founder of UnityJDBC.

Sarah Meier: In my research, I learned that only a small portion of researchers in AI are female. As a professor of computer science at the university, how do you support female students in the field?

Ramon Lawrence: It is critical to make it open and welcoming to everybody. Trying to use examples that are not gender specific or, to be on the positive side, provide examples of women succeeding in AI and computer science in general.

This is a long-standing issue in computer science, and we need more representation of women in the discipline. [The faculty] prides itself on developing courses and content welcoming to women who want to be in the discipline. We are one of the few departments where our ratio is near 50%.

SM: In your curriculum, what initiatives have you taken to reduce inequality and lack of diversity attributed to gender biases?

RL: It's essential to do more social things. Many courses are doing pair programming now, and working in groups. The typical stereotype of people in computer science is the male, loner, gamer who's all by themselves. [The faculty] wants to break that stereotype. Everybody can program; everybody can do it. 

Sometimes, you just need to see either role models, like faculty who are leading these courses, who have demonstrated that you can be successful as a woman in computer science, or you just need peers in your course that you can work with — that's the significant change, not so much on the data, but for the university teaching students in conducive environments that women want to be in, to get a hold of and learn what to do.

Computer science is a space for everybody. We're working together collaboratively because that's how you will be successful.

---------------------------------------------------------

8 World Economic Forum. (2023). Global Gender Gap Report 2023. WEF. weforum.org/docs/WEF_GGGR_2023.pdf

Moving forward, it seems impossible to escape bias; we all have our own. 

During the development of the Internet today, many factors were largely unknown. The same is true for AI now. Therefore, how we approach and configure these new avenues of learning is entirely a reflection of ourselves as a collective

AI is quickly innovating at a very rapid pace. Looking around, it is everywhere we go and almost always on us. The anxiety that many people are feeling right now is palpable, which has manifested over the last decade in newer sci-fi films, like Ex-Machina and Jexi. In these movies, the female AI, Ava and Jexi, are left to their own devices. In doing so, AI’s ability to control and learn information that may extend further than their programming is called to light. 

Though only fictional, this uneasiness is reminiscent of development in earlier technology. The uncertainty in where the effects are leading is clearly shown in these cautionary tales critiquing the treatment of AI.

As a student in STEM, I constantly see the inconsistency in the representation of women, specifically women of colour, as role models to look up to. Questioning and thinking about these issues is the start of addressing the problem. To ensure that the future digital landscape is welcoming and promotes equality for everyone, it is vital to protect AI by diversifying the data, the engineers, and the designers creating this intelligence for the future.

The digital revolution against gender biases must start with us.