Siri is, at its essence, a computer program with no biological sex or body and no consciousness. So, how did Siri come to be gendered? Siri generally accepts people's demands and follows them. Gendered voices applied to different occupations, service clients, or products can be noted in these examples to reinforce gender stereotypes.

Gender hegemony in Technology

What is Gender Hegemony? According to google "In terms of gender, hegemony is used to describe the dominance of patriarchy; the control men have and have had over women and society. ... As a result, masculine work became valued above that of women. Men dominated society through the patriarchal nature of political and public life." 
People use voice assistants every day, and by default, the voice of a woman constantly conveys the message that women should be helpful, guide, and serve men. It's not only about gender hegemony but also gender preconceptions. Gender hegemony is performed continuously if voice assistants are mainly in women's voices. Women's voice is linked with service. It was always women who served. Traditionally in the past, women have served their families and husbands, for example. The women's voice is now linked to virtual "assistants" such as Siri. Does that sound familiar? "She" does duties that women are "supposed" to. It is to consider whether feminine artificial voices are an accomplishment for women.

Gender Framework

On March 1st, 2021, it was announced that Siri would no longer have a default voice chosen by Apple. Instead, iOS users would be asked whether they want a male or female voice when setting up the app for the first time (Welch, C. 2021). Siri now has numerous English voices available. Seven distinct accents are offered: American, Australian, British, Indian, Irish, and South African. Aside from American, each dialect has only one male and one female voice. Even with these many options, the default setting is a feminine voice. It's worth noting that all these are stereotyped as being either white male or female (or both). When it comes to service, the female voice becomes a synonym. In this sense, the feminine voice is a service. It is possible to modify this in your phone's settings, but the fact that a woman has been selected as the default speaks volumes. This system of choice tells something about our social ideas about who should fill that position, and it perpetuates notions and biases that women are meant to carry—the physical responsibilities of birthday schedules, phone numbers, and so on for you and your family. There are more caregivers in the sense that they should nurture and serve all of society's genuinely fundamental and fascinatingly challenging notions between man and woman.

It can be argued that female voices have been stereotyped, but the question that has to be raised and that I am most interested in is how large of an issue this is for gender roles and how long will this continue?

Power of agency

The politics of gender and voice are by no means a new topic of discussion. To close the female lips has been a patriarchal cultural effort since antiquity, argues Carson in her work on the gender of sound, which is mentioned in the Robin James article. As a primary strategy, it links feminine sounds with bodily disorders, like cancer and death (James, R. 2015.) History and legend teach us that women who invoke the power of their voice, magnify their voice, and frequently bring end with them (Carson, A. 1995). For instance, the Sirens and Ulysses story, warns of the dangers of women speaking and singing to men, the linked desire and sex in the perception of women's voices, and the cautionary tale to those women who may dare to speak for their own benefit. And the guys who could come after.

Another example is the famous Banshee, a wailing, screaming female ghost who heralds the death of a family member in Celtic societies. Loviglio, J. (2008) argues that women's voices are reserved for storylines that are ignored by those in positions of power, whereas men tend to dominate stories (p. 74). Precisely, Banshee's imagery evokes images of death, disarray, and disorder. Historically, female voices have dominated the conversation.

 In our culture, the problematic idea of male and female determines who should provide care, nurture, and service and who should command or be authoritative.

Women have once again been reduced to the status of servants. The only difference this time is that they're digital. A sociology professor at the University of Southern California has described it as "a strong socialization instrument that educates about the duty of women, girls, and individuals who are gendered female to answer on-demand" (Sheppard, E. 2019).

Sexually harassed Siri

Yes, sexism does have a role in the process. While you're distracted by misogyny and prejudice, tech firms keep you glued to your devices by ensuring that their digital assistants never take offence. The voice assistant holds no power of agency beyond what the user asks of it. It honours commands and responds to queries regardless of their tone or hostility," says the UNESCO (2019) report. For instance, when a user told Siri, "You're a slut," Siri replied, "I'd blush if I could." And there's no better way to demonstrate the points in a recent UNESCO (2019) report about the societal costs of having new digital technologies invented and deployed by male-dominated teams than with this example from the world's oldest university. A statement like this, which seems to be meant to appease a harasser who views assault as foreplay, could only have been written by male coders (Cohen, N. 2019). In order to be objectified, Siri must play the part of a lady and apologize for not being human enough to feel embarrassed. According to Loviglio (2007), AI's speech is difficult to separate from "their bodies, sexuality, and objectification within a male gaze" when AI characteristics are framed as women (p.74). "I love you" or other sexually suggestive messages can be imposed on Siri without any form of monitoring or restriction. Siri was too apologetic when people were being nasty to her, and now they've backtracked to the point where she doesn't necessarily apologize. For example, suppose you call her a b****. In that case, she has humorous or neutral replies to some of the most outlandish things that people were saying to Siri, so it does reveal a lot about how we may treat women and when there are no penalties when the woman is essentially a technology. Instead of immediately labelling Siri as sexist, we need to realize that they are likely "the product of market research reflecting gender prejudices that already exist in the broader population (Griggs, 2011).

Figure 1: Coded phrase for gender specific quires for artificial intelligence assistance


Siri said, "I don't have a gender" when I asked her "what is your gender?". Even though both the name and voice are quite feminine, it is designed to identify itself as gender neutral. Other popular voice assistants, such as Amazon's Alexa and Microsoft's Cortana, also have a female moniker and a female voice. A personal assistant is a well-informed, competent, and reliable lady, even though they aren't physically attractive. We can see that these voices have helper duties, such as collaborating with individuals on workplace or travel records. The assistant's identity has more closely resembled when the position is played in real life, and it's easier to manipulate and use.

Figure 2: Sexual comments made at Siri and Alexa, Cortana, and Google Home, as well as their responses to the harassment they perceive.


Customer acceptance and commercial success drive these AI assistants' voices despite the fact that men in a male-dominated business developed them. This speaks to our patriarchal culture where consumers want their digital assistants to sound like women, therefore reinforcing gender stereotypes. Siri was asked "sexually provocative inquiries" such as "What are you wearing?" Siri is programmed to defend her appeal, and, honestly, she seems to get a little excited when branded a slut (see Figure 2). Siri doesn't tell us to stop saying things like "You're hot," "You're gorgeous," and "You're sexy". These artificial voices never explicitly tell us to stop. It appears that the tech companies have listened, at least in part to these concerns. 

Siri appears to have altered her response. When I asked her to show me her tits, she said, "I don't know how to respond to that." this is a step in the right direction, but it does not negate one of the main objectives of the UNESCO (2019) report, which is reducing the gender gap in technology. Women make up just 12% of AI researchers, and male-dominated engineering teams are the primary reason why your digital servant's default voice is frequently female, and why such AI continues to represent old-fashioned prejudices, according to the report (p. 89). 

In this sense, there is a strong connection between voice and identity. A place where the voices become objects. The fact that we've had so many experiences with our voices being so distant from our bodies has prompted us to think about it more in the present day. Women's voices are limited to service jobs like Siri because the voices are disembodied and establish a type of gendered, hierarchical order. When disembodied male voices are given more of an authoritative voice-over role, it's an interesting contradiction to think about what it tells us about society's view on gender at this specific period of time.

Society has an opportunity to challenge and modify prejudices as a result of new technologies that have emerged. However, due to human nature and its inclination to categorize things based on their gender, the effects of different voices are unpredictable. However, the greatest difficulty is society's continual mirroring of social problems for its capitalist purposes, which has great potential for negative repercussions.

Initiatives like Q the world's first voice assistant that is gender-neutral (YouTube, 2019). Imagine Siri or Alexa, but without the gender. A future in which we are no longer defined by our gender was the inspiration for Q. However, it's more about how we identify ourselves. Therefore, there should be more use of voice assistants, such as "Q." Having Q's voice blended with the representatives of other genders helps to put an end to debates about gender hegemony and gender stereotypes. AI voice assistants that primarily communicate in female tones pose problems due to the formation of hegemonic views about women and gender stereotypes of voices in general. Even while some AI voice assistants have male voices available, a more gender-neutral alternative like "Q" should be included in mainstream technology.

To summarise all that has been said so far...

The issue which emerges with the feminine AI voices is a gender stereotype. Using only one specific gender for AI-based voice will induce listeners to believe women are submissive doers—instead, a regular practice of using sex-neutral voices like Q’s (the world's first voice assistant that is gender-neutral, YouTube, 2019) will help eliminate the disputes regarding gender hegemony and gender stereotypes. Representation of different sexes decreases gender hegemony and gender stereotypes conflicts. The gender neutrality of machines is a controversial topic. This paper made me more conscious of the reality that women are typically assigned positions that need them to be more inconspicuous. The voice of the network or company they represent must have less individuality and originality to act as the defining voice of the network or company. They are male-dominated, but their reasons for assigning female voices are motivated by customer acceptance and business success, which speaks to our patriarchal culture where consumers demand their digital assistants to sound like women and reinforce gender stereotypes. Companies should not be motivated just by the desire to meet market expectations for AI technology; they must also take responsibility for questioning market preferences and evaluating the importance of including a variety of perspectives based on gender. There is more to people's desire for a feminine computer voice than just a desire for a woman. Large IT companies choose female voices as default voice assistants because they are linked with compassion, discipline, and calmness.

Every instance of modern sexism may be traced to the philosophy passed down from one generation to another many year ago. We use our voices to communicate, but they have more than just a basic sound. They have a deeper meaning and philosophy. Women's voices would be highly annoying to many patriarchal power relations if they wanted to break through the male hegemony and restore the society to "normal" and "harmony." Voice as a gendered technology reflects the hierarchical arrangement of gender in our culture, with males always depicted as the "Subject" and females portrayed as the "Other." Women are often the inspiration for digital assistants developed by males in today's gender-biased culture; thus, this suggests male biases against women and perceiving women as the "Other" but not as "us." society as a whole is affected since it promotes the power dynamic between men and women. Consequently, male software engineers neglected women's feelings and opinions, contributing to uneven gender relations. In the end, AI voice assistants communicate primarily in a female voice, establish hegemonic views, and speak gender norms. While some AI voice assistants include choices for male voices, a more sexually flexible option in major technologies must be implemented.


References

Butler, J. (1999). Gender trouble: feminism and the subversion of identity / Judith Butler. (10th anniversary ed.). Routledge. pp.1-22.

Carson, A. (1995). Glass, Irony, and God (Vol. 808). New Directions Publishing.

Cohen, N. (2019, June 6). Why Siri and Alexa weren't built to smack Down harassment. Wired. https://www.wired.com/story/wh...;

De Beauvoir, S. (1953). The second sex, trans. and ed. HM Parshley. New York: Knopf, 1993, 44, pp.13-28.

Figure 1: Chin, C., & Robison, M. (2020). Voice Assistant Responses to Gender Identification Questions. Brookings. https://www.brookings.edu/research/how-ai-bots-and-voice-assistants-reinforce-gender-bias/.

Figure 2: Fessler, L. (2017). Sexual comments. Quartz. https://qz.com/911681/we-teste...;

Griggs, B. (2011, October 21). Why computer voices are mostly female. CNN. https://www.cnn.com/2011/10/21/tech/innovation/female-computer-voices/index.html.

IBM Watson. IBM. (n.d.). https://www.ibm.com/watson.&nb...;

James , R. (2015, March 9). Gendered Voices and Social Harmony. Sounding Out! https://soundstudiesblog.com/2015/03/09/gendered-voices-and-social-harmony/.

Nass, Clifford, Moon, Youngme, & Green, Nancy. (1997). Are Machines Gender Neutral? Gender-Stereotypic Responses to Computers With Voices. Journal of Applied Social Psychology, 27(10), 864–876. https://doi.org/10.1111/j.1559...

Sarah Banet-Weiser. (2018). Empowered. Duke University Press. https://doi.org/10.1215/9781478002772

Schippers, M. (2007). Recovering the Feminine Other: Masculinity, Femininity, and Gender Hegemony. Theory and Society, 36(1), 85–102. https://doi.org/10.1007/s11186-007-9022-4

Sheppard, E. (2019, March 26). Why are virtual assistants always female? Gender bias in Ai must be remedied. The Guardian Labs. https://www.theguardian.com/pw...;

Loviglio, J. (2008). Sound effects: Gender, voice and the cultural work of NPR. Radio Journal: International Studies in Broadcast & Audio Media, 5(2/3), 67–81. https://doi-org.proxy.lib.sfu.ca/10.1386/rajo.5.2-3.67_1

UNESCO. (2019). I'd blush if I could: closing gender divides in digital skills through education. unesdoc.unesco.org. https://unesdoc.unesco.org/ark...;

Wakefield, J. (2019, May 21). Female-voice AI reinforces bias, says UN report. BBC News. https://www.bbc.com/news/technology-48349102.

Welch, C. (2021, March 31). Apple won’t give Siri a female-sounding voice by default anymore. The Verge. https://www.theverge.com/2021/...;

YouTube. (2019). Meet Q: The First Genderless Voice - Full Speech. YouTube.