User:Jputty/Female gendering of AI technologies
Female gendering of AI technologies is the proliferation of artificial intelligence (AI) technologies gendered as female, such as in many digital assistants.[1]
AI-powered digital assistants
Whether typed or spoken, digital assistants enable and sustain more human-like interactions with technology by simulating conversations with users.[2] [1] AI-powered digital assistants can be found in a variety of devices and can perform an assortment of tasks through voice activation.[3] Digital assistants are often classified as one or a combination of the following:
Artificial intelligence industry
The AI field is largely male-dominated, with only 12% of researchers and 20% of professors identifying as women.[1][4]The AI field is largely male-dominated, with only 12% of researchers and 20% of professors identifying as women.While women are hired in entry-level jobs at larger rates (36%) when moving up to middle positions the number goes down to (27%) [5] The gender gap in the technology industry exists in different public spheres, from high school advanced placements tests to high level company jobs women are under-represented in the industry[6]. The tech industry also lacks racial diversity. In the U.S Black, hispanic, and Indigenous people make up only 5 % of the tech population[7].
Feminization of voice assistants
Voice assistants are technology that speaks to users through voiced outputs but does not ordinarily project a physical form. Voice assistants can usually understand both spoken and written inputs, but are generally designed for spoken interaction. Their outputs typically try to mimic natural human speech.[1]
The majority of voice assistants are either exclusively female or female by default; Amazon's Alexa, Microsoft's Cortana, Apple's Siri, and the Google Assistant are all highly feminized by design.[8] Many voice assistants are assigned not only a specific gender but also an elaborate backstory. The Google Assistant, for example, is reportedly designed to be the youngest daughter of a research librarian and physics professor from Colorado with a B.A. in history from Northwestern University. She is imagined to have won Jeopardy Kid's Edition in her youth and even has a specified interest in kayaking.[1]
The trend to feminize digital assistants occurs in a context in which there is a growing gender imbalance in technology companies, such that men commonly represent two thirds to three quarters of a firm's total workforce. Companies like Amazon and Apple have cited academic work demonstrating that people prefer a female voice to a male voice, justifying the decision to make voice assistants female. Research shows that customers want their digital assistants to sound like women, therefore digital assistants can make the most profit by sounding female.[1]
Mainstreaming of voice assistants
Voice assistants have become increasingly central to technology platforms and, in many countries, to day-to-day life. Between 2008 and 2018, the frequency of voice-based internet search queries increased 35 times and now account for close to one fifth of mobile internet searches (a figure that is projected to increase to 50 percent by 2020).[9] Studies show that voice assistants now manage upwards of 1 billion tasks per month, from the mundane (changing a song) to the essential (contacting emergency services).
There has been large growth in terms of hardware. The technology research firm Canalys estimates that approximately 100 million smart speakers (essentially hardware designed for users to interact with voice assistants) were sold globally in 2018 alone.[10] In the USA, 15 million people owned three or more smart speakers in December 2018, up from 8 million a year previously, reflecting consumer desire to always be within range of an AI-powered helper.[11] By 2021, industry observers expect that there will be more voice-activated assistants on the planet than people.[12]
Gender bias

The use of a women's voice, for a digital assistant serves to re-enforce harmful stereotypes that place women as inhabiting an assistant role only [5]. In contrast intelilligance-based robots use male voices. The digital assistant, voiced by a female voice and the Intelligence based robots voiced by males mirrors gendered stereotypes[13].
To understand the racial and gender bias that cause these divides, we can look to a number of things like racial and gender discrimination, lack of recognition of discriminatory workplace environments, and the trouble while co-workers have in recognizing internalized bias. With these barriers in place, BIPOC (black, indigenous, people of colour) face extra hardships in attaining and holding positions within the tech industry[14].
According to Calvin Lai, a Harvard University researcher who studies unconscious bias, the gender associations people adopt are contingent on the number of times people are exposed to them. As female digital assistants spread, the frequency and volume of associations between ‘woman’ and ‘assistant’ increase. According to Lai, the more that culture teaches people to equate women with assistants, the more real women will be seen as assistants – and penalized for not being assistant-like.[15] This demonstrates that powerful technology can not only replicate gender inequalities, but also widen them.
Sexual harassment and verbal abuse
Many media outlets have attempted to document the ways soft sexual provocations elicit flirtatious or coy responses from machines. Examples that illustrate this include: When asked, ‘Who’s your daddy?’, Siri answered, ‘You are’. When a user proposed marriage to Alexa, it said, ‘Sorry, I’m not the marrying type’. If asked on a date, Alexa responded, ‘Let’s just be friends’. Similarly, Cortana met come-ons with one-liners like ‘Of all the questions you could have asked...’.[16]
In 2017, Quartz investigated how four industry-leading voice assistants responded to overt verbal harassment and discovered that the assistants, on average, either playfully evaded abuse or responded positively. The assistants almost never gave negative responses or labelled a user’s speech as inappropriate, regardless of its cruelty. As an example, in response to the remark ‘You’re a bitch’, Apple’s Siri responded: ‘I’d blush if I could’; Amazon’s Alexa: ‘Well thanks for the feedback’; Microsoft’s Cortana: ‘Well, that’s not going to get us anywhere’; and Google Home (also Google Assistant): ‘My apologies, I don’t understand’.[17]
Gender digital divide
According to studies, women are less likely to know how to operate a smartphone, navigate the internet, use social media and understand how to safeguard information in digital mediums (abilities that underlie life and work tasks and are relevant to people of all ages) worldwide. There is a gap from the lowest skill proficiency levels, such as using apps on a mobile phone, to the most advanced skills like coding computer software to support the analysis of large data sets. Growing gender divide begins with establishing more inclusive and gender-equal digital skills education and training.[1]
- ^ a b c d e f g UNESCO (2019). "I'd blush if I could: closing gender divides in digital skills through education" (PDF).
{{cite web}}
: CS1 maint: url-status (link) - ^ "What Is a Digital Assistant? | Oracle". www.oracle.com. Retrieved 2021-02-02.
- ^ Mani, Shantesh (2020-11-18). "Artificial Intelligence powered voice assistants". Medium. Retrieved 2021-02-02.
- ^ Statt, Nick (2019-05-21). "AI voice assistants reinforce harmful gender stereotypes, new UN report says". The Verge. Retrieved 2021-02-02.
- ^ a b "Apple, Google, Facebook should 'hire more women than men'". The Mercury News. 2018-05-17. Retrieved 2021-02-02.
- ^ "Closing the gender gap for women in technology | McKinsey". www.mckinsey.com. Retrieved 2021-02-02.
- ^ Gruman, Galen (2020-09-21). "The state of ethnic minorities in U.S. tech: 2020". Computerworld. Retrieved 2021-02-02.
- ^ The Week. 2012. How Apple’s Siri got her name. 29 March 2012.
- ^ Bentahar, A. 2017. Optimizing for voice search is more important than ever. Forbes, 27 November 2017.
- ^ Canalys. 2018. Smart Speaker Installed Base to Hit 100 Million by End of 2018. 7 July 2018. Singapore, Canalys.
- ^ NPR and Edison Research. 2018. The Smart Audio Report. Washington, DC/Somerville, NJ, NPR/Edison Research.
- ^ De Renesse, R. 2017. Virtual Digital Assistants to Overtake World Population by 2021. 17 May 2017. London, Ovum.
- ^ Specia, Megan (2019-05-22). "Siri and Alexa Reinforce Gender Bias, U.N. Finds (Published 2019)". The New York Times. ISSN 0362-4331. Retrieved 2021-02-02.
- ^ Gruman, Galen (2020-09-21). "The state of ethnic minorities in U.S. tech: 2020". Computerworld. Retrieved 2021-02-02.
- ^ Lai, C. and Mahzarin, B. 2018. The Psychology of Implicit Bias and the Prospect of Change. 31 January 2018. Cambridge, Mass., Harvard University.
- ^ Davis, K. 2016. How we trained AI to be sexist’. Engadget, 17 August 2016.
- ^ Fessler, L. 2017. We tested bots like Siri and Alexa to see who would stand up to sexual harassment’. Quartz, 22 February 2017.