{"id":50180,"date":"2022-03-17T00:00:00","date_gmt":"2022-03-17T00:00:00","guid":{"rendered":"https:\/\/www.techopedia.com\/women-in-ai-reinforcing-sexism-and-stereotypes-with-tech\/"},"modified":"2024-02-13T07:14:45","modified_gmt":"2024-02-13T07:14:45","slug":"women-in-ai-reinforcing-sexism-and-stereotypes-with-tech","status":"publish","type":"post","link":"https:\/\/www.techopedia.com\/women-in-ai-reinforcing-sexism-and-stereotypes-with-tech\/2\/33952","title":{"rendered":"Women in AI: Reinforcing Sexism and Stereotypes with Tech"},"content":{"rendered":"
Digital transformation<\/a> has changed the way we work and play.<\/p>\n But technological progress does not always translate into social progress with respect to gender parity. Some aspects of tech have reinforced, rather than countered, sexual stereotypes. (Also read:<\/strong> What Do Women in Tech Want?<\/strong><\/a>)<\/strong><\/p>\n Here are some key areas reinforcing sexism and gendered stereotypes within artificial intelligence<\/a> (AI):<\/p>\n While tech and AI<\/a> continue to advance<\/a>, the same cannot be said for women\u2019s positions in these very male-dominated fields. Women still hold just over a quarter (26%) of data<\/a> and AI positions, according to the World Economic Forum<\/a>.<\/p>\n That lack of representation means that the majority of women working in tech\u201472%, according to TrustRadius<\/a>\u2014still have to contend with bro culture<\/a>. That can translate into a very toxic, and even dangerous, environment for women.<\/p>\n In the instance of the virtual reality<\/a> (VR) game company Activision Blizzard, the workplace bro culture led to unequal pay, sexual harassment and even assaults without any real consequences for the perpetrators. The Wall Street Journal<\/a> reported that the CEO, who knew about misconduct, intervened to be sure those found guilty of misconduct from internal investigations were not fired as per the recommendations.<\/p>\n The sexism pervading the companies that produce games also creates a hostile environment for female players<\/a>.<\/p>\n In 2021, Reach3 Insights surveyed 900 women<\/a> and found 59% of them opted for gender-neutral or even masculine names when playing to avert sexual harassment.<\/p>\n Over three-quarters of women surveyed (77%) reported having to deal with some kind of unpleasantness as a female. Judgment on their skills was reported by 70% and gatekeeping by 65%. Half reported patronizing comments and 44% said they \u201creceived unsolicited relationship asks while gaming<\/a>.\u201d<\/p>\n Some women have an even worse experience in virtual reality. Jordan Belamire<\/a> wrote “My First Virtual Reality Groping<\/a>.” An avatar<\/a> named BigBro442 persisted in groping her avatar despite her requests and orders to stop. Belamire noted:<\/p>\n \u201cAs VR becomes increasingly real, how do we decide what crosses the line from an annoyance to an actual assault? Eventually we\u2019re going to need rules to tame the wild, wild west of VR multi-player<\/a>.\u201d<\/p>\n Another question is: when is “eventually” going to arrive?<\/p>\n Belamire wrote “My First Virtual Reality Groping” in 2016, and over five years later a similar incident was reported in The Verge<\/a>. A beta tester<\/a> for Meta\u2019s “Horizon Worlds” reported that her avatar was groped on the platform and reported how upsetting she found the incident.<\/p>\n \u201cSexual harassment is no joke on the regular internet, but being in VR adds another layer that makes the event more intense,\u201d she wrote. \u201cNot only was I groped last night, but there were other people there who supported this behavior which made me feel isolated in the Plaza.\u201d<\/p>\n Meta\u2019s platform does offer a blocking feature, which can give some more control to those entering a space where anyone can approach your avatar. But that kind of solution still doesn\u2019t measure up to what Belamire suggested: a code of conduct that players would have to adhere to.<\/p>\n That sexual harassment remains a serious problem\u2014one that carries over from the real world to the virtual one\u2014reflects the fact that society is still mired in certain gendered assumptions. And these assumptions also express themselves in subtler forms.<\/p>\n Gender equality was supposed to have advanced since the middle of the last century, but the gendered assumptions that remain in place in everyday tech remind us we still have a long way to go.<\/p>\n That was the focus of a recent UNESCO study<\/a> entitled \u201cI\u2019d Blush If I Could.\u201d<\/p>\n The study\u2019s title is a sentence Apple\u2019s female-gendered voice-assistant, Siri<\/a>, was originally programmed to say in response to users calling her a sexist name. Apple updated Siri\u2019s programming in early 2019 to offer a more machine-appropriate \u201cI don\u2019t know how to respond to that\u201d when someone makes such a statement to the AI agent.<\/p>\n But still, one has to wonder why it took the company that long. Siri was released in 2011, and it shouldn’t have taken nearly eight years to acknowledge and address a problem of sexist assumptions.<\/p>\n As the report points out, \u201dSiri\u2019s \u2018female\u2019 obsequiousness\u2014and the servility expressed by so many other digital assistants projected as young women\u2014provides a powerful illustration of gender biases coded into technology products, pervasive in the technology sector and apparent in digital skills education.\u201d<\/p>\n Ironically enough, Amazon, whose name refers to a fierce warrior race of women, upheld sexist assumptions about females when it launched its AI agent<\/a>. Alexa\u2019s name is derived from Alexandria, a city whose claim to fame in the ancient world was its library, according to Daniel Rausch, the head of Amazon\u2019s \u201cSmart Home\u201d<\/a> division.<\/p>\n Rausch told Business Insider<\/a> that the idea behind referencing Alexandria with Alexa’s name was to capture the idea of the ancient library’s original collection of volumes, which housed \u201call the collective knowledge of the world at that time.\u201d As that ancient city was named for Alexander the Great, Amazon could just as well have called its agent “Alex,” a name used by men and women.<\/p>\n But the company decided on the distinctly feminine version of the name, just as Apple opted for the feminine “Siri” and Microsoft created Cortana<\/a>. Likely, the companies all did the same kind of market research Amazon said it did. (Also read: <\/strong>How Will AI Change the Market Research Scenario?<\/strong><\/a>)<\/strong><\/p>\n In the Business Insider interview, Rausch said Amazon \u201cfound that a woman\u2019s voice is more \u2018sympathetic\u2019 and better received.\u201d The article went on to say this preference for female voices predates AI assistants.<\/p>\n Indeed, even the computer on board the Enterprise spoke in a female voice. The voice was in fact that of Majel Barrett-Roddenberry, wife of the creator of the \u201cStar Trek\u201d series and most recognized by fans for her recurring role as the perfectly coiffed blond nurse, Christine Chapel, who had to dutifully take orders from Dr. McCoy.<\/p>\n True there are AI agents that are linked to male identities, as PC Mag\u2019s Chandra Steele observed in a Medium blog<\/a> in 2018. But they are typically linked to more serious tasks than those relegated to the virtual assistant on your desktop<\/a> or phone<\/a>. Accordingly, IBM\u2019s Watson<\/a>, which is associated with things like medical research<\/a>, was given the \u201cmasculine-sounding voice\u201d that people associate with confidence and leadership. (Also read: <\/strong>Top 20 AI Use Cases: Artificial Intelligence in Healthcare<\/strong><\/a>.)<\/strong><\/p>\n In contrast, the female voices are associated with cordiality and complaisance. \u201cThough they lack bodies,\u201d Steele explained, \u201cthey embody what we think of when we picture a personal assistant: a competent, efficient, and reliable woman.\u201d<\/p>\n Sometimes virtual assistants are even granted a feminine virtual body\u2014at least one that appears on-screen. That is the case of IPsoft\u2019s cognitive agent Amelia; she is depicted as blonde who could be in her twenties. She embodies the dependable female who supports the one in charge, in the background but also conventionally attractive.<\/p>\n \u201cThere\u2019s nothing artificial about AI,\u201d declared Fei-Fei Li<\/a>, an expert in the field. \u201cIt\u2019s inspired by people, it\u2019s created by people, and\u2014most importantly\u2014it impacts people.\u201d Just as \u201cgarbage in, garbage out\u201d<\/a> applies to all data, the same holds for what she terms \u201cbias in, bias out\u201d for AI systems.<\/p>\n The upside of that, however, is that it is possible to reshape the path that has been set. However, we must make a conscious effort<\/a> to balance the perspectives fed into AI. Failure to do so, Li said, would “reinforce biases we\u2019ve spent generations trying to overcome.” (Also read: <\/strong>Minding the Gender Gap: 10 Facts about Women in Tech<\/strong><\/a>.)<\/strong><\/p>\n What we need to do going forward is to consciously combat bro culture, whether it gives expression to overtly harmful effects, as in the case of sexual harassment and assault, or whether it manifests more subtly, like in the sexual stereotyping of AI-powered entities.<\/p>\nBro Culture at Work<\/span><\/h2>\n
Bro Culture at Play<\/span><\/h2>\n
New Platforms, Same Old Problem<\/span><\/h2>\n
What Siri Says About Us<\/span><\/h2>\n
What’s in a Name?<\/span><\/h2>\n
Why AI Uses Women\u2019s Voices and Avatars<\/span><\/h2>\n
Tackling the Root of the Problem<\/span><\/h2>\n