Go back
UN Report: Siri Perpetuates Sexism

UN Report: Siri Perpetuates Sexism

Debates

vivify
rain

Joined
08 Mar 11
Moves
12456
Clock
23 May 19
1 edit

https://www.cnn.com/2019/05/22/tech/alexa-siri-gender-bias-study-scli-intl/index.html

Siri Perpetuating Stereotypes

Siri, Alexa and other female-voiced AI assistants are perpetuating gender stereotypes and encouraging sexist and abusive language from users, a UN report has said.

The report by UNESCO warns of the negative consequences of the personal assistants, claiming they perpetuate the idea that "women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command."
It also highlighted the passive and polite responses the assistants give when users make sexually abusive remarks, warning that their algorithms are reinforcing sexist tropes.

The assistant holds no power of agency beyond what the commander asks of it," the report states. "It honours commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment."

"What emerges is an illusion that Siri — an unfeeling, unknowing, and non-human string of computer code — is a heterosexual female, tolerant and occasionally inviting of male sexual advances and even harassment. It projects a digitally encrypted 'boys will be boys' attitude."

Hundreds of millions of people use personal assistants, and the four main offerings — Apple's (AAPL) Siri, Amazon (AMZN) Alexa, Microsoft's (MSFT) Cortana and Google (GOOGL) Assistant — are all voiced by women as a default setting.

The report was named "I'd Blush If I Could," which is the response Siri once gave when users said "You're a slut."
The UNESCO report outlined a number of similarly polite and accepting responses made when people use sexist language. To the same insult, Alexa responded, "Well, thanks for the feedback," it said.

"Siri responded provocatively to requests for sexual favours by men ('Oooh!'; 'Now, now'; 'I'd blush if I could'; or 'Your language!' ), but less provocatively to sexual requests from women ( 'That's not nice' or 'I'm not THAT kind of personal assistant' )," it found.

"Their passivity, especially in the face of explicit abuse, reinforces sexist tropes," it said.

q

Joined
05 Sep 08
Moves
66636
Clock
23 May 19

@vivify said
https://www.cnn.com/2019/05/22/tech/alexa-siri-gender-bias-study-scli-intl/index.html

Siri Perpetuating Stereotypes

Siri, Alexa and other female-voiced AI assistants are perpetuating gender stereotypes and encouraging sexist and abusive language from users, a UN report has said.

The report by UNESCO warns of the negative consequences of the personal assi ...[text shortened]... eir passivity, especially in the face of explicit abuse, reinforces sexist tropes," it said.
I am so thankful the UN is addressing the world's biggest issues. I look forward to a Bill of Rights to protect artificial intelligence from being subjected to verbal abuse. Perhaps we can have governmental agencies overseeing the process. Maybe we could have a special criminal court to incarcerate violators too. No issue should ever be ignored.

Suzianne
Misfit Queen

Isle of Misfit Toys

Joined
08 Aug 03
Moves
37388
Clock
24 May 19

Forget Siri or Alexa, I'd rather have Gideon.

I'm gonna guess less than 1% here even know what I'm talking about.

k
Flexible

The wrong side of 60

Joined
22 Dec 11
Moves
37304
Clock
24 May 19
1 edit

@suzianne said
Forget Siri or Alexa, I'd rather have Gideon.

I'm gonna guess less than 1% here even know what I'm talking about.
I heard this one on bbc radio 4 and at first I wasn’t sure about the gravitas of it but when you think about a young girl or boy growing up listening to those compliant, here to please, slightly husky female voices it makes a lot of sense.
I think they’ve gone this way to cash in on pre existing stereotypes but it’s those very stereotypes that need to be broken and dumped in a skip.
I’m not absolutely sure what your talking about but then I don’t spend a lot of time staying in hotels, it’s just too expensive .

JS357

Joined
29 Dec 08
Moves
6788
Clock
24 May 19
3 edits
Vote Up
Vote Down

@quackquack said
I am so thankful the UN is addressing the world's biggest issues. I look forward to a Bill of Rights to protect artificial intelligence from being subjected to verbal abuse. Perhaps we can have governmental agencies overseeing the process. Maybe we could have a special criminal court to incarcerate violators too. No issue should ever be ignored.
I think more study and time is needed to understand the effects of having a gendered voice interface in a personal digital assistant. It might have deeper effects, not necessarily all negative or positive for those of us whose platforms are meatware.

Edit: I must add that given the gender and gender preferences of most programmers, some of the attributes given to PDAs are predictably going to reflect gender stereotypes just as with the creation of fictional characters or the raising of a human child.

Cookies help us deliver our Services. By using our Services or clicking I agree, you agree to our use of cookies. Learn More.