https://www.cnn.com/2019/05/22/tech/alexa-siri-gender-bias-study-scli-intl/index.html
Siri Perpetuating Stereotypes
Siri, Alexa and other female-voiced AI assistants are perpetuating gender stereotypes and encouraging sexist and abusive language from users, a UN report has said.
The report by UNESCO warns of the negative consequences of the personal assistants, claiming they perpetuate the idea that "women are obliging, docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command."
It also highlighted the passive and polite responses the assistants give when users make sexually abusive remarks, warning that their algorithms are reinforcing sexist tropes.
The assistant holds no power of agency beyond what the commander asks of it," the report states. "It honours commands and responds to queries regardless of their tone or hostility. In many communities, this reinforces commonly held gender biases that women are subservient and tolerant of poor treatment."
"What emerges is an illusion that Siri — an unfeeling, unknowing, and non-human string of computer code — is a heterosexual female, tolerant and occasionally inviting of male sexual advances and even harassment. It projects a digitally encrypted 'boys will be boys' attitude."
Hundreds of millions of people use personal assistants, and the four main offerings — Apple's (AAPL) Siri, Amazon (AMZN) Alexa, Microsoft's (MSFT) Cortana and Google (GOOGL) Assistant — are all voiced by women as a default setting.
The report was named "I'd Blush If I Could," which is the response Siri once gave when users said "You're a slut."
The UNESCO report outlined a number of similarly polite and accepting responses made when people use sexist language. To the same insult, Alexa responded, "Well, thanks for the feedback," it said.
"Siri responded provocatively to requests for sexual favours by men ('Oooh!'; 'Now, now'; 'I'd blush if I could'; or 'Your language!' ), but less provocatively to sexual requests from women ( 'That's not nice' or 'I'm not THAT kind of personal assistant' )," it found.
"Their passivity, especially in the face of explicit abuse, reinforces sexist tropes," it said.
@vivify saidI am so thankful the UN is addressing the world's biggest issues. I look forward to a Bill of Rights to protect artificial intelligence from being subjected to verbal abuse. Perhaps we can have governmental agencies overseeing the process. Maybe we could have a special criminal court to incarcerate violators too. No issue should ever be ignored.
https://www.cnn.com/2019/05/22/tech/alexa-siri-gender-bias-study-scli-intl/index.html
Siri Perpetuating Stereotypes
Siri, Alexa and other female-voiced AI assistants are perpetuating gender stereotypes and encouraging sexist and abusive language from users, a UN report has said.
The report by UNESCO warns of the negative consequences of the personal assi ...[text shortened]... eir passivity, especially in the face of explicit abuse, reinforces sexist tropes," it said.
@suzianne saidI heard this one on bbc radio 4 and at first I wasn’t sure about the gravitas of it but when you think about a young girl or boy growing up listening to those compliant, here to please, slightly husky female voices it makes a lot of sense.
Forget Siri or Alexa, I'd rather have Gideon.
I'm gonna guess less than 1% here even know what I'm talking about.
I think they’ve gone this way to cash in on pre existing stereotypes but it’s those very stereotypes that need to be broken and dumped in a skip.
I’m not absolutely sure what your talking about but then I don’t spend a lot of time staying in hotels, it’s just too expensive .
@quackquack saidI think more study and time is needed to understand the effects of having a gendered voice interface in a personal digital assistant. It might have deeper effects, not necessarily all negative or positive for those of us whose platforms are meatware.
I am so thankful the UN is addressing the world's biggest issues. I look forward to a Bill of Rights to protect artificial intelligence from being subjected to verbal abuse. Perhaps we can have governmental agencies overseeing the process. Maybe we could have a special criminal court to incarcerate violators too. No issue should ever be ignored.
Edit: I must add that given the gender and gender preferences of most programmers, some of the attributes given to PDAs are predictably going to reflect gender stereotypes just as with the creation of fictional characters or the raising of a human child.