AI voice assistants promote sexist attitudes toward women, UN says

CNET | 5/22/2019 | Shelby Brown
jenny124124 (Posted by) Level 3
Click For Photo: https://cnet3.cbsistatic.com/img/KBR1GYWD2mNrI9tvFINf3uZtuxk=/756x567/2019/01/17/160d0588-c985-400c-9c2b-50604e69486d/jot-google-home-mini-1.jpg

Defaulting to Alexa instead of Alex may have larger consequences than you might think.

Digital voice assistants like Siri, Alexa and Google Assistant have, by default, a female voice. A new report from the United Nations says that's a problem.

Default - Voices - Stereotype - Women - Obliging

These default voices reinforce the stereotype of women being "obliging, docile and eager-to-please helpers" that are "available at the touch of a button or with a blunt voice command like 'hey' or 'OK,'" according to the UNESCO report, released Friday.

The smart-home gadgets, which are often referred to as "she" or "her," have no choice but to respond. Digital assistants also can't defend against abuse, which reinforces the idea that women are "subservient and tolerant of poor treatment," the UN report said.

Apple - Siri - Microsoft - Cortana - Offer

While Apple's Siri and Microsoft's Cortana offer male voices, the defaults for the digital helpers are female voices. Amazon's Alexa offers several accents, but all its voices are female. Google said it's developed a variety of 10 voice offerings in the US and that when customers set up a Google Home device, they have a 50-50 chance of getting either a traditionally female sounding voice, or a traditionally male sounding voice.

"Siri's 'female' obsequiousness -- and the servility expressed by so many other digital assistants projected as young women -- provides a powerful illustration of gender biases coded into technology products, pervasive in the technology sector and apparent in digital skills education," reads the report. It's...
(Excerpt) Read more at: CNET
Wake Up To Breaking News!
Sign In or Register to comment.

Welcome to Long Room!

Where The World Finds Its News!