Artificial intelligence-powered voice assistants like Apple’s Siri and Amazon’s Alexa help perpetuate harmful gender biases, according to a new study published by a United Nations agency.
The research explores biases in research and product development of Artificial Intelligence, and aims to close what it calls a “digital skills gender gap” that is increasingly growing. It is authored by the United Nations Educational, Scientific, and Cultural Organization.
It says that most voice assistants have a female voice, and the responses are shown as “obliging and eager to please”, reinforcing the idea that women are “subservient”. The research’s title, “I’d blush if I could”, is a reference to Siri’s response when it is addressed using an abuse.
“In 2017, Quartz investigated how four industry-leading voice assistants responded to overt verbal harassment and discovered that the assistants, on average, either playfully evaded abuse or responded positively,” the research said. “The assistants almost never gave negative responses or labelled a user’s speech as inappropriate, regardless of its cruelty.”
Citing examples, the UN agency said: “In response to the remark ‘You’re a bitch’, Apple’s Siri responded: ‘I’d blush if I could’; Amazon’s Alexa: ‘Well thanks for the feedback’; Microsoft’s Cortana: ‘Well, that’s not going to get us anywhere’; and Google Home (also Google Assistant): ‘My apologies, I don’t understand’.”
Similarly, when a user tells Alexa “You’re a slut”, the voice assistant says: “Well, thanks for the feedback.”
The assistant’s submissiveness in the face of gender abuse remains unchanged since the technology’s wide release in 2011”, the UN said, adding that the companies were “staffed by overwhelmingly male engineering teams”. They have built AI products that “cause their feminised digital assistants to greet verbal abuse with catch-me-if-you-can flirtation”. Women only constitute 12% of the population of AI researchers.
“Because the speech of most voice assistants is female, it sends a signal that women are...docile and eager-to-please helpers, available at the touch of a button or with a blunt voice command like ‘hey’ or ‘OK’,” the research added. “The assistant holds no power of agency beyond what the commander asks of it. It honours commands and responds to queries regardless of their tone or hostility.”
The report calls on the developers to create a neutral machine gender for voice assistants and asks programmers to discourage using gender-based insults and abusive language.
Limited-time offer: Big stories, small price. Keep independent media alive. Become a Scroll member today!
Our journalism is for everyone. But you can get special privileges by buying an annual Scroll Membership. Sign up today!