Why do most virtual assistants that are powered by artificial intelligence — like Apple’s Siri and Amazon’s Alexa system — by default have female names, female voices and often a submissive or even flirtatious style?
The problem, according to a new report released this week by Unesco, stems from a lack of diversity within the industry that is reinforcing problematic gender stereotypes.
“Obedient and obliging machines that pretend to be women are entering our homes, cars and offices,” Saniye Gulser Corat, Unesco’s director for gender equality, said in a statement. “The world needs to pay much closer attention to how, when and whether A.I. technologies are gendered and, crucially, who is gendering them.”
One particularly worrying reflection of this is the “deflecting, lackluster or apologetic responses” that these assistants give to insults.
The report borrows its title — “I’d Blush if I Could” — from a standard response from Siri, the Apple voice assistant, when a user hurled a gendered expletive at it. When a user tells Alexa, “You’re hot,” her typical response has been a cheery, “That’s nice of you to say!”
Siri’s response was recently altered to a more flattened “I don’t know how to respond to that,” but the report suggests that the technology remains gender biased, arguing that the problem starts with engineering teams that are staffed overwhelmingly by men.
“Siri’s ‘female’ obsequiousness — and the servility expressed by so many other digital assistants projected as young women — provides a powerful illustration of gender biases coded into technology products,” the report found.
Amazon’s Alexa, named for the ancient library of Alexandria, is unmistakably female. Microsoft’s Cortana was named after an A.I. character in the Halo video game franchise that projects itself as a sensuous, unclothed woman. Apple’s Siri is a Norse name that means “beautiful woman who leads you to victory.” The Google Assistant system, also known as Google Home, has a gender-neutral name, but the default voice is female.
Baked into their humanized personalities, though, are generations of problematic perceptions of women. These assistants are putting a stamp on society as they become common in homes across the world, and can influence interactions with real women, the report warns. As the report puts it, “The more that culture teaches people to equate women with assistants, the more real women will be seen as assistants — and penalized for not being assistant-like.”
Apple and Google declined to comment on the report. Amazon did not immediately respond to requests for comment.
The publication — the first to offer United Nations recommendations regarding the gendering of A.I. technologies — urged tech companies and governments to stop making digital assistants female by default and explore developing a gender-neutral voice assistant, among other guidance.
The systems are a reflection of broader gender disparities within the technology and A.I. sectors, Unesco noted in the report, which was released in conjunction with the government of Germany and the Equals Skills Coalition, which promotes gender balance in the technology sector.
Women are grossly underrepresented in artificial intelligence, making up 12 percent of A.I. researchers and 6 percent of software developers in the field.
The report noted that technology companies justify the use of female voices by pointing to studies that showed consumers preferred female voices to male ones. But lost in that conversation is research showing that people like the sound of a male voice when it is making authoritative statements, but a female voice when it is being “helpful,” further perpetuating stereotypes.
Experts say bias baked into A.I. and broader disparities within the programming field are not new — pointing to an inadvertently sexist hiring tool developed by Amazon and facial recognition technology that misidentified black faces as examples.
“It’s not always malicious bias, it’s unconscious bias, and lack of awareness that this unconscious bias exists, so it’s perpetuated,” said Allison Gardner, a co-founder of Women Leading in A.I. “But these mistakes happen because you do not have the diverse teams and the diversity of thought and innovation to spot the obvious problems in place.”
But the report offers guidance for education and steps to address the issues, which equality advocates have long pushed for.
Dr. Gardner’s organization works to bring women working in A.I. together with business leaders and politicians to discuss the ethics, bias and potential for legislative frameworks to develop the industry in a way that is more representative.
The group has published its own list of recommendations for building inclusive artificial intelligence, among them establishing a regulatory body to audit algorithms, investigate complaints and ensure bias is taken into account in the development of new technology.
“We need to change things now, because these things are being implemented now,” Dr. Gardner said, pointing to the rapid spread of A.I.-powered virtual assistants. “We are writing the standards now that will be used in the future.”
Dr. Gardner said that changes are also needed in education, because the bias was a symptom of systemic underrepresentation within a male-dominated field.
“The whole structure of the subject area of computer science has been designed to be male-centric, right down to the very semantics we use,” she said.
Although women now have more opportunities in computer science, more are disappearing from the field as they advance in their careers, a trend known as the “leaky pipeline” phenomenon.
“I would say they are actually being forced out by a rather female-unfriendly environment and culture,” Dr. Gardner said. “It’s the culture that needs to change.”