Siri and Alexa ‘reinforce gender bias’, says UNESCO report

Why do most voice assistants have female names, and why do they have submissive personalities? The answer, according to a new report released by UNESCO, the UN’s Education, Science and Culture agency, is that there are hardly any women working in the technical teams that develop these services and other cutting-edge digital tools.

The report is called I’d Blush If I Could, a reference to the standard answer given by the default female-voice of Apple’s digital assistant, Siri, in response to insults from users. Apart from Siri, other ‘female’ voice assistants also express submissive traits, an expression of the gender bias built in to artificial intelligence (AI) products as a result of what UNESCO calls the “stark gender-imbalances in skills, education and the technology sector.”

Amazon’s Alexa, a reference to the ancient library of Alexandria, is unmistakably female. Microsoft’s Cortana was named after an AI character in the Halo video game franchise that projects itself as a sensuous, unclothed woman. Apple’s Siri is a Norse name that means “beautiful woman who leads you to victory.” The Google Assistant system, also known as Google Home, has a gender-neutral name, but the default voice is female.

The report makes several recommendations, including advice to stop making digital assistants female by default, programming them to discourage gender-based insults and abusive language, and developing the advanced technical skills of women and girls so they can steer the creation of new technologies alongside men.

Given the explosive growth of voice assistants, says the report, there is an urgent need to help more women and girls cultivate digital skills. Women are significantly under-represented in teams developing AI tools; women make up only 12% of AI researchers, 6% of software developers, and are 13 times less likely to file information and communication technology patents.

“Obedient and obliging machines that pretend to be women are entering our homes, cars and offices,” said Saniye Gülser Corat, director of gender equality at UNESCO.

“Their hardwired subservience influences how people speak to female voices and models how women respond to requests and express themselves. To change course, we need to pay much closer attention to how, when and whether AI technologies are gendered and, crucially, who is gendering them.”