But Sutton points out that people are now very familiar with holographic technologies.
“We are so configured to hear certain types of sounds in these devices, that moving away from that can be difficult for users,” she says.
Obviously, part of the problem here is that gender bias and prejudice exists throughout society and synthetic voices – like any cultural trace – run the risk of reversing it. While it’s still a worthwhile exercise, you can’t get rid of gender biases just by redesigning Siri’s voice. It won’t reverse people’s misogynistic attitudes overnight or suddenly equal the number of women and men working in the AI industry.
There is another point to be made here. That is, virtual assistants – by definition – will always remain dependent entities. They are more or less digital servants, after all, so how can we talk to them on an equal level? This is really what we need in order to break away from all the embarrassing power dynamics and problematic authoritarian behavior we are currently experiencing.
“If it’s primarily designed to help people search and shop, how far can you go with meaningful representations and relationships?” quizzes Charlotte Webb, co-founder of Internet Feminism.
In the near future, we will likely encounter more technologies that speak to us. Webb says she is concerned about how voice assistants can continue to perpetuate gender stereotypes once they embody avatars in “metaverse” virtual reality spaces. People have already been Accused of sexually harassing others in the metaverse. Will virtual assistants, inadvertently or inadvertently, enable and encourage such behavior?
The history of synthetic voices, and our attitudes toward them, may have perpetuated and even deepened gender biases—like a feedback loop that amplifies some of our worst intentions. However, awareness of such issues has risen in recent years, with investigations into the outputs of artificial intelligence techniques and changes in social attitudes thanks to The #MeToo . movement and similar campaigns.
You could argue that’s the gist of all of this, after all. A more informed approach to each other and the wonderful array of human identities out there in the world begins with us, not a database. In order to overcome gender-based biases, we can’t just update the software. Or return the vehicle to the manufacturer.
“I definitely don’t see a technical solution to it,” Webb says. “I think it’s a human problem.”
If you like this story, Subscribe to the bbc.com weekly newslettercalled the “Basic List” – a carefully selected collection of stories from
BBC futureAnd the cultureAnd the work lifeAnd the Travel And the early Delivered to your inbox every Friday.
“Food evangelist. Award-winning travel guru. Friendly zombieaholic. Lifelong bacon practitioner.”