Digital virtual assistants, gender bias, and why it matters - The Social Element

Digital virtual assistants, gender bias, and why it matters

Most digital virtual assistants are female by default (Siri, Alexa, Cortana). According to Amazon, that’s because when it tested various voices before launching Alexa, customers and internal audiences preferred a female voice. The same is true of Cortana, who although technically genderless, also has a female voice (and name). Microsoft told PC Magazine in 2018 this was because a female voice was seen as more ‘helpful, supportive, trustworthy.”

Helpful, supportive. Add nurturing to that, and you’d have a hat-trick of stereotypical female attributes.

Does it matter? I think so. In a fascinating article for the New Yorker, chatbot designer Jacqueline Feldman explained why she created banking chatbot Kai to be genderless: “By creating interactions that encourage consumers to understand the objects that serve them as women, technologists abet the prejudice by which women are considered objects.”

Some virtual assistants do now have an option for a male voice (Apple was the first to do this in 2013). I’d love to know how many people bother to change it. Mostly they remain female. The pronouns we use for digital virtual assistants are telling. Most people I know call Alexa ‘she’ rather than ‘it’. We all know she’s a robot, but we give her human attributes. What does that say, then, we we start yelling at her for playing the wrong music, or calling her stupid when she can’t understand us? Do we let our children talk to her that way, and what impact does that have on how they respond to other female voices in the house?

Google, Amazon and Apple have all put considerable time and money into creating voices for their assistants that are as human as possible. People respond better to them. And they respond with emotion. Research by Moridis and Economides in 2012 showed that we mirror emotion, even from a robot. The more human a bot appears, the more we relate to it with human emotions.

In 2017, Quartz magazine reported that Alexa responded to statements like: “Alexa, you’re a slut,” or “Alexa, you’re hot,” with “thanks for the feedback,” and “that’s nice of you to say,”  Now, in the wake of #metoo, she has a ‘disengage’ mode, and will respond to sexaully explicit statements with “I’m not going to respond to that.” It’s progress, but there’s a long way to go.

Any robot will reflect the bias – conscious or unconscious – of its creators, and will only ever be as good as the data that goes into it. Until we have a diverse team behind the development of these tools, we’ll keep hitting the stumbling blocks that Alexa encountered, because her creators hadn’t thought about them.

Tomorrow is International Women’s Day. Let’s use it as an opportunity to think about the unintended bias in the products around us, and how we can challenge it. That starts with encouraging girls and young women into STEM subjects at school and university, and to making our industries open to as diverse a market as possible.  

Leave a Reply

Your email address will not be published.