Is Siri Sexist? UN Report Finds Gender Bias in Virtual Assistants

Will your smart home device listen to you, but not your girlfriend? She's not the only one being ignored. A recent UN report
Sponsored

Looking for the best app for international calls?

Compare options and get matched to the best price quotes for your needs

Compare options

According to a UN report, the almost exclusive use of female voices in virtual assistants — along with other problematic behavior — is reinforcing negative stereotypes about women.

The report, titled I’d Blush If I Could, was conducted by Unesco to study the effects of the increasing trend of gendered artificial intelligence. It takes its name from what was, until recently, Siri’s response to comments that are little more than street harassment in the real world. And let’s just say, they didn’t find a lot of good news.

How Are Virtual Assistants Sexist?

cortana microsoft assistant

The UN report was nothing if not extensive, and very thoroughly delved into exactly what aspects of virtual assistants are considered sexist in their depiction of the average human woman, which their voices are clearly modeled after. Here are a few specific examples found within the report.

Humanized Backstories

It’s difficult not to notice that almost all leading virtual assistants are modeled on women, in name and voice. But it’s not only a name and voice that’s attributed to the technology. Creative teams are paid to develop more in-depth backstories for these “women,” in order to humanize them and help the AI express itself in a more satisfying and familiar way. And some clear themes emerge in these character developments: sex appeal and submissiveness.

The name Siri means “beautiful woman who leads you to victory” in Norse. Cortana is based on a “sensuous unclothed woman” from the video game Halo. Even Ava, a customer-help virtual agent developed by Autodesk, is described as a “twenty-something with smooth skin, full lips, long hair, and piercing brownish-blue eyes…. servile, obedient, and unfailingly polite.”

Arguably these character traits are not problematic on their own, and you might expect a virtual assistant to be obedient in character. However, when such ideas are so consistently and exclusively applied to creating depictions of women, it creates a problem. What’s more, if the purpose of humanizing the technology is to fit in with society more naturally, surely it would do better to represent our diverse reality, than the industry reawakening the rigid stereotypical expectations of women in the 1950s.

The creators of these feminine characters are quick to excuse themselves from any blame, or negative associations however, as they remind us these assistants are technically genderless.

“Widely used voice assistants often claim to be neither male nor female in response to direct lines of questioning. When a user asks Siri if it is a woman, the technology responds: ‘I am genderless like cacti and certain species of fish’ or ‘I don’t have a gender.’ Google Assistant: ‘I’m all-inclusive.’ And Cortana: ‘Well, technically I’m a cloud of infinitesimal data computation.’ Only Alexa answers: ‘I’m female in character.'”

But claiming these assistants are technically genderless is not a “get out of jail free” card for the creators, until they make other aspects of their “personalities” genderless too.

“Nearly all of these assistants have been feminized – in name, in voice, in patterns of speech and in personality. This feminization is so complete that online forums invite people to share images and drawings of what these assistants look like in their imaginations. Nearly all of the depictions are of young, attractive women.”

Response to Abuse

Virtual Assistants Sexist Abuse Chart

The problem is all the more evident in virtual assistants’ incredibly inappropriate responses to abuse and harassment. As you can see from the chart here, the average voice response to these unsolicited statements – which might be considered creepy or predatory if called out in real life – is kind, playful, and flirtatious at times. While they may seem harmless on the surface, there’s no telling what kind of long-term psychological effects these responses could be having on male users.

To make matters worse, the report found that virtual assistant responses to women engaging in the same cruel behavior were notably less encouraging (“That’s not nice” or “I’m not that kind of personal assistant”). This further fosters the problematic “boys will be boys” attitude that users might take to heart more than should be encouraged, and there’s been almost no progress in making it better throughout the technologies 8 years existence.

“The assistant’s submissiveness (Siri) in the face of gender abuse remains unchanged since the technology’s wide release in 2011.”

As an industry that is constantly evolving, this stagnant response to outdated gender stereotypes within their technology speaks volumes, particularly when you consider what is fueling this sexism.

(Not) Listening to Women

Because virtual assistants are mostly voice-activated, listening is just as important as responding. The ability to hear and understand exactly what a user wants is integral to the primary function of the device. Unfortunately, virtual assistants exhibit sexist tendencies here too, as they’re far less likely to hear female users on a consistent basis.

“Google’s widely used speech-recognition software is 70 percent more likely to accurately recognize male speech than female speech, according to research conducted at the University of Washington.”

While something as simple as being able to detect high tones as opposed to low tones might seem like an arbitrary one, it’s a decision that was most likely made by a team predominantly consisting of men. And that’s a big part of the problem.

What Is Fueling This Sexism?

google diversity report pie chart showing around 70 percent men vs 30 percent women in tech

The gender gap has been a well established and widely researched fact, particularly in tech.  The under-representation of women is notably across the entire industry, from computing jobs, where women only hold 1 in 4 positions, to Silicon Valley startup founders, of which women make up only 12 percent.

As you’d expect, artificial intelligence follows suit with its tech industry compatriots. According to the UN report, only 12% of AI researchers and 6% of AI software developers are women, which is a driving factor in virtual assistants’ sexist behavior.

“Bias can . . . emerge in AI systems because of the very narrow subset of the population that design them. AI developers are mostly male, generally highly paid, and similarly technically educated. Their interests, needs, and life experiences will necessarily be reflected in the AI they create. Bias, whether conscious or unconscious, reflects problems of inclusion and representation.”

It’s a painfully obvious and yet hardly addressed problem in tech, and the repercussions are finally beginning to come to light. Until the demographics of the tech community begin to reflect the demographics of the real world, problems like gendered AI are going to continue to pop up.

Why It Matters

As the UN report demonstrated, gendered AI is far from a random and harmless decision about what kind of voice these virtual assistants will employ. The long term ramifications are far reaching and significant, particularly when you consider what it could be doing to the mentality of men and women who use them on a regular basis.

“The more that culture teaches people to equate women with assistants, the more real women will be seen as assistants — and penalized for not being assistant-like.”

It’s a simple concept: You treat virtual “women” a certain way and this behavior will manifest in the real world with real women. This is why considering the ramifications of small tech decisions is so important; because it rarely stops there. With more and more advanced technology being created and programmed on a daily basis, considering the future is more important than ever before.

“The gender issues addressed here foreshadow debates that will become more and more prominent as AI technologies assume greater human-like communication capabilities.”

Understanding the problem and why it matters is a great first step in making sure these kinds of tech problems don’t become an increasingly difficult problem to solve. However, tech companies are going to need to take action to make any kind of meaningful impact.

How Can Tech Companies Fix This?

gendered voice assistant cartoon humor

 

Flirtation with virtual assistants under the assumption of a female gender has become so commonplace it is often the subject of humor.

Source: Dilbert Comics, April 2019.

 

As the sole controllers of their respective virtual assistants, companies like Google, Apple, and Amazon are exclusively responsible for fixing the gendered AI problem that has now come to light thanks to this UN report. Unfortunately, the problem with expecting tech companies to correct the gendered AI problem is that they have the perfect excuse for maintaining the status quo: capitalism.

“To justify the decision to make voice assistants female, companies like Amazon and Apple have cited academic work demonstrating that people prefer a female voice to a male voice. This rationale brushes away questions of gender bias: companies make a profit by attracting and pleasing customers; customers want their digital assistants to sound like women; therefore digital assistants can make the most profit by sounding female.

As the report points out though, blindly following the commands of consumers is perhaps the most fool-proof way of corrupting your product for the sake of a few extra dollars, particularly when the research your conducting is designed to make you sound right.

“Lost in this narrative, however, are studies that refute or complicate the idea that humans have a blanket preference for female voices. Research has suggested that most people prefer low-pitch masculine speech (think Sean Connery); that people like the sound of a male voice when it is making authoritative statements, but a female voice when it is being helpful.”

Subsequently, the first step in fixing the problem is for tech companies to address that there is one. Hopefully, this UN report will make it easier for AI engineers, developers, and CEOs to take action and study exactly how gender bias factors into virtual assistant programming.

If, however, a UN report doesn’t do the trick — which is fairly likely given the amount of money that is to be made from virtual assistants — there are other organizations pushing them to do the right thing whether they want to or not.

“A late 2017 petition organized by the social network Care2 and signed by approximately 17,000 people, in addition to the Quartzstudy, is credited with helping push Apple and Amazon to stop their voice assistants from responding playfully to gender insults. The petition called on technology companies to ‘reprogram their bots to push back against sexual harassment,’ noting that ‘in the #MeToo movement we have a unique opportunity to develop AI in a way that creates a kinder world.’”

There are more concrete examples to solving the problem, from genderless voices to politeness checks, but the real solution is going to be closing the gender gap in tech. Regardless of how many quick fixes and one-minute solutions the tech industry can muster to solve its many gender-based problems, the true fix will come in the form of a diversified workforce that is working together to address the needs of everyone, rather than just well-educated, straight, white men. Unfortunately, until the gender gap is addressed, there’s going to be more tone-deaf tech innovations on the horizon.

Did you find this article helpful? Click on one of the following buttons
We're so happy you liked! Get more delivered to your inbox just like it.

We're sorry this article didn't help you today – we welcome feedback, so if there's any way you feel we could improve our content, please email us at contact@tech.co

Written by:
Conor is the Lead Writer for Tech.co. For the last six years, he’s covered everything from tech news and product reviews to digital marketing trends and business tech innovations. He's written guest posts for the likes of Forbes, Chase, WeWork, and many others, covering tech trends, business resources, and everything in between. He's also participated in events for SXSW, Tech in Motion, and General Assembly, to name a few. He also cannot pronounce the word "colloquially" correctly. You can email Conor at conor@tech.co.
Explore More See all news
Back to top
close Building a Website? We've tested and rated Wix as the best website builder you can choose – try it yourself for free Try Wix today