Why are Artificial Intelligence (AI) robots and voices almost all women? And what does that say about our culture?

Credit: Startup Savant
Credit: Startup Savant
With the proliferation of female robots such as Sophia and the popularity of female virtual assistants such as Siri (Apple), Alexa (Amazon) and Cortana (Microsoft), artificial intelligence seems to have a gender issue.

This gender imbalance in AI is a pervasive trend that has drawn sharp criticism in the media (even Unesco warned against the dangers of this practice) because it could reinforce stereotypes about women being objects.

But why is femininity injected in artificial intelligent objects? If we want to curb the massive use of female gendering in AI, we need to better understand the deep roots of this phenomenon.

Making the inhuman more human

In an article published in the journal Psychology & Marketing, we argue that research on what makes people human can provide a new perspective into why feminization is systematically used in AI. We suggest that if women tend to be more objectified in AI than men, it is not just because they are perceived as the perfect assistant, but also because people attribute more humanness to women (versus men) in the first place.

Trailer for Ex Machina, a 2015 film starring Domhnall Gleeson and Oscar Isaac.

Why? Because women are perceived as warmer and more likely to experience emotions than men, female gendering of AI objects contributes to humanizing them. Warmth and experience (but not competence) are indeed seen as fundamental qualities to be a full human but are lacking in machines.

Drawing on theories from dehumanization and objectification, we show across five studies with a total sample of more than 3,000 participants that:

  • Women are perceived as more human than men, overall and compared to non-human entities (animals and machines).
  • Female bots are endowed with more positive human qualities than male bots, and they are perceived as more human than male bots, compared to both animals and machines.
  • The inferred humanness of female bots increases perceived uniqueness of treatment from them in a health context, leading to more favorable attitudes toward AI solutions.

We used several different measures of perceived humanness, compared to both animals and machines. For example, to measure blatant humanness of female and male bots compared to animals, we used the ascent humanization scale based on the classic “march of progress” illustration. We explicitly asked online respondents to indicate how “evolved” they perceived female or male bots to be, using a continuous progression from ancient apes to modern humans.

To measure the blatant perceived humanness of female and male bots compared to machines, we created a scale that measures blatant mechanistic (de)humanization, by picturing man’s evolution from robot to human (instead of ape to human). Of course, we created both a female and a male version of each of these scales.

Other measures captured more subtle and implicit perceptions of humanness, by asking respondents the level of emotions they attributed to male and female bots. Some emotions are said to distinguish humans from machines (for example, “friendly”, “fun-loving”), and other emotions to distinguish humans from animals (i.e., “organized”, “polite”). Finally, we also used an implicit association test to investigate whether female bots are more likely than male bots to be associated with the concept of “human” rather than “machine”.

The ghost in the machine

While we found that women and female robots are perceived as more human on most of the subtle and all the blatant and implicit measures of humanness, we also found that men and male robots are perceived as more human on the negative dimensions of the subtle measures of humanness. Taken together, these results indicate that female robots are not only endowed with more positive human qualities than male robots (benevolent sexism), but that they are also perceived as more human and are expected to be more prone to consider our unique needs in a service context.

These findings may point to a new possible explanation of why female bots are favored over their male counterparts, with people preferring female intelligent machines because such machines are more strongly associated with humanness.

Trailer for Her, a 2013 film starring Joaquin Phoenix and Scarlett Johansson.

If femininity is used to humanize non-human entities, this research suggests that treating women like objects in AI may lie precisely in the recognition that they are not. The popular assumption, though, frequently referred to as the dehumanization hypothesis, is that it is necessary to view outgroup members as animals or instruments before objectifying them. In other words, dehumanization would be a prerequisite for objectification to take place, with targets of objectification typically being denied their humanness. Contrary to this dominant view, the transformation of women into objects in AI might occur not because women are perceived as subhumans, but because they are perceived as superhumans in the first place.

This is in line with Martha C. Nussbaum’s assertion: “Objectification entails making into a thing… something that is really not a thing” (Nussbaum, 1995, p. 256–7). It also fits with Kate Manne’s view on misogyny and dehumanization: “Often, it’s not a sense of women’s humanity that is lacking. Her humanity is precisely the problem” (Manne, 2018, p. 33). Therefore, the widespread use of female identity in AI artefacts may be rooted in the implicit recognition that women are perceived to be human, and more so than men.

Objectification of women in the real world?

This research builds on what makes people human compared to machines to better understand the deep roots of the widespread female gendering of AI. Because feelings are at the very substance of our humanness, and because women are perceived as more likely to experience feelings, we argue that female gendering of AI objects makes them look more human and more likely to consider our unique needs. However, this process of transforming women into objects could lead to women’s objectification by conveying the idea that women are objects and simple tools designed to fulfill their owners’ needs. This may potentially fuel more women’s objectification and dehumanization in the non-digital world.

Follow the latest news and policy debates on sustainable agriculture, biomedicine, and other ‘disruptive’ innovations. Subscribe to our newsletter.

This research highlights thus the ethical quandary faced by AI designers and policymakers: Women are said to be transformed into objects in AI, but injecting women’s humanity into AI objects makes these objects seem more human and acceptable.

These results are not particularly encouraging for the future of gender parity in AI, nor for ending objectification of women in AI. The development of gender-neutral voices could be a way to move away from the female gendering of AI and stop the perpetuation of this benevolent sexism. Another solution, similar to Google’s recent experimentation, would be to impose a default gender voice, assigning randomly and with an equal probability either a male or a female intelligent bot to users.

Sylvie Borau, Ph.D. is an associate professor at TBS Business School, where she teaches ethical marketing. She is also an associate researcher at the Institute for Advanced Study in Toulouse. Find Sylvie on Twitter @sylvieborau

A version of this article was originally posted at the Conversation and has been reposted here with permission. The Conversation can be found on Twitter @ConversationUS

{{ reviewsTotal }}{{ options.labels.singularReviewCountLabel }}
{{ reviewsTotal }}{{ options.labels.pluralReviewCountLabel }}
{{ options.labels.newReviewButton }}
{{ userData.canReview.message }}
screenshot at  pm

Are pesticide residues on food something to worry about?

In 1962, Rachel Carson’s Silent Spring drew attention to pesticides and their possible dangers to humans, birds, mammals and the ...
glp menu logo outlined

Newsletter Subscription

* indicates required
Email Lists
glp menu logo outlined

Get news on human & agricultural genetics and biotechnology delivered to your inbox.