Gendering personal intelligent assistants

In a previous post I asked ‘Why are we copy pasting the shitty-ness of IRL structures in digital systems?’ drawing a pyramid that represents the hierarchy of shit that sees 8% of the population owning 86% of the world’s wealth and controlling a structure of deep rooted inequality, oppression, violence and discrimination. This shitty-ness is being translated into digital systems that build on and perpetuate in real life (IRL) inequalities – see the work of Virginia Eubanks.

The digital systems we’re creating reflect the view of the world of the non-diverse privileged groups of people that create and control these systems: mostly white men. These digital systems are also layered with fantasies about a perfect world, a perfect woman, a perfect future.

The patterns of interaction we are creating in digital worlds are not only building on and perpetuating IRL dominant notions of violence and oppression, but are also reinforcing them and reshaping them back IRL. It’s the IRL-Digital loop of shitty-ness.

By looking at how women are portrayed as robots on screen, in science fiction and now as personal intelligent assistants we can make the connections between how women are viewed and treated IRL and how we are gendering our technology.

In this blog post, I am collating snippets of relevant articles, papers, podcasts etc of the work that focuses on the topic of gender in technology.  My aim is to generate debate in the context of a critical design project & explorative workshop with the Feminist Internet.

Virtual assistants have long been layered with human traits. Using market studies to justify the female gendering of PIAs, the tech industry has largely failed to critically engage with what complexities lie behind conclusions such as “People perceive female voices as helping us solve our problems by ourselves while they view male voices as authority figures who tell us the answers to our problems.” 1

giphy

This is bullshit. There’s a bunch of social stereotypes associated to this “phenomenon”. Consider also that women are very often silenced. What studies like this fail to engage with is why people feel a preference towards a female identity (not voices) in an assistant function and whether that is something we want to reinforce or subvert.

In more formal terms, Miriam E. Sweeney tells us that:

Increasingly anthropomorphism is used as a design strategy in computing interfaces to make them more accessible and intuitive to users. Technologies are never neutral, and always consist of a complex arrangement of technical, social, and cultural (ideological) aspects. Computing interfaces designed to have the characteristics of people raise ethical questions about what it means to explicitly gender and racialize technologies. 2

The tech industry neglects to properly engage with these complex arrangements and ethical questions because the tech industry profit driven afterall #capitalism. They don’t really care that the “link between how we treat “fembots” and human women is real.” as Laurie Penny puts it. In the same article Penny says:

Right now, as we’re anticipating the creation of AIs to serve our intimate needs, organise our diaries and care for us, and to do it all for free and without complaint, it’s easy to see how many designers might be more comfortable with those entities having the voices and faces of women. If they were designed male, users might be tempted to treat them as equals, to acknowledge them as human in some way, perhaps even offer them an entry-level salary and a cheeky drink after work. 3

The cringe worthy examples are plenty:
https://player.vimeo.com/video/267597319

Caption: I filmed this a couple of weeks ago on a Virgin train from Newcastle.

I know.

Then there’s Ms. Dewey from microsoft:

Screen Shot 2018-04-30 at 17.09.39.png

From ‘When Robots are instrument of male desire’ –  Katherine Cross from The Establishment quoting Miriam Sweeney:

“Ms. Dewey was designed according to sexual logics that fundamentally define her as an object of sexual desire and require her to respond to requests for sexual attention,” Sweeney writes, after having studied user responses and inputs into the search engine, as well as a comprehensive content analysis of Ms. Dewey’s replies to certain queries. In her research, for instance, Sweeney observed that a user ordered “You Strip” to Ms. Dewey three times, each time prompting a more compliant response from the virtual assistant. “The design parameters that have Ms. Dewey change a sexual rebuff into sexual obedience creates a crisis of consent in the interface, reinforcing the no-really-means-yes mentality that is characteristic of rape culture under patriarchy.”
(…)
sexist attitudes that still pervade the wider tech industry, and the fantasy of the sexy, sexual servant that many corporations are now feeding. What attitudes do these people take to real women they may encounter working at a restaurant or a Starbucks?
(…)
We have to reckon with the troubling reality that what we fear most in AI is that it will either reflect the worst of us, or fight back against us because we know what we expect of it is morally wrong. It is a guilt-ridden memory of a future where we live out a fantasy of women’s servitude en masse, with ideal womanhood positioned as being ever more robot-like. In order to face that fear, we have to recognize what we are currently trying to build: a servile woman who doesn’t talk back, a female robota who embodies the most dehumanizing aspects of both societal sexism and capitalist urges. 4

Allison de Fren explains how on screen “Sometimes a robot, is just the fantasy of the perfect woman realised.” You can hear Allison de Fren in the podcast Fembots, that covers how fembots have been portrayed on screen and the patterns that come from those depictions:

“there’s often a mad scientist type and he is creating the female robot and it’s usually for the benefit of some male protagonist(…)”

“a villainous man creating an army of perfectably controllable women. The idea of controlling women, of mad rich genious men crafting the perfect submissive woman out of plastic and wires(…)” says Sarah Mirk (who I believe is the narrator of the podcast).

“A lot of stories that men have written about artificial women are stories in which, kind of like Pygmalion, men are imagining who is my ideal partner and if I could create my ideal partner is it possible that she could come to life and we could live happily ever after? To a certain extent is imagining a perfect future in which there is a companion that never doubts you, never questions you and maybe even doesn’t have her own will (…)”

“The word robot means slave in Czech, often these beings are beings with which we can imagine control over other human beings. (…) so in some ways we’re working out ideas around control but also control we’ve had over other people.” 5

im-so-tired-4the3q

Waifus and epic fails

In doing this research I have started compiling a list of Virtual / AI assistants with examples of how stereotypical views of human women are being translated into female artificial objects.

Azuma Hikari, also know as a holographic waifu and the wife of the future.

“It can also send and receive text messages when its “master” is not at home, interacting in some ways like a…domestic partner. And the Gatebox can also control smart-home appliances, such as lights and robotic vacuums—so when you get home from work, your holographic waifu will have the lights on for you and will have done the cleaning.” “She will always do all she can just for the owner.” 6

Hikari is shown in Bloomberg clip entitled In Japan, Virtual Partners Fill a Romantic Void together with a virtual boyfriend: users play a woman sitting in a chair, who is auctioned to a rich man that says: ‘I bought you, you are mine.’

giphy2

Screen-Shot-2018-05-02-at-16.04.jpg*despair*

giphy1

Personal Intelligent assistants

The work of Leah Flesser in the Quartz study that tested bots’ responses to sexual harassment is to the point:

Women have been made into servants once again. Except this time, they’re digital. (…)  By letting users verbally abuse these assistants without ramifications, their parent companies are allowing certain behavioral stereotypes to be perpetuated. Everyone has an ethical imperative to help prevent abuse, but companies producing digital female servants warrant extra scrutiny, especially if they can unintentionally reinforce their abusers’ actions as normal or acceptable.
(…)
The message is clear: Instead of fighting back against abuse, each bot helps entrench sexist tropes through their passivity.

Even if we’re joking, the instinct to harass our bots reflects deeper social issues. In the US, one in five women have been raped in their lifetime, and a similar percentage are sexually assaulted while in college alone; over 90% of victims on college campuses do not report their assault. And within the very realms where many of these bots’ codes are being written, 60% of women working in Silicon Valley have been sexually harassed at work. 7

The IRL-digital loop of shitty-ness

Penny, pungent as always, tells it how it is:

In stories from Bladerunner and Battlestar Galactica to 2015’s Ex Machina, female robots are raped by men and viewers are invited to consider whether these rapes are truly criminal, based on our assessment of whether the fembot has enough sentience to deserve autonomy. This is the same assessment that male judges around the world are trying to make about human women today. 3

(Penny’s stuff is excellent, read it all).

The parallels between the ‘(ro)bot – man’, the ‘non-white – white’ and the ‘woman-man’ relationship is one where there is power and ownership on one side and guilty oppressive less-than human feelings on the other. 

Not only are we re-creating IRL inequalities but we are actually redefining social norms via the interactions with digital objects.  

As Jutta Webber and Corina Bath put it:

The way interactions ‘reshape the wo/man – machine relationship into a gendered caregiver-infant relationship.’
(…)
Concepts of human-machine relationship, particularly in the new field of ‘social’ AI illustrated further ontological and anthropological assumptions. The relationships of owner-pet, parent-baby or caregiver-infant are sorts of pedagogical relationships that afford a lot of time, patience, engagement and work in order to function properly. Are these the kind of relationships desirable for human-machine interaction? Do we really want to educate our machines? 8

Jacqueline Feldman discussing her experience of of designing the personality for a chatbot called Kai, also gives us a glimpse of hope on challenging the gender binary in our society:

By creating interactions that encourage consumers to understand the objects that serve them as women, technologists abet the prejudice by which women are considered objects. They may overlook this hazard in part because these workers are, for the most part, men.
(…)
People asking about my job have mistakenly called Kai “she.” By correcting them, I am not de-gendering Kai but simply resisting anthropomorphizing it overly. We don’t need to make our technologies conform to the gender binary of human societies in order to like them. 9

Because we’re literally translating stereotypes and social norms from IRL to digital we’re perpetuating all the gender inequalities that exist in the physical world. It’s urgent to take a critical stance and truly engage with what it means to gender things.

The end


References

Siri and Cortana Sound Like Ladies Because of Sexism – Jessi Hempel quoting Clifford Nass’s ‘Wired for Speech’

Not just a pretty (inter)face: A critical analysis of Microsoft’s ‘Ms. Dewey’ – Miriam Sweeney

3 Why do we give robots female names? Because we don’t want to consider their feelings – Laurie Penny 

4 When Robots Are an Instrument of Male Desire – Katherine Cross from The Establishment quoting Miriam E. Sweeney

Miriam E. Sweeney https://miriamsweeney.net/  http://mesweeney.people.ua.edu/

5 Popaganda:Fembots – Bitch Media
Allison de Fren’s Mechanical Bride and Fembot in a Red Dress

We tested bots like Siri and Alexa to see who would stand up to sexual harassment – Leah Fessler

Social’ Robots & ‘Emotional’ Software Agents. Gendering Processes and De-Gendering Strategies for ‘Technologies in the Making – Jutta Webber and Corina Bath

The bot politc – Jacqueline Feldman

Margaret Rhee https://mrheeloy.com/

Allison de Fren http://www.mechanicalbridemovie.com/allison-defren-bio.html

11 thoughts on “Gendering personal intelligent assistants

  1. *Degendering AI*
    “One way to make AI less problematic for women is to take gender out of the equation. Alexa and Siri have something in common: they’re both clearly female characters and female voices. That’s taken further with virtual girlfriends such as Gatebox in Japan — and that’s before we start talking about sex robots. But Alexa and Siri are a good place to start.
    “What they tend to do is keep reproducing this idea of women as sexual objects to be used, to be appropriated,” Richardson says, explaining that giving objects female personas cements existing power dynamics. “Women are expected to give away power, to acknowledge and look after men, to laugh at their jokes, flatter their ego — these kinds of things. So when you’ve got men then creating models of relationships [with AI assistants], they’re creating a model of relationship that is very egocentric, not very neutral… I think that’s what’s underlying a lot of robots and AI.”

    Because of that, such tools should be gender neutral, Goldstaub argues. “We should degender our AI, so it’s like a washing machine rather than a Tamagotchi. Things that are meant to stay as tools should stay as tools.”

    Artificial Intelligence Isn’t Good for Women, But We Can Fix It:
    https://www.teenvogue.com/story/artificial-intelligence-isnt-good-for-women-but-we-can-fix-it

    Like

  2. *Degendering AI*

    “Google’s translation program decided that soldiers, doctors and entrepreneurs were men, while teachers and nurses were women. Overwhelmingly, the professions were male. Finnish and Chinese translations had similar problems of their own, Quartz noted.
    (…)
    Rich Caruana, a Microsoft researcher who has worked to better understand the internal mechanisms of algorithms, said that omitting variables like gender and race in different algorithms isn’t always the solution to countering bias. In some cases, like medical predictions, these variables could be important to accuracy. And there can be other variables, like ZIP codes, that can correlate with race and introduce bias into models that don’t explicitly include race. which can embody biases as well when they are included in models.
    https://wordpress.com/read/blogs/140722875/posts/5480

    Like

  3. *Studies show that people prefer female voices*
    “Two studies recently cited by The Wall Street Journal show that men and women prefer a female voice assistant because they believe it’s more welcoming and understanding. However, they prefer to listen to a male voice on certain subjects.

    The first study, which was conducted by researchers at Indiana University in 2008, found that when having human-computer interactions, a female voice is “warmer.” The second study, done by Stanford University in 1997, shows that a male voice is preferred when learning about computers, but a female voice is better when hearing about love and relationships.”
    http://www.vocativ.com/404806/male-ai-voice-assistants-apple-siri/index.html

    Like

  4. *Studies show that people prefer female voices*
    (…) but if we’re going to live in a world in which we’re ordering our machines around so casually, why do so many of them have to have women’s names?
    (…)The simplest explanation is that people are conditioned to expect women, not men, to be in administrative roles—and that the makers of digital assistants are influenced by these social expectations. But maybe there’s more to it.
    (…)
    In 1980, for example, the U.S. Department of Transportation reported that several surveys among airplane pilots indicated a “strong preference” for automated warning systems to have female voices, even though empirical data showed there was no significant difference in how pilots responded to either female or male voices. (Several of the pilots said they preferred a female voice because it would be distinct from most of the other voices in the cockpit.)
    https://www.theatlantic.com/technology/archive/2016/03/why-do-so-many-digital-assistants-have-feminine-names/475884/

    Like

  5. “Findings showed that men were more likely to donate money to the female robot, while women showed little preference. Subjects also tended to rate the robot of the opposite sex as more credible, trustworthy, and engaging. In the case of trust and engagement the effect was much stronger between male subjects and the female robot. These results demonstrate the importance of considering robot
    and human gender in the design of HRI.”
    Persuasive Robotics: the influence of robot gender on human behavior
    One of the author’s is Cynthia Breazeal, the creator of Jibo.

    Like

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s