In a previous post I asked ‘Why are we copy pasting the shitty-ness of IRL structures in digital systems?’ drawing a pyramid that represents the hierarchy of shit that sees 8% of the population owning 86% of the world’s wealth and controlling a structure of deep rooted inequality, oppression, violence and discrimination. This shitty-ness is being translated into digital systems that build on and perpetuate in real life (IRL) inequalities – see the work of Virginia Eubanks.
The digital systems we’re creating reflect the view of the world of the non-diverse privileged groups of people that create and control these systems: mostly white men. These digital systems are also layered with fantasies about a perfect world, a perfect woman, a perfect future.
The patterns of interaction we are creating in digital worlds are not only building on and perpetuating IRL dominant notions of violence and oppression, but are also reinforcing them and reshaping them back IRL. It’s the IRL-Digital loop of shitty-ness.
By looking at how women are portrayed as robots on screen, in science fiction and now as personal intelligent assistants we can make the connections between how women are viewed and treated IRL and how we are gendering our technology.
In this blog post, I am collating snippets of relevant articles, papers, podcasts etc of the work that focuses on the topic of gender in technology.
Virtual assistants have long been layered with human traits. Using market studies to justify the female gendering of PIAs, the tech industry has largely failed to critically engage with what complexities lie behind conclusions such as “People perceive female voices as helping us solve our problems by ourselves while they view male voices as authority figures who tell us the answers to our problems.” 1
This is bullshit. There’s a bunch of social stereotypes associated to this “phenomenon”. Consider also that women are very often silenced. What studies like this fail to engage with is why people feel a preference towards a female identity (not voices) in an assistant function and whether that is something we want to reinforce or subvert.
In more formal terms, Miriam E. Sweeney tells us that:
Increasingly anthropomorphism is used as a design strategy in computing interfaces to make them more accessible and intuitive to users. Technologies are never neutral, and always consist of a complex arrangement of technical, social, and cultural (ideological) aspects. Computing interfaces designed to have the characteristics of people raise ethical questions about what it means to explicitly gender and racialize technologies. 2
The tech industry neglects to properly engage with these complex arrangements and ethical questions because the tech industry profit driven afterall #capitalism. They don’t really care that the “link between how we treat “fembots” and human women is real.” as Laurie Penny puts it. In the same article Penny says:
Right now, as we’re anticipating the creation of AIs to serve our intimate needs, organise our diaries and care for us, and to do it all for free and without complaint, it’s easy to see how many designers might be more comfortable with those entities having the voices and faces of women. If they were designed male, users might be tempted to treat them as equals, to acknowledge them as human in some way, perhaps even offer them an entry-level salary and a cheeky drink after work. 3
The cringe worthy examples are plenty:
Caption: I filmed this a couple of weeks ago on a Virgin train from Newcastle.
Then there’s Ms. Dewey from microsoft:
From ‘When Robots are instrument of male desire’ – Katherine Cross from The Establishment quoting Miriam Sweeney:
“Ms. Dewey was designed according to sexual logics that fundamentally define her as an object of sexual desire and require her to respond to requests for sexual attention,” Sweeney writes, after having studied user responses and inputs into the search engine, as well as a comprehensive content analysis of Ms. Dewey’s replies to certain queries. In her research, for instance, Sweeney observed that a user ordered “You Strip” to Ms. Dewey three times, each time prompting a more compliant response from the virtual assistant. “The design parameters that have Ms. Dewey change a sexual rebuff into sexual obedience creates a crisis of consent in the interface, reinforcing the no-really-means-yes mentality that is characteristic of rape culture under patriarchy.”
sexist attitudes that still pervade the wider tech industry, and the fantasy of the sexy, sexual servant that many corporations are now feeding. What attitudes do these people take to real women they may encounter working at a restaurant or a Starbucks?
We have to reckon with the troubling reality that what we fear most in AI is that it will either reflect the worst of us, or fight back against us because we know what we expect of it is morally wrong. It is a guilt-ridden memory of a future where we live out a fantasy of women’s servitude en masse, with ideal womanhood positioned as being ever more robot-like. In order to face that fear, we have to recognize what we are currently trying to build: a servile woman who doesn’t talk back, a female robota who embodies the most dehumanizing aspects of both societal sexism and capitalist urges. 4
Allison de Fren explains how on screen “Sometimes a robot, is just the fantasy of the perfect woman realised.” You can hear Allison de Fren in the podcast Fembots, that covers how fembots have been portrayed on screen and the patterns that come from those depictions:
“there’s often a mad scientist type and he is creating the female robot and it’s usually for the benefit of some male protagonist(…)”
“a villainous man creating an army of perfectably controllable women. The idea of controlling women, of mad rich genious men crafting the perfect submissive woman out of plastic and wires(…)” says Sarah Mirk (who I believe is the narrator of the podcast).
“A lot of stories that men have written about artificial women are stories in which, kind of like Pygmalion, men are imagining who is my ideal partner and if I could create my ideal partner is it possible that she could come to life and we could live happily ever after? To a certain extent is imagining a perfect future in which there is a companion that never doubts you, never questions you and maybe even doesn’t have her own will (…)”
“The word robot means slave in Czech, often these beings are beings with which we can imagine control over other human beings. (…) so in some ways we’re working out ideas around control but also control we’ve had over other people.” 5
Waifus and epic fails
In doing this research I have started compiling a list of Virtual / AI assistants with examples of how stereotypical views of human women are being translated into female artificial objects.
Azuma Hikari, also know as a holographic waifu and the wife of the future.
“It can also send and receive text messages when its “master” is not at home, interacting in some ways like a…domestic partner. And the Gatebox can also control smart-home appliances, such as lights and robotic vacuums—so when you get home from work, your holographic waifu will have the lights on for you and will have done the cleaning.” “She will always do all she can just for the owner.” 6
Hikari is shown in Bloomberg clip entitled In Japan, Virtual Partners Fill a Romantic Void together with a virtual boyfriend: users play a woman sitting in a chair, who is auctioned to a rich man that says: ‘I bought you, you are mine.’
Personal Intelligent assistants
The work of Leah Flesser in the Quartz study that tested bots’ responses to sexual harassment is to the point:
Women have been made into servants once again. Except this time, they’re digital. (…) By letting users verbally abuse these assistants without ramifications, their parent companies are allowing certain behavioral stereotypes to be perpetuated. Everyone has an ethical imperative to help prevent abuse, but companies producing digital female servants warrant extra scrutiny, especially if they can unintentionally reinforce their abusers’ actions as normal or acceptable.
The message is clear: Instead of fighting back against abuse, each bot helps entrench sexist tropes through their passivity.
Even if we’re joking, the instinct to harass our bots reflects deeper social issues. In the US, one in five women have been raped in their lifetime, and a similar percentage are sexually assaulted while in college alone; over 90% of victims on college campuses do not report their assault. And within the very realms where many of these bots’ codes are being written, 60% of women working in Silicon Valley have been sexually harassed at work. 7
The IRL-digital loop of shitty-ness
Penny, pungent as always, tells it how it is:
In stories from Bladerunner and Battlestar Galactica to 2015’s Ex Machina, female robots are raped by men and viewers are invited to consider whether these rapes are truly criminal, based on our assessment of whether the fembot has enough sentience to deserve autonomy. This is the same assessment that male judges around the world are trying to make about human women today. 3
(Penny’s stuff is excellent, read it all).
The parallels between the ‘(ro)bot – man’, the ‘non-white – white’ and the ‘woman-man’ relationship is one where there is power and ownership on one side and guilty oppressive less-than human feelings on the other.
Not only are we re-creating IRL inequalities but we are actually redefining social norms via the interactions with digital objects.
As Jutta Webber and Corina Bath put it:
The way interactions ‘reshape the wo/man – machine relationship into a gendered caregiver-infant relationship.’
Concepts of human-machine relationship, particularly in the new field of ‘social’ AI illustrated further ontological and anthropological assumptions. The relationships of owner-pet, parent-baby or caregiver-infant are sorts of pedagogical relationships that afford a lot of time, patience, engagement and work in order to function properly. Are these the kind of relationships desirable for human-machine interaction? Do we really want to educate our machines? 8
Jacqueline Feldman discussing her experience of of designing the personality for a chatbot called Kai, also gives us a glimpse of hope on challenging the gender binary in our society:
By creating interactions that encourage consumers to understand the objects that serve them as women, technologists abet the prejudice by which women are considered objects. They may overlook this hazard in part because these workers are, for the most part, men.
People asking about my job have mistakenly called Kai “she.” By correcting them, I am not de-gendering Kai but simply resisting anthropomorphizing it overly. We don’t need to make our technologies conform to the gender binary of human societies in order to like them. 9
Because we’re literally translating stereotypes and social norms from IRL to digital we’re perpetuating all the gender inequalities that exist in the physical world. It’s urgent to take a critical stance and truly engage with what it means to gender things.
1 Siri and Cortana Sound Like Ladies Because of Sexism – Jessi Hempel quoting Clifford Nass’s ‘Wired for Speech’
8 Social’ Robots & ‘Emotional’ Software Agents. Gendering Processes and De-Gendering Strategies for ‘Technologies in the Making – Jutta Webber and Corina Bath
9 The bot politc – Jacqueline Feldman
Margaret Rhee https://mrheeloy.com/
Allison de Fren http://www.mechanicalbridemovie.com/allison-defren-bio.html