In the past 3 years I’ve been creating a body of work that focuses on the ways in which the structures of oppression that exist in real life (IRL) are being mirrored, reinforced and amplified by the digital worlds we create. How digital bias is a continuation of IRL bias.
Within this research, I’ve taken voice technologies as an object to explore the ways in which power structures are embodied in the everyday consumer devices that mediate our lives:
- In 2017 I created ‘Alexa Diaries’ a fictional project in which Alexa tells the story of the first 11 days of becoming part of my life.
- I followed with the ‘Feminist Alexa’ project a series of participatory design workshops that explore the ways in which gender is being used in technology and the connections with gender based discrimination IRL
- In ‘Alexa, you’re a pussy’ I consolidated the thinking on the topics of gender, technology and inequality
3 key points have served as a common thread throughout the research:
1- The way gender is used in tech is a direct reflection of how women are viewed and treated IRL
The use of human traits in technology is never neutral. By looking at how women are portrayed as robots in science fiction and now as personal intelligent assistants we can understand the use of gender in technology as framed within the historical continuum of the construction of the artificial woman.
2- The way gender is used in technology has a real impact on women IRL
“By creating interactions that encourage consumers to understand the objects that serve them as women, technologists abet the prejudice by which women are considered objects. They may overlook this hazard in part because these workers are, for the most part, men.” Jacqueline Feldman
Cringe: Virgin train toilet
3- The way gender is used in technology is a direct reflection of heteropatriarchy and intersects with other systems of oppression
Overly reductionist alert: We can use Patricia Hill-Collins ‘Matrix of Domination’ to structure an intersectional approach where we consider the ways in which numerous systems of oppression (heteropatriarchy, whitesupremacy, settler colonialism and capitalism) interlock and are at the core of the products and services that mediate our day to day interactions.
Now, the industry is motivated by selling, by making a profit. There is no intention to really address the impact of using gender in technology. Much like there is no desire to address sustainability and human rights in the production line.
Tech companies do not give a fig about the ways in which their products and services are actively reproducing inequality and causing ecological breakdown — they just want to keep on selling.
“And while the sheer visibility of the problem is now prompting some governments towards banning some single use products, the systemic nature of the problem — an economy and culture of over-production and over-consumption, particularly of package, ready-to-eat food and beverages, and FMCGs (fast-moving consumer goods) — is not sufficiently confronted by decision-makers.” ‘Ontological Design and Criticality’, Anne-Marie Willis
In which ways, by addressing the symptoms e.g. algorithmic bias are we engaging with the systemic nature of the problem? Are we just enabling tech companies to sell more?
“What does it mean when the tools of a racist patriarchy are used to examine the fruits of that same patriarchy? It means that only the most narrow parameters of change are possible and allowable.” Audre Lord, ‘The master’s tools will never be used to bring the master’s house down’
Systemic problems will not be solved by those same structures that created them. We need to come up with alternatives that counter, challenge and change the present towards more equitable futures.