Empathetic Design

And the Reengineering of Gendered Spaces

By Rhiannon Williams / Feminist Internet

Actress Rosamund Pike is being slammed against the bone and ochre tiles of the Edgware Road underpass. She slides down to the wet floor and the dance continues: eyes flickering, face a mask of horror, she is jerked and hurled around the fluorescent-lit tunnel, her body and gaze compelled by a floating metal orb which has her possessed. This is the music video for Massive Attack and Young Fathers’ “Voodoo in My Blood,” and it’s closely based on a disturbing scene from the 1981 Andrej Zutawski film Possession. While demonic possession is the horror focus of these pieces, they both contain another daunting element of fear: the setting and space of the underpass itself. Pike’s wariness as she leaves the light of day and descends into the underpass—even before encountering the orb, the fearful glance she throws over her shoulder—these tics and responses can be instinctively understood by those who find themselves vulnerable in public space. Even the kink in the tunnel wall roughly catches her arm like a stranger shouldering by, re-inforcing the hostility of her environment. Pike’s body language entering the Edgware Road underpass is an illustration of the relationship between identity and space; the all-too familiar alertness of a womxn walking alone through an environment coded unsafe for womxn. Her behaviour entering the tunnel is possibly scarier than the possession of her body that follows, because the space itself and the sense of danger it invokes are uncomfortably relatable. We’re watching her enter a space that sets up a narrative of danger to womxn the moment it appears on our screens. For many of us, we’re watching a frightening, familiar experience.

Despite sounding like a very outdated concept, gendered spaces still exist today. Certain environments and infrastructures are charged with greater danger or safety, hostility or inclusion, depending on an individual’s gender, race, or particular privilege. Places like tunnels, alleyways, and forests all draw associations with the potential of assault for lone womxn, to the extent that they and other marginalised people who are often attacked in such spaces frequently face blame by the media and society at large for being there in the first place. The onus is on them for daring to enter the space, rather than the space-makers themselves, who we trust to code and program our shared environments for better safety. Despite the fact that these areas are public and despite the fact that we all deserve the right to roam freely regardless of gender or colour, age-old stereotypes about who “belongs” in a given environment are used to exclude womxn, people of colour, transgender people, the less-abled, and other underprivileged groups who, by occupying such coded spaces, are supposedly responsible for putting themselves at risk. Digital space, too, reflects this and online misogynistic harassment campaigns such as Gamergate attempting to hound womxn out of male-dominated spheres are all too common. [1]

If spatial design can maintain climates of danger and compliance for specific groups, then surely fresh design initiatives and ethical technological solutions are the remedies we need. With advances in automation, the Internet of Things, and the ever-increasing presence of artificially intelligent assistants aiding and instructing us, technology and digital spaces are going to govern and intervene with our physical spaces more than ever. In a world where approaching the maw of a London underpass puts womxn on alert, sensor technology, facial recognition, and even basic alarm technology have the potential to recode spaces of threat as spaces of visibility, sending messages of zero-tolerance to would-be predators, and of inclusion to would-be victims.

Oppressive Design Under the Guise of Progress

However, initiatives using such technology, while often progressive in intent, frequently fail to consider existing social inequalities and can even reinforce them, bringing new dangers and limitations to our daily territories. Whether it’s the AI “judge” used by the US court that was revealed to be biased against black prisoners, [2] the sexist recruitment algorithm tested by Amazon, [3] or the recent insidious use of facial recognition to track porn actresses, innovations in AI, machine learning, and automation have proven their ability to continue recreating spaces in which marginalized groups face exclusion, contempt, and surveillance. Training digital entities on data from a world in which we are not equals results in the replication of existing biases. Designing technology without considering its use by less able-bodied people only reproduces the lack of inclusion and accessibility they are already subject to, retrofitting it for the digital age.

Buzz, a proposed design for an anti-rape wearable, [4] is an example of how non-empathic design can reiterate social vulnerabilities. As a wristband that monitors alcohol intake, it flashes when the wearer has drunk enough to presumably be more vulnerable to sexual assault. Billed as a measure to reduce the risk of rape, Buzz only pastes patronizing surveillance onto the complex issue of sexual predation without actually addressing the problem or re-educating anybody about boundaries and consent. The design places responsibility on potential victims, proposing the humiliating public monitoring of alcohol consumption and is in fact a device that abusers could use to highlight victims. We see the intersection of personal smart technology and conduct in public space, the wearable informing the wearer, their companion, and any potential onlookers how to behave, and in doing so reinforcing oppressive myths about sexuality and responsibility. With such a design opportunity before us—the use of artificially intelligent technology to inform the way we might see a given person or space in a public encounter—we can understand emerging technology as having great potential to address prejudice and inequality. Overlapping the private delivery of information with behaviour in communal spaces, advancing technology could alert us to the multiplicity of human experience— information regarding health, danger, and the different ways in which different people process social situations—the same way Buzz alerts us to its singular victim-blaming and heteronormative narrative of sexual harassment. We shouldn’t have to be suffering sexist commentary on our bodies and safety from technology, compelled the way Pike is by the floating orb to behave in a way that contradicts our autonomy. What, then, is the alternative? How can we design and engineer spaces and things that shirk old inequalities for inclusion, amplification, and safety?

Towards Better Design and Better Space

We can start with more empathic design. From household speakers and mobile devices to ads and fictional TV narratives, Amazon’s personal intelligence assistant (PIA) Alexa (and the presence of PIAs in general) is becoming normalized as a feature of domestic life. PIAs have also gathered criticism for reinforcing sexist gender stereotypes. Since 2018, Feminist Internet has been running our Feminist Alexa workshop, where groups of creatives gather to learn about design bias in AI and develop their own improved prototype PIAs in response. Throughout the workshops, we reiterate that these digital entities and voices now occupying our homes are not neutral—they were designed by human beings with human biases and trained on data that reflects the biases of our society. It’s tempting to interpret the output of a machine as a correct absolute, but in fact it can only draw conclusions from the data and programming given to it by humans. Bias in PIAs therefore occurs when a PIA reflects the bias of the designer.

It’s the characterization of PIAs as female that has sparked the strongest reaction: calm, soft, and subservient female voices personify the AI service bots. Often this is in response to market demand, but what are we saying and teaching when we assign womxnhood and femaleness to assistant bots that serve us as caretakers and companions with subservient identities? Ultimately, PIAs exist only in relation to their users and are objects, commodities—a dangerous thing to synonymize with womxnhood when we consider the disproportionate extent to which womxn suffer physical and sexual abuse, as well as objectification and fetishization in the media. Existing PIAs are also rarely designed to reprimand or re-educate abusive or misogynistic behaviour. Alexa may have a ”disengage” mode, [5] but it’s still representative of female passivity and suppression— “she” simply goes into hiding when called a bitch or slut, or asked for sex. Alexa perpetuates domestic gender coding: the calm female listing chores and instructions, taking orders and remembering things so we don’t have to, processing demands and insults with equal placidity.

Beyond Bias

So how do we go about creating a PIA that isn’t implicitly biased, that doesn’t reinforce female subservience? This is where empathic design comes in. Our Feminist Alexa workshops centre on imagining a diverse range of situations and users that a service bot could help with, strategizing ideal PIA responses to the needs of underrepresented groups and identifying instances where the PIA could re-educate its user—for example through its responses to abusive language. Designing a feminist Alexa or a feminist PIA goes beyond avoiding stereotypical gender traits and must also prioritize equality and accessibility. It takes into account that the norms reinforced by the technology that surrounds us in our private or domestic settings informs the way we design and build public spaces and how we treat each other within them, whether that be the coding of the kitchen as a female domain, or a lack of consideration of the requirements of the less physically able, leading to erasure. PIAs are in a unique position, at once both an intimate part of their user’s life and a potential source of vast knowledge on shared human experience and human data.

The workshop ultimately guides participants to design responses to questions beyond the usual “What’s the weather like tomorrow Alexa?” and considers how a PIA might be able to help with things like advice, loneliness, mental health issues, and education. Participants work in groups and develop human personas, considering needs that differ from the usual domestic responses we see PIAs perform. Hypothetical user personas are created, previous examples including a lonely elderly woman, a young man struggling with online fake news culture, a girl feeling guilty about her bullying habits and a gay boy struggling to understand his developing sexuality in the context of his religious and conservative upbringing. Responsive PIA functions designed by participants have included the provision of online source checking, sex education, mindfulness and self-reflection techniques, information on local events, and music or resource recommendations, among other traits.

It is through considering this potential of PIAs to create a safe and educational space in the privacy of our homes that we can use them to impact our shared experience of public space. PIAs have an invaluable role here, as not everybody is able to leave the house to attend educational or therapeutic sessions that take place in physical spaces, or has access to the money required to partake. Both PIAs and safe and informative online spaces can combat issues of physical and financial accessibility when it comes to crucial education and rehabilitation, and also challenge the issues of misinformation and slack regulations on the internet. UK-based non-profit Glitch is an active example, raising awareness of online abuse and how it mirrors systemic oppression in physical spaces, but also providing useful toolkits and challenging policy on online violence. It’s not a singular innovation that will recode a space like Twitter or the Edgware Road underpass, but rather an approach from multiple angles, educating people in both the private and the collective sphere on how to regard each other with less prejudice and more empathy.

With the digital ever shaping our physical spaces and the assumptions they house, we need to take advantage of emerging technology and use it to design and retrofit spaces that optimize diversity, equality, and accessibility. A physical map of restrictions overlaid with a digital map of assumption-based surveillance is not enough. As designers, makers, architects, and engineers, we must start prioritizing marginalized groups at the core of our creations. Only then will we get to the point where robotic structures increase accessibility, where AI does not replicate prejudice, where entities of domestic labour have no gender, and where a gaping underpass is nothing more than a useful shortcut through the city.

Notes

The author uses the spelling womxn as a gesture of inclusivity.

1. Keza McDonald, “We’ve seen Carl Benjamin’s rank misogyny before – remember Gamergate,” Guardian, May 9, 2019, https://www.theguardian.com/commentisfree/2019/may/09/gamergate-carl-benjamin-ukip-me.

2. Stephen Buranyi, “Rise of the Racist Robots – How AI Is Learning All Our Worst Impulses,” Guardian, August 8, 2017, https://www.theguardian.com/inequality/2017/aug/08/rise-of-the-racist-robots-how-ai-is-learning-all-our-worst-impulses. 

3. “Amazon Scrapped ‘Sexist AI’ Tool,” BBC News, October 10, 2018, https://www.bbc.co.uk/news/ technology-45809919.

4. Mark Wilson, “Can this ob-gyn-designed wearable reduce campus sexual assault?,” Fast Company, June 11, 2018, https://www.fastcompany.com/90175137/can-this-ob-gyn-designed-wearable-reduce-campus-sexual-assault.

5. Rex Crum, “Amazon’s Alexa Can Now ‘Disengage’ If Asked Sexually Harassing Questions,” SiliconBeat, January 17, 2018, http://www.siliconbeat.com/2018/01/17/amazons-alexa-can-now-disengage-asked-sexually-harassing-questions/.


Bio

Rhiannon Williams is a poet, researcher, writer, and a co-founder of Feminist Internet. Her areas of interest include gender, liminal spaces and buffer zones, videogames, emerging and future technologies, and the impact our built surroundings have on our empathy, wellbeing, and equality.