Kava
—NO

Or, what British feminists can learn from Brett Kavanaugh's high school yearbook and its implications our digital future.

 

by Rachel Huber

 

For many women following politics in Washington this autumn, the story of Brett Kavanaugh’s disputed nomination for a seat on the US Supreme Court proved a sadly familiar tale with a sadly familiar outcome: woman gives credible testimony of sexual violence; woman shakes the foundations of white male entitlement and privilege upon which much political power is built; woman is ignored.

AT2 - KAV.jpeg

Among those perplexed by the showdown that engulfed US politics were millennials. Kavanaugh publicly dissected a high school yearbook and a 1982 calendar. But those used to instant access to digital archives were left puzzled at the paucity of evidence that could cast a light on Kavanaugh’s teenage behaviours and heartened that contemporary online archiving has the potential to hold future Kavanaugh’s to account. Would he have been able to characterise misogynist descriptions of female classmates in his yearbook as innocent teenage banter if other social media posts could have suggested otherwise? And would he have been able to deny he was at a party when a geo-tagged social media post could have placed him there?

Can we rush towards cyber optimism?

The reality of our online future is complex. And, the power structures which shape many of our online connections are not so different from those of Washington DC. Why? Because the technologies we use to track, trace and archive our lives online are interwoven with the same biases and discriminations threaded through the rest of society. Algorithms - automated decision-making software - aren’t just the reason your Instagram reach has tanked or Facebook shows you certain videos.

The truth is that many of us are sufficiently privileged to ignore the ways algorithms are gatekeepers to women’s identities.

Sounds like techno-alarmism?  US academic Safiya Umoja Noble writes extensively on the flipside of big data, offering an essential intersectional outlook on what we often consider a trusted public service. Notably, how existing oppressive social relationships are reinforced by algorithms; how algorithms deepen inequalities; and, how discrimination is embedded in computer code. Google’s profit system, for example, depends on keyword search terms which marginalise black women via reductive, racist search results. Likewise search engines perpetuate racialised depictions of men and boys.

AT2 - Kavanaugh

In Algorithms of Oppression, How Search Engines Reinforce Racism, Noble serves up a powerful takedown of the way Yelp search terms ‘other’ people of colour, rendering black culture invisible by forcing users to mould themselves to fit white-centric language and search terms. Her discussion on digital sex crime exposes how online archives are turned against women.

She describes women being fired from jobs after ex-husbands and boyfriends shared revenge porn or online evidence of past work as exotic dancers. This shows how online archives can be weaponised to deny women jobs, rights and power.

EU citizens have the right to petition Google to take down this kind of content. Those in the US do not, having to pay to remove the same damaging content. Hardly a neutral democratic online landscape forging meritocracy, and all the more worrying as Brexit looms and Britons face the loss of the digital protections offered under EU law.

So if we are hopeful for a digital future where online archives can hold individuals to account, then we have to balance that with concern for who controls the manmade, fallible algorithms that organise our online life. A $90 Google exit package for an executive reported for sexual coercion is far from reassuring.

The ethical boundary between censoring history online and right to privacy is rich ground for debate. But these are debates we must have and they should be on the radar of all feminists. It’s time to unveil the workings of digital indexing so all men and all women can live online and offline fairly.

 
 

To stay informed look to:
Non-profit UK digital privacy organisation, Privacy International

Or read
Algorithms of Oppression, How Search Engines Reinforce Racism, Safiya Umoja Noble, New York University Press, 2018.