Kudos to the Guardian for Conducting a Self-Analysis of Article Commentary
Their goal in conducting this analysis was as stated: “As part of a series on the rising global phenomenon of online harassment, the Guardian commissioned research into the 70m comments left on its site since 2006 and discovered that of the 10 most abused writers eight are women, and the two men are black. Hear from three of those writers, explore the data and help us host better conversations online” by Becky Gardiner, Mahana Mansfield, Ian Anderson, Josh Holder, Daan Louter and Monica Ulmanu. The methodology used is presented here.
These findings are of no surprise to those aware of the impact of implicit gender and racial bias in our society. They are likely representative of a general phenomenon across the landscape of on-line journalism and digital news sources commentary. We fully support the stand that the Guardian is taking on this issue:
But in addition to the psychological and professional harm online abuse and harassment can cause to individuals, there are social harms, too. Recent research by the Pew Centre found that not only had 40% of adults experienced harassment online but 73% had witnessed others being harassed. This must surely have a chilling effect, silencing people who might otherwise contribute to public debates – particularly women, LGBT people and people from racial or religious minorities, who see others like themselves being racially and sexually abused.
Is that the kind of culture we want to live in?
Is that the web we want?
The findings we find most pertinent are:
- Articles written by women received more blocked (i.e. abusive or disruptive) comments across almost all sections. (note: blocked comments are defined as comments that violate their published community standards)
- Some sections attracted more blocked comments than others. World news, Opinion and Environment had more than the average number of abusive or disruptive comments. And, so did Fashion.
- Some subjects attracted more abusive or disruptive comments than others. Conversations about crosswords, cricket, horse racing and jazz were respectful; discussions about the Israel/Palestine conflict were not. Articles about feminism attracted very high levels of blocked comments. And so did rape.
They also discuss the broader impacts of negative comments across social media outlets and liken them to snowflakes that can quickly become avalanches for targeted individuals:
Anonymity disinhibits people, making some of them more likely to be abusive. Mobs can form quickly: once one abusive comment is posted, others will often pile in, competing to see who can be the most cruel. This abuse can move across platforms at great speed – from Twitter, to Facebook, to blogposts – and it can be viewed on multiple devices – the desktop at work, the mobile phone at home. To the person targeted, it can feel like the perpetrator is everywhere: at home, in the office, on the bus, in the street.
Jessica Valenti, a Guardian writer, is quoted as defining abuse as: “Imagine going to work every day and walking through a gauntlet of 100 people saying ‘You’re stupid,’ ‘You’re terrible,’ ‘You suck,’ ‘I can’t believe you get paid for this.’ It’s a terrible way to go to work.”
They include an enlightening interactive quiz to see which comments the reader would block versus those blocked by the journal. This article is well worth a read.