Can we apply restorative justice to online content moderation?
Lindsay Blackwell, a Ph.D. student and researcher at Facebook, ran an experiment to test the online application of restorative moderation practices to heal the toxicity of the internet.
Web communities apply strict rules for safeguarding users, but many times members are banned without exploring the background of the rule-breaking behaviour. Reddit, a social network, was selected for this experiment. Reddit uses volunteers to moderate their online communities to ensure they remain harassment-free, that members do not use offensive language, and that everyone sticks to the platform communication rules.
Three banned users were selected in the Christian community. Chat forums were created for a moderation facilitated by researchers. As a restorative process, the community moderator and the rule-breaking users discussed how they had been affected by the ‘injustice’ and if they understood why they were barred.
The experiment had mixed results, one failed due to the inexperience of the mediator, one was too angry to discuss things, but one was successful.
The successful session led to the user regaining access to the community after acknowledging that his behaviour was viewed by others as violent and he apologized for it. He said, ‘I thought someone else was the instigator and I felt ganged-up on or something. But…looks like I was the instigator’. His strong feelings appeared to others as aggressive behaviour, and helping him understand this made the user more conscious of his interactions.
It’s hard to mediate online. It is a huge investment of time and effort to repair online harassment, abuse and violence, but for healthy forums we need a human touch. ‘We will never effectively reduce online harassment unless we address the underlying motivations for participating in abusive behaviour, and having reformed violators go on to model prosocial norms is an incredible bonus’, said Ms. Blackwell.