Amber Robinson* says getting rid of anonymous comments isn’t the solution to the spread of hateful and abusive online posts.
At the end of last month, WPP’s Australian boss, John Steedman called for an end to anonymous online comments.
And a ruling in the Dylan Voller case set a precedent that publishers will be responsible for comments left on third-party platforms like Facebook.
This is a problem.
Publishers — and their followers — generate a relentless tide of Facebook comments, and have access to minimal moderation options.
One broadcaster we work with receives more than 138,000 comments each week on just one of its pages.
The only way to stop comments appearing on published posts is to add in word-based filters, which automatically hide comments until moderators can review and publish them.
These filters are designed to hide profanity, not “hacked” to prevent every single comment from appearing.
That’s because Facebook is built in the United States, where the right to freedom of speech is enshrined and libel is hardly ever prosecuted.
It’s obvious that this is not a workable solution.
Either Australia’s defamation laws need to change to take into account social media, as is the case in the UK, or Facebook needs to provide better moderation services to publishers, whose content drives people to use and engage with Facebook.
(Publishers also spend a lot of money just to have their posts seen in the newsfeed.)
Publishers can, and should, push for these changes.
But recent events also push us to ask: Are Australian online communities safe and respectful?
Do they inform and support their members?
Do they align with publishers’ organisational goals?
I’d argue no, no, and no.
If social media platforms can be compared with a public square, we have allowed preachers of hate to enter.
We’ve allowed bots disguised as humans to infiltrate our places of discussion and manipulate conversations.
We’ve given equal space to people just dropping by and those who’ve contributed to the community for years.
By reframing our online communities as intentional, rather than accidental, spaces, there is a huge opportunity to create value and, yes, drive website traffic.
But the way to do that isn’t banning anonymous comments.
While anonymity certainly gives some trolls the guts to say nasty things, anyone who has moderated Facebook comments knows that people are also happy to say absolutely vile things with their full name on display.
(Look at some of the replies Change.Org Director and LGBQTI activist, Sally Rugg received after appearing on Q&A last month.)
And anonymity provides a valuable opportunity to discuss sensitive topics such as mental health and abuse without the fear of having your name attached.
And there is another way.
Websites like Reddit and The Guardian have tried to improve the quality of their online communities by implementing user tools that enable readers to “upvote” comments and “downvote” others.
Reddit also awards pro-social behaviour with its “karma” points system, which elevates comments made by regular members whose opinions are respected by the wider group.
The Guardian features comments it believes are particularly worthwhile as a “Guardian Pick”, acting as both an incentive and an exemplar for others.
Creating intentional communities isn’t easy.
Facebook needs to take the moderation concerns of publishers seriously and provide more tools and control for page owners.
Publishers need to invest not just in moderation services, but in community management.
In a sea of online news served up via algorithm, great communities make the internet a place worth visiting — and staying for.
And those communities can include anonymous comments.
* Amber Robinson is a social media strategist at Quiip.
This article first appeared at mumbrella.com.au.