Shannon Liao* says Twitter has enlisted university experts to investigate how toxic speech is created on the social media platform.
Twitter has been working to combat spam and abuse on its platform for a while now, but there are still plenty of instances of toxicity on the network.
In its next step to clean up the service, Twitter has enlisted experts from universities to conduct an audit of its platform to figure out where the echo chambers and “uncivil discourse” are originating from.
Back in March, Twitter put out a call for experts to measure how toxic its platform was and suggest ways to improve it.
It said finalists would be chosen in July.
Twitter now says there were over 230 proposals, and of those, the winners include two professors from New York’s Syracuse University, one from Italy’s Bocconi University, a professor from a college that specialises in tech in the Netherlands, Delft University, and others.
The team of researchers will be led by Dr Rebekah Tromble, an Assistant Professor at Leiden University in the Netherlands who focuses on politics in social media.
They will investigate how toxic speech is created on Twitter.
The idea that the researchers are working off is from previous Leiden research, which found that when a group of like-minded people gathers to discuss similar perspectives, they’re encouraged to hate those not engaged in the same discussion, thus creating an echo chamber.
The researchers will see how many users exist in these echo chambers and how many users are actually talking to others with diverse perspectives.
The team will also create algorithms to track whether conversations on Twitter are “uncivil” or if they veer into “intolerant” in what could be hate speech.
Uncivil conversations can sometimes be problematic, but they’re also good for political dialogue, while hate speech is “inherently threatening to democracy,” according to Twitter.
The implication is that once the researchers successfully identify the differences between these two kinds of conversations, Twitter will become better equipped to target hate speech, while keeping uncivil discourse in check.
A second, smaller team formed by professors at the University of Oxford and the University of Amsterdam will similarly work off the idea that echo chambers can’t be formed when people are exposed to a variety of ideas and perspectives and that diversity breeds open-mindedness.
They will study whether the effects of positive online interaction can be carried across to the offline world.
There’s no timeline on when the research will reach fruition, and Twitter has stated that the teams are undertaking “a very ambitious task.”
* Shannon Liao is tech and culture reporter at The Verge. She tweets at @Shannon_Liao.
This article first appeared at www.theverge.com.