26 September 2023

eSafety calls time for algorithms to open up

Start the conversation

Australia’s eSafety Commissioner has called on social media platforms and online services to open up their transparency and risk management stands to show how algorithms are impacting on users, particularly children.

The Commissioner, Julie Inman Grant said her position paper Recommender systems and algorithms examined how recommender algorithms exposed children to heightened risks, “such as online sexual exploitation through friend or follower suggestions connecting them with adults, dangerous viral challenges, and harmful content loops that may contribute to negative mental health impacts”.

“Systems designed to maximise engagement may also try to draw people in through shocking and extreme content, which could normalise prejudice and hate by amplifying content and views that are misogynistic, homophobic, racist and extreme,” Ms Inman Grant said.

“Recommender algorithms can have positive benefits, like introducing us to new entertainment, experiences, friendships and ideas, based on factors such as our previous searches and online preferences,” she said.

“But we also need to consider the risks, particularly to the most vulnerable among us – our children.”

Ms Inman Grant said almost two-thirds of young people aged 14 to 17 were exposed to seriously harmful content related to drug taking, suicide, self-harm or violence.

She said one in 10 children had been the target of hate speech online, and one in four had been in online contact with an adult they didn’t know.

“Greater transparency about the data inputs, what a particular algorithm has been designed to achieve, and the outcomes for users are all critical to helping both the public and online safety regulators understanding how these algorithms affect what we see and do online,” the Commissioner said.

“We also need to make tech companies accountable for the impacts, particularly on children,” she said.

“If a child is particularly vulnerable in the real world, being served up more and more content relating to self-harm and suicide, dangerous challenges, or body image and eating disorders could not only have negative mental health impacts, but also potentially place them in real physical danger.”

Ms Inman Grant recommended companies take a more proactive ‘Safety By Design’ approach by considering the risks algorithms may pose at the outset and designing in appropriate guardrails.

The eSafety Commissioner’s 19-page Position Statement can be accessed at this PS News link.

Start the conversation

Be among the first to get all the Public Sector and Defence news and views that matter.

Subscribe now and receive the latest news, delivered free to your inbox.

By submitting your email address you are agreeing to Region Group's terms and conditions and privacy policy.