27 September 2023

Human Rights concern at non-human AI

Start the conversation

The Australian Human Rights Commission (AHRC) has discovered that 46 six per cent of people in Australia were not aware that the Government makes important decisions about them using artificial intelligence (AI).

The Commission said that older people, those not in paid employment and lower-income earners were least aware that Government used AI to make decisions.

“However, people in these groups are more likely to be affected by such decision making,” the Commission said.

It said research on attitudes towards AI also showed strong public demand for the technology to be used accountably and transparently, so that human rights were protected.

Human Rights Commissioner Edward Santow said Australia should not beta test new technology on vulnerable groups in society.

“Where AI is used in decision making, the decisions can be harder to understand,” Mr Santow said.

“It can also be more difficult to prove when such decisions are unlawful or unfair,” he said.

“The research we commissioned shows that people want to see new technology being used in ways that are transparent, understandable and they want to be informed about it.”

“And that is clearly not happening because almost half of the people we polled didn’t even know it was happening.”

Mr Santow said Australians were not opposed to new technology, but they recognised that AI could make mistakes.

“Australians want laws that promote human rights and accountability to be applied rigorously to AI and other new tech,” he said.

“The overwhelming majority of people polled by the Commission (88 per cent) want to be able to understand how AI is used on them, by being given reasons or an explanation for AI-informed decisions that affect them.”

He said where AI was used to make a decision which may be unlawful or otherwise wrong, 87 per cent of those polled said it was ‘very important’ or ‘quite important’ to be able to appeal the decision.

“Between 41 per cent and 48 per cent of participants said they would have ‘a lot’ more trust in automated decisions by Government if there were oversight measures put in place including human checks, limitations on personal information sharing, and stronger laws to protect people’s human rights,” Mr Santow said.

Start the conversation

Be among the first to get all the Public Sector and Defence news and views that matter.

Subscribe now and receive the latest news, delivered free to your inbox.

By submitting your email address you are agreeing to Region Group's terms and conditions and privacy policy.