A paper by the Human Rights Commission, in collaboration with other Agencies, demonstrates how businesses can identify algorithmic bias in artificial intelligence (AI), and proposes steps they can take to address the problem.
Human Rights Commissioner, Edward Santow said that with companies increasingly using AI for decision-making in everything from pricing to recruitment, the paper, Addressing the Problem of Algorithmic Bias, explored how these decision-making systems could result in unfairness.
“The technical paper also offers practical guidance for companies to ensure that when they use AI systems, their decisions are fair, accurate and comply with human rights,” Mr Santow said.
“Human rights should be considered whenever a company uses new technology, like AI, to make important decisions.”
He said AI promised better, smarter decision-making, but it could also cause real harm.
“Unless we fully address the risk of algorithmic bias, the great promise of AI will be hollow,” Mr Santow said.
“Algorithmic bias can arise in many ways. Sometimes the problem is with the design of the AI-powered decision-making tool itself. Sometimes the problem lies with the data-set that was used to train the AI tool. It often results in customers and others being unfairly treated.”
However, Chief Executive of the Gradient Institute, one of the partners in the report,
Bill Simpson-Young said the good news was that algorithmic biases in AI systems could be identified and steps taken to address them.
“Responsible use of AI must start while a system is under development and certainly before it is used in a live scenario. We hope this paper will provide the technical insight developers and businesses need to help their algorithms operate more ethically,” Mr Simpson-Young said.
The paper, the first of its kind in Australia, is the result of collaboration between the Australian Human Rights Commission and the Gradient Institute, the Consumer Policy Research Centre, CHOICE and the Commonwealth Scientific and Industrial Research Organisation’s Data61.