Unless you’ve been overseas or disconnected from all forms of electronic communication, it won’t have escaped your notice that Australia’s data protection legislation came into effect on February 22nd.
Parallels have already been drawn with the European Union’s General Data Protection Regulation (GDPR), due to come into effect in May. However, there is one important aspect of the GDPR that isn’t reflected in the Australian legislation, but is still highly relevant to Australian businesses. As we’ve mentioned before, being outside of the EU does not exempt Australia from observing the GDPR’s rulings.
A closer look into AI decision-making
As Norton Rose Fulbright lawyers Sven Jacobs and Christoph Ritzer noted in a November 2017 blog post, “[The GDPR] is not just one of the biggest changes to the regulatory framework of data protection … it also specifically addresses automated individual decision-making, including profiling.”
They explained: “Article 29 Data Protection Working Party (WP 29) has recently adopted Guidelines on Automated Decision-making, which are likely to profoundly impact AI-based business models.
“Automated decision-making is often defined as the ability to make decisions by technological means without human involvement. This definition also includes so-called artificial general intelligence.”
And, they continue: “The WP 29 acknowledges that ‘the growth and complexity of machine-learning can make it challenging to understand how an automated decision-making process works’, but nevertheless it expects data controllers using such processes to ‘find simple ways to tell the data subject about the rationale behind, or the criteria relied in reaching the decision without necessarily always attempting a complex explanation of the algorithms used or disclosure of the full algorithm’ and that ‘complexity is no excuse for failing to provide information to the data subject’.”
And for Australian organisations?
What this boils down to, as two local AI experts have recently highlighted, is that organisations must be able to explain the decisions arrived at by their AI tools.
For example, many organisations now use automated tools to screen job applications and produce a shortlist of candidates. If challenged by someone who felt they had been discriminated against, the organisation would have to demonstrate there were no inbuilt biases in its algorithms.
And just because Australia is not in the EU does not mean that Australian organisations will be able to ignore the requirements of the GDPR.
“The GDPR will require all organisations making decisions using AI or data science to explain their decisions.”
Habib Baluwala, newly appointed data scientist at Auckland-based consultancy Soltius, said the GDPR would require all organisations making decisions using AI or data science to explain their decisions, and that organisations dealing with international clients based in the EU would need to comply with the GDPR’s regulations and provide transparency for algorithms used in making decisions.
His views were echoed by Australian Richard Kimber, speaking at the launch of his new company Daisee, which uses AI to help businesses identify peaks and troughs in supply and demand, amongst other things.
Kimber described AI as “part art, part science” saying “regulators are getting more and more interested in this because they don’t want black box solutions where you can’t say why a customer was rejected for a loan or why someone was turned down for the job. The GDPR will require AI systems to be explainable.”
He said Daisee was working to “minimise the art and maximise the science so the answers are replicable, particularly when you need to be able to explain the result.”
Creating advanced, secure technology solutions in Australia since 1986
Our expert team has developed and deployed secure cloud solutions all across Australia. Our team is also building out our recent deployment of Amazon Lex as the underlying technology to drive chatbots for automated AI text conversations. Contact us today to find out more.