ICA warns AI reforms ‘disjointed, hazardous’
A disjointed regulatory approach to the use of artificial intelligence and automated decision-making could create conflicting requirements, increased compliance burdens and further uncertainty, the Insurance Council of Australia has warned.
ICA says the sequence of reforms could mean organisations have to start including information about automated decision-making use in privacy policies before other regulatory processes are finalised that could affect how it is used.
“Many ICA members view this as potentially putting the cart before the horse,” it says in a submission to a Senate committee inquiry into the privacy and other legislation amendment bill.
The government is also consulting on proposed guardrails for AI and will develop further proposals as part of Privacy Act second-tranche reforms that will address automated decision-making.
“The industry considers the current disjointed approach somewhat hazardous,” ICA says. “Organisations may risk implementing changes based on current requirements, only to find that subsequent reforms require significant alterations, leading to duplication of effort and wasted resources, which would have cost implications for consumers.”
The submission also warns that if regulatory guidance is delayed or lacks specificity, organisations will operate in a vacuum, increasing the risk of non-compliance and requiring costly adjustments once guidance is finally provided.
ICA seeks transition periods for the introduction of the proposed privacy changes, including a minimum 12 months for new provisions on personal data security, retention and destruction.
In an age of rapid digital transformation, reforms in the bill are both timely and necessary, it says.
“Safeguarding the privacy and information security of Australians is essential, and ICA recognises that the general insurance industry, which handles sensitive data for millions of Australians, plays a pivotal role in enabling these protections.”