Brought to you by:

Steadfast CEO, Human Rights Commissioner feature at Dive In 

This year’s Dive In inclusion and diversity in insurance festival ran over three days last week, featuring industry leaders and experts in equality. 

Steadfast MD and CEO Robert Kelly was among a panel of industry leaders who discussed supporting women in leadership positions, how workplaces can support parents equitably, and the importance of “progress over perfection”. 

At the “Driving gender equity and breaking barriers for women in the workplace” session, Mr Kelly encouraged women candidates to apply for positions even if they don't feel they meet all the criteria, assuring that the right organisation will support learning. 

Accenture MD Technology Bridget Tracy also said women should feel empowered women to "just say yes" to opportunities that arise, even if they experience self-doubt, and Experian MD ANZ Andrew Black spoke about the power of normalising men taking parental leave and taking on more of the parental duties in a household. 
 
On Thursday, Australian Human Rights Commissioner Lorraine Finlay spoke at a session entitled Artificial Intelligence and Discrimination in Insurance pricing and Underwriting. Emerging technologies can pose significant risks to human rights and cause “really serious harms to individuals,” she said.  

“The starting point...is the acknowledgement that technology is absolutely essential but that it has to be fair.  

“Human rights need to not just be an afterthought, not just be a compliance box … but actually need to be embedded at every single step along the way. There are particular risks of unfair or discriminatory decision making that can arise from the use of artificial intelligence...and it can arise in many forms of decision making.”  

Australia has legislated protection against race, gender and disability discrimination which applies to AI technology. At the same Dive In session, IAG Executive Manager Data and Algorithmic Ethics Chris Dolman explained there might be bias embedded in data used for underwriting as this reflects history.  

"Perhaps the most famous example is in hiring graduates. There have been various hiring algorithms developed around the world and some of them have been criticised for gender bias because if you've hired mostly men in the past, that will be embedded in the data you've collected, then the algorithm built off that might also be biased in favour of men.  

“If your data-generating process is biased it's going to generate biased data that's going to feed into your models. So you can perhaps counteract that directly by de-biasing your data before it goes into the model.” 

Insurers can obtain more representative data, he said, and "go out and find more data on the groups less represented”.  

"You can also process your data to try and remove some of those biases,” Mr Dolman said.

Access recordings of 2023 Dive In sessions here.