Brought to you by:

Patchwork adoption of AI leading to ‘unknown risks’

Most businesses are only “scratching the surface” when using artificial intelligence, as procurement, risk and compliance, and health and safety departments lag in adoption, a survey by broker Gallagher has found. 

AI use is largely limited to the IT and sales and marketing functions, and it mostly centres on handling customer enquiries (36% of respondents), summarising documents (35%) and writing emails (32%).

Despite the uneven adoption, the survey found “vast swathes” of the workforce are using AI tools, and employers are adapting to this shift. 

Generative AI is now a “boardroom reality, with adoption taking place at a ferocious pace across business functions”, Gallagher says. 

However, there are “gaps in AI literacy and adoption across organisations”.

The survey found business leaders are more aware of AI risks now than a year ago and are making “bold and swift moves” to mitigate them. The variety of strategies being implemented because of AI adoption is leading to “unknown and potentially unpredictable risks”, Gallagher warns.

Globally, 68% of business leaders see AI as an opportunity, although 11% now see it as a risk, up from 5% a year ago.

Almost two-thirds say their business has tested AI. A significant skills shortage is emerging as the top obstacle to overcome, alongside ethical considerations. Compliance issues are another major barrier. 

Gallagher global chief information officer Mark Bloom says a successful AI integration framework should bring IT, data privacy, legal, compliance, risk management and business leadership together to ensure systems are safe, ethical and compliant.  

“For a period of time, it is also recommended that a human validate the results and outputs to avoid unintended consequences,” he said.

Gallagher says GenAI can produce many useful things, but rarely with any depth of feeling or connection, and “certain roles will always require empathy and emotional intelligence”.

See the survey results here