Doctors call for AI rules to prevent medical mistakes

Australian doctors are using AI tools, including ChatGPT, to deal with patients every day despite a lack of guidelines or oversight from the nation’s medical regulator, an inquiry has heard.

Healthcare experts revealed the issue at the Senate’s Adopting Artificial Intelligence inquiry on Wednesday, while calling for restrictions on use of the technology in healthcare to give doctors, nurses and patients greater confidence.

However, technology firms warned the inquiry to carefully consider strict rules on the use of AI to avoid putting Australian innovation at a disadvantage.

Australian Alliance for Artificial Intelligence in Healthcare director Enrico Coiera told the inquiry AI technology was being used throughout medical practices despite the absence of guidelines.

“AI is already in routine use in the healthcare system,” he said.

“Digital scribes are used daily in general practice to listen in on patient conversations and summarise records for them automatically.”

But Professor Farah Magrabi, from Macquarie University’s Australian Institute of Health Innovation, said the technology was being used without professional oversight as it did not qualify for scrutiny from the Therapeutic Goods Administration (TGA).

“They fall through a gap at the moment because software that is just there for record-keeping is not subject to the TGA’s medical device regulations,” she said.

SA Health Excellence and Innovation in Health commissioner Keith McNeill said some doctors were taking the use of generative AI tools further.

“The younger generations… they’re actually already using ChatGPT to generate their discharge summaries, they’re just not telling us,” he said.

“What we need now are the guardrails around so that people can use these tools safely and effectively.”

Prof Coiera submitted 16 recommendations for AI rules to the inquiry, including the establishment of a national AI healthcare authority.

If used with oversight, he said, AI technology had the potential to improve medical treatments.

“We’re looking at machine-learning to identify your biomarker patterns and work out which drug is going to be the right drug for you,” he said.

“Imagine that replicated across every major disease class – we’re talking about a revolution over the next decade or two in the way we target treatment to patients.”

Earlier, representatives from tech companies warned senators against introducing tight AI rules to avoid making it difficult for Australian innovators to compete with their US rivals.

Trellis Data chief executive Michael Gately said laws that forced AI developers to pay content creators for their work or reveal data sources could hamper local firms.

“My preference would have always been to ensure that people are paid for their work under the Copyright Act … but I think the would be difficult to implement and would probably impact Australian companies unfairly against global competition,” Mr Gately said.

Nuvento chief executive David Hohnke agreed, telling the inquiry AI rules in Australia should work alongside regulations in Europe and the US.

“If we do this in isolation, we could harm ourselves and people will go, ‘so what, I’ll just use ChatGPT and throw my documents up there and breach out company requirements’,” he said.

But Atlassian global public policy head David Masters said Australia did have scope to set standards for AI use and introduce legal reforms.

The Senate inquiry is expected to issue its findings on the impact of AI in September.

 

Jennifer Dudley-Nicholson
(Australian Associated Press)

0

Like This