Patients and Participants are Pasting Clinical Records into AI

It’s becoming increasingly common to ask AI questions and rely on it as an expert. We ask about our health, disability, and schemes such as the NDIS, often treating the answers as definitive advice.

This article was prompted by a recent situation where a participant’s carer raised some questions after uploading a report into AI for feedback. While the example relates to the NDIS, the same issues apply to workers compensation, CTP, personal injury, and other areas where IOH Health provides expert clinical support.

Privacy

The first consideration is privacy. In this case, the carer had uploaded the participant’s report verbatim into a consumer AI tool, apparently without recognising that the report contained highly sensitive personal and health information.

An NDIS clinical report may include diagnoses, functional limitations, daily living needs, behavioural observations, family circumstances, support arrangements, risks, and recommendations. When that information is copied into an online AI system, it may be stored, processed, reviewed, or used in ways the participant did not understand or consent to. Depending on the tool, the information may also be handled outside Australia.

This is especially important where the person uploading the report is not the participant themselves. A carer, family member, support coordinator, or advocate may have access to the report to assist the participant, but that does not automatically mean they have authority to share the report with external online platforms.

Even when the intention is helpful, uploading the full report can create an avoidable privacy risk. The safer approach is to avoid entering identifiable information into consumer AI tools, use de-identified excerpts where possible, and seek the participant’s informed consent before sharing their information with any third-party system.

Beyond Privacy

Privacy is often the first concern when using AI tools—and for good reason. But the conversation shouldn’t stop there. Once information from a report has been uploaded—whether in full or as excerpts—the question shifts from “Is this safe to share?” to:
“What happens when this information is interpreted without full context?”
Missing puzzle pieces can make a world of difference
Even when privacy risks are managed, there are still important limitations in how AI understands and analyses clinical information. These risks are less visible, but often more impactful—particularly when conclusions are drawn about recommendations, eligibility, or what supports someone “should” receive. Your questions may seem relatively simple.

It can feel helpful—instant answers, clear explanations, and confident responses.

But there’s an important question:

Is the AI actually in a position to be your expert?
The appeal: fast, confident answers. AI tools are designed to:
  • provide quick responses
  • simplify complex information
  • give clear, structured explanations
For general understanding, this can be useful. But there’s a key limitation:
AI gives answers based on patterns—not on you.

AI sees Patterns
Clinicians see People

The limitation: no real understanding of your situation AI does not:
  • meet you
  • assess your functional capacity
  • observe how you manage daily tasks
  • understand your goals, risks, or environment
Instead, it:
  • recognises patterns across many people
  • predicts what is likely to be correct
  • fills in gaps based on probability
This means its answers can sound right—but may not actually reflect your situation.

What is a Clinical Report

A clinical report is a document prepared by a qualified health professional to support decisions and recommendations about your needs. In the NDIS context, it helps explain your functional situation, the impact of your disability, and why particular supports may be reasonable and necessary. The report includes key information about your assessment, clinical reasoning, and the basis for recommendations. However, it cannot capture every nuance that informed the clinician’s view. Some of the reasoning comes from observations, discussions, professional experience, and contextual factors that are difficult to fully reproduce in writing. Third parties reading the report, including the NDIA, also place some inferred weight on the assumed expertise, qualifications, and professional judgement of the report author. In other words, the report is not assessed only by the words on the page, but also in light of the clinician’s role as an expert who has assessed your situation and formed a professional opinion. This matters when a report is reviewed by an AI tool. If those additional details are not in the written report, and if the tool does not properly account for the author’s expertise and professional judgement, they are also missing from the AI’s “probability equation”. The AI may then place more weight on what is common or typical, while overlooking the individual and professional factors that make a recommendation appropriate for you. A clinical report should therefore be understood as expert evidence supporting decisions and recommendations — not as a complete record of every factor considered, and not as a document that contains the full clinical context.

What AI is useful for

AI can still be helpful when used carefully:
  • understanding terminology
  • getting a high-level summary
  • preparing questions to ask your clinician
Used this way, it can support—not replace—understanding.

What to do if something doesn’t make sense

If you read your report (or an AI summary) and something feels unclear or incorrect, then the best next step is simple:
Ask your Clinician.
They can:
  • explain the reasoning behind recommendations
  • clarify how conclusions were reached
  • connect the report to your specific situation

The bottom line

AI tools are powerful, but they don’t have:
  • your lived experience
  • your full assessment context
  • your clinician’s professional judgement
Your report is based on all three. So while AI can help you understand parts of it, it shouldn’t be relied on to judge or reinterpret it in full.

Thought leader – Dr Tyler Amell

It’s a pleasure for IOH to be able to support ARPA in bringing world renowned thought leader Dr Tyler Amell to our shores to share his insights on the science of resilience and well-being. 

2023 Winner of Leadership and Management Excellence at the Allied Health Awards

A big congratulations to our OT Manager, Teresa Ferreira, who won the 2023 Allied Health Leadership and Management Excellence at the national Allied Health Awards in Darwin. Teresa is a passionate leader who is motivated to see her team grow and be supported in their OT careers. She has integrated a high standard of clinical evidence, procedures and operations into IOH services, and this has been recognized at the highest level. Teresa is passionate about our MDT approach and has a love of learning, that has created a growth mindset in her team, constantly striving to enhance the services delivered to our clients.

2023 One Door Illawarra Mental Health Luncheon

It is an honour to once again support our community sponsoring the One Door Mental Health in the Workplace Luncheon no 6 October 2023. As always the One Door Committee for the Illawarra have attracted a top notch speaker to continue to shine light on the important role of workplaces in fostering healthy mental health. 

REMEMBRANCE BIKE RIDE: 8-10 SEPTEMBER 2023

IOH Health is excited to be the Gold Sponsor for the 10th Anniversary Remembrance Ride supporting NSW Police Legacy. Two of our core values are Connection and Care, and NSW Police Legacy is a valuable charity that aligns to these values in the service it provides.

NSW Police Legacy was established in 1987 to provide support to police families who have suffered the loss of a loved one. Today, NSW Police Legacy continues to enhance the lives of Police Legatees by providing support to families through the provision of meaningful benefits, services, and advocacy and pride themselves as being an inclusive organisation. Regardless of whether a police officer was serving or retired, or the circumstances of their death, they support those left behind.

The Police Legacy Board and staff strive to ensure that no Legatee will ever feel forgotten or in need, and that they continue to feel connected to the Police Family.

IOH Staff are riding with the BAM (Bring A Mate) Team across a number of pelotons for this 3-Day Sydney-Canberra Ride. 

Pick a rider to donate to and show your support.

Riding from IOH:

  • Graeme Shepherd (Rehabilitation Services Manager) – DONATE
  • Scott Morton (Senior Employment Consultant) – DONATE
  • James Hogg (Managing Director) – DONATE

Local Riders

  • Register today to confirm your place in a Local Ride.
  • Registration is FREE.
  • You can form your own Local Ride, or check back on this page to join in one of the rides happening around the state (keep an eye on our social media for updates).
  • You can register for your whole group/family, if you want to. No need to register one at a time.
  • You can buy official Remembrance Bike Ride 2023 Kit and event merchandise – visit the online merch store today! Order before 9 August 2023 to ensure it arrives in time for the event.

Need Assistance?
We are here to help

Quick Enquiry