Patients and Participants are Pasting Clinical Records into AI

It’s becoming increasingly common to ask AI questions and rely on it as an expert. We ask about our health, disability, and schemes such as the NDIS, often treating the answers as definitive advice.

This article was prompted by a recent situation where a participant’s carer raised some questions after uploading a report into AI for feedback. While the example relates to the NDIS, the same issues apply to workers compensation, CTP, personal injury, and other areas where IOH Health provides expert clinical support.

Privacy

The first consideration is privacy. In this case, the carer had uploaded the participant’s report verbatim into a consumer AI tool, apparently without recognising that the report contained highly sensitive personal and health information.

An NDIS clinical report may include diagnoses, functional limitations, daily living needs, behavioural observations, family circumstances, support arrangements, risks, and recommendations. When that information is copied into an online AI system, it may be stored, processed, reviewed, or used in ways the participant did not understand or consent to. Depending on the tool, the information may also be handled outside Australia.

This is especially important where the person uploading the report is not the participant themselves. A carer, family member, support coordinator, or advocate may have access to the report to assist the participant, but that does not automatically mean they have authority to share the report with external online platforms.

Even when the intention is helpful, uploading the full report can create an avoidable privacy risk. The safer approach is to avoid entering identifiable information into consumer AI tools, use de-identified excerpts where possible, and seek the participant’s informed consent before sharing their information with any third-party system.

Beyond Privacy

Privacy is often the first concern when using AI tools—and for good reason. But the conversation shouldn’t stop there. Once information from a report has been uploaded—whether in full or as excerpts—the question shifts from “Is this safe to share?” to:
“What happens when this information is interpreted without full context?”
Missing puzzle pieces can make a world of difference
Even when privacy risks are managed, there are still important limitations in how AI understands and analyses clinical information. These risks are less visible, but often more impactful—particularly when conclusions are drawn about recommendations, eligibility, or what supports someone “should” receive. Your questions may seem relatively simple.

It can feel helpful—instant answers, clear explanations, and confident responses.

But there’s an important question:

Is the AI actually in a position to be your expert?
The appeal: fast, confident answers. AI tools are designed to:
  • provide quick responses
  • simplify complex information
  • give clear, structured explanations
For general understanding, this can be useful. But there’s a key limitation:
AI gives answers based on patterns—not on you.

AI sees Patterns
Clinicians see People

The limitation: no real understanding of your situation AI does not:
  • meet you
  • assess your functional capacity
  • observe how you manage daily tasks
  • understand your goals, risks, or environment
Instead, it:
  • recognises patterns across many people
  • predicts what is likely to be correct
  • fills in gaps based on probability
This means its answers can sound right—but may not actually reflect your situation.

What is a Clinical Report

A clinical report is a document prepared by a qualified health professional to support decisions and recommendations about your needs. In the NDIS context, it helps explain your functional situation, the impact of your disability, and why particular supports may be reasonable and necessary. The report includes key information about your assessment, clinical reasoning, and the basis for recommendations. However, it cannot capture every nuance that informed the clinician’s view. Some of the reasoning comes from observations, discussions, professional experience, and contextual factors that are difficult to fully reproduce in writing. Third parties reading the report, including the NDIA, also place some inferred weight on the assumed expertise, qualifications, and professional judgement of the report author. In other words, the report is not assessed only by the words on the page, but also in light of the clinician’s role as an expert who has assessed your situation and formed a professional opinion. This matters when a report is reviewed by an AI tool. If those additional details are not in the written report, and if the tool does not properly account for the author’s expertise and professional judgement, they are also missing from the AI’s “probability equation”. The AI may then place more weight on what is common or typical, while overlooking the individual and professional factors that make a recommendation appropriate for you. A clinical report should therefore be understood as expert evidence supporting decisions and recommendations — not as a complete record of every factor considered, and not as a document that contains the full clinical context.

What AI is useful for

AI can still be helpful when used carefully:
  • understanding terminology
  • getting a high-level summary
  • preparing questions to ask your clinician
Used this way, it can support—not replace—understanding.

What to do if something doesn’t make sense

If you read your report (or an AI summary) and something feels unclear or incorrect, then the best next step is simple:
Ask your Clinician.
They can:
  • explain the reasoning behind recommendations
  • clarify how conclusions were reached
  • connect the report to your specific situation

The bottom line

AI tools are powerful, but they don’t have:
  • your lived experience
  • your full assessment context
  • your clinician’s professional judgement
Your report is based on all three. So while AI can help you understand parts of it, it shouldn’t be relied on to judge or reinterpret it in full.

Leave a comment

Need Assistance?
We are here to help

Quick Enquiry