
Artificial Intelligence in Allied Health Newsletter - March 2025
Mar 11, 2025Hi fellow AHPs!
I hope you’re settled into the new year by now. I’m well into the swing of things, with upcoming conferences, webinars, team workshops and course writing keeping me busy. In February I also started the Hugging Face AI Agents course, which has so far taught me a lot about how AI Agents work, and I’ve even had the opportunity to build one (using coding, which I haven’t done in about 20 years!). AOTA Inspire 2025 in Philadelphia is rapidly approaching and will happen between now and when I send out my April newsletter. I’m very excited to be presenting a short course on implementing AI into practice as well as meeting with my US and Canadian colleagues. If you’re attending, I’d love to meet in person and discuss all things innovation, technology and OT, please hit reply and let me know you’ll be there! Enjoy this month’s newsletter, it’s packed to the brim!
Industry News
Updated ChatGPT Model
A week or so before writing this newsletter OpenAI introduced it’s newest ChatGPT Model – GPT-4.5. It’s currently a research preview, so only available to Pro users and developers at the moment.
The new model has improved ability to recognise patterns, draw connections, and generate creative insights and is the “most human-like chatbot yet.” It has the ability to engage in more natural conversations with better understanding of nuance. GPT4.5 has a better understanding of subtle cues, implicit expectations and “what humans mean.” This means it excels in creative tasks, including writing and design. OpenAI have indicated that has the lowest hallucination rate of all of the OpenAI models (it still sits at 37.1%, so they are still a significant feature of AI use).
When it is available to you (depending on your membership tier), you’ll be able to select it in the model selector dropdown on your device.
You can read more here: https://openai.com/index/introducing-gpt-4-5/
Legal Uses of Generative AI
On the 3rd of February, the NSW, Australia Chief Justice released a “Supreme Court Practice Note” about the Use of Generative Artificial Intelligence (GenAI). It aims to establish guidelines for the use of Generative AI in legal proceedings within the state of NSW. It specifies acceptable and prohibited uses of AI tools due to concerns around accuracy, confidentiality, and ethical obligations in legal practice. It states that using Gen AI to generate things such as chronologies, indexes, briefs and document summaries are permitted uses. Whilst drafting witness statement and affidavits are prohibited.
Much of this I can definitely agree with, but where it gets interesting for us allied health folk is where it talks about Expert Reports, which our reports would typically fall under. It states that Gen AI must not be used to draft or prepare the content of an expert report, without prior approval from the court. It’s rationale is that an Expert Report must reflect the expert’s opinion and reasoning, not AI-generated content. It states approval can be sought from the court for use of AI and a statement of disclosure must be included including what the AI tool was used for, which tool was used, including the privacy settings, and the prompts utilised.
Implications for Practice
There are many nuances to this sort of broad statement, which in my opinion has taken a very conservative approach to Gen AI use for Expert Reports. I guess one thing is clear- if you are working in NSW Australia and submitting a medico-legal report, an expert witness report or a report specifically to be used in court, you must seek pre-approval for any AI use and disclose the usage. Outside of this it’s nuanced and still unclear to me, there’s been discussion amongst our local disability community about whether this means a disclaimer should be added to all reports where AI has been used, just in case it ever ends up in court. In Australia, AHPRA (our regulatory body) states you don’t have to disclose AI use, unless the data is stored, but transparency is best practice. I agree with this stance, and most jurisdictions around the world are saying transparency is best practice. The question still remains though- what level of transparency?
AI and Clinician Wellbeing: Friend or Foe?
I wrote this LinkedIn article recently unpacking how AI can be judged as friend or foe of clinicians. It’s a 5 minute read, if you want to check it out, please let me know your thoughts. Read it here: https://www.linkedin.com/pulse/ai-clinician-wellbeing-friend-foe-jessica-francis-5fnic/?trackingId=ZajsoFtXSAi9MSwjq%2FTGog%3D%3D
Research
A piece of research published in February out of the University of Westminster, UK, looked at the use of conversational artificial intelligence chatbots for personalised health education to then refer for medial consultations. The intention was to see if this type of tool (chatbot-assisted self-assessment) could be used with minority groups to overcome barriers to healthcare.
They found that the chatbots were generally well-accepted for self-assessment, particularly in ethnic minority groups. Participants provided feedback that they found it useful for private, anonymous, and non-judgemental discussions about sensitive health topics. Most participants were comfortable disclosing sensitive information, provided anonymity was assured, however there was hesitancy in sharing personally identifiable details. Participants feedback that the chatbots were accessible, stigma-free, and convenient in obtaining health advice. In particular, their value was observed in reducing barriers to discussing sexual health and facilitating access to screening and preventative healthcare.
The users expressed concerns about the lack of human empathy, trust in the chatbot-generated advice, and some participant experienced digital literacy barriers.
Relevance to Practice
To me, this shows that AI has a part to play alongside allied health professionals, and has the capacity to increased accessibility of healthcare. Simultaneously this research reinforces the inability of AI to replace the human element of healthcare delivery, and the imperatives for human oversight, ethical AI design, and culturally sensitive and inclusive codesign of AI tools.
Find it here: https://journals.plos.org/digitalhealth/article?id=10.1371/journal.pdig.0000724#abstract0
Prompt of the Month
Each month, I’ll be sharing a useful AI prompt designed to enhance your practice. This month, let’s use the AI to craft an elevator pitch, explaining your allied health profession to someone else.
Prompt:
“Outline a 3-minute elevator pitch explaining my role as an [allied health profession] to someone unfamiliar with the field.”
Tried this prompt? Send me an email and let me know how you found it!
AI Tool Spotlight
Napkin.ai
“Get visuals from your text” – A shoutout to Liam Fagan (fellow Aussie OT) for showing me this tool. Napkin.ai simply creates a visual from your text. It can create infographics, diagrams, and flowcharts based on your text. You simply write or paste your text into the tool, then click, and it generates visuals for you in an instant that you can then customise and edit. Once you finish, you can export in your preferred format, making them easy to integrate into presentations, websites, social media or documents.
A quick note about privacy, Napkin.ai is not a healthcare specific tool, and makes no mention of HIPAA, GDPR or APP compliance, so be cautious about what data you put into the tool.
Upcoming Events
Live Webinar: AI Scribes for Allied Health
Struggling with documentation overload? AI scribes can save you time, reduce admin work, and let you focus more on client care- but how do they actually work? Join me for a 1-hour live webinar, where I’ll break down what AI scribes are, how to choose the right one, and the key ethical considerations.
When: Monday 24th March 2025. 12-1pm AEDT (Australian Eastern Daylight Savings Time- Sydney/ Melbourne time).
Find your timezone here: https://www.timeanddate.com/worldclock/converter.html
Where: Online via Zoom
Cost: $49
The session will be recorded and available to those unable to attend live.
More Information & Register here: https://www.thetrainingclub.com.au/ai-scribes
Free: The Group Chat – group supervision/ community of practice
When: Thursday, 28th March 2025. 12-1pm AEDT (Australian Eastern Daylight Savings Time- Sydney/ Melbourne time).
Find your timezone here: https://www.timeanddate.com/worldclock/converter.html
Where: Online via Zoom
Cost: Free
Join me for a free, interactive "group chat" to share all things AI in allied health. It's a group supervision, community of practice, or interest group- whatever you want to call it, I'm calling it the 'group chat.' This session will be an open, informal, and mostly unstructured discussion for brainstorming, sharing ideas and challenges and peer learning!
The session will be recorded and available to those unable to attend live.
Sign up here. https://www.thetrainingclub.com.au/offers/GomvYPKz
Get the AI in Allied Health Newsletter
A free monthly round up of what's happening in AI, tips, research and more, straight to your inbox!
Unsubscribe at any time.