
- Posted: October 11, 2023
- |
- Audrey Mora
By now, you’ve probably read or watched a bunch of articles and videos telling you how to use ChatGPT for work. You might even have played with it yourself to see what the fuss is all about. With AI being added to every facet of our lives, we decided to take a look at whether AI could actually be used for your mental and behavioral health practice.
When we say “AI” in this post, we’re referring to generative artificial intelligence with machine learning capabilities. While ChatGPT is the primary example, it’s by no means the only one.
The Good
Automation Saves Time
AI can generate pages of text in a few seconds. It doesn’t ever stare at a blank page wondering where to start, and it can quickly and easily find information about anything. It’s tempting to have session notes done after typing a few sentences! Providers can also ask the AI to generate templates they can then tweak. Some therapists also use AI to create marketing emails to attract new clients or social media posts to save time while maintaining an online presence.
If you’re someone who battles with Excel regularly, good news: ChatGPT can help you quickly and easily clean, input, or analyze data. For therapists who never quite got the hang of formulas, this can turn a long, tedious task into a few clicks.
Formatting is Easy
Sometimes, clinicians write notes or documentation only to realize they need to reformat it to be compliant with insurance rules, state regulations, or even their own practice’s guidelines. You already did the hard work, so you can simply ask the AI to rewrite notes in a SOAP format, for example.
It Can Be Cheap
Earlier versions of ChatGPT and other models are available for free. Can’t beat that discount! While it was difficult to use AI when it first came out, you no longer should have an issue with queues.
The Bad
AI Isn’t Compliant
You’re no doubt well-aware of how complicated it can be to stay compliant in the field of mental and behavioral health. Even seasoned professionals using specialized software can struggle with audits. ChatGPT and other AI are not designed for medical compliance. They’re definitely not HIPAA-compliant and cannot be trusted with client information.
You can tell AI to format documents a certain way, but that doesn’t mean it’ll do it well. If your insurance payer, for example, notices you wrote parts of your treatment plan with AI, they may very well reject your claim.
AI also uses the text it’s fed to learn. That means all your information, progress notes, and templates could be used by others. It also means that if you’re not careful, sensitive information like a client’s progress or medication could be added to the AI’s data set – and accessed by other people.
Another sticking point? Liability. We’re not quite sure who’s liable when AI messes up, but it could be you.
Ethical Concerns
Mental and behavioral health providers are continually told to do more with less. Cram more clients into your schedule, and try to use technology to minimize the paperwork burden associated with client care. But there are strong ethical concerns at play, especially with AI. Sure, your progress notes took 2 minutes, but is the AI-written version actually effective and evidence-based? Can it stand up to professional scrutiny, and does it have the client’s long-term recovery and best interests in mind? Shortcuts are tempting, but they’re not always what’s best for clients.
AI has access to a vast library of information, but at the end of the day, it doesn’t have your expertise or your empathy for the client – no matter how well it can mimic. You could have twenty clients with anxiety, and while AI might recommend the same general treatment, you know everyone might need a different approach.
Then, there’s AI bias. AI is built by people, and in addition to our own biases (that we may or may not know about), AI is trained on data that also includes systemic and statistical bias. We’re already well aware of health inequity problems, and AI could threaten to make them worse.
Hallucinations
“Hallucinations” is the name given to mistakes AI makes when trying to follow a prompt to the detriment of facts or logic. It’s not actively misleading you; rather, the program is told you want something, and does everything it can to give you the results you want, even if it has to make things up along the way. Sometimes, it’s partly the user’s fault. If you ask AI to write a paragraph about chocolate ice cream being the best flavor, it might ignore evidence that the most popular flavor worldwide is vanilla. Other times, it may give you a proper response for your query but provide “sources” that are made up (for example, the links may not work or send you to something unrelated). It could also be to due an incomplete data set.
If it’s something harmless like ice cream, a hallucination isn’t too bad, but what if the AI hallucinates when asked to craft a treatment plan or progress note? AI is very good at making things look good at first glance, so it’s possible you may not catch the problem.
Changes in AI
AI is constantly learning. Beyond that, the language models often undergo a variety of changes as the technology improves, regulations come up, or the company finds a way to make it more profitable. For example, free AI models are not the latest and greatest. While this is to be expected, it also means that the technology is not always reliable.
For example, people often use AI as a therapist. As the issues came to light, AI companies tweaked the AI to make it more difficult for users to ask AI for therapy. It can still be tricked (say, by being told to act out a scene from a movie as a therapist), but users were frustrated by the change. So one day you may use it to create a template for a consent form, and the next, the AI might tell you it’s not allowed to do so because of limitations.
Alternatives to AI
So, what’s a therapist to do to save time? Don’t worry, you have stable, compliant alternatives. Nowadays, EHRs often include editable templates for everything from progress notes to treatment plans. Some EHRs, like TheraNest, offer dynamic forms that can use data that was already entered to minimize redundant data entry and reduce mistakes.
Another option is to use your EHR’s Wiley Treatment or Progress Notes Planners integration (if available). Wiley offers more than a thousand DSM-5-ready prewritten treatment goals, objectives, and progress notes to make documentation faster and easier. Wiley is compliant, evidence-based, and works directly within your EHR, so you don’t have to worry about hallucinations or HIPAA violations.
AI is a tool that has a lot of potential, but it’s also hyped up to incorporate it into every aspect of our work and lives – even if we’re not sure it fits there. While it’s important to become familiar with technology that’s quickly becoming ubiquitous rather than to be scared of it, we should also be cognizant of AI’s limitations. Taking shortcuts is tempting, but mental and behavioral health may not be the place to do it.
TheraNest offers EHR solutions designed for and by therapists. We want to help you save time so you can care for more clients – without taking harmful shortcuts. Experience how we can help practices thrive with our free 21-day trial -no credit card required.