banner

News

May 17, 2023

'This Is Not a Fad': Why Behavioral Health Can't Run From Generative AI

DALL-E prompted by Chris Larson

DALL-E prompted by Chris Larson

Generative AI has come for behavioral health. Now the question is this: What will the industry do about it?

Forward-thinking and innovative approaches to technology may help the behavioral health industry tackle some of its most pressing challenges. Further, early adoption may allow the industry to play a meaningful role in generative AIs development and implementation overall. Embracing cutting-edge technology may even help behavioral health providers over their predilection to be slow in taking up new technology.

"I’ve found that, while [new technologies] don't all work the way they say they’re going to, if you embrace it early enough, you have more of a say in how the products look and feel and act," Dale Klatzker, CEO of Gaudenzia, told BHB. "You can tailor those things with a willing partner in more ways than when something is more fully baked; there's less willingness to do."

Founded in 1968, Norristown, Pennsylvania-based Gaudenzia is the largest nonprofit addiction treatment provider in Pennsylvania and Maryland. It also operates in Delaware and has an affiliate entity in Washington D.C. In fiscal year 2022 ended June 30, it garnered $114 million in revenue.

While the organization was an early mover in addressing social determinants of health — it developed its own low-income housing — it has been slow to adopt technology, according to Klatzker.

Gaudenzia rolled out an electronic health record (EHR) five years ago. This is considerable, considering poor EHR uptake in the behavioral health space.

For example, 55% percent of behavioral health professionals in one study of electronic health records usage (EHRs) have never used one. Other research finds that 47% of psychiatric hospitals use EHRs and large facilities affiliated with a health system were likelier to adopt a certified EHR.

EHRs, which are table stakes for other health care segments, have even worse adoption in different behavioral health segments.

About 37% of private mental health organizations that take Medicaid use electronic health records (EHRs). Additionally, 32% of private substance use disorder treatment providers accepting Medicaid use EHRs, according to a report by the Medicaid and CHIP Payment and Access Commission (MACPAC).

About 2.5 years ago, Gaudenzia engaged an AI tool provider called Eleos. Eleos uses natural language processing, a technology that can understand and encode written and spoken language, and generative AI to pull insights from therapy sessions. It automates documentation and tracks the presence and quality of specific therapy interventions.

The former feature came at a vital time in the organization's technology journey. Gaudenzia had yet to standardize its EHR documentation practices. Adopting and rolling out an Eleos tool that did it for clinicians speedily and consistently presented an "incredibly powerful" solution.

"Our staff has become very dependent upon it," Klatzker said. "It saves 30% to 40% of their charting time."

A number of innovation-minded behavioral health organizations use generative AI tools today. The question of its place in the behavioral health sector is not if it will have a place. Rather, it's how it will be used and how deeply it will percolate into clinical practice.

And while it's still early days for generative AI, its sophistication will swiftly improve and aid the acceleration of other technologies, creating recursive growth in the power and availability of these types of services.

"This is not a fad," David Mou, CEO of digital telehealth startup Cerebral told BHB. "This is very, very real."

While the fragmented nature of the industry — replete with mom-and-pop operations and underfunded nonprofits — presents another barrier to the proliferation of generative AI, it could also be a tool to help cash-strapped organizations develop their own home-grown tech tools.

Matthew Serel, CEO and co-founder of You Are Accountable, developed a client note generation tool using OpenAI's GPT-4 model and Microsoft's Azure Speech Service's speech-to-text tools. He also created a generative AI model trained on his company's documentation, policies and procedures called AskAJ, named after his business partner and You Are Accountable's chief member officer A.J. Diaz.

"Documentation is super important, but it takes time and, frankly, it's a non-preferred activity," Serel told BHB. "Nobody wants to write notes. I would much prefer my team to have an hour of downtime a day than to write notes."

You Are Accountable provides peer support services, care coordination and social reinforcement to encourage recovery for patients in addiction treatment.

Serel himself developed an employee engagement tool that helps get past human bias in assessing if You Are Accountable employees are impacted by working with people in dire circumstances.

"In behavioral health, you’re working with people with their own issues and when you’re doing that you end up taking some of that on," Serel said.

Serel was able to develop these tools himself over the course of a couple of days. The company has rolled them out and is assessing further tweaks. All the while, Serel continued to maintain his role as the day-to-day manager of a boot-strapped startup.

He used GitHub Copilot, an AI tool that accelerates coding, to aid in the development and implementation of the various generative AI tools behind the note system, AskAJ and the engagement scores.

Serel has a leg up on other providers because of his background. He started his career in software development as a programmer. He founded, programmed for and eventually sold AccuPoint, an EHR company that grew quickly in applied behavior analysis, to Therapy Brands.

While the company is growing, You Are Accountable enjoys a startup's inherent speed and agility.

"If I have an idea, I can push it out. I do it responsibly because I built an EHR before, and I know how to do all this stuff," Serel said. "There's lots of room for motivated people to make a big impact with little investment."

The direct impacts and practical applications of generative AI could present an attractive opportunity for behavioral health providers, Serel said, adding that other AI tools can help them become more tech-forward.

Mou sees the development of generative AI tools today moving quickly. Each week produces new and more sophisticated technology. He also expects that innovation rate to accelerate.

"This is an exponential technology," Mou said. "I think it's going to end up becoming a commodity."

Eventually, generative AI will be so commoditized for health care and mental health that the next order question will be about implementation. Eventually, 90% of working with generative AI will be about ensuring that — for example — it puts in and pulls from EHRs correctly, measuring impact on productivity and assessing enhancements to patient success, Mou added.

Cerebral has an advantage in this effort, Mou said.

The company is well-funded, has developed an internal electronic health record and made machine learning (technology that adjusts to or makes inferences from data) a priority early on.

"Going forward with generative AI, we’re going to accelerate this, and it's going to be a key part of our strategy," Mou said. "But the field needs to have a lot of humility about what we’re trying to do here."

In brief, generative AI's central promise for behavioral health is reducing administrative burdens for organizations and employees. Seventy percent of providers say such tasks take time away from patient care, while 40% say fewer administrative tasks would improve client care. Administrative burdens are a key driver of burnout and a contributing factor to the behavioral health clinician shortage.

Klatzker said the "workforce-challenged environment" has Gaudenzia looking for "anything that an agency like Gaudenzia can do to give itself some advantage, some differentiator [where] people can focus on what they were trained to do, and not focused on the minutia or administrative burdens."

Mou also sees generative AI tools as a means to prepare patients for their first appointment from a clinical success standpoint and to curb no-shows. AI tools can take in specific data about a person and make data-backed decisions about the optimal outreach tailored for specific patient factors.

Further, generative AI tools can personalize communications about treatment plans, guidance about medication usage and other efforts to improve patient engagement and retention, Mou said.

"We all know that good patient engagement is a prerequisite of good clinical outcomes," Mou said. "Generative AI can get you so efficient at this that you can run 30 to 40 of these projects.

"All of a sudden, you look at the average patient that comes to Cerebral and, depending on their background, everyone has a personalized set of communications and treatment plans — all to optimize on patient engagement which leads to good clinical outcomes."

Jonathan Ciampi, CEO of the comprehensive virtual behavioral health company Bright Heart Health, also sees generative AI and other AI tools as a means to gather data more efficiently. Care outcomes and other process measures are increasingly, but not universally, tracked in behavioral health as greater interest in value-based care and improving the objectivity of behavioral increases.

In part, Bright Heart Health partners with Lyssn.io — a company that uses AI to assess the presence and quality of behavioral health interventions. It also measures the specific performance of clinicians by rating aspects such as empathy and the use of evidence-based treatments. Clients can also tap into an automatic note-generation feature and an AI system — trained on thousands of clinician-verified transcripts of care sessions — as part of an interactive training platform.

"Outcomes are very subjective," Ciampi said. "I think that's really a big disservice to the field. Part of that has been the historical roots of psychotherapy and things like that. But I also think some of it has been a lack of access to technology and the ability to draw out those outcomes. That's where I think AI and machine learning help."

Bright Heart Health uses Lyssn's tools to monitor its cognitive behavioral therapy (CBT) and motivational interview quality. On top of the other features, Lyssn helps the company's medical scribes code sessions to submit to health plans for reimbursement.

Similar to Gaudenzia, Bright Heart Health seeks to differentiate itself from its competitors in the eyes of clinicians by not requiring them to chart.

Further down the line, the ease and speed of data gathering and assessment could open new avenues for the science of mental health care.

Michael Tanana, co-founder and chief technology officer of Lyssn, sees AI generally — including generative AI — in assessing what therapy techniques work, when they work, and for whom.

"We can categorize billions of words and utterances in context, with outcomes," Tanana said. "And that's going to lead to all sorts of things where we understand better which things we should emphasize in the training of psychology."

Mou echoed a similar sentiment. He theorized that the use of AI generally could eventually lead to a new, particular type of depression diagnosis with correlating best-in-class treatments specific to that diagnosis.

Generative AI tools may also play a key role in patient intake, triage and patient-provider matching — ensuring that patients get to the right level of care and with the right provider at the start of care.

There are still a lot of questions about the ethics of AI when it comes to treating and interacting with patients. Historically, clinicians have been skeptical and even resistant to AI because of safety concerns.

For Mou and Cerebral, all generative AI programs have a human involved in the service loop. Clinicians review communications before they go out to patients.

"Health care is always going to be about human-to-human relationships," Mou said. "In my own opinion here, companies that think they can replace that are misguided, especially in mental health.

"I’m worried that some of these cowboys that are just throwing things out there and saying here's a therapist in a box are going to end up doing someone real harm and harming the name of generative AI in healthcare."

Generative AI, in Mou's assessment, can revolutionize between-session engagement. Mou, a psychiatrist, has conducted therapy sessions with patients in the past and found that patients frequently didn't complete practices he’d given, taking up time in the session to do that.

Customized content and engagement tools could drive better care outcomes as patients get a hand in implementing what they learn in therapy.

But he also said it's vital to ensure that communications that AI generates are transparently indicated as such. Cerebral recently released a charter of "guiding principles" about using generative AI and similar tools. Transparency is included among six other principles.

Mou also maintains that Cerebral does not allow patients to directly interface with AI.

"We’re not there yet; I don't think we as a field are ready for that," Mou said, adding that it always needs to be clear that people are interfacing with an AI, not a human.

But given the recursive nature of technology development and growing patient familiarity and comfort with such tools, it may be an inevitability that generative AIs develop with the appropriate safeguards and subject matter expertise.

The consideration of patient safety is huge, Ciampi said. But these types of tools are inevitable if providers seek to optimize between-session training with generative AI.

"To say that you would never use or deploy Ai for a patient-facing capability, I think, is naive," Ciampi said. "We’re going to see it start at the least risky scenarios and it's going to start to evolve up and the guardrails will be put in place."

An example could be a generative AI that trains patients on the dialectical behavior therapy (DBT) coping strategy called the STOP technique. That tool could assess the completion and success of the technique and inform further therapeutic decisions. Another example could be emotional dysregulation scenario training, Ciampi said.

Bright Heart Health, Cerebral, Gaudenzia, Lyssn.io, You Are Accountable

Chris Larson is a reporter for Behavioral Health Business. He holds a bachelor's degree in communications from Brigham Young University and has been covering the health care sector since December 2016. He is based in the Louisville metro area. When not at work, he enjoys spending time with his wife and two kids, cooking/baking and reading sci-fi and fantasy novels.

SHARE