The American Psychiatric Association (APA) has updated its Privacy Policy and Terms of Use, including with new information specifically addressed to individuals in the European Economic Area. As described in the Privacy Policy and Terms of Use, this website utilizes cookies, including for the purpose of offering an optimal online experience and services tailored to your preferences.

Please read the entire Privacy Policy and Terms of Use. By closing this message, browsing this website, continuing the navigation, or otherwise continuing to use the APA's websites, you confirm that you understand and accept the terms of the Privacy Policy and Terms of Use, including the utilization of cookies.

×

Abstract

The authors describe a quality improvement approach in which a crisis center and a payer collaborate to improve care. Each crisis visit is considered as a potentially missed opportunity for community stabilization. Daily data on crisis visits are sent to the payer for a more up-to-date analysis of trends than is possible with financial claims data, which may lag behind services provided by up to 90 days. Using these trend data, the two organizations collaborate to identify patterns that lead to opportunities for improvement and develop multiple rapid-cycle projects for better management of services, resulting in significant decreases in readmissions and in the number of high utilizers.

Those accountable for a system of care have an interest in managing services to maximize the potential for wellness in community settings. Promoting care in the least-restrictive environment possible is more recovery oriented than providing avoidable inpatient care, and it has been enshrined into law by the U.S. Supreme Court’s Olmstead decision. Community-based care is also less costly than higher levels of care. Thus payers and clinicians have aligned incentives for reducing possible excessive use of acute care services, such as emergency and inpatient care.

Payers have developed many sophisticated data-driven methods for monitoring network performance; however, these analyses are often based on financial claims data, which may lag behind the services provided by as much as 90 days. This delay makes it difficult to address problems early. Furthermore, such efforts may not fully realize the potential for providers at ground level to contribute additional context and ideas.

Crisis utilization, on the other hand, can be a source of useful data to guide system-level improvement. If each crisis visit is viewed as a missed opportunity for a more effective crisis prevention response in the community, the story behind each crisis visit can generate a “mini root cause analysis” to identify possible system issues (for example, the patient couldn’t get an appointment at the clinic, or the crisis plan isn’t working). By tracking trends in crisis utilization in a way that captures these missed opportunities, one can identify targets for improvement both for individuals receiving services and for the system as a whole. Brown (1) eloquently captured this concept with her observation that “maybe stories are just data with a soul.”

In this column, we describe an approach in which providers and payers routinely collaborate to share data to implement quality improvement (QI) initiatives targeted at improving care and cost by reducing the need to access crisis and emergency services.

A Provider-Payer Partnership

In Arizona, a regional behavioral health authority (RBHA) is contracted by the state to serve as the point of accountability for behavioral health service design and delivery for a given region. The RBHA is financed via Medicaid and other funds, and it contracts with multiple providers to deliver a complete continuum of care. Cenpatico Integrated Care, a partnership between Centene Corporation and Banner–University Family Care, has been the RBHA for southern Arizona since October 2015.

The crisis response center (CRC), located in Tucson on the campus of Banner–University of Arizona Medical Center South, was built in 2011 with Pima County bond funds to reduce the numbers of persons with behavioral health needs in jails and emergency rooms. The CRC has been managed by Connections Health Solutions since 2014 and provides 24/7 urgent care, 23-hour observation, and brief inpatient care to approximately 12,000 adults and 2,400 youths annually.

In January 2016, CRC and RBHA leadership began a series of regular conference calls to discuss and problem-solve system issues that led to crisis center utilization. By spring, the group had developed a process for working as a collaborative QI team, using data sharing and analysis to generate rapid-change cycles for the purpose of improvement (Figure 1).

FIGURE 1.

FIGURE 1. Systemwide quality improvement partnership between a crisis response center (CRC) and a regional behavioral health authority (RBHA)

Data sharing.

The CRC provides individual-level crisis utilization data to the RBHA in a daily data feed listing every RBHA member who had been discharged the previous day, key demographic data, the responsible clinic, secondary payers, and disposition. It also provides a monthly report listing high utilizers and chart summaries of members who made return visits to the observation unit within 72 hours of discharge and members who were readmitted to the inpatient unit within 30 days. Because these reports contain information about RBHA members only, there is no need for additional consent or business agreements.

Analysis of trends.

RBHA staff analyze the CRC data and present the findings during twice-monthly conference calls. Analysis based on daily feeds instead of claims makes it possible to identify trends within weeks instead of months. The group discussions function as root cause analyses of significant trends, providing the opportunity to identify various possible contributors to the findings and generate ideas for improvement projects. For example, a trend might spark CRC staff to discuss their on-the-ground experience with a particular system barrier of which the RBHA was previously unaware.

Rapid-cycle quality improvement.

Like many QI projects, the partnership uses the plan-do-check-act (PDCA) cycle as a framework for carrying out change (2). This simple four-step process begins with identifying a problem and designing an improved process to address it (plan), followed by testing the new process (do), analyzing results to determine how well it worked (check), and deciding whether to continue with the new process, or adjust and repeat (act).

Using the PDCA framework, ideas generated from the discussions are quickly put into action via multiple QI efforts of varying complexity and scope. Because the QI team consists of key leaders of both organizations, decisions about new initiatives can be made during team discussions and do not become delayed waiting for approval. The team also includes clinical staff from the CRC, who experience system problems firsthand and thus can add context to the shared data and proposed solutions. RBHA staff similarly provide their perspective regarding administrative functions not visible to CRC staff.

Results

Comparing fiscal years 2016 and 2017, there was a 30% decrease in 72-hour return visits to the observation units, dropping from 1.6% to 1.1% of youths (p<.03) and from 3.3% to 2.2% of adults (p<.02) (72-hour return visits are defined as the percentage of patients discharged from the observation unit who return to the observation unit within 72 hours, as described previously [3]). There was a 50% decrease in the percentage of adults discharged from the inpatient unit who were readmitted to the inpatient unit within 30 days, from 5.5% to 2.8% (p<.01).

In-home supports for youths.

Analysis of 72-hour return visits indicated that a subset of youths was unable to remain as placed without additional supports. A new program was created to provide more intensive in-home services for youths at risk of losing their placement after a crisis episode.

School-based crises.

In response to 911 calls from schools, a subset of youths arrived at the CRC via law enforcement. In order to provide a less restrictive alternative, these 911 calls were routed to the crisis hotline so that a mobile crisis team could be dispatched to the school and attempt to resolve the crisis without the need for CRC transport or law enforcement involvement.

Improved care coordination for youths.

Some readmissions of youths were linked to failures in care coordination with the outpatient clinics. A conference line was set up for clinic liaisons to call into daily rounds on the observation unit when a patient at their clinic was being discussed. Time slots were set aside to make it easier and faster to schedule in-person interagency team meetings at the CRC, if necessary.

Adult high utilizers.

Because of the high volume of adult encounters at the CRC (approximately 1,000 per month), the QI project involving adults focused on high utilizers rather than overall trends. A high utilizer was defined as having four or more visits in the preceding four-month period. The first high-utilizer report in May 2016 identified a cohort of 64 individuals, and a new report was sent each month thereafter in order to identify new high utilizers as they emerged. CRC and RBHA staff organized multiagency conference calls in order to discuss each patient’s needs and develop improved care plans in collaboration with all involved providers. The charts were flagged in the CRC electronic health record with information about the new plan and whom to contact.

Among the 64 individuals identified as high utilizers in May 2016, CRC visits decreased from 1,196 to 675 between the six months before and the six months after they were identified, a decline of 44% (p<.05). One year later, in May 2017, only seven of the original cohort of 64 remained high utilizers, and a significantly smaller number (N=37) met the high-utilizer definition (p<.01). [A graph showing the decrease in high utilizers and a case example are available in an online supplement to this column.]

Discussion

This project demonstrates how crisis providers and payers can collaborate to improve care at both the individual and system level. Crisis utilization data provided a mechanism for more up-to-date analysis of trends, and regular collaborative QI meetings facilitated the rapid development of multiple QI cycles. Outcomes were improved for individuals in the high-utilizer group, as indicated by decreased crisis visits. In addition, the decrease in readmissions and decrease in total number of high utilizers suggests improvements in the system as a whole, as people were better able to get their needs met without accessing crisis services. Furthermore, these outcomes were achieved via more efficient use of existing resources rather than through increases in staffing or cost.

The project also underscores the importance of the composition of the team in QI initiatives (4). It is critical to include both leaders who can act as decision makers and individuals who have first-hand experience working in the environment that is the target of improvement efforts.

Using a formal data-driven QI process—instead of a traditional research approach—has been essential for our success. To appreciate why that is true, it is important to understand the different methods and indications for each method (5).

Both research and QI have value in improving services and systems, but they must be understood as distinct processes. Research attempts to add to the general body of knowledge regarding evidence-based care, and it is by necessity slow and methodical. Research methods isolate a particular intervention, control all other variables if possible, and test whether that intervention achieved the desired result. To do so, researchers test hypotheses one at a time in controlled settings with well-defined subsets of patients in order to minimize confounding effects.

In contrast, QI is rapid and iterative. It focuses on applying what is already known to solve or improve complex real-world problems rather than on expanding the general body of knowledge. Many interventions may be tried in rapid succession or at the same time, in uncontrolled natural processes. Results are measured quickly to see whether improvement has occurred, and modifications are immediately applied and reanalyzed, Furthermore, there is no control group; interventions are applied to everyone, and recognition of confounding variabilities is welcomed, rather than “screened out.” However, like research, QI is thoroughly data driven and follows formal rules and processes.

In this project, we employed QI rather than a research approach in order to deploy multiple interventions rapidly and simultaneously. For example, in the school-based crisis intervention, there was no comparison group, many other improvements were made at the same time, and we did not control for differences in why certain schools call 911 more than others. Rather, the value of the QI approach was that it provided a process for studying the data so that one could discover that 911 calls from schools were a problem in the first place. Then the QI partnership team worked together to rapidly implement and test potential solutions. The details of the solution that ultimately worked may not be generalizable to other communities, where, for example, increasing school-based counseling services may be a better solution. The point is to have a process to discover problems, act on them quickly, test the results, modify as needed, and maintain what works.

Although this project involves a single crisis provider and one payer, we believe the most important element is the collaborative QI process itself. Based on the success of this project, similar protocols are now being developed between the RBHA and local emergency departments and between the CRC and other payers. Working with others to apply QI methods to achieve a common goal provides value to any organization in any community.

Dr. Balfour, Dr. Fox, Ms. Morales, and Dr. Berdeja are with Connections Health Solutions, Tucson, Arizona. Dr. Balfour and Dr. Berdeja are also with the Department of Psychiatry, University of Arizona, Tucson. The remaining authors are with Cenpatico Integrated Care, Tucson. Marcela Horvitz-Lennon, M.D., and Kenneth Minkoff, M.D., are editors of this column.
Send correspondence to Dr. Balfour (e-mail: ).

The authors report no financial relationships with commercial interests.

References

1 Brown B: The Power of Vulnerability. New York, TED: Ideas Worth Spreading, 2010. https://www.ted.com/talks/brene_brown_on_vulnerabilityGoogle Scholar

2 Plan-Do-Check-Act (PDCA) Cycle. Milwaukee, American Society for Quality, 2004. http://asq.org/learn-about-quality/project-planning-tools/overview/pdca-cycle.htmlGoogle Scholar

3 Balfour ME, Tanner K, Jurica JS, et al.: Crisis Reliability Indicators Supporting Emergency Services (CRISES): a framework for developing performance measures for behavioral health crisis and psychiatric emergency programs. Community Mental Health Journal 52:1–9, 2016Crossref, MedlineGoogle Scholar

4 Science of Improvement: Forming the Team. Boston, Institute for Healthcare Improvement, 2018. http://www.ihi.org/resources/Pages/HowtoImprove/ScienceofImprovementFormingtheTeam.aspxGoogle Scholar

5 Cook PF, Lowe NK: Differentiating the scientific endeavors of research, program evaluation, and quality improvement studies. Journal of Obstetric, Gynecologic, and Neonatal Nursing 41:1–3, 2012Crossref, MedlineGoogle Scholar