The American Psychiatric Association (APA) has updated its Privacy Policy and Terms of Use, including with new information specifically addressed to individuals in the European Economic Area. As described in the Privacy Policy and Terms of Use, this website utilizes cookies, including for the purpose of offering an optimal online experience and services tailored to your preferences.

Please read the entire Privacy Policy and Terms of Use. By closing this message, browsing this website, continuing the navigation, or otherwise continuing to use the APA's websites, you confirm that you understand and accept the terms of the Privacy Policy and Terms of Use, including the utilization of cookies.

×

Abstract

Objective:

The authors examined consumer outcomes before and after implementing CommonGround, a computer-based shared decision-making program.

Methods:

Consumers with severe mental illness (N=167) were interviewed prior to implementation and 12 and 18 months later to assess changes in active treatment involvement, symptoms, and recovery-related attitudes. Providers also rated consumers on level of treatment involvement.

Results:

Most consumers used CommonGround at least once (67%), but few used the program regularly. Mixed-effects regression analyses showed improvement in self-reported symptoms and recovery attitudes. Self-reported treatment involvement did not change; however, for a subset of consumers with the same providers over time (N=83), the providers rated consumers as more active in treatment.

Conclusions:

This study adds to the growing literature on tools to support shared decision making, showing the potential benefits of CommonGround for improving recovery outcomes. More work is needed to better engage consumers in CommonGround and to test the approach with more rigorous methods.

Medication management for people with severe mental illness has historically been conceptualized as use of strategies to increase compliance. However, the current approach attempts to incorporate person-centered, recovery-oriented care with effective medication management in support of consumer goals, which involves complex decision making and requires a partnership between two experts, the consumer and the provider (1). This concept of shared decision making (SDM) is now widely recognized as an indicator of high-quality health care, with increasing calls for SDM in mental health settings (1). However, SDM is still relatively rare in mental health care, and few studies have examined approaches specifically designed to increase SDM in these settings.

Consumers with severe mental illness desire a role in treatment decisions (2), but several barriers impede widespread use of SDM, including provider concerns about time constraints, questions of applicability for some consumers or clinical situations, and confusion regarding roles and responsibilities (3). Because of barriers to SDM, decision support tools may facilitate more effective and efficient clinical consultation while promoting reciprocal exchange of information and preferences to improve consumer outcomes.

One promising decision support system is CommonGround, which integrates computer technology, decision support tools, peer support, and provider and consumer training (4). Initial pilot work with CommonGround among people with severe mental illness suggested that it improved consumer-provider communication, increased shared treatment decisions, and strengthened the focus on recovery-oriented goals (46). Two other CommonGround evaluations reported varied findings. The first showed significantly improved symptoms and functioning and fewer consumer concerns about side effects with use of CommonGround (7); however, the second did not show improvements in medication adherence over a six-month follow-up (8). More research is needed to investigate the impact of CommonGround on consumer outcomes.

In the CommonGround program, computer kiosks use technologically advanced, self-guided discovery modules designed to assist individuals to learn about recovery, identify strategies to reach recovery goals, and monitor and share progress. “Personal medicine” (self-identified strategies that provide meaning and help consumers stay well) and a “power statement” (goals for psychiatric medication use in the recovery context) are developed in CommonGround (4). Prior to a psychiatric visit, consumers complete a one-page health report with assistance from peer providers that integrates a power statement and personal medicine with current symptoms and concerns to facilitate more efficient communication with providers. The health report highlights one or more areas that the consumer most wants to discuss during limited appointment times and assists in clarifying consumer and provider roles in the decision-making process. CommonGround was designed to overcome common obstacles for people with severe mental illness, such as low literacy, limited computer skills, and potentially elevated symptoms, by providing peer-guided computer-based tools in accessible language (4).

Our objective was to implement CommonGround in a new service setting—an urban community mental health center (CMHC)—and examine outcomes of consumers with severe mental illness engaged in assertive community treatment (ACT) or outpatient services who had access to the program. Because CommonGround prompts consumers to take a greater role in treatment decisions, we expected them to report an increased desire for autonomy in treatment decisions and to show greater activation in treatment. Furthermore, CommonGround provides concrete tools to identify and address medication concerns and to integrate personal medicine and consumer preference about medication in decision making, which should contribute to reduced symptoms. Finally, because of CommonGround’s emphasis on recovery, particularly with peer providers who model recovery (1), we hypothesized that consumers would report greater levels of recovery and hope. This study extends prior work by implementing the approach in a new setting and assesses a broader range of recovery-related consumer outcomes.

Methods

We implemented CommonGround in two outpatient clinics and two ACT teams serving adults with severe mental illness in an urban CMHC. Because of staff turnover, there were eight different psychiatric providers over the study period (March 2012 to February 2015). Visits with providers generally entailed check-ins, medication management, and discussion of consumer concerns. The CommonGround program was offered at decision support centers (DSCs) staffed by peer providers.

Research assistants approached potential participants upon arrival for a psychiatric visit. The assistants described the study, screened interested participants, and completed an informed consent process. Eligibility included receipt of psychiatric services from the CMHC, English fluency, ability to provide informed consent, and willingness to be interviewed three times and have three psychiatric provider visits audiotaped (baseline, 12 months, and 18 months). Consumers were not eligible if they were planning to leave the CMHC or change providers during the study period. Assistants audiotaped the psychiatric visit and conducted an interview. Providers were asked to complete a brief measure assessing consumer involvement after the visit. One-year and 18-month interviews were scheduled to coincide with psychiatric appointments. Consumers were paid $20 for each interview. All procedures were approved by the Indiana University Institutional Review Board.

We gathered demographic data and obtained psychiatric diagnoses through agency records. The Patient Activation Measure (9) assessed activation in mental health treatment. The Autonomy Preference Index assessed preferences related to autonomy in medical decision making (10) with two subscales: information-seeking autonomy and decision-making autonomy. We assessed symptoms by using a subscale of the How I Am Doing scale from the CommonGround program (7). We used the 24-item Recovery Assessment Scale (RAS) to measure perceived level of recovery from psychiatric illness (11), and hope was assessed with the State Hope Scale (12). Providers rated consumer involvement in visits by using a six-item questionnaire developed for this study. Providers rated the extent to which the consumer and provider worked together in the session on a variable 4-point response scale. All measures have been used in this population before and had good reliability.

Mixed-effects regression analyses were used to examine changes in consumer outcomes over time and controlled for age, race-ethnicity, gender, and clinic type. Frequency of CommonGround health report completion, an indication of the intervention exposure intensity and the most critical indicator of program engagement (13), was also controlled. Furthermore, for consumers who had the same psychiatric provider over 18 months (N=37), we examined whether the provider perceived changes in involvement over time. Most consumers had different providers over time because of turnover, and thus provider effects were not controlled. Multiple imputation was used for missing data.

Results

Over half of the 167 participating consumers were male (N=95, 57%) and African American (N=91, 55%) and had completed high school or some college (N=97, 58%). Most participants had a diagnosis of schizophrenia (N=94, 68% of 139 with diagnostic data available). There were 167 participants at baseline, 105 at 12 months, and 83 at 18 months (50% dropout). Dropout was not significantly related to consumer demographic factors or baseline outcomes.

Regarding intervention exposure, 60 participants (36%) never completed a health report, 34 (20%) completed one, 24 (14%) completed two, 13 (8%) completed three, and 36 (22%) completed more than three reports during the study period. Among the 83 participants who were in the study for 18 months, those who had the same providers (N=37) completed the health report about twice as often (mean±SD=6.2±4.9 reports) as those with different providers (2.5±1.6 reports) (t=–4.42, df=43, p<.001).

Consumer outcomes over time are shown in the Table 1. Self-reported patient activation and autonomy preferences did not change over time. However, consumers who had the same provider over 18 months showed significant improvement in provider perceptions of consumer involvement over time (β=.13). For the entire sample, self-reported symptoms also improved (β=.08). Recovery attitudes showed significant improvement as measured by RAS overall mean scores (β=.06) and by the RAS subscale “no domination by symptoms” (β=.15). Improvement in two other RAS subscales was marginally significant: “personal confidence and hope” (β=.06) and “reliance on others” (β=.07). Hope did not change over time.

TABLE 1. Mixed-effects regression analysis of consumer outcomes after implementation of CommonGround

Baseline (N=167)12 months (N=105)18 months (N=83)
VariableMSDMSDMSDp
Involvement
 Patient Activation Measurea55.3713.3454.2112.4654.6814.79.57
 Autonomy Preference Indexb
  Decision making2.42.832.51.912.451.04.78
  Information seeking4.39.494.20.544.27.57.59
 Providers' (N=37) perception of consumer involvementc3.54.423.65.443.80.36.01
How I Am Doing symptom subscaled3.53.943.61.953.69.91.02
Recovery attitudes
 Recovery Assessment Scalee
  Total3.84.533.91.643.97.63.02
  Personal confidence and hope3.83.653.88.793.95.74.06
  Willingness to ask for help4.17.684.22.714.26.68.24
  Goal and success orientation4.11.624.09.754.13.67.74
  Reliance on others3.83.753.78.933.97.78.05
  No domination by symptoms3.13.913.53.943.441.01<.01
 State Hope Scalef2.91.642.90.662.95.72.55

aPossible scores range from 1 to 100, with higher scores indicating greater activation.

bPossible scores range from 1 to 5, with higher scores indicating greater preferences for autonomy.

cRated on a scale from 1 to 4, with higher scores indicating greater involvement in the visit.

dPossible scores range from 1 to 5, with higher scores indicating less severe symptoms.

ePossible subscale scores range from 1 to 5, with higher scores indicating greater self-perceived recovery.

fPossible scores range from 1 to 5, with higher scores indicating greater hope.

TABLE 1. Mixed-effects regression analysis of consumer outcomes after implementation of CommonGround

Enlarge table

Discussion and Conclusions

In this uncontrolled study, consumers reported improvements in symptoms and perceived recovery attitudes over time after implementing CommonGround in the context of ongoing mental health services. However, most measures of treatment involvement did not change. In addition, use of CommonGround was variable, and a large proportion of participants (36%) never completed a health report.

In terms of positive changes in consumer outcomes, our findings are consistent with previous work showing improved symptoms and functioning among participants who use CommonGround (7). This study extends these findings by showing improvement on a recovery-related measure independent of the administrative data tracked in the CommonGround system. Another important contribution is that we examined the use of CommonGround in the context of both ACT and outpatient treatment teams. Given some of the uncertainty that has surrounded the feasibility of SDM with people with severe mental illness (14), the findings reported here suggest that decision aid technology with a support system that includes peer providers is promising for those involved in the most intensive community mental health services.

One unexpected finding in our study was the low rate of CommonGround health report completion. Given high levels of provider turnover during the study period and changing treatment team infrastructure, there were several barriers to implementation that likely influenced CommonGround use (15). Indeed, consumers who had the same providers completed the health report about twice as often as those with different providers, and these individuals showed positive changes in treatment involvement. It may be that provider consistency is an important mechanism that promotes CommonGround use and improves consumer outcomes. Future work should seek strategies to support use of CommonGround, even in the face of turnover.

The study had several limitations. First, because the study lacked a control group and an experimental design, the causal influence of CommonGround for improved outcomes is not clear. Second, the study had a high rate of provider turnover, which had an impact on use of CommonGround and affected our ability to control for provider effects over time. Third, we had a relatively high rate of consumer dropout and a low rate of health report completion. Future work is needed to investigate factors contributing to systemic and participant-level barriers to engaging in CommonGround, including understanding subgroups for whom the intervention may be most effective. Finally, we are limited in our interpretation of clinical significance by the relatively small effect sizes, which require further investigation.

Overall, our study found additional positive recovery outcomes after CommonGround implementation for consumers who received ACT and outpatient services in a CMHC, indicating potential benefits of the program for those receiving the most intensive outpatient services. More attention to facilitating consistent use of CommonGround and a more rigorous design to evaluate its causal influence are warranted.

Dr. Salyers, Ms. Bonfils, Ms. Firmin, and Ms. Luther are with the Department of Psychology, Indiana University–Purdue University Indianapolis, Indianapolis (e-mail: ). Dr. Fukui, Dr. Goscha, and Dr. Rapp are with the School of Social Welfare, University of Kansas, Lawrence, where Dr. Holter was affiliated when this work was done.

Preliminary data were presented at the annual conference of the Society for Social Work and Research, Washington, D.C., January 13–17, 2016.

This research was supported by grant R34MH093563 from the National Institute of Mental Health.

The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health.

The authors report no financial relationships with commercial interests.

References

1 Deegan PE, Drake RE: Shared decision making and medication management in the recovery process. Psychiatric Services 57:1636–1639, 2006LinkGoogle Scholar

2 Hamann J, Cohen R, Leucht S, et al.: Do patients with schizophrenia wish to be involved in decisions about their medical treatment? American Journal of Psychiatry 162:2382–2384, 2005LinkGoogle Scholar

3 Légaré F, Ratté S, Gravel K, et al.: Barriers and facilitators to implementing shared decision-making in clinical practice: update of a systematic review of health professionals’ perceptions. Patient Education and Counseling 73:526–535, 2008Crossref, MedlineGoogle Scholar

4 Deegan PE, Rapp C, Holter M, et al.: A program to support shared decision making in an outpatient psychiatric medication clinic. Psychiatric Services 59:603–605, 2008LinkGoogle Scholar

5 Campbell SR, Holter MC, Manthey TJ, et al.: The effect of CommonGround software and decision support center. American Journal of Psychiatric Rehabilitation 17:166–180, 2014CrossrefGoogle Scholar

6 Goscha R, Rapp C: Exploring the experiences of client involvement in medication decisions using a shared decision making model: results of a qualitative study. Community Mental Health Journal 51:267–274, 2015Crossref, MedlineGoogle Scholar

7 MacDonald-Wilson KL, Deegan PE, Hutchison SL, et al.: Integrating personal medicine into service delivery: empowering people in recovery. Psychiatric Rehabilitation Journal 36:258–263, 2013Crossref, MedlineGoogle Scholar

8 Stein BD, Kogan JN, Mihalyo MJ, et al.: Use of a computerized medication shared decision making tool in community mental health settings: impact on psychotropic medication adherence. Community Mental Health Journal 49:185–192, 2013Crossref, MedlineGoogle Scholar

9 Green CA, Perrin NA, Polen MR, et al.: Development of the Patient Activation Measure for mental health. Administration and Policy in Mental Health and Mental Health Services Research 37:327–333, 2010Crossref, MedlineGoogle Scholar

10 Ende J, Kazis L, Ash A, et al.: Measuring patients’ desire for autonomy: decision-making and information-seeking preferences among medical patients. Journal of General Internal Medicine 4:23–30, 1989Crossref, MedlineGoogle Scholar

11 Salzer MS, Brusilovskiy E: Advancing recovery science: reliability and validity properties of the Recovery Assessment Scale. Psychiatric Services 65:442–453, 2014LinkGoogle Scholar

12 Snyder CR, Sympson SC, Ybasco FC, et al.: Development and validation of the State Hope Scale. Journal of Personality and Social Psychology 70:321–335, 1996Crossref, MedlineGoogle Scholar

13 Fukui S, Salyers MP, Rapp C, et al.: Supporting shared decision-making beyond consumer-prescriber interactions: initial development of the CommonGround fidelity scale. American Journal of Psychiatric Rehabilitation 19:252–267, 2016Crossref, MedlineGoogle Scholar

14 Kaminskiy E: The elephant in the room: a theoretical examination of power for shared decision making in psychiatric medication management. Intersectionalities 4:19–38, 2015Google Scholar

15 Bonfils KA, Dreison KC, Luther L, et al.: Implementing CommonGround in a community mental health center: lessons in a computerized decision support system. Psychiatric Rehabilitation Journal, in pressGoogle Scholar