The American Psychiatric Association (APA) has updated its Privacy Policy and Terms of Use, including with new information specifically addressed to individuals in the European Economic Area. As described in the Privacy Policy and Terms of Use, this website utilizes cookies, including for the purpose of offering an optimal online experience and services tailored to your preferences.

Please read the entire Privacy Policy and Terms of Use. By closing this message, browsing this website, continuing the navigation, or otherwise continuing to use the APA's websites, you confirm that you understand and accept the terms of the Privacy Policy and Terms of Use, including the utilization of cookies.

×
Published Online:https://doi.org/10.1176/appi.ps.201900181

Abstract

Objective:

Few existing instruments measure recovery-oriented organizational climate and culture. This study developed, psychometrically assessed, and validated an instrument to measure recovery climate and culture.

Methods:

Organizational theory and an evidence-based conceptualization of mental health recovery guided instrument development. Items from existing instruments were reviewed and adapted, and new items were developed as needed. All items were rated by recovery experts. A 35-item instrument was pilot-tested and administered to a national sample of mental health staff in U.S. Department of Veterans Affairs Psychosocial Rehabilitation and Recovery Centers (PRRCs). Analysis entailed an exploratory factor analysis (EFA) and inter-item reliability and scale correlation assessment. Blinded site visits to four PRRCs were performed to validate the instrument.

Results:

The EFA determined a seven-factor solution for the data. The factors identified were staff expectations, values, leadership, rewards, policies, education and training, and quality improvement. Seven items did not meet retention criteria and were dropped from the final instrument. The instrument exhibited good internal consistency (Cronbach’s α=0.81; subscales, α=0.84–0.88). Scale correlations were between 0.16 and 0.61, well below the threshold (α=0.9) for indicating overlapping constructs. Site visitors validated the instrument by correctly identifying high-scoring and low-scoring centers.

Conclusions:

These findings provide a psychometrically tested and validated instrument for measuring recovery climate and culture in mental health programs. This instrument can be used in evaluation of mental health services to determine the extent to which programs possess the organizational precursors that drive recovery-oriented service delivery.

HIGHLIGHTS

  • Few existing instruments measure recovery-oriented organizational climate and culture.

  • This study developed, psychometrically assessed, and validated an instrument to measure recovery climate and culture.

  • The resulting psychometrically sound instrument comprises 28 items and seven subscales and can be incorporated into evaluations of the recovery orientation of mental health programs.

Mental health systems have shifted away from a medical model and toward a recovery model of care over the past 40 years, and recovery-oriented care has emerged as the prominent paradigm for helping those with mental health conditions (15). This is demonstrated by the integration of recovery concepts and language into vision statements, mission statements, and service philosophies (6) as well as into policies that govern service delivery and practice (7) in many large mental health systems.

Few psychometrically tested and validated instruments are available to measure the extent to which programs encompass the necessary organizational precursors to drive recovery-oriented service delivery and, ultimately, outcomes. This is a persistent limitation in evaluating whether a mental health program has successfully transitioned to a recovery model. Some available instruments are helpful for measuring personal recovery (816), whereas others are useful for measuring the recovery orientation of services or providers (1720). There are two main drawbacks in the existing menu of recovery instruments, however. First, existing instruments largely lack an evidence-based conceptualization of the multiple dimensions of recovery, such as those set forth by the Substance Abuse and Mental Health Services Administration (SAMHSA) (21). Second, existing instruments for assessing recovery orientation, to our knowledge, are not rooted in organizational theory, even though a positive culture of healing and organizational commitment to supporting recovery (i.e., organizational climate and culture) are promoted as critical aspects of a recovery-oriented mental health system (2224). Therefore, they are not helpful for understanding and measuring whether organizational climate and culture promote recovery-oriented service delivery.

According to organizational theory, climate and culture form the foundation for staff actions and the way services are provided. As key elements of the social psychological work context, they directly influence staff behaviors in the workplace (25). Organizational climate is defined as shared meanings among people in the organization about events, policies, procedures, and rewarded behaviors (25). These shared meanings are distinct from the actual policies, practices, and procedures themselves. Organizational culture is more enduring than climate. It connotes the pattern of assumptions that are shared within an organization and taught as the right way to think and perceive (25). Several existing recovery instruments—for example, the Recovery Culture Progress Report (26) and the Recovery-Enhancing Environment Scale (27)—do, in fact, include items pertaining to organizational climate and culture. However, these touch only superficially on the constructs, are not explicitly connected to organizational theories, and to date have not been psychometrically assessed. In this study, we addressed this gap by developing a validated instrument to assess recovery-promoting climate and culture within mental health programs.

Methods

Conceptual Framework

Organizational theory on climate and culture.

The model of organizational climate and culture by Ehrhart et al. (25) guided the selection of organizational domains for instrument development (Figure 1). It proposes that leadership affects both organizational culture (shared values) and climate (shared meanings and expectations held by staff). Organizational policies, practices, and procedures, in addition to leadership and staff expectations, are posited to affect organizational climate. We operationalized practices and procedures as staff rewards, staff education and training, and quality improvement processes. Therefore, the organizational domains used in instrument development were leadership, values, expectations, policies, rewards, education and training, and quality improvement.

FIGURE 1.

FIGURE 1. Conceptual framework of organizational domains supporting recovery climate and culturea

aThe domains used in developing the instrument for assessing recovery climate and culture appear in bold. The model is based on work by Ehrhart et al. (25).

Our approach was similar to Moos and Houts’s (28) work in developing social climate scales in mental health settings. As is the usual practice in assessing climate and culture (2932), we developed an instrument for staff to complete, and items elicited their perceptions of expectations, leadership, rewards, etc., in their organization. Climate and culture were measured as the aggregate (mean) values of staff perceptions within each organization.

Evidence-based dimensions of recovery.

A systematic review of recovery dimensions identified in the scientific literature (33) further guided instrument development. The recovery dimensions used were individualized/person-centered, empowerment, hope, self-direction, relational, nonlinear/many pathways, strengths based, respect, responsibility, peer support, holistic, culturally sensitive, and trauma informed.

Item Development

Identification, adaptation, and development of items.

Using two compendiums of recovery instruments (34, 35), we identified 15 existing instruments as potentially relevant to our study (see list in online supplement) and reviewed each one. We then mapped relevant items to organizational domains and recovery dimensions in our conceptual framework and edited them for clarity. We drafted additional items as needed to ensure that each organizational domain and recovery dimension was represented in the item bank. That is, at least one item captured every recovery dimension within each organizational domain. Some items touched on more than one recovery dimension or organizational domain. (Details of the mapping exercise ae available in the online supplement.)

Item review by recovery experts.

Three external recovery experts reviewed and independently rated each of the 51 items in the item bank. Experts had decades of recovery experience in mental health service provision, advocacy, and research. Each item was rated on 4-point scales for relevance to recovery (1, not relevant; 2, somewhat relevant; 3, quite relevant; and 4, highly relevant) and clarity (1, not clear; 2, somewhat clear; 3, quite clear; and 4, extremely clear). Any item with an average score below 3 on relevance or clarity was eliminated, and the remaining items were further refined on the basis of expert feedback.

Survey development and cognitive testing.

The resulting 35 items were then organized into a survey. Response options for 31 items were provided on a 5-point Likert scale, with 1 indicating that the element was not at all present and 5 indicating that the element was extremely present in the program, and the response options for the remaining four items were yes, no, and don’t know. We added several questions that captured respondent and program characteristics (e.g., tenure, role, and discipline). Next, we pilot-tested the survey with six mental health staff—a peer support specialist, a psychologist, a social worker, a nurse, and two program managers—using semistructured interviews to elicit feedback on survey length, flow, organization, and item and response comprehension. Items were refined once more before survey administration.

Study Setting, Recruitment, and Survey Administration

We administered the survey in U.S. Department of Veterans Affairs (VA) Psychosocial Rehabilitation and Recovery Centers (PRRCs). In 2008, the VA adopted SAMHSA’s definition of recovery as part of the transformation to a recovery-oriented system of mental health care (36). All existing mental health day treatment and hospital programs were redesigned as PRRCs, in accordance with changes in VA policy and procedures. VA PRRC national policy enumerated the mission, vision, and values expected in these programs and proscribed program structure, services, and characteristics to ensure alignment with recovery principles (37). PRRCs were the ideal setting to develop and test our recovery climate and culture instrument within a national mental health system because of the centers’ recovery-informed policies and expectations. There were 104 PRRCs at the time this study was conducted, each containing between two and 14 multidisciplinary staff.

The survey was programmed into a Web-based platform (Enterprise Feedback Management, Verint Systems, Inc., Melville, New York), and approvals were obtained from the VA Boston Healthcare System and Edith Nourse Memorial Veterans Hospital Institutional Review Board, unions, and survey compliance offices. The elements of informed consent were explained on the first screen of the electronic survey, and respondents were required to indicate their consent to participate.

PRRC staff were invited to complete the survey by e-mail. We obtained the e-mail addresses of PRRC staff from the VA Office of Mental Health and Suicide Prevention, which was a collaborator on the study. We used a three-step recruitment approach comprising a series of e-mail invitations and reminders (38). Responses were collected over a 3-week period in May 2017. (Details about the recruitment process are available in an online supplement.)

Data Analyses

We examined descriptive statistics for each item and then conducted exploratory factor analysis (EFA) to examine the factor structure of the data. We calculated the eigenvalues of the sample’s polychoric correlation matrix and conducted the EFA followed by geomin rotation. We determined the number of factors by identifying the number of eigenvalues greater than 1 and assessed factor loading patterns for each item. To ensure a clear factor loading structure, we kept items with a factor loading of greater than 0.40 on one of the factors (which indicates that 16% of the item variance could be explained by this factor and is considered a strong effect between the item and factor) and a factor loading of less than 0.3 on other factors. In addition to examining factor loading patterns, we considered the meaning of the factors, number of items, and each factor’s content coverage to determine the final factors. (Details on the factor loadings are available in the online supplement.)

For each item within each factor, we calculated the corrected item-total correlation, using the threshold that greater than 0.4 is acceptable. Also the Cronbach’s alpha if the item was deleted was calculated, and any value lower than the Cronbach’s alpha with all items was considered acceptable. We examined internal consistencies for each factor by calculating the Cronbach’s alpha and used the following criteria for determining acceptability: α>0.90, excellent; α=0.81‒0.90, good; α=0.71‒0.80, acceptable; α=0.61‒0.70, questionable; α=0.50‒0.60, poor; and α<0.50, unacceptable (39). We calculated the mean and standard deviation of each factor and correlations among the factors.

Additionally, we explored differences in overall score by discipline and supervisory status, accounting for site-level clustering and unbalanced groups.

Instrument Validation With Site Visits

Site selection.

We assessed extent of recovery-promoting climate and culture for programs with four or more survey respondents. We first checked the intraclass correlation coefficient (ICC1) to ensure that aggregating staff responses to the site level was appropriate. The ICC1 was 0.35—well above the acceptable threshold for aggregation (ICC1>0.10 [40]). Next, we ranked these programs by average score across all factors (average overall score). We invited the two PRRC programs ranked highest and the two ranked lowest to participate in site visits.

Site visits.

Two-person teams blinded to the PRRC rankings conducted 1.5-day site visits that included semistructured interviews with staff and program participants, group and meeting observations, and a facility tour. Site visitors independently rated each PRRC on the extent to which recovery principles were present in the organizational domains captured in the instrument and then discussed their ratings and reached consensus. Site visit consensus ratings were compared with average overall survey score rankings to determine whether the survey accurately discriminated between the sites rated highest and lowest by the site visitors (41).

Results

Sample Characteristics

We sent survey links to 785 potential respondents and received 280 survey responses (36% response rate). Thirty-two responses did not include data for any instrument items and were excluded; the analytical sample contained 248 responses. Table 1 reports sample characteristics. The majority of respondents were female (65%) and nonveterans (70%); 80% were not supervisors. The most frequent disciplines were social work (33%), peer specialist (17%), and psychology (17%). The percentage of staff in each discipline differed by no more than 3 percentage points from the distribution of disciplines for all PRRC staff, as reported in the staff directory.

TABLE 1. Characteristics of 248 staff at VA Psychosocial Rehabilitation and Recovery Centers (PRRCs) who completed the Recovery Climate and Culture Scale

CharacteristicN%
Gendera
 Female15665
 Male8435
Veteran statusb
 Nonveteran17370
 Veteran7430
Tenure in a mental health setting (years)
 <131
 1–55823
 6–106627
 >1012149
Tenure in the VA (years)
 <1114
 1–57731
 6–108032
 >108032
Tenure in PRRC (years)
 <13012
 1–512049
 6–107329
 >102510
Supervisory status
 None19880
 First-line supervisor5020
Discipline/profession
 Social work8133
 Peer specialist4317
 Psychology4317
 Nursing2711
 Administrative177
 Occupational therapy125
 Recreational therapy73
 Other187

aPercentages are based on responses from 240 staff.

bPercentages are based on responses from 247 staff.

TABLE 1. Characteristics of 248 staff at VA Psychosocial Rehabilitation and Recovery Centers (PRRCs) who completed the Recovery Climate and Culture Scale

Enlarge table

Distribution of Item Responses

There were 228 to 244 responses for each item, and item means ranged from 2.62 (0.57 for dichotomous items) to 4.58 (0.63 for dichotomous items) with standard deviations between 0.49 and 1.48 (see online supplement). The percentage of missing responses for each item ranged from 2% to 8%. We did not find any clear pattern of missing or skipped items.

Exploratory Factor Analysis

Eigenvalues extracted from the polychoric correlation matrix of the 35 items revealed that seven eigenvalues were greater than 1, and the seven-factor solution explained 76% of the variance. Seven factors stemming from the EFA aligned well with the seven organizational dimensions in our conceptual framework. (The factor loadings are available in the online supplement.) Based on that evidence, we concluded that the seven-factor model was the final model.

Seven of the 35 items were not retained in the final instrument because they did not meet the criteria for retention. The resulting instrument contained 28 items (Table 2).

TABLE 2. Factor loadings for the seven-factor solution of the final items on the Recovery Climate and Culture Scale, by organizational domain

Factora
Domain and item #Item1234567Item-total correlation
Expectations
 1To what extent do your coworkers in the [Psychosocial Rehabilitation and Recovery Center] PRRC expect each other to educate PRRC participants about their rights as citizens in the larger community?.68b*.14.08.12.04.07−.01.65
 2To what extent do your coworkers in the PRRC expect each other to facilitate relationship-building among PRRC participants so that those who are more advanced in their own recovery process will serve as role models or mentors for their peers?.82b*−.05.05.01.02−.06.03.54
 3To what extent do your coworkers in the PRRC expect each other to deliver services in a way that is sensitive to each PRRC participant’s ethnic background, race, sexual orientation, religious beliefs, and gender?.71b*.20*−.01−.04−.07.01.06.52
 4To what extent do your coworkers in the PRRC expect each other to facilitate PRRC participants’ involvement in community activities and social networks outside the mental health system?.79b*.01.03−.04.10.13.08.57
 5To what extent do your coworkers in the PRRC expect each other to build on PRRC participants’ strengths and capabilities as the foundation of their participation in the PRRC program?.88b*.01−.01.15*−.06−.06−.02.60
Values
 6How important is it to you personally that the PRRC continues to work with PRRC participants even when they refuse certain other treatments (e.g., medication, inpatient hospitalization)?−.05.15.00.64b*−.05−.05.08.27
 7How important is it to you personally that PRRC staff support the decisions and choices of PRRC participants even when staff have concerns about possible negative consequences?.09−.05.01.97b*.02−.01−.03.35
 8How important is it to you personally that the PRRC includes staff members who themselves are people in mental health recovery?−.02.06.03.4b*.15.09.11.23
Leadership
 9Thinking about the person who is your current PRRC team lead, how strongly does he or she emphasize including PRRC participants in decisions about the PRRC?.01.79b*.02−.03.02−.05.14*.71
 10Thinking about the person who is your current PRRC team lead, how strongly does he or she emphasize including in the care planning and recovery process family members and others who are important to PRRC participants?−.08.89b*.04−.01−.01−.13−.01.67
 11Thinking about the person who is your current PRRC team lead, how strongly does he or she emphasize giving attention to all life areas of PRRC participants, including health, home, purpose, and community?.07.92b*.01.01−.08−.07.03.72
 12Thinking about the person who is your current PRRC team lead, to what extent does he or she actively advocate for the use of recovery-promoting practices throughout the whole VA facility?.01.82b*.02.08.11.20*.04.69
 13Thinking about the person who is your current PRRC team lead, to what extent does he or she demonstrate recovery promotion in practice?.22*.77b*−.01.01.14.22*−.03.67
Education and training
 14To what extent do you receive (or have you received) education/training that specifically focuses on how to develop individualized PRRC care plans that are client driven?.10.05.80b*−.07.06−.11−.23*.59
 15To what extent do you receive (or have you received) education/training that specifically focuses on how to work supportively with PRRC participants when they are not adhering to treatments they have agreed to use?.05.10.87b*.01−.02−.05−.21*.65
 16To what extent do you receive (or have you received) education/training that specifically focuses on how to teach PRRC participants to advocate for their own wellness?.03−.12*1.00b*.03.00.04.05.70
 17To what extent do you receive (or have you received) education/training that specifically focuses on how to make PRRC participants feel comfortable and safe in the program?−.05−.06.99b*.03−.03.00.05.65
 18To what extent do you receive (or have you received) education/training that specifically focuses on how to connect PRRC participants with natural supports in the community?−.05.04.91b*.02.02.06.05.70
Rewards
 19To what extent are PRRC staff rewarded for championing recovery-promoting principles and practices in the PRRC?.04.05−.04.06.93b*−.02−.03.55
 20To what extent are PRRC staff rewarded for promoting a holistic approach in the PRRC, including attention to health, home, purpose, and community?.00.03.00−.03.98b*.01.00.54
 21To what extent are PRRC staff rewarded for promoting cultural sensitivity within the PRRC?.00−.03.07*.00.94b*−.05.04.55
Policy
 22Does this PRRC have a formal policy specifying that veterans who use the PRRC are asked to participate in the development and modification of program policies and procedures?.03−.12.04.00−.15.67b*−.07.35
 23Does this PRRC have a formal policy specifying that veterans who wish to use the PRRC program may do so even if they are refusing or noncompliant with other treatment?−.07−.04−.03−.11.01.72b*.09.28
 24Does this PRRC have a formal policy specifying that veterans with lived experience with serious mental illness have a role in PRRC program quality improvement and evaluation?.01.04−.08.02−.23.80b*−.06.39
Quality improvement
 25To what extent does this PRRC actively solicit feedback from PRRC participants by using surveys that go beyond completing program evaluation forms or providing access to a suggestion box?.16*.14−.02.05−.01−.20*.55b*.56
 26To what extent does this PRRC actively solicit feedback from PRRC participants in meetings convened specifically for that purpose, such as focus groups, roundtable discussions, and community meetings?.32*−.03−.01−.02.05−.06.72b*.60
 27To what extent does this PRRC use peers to actively solicit feedback about the PRRC from PRRC participants?.00.16.17*.01.17*.06.54b*.63
 28To what extent does this PRRC use feedback from PRRC participants as the basis for making changes in how services are delivered in this PRRC, such as the intake process, staff-veteran roles, and the physical setup of the PRRC?.04.16.20*.00−.03−.03.70b*.67

aEigenvalues for factors (F) 1, 12.46; F2, 2.65; F3, 2.05; F4, 1.81; F5, 1.56; F6, 1.33; F7, 1.14. Exploratory factor analysis was conducted by using weighted least squares with mean and variance adjustment and pairwise deletion followed by geomin rotation. Goodness-of-fit statistics in the seven-factor model: χ2=295.96, df=203, p<.001; comparative fit index=.995; Tucker-Lewis Index=.990; root mean square error of approximation=.043, 90% confidence interval=.032, .053; standardized root mean square residual=.028.

bFactor loading >.4, meaning 16% of the item variance could be explained by this factor, considered a strong effect between the item and factor.

*p<.05.

TABLE 2. Factor loadings for the seven-factor solution of the final items on the Recovery Climate and Culture Scale, by organizational domain

Enlarge table

Internal Consistency

The internal consistency reliability (Cronbach’s α) of the final 28 items was 0.81 overall, and the reliability of the individual subscales ranged from 0.84 to 0.88 (Table 3). Cronbach’s alpha values for all subscales were above 0.8 and were considered good. Correlation among the subscales ranged from 0.16 to 0.61. No correlations were greater than 0.9, which indicated the unique content of each factor.

TABLE 3. Internal consistency reliability and correlations for domain subscales of the Recovery Climate and Culture Scalea

Correlation
Response
SubscaleN of itemsMSDCronbach’s αExpectationsValuesLeadershipEducation and trainingRewardsPolicyQI
Total scale283.80.75.81
Expectations54.44.77.851
Values34.42.64.88.321
Leadership54.19.98.84.59.321
Education and training53.861.09.85.56.27.521
Rewards32.651.33.86.29.20.45.321
Policy33.191.61.87.20.16.32.34.281
Quality improvement (QI)43.901.06.84.54.30.61.50.38.401

aAll Pearson correlation coefficients for correlations between domains were significant (p<.05). All but 3 items were rated on a scale of 1 to 5, with higher responses indicating greater presence of the recovery element measured by the item in the program. Responses for the 3 dichotomous items, coded 0 or 1, were recoded to 1 or 5, respectively, to convert them to the same response scale as the rest of the items.

TABLE 3. Internal consistency reliability and correlations for domain subscales of the Recovery Climate and Culture Scalea

Enlarge table

Sensitivity Analyses

We found no difference in overall score by discipline; however, we found a small, yet statistically significant, absolute difference in overall score by supervisory status. Nonsupervisors had an average overall score of 3.73±0.05 on a 5-point scale, compared with 4.03±0.10 for supervisors (p=0.012).

Site Visit Validation

Site visitors correctly identified the two sites ranked highest and the two sites ranked lowest. Further information about the site visit component of the study is reported elsewhere (41).

Discussion

In this study, we developed and validated a psychometrically sound instrument to measure recovery climate and culture in mental health programs. To our knowledge, this is the first instrument to use both a comprehensive framework guided by organizational theory on climate and culture and an empirically based conceptualization of the multiple dimensions of mental health recovery. It measures key facets of climate and culture, such as leadership and rewards, which can be targeted for intervention to achieve recovery-oriented service delivery.

The two individual items with the highest average response were items 3 (“To what extent do your coworkers in the PRRC expect each other to deliver services in a way that is sensitive to each PRRC participant’s ethnic background, race, sexual orientation, religious beliefs, and gender?” [item 6 in the original 35]) and 6 (“How important is it to you personally that the PRRC continues to work with PRRC participants even when they refuse certain other treatments?” [item 10]). This suggests that these elements have been the easiest for PRRC programs to adopt when seeking a recovery orientation. The content in these items aligns with the recovery dimensions of cultural sensitivity, empowerment, and self-direction set forth by SAMHSA (21). They also map to organizational domains of staff expectations (item 3) and staff values (item 6).

On the other hand, the two items with the lowest average response were items 20 (“To what extent are PRRC staff rewarded for promoting a holistic approach in the PRRC, including attention to health, home, purpose, and community?” [item 27]) and 21 (“To what extent are PRRC staff rewarded for promoting cultural sensitivity within the PRRC?” [item 28]). These items reside in the organizational domain of staff rewards, suggesting that rewards may be an underutilized area of development in the programs we studied. Also of note, the item about rewarding staff for promoting cultural sensitivity (item 21) was rated low while the item about staff expectations for being culturally sensitive and inclusive (item 3) was rated high. This dichotomy in the responses for these two items indicates that in the PRRCs we studied, staff expected each other to demonstrate cultural sensitivity, but reward systems for supporting and encouraging cultural sensitivity were not prominent.

The example of misalignment between staff expectations for cultural sensitivity and program reward structures that promote cultural sensitivity demonstrates how this tool, based on organizational theory of climate and culture, could be used in helping organizations develop a recovery orientation. Users of the instrument could assess program scores in each organizational domain and determine the areas in which the program performs well and areas that warrant improvement. These data could inform the program about how to target change efforts. In the example highlighted above, PRRCs could revamp staff reward and recognition practices to better align with recovery principles, such as by publicly recognizing staff for their contributions to recovery-oriented service, instituting achievement awards for promoting recovery principles, or encouraging medical center leadership to send a thank you note to staff for promoting elements of recovery in practice (42).

The impetus for developing this instrument was to advance research on the relationships between climate and culture and recovery-oriented outcomes. As the preceding discussion illustrates, the instrument also has the potential to identify domains of organizational climate and culture that leaders can use to transform their organizations into recovery-oriented care systems. For that reason, this instrument could prove useful in VA mental health settings other than PRRCs, such as mental health outpatient programs, behavioral health interdisciplinary program teams, and residential treatment programs.

This study had several limitations. First, the findings are not directly generalizable to other mental health settings because we conducted this study only in the VA. However, testing the instrument in the VA afforded access to a national sample of program staff operating under one set of policies and procedures, which would be difficult to achieve in a national sample of mental health programs. Community mental health centers, for example, are heavily influenced by state-specific mental health policies that would pose challenges to testing the instrument. Furthermore, many of the instrument’s items were adapted from instruments developed outside the VA. For these reasons, it is likely our findings will translate to non-VA settings.

Second, the study had a 36% response rate, so responses may not be representative of the entire PRRC staff population. However, such a response rate is typical of other studies (4345). Third, we eliminated seven items to obtain unambiguous factor loadings and ensure parsimonious scales in the final instrument. Although the content in these items was determined to be important in the initial phases of instrument development, these items did not perform well in psychometric analysis. Future users of the instrument may consider capturing this content using another data collection approach, such as semistructured interviews. Fourth, we did not conduct a confirmatory factor analysis (CFA) following the EFA because we lacked sufficient respondents to split the sample. Without conducting a CFA, we cannot confirm that the factor structure identified in the EFA was correct, and we cannot determine whether each factor was unidimensional. Finally, this study focused on the content validity and internal consistency of the instrument. Future research should assess its construct validity using external criterion variables, including personal recovery outcomes.

Conclusions

In this study, we developed a psychometrically tested and validated instrument to measure recovery-promoting climate and culture in mental health programs. The instrument is a necessary first step for conducting research on the extent to which recovery climate and culture drive recovery-oriented service delivery and individual-level outcomes. Recovery climate and culture measures should be incorporated into evaluations of the recovery orientation of mental health programs, and this instrument offers a tool for doing so.

Division of Health and Environment, Abt Associates, Inc., Cambridge, Massachusetts (Evans); Center for Healthcare Organization and Implementation Research (CHOIR) (Wewiorski, Ellison) and Social and Community Reintegration Research Program (Gorman), Edith Nourse Rogers Memorial Veterans Hospital, Bedford, Massachusetts; Department of Health Law, Policy, and Management, Boston University School of Public Health, Boston (Ni, Charns); CHOIR, U.S. Department of Veterans Affairs Boston Healthcare System, Boston (Harvey, Charns); Department of Psychiatry, University of Massachusetts Medical School, Worcester (Ellison); VISN 1 Mental Illness Research, Education and Clinical Center, U.S. Department of Veterans Affairs Connecticut Healthcare System, West Haven, Connecticut, and Department of Psychiatry, Yale University School of Medicine, New Haven, Connecticut (Hunt); Department of Psychiatry, Boston University School of Medicine, Boston (Gorman). Dr. Evans was with CHOIR when the research was conducted.
Send correspondence to Dr. Evans ().

This material is based on work supported by Health Services Research and Development, Office of Research and Development, Veterans Health Administration (PPO 16-135). Manuscript development was partly supported by funding from Abt Associates’ Health Services Market Center.

The authors report no financial relationships with commercial interests.

The authors thank Allie Silverman and Jacquelyn Pendergast for their research support and Laurel Radwin and Michael Shwartz for their guidance and input on survey development processes.

The views expressed in this article are those of the authors and do not necessarily reflect the position or policy of the U.S. Department of Veterans Affairs or the U.S. government.

References

1 Anthony WA: Recovery from mental illness: the guiding vision of the mental health service system in the 1990s. Psychosoc Rehabil J 1993; 16:11–23CrossrefGoogle Scholar

2 Davidson L, Drake RE, Schmutte T, et al.: Oil and water or oil and vinegar? Evidence-based medicine meets recovery. Community Ment Health J 2009; 45:323–332Crossref, MedlineGoogle Scholar

3 Davidson L, Tondora J, O’Connell MJ, et al.: Creating a recovery-oriented system of behavioral health care: moving from concept to reality. Psychiatr Rehabil J 2007; 31:23–31Crossref, MedlineGoogle Scholar

4 Davidson L, O’Connell M, Tondora J, et al.: The top ten concerns about recovery encountered in mental health system transformation. Psychiatr Serv 2006; 57:640–645LinkGoogle Scholar

5 Goldberg RW, Resnick SG: US Department of Veterans Affairs (VA) efforts to promote psychosocial rehabilitation and recovery. Psychiatr Rehabil J 2010; 33:255–258Crossref, MedlineGoogle Scholar

6 Davidson L: The recovery movement: implications for mental health care and enabling people to participate fully in life. Health Aff (Millwood) 2016; 35:1091–1097Crossref, MedlineGoogle Scholar

7 Jacobson N: In Recovery: The Making of Mental Health Policy. Amsterdam, VU University Press, 2004Google Scholar

8 Corrigan PW, Salzer M, Ralph RO, et al.: Examining the factor structure of the Recovery Assessment Scale. Schizophr Bull 2004; 30:1035–1041Crossref, MedlineGoogle Scholar

9 McNaught M, Caputi P, Oades LG, et al.: Testing the validity of the Recovery Assessment Scale using an Australian sample. Aust N Z J Psychiatry 2007; 41:450–457Crossref, MedlineGoogle Scholar

10 Sklar M, Sarkin A, Gilmer T, et al.: The psychometric properties of the Illness Management and Recovery Scale in a large American public mental health system. Psychiatry Res 2012; 199:220–227Crossref, MedlineGoogle Scholar

11 Andresen R, Caputi P, Oades L: Stages of Recovery Instrument: development of a measure of recovery from serious mental illness. Aust N Z J Psychiatry 2006; 40:972–980Crossref, MedlineGoogle Scholar

12 Drapalski AL, Medoff D, Unick GJ, et al.: Assessing recovery of people with serious mental illness: development of a new scale. Psychiatr Serv 2012; 63:48–53LinkGoogle Scholar

13 Jerrell JM, Cousins VC, Roberts KM: Psychometrics of the recovery process inventory. J Behav Health Serv Res 2006; 33:464–473Crossref, MedlineGoogle Scholar

14 Burgess P, Pirkis J, Coombs T, et al.: Assessing the value of existing recovery measures for routine use in Australian mental health services. Aust N Z J Psychiatry 2011; 45:267–280Crossref, MedlineGoogle Scholar

15 Cavelti M, Kvrgic S, Beck EM, et al.: Assessing recovery from schizophrenia as an individual process. a review of self-report instruments. Eur Psychiatry 2012; 27:19–32Crossref, MedlineGoogle Scholar

16 Shanks V, Williams J, Leamy M, et al.: Measures of personal recovery: a systematic review. Psychiatr Serv 2013; 64:974–980LinkGoogle Scholar

17 Bedregal LE, O’Connell M, Davidson L: The Recovery Knowledge Inventory: assessment of mental health staff knowledge and attitudes about recovery. Psychiatr Rehabil J 2006; 30:96–103Crossref, MedlineGoogle Scholar

18 Armstrong NP, Steffen JJ: The Recovery Promotion Fidelity Scale: assessing the organizational promotion of recovery. Community Ment Health J 2009; 45:163–170Crossref, MedlineGoogle Scholar

19 O’Connell M, Tondora J, Croog G, et al.: From rhetoric to routine: assessing perceptions of recovery-oriented practices in a state mental health and addiction system. Psychiatr Rehabil J 2005; 28:378–386Crossref, MedlineGoogle Scholar

20 Onken SJ, Dumont JM, Ridgway P, et al.: Mental Health Recovery: What Helps and What Hinders? A National Research Project for the Development of Recovery Facilitating System Performance Indicators Phase One Research Report: A National Study of Consumer Perspectives on What Helps and Hinders Recovery. Alexandria, VA, National Association of State Mental Health Program Directors, National Technical Assistance Center, 2002. https://www.researchgate.net/profile/Steven_Onken/publication/242469660_Mental_Health_Recovery_What_Helps_and_What_Hinders_A_National_Research_Project_for_the_Development_of_Recovery_Facilitating_System_Performance_Indicators_Phase_One_Research_Report_A_National_Study_of_/links/00b4953b5bb2ca0f00000000.pdfGoogle Scholar

21 SAMHSA’s Working Definition of Recovery. Rockville, MD, Substance Abuse and Mental Health Services Administration, 2012. https://store.samhsa.gov/system/files/pep12-recdef.pdfGoogle Scholar

22 Fisher DB: Towards A Positive Culture of Healing. The DMH Core Curriculum: Consumer Empowerment and Recovery, Part 1, Boston, Commonwealth of Massachusetts Department of Mental Health, 1993Google Scholar

23 Jacobson N, Greenley D: What is recovery? A conceptual model and explication. Psychiatr Serv 2001; 52:482–485LinkGoogle Scholar

24 Le Boutillier C, Leamy M, Bird VJ, et al.: What does recovery mean in practice? A qualitative analysis of international recovery-oriented practice guidance. Psychiatr Serv 2011; 62:1470–1476LinkGoogle Scholar

25 Ehrhart MG, Schneider B, Macey WH: Organizational Climate and Culture: An Introduction to Theory, Research, and Practice. Abingdon, UK, Routledge, 2013CrossrefGoogle Scholar

26 Ragins M: A Recovery Culture Progress Report. Los Angeles, Mental Health America of Los Angeles, 2009. https://rickpdx.files.wordpress.com/2013/12/87arecoverycultureprogressreport.pdfGoogle Scholar

27 Ridgway P, Press A: Assessing the Recovery-Orientation of Your Mental Health Program: A User’s Guide for the Recovery-Enhancing Environment Scale (REE). Lawrence, University of Kansas, 2004Google Scholar

28 Moos RH, Houts PS: Assessment of the social atmospheres of psychiatric wards. J Abnorm Psychol 1968; 73:595–604Crossref, MedlineGoogle Scholar

29 Litwin GH, Stringer RA: Motivation and Organizational Climate. Cambridge, MA, Harvard Business School, Division of Research, 1968Google Scholar

30 Singer S, Meterko M, Baker L, et al.: Workforce perceptions of hospital safety culture: development and validation of the Patient Safety Climate in Healthcare Organizations Survey. Health Serv Res 2007; 42:1999–2021Crossref, MedlineGoogle Scholar

31 Thomas C, Ward M, Chorba C, et al.: Measuring and interpreting organizational culture. J Nurs Adm 1990; 20:17–24MedlineGoogle Scholar

32 Gaston EH: Developing a motivating organizational climate for effective team functioning. Hosp Community Psychiatry 1980; 31:407–412AbstractGoogle Scholar

33 Ellison ML, Belanger LK, Niles BL, et al.: Explication and definition of mental health recovery: a systematic review. Adm Policy Ment Health Ment Health Serv Res 2018; 45:91–102Crossref, MedlineGoogle Scholar

34 Burgess P, Pirkis J, Coombs T, et al: Review of Recovery Measures. Parramatta, New South Wales, Australian Mental Health Outcomes and Classification Network, 2010Google Scholar

35 Khanam D, McDonald K, Williams Neils C: Measuring Recovery: A Toolkit for Mental Health Providers in New York City. New York, Bureau of Mental Health, NYC Department of Health and Mental Hygiene, 2013. https://facesandvoicesofrecovery.org/file_download/inline/ce1a7768-1987-47e5-a0a4-90b5c0d36d54Google Scholar

36 Uniform Mental Health Services in VA Medical Centers and Clinics. Report no 1160.01. Washington, DC, US Department of Veterans Affairs, 2008. https://www.va.gov/vhapublications/ViewPublication.asp?pub_ID=1762Google Scholar

37 Psychosocial Rehabilitation and Recovery Centers (PRRC). Report no 1163.03. Washington, DC, US Department of Veterans Affairs, 2011. https://www.va.gov/vhapublications/ViewPublication.asp?pub_ID=2428Google Scholar

38 Dillman DA: Mail and Internet Surveys: The Tailored Design Method–2007 Update With New Internet, Visual, and Mixed-Mode Guide. New York, Wiley, 2011Google Scholar

39 Cronbach LJ, Shavelson RJ: My current thoughts on coefficient alpha and successor procedures. Educ Psychol Meas 2004; 64:391–418CrossrefGoogle Scholar

40 Schneider B, Ehrhart M, Macey W: Perspectives on organizational climate and culture; in APA Handbook of Industrial and Organizational Psychology. Washington, DC, American Psychological Association, 2011.Google Scholar

41 Wewiorski NJ, Gorman JA, Ellison ML, et al.: A site visit protocol for assessing recovery promotion at the program level: an example from the Veterans Health Administration. Psychiatr Rehabil J 2019; 42:323–328Crossref, MedlineGoogle Scholar

42 Phillips H, Bogdanich I, Carter K, et al.: Exploring novel approaches to staff rewards and recognition. Hosp Pharm 2017; 52:729–731Crossref, MedlineGoogle Scholar

43 Huang G, Muz B, Kim S, et al: 2017 Survey of Veteran Enrollees’ Health and Use of Health Care: Data Findings and Final Report. Rockville, MD, Westat, 2018. https://www.va.gov/HEALTHPOLICYPLANNING/SOE2017/VA_Enrollees_Report_Data_Findings_Report2.pdfGoogle Scholar

44 Helfrich CD, Dolan ED, Simonetti J, et al.: Elements of team-based care in a patient-centered medical home are associated with lower burnout among VA primary care employees. J Gen Intern Med 2014; 29(suppl 2):S659–S666Crossref, MedlineGoogle Scholar

45 Teclaw R, Price MC, Osatuke K: Demographic question placement: effect on item response rates and means of a Veterans Health Administration survey. J Bus Psychol 2012; 27:281–290CrossrefGoogle Scholar