The American Psychiatric Association (APA) has updated its Privacy Policy and Terms of Use, including with new information specifically addressed to individuals in the European Economic Area. As described in the Privacy Policy and Terms of Use, this website utilizes cookies, including for the purpose of offering an optimal online experience and services tailored to your preferences.

Please read the entire Privacy Policy and Terms of Use. By closing this message, browsing this website, continuing the navigation, or otherwise continuing to use the APA's websites, you confirm that you understand and accept the terms of the Privacy Policy and Terms of Use, including the utilization of cookies.

×
Published Online:https://doi.org/10.1176/appi.ps.201400147

Abstract

Objective:

The purpose of this study was to examine influences on the sustainability of a program to implement an evidence-based psychotherapy in a mental health system.

Methods:

Interviews with program administrators, training consultants, agency administrators, and supervisors (N=24), along with summaries of program evaluation data and program documentation, were analyzed with a directed content-analytic approach.

Results:

Findings suggested a number of interconnected and interacting influences on sustainability, including alignment with emerging sociopolitical influences and system and organizational priorities; program-level adaptation and evolution; intervention flexibility; strong communication, collaboration, planning, and support; and perceived benefit. These individual factors appeared to mutually influence one another and contribute to the degree of program sustainability achieved at the system level. Although most influences were positive, financial planning and support emerged as potentially both facilitator and barrier, and evaluation of benefits at the patient level remained a challenge.

Conclusions:

Several factors appeared to contribute to the sustainability of a psychosocial intervention in a large urban mental health system and warrant further investigation. Understanding interconnections between multiple individual facilitators and barriers appears critical to advancing understanding of sustainability in dynamic systems and adds to emerging recommendations for other implementation efforts. In particular, implications of the findings include the importance of implementation strategies, such as long-term planning, coalition building, clarifying roles and expectations, planned adaptation, evaluation, diversification of financing strategies, and incentivizing implementation.

As programs to implement evidence-based psychotherapies (15) in mental health systems have proliferated and matured, the need for research to understand the conditions and processes necessary to sustain them also has increased. Sustainability is a multidimensional construct that has been conceptualized as being embedded within a system and involving the application of a program or an approach with ongoing fidelity, ongoing benefits to consumers, the continuation of program activities, continuous or increased capacity, and the extent to which partnerships are maintained (6,7). Recent research suggests mixed results for some aspects of sustainability, such as the degree of fidelity and penetration into the system (8,9). However, other reports of statewide implementations have documented sustained program activities, capacity, and partnerships. Consistent with implementation models (10), these reports suggest the importance of factors in the inner and outer context, such as favorable political climate for policy reform and mandated change (2,11), which are coupled with a multifaceted strategy, including collaboration with stakeholders and sustainable funding (5,12). Much of the literature has focused on national or statewide initiatives, and little guidance exists for local mental health systems and large behavioral health organizations that are implementing these treatments (13) in the absence of a broader mandate or policy. Furthermore, despite an increasing emphasis on dynamic relationships between these factors in the implementation literature (6,10), many studies to date have described individual contributing factors but without much discussion of how their interrelationships may influence implementation outcomes. An exploration of mutual influence can facilitate the development of a comprehensive and successful plan to promote sustainability in other local systems.

This study focused on a seven-year-old community-based partnership to implement cognitive therapy (CT), with the purpose of understanding factors that may influence the continuation of program activities to promote and support implementation. Evidence that this program has been sustained includes continued funding, ongoing involvement among the majority of agencies that completed training, enrollment of new agencies, and expansions to offer CT to additional populations and settings (14). Also, although employee turnover affected sustainability at the clinic level, a high proportion of the clinicians who remained at their clinics two years after CT training in a community-based implementation program demonstrated sustained CT fidelity (Stirman SW, Pontoski K, Creed T, et al., unpublished manuscript, 2014).

Methods

Setting and Context

In 2007, the Beck Initiative (BI) was formed as a partnership between the Philadelphia Department of Behavioral Health and Intellectual disAbilities Services (DBHIDS) and the University of Pennsylvania (Penn) to implement, within the city’s behavioral health provider agencies, a CT program for depression and suicide prevention and for disorders that commonly co-occur with depression (13,14). DBHIDS is an urban mental health system with over 300 provider agencies that provide mental health and substance abuse services to the city’s 470,000 Medicaid recipients and thousands of uninsured and underinsured individuals (13,14).

Data Sources

Analysis included interviews with key informants at the administrative level at DBHIDS, Penn, and participating agencies, as well as a document review. In-depth, semistructured interviews were conducted with clinic administrators as training began and again 12–24 months postconsultation. Interviews with system administrators and Penn BI personnel occurred four years after the program began. The primary interviewer was not affiliated with the BI. Interview guides (available by request) were developed to assess interrelated aspects of sustainability related to the outer context (such as in the service environment) and the inner context (including intraorganizational characteristics) and to assess intervention characteristics in the implementation and “sustainment” phases of Aarons and colleagues’ (10) implementation framework and other research and theory on sustainability (6,10,12,15). Study investigators developed targeted questions and probes by mapping questions onto each key construct (16), with a framing question (17) included at the beginning to orient participants and allow them to express opinions about key facilitators and barriers to the program’s sustainability.

A document review was conducted with 100% of the existing BI documentation to complement the interviews and provide additional insight into policies and processes that may have influenced sustainability. Available documents included publicity and Web-based materials, notes, minutes, and all available summaries from BI workshops and consultation sessions (N=16), internal meetings (N=193), BI policy documentation and reports between 2007 and 2012 (N=10), and reports and publications (N=6).

Analysis

We used a directed content analysis (16) approach to validate and extend existing theory on sustainability. Digitally recorded interviews were transcribed and checked for accuracy. We developed a preliminary codebook of operationalized, a priori codes based on existing theory and research. Using NVIVO 10 software, three authors (SWS, AM, and JG) coded transcripts from different stakeholders and identified additional codes through consensus to finalize the codebook and specify links between key concepts. Two coders (AM and JG) who were not affiliated with the BI then coded all transcripts. A subsample was coded by all coders, who also audited each other’s work and resolved disagreements via consensus in weekly meetings. Policies, internal documentation, and summary memos based on reviews of the meeting minutes were coded with the same codebook and process. Queries were then run to identify codes and relationships related to program sustainability. Central themes were determined by diversity and triangulation across data sources, by the frequency with which potential influences were identified, and by attention to key decision points and interactions described in data sources.

Results

Out of 28 administrators and training consultants available at the time of the interviews, 24 participated in interviews specifically about their experiences with the BI, yielding an 86% response rate. Three additional training consultants and one DBHIDS administrator left the system after the first year of the initiative; because no additional themes were identified by the end of our interviews with the existing sample (17), we did not solicit their participation. Thus 75% of all administrators and training consultants who had been involved with the BI were interviewed. The sample comprised DBHIDS (N=3) and Penn (N=4) administrators, training consultants (N=8), and agency administrators and supervisors (N=9). Participants identified as Caucasian (75%, N=18), African American (17%, N=4), and Asian (8%, N=2). Sixty-two percent of participants were female (N=15), and 38% were male (N=9); 54% (N=13) had a master’s degree, and 46% (N=11) had an M.D. or Ph.D. degree. Data on participant ages were not collected.

Analyses revealed key interconnections between constructs identified in theories of sustainability, which are described below and further illustrated in Figure 1 and Table 1. Table 2 describes the number of stakeholders in each category who endorsed each theme.

FIGURE 1.

FIGURE 1. Sustainability construct for implementation of the Beck Initiative cognitive therapy (CT) program for depression

TABLE 1. Stakeholder perspectives on sustainability of the Beck Initiative (BI) cognitive therapy (CT) program for depression

Influence and examplesaInteractions
Collaboration and communicationFacilitated program adaptation and evolution, planning and support, and evaluation
 DBHIDS administrator: Having someone on our staff [who] is dedicated to ensuring the coordination and collaboration of agencies, Penn, and DBHIDS has played a huge role in [the initiative’s] being successful].
 Penn administrator: The agencies really learn from each other. . . . [T]hey come early and stay late and are . . . chatting with each other about potential problems or questions. . . . And we find that the more senior agencies, like the ones that went through it more in the beginning, really have a lot of great tips and suggestions for the newer agencies because they’ve been through it, and they’ve kind of been in their shoes. I think that’s a great way to keep them connected and feeling like they’re part of a group as opposed to isolated agencies just doing CT work.
 Agency administrator: We’ve had these arranged meetings between DBHIDS and administrators here, and of course the folks from Penn, to make sure we are doing things that make sense and making course corrections when necessary.
Alignment with sociopolitical influences and system prioritiesFacilitated program adaptation and evolution; facilitated by intervention fit and flexibility, perceived benefit, and collaboration
 DBHIDS administrator: The [politics and] advocacies and community-based pressure to improve and enhance services for children certainly played a part in wanting to extend the Beck Initiative to [school settings and] other child providers. [Also] the mayor’s initiative around ending homelessness. . . . [T]hat wasn’t . . . initially a stated goal for the city in terms of behavioral health priority. So very quickly it became a priority in [the] behavioral health system in accord with the mayor’s initiative. So there’s another opportunity where political and “critical” need aligned with the opportunity to use the Beck Initiative.
 DBHIDS administrator: Over time, the focus of the network has changed, and . . . it’s been important to be flexible and to go with those needs within the network and sort of [be] there as partners. I’m afraid that if we had stayed in that adult outpatient box, I don’t think [the BI] would still be where it is today. Maybe it would be, but not [at] such a broad level. It would have been more of an afterthought. I think for it to truly be an example for future and current evidence-based practices, . . . it’s been pivotal that it’s able to reach so many different levels of care and so many different diagnostic areas. There are other evidence-based practices that we are currently implementing or looking to implement that are not as broad as CT, and the question about sustaining them over time in the network has been brought up. They are more short term, like, “What are we going to do once the contract [for training] ends?”
Intervention flexibilityFacilitated alignment with sociopolitical influences, system needs, and program evolution; interacted with planning and technical support
 BI training consultant: The types of issues [we addressed] were issues where there had been work done with cognitive therapy to address those issues. Everything from anger and bipolar disorder and . . . significant anxiety. Certainly there was some depression, psychosis—and these were all areas where people had done work to develop [CT], so we were really just talking about how to apply interventions and conceptualize around those issues and across multiple issues as well.
 Agency administrator: [The new BI program] followed the [initial] training, and about a year later we started that group [for a specific population]. [The BI training consultant] brought the material to us. She brought the concepts and reintroduced the group at the [agency name], and we were able to mold and shape the group to fit our specific program, and it’s been pretty successful ever since.
Program adaptation and evolutionFacilitated perceived benefit; facilitated by evaluation and perceived benefit, sociopolitical influences, intervention fit, and flexibility
 Penn training consultant: We have . . . a pretty fluid process where we were asking for lots of feedback from primarily the therapists during the training—lots of feedback about how is this fitting for you. How does this apply with somebody that you would work with . . . so . . . a lot of the adaptation happens in the training because it is a very bidirectional thing. It’s very conversation guided. I think that the therapists often initially looked surprised that [we] were asking. I think it’s not a typical experience for them in training, but [it’s] then ultimately received really well. Feedback that we receive . . . is that they really appreciate that we want to make sure that it is a fit for all of the realities of where they are.
 Agency administrator: The Beck folks were very flexible. It’s terribly important to take this model and see how we can adapt it [here]. We first try to do the model as true as we can, but as time [goes] on to see how we can make it work so we can use it more effectively, more often here.
Evaluation and perceived benefitFacilitated program adaptation and interaction; planning and support (financial); facilitated by communication and collaboration; program adaptation and evolution; planning and support (technical)
 BI training consultant: [Clients at the agency] . . . made some improvements. I think some people were really impressed with some of the improvements that they saw. Others I think it was only smaller gains, but in general I think people felt positive about what was happening.
 DBHIDS administrator: From our standpoint the first measure of success would be in number of clinicians—number and diversity and the capacity building—[so] that we’re able to say, “we’re in this many agencies, we have this many clinicians, and they’re all on these levels of care.” The next chain of outcomes that we don’t have is . . . the critical outcomes with individuals.
 DBHIDS administrator: It’s more anecdotal than I’d like, but [agency-level data are] from a mixture of sources and information. Some [come] from provider meetings that we have and some . . . from what was credentialing and what is now the Network Improvement and Accountability Collaborative reviews.
 DBHIDS administrator: It would be very nice, for example, to see how many patients are actually being treated . . . and [it] also would be interesting to know what impact it has on the patient in terms of improvement. We have attempted to get some sense of that through giving BDIs, but there have been some administration problems in terms of that, so at this moment we don’t have really good data in terms of the patients.
 Penn administrator: We keep running into problems in terms of getting statistical analyses [on DBHIDS client databases], and it has been a stumbling block for us. For a number of years we’ve been wanting more analyses, and it seems that they are understaffed and overextended.
 DBHIDS administrator: I think we are learning a lot; I just think we need to be more systematic about documenting [what we learn] and then packaging it in such a way that it can be useful to other initiatives that we’re doing in the system. But I do think that we are learning a lot. I think that our provider world is benefiting from the training. I think that we are having an impact, but again I think we need to be able to document that.
Planning and support: technical assistanceResulted from collaboration and communication, program adaptation, and evolution; affected by planning and financial support, evaluation, and perceived benefit
 DBHIDS administrator: What has helped . . . is that Penn has been going out and visiting each of the agencies every 6 to 8 weeks, sitting at the internal group, bringing some CT material, and just being there to keep the focus.
 DBHIDS administrator: The majority of the issues that I see really have to do with how we handle the uncertainty of, that is, assuring the sustainability going forward, how we’re going to structure these things, how far out [we can] plan in terms of training, how we’re going to deal with issues of attrition and loss of capacity.
 Agency administrator: Whenever we vocalize that someone feels stuck with a certain issue, [DBHIDS administrator and Penn personnel] have been really helpful in trying to put something together for us.
 DBHIDS administrator: [Sometimes] I don’t know what we would have done without [the Web-based training]. It’s a really good way for people to get up to speed on the basics of CT to the point that they can join the internal group and be mentored [by previously trained clinicians].
Planning and support: financial supportAffected by sociopolitical influences, evaluation and perceived benefit, and program adaptation and evolution
 DBHIDS administrator: As a department we’re unique in a city this size to have the county authority and funding and [be] the policy-setting entity also in control of the Medicaid funding. . . . We have a mission and can also ensure that the funding streams are available to support it.
 DBHIDS administrator: There might be populations [for] the initiatives that we want to [reach] that we think will take several years . . . because it’s difficult to make those kinds of decisions [when funding decisions can be made only on a year-to-year basis]. . . . I think that’s one of the reasons why we had to take things in a more stepwise fashion . . . based on the finances.

aDBHIDS, Philadelphia Department of Behavioral Health and Intellectual disAbility Services; Penn, University of Pennsylvania; BDI, Beck Depression Inventory

TABLE 1. Stakeholder perspectives on sustainability of the Beck Initiative (BI) cognitive therapy (CT) program for depression

Enlarge table

TABLE 2. Frequency at which stakeholders endorsed individual influences on sustaining the Beck Initiative cognitive therapy program

InfluenceSystem administrators (N=3)Penn administrators (N=4)Training consultants (N=8)Clinic administrators (N=9)Total stakeholders who endorsed (N=24)
Collaboration and communication347620
Alignment with sociopolitical influences and system priorities334414
Intervention flexibility346417
Program adaptation and evolution227617
Evaluation and perceived benefit343717
Planning and support
 Technical assistance348722
 Financial support342615

TABLE 2. Frequency at which stakeholders endorsed individual influences on sustaining the Beck Initiative cognitive therapy program

Enlarge table

Collaboration and Communication

The strong collaborative relationship between DBHIDS, Penn, and participating agencies was noted to be a fundamental facilitator of program sustainability by the majority of stakeholders (Table 2). As a Penn administrator stated, “With DBHIDS administration, we’ve had a very positive, very reciprocal, and very collaborative, very respectful relationship. I don’t think we could have done this without that very strong relationship with them.”

Also noted in interviews, and apparent in policy documentation and meeting minutes, were processes and policies that clarified the roles and responsibilities of BI personnel. For example, the DBHIDS program director was a key liaison between the agencies, Penn, and DBHIDS and was tasked with ensuring that appropriate expectations were set and that training and posttraining activities were running smoothly. Challenges were generally resolved by program directors and agency administrators through established procedures, without escalation to higher levels of administration. Meeting minutes and policy documents clearly indicated who would collect and maintain different types of evaluation data, which was to be shared between Penn and DBHIDs for quality improvement. This clarity helped reduce the potential problem of “too many cooks in the kitchen.” Collaborators also demonstrated trust in the abilities of all of the individuals within the partnership to fulfill their unique responsibilities. Transitions in leadership went smoothly, according to all Penn and DBHIDS administrators; for example, “If anything [after a leadership transition], we’re meeting more frequently now . . . to make sure that we’re laying down the proper rails for communication.”

The collaborative relationship extended to multiple levels within and across agencies. Meeting minutes and administrator interviews indicated that the commissioner of DBHIDS routinely discussed BI activities at networkwide meetings to share information and publicize successes. Operational meetings to assess agency needs, clarify what participation in the program entailed, and troubleshoot throughout the training phase further fostered the collaboration. Agency directors and clinical supervisors also participated in quarterly BI meetings to facilitate cross-agency communication (Table 1). Nearly all key informants noted that these strategies to promote communication and collaboration were critical to the initiative’s success. As described below and in Table 1, the collaborative relationships had a positive impact on other facilitators by promoting further planning and support, which contributed to perceived benefits and program evolution.

Sociopolitical Influences and Alignment With System and Agency Priorities

Document review, supported by interviews with all three DBHIDS administrators and other stakeholders (Table 2), indicated that the initiative was viewed as an important early component of a broader effort within the system. A DBHIDS administrator stated, “The department has made an explicit priority of identifying, implementing, disseminating, and evaluating evidence-based and emerging best practices. The Beck Initiative for a number of years has been the crown jewel of the project in this regard.” Positioning the program within these broader efforts ensured that as priorities emerged from sociopolitical influences (Table 1), the system looked for ways to expand the BI to support these goals.

Intervention Fit and Flexibility

BI training consultants, Penn, and DBHIDS administrators noted that CT’s applicability to a broad range of presenting problems and populations allowed the program to evolve and change in response to sociopolitical influences and system- and agency-level needs. Administrators at four agencies noted the different programs and populations that CT reached through the BI at their agencies (Table 1). As a DBHIDS administrator explained, “There’s something about the adaptability and also the universality of cognitive therapy that has blended itself extremely well and has also expanded across the system.”

Program Adaptation and Evolution

As Figure 1 indicates, each of the aforementioned influences played a role in the ways in which the program evolved and the model was adapted. The collaborative relationship and confidence in the flexibility of CT led DBHIDS to ask the BI to address emerging system priorities and to meet the needs of the populations served by DBHIDS. Multiple data sources indicate that the program evolved primarily to offer CT to additional populations, and at times the expansion necessitated additional context-level adaptations, such as the delivery of CT outside of an outpatient setting (including inpatient stays and home visits) and the training of multiple disciplines on treatment teams (for example, psychiatric assistants and vocational specialists) to support CT delivery by clinicians. As noted by a BI training consultant (Table 1), collaboration between participating agencies, Penn, and DBHIDS ensured that these expansions went smoothly and increased perceived benefits.

Evaluation and Perceived Benefit

Meeting minutes and program evaluation data documented consistent tracking of clinicians’ satisfaction with training, fidelity scores during training and every two years thereafter, and information on the continuation of CT activities and consultation within the clinics. This evaluation indicated that training and program-level goals were largely met (Table 1). However, there was agreement that consumer-level evaluation was a challenge. Table 1 outlines efforts and challenges related to individual-level outcomes. Administrators acknowledged that a standard system for evaluation at the consumer level had not been developed and expressed a commitment to developing evaluation procedures. However, DBHIDS administrators and seven agency administrators noted important benefits to the BI, including the development of a trained workforce and benefits to consumers. These perceived benefits appeared to positively affect the collaboration and to result in ongoing support and further program expansion (Figure 1).

Planning and Support for Sustainability

The supports and policies put into place by DBHIDS and Penn leadership within the first year, and documented throughout the years in internal meeting minutes, indicate that sustaining the program was not an afterthought and that considerable effort was made to support and sustain the overall program and the use of CT within the agencies.

Ongoing technical support and consultation.

A requirement was put into place early in the initiative that participating agencies commit to ongoing internal CT consultation and periodic fidelity assessment after the training phase had ended, with ongoing support from Penn, which was perceived to be essential (Tables 1 and 2). Agencies also needed support to expand or maintain their cohort of CT providers because of turnover (Table 1). All Penn and DBHIDS administrators and four agency administrators discussed the impact of the Web-based training developed for this purpose. All DBHIDS administrators also noted plans to require both sustainability and evaluation plans for requests for proposals (RFPs) for participation in the BI or other programs, as well as a new Evidence-based Practice and Innovation Center developed to support implementation of evidence-based practices throughout the system.

Financial support.

Financial support was primarily a facilitator of sustainability, although some aspects of the current funding strategy were noted to be potential and actual challenges (Table 1). Funding for program administration, training, and reimbursement to agencies for lost productivity during training and consultation activities has been provided by DBHIDS through annual reinvestment dollars accrued through DBHIDS’ administrative services organization’s effective management of the Medicaid behavioral health contract. Medicaid billing covered the delivery of CT by trained providers in most cases. Challenges that were noted included uncertainty related to availability of reinvestment funds, constraints of state Medicaid budgets, and the impact of lost revenue at the agency level associated with staff’s participation in BI activities after training ended. This uncertainty limited the scope and scale of new initiative activities at times (Table 1).

Penn and DBHIDS administrators recognized that without a financing model independent of reinvestment funds, the initiative would be in jeopardy in the absence of an annual surplus. Meeting minutes and early memos indicated plans since the inception of the program, and in the last year of the study, DBHIDS began to experiment with alternative funding models (Table 1). New models could also help agencies absorb costs related to training and ongoing consultation, given that meeting minutes and interviews with three agency administrators noted that these costs were a challenge to sustainability. In addition, because the use of evidence-based treatments is now a core expectation, a DBHIDS administrator noted that evaluation of responses to RFPs includes prioritization of providers who demonstrate that they use them, stating, “We’ve actually changed the incentives within our system such that being trained . . . puts you at a competitive advantage within the system.”

Discussion

This study identified a number of interrelated contextual factors that emerged as potential influences on sustainability of an evidence-based treatment in a city’s behavioral health system. Many findings, particularly those related to the outer context (sociopolitical influences, system needs, funding, and collaboration), as well as intervention fit and the need for ongoing support, were consistent with Aarons and colleagues’ model (10), and our findings also support Pluye and colleagues’ (18) emphasis on planning for sustainability. Some aspects, such as perceived benefits, are captured in other implementation models (19), and the concepts of evaluation and program adaptation map most closely onto Chambers and colleagues’ dynamic sustainability framework (20). That the findings do not map exactly onto a single model of implementation or sustainability suggests that different aspects of individual implementation models may be more and less relevant, depending on the context and nature of the intervention.

It is noteworthy that sociopolitical influences could have shifted priorities away from CT as new or broader initiatives occurred (21), particularly because CT implementation was not a state or national mandate. In this case, CT was not supplanted by new system-level priorities, but its flexibility allowed it to be applied to differing contexts to meet emerging needs. The need to broaden the application of CT to ensure sustainability is consistent with some of the recommendations regarding the use of practice-based efforts to adapt and optimize interventions (20,22), although evaluation of program impact on consumer outcomes has been a considerable challenge for the BI as well as other programs (23). Such evaluation remains an important future direction, especially because policy makers may require evidence of consumer-level benefits before supporting ongoing implementation. In addition, our findings suggest that to be sustained, implementation programs may require openness to mutual adaptation, which is closely linked with quality management strategies, such as the development of processes and tools for evaluation, small cycles of change and evaluation when adapting interventions, and provision of ongoing consultation (24). Although a majority of themes that we identified appeared to be positive influences on program sustainability, funding appeared to be both a facilitator and a barrier. Our findings suggest that several finance strategies are indicated for sustainability, including incentives for evidence-based practice delivery, increasing referrals, and diversifying funding sources (24). Development of a workable funding model can take time, so early planning and piloting of funding strategies can reduce the likelihood that programs will end when the initial funding source is no longer available.

Finally, our findings support the recent literature that suggests that understanding interconnections between individual influences is critical to advancing understanding of sustainability in dynamic systems (6,20). When selecting implementation strategies to promote sustainability, considering how each strategy may ultimately interact with and affect other key facilitators of sustainability is important. For example, during planning, it may be important to anticipate shifts in priorities and emerging needs to select treatments that may lend themselves to adaptation or flexible application. Similarly, when selecting strategies for quality assurance or financing, understanding how they will meet immediate needs and understanding how they may fit over the long term and affect collaborative relationships at all levels are important. Relationship development strategies, such as coalition building, formal agreements regarding roles and activities, and community-academic partnerships (24), may facilitate this understanding.

This article makes a unique contribution to the literature in its focus on interrelated influences on the sustainability of a program to implement evidence-based mental health treatments at the local system level in the absence of a broader mandate. However, some limitations are to be noted. First, this study focused on a single program within a single mental health system, and some potential key informants were not interviewed. The perspectives of clinicians and consumers of mental health services were not included, because preliminary coding of interviews with them through a larger program of research identified little material relevant to sustainability at the system level, as opposed to the organization level or individual level. The qualitative nature of the study precludes conclusions about causal relationships between factors identified here and sustainability. Further, this study did not investigate all aspects of sustainability. It is possible that despite fairly robust activity at the system level, variation will be present at agency or provider levels and that different influences will be more salient. Investigation of the extent to which the factors identified in the interviews are associated with quantitative measures of sustainability (such as long-term CT fidelity and sustained improvement in client outcomes) requires other types of research designs.

Conclusions

Through an analysis of program documentation and stakeholder attributions, this study investigated potential facilitators of sustained activities in a program to implement evidence-based psychotherapy in a city’s behavioral health program. How the successes at this level, and the decisions to expand and broaden the program, influence activities at organization and provider levels is an important topic of ongoing research, particularly as established implementation efforts mature and seek to sustain their achieved capacity. Findings from this program of research may inform efforts of other local systems to sustain implementation programs and may contribute to the refinement of sustainability models that are applicable to local and regional implementation.

Dr. Stirman and Ms. Gamarra are with the Women’s Health Sciences Division, U.S. Department of Veterans Affairs (VA) National Center for PTSD, Boston (e-mail: ). Dr. Stirman is also with the Department of Psychiatry, Boston University. Dr. Matza is with the Lesbian, Gay, Bisexual, and Transgender Program, Office of Patient Care Services, U.S. Department of Veterans Affairs Central Office, Washington, D.C. Ms. Toder, Dr. Beck, Dr. Crits-Christoph, and Dr. Creed are with the Department of Psychiatry, University of Pennsylvania, Philadelphia. Ms. Xhezo, Dr. Evans, and Dr. Hurford are with the Philadelphia Department of Behavioral Health and Intellectual disAbility Services, Philadelphia.

Funding for this research was provided by grant K99/R00 MH 080100 from the National Institute of Mental Health (NIMH). When the research was conducted, Dr. Stirman was a fellow at NIMH and the Implementation Research Institute (R25 MH080916).

Opinions expressed in this article do not necessarily reflect the viewpoints of the Veterans Health Administration or the National Institutes of Health.

Dr. Crits-Christoph has received research funding from Alkermes, Inc., to conduct analyses of a naturalistic database of treatments for substance abuse in Missouri. The other authors report no financial relationships with commercial interests.

The authors gratefully acknowledge the agencies, clinicians, and consumers who have participated in the Beck Initiative for their contributions to this project.

References

1 Becker KD, Nakamura BJ, Young J, et al.: What better place than here, what better time than now? Advancing the dissemination and implementation of evidence-based practices. Behavior Therapist 32:89–96, 2009Google Scholar

2 Karlin BE, Cross G: From the laboratory to the therapy room: national dissemination and implementation of evidence-based psychotherapies in the US Department of Veterans Affairs Health Care System. American Psychologist 69:19–33, 2014Crossref, MedlineGoogle Scholar

3 McHugh RK, Barlow DH: The dissemination and implementation of evidence-based psychological treatments: a review of current efforts. American Psychologist 65:73–84, 2010Crossref, MedlineGoogle Scholar

4 Ehlers A, Grey N, Wild J, et al.: Implementation of cognitive therapy for PTSD in routine clinical care: effectiveness and moderators of outcome in a consecutive sample. Behaviour Research and Therapy 51:742–752, 2013Crossref, MedlineGoogle Scholar

5 Nakamura BJ, Chorpita BF, Hirsch M, et al.: Large-scale implementation of evidence-based treatments for children 10 years later: Hawaii’s evidence-based services initiative in children’s mental health. Clinical Psychology: Science and Practice 18:24–35, 2011CrossrefGoogle Scholar

6 Wiltsey Stirman S, Kimberly J, Cook N, et al.: The sustainability of new programs and innovations: a review of the empirical literature and recommendations for future research. Implementation Science 7:17, 2012Crossref, MedlineGoogle Scholar

7 Shediac-Rizkallah MC, Bone LR: Planning for the sustainability of community-based health programs: conceptual frameworks and future directions for research, practice and policy. Health Education Research 13:87–108, 1998Crossref, MedlineGoogle Scholar

8 Shiner B, D’Avolio LW, Nguyen TM, et al.: Measuring use of evidence based psychotherapy for posttraumatic stress disorder. Administration and Policy in Mental Health and Mental Health Services Research 40:311–318, 2013Crossref, MedlineGoogle Scholar

9 Wilk JE, West JC, Duffy FF, et al.: Use of evidence-based treatment for posttraumatic stress disorder in Army behavioral healthcare. Psychiatry 76:336–348, 2013Crossref, MedlineGoogle Scholar

10 Aarons GA, Hurlburt M, Horwitz SM: Advancing a conceptual model of evidence-based practice implementation in public service sectors. Administration and Policy in Mental Health and Mental Health Services Research 38:4–23, 2011Crossref, MedlineGoogle Scholar

11 Schoenwald SK, Garland AF, Southam-Gerow MA, et al.: Adherence measurement in treatments for disruptive behavior disorders: pursuing clear vision through varied lenses. Clinical Psychology: Science and Practice 18:331–341, 2011Crossref, MedlineGoogle Scholar

12 Stroul BA, Manteuffel BA: The sustainability of systems of care for children’s mental health: lessons learned. Journal of Behavioral Health Services and Research 34:237–259, 2007Crossref, MedlineGoogle Scholar

13 Stirman SW, Buchhofer R, McLaulin JB, et al.: The Beck Initiative: a partnership to implement cognitive therapy in a community behavioral health system. Psychiatric Services 60:1302–1304, 2009LinkGoogle Scholar

14 Creed TA, Stirman SW, Evans AC, et al.: A model for implementation of cognitive therapy in community mental health: the Beck Initiative. Behavior Therapist 37(3):56–64,2014Google Scholar

15 Maher L, Gustafson D, Evans A: Sustainability Model and Guide. Leicester, England, National Health Service, Institute for Innovation and Improvement, 2004Google Scholar

16 Hsieh HF, Shannon SE: Three approaches to qualitative content analysis. Qualitative Health Research 15:1277–1288, 2005Crossref, MedlineGoogle Scholar

17 Carlson NM, McCaslin M: Meta-injury: an approach to interview success. Qualitative Report 8:549–569, 2003Google Scholar

18 Pluye P, Potvin L, Denis JL: Making public health programs last: conceptualizing sustainability. Evaluation and Program Planning 27:121–133, 2004CrossrefGoogle Scholar

19 Damschroder LJ, Aron DC, Keith RE, et al.: Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implementation Science 4:50, 2009Crossref, MedlineGoogle Scholar

20 Chambers DA, Glasgow RE, Stange KC: The dynamic sustainability framework: addressing the paradox of sustainment amid ongoing change. Implementation Science 8:117, 2013Crossref, MedlineGoogle Scholar

21 Martin GP, Weaver S, Currie G, et al.: Innovation sustainability in challenging health-care contexts: embedding clinically led change in routine practice. Health Services Management Research 25:190–199, 2012Crossref, MedlineGoogle Scholar

22 Chorpita BF, Daleiden EL: Doing more with what we know: introduction to the special issue. Journal of Clinical Child and Adolescent Psychology 43:143–144, 2014Crossref, MedlineGoogle Scholar

23 Institute of Medicine: Treatment for Posttraumatic Stress Disorder in Military and Veteran Populations: Final Assessment. Washington, DC, National Academies Press, 2014Google Scholar

24 Powell BJ, McMillen JC, Proctor EK, et al.: A compilation of strategies for implementing clinical innovations in health and mental health. Medical Care Research and Review 69:123–157, 2012Crossref, MedlineGoogle Scholar