The American Psychiatric Association (APA) has updated its Privacy Policy and Terms of Use, including with new information specifically addressed to individuals in the European Economic Area. As described in the Privacy Policy and Terms of Use, this website utilizes cookies, including for the purpose of offering an optimal online experience and services tailored to your preferences.

Please read the entire Privacy Policy and Terms of Use. By closing this message, browsing this website, continuing the navigation, or otherwise continuing to use the APA's websites, you confirm that you understand and accept the terms of the Privacy Policy and Terms of Use, including the utilization of cookies.

×
ArticlesFull Access

The IPS Learning Community: A Longitudinal Study of Sustainment, Quality, and Outcome

Abstract

Objective:

Implementations of evidence-based mental health practices often disappear quickly, and few studies have examined sustainment. Since 2001, the Individual Placement and Support (IPS) learning community has promoted dissemination, implementation, sustainment, and expansion of IPS by using multiple strategies: online training, in-person training and technical assistance, technical assistance teleconferences, annual meetings, stakeholder conference calls, fidelity assessments, and transparency of outcomes. This study examined sustainment of IPS over a two-year period among programs in the learning community in the United States.

Methods:

The authors interviewed IPS team leaders in 129 programs actively participating in the learning community in 2012 and 2014. The structured interview addressed questions about program status, funding, and quality improvement activities. Simultaneously, the learning community tracked program-level data on IPS fidelity and employment rates. The study examined two-year program sustainment and changes in employment rates, fidelity, funding, and quality improvement activities.

Results:

In 2012, 129 participating sites had been active for an average of 4.5 years. At two-year follow-up, 124 (96%) sites were sustained. The sustaining sites maintained quality improvement activities, expanded funding sources, and increased employment rates (41% to 43%; p=.04) and fidelity scores (103.8 to 108.4; p=.002).

Conclusions:

Nearly all programs participating in a learning community in 2012 continued to provide IPS services over the next two years, exceeding sustainment rates for evidence-based practices reported in the literature. Quality indicators also improved, suggesting that learning community activities fostered sustainment and quality. Controlled studies must compare specific learning community approaches with usual methods of sustainment.

Even though many evidence-based practices have been developed, few clients with severe mental illness receive effective services (1). Evidence-based practices are often implemented poorly and rarely endure beyond initial enthusiasm and grant funding. The empirical literature on sustainment of evidence-based practices is weak and fragmented (2). Factors commonly hypothesized to influence sustainment include political support, funding stability, community partnerships, organizational capacity, fidelity monitoring, technical assistance, workforce development, and supervision (3,4). Although many individual factors have modest empirical support (3), research findings have been inconsistent.

Monitoring the quality of services is crucial; without mechanisms to ensure adherence to model standards, programs will vary widely in how services are delivered (5). Therefore, fidelity scales, defined as measures that assess adherence to a program model, have become essential tools for program implementation (6) and sustainment (7). Strategies to maintain and improve the quality of services include establishment of learning collaboratives (8) and technical assistance centers to provide training, resource materials, fidelity monitoring, on-site consultation, and other sources of support (9,10). At the local agency level, quality improvement efforts, including fidelity and outcome monitoring, are associated with better implementation (11). The frequency and quality of supervision also affect the quality of implementation and sustainability of services (3,12). Another factor, workforce development, ensures that practitioners have the skills to do the practice (13). At the state level, ensuring a well-trained workforce requires systematic and large-scale methods for initial and booster training; online training is one such method (14).

One evidence-based practice increasingly implemented throughout the United States is the Individual Placement and Support (IPS) model of supported employment (15,16). In recognition of the many factors influencing the long-term survival of any program model, a comprehensive learning community was developed to sustain IPS. Beginning in 2001, the Dartmouth Psychiatric Research Center and the Johnson & Johnson Office of Corporate Contributions partnered to develop a comprehensive program to strengthen state and local infrastructures to promote access to IPS throughout the United States. After starting as a small demonstration in three states, the program has evolved into a network of 20 states and three European countries known as the IPS learning community (17).

Historically, the term “learning collaborative” has been used to define a network of organizations with a shared goal of improving treatment for a specific medical condition, facilitated by regular communication (for example, meetings, teleconferences, and newsletters) and collection and dissemination of objective information about procedures and outcomes, typically over a few months (18,19). Some collaboratives provide training and technical assistance and facilitate research and innovation (20). The IPS group adopted the term “learning community” to signify its long-term commitment to quality and expansion; the term differentiates our approach from time-limited quality improvement learning collaboratives (21).

The purpose of this descriptive study was to examine sustainment of U.S. programs in the IPS learning community over a two-year period. We also examined changes over that time in the infrastructure supporting sustainment. We hypothesized that participation in the learning community would promote sustainment, program fidelity, and employment outcomes.

Methods

Overview

This prospective study examined changes over a two-year period in 129 programs participating in the IPS learning community in 2012. In addition to examining sustainment, fidelity, and employment outcomes, we examined funding and quality improvement efforts. The Dartmouth Institutional Review Board approved the study, which followed the principles outlined in the Declaration of Helsinki.

IPS Learning Community

The IPS learning community provides a set of strategies, interventions, and activities intended to promote the dissemination, implementation, and sustainment of IPS services. The learning community has encompassed a two-tiered, decentralized approach. In the United States, Dartmouth trainers and researchers bring together state leaders and help them build a viable infrastructure for implementing and sustaining IPS services within their states (17). Dartmouth provides resources, such as brochures, posters, policy bulletins, newsletters, videos, and online training for employment specialists and supervisors. In each state, the leadership team includes three key roles: liaisons from two state agencies responsible for employment services (that is, mental health and vocational rehabilitation) and one or more state trainers. State leaders create parallel learning communities consisting of IPS programs within their public mental health systems. As part of their participation in the learning community, state leaders submit quarterly employment outcome reports for IPS programs within their states; Dartmouth analyzes and distributes the data back to the states (22). State trainers conduct periodic fidelity reviews of both new and established IPS programs by using a validated fidelity scale (23). IPS programs are considered active participants once they begin submitting outcome reports, typically about nine months after start-up.

Sample

The sample consisted of 129 U.S. sites in 13 states that were active in the IPS learning community as of January 2012. All sites meeting this criterion agreed to participate in the study. At the 2012 interview, the sites had participated in the learning community for a mean±SD of 4.5±2.7 years (median=3.9). The 13 learning community states were Connecticut, District of Columbia, Illinois, Kansas, Kentucky, Maryland, Minnesota, Missouri, Ohio, Oregon, South Carolina, Vermont, and Wisconsin. The number of sites per state ranged from three to 21; four states had 16 or more sites, and six states had six or fewer sites. Sites were located in both urban and rural communities (23). Most local programs had a single IPS team (N=105, 81%), but 24 sites had two (N=12, 9%), three (N=7, 5%), or four or more (N=5, 4%) teams. The mean job tenure for IPS team leaders was 5.0±4.9 years (median=3.3). Although the learning community does not compile statistics on client background characteristics, the people served in these programs reflect the clients served by the public system.

Measures

We operationally defined sustainment as follows: a program is sustained if it continues to employ staff, maintains an active client caseload, and provides direct services. Programs sometimes continue in name only, without adhering to the program model that they originally implemented (3). Therefore, a more meaningful standard is sustaining a program at good fidelity (that is, adhering to the core principles of the program model).

Fidelity.

The 25-item Individual Placement and Support Fidelity Scale (IPS-25) assesses adherence to the evidence-based principles of IPS (24). Each item is rated on a 5-point behaviorally anchored dimension, ranging from 1, representing lack of adherence, to 5, indicating close adherence to the model. The total score on the IPS-25, the sum of item scores, ranges from 25 to 125. A score of 100 or more is considered good fidelity.

The fidelity manual recommends that two trained fidelity assessors conduct a 1.5-day site visit to collect the necessary information to complete the IPS-25 (www.dartmouthips.org). The assessment procedures include interviews with the IPS program leader, employment specialists, clinicians, agency leaders, family members, and clients; observation of team meetings and community contacts with employers; and review of client charts. For quality improvement purposes, assessors prepare a fidelity report with fidelity ratings for each of the 25 items and recommendations for improvement.

Competitive employment rate.

Competitive employment is defined as employment in integrated work settings in the competitive job market at prevailing wages, with supervision provided by personnel employed by the business. The learning community tracks the quarterly competitive employment rate, which is based on at least one day of competitive employment during a specified three-month period (that is, a calendar quarter). All competitive jobs are counted, regardless of hours per week worked. The site-level competitive employment rate is calculated as the number of clients employed divided by number of clients active on the caseload during the quarter in which the fidelity assessment is completed. The benchmark for good employment outcome is a quarterly employment rate of 41% or more (17).

Data collection procedures.

As part of the IPS learning community, individual sites agree to participate in annual fidelity assessments and submit quarterly competitive employment outcomes (17,22). Each state identifies a group of qualified fidelity reviewers who independently assess IPS programs in the state on adherence to IPS standards. The fidelity assessors for each state include trainers from technical assistance centers, coordinators from state mental health and vocational rehabilitation agencies, and IPS supervisors from other programs. Dartmouth provides training and technical assistance in performing fidelity assessments through a three-day workshop, bimonthly teleconferences, individual consultation, and on-site training.

Also, as part of the agreement for participating in the learning community, each state appoints a coordinator to compile quarterly outcome data and report them to the Dartmouth team, which prepares and distributes a detailed quarterly report of outcomes aggregated at the state and learning community levels (17). For this study, we used quarterly outcomes for June 2012 and June 2014. For three sites missing second-quarter outcome data, we substituted data submitted in the following quarter.

For this study, state coordinators provided site-level fidelity data by using a standardized spreadsheet from their most recent on-site fidelity reviews. With the exception of Maryland, all states followed the above fidelity procedures with minor variations. (Maryland used an earlier version of the IPS Fidelity Scale [25] not directly comparable to the IPS-25.) The dates for the site fidelity reviews ranged from January 2011 to August 2012 for the initial time period and from January 2013 to August 2014 for the second time period.

Interview Procedures

Initial team leader interviews.

Between February and May of 2012, interviewers contacted team leaders for the 129 IPS programs active in the IPS learning community as of January 2012. Telephone interviews averaged one hour and consisted of open-ended questions and factual short-answer questions. [The interview guide is included in an online supplement to this article.]

The interview included a checklist of funding sources, and respondents were asked to identify which sources were used by their agency to fund IPS services and to rank the top three revenue sources. The interview guide also included questions about use of technical assistance, training, fidelity monitoring, outcome monitoring, and field mentoring. A final question asked, “Do you have any worries about IPS being discontinued in the next year?” These responses were dichotomized as either yes (worried about discontinuation) or no (not worried about discontinuation).

Follow-up team leader interviews.

Between February and July of 2014, interviewers recontacted program coordinators from the same sites included in the 2012 sample. We interviewed the current IPS team leader for 122 sites that were sustained by 2014, in addition to the team leader for one site representing the merger of two sites that also continued to offer IPS services. The interview protocol was a shortened version of the initial interview with the key questions unchanged. We completed initial and follow-up interviews with IPS team leaders (or another knowledgeable staff member) at all the study sites.

Results

Sustainment

Of the 129 sites active in 2012, 124 (96%) continued to offer IPS services in 2014. Two of the 124 continuing sites were no longer functioning as independent programs because their parent organizations had merged in 2013. The remaining five sites that had discontinued IPS services were located in three states. In this article, we report changes over time for the 122 sites active at both time periods and operating as independent sites.

Funding Sources for IPS

According to 2012 team leader interviews, the most common funding sources for IPS were vocational rehabilitation agencies, state or county mental health budgets, Medicaid, and Ticket to Work (a Social Security Administration program for people receiving disability payments), as shown in Table 1. Other revenue sources were rare. Of the 122 active sites in 2014, the four most common funding sources remained the same. Utilization of Medicaid and vocational rehabilitation funding significantly increased, and utilization of Ticket to Work funding significantly declined during this time.

TABLE 1. Funding characteristics at two time points of 122 sustaining programs participating in the Individual Placement and Support learning community

Characteristic20122014pa
N%N%
Funding
 Medicaid63528066.002
 Vocational rehabilitation agencies988010687.04
 State or county budgets90749780.23
 Ticket to Work44363025.02
 Federal block grant8733.18
 Private foundation25211512.03
 Client payment761613.05
 Other24201613.19
Diversity of major funding sources
 1 source26212117
 2 sources61504134
 3 sources34286049
 None of the 3 sources110
Top-ranked funding source
 Medicaid36304537
 Vocational rehabilitation agencies28233125
 State or county budgets52434335
 None of 3 sources6533

aProportions were compared by McNemar’s tests.

TABLE 1. Funding characteristics at two time points of 122 sustaining programs participating in the Individual Placement and Support learning community

Enlarge table

The importance of different funding sources (that is, the proportion of total program revenue) yielded a different rank order, as shown in Table 1. In 2012, team leaders most often ranked state or county budgets as their top revenue source for IPS programs, followed by Medicaid and vocational rehabilitation agencies. By 2014, Medicaid had overtaken state or county budgets as the mostly frequently top-ranked revenue source. Sites varied by state in their dependence on funding sources; in 2012, most sites identified state or county budgets (four states), Medicaid (four states), and vocational rehabilitation agencies (four states) as their top revenue source. In one state, 13 (68%) of 19 sites relied exclusively on state funding for the IPS services, whereas in the other states nearly all sites reported multiple funding sources.

We defined “funding diversity” as the number of funding sources used among the three major funding sources (that is, vocational rehabilitation, state or county budgets, and Medicaid). Funding diversity increased between 2012 and 2014 (t=4.17, df=121, p<.001). The percentage of sites accessing all three sources of funding increased from 28% to 49%.

Quality Improvement Activities

From 2012 to 2014, the number of sites completing fidelity reviews in the past year increased from 79 (65%) to 92 (75%) (McNemar’s test [N=122], p=.07), as shown in Table 2. The number of sites completing online training and the number receiving on-site technical assistance were similar in 2012 and 2014.

TABLE 2. Quality improvement activities at two time points of 122 sustaining programs participating in the Individual Placement and Support learning community

Activity20122014pa
N%N%
Fidelity review within the past 12 months79659275.07
Online training (at least 1 staff)524351421.00
On-site technical assistance in past year89738368.38

aProportions were compared by McNemar’s tests.

TABLE 2. Quality improvement activities at two time points of 122 sustaining programs participating in the Individual Placement and Support learning community

Enlarge table

Fidelity and Employment Outcomes

Between 2012 and 2014, the mean IPS-25 fidelity scores showed a modest but significant increase, from 103.8±9.5 to 108.4±7.6 (t=3.25, df=62, p=.002) at the 63 sites with fidelity reviews in both years. Among sites with fidelity reviews in both years (including Maryland), the percentage of sites meeting the standards for good fidelity increased from 56 (78%) to 65 (90%) (McNemar’s test [N=72], p=.05). Among the 122 sites that sustained IPS services in 2014, the mean quarterly employment rate increased significantly from 41%±15.0% to 43%±13.1% (t=2.13, df=121, p=.04). The number of sites achieving the benchmark standard for good employment increased from 62 (51%) to 77 (63%) (McNemar’s test [N=122], p=.02).

Worries About Discontinuation of IPS Services

Between 2012 and 2014, the number of IPS team leaders worried about IPS services being discontinued declined from 29 (24%) to 17 (14%) (McNemar’s test [N=121], p=.06). The most common worry was future funding.

Discussion

The two-year sustainment rate was 96% for 129 programs in the IPS learning community. This sustainment rate is higher than the 80% rate over a two-year period after the termination of the formal implementation phase in a national study of 49 sites implementing a new evidence-based practice aided by systematic technical assistance (7). Data on sustainment of evidence-based practices are rarely published. Many studies make it clear, however, that enthusiasm for an innovative program model, as well as model fidelity, often fades over time (3,2628). Funding initiatives targeting specific program models often spur growth, followed by rapid dissolution when state-sponsored funding ends. For example, over a span of less than a decade, one state experienced a cycle of rapid growth followed by a collapse of services for an evidence-based practice when the targeted funding for this program was abruptly curtailed (3,29). To our knowledge, no one has examined the empirical literature on sustainability to establish benchmarks for target rates for sustaining programs over time (2).

Beyond the high sustainment rate, most sustaining sites met benchmark standards for good fidelity and good employment outcomes. Improvements on both criteria suggest that the learning community enhanced the quality of services at participating sites. Model drift has not occurred in the IPS learning community, probably because of fidelity and outcome monitoring. Fidelity monitoring is crucial for gauging how well programs are sustained; once a state discontinues fidelity reviews, program leaders may introduce changes that compromise services (3).

Another key to maintaining IPS services has been ongoing attention to funding. The development of IPS services commensurate with need is especially formidable for IPS, because it lacks a single reliable funding source (30,31). Fortunately, all of the states in the IPS learning community have been able to survive state budget cuts and other challenges, primarily through the work of creative and persistent state leaders who have ensured continuous (and increasing) funding for IPS. Support and advice from the network of trainers and other state leaders in the learning community have helped avert program discontinuations precipitated by budgetary shortfalls. During the study period, sites diversified their funding sources, with more sites accessing funding from vocational rehabilitation agencies and Medicaid over time. Despite similarities in the broad categories of funding sources used to fund IPS programs, the specific state funding algorithms varied greatly (31). State regulations vary considerably, and states need to be creative in developing viable funding models.

During the two-year period, states maintained the frequency of quality improvement activities, such as fidelity reviews, training, and technical assistance. Moreover, national, state, and program leaders provided extensive technical assistance during this time that was not captured by the survey. The finding of improved employment outcomes between 2012 and 2014 is especially encouraging, because it points to positive changes in the lives of clients, which is the reason for providing IPS services.

Recently the New York State Office of Mental Health funded an initiative modeled on the IPS learning community (32). This initiative provides technical assistance and a process for monitoring fidelity and employment outcomes, and it has achieved outcomes similar to those in the national learning community. New York also has targeted funds for IPS, promoting the growth and sustainment of IPS services. By the end of 2014, 59 (69%) of 86 eligible programs had joined the New York initiative.

This study had several limitations. First, the interview data relied on a single respondent from each site, without confirmed interrater reliability and validity. Second, the sampling method may have biased the sustainment rate because dropouts prior to 2012 were not included. Third, fidelity reviews were not completed for all sites. Fourth, the relatively brief follow-up period of two years warrants caution. Fifth, increased employments rates could have been affected by the economic recovery under way during this period. Sixth, generalizability was limited by sampling states that were among early adopters of IPS. Finally, because this study included no comparison group, no causal inferences can be made about the impact of the learning community.

Conclusions

Sustainment of evidence-based practices appears to be enhanced through the mechanism of a learning community. Although relatively untested with other evidence-based practices, its basic concepts are promising. Controlled studies of long-term learning communities in comparison with usual methods are needed before drawing firm conclusions.

The authors are with the Dartmouth Psychiatric Research Center, Lebanon, New Hampshire (e-mail: ).

The Dartmouth Psychiatric Research Center receives an annual gift from Johnson & Johnson Office of Corporate Contributions for disseminating IPS supported employment. The authors report no financial relationships with commercial interests.

References

1 Drake RE, Essock SM: The science-to-service gap in real-world schizophrenia treatment: the 95% problem. Schizophrenia Bulletin 35:677–678, 2009Crossref, MedlineGoogle Scholar

2 Wiltsey Stirman S, Kimberly J, Cook N, et al.: The sustainability of new programs and innovations: a review of the empirical literature and recommendations for future research. Implementation Science 7:17, 2012Crossref, MedlineGoogle Scholar

3 Bond GR, Drake RE, McHugo GJ, et al.: Long-term sustainability of evidence-based practices in community mental health agencies. Administration and Policy in Mental Health and Mental Health Services Research 41:228–236, 2014Crossref, MedlineGoogle Scholar

4 Schell SF, Luke DA, Schooley MW, et al.: Public health program capacity for sustainability: a new framework. Implementation Science 8:15, 2013Crossref, MedlineGoogle Scholar

5 Corbière M, Lanctôt N, Lecomte T, et al.: A pan-Canadian evaluation of supported employment programs dedicated to people with severe mental disorders. Community Mental Health Journal 46:44–55, 2010Crossref, MedlineGoogle Scholar

6 McHugo GJ, Drake RE, Whitley R, et al.: Fidelity outcomes in the National Implementing Evidence-Based Practices Project. Psychiatric Services 58:1279–1284, 2007LinkGoogle Scholar

7 Swain K, Whitley R, McHugo GJ, et al.: The sustainability of evidence-based practices in routine mental health agencies. Community Mental Health Journal 46:119–129, 2010Crossref, MedlineGoogle Scholar

8 Van Duin D, Franx G, Van Wijngaarden B, et al.: Bridging the science-to-service gap in schizophrenia care in the Netherlands: the Schizophrenia Quality Improvement Collaborative. International Journal for Quality in Health Care 25:626–632, 2013Crossref, MedlineGoogle Scholar

9 Rapp CA, Goscha RJ, Carlson LS: Evidence-based practice implementation in Kansas. Community Mental Health Journal 46:461–465, 2010Crossref, MedlineGoogle Scholar

10 Salyers MP, McKasson M, Bond GR, et al.: The role of technical assistance centers in implementing evidence-based practices: lessons learned. American Journal of Psychiatric Rehabilitation 10:85–101, 2007CrossrefGoogle Scholar

11 Torrey WC, Bond GR, McHugo GJ, et al.: Evidence-based practice implementation in community mental health settings: the relative importance of key domains of implementation activity. Administration and Policy in Mental Health and Mental Health Services Research 39:353–364, 2012Crossref, MedlineGoogle Scholar

12 Carlson L, Rapp CA, Eichler MS: The experts rate: supervisory behaviors that impact the implementation of evidence-based practices. Community Mental Health Journal 48:179–186, 2012Crossref, MedlineGoogle Scholar

13 Leeman J, Calancie L, Hartman MA, et al.: What strategies are used to build practitioners’ capacity to implement community-based interventions and are they effective? A systematic review. Implementation Science 10:80, 2015Crossref, MedlineGoogle Scholar

14 Covell NH, Foster FP, Margolies PJ, et al.: Using distance technologies to facilitate a learning collaborative to implement stagewise treatment. Psychiatric Services 66:645–648, 2015LinkGoogle Scholar

15 Druss BG: Supported employment over the long term: from effectiveness to sustainability. American Journal of Psychiatry 171:1142–1144, 2014LinkGoogle Scholar

16 Marshall T, Goldberg RW, Braude L, et al.: Supported employment: assessing the evidence. Psychiatric Services 65:16–23, 2014LinkGoogle Scholar

17 Becker DR, Drake RE, Bond GR: The IPS supported employment learning collaborative. Psychiatric Rehabilitation Journal 37:79–85, 2014Crossref, MedlineGoogle Scholar

18 Nadeem E, Olin SS, Hill LC, et al.: A literature review of learning collaboratives in mental health care: used but untested. Psychiatric Services 65:1088–1099, 2014LinkGoogle Scholar

19 O’Connor GT, Plume SK, Olmstead EM, et al.: A regional intervention to improve the hospital mortality associated with coronary artery bypass graft surgery. JAMA 275:841–846, 1996Crossref, MedlineGoogle Scholar

20 Hulscher ME, Schouten LM, Grol RP, et al.: Determinants of success of quality improvement collaboratives: what does the literature show? BMJ Quality and Safety 22:19–31, 2013Crossref, MedlineGoogle Scholar

21 Kilo CM: A framework for collaborative improvement: lessons from the Institute for Healthcare Improvement’s Breakthrough Series. Quality Management in Health Care 6:1–14, 1988CrossrefGoogle Scholar

22 Becker DR, Drake RE, Bond GR, et al.: A national mental health learning collaborative on supported employment. Psychiatric Services 62:704–706, 2011LinkGoogle Scholar

23 Bond GR, Peterson AE, Becker DR, et al.: Validating the revised Individual Placement and Support Fidelity Scale (IPS-25). Psychiatric Services 63:758–763, 2012LinkGoogle Scholar

24 Becker DR, Swanson S, Reese SL, et al.: Evidence-Based Supported Employment Fidelity Review Manual, 3rd ed. Lebanon, NH, Dartmouth Psychiatric Research Center, 2015Google Scholar

25 Bond GR, Becker DR, Drake RE, et al.: A fidelity scale for the Individual Placement and Support model of supported employment. Rehabilitation Counseling Bulletin 40:265–284, 1997Google Scholar

26 Fairweather GW: The Fairweather lodge: a twenty-five year retrospective; in New Directions for Mental Health Services, vol 7. San Francisco, Jossey-Bass, 1980Google Scholar

27 McFarlane WR, McNary S, Dixon L, et al.: Predictors of dissemination of family psychoeducation in community mental health centers in Maine and Illinois. Psychiatric Services 52:935–942, 2001LinkGoogle Scholar

28 McGrew JH, Bond GR, Dietzen LL, et al.: Measuring the fidelity of implementation of a mental health program model. Journal of Consulting and Clinical Psychology 62:670–678, 1994Crossref, MedlineGoogle Scholar

29 Moser LL, DeLuca NL, Bond GR, et al.: Implementing evidence based psychosocial practices: lessons learned from statewide implementation of two practices. CNS Spectrums 9:926–936, 2004Crossref, MedlineGoogle Scholar

30 Hogan MF, Drake RE, Goldman HH: A national campaign to finance supported employment. Psychiatric Rehabilitation Journal 37:73–75, 2014Crossref, MedlineGoogle Scholar

31 Karakus M, Frey W, Goldman H, et al.: Federal Financing of Supported Employment and Customized Employment for People With Mental Illnesses: Final Report. Rockville, Md, Westat, 2011Google Scholar

32 Margolies PJ, Broadway-Wilson K, Gregory R, et al.: Use of learning collaboratives by the Center for Practice Innovations to bring IPS to scale in New York State. Psychiatric Services 66:4–6, 2015LinkGoogle Scholar