The American Psychiatric Association (APA) has updated its Privacy Policy and Terms of Use, including with new information specifically addressed to individuals in the European Economic Area. As described in the Privacy Policy and Terms of Use, this website utilizes cookies, including for the purpose of offering an optimal online experience and services tailored to your preferences.

Please read the entire Privacy Policy and Terms of Use. By closing this message, browsing this website, continuing the navigation, or otherwise continuing to use the APA's websites, you confirm that you understand and accept the terms of the Privacy Policy and Terms of Use, including the utilization of cookies.

×
Evidence-Based PracticesFull Access

Strategies for Disseminating Evidence-Based Practices to Staff Who Treat People With Serious Mental Illness

According to Goethe, "Knowing is not enough; we must apply." Psychiatric Services, in conjunction with the Center for Mental Health Services, the National Alliance for the Mentally Ill, and the Robert Wood Johnson Foundation, has dedicated 2001 to reviewing evidence-based practices for treating people who have a serious mental illness. The series of journal articles has reviewed empirically based practice guidelines for supported employment (1), dual diagnosis services (2), case management and assertive community treatment (3), pharmacologic treatment (4), treatment for posttraumatic stress disorders (5), and family-based services (6). The vision of these articles is clear: mental health systems must adopt evidence-based practices to ensure that an effective set of treatment services is available for people with mental illness.

This paper reviews the research on dissemination strategies that facilitate the transfer of research-based practices from academic setting to public-sector psychiatry. How does a team of diverse mental health providers develop and maintain these evidence-based practices in the real world?

There are multiple reasons why evidence-based strategies have not been implemented at satisfactory levels. These reasons reflect almost every conceivable factor that influences the provision of services: federal and state laws, local ordinances, administrative policies, funding priorities, community resources, the concerns of advocates, the interests of local consumers, and program staffing. Strategies to disseminate evidence-based practices to staff focus largely on the last factor; given that certain practices have been shown to be effective in helping some populations with specific problems, why don't treatment teams who are responsible for assisting these populations use these practices? The dissemination strategies described in our paper target this concern.

A common element of the articles in the evidence-based practices series is that services for people with serious mental illness must be based on rigorous research. Unfortunately, the effort we put into developing and evaluating treatment practices is not paralleled by the research enterprise to examine dissemination strategies. The National Institute of Mental Health recognized this concern in a 1999 program announcement for dissemination research. The institute wanted to stimulate investigations on the array of influences that beneficially or adversely affect the adoption in practice of valid mental health research findings.

To that end, we have reviewed two strands of the literature: investigations that examine barriers to using evidence-based practices in real-world settings and studies of strategies that seek to overcome these barriers and foster the dissemination of effective practices. Although this body of investigations provides some direction for dissemination of evidence-based practices to staff, several challenges remain for future research in this area. We end the paper with a brief discussion of these challenges and directions for overcoming them.

Barriers to dissemination of evidence-based practices

Proponents of innovation are often dismayed that despite the millions of dollars and the years of effort spent in the development and evaluation of treatments for people with mental illness, service providers may take a decade or more to incorporate these treatments into their day-to-day service armamentarium (7,8). Two sets of barriers specifically related to dissemination and implementation might account for this delay.

First, individual service providers lack the basic knowledge and skills required to assimilate evidence-based practices into their regular approach to treatment. Moreover, work-related variables—for example, job burnout—undermine some staff members' interest in new and innovative practices. Second, many evidence-based practices require a team of service providers. Organizational barriers, such as poor leadership, a change-averse culture, insufficient collegial support, and bureaucratic constraints, hinder the team's effort to implement and maintain such practices.

Individual clinicians' lack of necessary knowledge and skills

Service providers who are expected to assimilate evidence-based programs into their day-to-day practice need to have mastered a basic set of competencies. Staff members who lack these skills are not able to carry out either the simple interventions that constitute the status quo or the new ways of providing service outlined by innovative practices (9,10). Job task analyses have outlined several levels of competency that are necessary to implement the evidence-based practices that have been highlighted in the Psychiatric Services series (11).

Three clusters of competencies emerge from these job analyses (12,13). First, service providers need to acquire the attitudes that are the foundation of evidence-based practices (14,15,16). Most important among these attitudes is a change from viewing treatment as mostly custodial—that is, "the goal is to provide asylums where people with serious mental illness can live out their lives protected from their community"—to perceiving services as adjuncts to helping people regain a place in the community.

Service providers also need a broad range of knowledge to be able to assimilate evidence-based practices. Two specific knowledge bases are especially important: information about the impact of serious psychiatric disabilities—for example, psychiatric symptoms, social dysfunction, course of the disorders, and impact on family—and information about pharmacological and psychosocial interventions. Finally, service providers need to master a series of skills, basic behavioral tools that are essential for actual implementation of evidence-based practices (17). These skills include interpersonal support, instrumental support, goal setting, and skills training.

The depth and breadth of knowledge and skills needed by individual service providers will vary depending on their role in the service plan. These roles are not necessarily wedded to professional disciplines but rather represent the provider's level of responsibility in specific treatment plans. In a classic paper, Bernstein (18) made a distinction about key roles in behavior therapy that applies here. She distinguished between the behavioral engineers—experts who are charged with developing an intervention plan—and behavioral technicians—service providers who will carry out the plan.

At a minimum, behavioral technicians must have mastered the skills necessary to implement assigned components of the treatment plan. Behavioral engineers need a much broader perspective; not only must they be expert in specific skills, but they also must have mastered the range of theories that guide the specific treatment plan.

A third person has a key role in the implementation and maintenance of evidence-based practices: the individual consumer (19,20). Rather than being passive vessels of treatment, consumers of mental health services are active members of the team and take part in deciding on goals and designing interventions.

There are several reasons why individual staff members may lack the appropriate attitudes, knowledge, and skills. Some never participated in formal (preservice) training to learn them. Many staff members who provide psychosocial services to persons with mental illness have little more than a high-school education. Others receive training that is not germane to the principles and practices outlined by evidence-based practices. For example, students in some psychological training programs learn projective testing and psychodynamic therapy techniques (21,22), neither of which has been shown to be useful in treating disabilities among people with serious mental illness (23).

Training in specific disciplines may also have unintended consequences in terms of some evidence-based practices (24,25). For example, rules about the kind of quantitative assessment that is fundamental to many evidence-based practices may seem incongruent with some of the basic tenets learned in nursing and medical schools (26,27). Some providers believe that quantification is contrary to care giving.

Some service providers have the necessary knowledge and skills but are unable to implement evidence-based guidelines because they are burned out. Of the various models of job burnout, the paradigm of Maslach and colleagues (28) has been especially useful for understanding the experiences of health care providers. Their model includes three components: reduced personal accomplishment, or absence of feelings of competence and success because of job stress; emotional exhaustion, or feelings of being emotionally overextended and physically drained from work; and depersonalization, or an impersonal response style to health care consumers.

Research has repeatedly shown that staff members who have high levels of emotional exhaustion and depersonalization are less likely to be aware of or implement innovative approaches to human service (29,30). Moreover, they are less interested in learning new treatment approaches, which is unfortunate, because these approaches could provide the type of knowledge and skills that would help them address work stressors and counteract burnout. Some staff members adopt an attitude that if they ignore the evidence-based treatment, it will go away.

Provider teams' difficulty in developing a cohesive service plan

Staff burnout is also associated with diminished collegial support; mental health providers who report a lack of cooperation and collaboration with peers on the treatment team are likely to be emotionally exhausted at work (31,32). Clearly, this is a troublesome phenomenon, because the success of many evidence-based practices requires the coordinated efforts of the treatment team. Assertive community treatment, services for people with dual diagnoses, and supported employment all require a multidisciplinary group of services providers who can integrate their unique skills into an effective and dynamic plan for each consumer. Service providers who are unable to work together as a team will not be able to develop a plan that is sufficiently broad to meet the exigencies of most evidence-based guidelines (33,34).

Moreover, these providers are unlikely to be able to follow a plan dynamically—that is, to change key parameters as the needs, resources, and skills of the individual consumer become more apparent. Finally, a disjointed team may not develop a collaborative relationship with the consumer but will develop instead a unilateral plan, in much the same way as the team members work with each other (19).

The research has identified several reasons—in addition to individual staff burnout—why collegial support on a treatment team fails to develop. Often, team members perceive a lack of control over programmatic decisions. They believe that service innovations, and their corresponding training initiatives, reflect the interests of the administrators rather than representing what line-level service providers believe to be the key needs and concerns of their clients (33,35). Administrative priorities are often perceived as reflecting abstract political interests rather than the more pressing needs of the team and its clients. Furthermore, treatment teams often report that their efforts are bogged down by bureaucratic constraints (33,36), including the paperwork and other documentation that are needed to track the implementation and impact of services. Staff members frequently feel that these kinds of efforts take time away from the essential aspect of their job, which is interacting with consumers.

A third variable that is critical to teamwork is leadership. Does the person who is responsible for administering the team and supervising team members have the necessary skills to do so? Two leadership styles clearly undermine teamwork. Passive management-by-exception leaders respond to organizational issues only when they arise as a barrier to performance or an exception to standard practices (37,38). This approach may be troublesome, because it focuses on staff errors. Overreliance on corrective management in the absence of positive feedback may demoralize staff. Laissez-faire leaders are uninvolved or disinterested in the day-to-day activities of their staff (39). This kind of hands-off leadership typically undermines collegiality among team members, which in turn diminishes the implementation of effective treatments.

Limited time for training

The above review clearly suggests a direction for training—namely, interventions that foster individual staff training and organizational development. Unfortunately, this need presents an interesting conundrum. The group that most requires training and development is staff who work in public-sector programs that are overwhelmed by the number of clients needing service and by the lack of resources to support these services. This group will report that more than eight hours of their workday are filled with direct service activities. How, then, can they take time away from the field to engage in training services?

Administrators will echo this concern. Service providers who are tied up in training programs are not providing billable services; hence agency income, which is already meager, is further limited. Any effort to boost the skills of individual staff and the treatment team must take this dilemma into account. In particular, training technologies that can be provided at the job site and that are quickly transferable to the practice environment have priority.

Strategies to facilitate dissemination

A variety of strategies enhance the dissemination and regular implementation of evidence-based practices. Three are reviewed here: manual- and guideline-based strategies that make evidence-based practices more user-friendly to line-level staff; education-based approaches that increase staff members' knowledge and skills; and organization-based strategies that enhance the team's ability to work collaboratively.

Making manuals and guidelines user-friendly

Some evidence-based practices have components that interfere with their transfer from the academic settings in which they were developed to real-world settings. For example, the resources and policies that foster development of an innovative practice in a research environment may not parallel the demands of consumers, providers, and administrators in the public mental health system (33). Moreover, evidence-based practices that have survived clinical trials are often steeped in jargon and principles that are unintelligible to those who do not work in academic settings (36,40). The development of treatment manuals and practice guidelines is a key strategy for making evidence-based practices more accessible to line-level staff.

Treatment manuals originally were designed to ensure that clinicians participating in efficacy studies were implementing interventions according to the protocol (41,42). These manuals spell out the specific steps through which therapists must guide consumers to accomplish the goals of the service. As the manuals were being developed, dissemination investigators realized that they provide a technology for overcoming some of the translation barriers posed by evidence-based treatments. Treatment manuals and practice guidelines have been successfully developed and disseminated for social skills training programs (43,44), family treatment (16,45), supported employment (46), and assertive community treatment (47,48).

Manuals for evidence-based treatments serve several purposes (49). The microskills contained in many treatment manuals can be quickly learned by line-level generalists, so there is no need to hire specially trained and costly professionals to implement the programs. Treatment manuals often have high face validity, which improves the likelihood that provider staff will understand the treatment's rationale and implement the technology. The manuals typically have built-in fidelity systems that practitioners can use to ensure that they are implementing the program correctly. These systems may also include outcome assessments that assist staff in determining whether consumer goals are being accomplished.

Despite these benefits, manuals have some limitations that need to be addressed in future work. Practice guidelines and treatment manuals vary in level of detail and specific guidance. Most were not developed with the high school-educated worker as the intended provider audience and therefore require additional translation and elaboration. Manuals were not meant to be stand-alone dissemination strategies. Rarely do innovators believe that the information in these manuals can be implemented without some basic training. Education strategies are often combined with manuals to enhance dissemination and implementation.

Educating staff on skills and principles

Education programs target two different groups of people: those in preservice training—for example, students preparing for a career in mental health services—and those in in-service training—for example, paraprofessionals and professionals who must learn recently developed evidence-based approaches to update their practices. In terms of preservice training, university-based curriculum programs have identified a variety of competencies needed for contemporary positions (50,51,52). Innovators in this area who are training future behavior technicians prefer to target discrete skills rather than overwhelming students with the breadth of facts and principles that constitute the body of knowledge related to mental health. Follow-up research has shown that students who complete these curricula actually pursue careers in their discipline and report that they were adequately prepared for their jobs (52).

A larger body of research has examined the impact of in-service training on the day-to-day practices of mental health treatment providers. Like preservice education, in-service programs for line-level staff have focused on teaching discrete skills that make up the evidence-based practice (53,54). Trainers use such learning activities as modeling, role play, feedback, and homework to help staff learn new skills and apply them in their treatment settings. Frequently these programs are paired with education for staff on how to use manual-based practices.

Research has shown that service providers who complete these kinds of training programs have improved attitudes about innovative practices (55,56,57,58,59,60), learn more skills (61,62,63), and show some use of the skills at their practice setting (33,44,45,62). Skills learned in in-service training are likely to be maintained over time when training is paired with ongoing, regular consultation (16,64,65).

Although education-based approaches are a necessary first step in disseminating evidence-based practices, they have two significant limitations. First, many professionals and paraprofessionals decide not to participate in staff education programs or drop out before training is completed (54,65,66). As a result, a significant portion of the provider population never receives training. Second, treatment providers who learn new skills during in-service training fail to develop enduring treatment services based on these skills (44,49).

In an attempt to identify factors that promote technology transfer, researchers have compared programs that seem to embrace innovative ideas with those that remain stuck in old ways and hence fail to benefit from education approaches (49). They found that in the first case, "innovators" within the program had sufficient organizational commitment and administrative support to introduce and maintain innovations within their teams. Industrial and organizational psychologists have developed a variety of strategies to foster this kind of commitment and support. Some of these strategies are reviewed below.

Education programs are more successful when the use of new skills by staff members is reinforced. Organizational behavior management is the application of behavior modification principles to reinforce individual and group behaviors within organizations (67,68,69). For example, organizational consultants may teach supervisors and staff how to use work-related rewards, such as extra days off, monetary bonuses, and prime parking spaces, to increase the use of newly learned skills. Organizational behavior management, which is based on B. F. Skinner's form of applied behavior analysis, yields several benefits as a dissemination strategy.

First, it provides the staff developer with a broad and empirically well-supported theoretical perspective for understanding staff behaviors. Professionals who are familiar with behavior modification can quickly master the fundamentals of the strategy (70). Second, this theoretical perspective provides a useful method for tracking the effects of organizational behavior management as well as a useful "bag of tricks" from which training consultants might select such interventions as goal setting (71) and performance feedback (72) to effect desired staff behavior.

Improving organizational dynamics

A team of treatment providers may not be interacting cohesively for several reasons, as outlined above. Each of these reasons suggests necessary foci for organization-based strategies. Research conducted by organizational psychologists and management experts from the business sector points to some of these strategies.

Improving team leadership. Research has identified two types of leadership skills that are especially effective for the services team: transformational and transactional (73,74). Leaders who use transformational skills encourage team members to view their work from more elevated perspectives and to develop innovative ways to deal with work-related problems. Specific skills related to transformational leadership promote inspiration, intellectual stimulation, and individual consideration. Transactional leadership skills include goal setting, feedback, self-monitoring, and reinforcement strategies that help team members maintain effective programs.

Two studies have examined whether leadership models that were primarily developed in business and military settings are relevant for mental health and rehabilitation teams (75,76). Because the goals and tasks that define mental health settings are different from those in industrial and military systems, one might expect that the leadership needs of mental health teams would not be explained by investigations conducted in those systems. However, the research does not support this concern. Findings from studies of more than 1,000 staff members working in human service settings showed that independent groups of mental health providers (77) and rehabilitation providers (76) identified leadership factors that paralleled transformational and transactional leadership. Findings from these studies were then discussed with eight focus groups comprising team members and leaders to develop a curriculum for mental health team leaders (77). Subsequent research found that team leaders who participated in training in this curriculum showed significant improvement in individualized consideration and supervisory feedback (78). Improved leadership has also been associated with consumer satisfaction and quality of life (78).

Total quality management. Targeting leadership skills may often be insufficient to improve teamwork and collaboration. Organizational psychologists have developed total quality management strategies that are useful for facilitating a team's ability to work together and implement effective intervention programs (79,80,81,82). Three principles are central to total quality management. First, total quality management is a set of organizational development strategies that attempts to improve the quality and productivity of the work environment from the bottom up—for example, from the level of the case manager, the job coach, and the rehabilitation counselor charged with the day-to-day implementation of the program. Supervisors and administrators are frequently removed from day-to-day affairs and therefore are not aware of immediate programmatic needs (83,84,85).

Second, development efforts need to be driven by data rather than by opinion (86,87). Hence employees need to collect objective information to identify program needs and client progress. Employees must also collect data to assess the impact of any program development. Finally, total quality management values continuous quality improvement. Staff members are therefore required to make explicit decisions about the program that will improve the quality and productivity of the work environment over time.

Organizational decision-making efforts, such as those supported by total quality management, often fall short when they are general and not specific to the needs of the staff or when they are not conducted for a significant length of time (88,89,90). Therefore, total quality management efforts that seek to increase evidence-based approaches need to focus specifically on composite skills for an extended period. Few studies of this management style have addressed evidence-based mental health practices, although there has been some study of efforts that affected charting and data-gathering activity in service settings (91,92).

Interactive staff training. Interactive staff training represents an integration of the education approach and the total quality management approach to dissemination (93). As such, it varies from more traditional training efforts in two important ways. First, training focuses on the team in its practice setting; in this way, team members can work together to learn new practices and form them into a viable plan for their agency. Second, it encourages the development of user-friendly programs. Interactive staff training accomplishes these goals by walking the team through four stages.

Stage 1 provides an introduction to the system. Consultants who provide interactive staff training usually come from outside the team. Hence they need to gain the trust of team members before significant training and program development occurs (94). This trust can often be obtained by beginning the training effort with a needs assessment; the message here is that the team best knows its own training needs. Individuals from within the existing team are then assembled as a program committee charged with making preliminary decisions about how to implement the selected intervention. One person from the committee is chosen to champion the training and development effort (21).

In stage 2 a program is developed. Interactive staff training consultants work with the program champion and committee to make specific decisions about which evidence-based practice best meets their needs. The training consultant uses this opportunity to educate committee members and other key staff about the principles and services of the selected intervention. The consultant then engages the program committee in making decisions about how the ideal program will be adapted to meet the needs of participants and staff. Consultants use their expertise to help the committee evaluate initial decisions. Socratic questioning is a useful means for accomplishing this goal (95). Rather than trying to ascertain a weakness or a limitation of a program, the purpose of Socratic questioning is to help the champion and the program committee evaluate for themselves the costs and benefits of specific program choices.

Stage 3 focuses on program implementation. Before a full-fledged trial of the program occurs, the committee pilots a draft program to uncover potential weaknesses. Pilot programs are conducted with a subgroup of team members and a subset of program participants. The program committee then uses a problem-solving approach to resolve difficulties discovered in the pilot program. Through this process, program committees and treatment teams are taught that limitations in an evidenced-based program are problems that can be fixed, rather than overwhelming difficulties that indicate that the program should be abandoned.

Stage 4 covers program maintenance. In the final stage, the team sets up structures that help maintain the newly developed package over the long term. Staff members are encouraged to brainstorm to produce questions about the efficacy of the program that lead to suggestions for correcting problems. The program committee then collects data to determine program efficacy in terms of the specific questions (96,97). The committee uses the data to adjust the program where needed.

Three studies have examined the impact of interactive staff training on participating staff and their clients. The first study evaluated the impact of nine months of interactive staff training on attitudes and burnout levels of 35 participating staff members (98). The results showed significant reductions in burnout and improvements in collegial support and in attitudes about program development.

The second study used a team level of analysis to examine whether interactive staff training led to actual change in the behavior of staff who were conducting a rehabilitation program in a residential setting (93). The results showed increases in staff participation in evidence-based services from zero to more than 75 percent of team members. Moreover, the proportion of consumers who participated in targeted strategies rose from less than 10 percent to more than 85 percent of program participants.

The third study obtained similar results with a time series design that measured changes in staff behavior related to the evidence-based program and consumers' response to that program (93). In that study, burnout diminished among all team members. The staff's attitudes about innovations and the actual implementation of the innovations improved significantly. Consumer satisfaction with the program improved, and overall consumer outcome, as measured with the Global Assessment of Functioning, showed significant improvement.

Limitations to research on dissemination programs

There is a meta-message in this paper, namely, that we need to adopt an evidence-based approach to evaluating the dissemination of evidence-based treatments. Although some of the strategies used to evaluate clinical services might be relevant for evaluating dissemination practices, a paradigm for the complete evaluation of transfer strategies would not have the same basic assumptions as one for clinical research (42,93).

Dissemination researchers agree that the comprehensive assessment of training efforts must include five progressively important levels of measurement (99): Did staff participants find the training program interesting and satisfactory? Did training increase the staff's knowledge and skills? Did increased knowledge and skills lead to real changes in the service program? Did consumers report greater satisfaction with the program as a result of these changes? Were consumers better able to cope with their disabilities as a result of program changes?

Much of the dissemination research reviewed in this paper targets the first levels of impact without determining whether staff practices or consumer outcomes actually improve after the dissemination effort.

The standard design for outcomes evaluation in clinical and services research is the randomized controlled trial. However, two obstacles interfere with randomization in dissemination research. First, administrative rules—including union contracts—may prevent the assignment of staff to conditions that represent independent arms of a dissemination impact study. In cases in which such assignments are condoned, administrators might restrict the breadth and depth of process and outcome measures so as not to interfere with work. However, similar concerns are periodically raised as reasons to constrain randomized controlled trials in clinical and service settings. These limitations have been overcome when administrators and policy makers have recognized the importance of this kind of research design.

Second, the rationale behind randomization requires individual service providers to be assigned to different service teams, corresponding with unique dissemination strategies. However, the unit of interest in dissemination research may be the impact of a specific dissemination strategy on the service team; for example, does a specific educational approach improve the assertive community treatment team's ability to provide case management services? Randomly assigning treatment providers to different teams would be like randomly assigning relatives to different families. The size of a randomized trial changes exponentially when the unit of analysis is shifted from individual providers to treatment teams. A study with 60 case managers randomly assigned to a dissemination strategy may have adequate statistical power, but a sample of 60 teams would require many hundred staff in several institutions.

Industrial and organizational psychologists—for example Dansereau and colleagues (100)—have developed research strategies that address this sampling problem. Moreover, quasi-experimental research methods, such as the time series design, or single-subject design methods, such as the multiple-baseline approach, may have to be considered more thoroughly for staff dissemination research.

Evidence-based practices offer great promise for helping people who have serious mental illnesses accomplish their life goals. Some of the staff dissemination practices outlined in this paper will enable service providers in real-life programs to help clients in their efforts. Ongoing research to address the concerns outlined here will ensure that mental health systems use evidence-based dissemination strategies to guide the transfer of effective practices into the practice world.

Acknowledgments

This paper was made possible in part by a grant from the Illinois Office of Mental Health to establish the Illinois Staff Training Institute for Psychiatric Rehabilitation at the University of Chicago.

Dr. Corrigan, Dr. McCracken, Ms. Blaser, and Dr. Barr are with the University of Chicago Center for Psychiatric Rehabilitation, 7230 Arbor Drive, Tinley Park, Illinois 60477 (e-mail, ). Dr. Steiner is with the Illinois Office of Mental Health.

References

1. Bond G, Becker D, Drake R, et al: Implementing supported employment as an evidence-based practice. Psychiatric Services 52:313-322, 2001LinkGoogle Scholar

2. Drake RE, Essock SM, Shaner A, et al: Implementing dual diagnosis services for clients with severe mental illness. Psychiatric Services 52:469-476, 2001LinkGoogle Scholar

3. Phillip SD, Burns BJ, Edgar EJ, et al: Moving assertive community treatment into standard practice. Psychiatric Services 52:771-780, 2001LinkGoogle Scholar

4. Mellman T, Miller A, Weissman E, et al: Evidence-based pharmacologic treatment for people with severe mental illness: a focus on guidelines and algorithms. Psychiatric Services 52:619-625, 2001LinkGoogle Scholar

5. Rosenberg SD, Mueser KT, Friedman MJ, et al: Developing effective treatments for posttraumatic disorders among people with severe mental illness. Psychiatric Services 52:1453-1461, 2001LinkGoogle Scholar

6. Dixon L, McFarlane WR, Lefley H, et al: Evidence-based practices for services to families of people with psychiatric disabilities. Psychiatric Services 52:903-911, 2001LinkGoogle Scholar

7. Kuipers E: Psychological treatments for psychosis: evidence-based but unavailable? Psychiatric Rehabilitation Skills 4:249-258, 2000Google Scholar

8. Steiner L: Psychiatric rehabilitation in the state mental health system. Psychiatric Rehabilitation Skills (special issue), in pressGoogle Scholar

9. Blair CE, Eldridge EF: An instrument for measuring staff's knowledge of behavior management principles (KBMQ) as applied to geropsychiatric clients in long-term care settings. Journal of Behavior Therapy and Experimental Psychiatry 28:213-220, 1997Crossref, MedlineGoogle Scholar

10. Dipboye R: Organizational barriers to implementing a rational model of training, in Training for a Rapidly Changing Workplace: Applications of Psychological Research. Edited by Quinones MA, Ehrenstein A. Washington, DC, American Psychological Association, 1997Google Scholar

11. Giffort D: A systems approach to developing staff training. New Directions for Mental Health Services, no 79:25-34, 1998Google Scholar

12. Jonikas J: Staff competencies for service delivery staff in psychosocial rehabilitation programs: a review of the literature. Unpublished manuscript, 1994Google Scholar

13. Torrey W, Bebout R, Kline J, et al: Practice guidelines for clinicians working in programs providing integrated vocational and clinical services for persons with severe mental disorder. Psychiatric Rehabilitation Journal 2:388-393, 1998CrossrefGoogle Scholar

14. Haddow M, Milne D: Attributes to community care: development of a questionnaire for professionals. Journal of Mental Health 4:289-296, 1995CrossrefGoogle Scholar

15. Good T, Berenbaum H, Nisenson L: Residential caregiver attitudes toward seriously mentally ill persons. Psychiatry 63:23-33, 2000MedlineGoogle Scholar

16. McFarlane WR, McNary S, Dixon L, et al: Predictors of dissemination of family psychoeducation in community mental health centers in Maine and Illinois. Psychiatric Services 52:935-942, 2001LinkGoogle Scholar

17. Jahr E: Current issues in staff training. Research in Developmental Disabilities 19:73-87, 1998Crossref, MedlineGoogle Scholar

18. Bernstein G: Training behavior change agents: a conceptual review. Behavior Therapy 13:1-23, 1982CrossrefGoogle Scholar

19. Onyett S: Understanding relationships in context as a core competence for psychiatric rehabilitation. Psychiatric Rehabilitation Skills 4:282-299, 2000CrossrefGoogle Scholar

20. Rapp C, Wintersteen R: The strengths model of case management: results from twelve demonstrations. Psychosocial Rehabilitation Journal 13(2):23-32, 1989Google Scholar

21. Corrigan PW: Wanted: champions of rehabilitation for psychiatric hospitals. American Psychologist 50:514-521, 1995CrossrefGoogle Scholar

22. Corrigan PW, Hess L, Garman AN: Results of a job analysis of psychologists working in state hospitals. Journal of Clinical Psychology 54:1-8, 1998Crossref, MedlineGoogle Scholar

23. Mueser K, Berenbaum H: Psychodynamic treatment of schizophrenia: is there a future? Psychological Medicine 20:253-262, 1990Google Scholar

24. Dickerson F: Hospital structure and professionals roles, in Handbook of Behavior Therapy in the Psychiatric Setting. Edited by Bellack AS, Hersen M. New York, Plenum, 1993Google Scholar

25. Swiezy N, Matson J: Coordinating the treatment process among various disciplines: behavior analysis and treatment, in Applied Clinical Psychology. Pacific Grove, Calif, Brooks/Cole, 1993Google Scholar

26. Hersen M, Bellack A, Harris F: Staff training and consultation, in Handbook of Behavior Therapy in the Psychiatric Setting. Edited by Bellack AS, Hersen M. New York, Plenum, 1993Google Scholar

27. Silverstein SM, Bowman J, McDugh D: Strategies for hospital-wide dissemination of psychiatric rehabilitation interventions. Psychiatric Rehabilitation Skills 2:1-24, 1997CrossrefGoogle Scholar

28. Maslach C, Jackson S, Leiter M: Maslach Burnout Inventory: third edition, in Evaluating Stress: A Book of Resources. Edited by Zalaquett, CAP, Wood, RD. Lanham, Md, Scarecrow Press, 1997Google Scholar

29. Corrigan PW: Differences between clinical and nursing inpatient staff: implications for training in behavioral rehabilitation. Journal of Behavior Therapy and Experimental Psychiatry 25:311-316, 1994Crossref, MedlineGoogle Scholar

30. Donat D, McKeegan G: Behavioral knowledge and occupational stress among inpatient psychiatric caregivers. Psychiatric Rehabilitation Journal 21:67-69, 1997CrossrefGoogle Scholar

31. Corrigan PW, Holmes EP, Luchins D, et al: Staff burnout in psychiatric hospitals: a cross-legged panel design. Journal of Organizational Behavior 15:65-74, 1994CrossrefGoogle Scholar

32. Corrigan PW, Williams OB, McCracken SG, et al: Staff attitudes that impede the implementation of behavioral treatment programs. Behavior Interventions 9:1-12, 1998CrossrefGoogle Scholar

33. Milne D, Gorenski O, Westerman C, et al: What does it take to transfer training? Psychiatric Rehabilitation Skills 4:259-281, 2000Google Scholar

34. Walko S, Pratt C, Siiter R, et al: Predicting staff retention in psychiatric rehabilitation. Psychosocial Rehabilitation Journal 16(3):150-153, 1993Google Scholar

35. Reid D, Everson J, Green C: A systematic evaluation of preferences identified through person-centered planning for people with profound multiple disabilities. Journal of Applied Behavior Analysis 32:467-477, 1999Crossref, MedlineGoogle Scholar

36. Corrigan PW, Kwartarini WY, Pramana W: Staff perceptions of barriers to behavior therapy in a psychiatric hospital. Behavior Modification 16:132-144, 1992Crossref, MedlineGoogle Scholar

37. Bass B: From transactional to transformational leadership: learning to share the vision. Organizational Dynamics 18:19-31, 1990CrossrefGoogle Scholar

38. Howell J, Hall-Merenda K: The ties that bind: the impact of leader-member exchange, transformational and transactional leadership, and distance on predicting follower performance. Journal of Applied Psychology 84:680-694, 1999CrossrefGoogle Scholar

39. Sosik J, Dionne S: Leadership styles and Deming's behavior factors. Journal of Business and Psychology 11:447-462, 1997CrossrefGoogle Scholar

40. Barlow DH: On the relation of clinical research to clinical practice: current issues, new directions. Journal of Consulting and Clinical Psychology 49:147-155, 1981Crossref, MedlineGoogle Scholar

41. Addis M, Wade W, Hatgis C: Barriers to dissemination of evidence-based practices: addressing practitioners' concerns about manual-based psychotherapies. Clinical Psychology 6:430-441, 1999Google Scholar

42. Anthony W: Psychiatric rehabilitation technology: operationalizing the "black box" of the psychiatric rehabilitation process. New Directions for Mental Health Services, no 79:79-87, 1998Google Scholar

43. Eckman T, Liberman R, Phipps C, et al: Teaching medication management skills to schizophrenic patients. Journal of Clinical Psychopharmacology 10:33-38, 1990Crossref, MedlineGoogle Scholar

44. Wallace CJ, Liberman RP, MacKain SJ, et al: Effectiveness and replicability of modules for teaching social and instrumental skills to the severely mentally ill. American Journal of Psychiatry 149:654-658, 1992LinkGoogle Scholar

45. Kavanagh DJ, Piatkowska O, Clark D, et al: Application of cognitive behavioral interventions for schizophrenia in multidisciplinary teams: what can the matter be? Australian Psychologist 28:181-188, 1993Google Scholar

46. Drake R, Becker D, Clark P, et al: Research on the individual placement and support model of supported employment. Psychiatric Quarterly 70(4):289-301, 1999Google Scholar

47. Cohen M, Farkas M, Nemec P: Psychiatric rehabilitation programs: putting concepts into practice? Community Mental Health Journal 24:7-21, 1988Google Scholar

48. Test MA, Stein L: Practical guidelines for the community treatment of markedly impaired patients. Community Mental Health Journal 36:47-60, 2000CrossrefGoogle Scholar

49. Corrigan PW, MacKain SJ, Liberman RP: Skills training modules: a strategy for dissemination and utilization of a rehabilitation innovation, in Intervention Research. Edited by Rothman J, Thomas E. Chicago, Haworth, 1994Google Scholar

50. Anthony W, Cohen M, Farkas M, et al: The chronically mentally ill case management: more than a response to a dysfunctional system. Community Mental Health Journal 24:219-228, 1988Crossref, MedlineGoogle Scholar

51. Backs AB, Giffort DW, McCracken SG, et al: Public academic training partnership for paraprofessionals who provide psychiatric rehabilitation. Psychiatric Rehabilitation Skills, in pressGoogle Scholar

52. Gill KJ, Pratt CW, Barrett N: Preparing psychiatric rehabilitation specialists through undergraduate education. Community Mental Health Journal 33:323-329, 1997Crossref, MedlineGoogle Scholar

53. Pratt C, Gill K: Profit sharing in psychiatric rehabilitation: a five-year evaluation. Psychosocial Rehabilitation Journal 17(2):33-41, 1993Google Scholar

54. Rogers ES, Cohen BF, Danley KS, et al: Training mental health workers in psychiatric rehabilitation. Schizophrenia Bulletin 12:709-719, 1986Crossref, MedlineGoogle Scholar

55. Addleton R, Tratnack S, Donat D: Hospital-based multidisciplinary training in the care of seriously mentally ill patients. Hospital and Community Psychiatry 42:60-61, 1991AbstractGoogle Scholar

56. Barnett JE, Clendenen F: The quality journey in a comprehensive mental health center. JCAHO Journal on Quality Improvement 22:8-17, 1996Google Scholar

57. Berryman J, Evans IM, Kalbag A: The effects of training in nonaversive behavior management on the attitudes and understanding of direct care staff. Journal of Behavior Therapy and Experimental Psychiatry 25:241-250, 1994Crossref, MedlineGoogle Scholar

58. Cook J, Yamaguchi J, Solomon M: Field-testing a post-secondary faculty in-service training for working with students who have psychiatric disabilities. Psychosocial Rehabilitation Journal 17(2):157-170, 1993Google Scholar

59. Liberman RP, Eckman T: Dissemination of skills training modules to psychiatric facilities: overcoming obstacles to the utilisation of a rehabilitation innovation. British Journal of Psychiatry 155(suppl 5):117-122, 1989Google Scholar

60. Lam C, Chan F, Hillburger J: Canonical relationships between vocational interests and attitudes. Vocational Evaluation and Work Adjustment Bulletin 26:155-160, 1993Google Scholar

61. Cook J, Horton-O'Connell T, Fitzgibbon G, et al: Training for state-funded providers of assertive community treatment. New Directions for Mental Health Services, no 79:55-64, 1998Google Scholar

62. Fadden G: Implementation of family interventions in routine clinical practice following staff training programs: a major cause for concern. Journal of Mental Health (UK) 6:599-612, 1997CrossrefGoogle Scholar

63. Rubel E, Sobell L, Miller W: Do continuing education workshops improve participants' skills? Effects of a motivational interviewing workshop on substance abuse counselors' skills and knowledge. AABT Behavior Therapist (Association for Advancement of Behavior Therapy) 23:73-80, 2000Google Scholar

64. Donat D: Impact of a mandatory behavioral consultation on seclusion/restraint utilization in a psychiatric hospital. Journal of Behavior Therapy and Experimental Psychiatry 29:13-19, 1998Crossref, MedlineGoogle Scholar

65. Liberman RP, Eckman T, Kuehnel T, et al: Dissemination of new behavior therapy programs to community mental health programs. American Journal of Psychiatry 139:224-226, 1982LinkGoogle Scholar

66. Liberman RP, Nuechterlein K, Wallace C: Social skills training and the nature of schizophrenia, in Social Skills Training: A Practical Handbook for Assessment and Treatment. Edited by Curran JP, Monti PM. New York, New York University Press, 1986Google Scholar

67. Bucklin B, Alvero A, Dickinson A, et al: Industrial-organizational psychology and organizational behavior management: an objective comparison. Journal of Organizational Behavior Management 20:27-75, 2000CrossrefGoogle Scholar

68. Frederiksen L: Handbook of Organizational Behavior Management. New York, Wiley, 1982Google Scholar

69. Reid D, Parsons M: Organizational behavior management in human service settings, in Handbook of Applied Behavior Analysis. Edited by Austin J, Carr JE. Reno, Nev, Context Press, 2000Google Scholar

70. Milne D: Organizational behavior management in a psychiatric day hospital. Behavioral Psychotherapy 16:177-188, 1988CrossrefGoogle Scholar

71. Calpin J, Edelstein B, Redmon W: Performance feedback and goal setting to improve mental health center staff productivity. Journal of Organizational Behavior Management 9:35-58, 1988CrossrefGoogle Scholar

72. Green C, Reid D, Perkins L, et al: Increasing habilitative services for persons with profound handicaps: an application of structural analysis to staff management. Journal of Applied Behavior Analysis 24:459-471, 1991Crossref, MedlineGoogle Scholar

73. Avolio B, Bass B, Jung D: Re-examining the components of transformational and transactional leadership using the multifactor leadership questionnaire. Journal of Occupational and Organizational Psychiatry 72:441-462, 1999CrossrefGoogle Scholar

74. Bass B: Current developments in transformational leadership: research and applications. Psychologist-Manager Journal 3(1):5-21, 1999Google Scholar

75. Corrigan PW, Garman AN, Lam C, et al: What mental health teams want in their leaders. Administration and Policy in Mental Health 26:111-124, 1998Crossref, MedlineGoogle Scholar

76. Corrigan PW, Garman AN, Canar J, et al: Characteristics of rehabilitation team leaders: a validation study. Rehabilitation Counseling Bulletin 42:186-195, 1999Google Scholar

77. Garman AN, Corrigan PW: Developing effective team leaders. New Directions for Mental Health Services, no 79:45-54, 1998Google Scholar

78. Corrigan PW, Lickey SE, Campion J, et al: A short course in leadership skills for the rehabilitation team. Journal of Rehabilitation 66:56-58, 2000Google Scholar

79. Deming WE: Out of Crisis. Cambridge, Mass, Institute of Technology Center for Advanced Engineering Study, 1986Google Scholar

80. Victor B, Boynton A, Stephens-Jahng T: The effective design of work under total quality management. Organizational Science 2:102-117, 2000CrossrefGoogle Scholar

81. Priebe S: Ensuring and improving quality in community mental health care. International Review of Psychiatry 12:226-232, 2000CrossrefGoogle Scholar

82. Sluyter G: Total quality management in behavioral health care. New Directions for Mental Health Services, no 79:35-43, 1998Google Scholar

83. Marks ML, Mirvis P, Hackett E, et al: Employee participation in a quality circle program: impact on quality of work life, productivity, and absenteeism. Journal of Applied Psychology 71:61-69, 1986CrossrefGoogle Scholar

84. Yeager E: Examining the quality control circle. Personnel Journal 58:682-708, 1979Google Scholar

85. Zemke R: Honeywell imports quality circles as long-term management strategy. Training 17:6-10, 1980Google Scholar

86. Barter JT, Lall K: Accreditation, in Textbook of Administrative Psychiatry. Edited by Talbott JA, Hales RE, Keill SL. Washington, DC, American Psychiatric Press, 1992Google Scholar

87. Fauman M: Quality assurance monitoring, in Manual of Psychiatric Quality Assurance. Edited by Mattson, MR. Washington, DC, American Psychiatric Association, 1992Google Scholar

88. Bowditch JL, Buono AF: A Primer on Organizational Behavior, 3rd ed. New York, Wiley, 1994Google Scholar

89. Glaser EM, Backer TE: Organization development in mental health services. Administration in Mental Health 6:195-215, 1979CrossrefGoogle Scholar

90. Pearlstein R: Who empowers leaders? Performance Improvement Quarterly 4(4):12-20, 1991Google Scholar

91. Hunter ME, Love CC: Total quality management and the reduction of inpatient violence and costs in a forensic psychiatric hospital. Psychiatric Services 47:751-754, 1996LinkGoogle Scholar

92. Sluyter GV, Mukherjee AK: Total Quality Management for Mental Health and Mental Retardation Services: A Paradigm for the '90s. Annandale, Va, American Network of Community Options and Resources, 1993Google Scholar

93. Corrigan PW, McCracken SG: An interactive approach to training teams and developing programs. New Directions for Mental Health Services, no 79:3-12, 1998Google Scholar

94. Brooker C, Tarrier N, Barrowclough C: Training community psychiatric nurses for psychosocial intervention. British Journal of Psychiatry 160:836-844, 1992Crossref, MedlineGoogle Scholar

95. Griffith B, Frieden G: Facilitating reflective thinking in counselor education. Counselor Education and Supervision 40:82-93, 2000CrossrefGoogle Scholar

96. Bickman L, Noser K: Meeting the challenges in the delivery of child and adolescent mental health services in the next millennium: the continuous quality improvement approach. Applied and Preventive Psychology 8:247-255, 1999CrossrefGoogle Scholar

97. Wandersman A, Imm P, Chinman M, et al: Getting to outcomes: a results-based approach to accountability. Evaluation and Program Planning 23:389-395, 2000CrossrefGoogle Scholar

98. Corrigan PW, Holmes EP, Luchins D, et al: The effects of interactive staff training on staff programming and patient aggression in a psychiatric unit. Behavioral Interventions 10:17-32, 1995CrossrefGoogle Scholar

99. Thomas EJ, Rothman J: Intervention Research: Design and Development for Human Service. New York, Haworth, 1994Google Scholar

100. Dansereau F, Yammarino F, Kohles J: Multiple levels of analysis from a longitudinal perspective: some implications for theory building. Academy of Management Review 24:346-357, 1999Google Scholar