The American Psychiatric Association (APA) has updated its Privacy Policy and Terms of Use, including with new information specifically addressed to individuals in the European Economic Area. As described in the Privacy Policy and Terms of Use, this website utilizes cookies, including for the purpose of offering an optimal online experience and services tailored to your preferences.

Please read the entire Privacy Policy and Terms of Use. By closing this message, browsing this website, continuing the navigation, or otherwise continuing to use the APA's websites, you confirm that you understand and accept the terms of the Privacy Policy and Terms of Use, including the utilization of cookies.

×
Published Online:

Over the past decade journals in both health and mental health services research have paid increasing attention to the potential contribution that qualitative methods can make to health services research ( 1 , 2 , 3 , 4 , 5 ). Some of this focus may be attributed to encouragement from the Services Research Branch of the National Institute of Mental Health (NIMH), whose staff have emphasized the importance of findings offered by qualitative approaches as a means of determining "what works for whom, under what circumstances, and why" ( 6 ). Bolstered by this support, many researchers have forged ahead in such endeavors ( 7 , 8 , 9 , 10 , 11 , 12 ), and their studies have highlighted important and previously unrecognized factors affecting service use and delivery.

Despite this growing interest, mixed-methods research too often is designed and implemented without guidance for appreciating the underlying epistemological differences that can make multimethod studies complex and problematic ( 13 ). The authors of this article—anthropologists, psychologists, and psychiatrists who conduct mixed-methods research—came together at an NIMH working conference in September 2006 to share our experiences participating in studies in which such underlying differences were not acknowledged at the outset. Our objective at the conference was to identify critical incidents in our research endeavors that illustrated clashes of epistemological assumptions. We then developed a list of lessons learned that would mitigate such difficulties for future mixed-methods investigations. In this article we share the results of that two-day discussion.

Maximizing the qualitative contribution

Focused on first-hand experiences, our dialogue revealed several challenges to incorporating qualitative research into mixed-methods designs in a way that enhances rather than weakens the overall study. A typical experience was the receipt of post hoc invitations to collaborate in complex mixed-methods studies. Several of us recounted instances of readily accepting invitations from colleagues to participate in a research project. The study sounded intriguing, the inclusion of qualitative methods was exciting and novel, and we were eager to establish collaborative relationships with others in the field of mental health services research. Upon meeting the research team, however, we often discovered that the study design had long since been approved, surveys or other quantitative data collection instruments were in the field, and sometimes highly detailed treatment interventions were waiting in the wings.

Of equal concern—and somewhat more puzzling—was discovering that assumptions of what constituted qualitative research led to limited definitions of the work that would be carried out. A common occurrence, for example, was learning that a small number of focus groups had been proposed and accepted as the sole qualitative contribution to a services research project. It may be that focus groups are often proposed because the format is a known means of qualitative data collection and because such groups can be held in a way that is both cost-effective and time-effective. In point of fact, we fully agree that focus groups are useful for producing surface impressions of a phenomenon or generating a short list of sentinel issues that merit further exploration through either qualitative or quantitative means. But this discussion format is not appropriate for gaining an in-depth understanding of treatment processes, exploring sensitive topics ( 14 ), or appreciating the impact of the larger context in which service delivery is situated.

Thus the best design to answer the study question might include focus groups, participant observation, and in-depth individual interviews; however, the best design might not include any group discussions. Colleagues might appreciate and be convinced by a suggestion to use different or additional data collection approaches. However, suggested changes may be very difficult to implement at a late point in the process, particularly if the budget for the qualitative component of the study has already been set and is based on a different conception of the scope of qualitative work.

The role of qualitative researchers may be constrained in other ways, for example, by limiting qualitative inquiry to the conduct of interviews aimed solely at eliciting "stakeholder perspectives" (consumers' experiences with a service delivery system or clinicians' perspectives on implementing a new treatment modality). An understanding of consumers' experiences is a valuable addition to many services research projects. Nevertheless, qualitative methods can offer more to a project than just insight into stakeholder perspectives. The methods are ideal for generating formative contextual data for intervention development purposes. Qualitative fieldwork during effectiveness trials can also identify obstacles to an intervention; a protocol that works in a controlled trial may simply not be feasible in real-world settings. In addition, qualitative research can delineate key research constructs, such as the mechanisms of action of an intervention, and can contribute to the validation and interpretation of quantitative findings. If the qualitative researcher is brought in after the study is already under way, however, such challenging and robust discoveries can rarely be accommodated under the existing design.

Minimizing the potential for conflict

The late inclusion of qualitative researchers can create some of the technical challenges described above and can lead to communication problems among team members. Indeed, our discussions suggested that many difficulties in mixed-methods research are not the result of misunderstandings or points of confusion but rather emerge from different worldviews that are deeply rooted in the philosophies of knowledge that researchers bring to their work. Left unresolved—or worse, unaddressed—these differences can significantly complicate the implementation of a study, as illustrated in the following example.

A community mental health clinic and multidisciplinary research team collaborated on a project to improve the relevance of the clinic's services to community members. The study proposal described a project that would involve community members in exploring how services could be modified. Early in the project, however, it became clear that researchers had different interpretations of "community participation." The quantitative researchers interpreted it to mean that front-office clinic staff would recruit potential participants (clinic users) who in turn would act as key informants on community issues; consumer surveys and interviews would round out the data collection. For the qualitative team members, however, the term encompassed service users and potential service users (that is, anyone in the clinic service area)—the understanding being that both of these groups would participate in discussions about improving the clinic's relevance to the community.

Would the contributions of frontline staff, consumer surveys, and consumer interviews be sufficient to represent the community perspective? Or as the qualitative researchers envisioned, did the study require input from the community at large (for example, persons who lived in the clinic's catchment area who were not receiving services there)? The different assumptions about the meaning of "community" and "community participation" were not discussed by team members at the outset of the study, resulting in significant impediments to collaboration and effective implementation of the study design.

Researchers have long recognized the fact that qualitative and quantitative research methods are based on different philosophies of knowledge, but little attention has been paid to how these differences can bring about real-world dilemmas for a mixed-methods research team. For example, quantitatively trained services researchers, following the lead of their counterparts in clinical trials research, define their task as discovering "truths" about the natural and social worlds. Unbiased collection of objective data provides the researcher with "just the facts" in which statistically meaningful differences offer insights into empirical phenomena. By contrast, many qualitative researchers come from a social constructionist perspective wherein the social and natural worlds are perceived only through a culturally given lens. Within this framework, the goal of qualitative research is not to seek "truth" but to gain an understanding of how individuals' worldviews affect their behaviors.

As anthropologist Clifford Geertz ( 15 ) argued, culturally given models provide people with a way to make sense of others' behaviors ("models of") and, at the same time, offer people blueprints for acting on the world around them ("models for"). As a consequence, conflicts emerge along such fundamental fault lines as the investigator's role in data collection. The qualitative researcher who is "always in the field" may be regarded as "overengaged" with subjects, leading to suspicion that his or her results are "biased" ( 16 ). Conversely, the quantitative researcher who administers a survey to study subjects through a third party may be regarded as "detached from" the study population. Conflict among the team members is sure to result when these fundamental—and fundamentally different—research behaviors are judged on the basis of very different philosophical perspectives.

The "flow" of the research process also differs significantly in these two worlds. The randomized controlled trial, which serves as the model for many funded mental health services research projects, operates in a distinctly linear fashion: baseline measurement, then application of an intervention (versus a comparison group), which is followed by a second set of measurements to assess group-level changes. Intermediate measurements, when they are taken, are intended to serve as additional documentation of the overall linear movement of the randomized controlled trial. In contrast, many contemporary qualitative researchers ascribe to hermeneutic philosophy, in which coming to understand the Other (that is, the worldview of the individuals who are the focus of the study), involves rethinking the Self (that is, the taken-for-granted worldview of the research team). This self-reflection, in turn, provides new insights into the Other—and so on in an iterative fashion. In this recursive framework, researchers anticipate learning, understanding, assessing, and then reapproaching the object of study from a slightly new perspective ( 17 ). The following example illustrates how a "linear" study design may veer off course when some fundamental assumptions by the research team (in this case, the assumed meanings of "culture") are called into question in the course of the research process.

One research team developed a "culturally relevant" intervention in which community health workers collaborated with physicians to address the mental health needs of local community members. Initially, the intervention was considered culturally relevant precisely because the community health workers shared the same ethnicity as patients and could communicate with them in their native language. Through the ethnographic evaluation of the intervention, however, it became apparent that the community health workers, who had grown up in the United States, did not always share the social and cultural backgrounds of their largely immigrant clientele. Other cultural factors not related to ethnicity also affected the intervention. For example, the clinical milieu in which the intervention was implemented presented unanticipated challenges for the community health workers, who had to learn "on the job" how to "fit in" with fellow staff members and providers. In addition, the community health workers lacked formal training and experience in mental health services research but were expected to serve as full participants in the research team. This expectation intimidated some of the community health workers—who were reluctant to express their unease to other team members—and also contributed to low morale.

Insights gleaned from qualitative data may be shunted aside when the study design is unable to accommodate a "midcourse correction." Grounded theory—a long-standing and important approach to qualitative analysis ( 18 )—uses a constant comparison method that allows for ongoing refinement throughout the research endeavor. In the above example, the research team has continued to build on ethnographic insights into the meaning of cultural relevance to adapt and tailor the intervention to other populations. In particular, the team has since committed itself to conducting formative ethnographic research pertaining to the target population and the clinical milieu in which the intervention is to be implemented to help ensure that subsequent intervention planning proceeds in a culturally responsive manner. Other contemporary studies have used similar emergent insights to appropriately modify the course of the research processes ( 19 ).

Contemporary researchers who intend to undertake mixed-methods studies maintain that the worldviews that motivate qualitative and quantitative investigation are not an "either-or" proposition. In fact, the careful combination of approaches may prove to be far more fruitful than either methodological approach alone. We agree that bridging the two cultures of research can enlarge the knowledge base: together we can ask questions that affect what and how much we can learn from a particular phenomenon. However, sound services research designs require qualitative investigators to participate in the discussion during the study planning stages to help determine which data collection methods are best suited to a particular investigation and how the study will incorporate emergent findings from the qualitative work. Inclusion of qualitative researchers from the outset is not just good practice; it will lead to more robust mixed-methods designs that maximize the contributions of all members of the research team.

Lessons learned

Although misunderstandings between qualitative and quantitative researchers may continue to arise in mixed-methods studies, many are predictable and, therefore, potentially preventable. Even after these conflicts arise, they are usually amenable to negotiation to the benefit of the research project as a whole. On the basis of our experiences, we offer the following recommendations for mixed-methods projects.

Early collaboration

The timing of bringing together qualitative and quantitative researchers for a project will vary. Whenever possible, however, qualitative experts should be involved in the design as well as the execution of qualitative components in mixed-methods studies. Early involvement of qualitative experts will ensure that the goals of the qualitative work are clearly defined and that the methods selected are appropriate for meeting the study objectives. Bringing in experts midway through a project to resolve problems after they occur makes for frustration and disappointment for all concerned and undermines the collaborative efforts.

Willingness to negotiate emerging problems

Even collaborators with the best intentions who engage in careful planning and communication may run into problems once the research enters the data collection phase. Team members must accept that such challenges are inevitable and be willing to talk through these differences in a collegial manner. Fundamental philosophical differences may not be resolved, but workable solutions can be found if the challenges are viewed as philosophical rather than personal.

This is one area in which the fieldwork tradition of anthropology can provide a useful model for health services research. As noted by Camino ( 20 ), "in traditional, extended fieldwork, anthropologists continually negotiate with people to solve practical problems, obtain information, and resolve ethical dilemmas." Approaches developed to manage the cultural bridging process include "understanding concepts and values, reflected in common expression" by the local culture and respecting and including perspectives of the various stakeholders. Collaboration across qualitative and quantitative ways of knowing can be viewed as akin to the ethnographic process of bridging cultures, and as such it should concentrate on including all perspectives and discovering common grounds for understanding.

Third-party assistance to resolve problems

Conflict among collaborators is not unique to mixed-methods research and may occur within any project requiring a team approach. Frank dialogue is an important first step in addressing internal problems. However, as suggested above, philosophical differences between research team members may lead to misunderstandings of the meanings of terms that are linked to underlying epistemological differences. Some difficulties can be ameliorated if a third party is available to help identify problem areas and talk through the different meanings with collaborators. Having such a "translator" available could help to ensure that all team members share an understanding of key concepts embedded in the study design. In extreme cases, it may be necessary to call on an outside mediator or ombudsman to help resolve issues that are adversely affecting the team's ability to collaborate, as in the following example.

A misunderstanding over authorship between a junior ethnographer and a senior services researcher led to substantial unrest within a large multimethod research team. Other team ethnographers interpreted this misunderstanding as a potential affront to their own autonomy as authors; a few even considered leaving the project. The lead investigators asked an independent mediator from the university's conflict resolution unit to help defuse the deepening tension. The formal mediation process proved successful, resulting in the creation of publication guidelines upon which all parties could agree. These guidelines remain in use nearly a decade later. (Indeed, the senior researcher was so impressed by this process that he participated in mediator training and became a designated mediator for the university's faculty conflict resolution unit.)

Seldom do collaborations become so strained that formal mediation is needed to resolve the conflict. Study directors might do well, however, to establish a plan for including a neutral third party early in the collaborative process, so that if such problems are encountered, a mechanism for resolution is already in place.

Conclusions

A combination of qualitative and quantitative research methods can provide a more robust understanding of services than either method used alone. However, there has been little guidance on how to blend these methods to build on the strengths of their respective epistemologies. Casual attempts at integrating methods may engender real conflicts when underlying philosophical principles collide. However, the usefulness of mixed-methods approaches should encourage research teams to work through their differences instead of simply abandoning these collaborations. We suggest that dialogue among collaborators take place in the early planning stages of a study, during which the team members discuss their philosophical assumptions and the ways in which their work reflects those epistemologies. Being able to explain why one is making a particular suggestion or decision and how it serves the goals of the collaboration builds appreciation and respect among collaborators and leads to better research results.

With advance planning and communication—and a plan for mediating philosophical roadblocks—a team can spend more time conducting research and less time ameliorating internal conflicts. The result will be a robust field of services research that is timely and informative and that can make a maximum contribution to improving treatment for psychiatric disorders.

Acknowledgments and disclosures

The authors thank Ann A. Hohmann, Ph.D., M.P.H., for her mentorship, encouragement, and long-standing dedication to mental health services research. They also thank Howard Waitzkin for his helpful feedback on earlier versions of the manuscript.

The authors report no competing interests.

Dr. Robins is affiliated with Westat, 1650 Research Blvd., Rockville, MD 20850 (e-mail: [email protected]). Dr. Ware is with the Department of Psychiatry and the Department of Social Medicine, Harvard Medical School, Boston. Dr. dosReis is with the Division of Child and Adolescent Psychiatry, Johns Hopkins University School of Medicine, Baltimore. Dr. Willging is with the Behavioral Health Research Center of the Southwest, Albuquerque, New Mexico. Dr. Chung is with the Department of Psychiatry, Georgetown University Medical Center, Washington, D.C. Dr. Lewis-Fernández is with the Department of Psychiatry, Columbia University, and the New York State Psychiatric Institute, New York City.

References

1. Devers KJ, Sofaer S, Rundall TG (eds): Qualitative methods in health services research. Health Services Research 34:1153– 1188, 1999Google Scholar

2. Barbour RL: Checklists for improving rigour in qualitative research. British Medical Journal 322:1115–1117, 2001Google Scholar

3. Mays N, Pope C: Rigour and qualitative research. British Medical Journal 311:109– 111, 1995Google Scholar

4. Pope C, Mays N: Assessing quality in qualitative research. British Medical Journal 320:50–52, 2000Google Scholar

5. Pope C, Mays N: Research the parts other methods cannot reach: an introduction to qualitative methods in health and health services research. British Medical Journal 311:42–44, 1995Google Scholar

6. Hohmann A, Shear MK: Community-based intervention research: coping with the "noise" of real life in study design. American Journal of Psychiatry 159:201– 207, 2002Google Scholar

7. Guarnaccia P, Rodriguez O: Concepts of culture and their role in the development of culturally competent mental health services. Hispanic Journal of Behavioral Sciences 18:419–443, 1996Google Scholar

8. Ware NC, Tugenberg T, Dickey B, et al: An ethnographic study of the meaning of continuity of care in mental health services. Psychiatric Services 50:395–400, 1999Google Scholar

9. Weine SM, Ware NC, Klebic A: Converting cultural capital among teen refugees and their families from Bosnia-Herzegovina. Psychiatric Services 55:923–927, 2004Google Scholar

10. Hopper K, Barrow SM: Two genealogies of supported housing and their implications for outcome assessment. Psychiatric Services 54:50–54, 2003Google Scholar

11. Hopper K, Jost J, Hay T, et al: Homelessness, severe mental illness, and the institutional circuit. Psychiatric Services 48:659– 665, 1997Google Scholar

12. McMillen C, Fedoravicius N, Roew J, et al: Crisis of credibility: concerns about the psychiatric care received by consumers of the child welfare system. Administration and Policy in Mental Health and Mental Health Services Research 34:203–212, 2007Google Scholar

13. Agar MH: Know when to hold 'em and when to fold 'em: qualitative thinking outside the university. Qualitative Health Research 14:100–112, 2004Google Scholar

14. Agar MH, MacDonald J: Focus groups and ethnography. Human Organization 54:78– 86, 1995Google Scholar

15. Geertz C: The Interpretation of Cultures. New York, Basic Books, 1973Google Scholar

16. Jivanjee P, Robinson A: Studying family participation in system-of-care evaluations: using qualitative methods to examine a national mandate in local contexts. Journal of Behavioral Health Services and Research 34:369–381, 2007Google Scholar

17. Alvesson M, Skoldberg K: Reflexive Methodology: New Vistas for Qualitative Research. London, Sage, 2000Google Scholar

18. Straus A, Corbin J: Grounded theory methodology, in Handbook of Qualitative Research. Edited by Denzin NK, Lincoln YS. Thousand Oaks, Calif, Sage, 1994Google Scholar

19. Walker JS, Koroloff N: Grounded theory and backward mapping: exploring the implementation context for wraparound. Journal of Behavioral Health Services and Research 34:443–458, 2007Google Scholar

20. Camino L: What can anthropologists offer ethnographic program evaluation? in Practicing Anthropology in a Postmodern World: Lessons and Insights From Federal Contract Research. Edited by Reed MC. Arlington, Va, National Association for the Practice of Anthropology, 1997Google Scholar