The American Psychiatric Association (APA) has updated its Privacy Policy and Terms of Use, including with new information specifically addressed to individuals in the European Economic Area. As described in the Privacy Policy and Terms of Use, this website utilizes cookies, including for the purpose of offering an optimal online experience and services tailored to your preferences.

Please read the entire Privacy Policy and Terms of Use. By closing this message, browsing this website, continuing the navigation, or otherwise continuing to use the APA's websites, you confirm that you understand and accept the terms of the Privacy Policy and Terms of Use, including the utilization of cookies.

×
Special ArticlesFull Access

Guidelines to Establish an Equitable Mobile Health Ecosystem

Published Online:https://doi.org/10.1176/appi.ps.202200011

Abstract

Mobile health (mHealth)—that is, use of mobile devices, such as mobile phones, monitoring devices, personal digital assistants, and other wireless devices, in medical care—is a promising approach to the provision of support services. mHealth may aid in facilitating monitoring of mental health conditions, offering peer support, providing psychoeducation (i.e., information about mental health conditions), and delivering evidence-based practices. However, some groups may fail to benefit from mHealth despite a high need for mental health services, including people from racially and ethnically disadvantaged groups, rural residents, individuals who are socioeconomically disadvantaged, and people with disabilities. A well-designed mHealth ecosystem that considers multiple elements of design, development, and implementation can afford disadvantaged populations the opportunity to address inequities and facilitate access to and uptake of mHealth. This article proposes inclusion of the following principles and standards in the development of an mHealth ecosystem of equity: use a human-centered design, reduce bias in machine-learning analytical techniques, promote inclusivity via mHealth design features, facilitate informed decision making in technology selection, embrace adaptive technology, promote digital literacy through mHealth by teaching patients how to use the technology, and facilitate access to mHealth to improve health outcomes.

HIGHLIGHTS

  • Mobile health (mHealth) is a promising approach to the provision of support services that may help facilitate monitoring of mental health conditions, offer peer support, provide psychoeducation, and deliver evidence-based practices.

  • Some groups may fail to benefit from mHealth despite having a high need for mental health services.

  • This article describes an mHealth ecosystem of equity, based on the following design principles: adopt a human-centered design, reduce bias in machine-learning techniques, promote inclusivity via mHealth design features, facilitate informed decision making in technology selection, embrace adaptive technologies, promote digital literacy by teaching patients how to use mHealth, and facilitate mHealth access to improve outcomes.

Mobile health (mHealth) includes the use of mobile devices, such as mobile phones, monitoring devices, personal digital assistants, and other wireless devices, in medical care and is a promising approach to the provision of support services. mHealth may enable improvements in monitoring mental health conditions, offering peer support, providing psychoeducation (i.e., information about conditions), and delivering evidence-based practices (14). mHealth technologies offer an opportunity to overcome barriers to services through use of technologies to address challenges related to transportation to and from services, and these technologies also may address linguistic and literacy barriers (5, 6). Moreover, the COVID-19 pandemic has expanded the need to offer mHealth. Use of mHealth has the potential to provide care for service users beyond the pandemic and is rapidly transforming mental health care delivery (7).

However, some limitations of mHealth need to be overcome before it can be recognized as a credible and effective service for achieving positive mental health outcomes. Many “well-being” apps on the market do not meet academic standards for clinical interventions and lack evidence-based research to inform their content (8). This disconnect between availability and the evidence base is also apparent for apps targeting specific mental disorders, including bipolar disorder, posttraumatic stress disorder, and bulimia nervosa (9). Additional concerns about the larger issue of safety include an overreliance on apps and users’ increased anxiety when apps result in self-diagnosis (9). One systematic review of consumer-facing mHealth apps found considerable safety concerns with the quality of information (e.g., incorrect or incomplete information and inconsistencies in content) and with app functionality (such as gaps in features, lack of user input, and other limitations) (10). Many such apps are seldom backed by empirical research, and users may be subjected to deceptive marketing practices. For some individuals, the benefits of using apps may be largely a placebo response. Apps also have the potential to harm certain high-risk populations.

Privacy is also a particular concern, because marginalized groups can be more susceptible to the effects of privacy violations than other groups. For example, people with mental health challenges may avoid or delay treatment because of fear of sharing stigmatizing information (11) or fear of potential repercussions from sharing information, such as being treated differently or losing one’s job (11, 12). Additionally, digital technologies may increase coercion in psychiatric care (13). To improve the self-determination of service users who interact with mHealth, it is essential to increase transparency about the use of various technologies, allowing service users to opt out of services and to edit or delete their data; it is also important to train clinicians about the implications of using these technologies (13). Much more comprehensive risk assessment metrics are needed to regulate the production and recommendation of mHealth apps (14).

Furthermore, some groups may fail to benefit from mHealth despite having a high need for mental health services, including people from ethnically and racially disadvantaged groups, rural residents, those who are socioeconomically disadvantaged, and people with disabilities. For example, mental health conditions such as major depressive disorder and posttraumatic stress disorder are highly prevalent in Rwanda, especially among survivors of genocide (12). Mental health resources and facilities are scarce in many regions, but mHealth may offset this scarcity by offering services such as automated chatbots as part of a stepped care model, which automatically escalates service users to the attention of a mental health provider when a user needs more intensive services.

In the United States, one in five adults (52.9 million in 2020) has a diagnosis of a mental disorder (15). Even though technology adoption is higher in the United States than in resource-poor nations, technology ownership levels are lower among disadvantaged U.S. groups. For example, 80% of White adults in the United States reported having home broadband, compared with 71% of Black and 65% of Hispanic adults with such service (16). Although 79% of people in suburban communities and 76% of those in urban communities have access to broadband Internet, only 63% of people in rural communities do (17). Individuals living in rural communities also own disproportionately fewer smartphones and tablets (17). Similarly, 23% of individuals with disabilities in the United States do not access the Internet, nearly three times the percentage in the general population (8%) (18). Additionally, when specific groups, such as individuals from racially and ethnically disadvantaged groups in the United States, access in-person mental health care, they often receive poor-quality care, compared with groups that are not racially and ethnically disadvantaged (19). This phenomenon may also be true for disadvantaged groups accessing care via mHealth; however, limited knowledge currently exists regarding disparities in quality of mHealth care by racial and ethnic groups.

mHealth Ecosystem of Equity

A well-designed mHealth ecosystem that considers multiple elements of design, development, and implementation can afford disadvantaged populations the opportunity to address inequities and facilitate access to and uptake of mHealth. This article describes an equitable mHealth ecosystem with the purpose of guiding industry and nonindustry entities, scientists, administrators, policy makers, educators, clinicians, lay interventionists (e.g., peer support specialists), and service users in their mHealth efforts to facilitate inclusion and equity. We, the authors of this article, convened over the course of 1 month through e-mail and developed recommendations through discussion and collaboration. To ensure that all suggestions were documented and included, authors checked in with each other to ensure that all authors took part in forming the recommendations. This article proposes the inclusion of the following principles and standards in the development of an mHealth ecosystem of equity: adopt a human-centered design, reduce bias in machine-learning analytical techniques, promote inclusivity via mHealth design features, facilitate informed decision making in technology selection, embrace adaptive technology, promote digital literacy through mHealth by teaching service users how to use the technology, and facilitate mHealth access to improve health outcomes (Figure 1).

FIGURE 1.

FIGURE 1. The mHealth ecosystem of equity

Adopt a Human-Centered Design Approach

Adopting a human-centered design approach in the creation of mHealth products involves increasing the participation of service users—especially those from disadvantaged backgrounds—in the discovery, design, and development process of mHealth apps to gain insights pertaining to their preferences and priorities. We hope that by identifying and addressing the needs of a diverse set of users, future mHealth products will have a universal design. Furthermore, the optimal process for the discovery, development, testing, iteration, and implementation of mHealth solutions to mental health challenges resides at the intersection of traditional behavioral science research and design thinking (2022). The traditional behavioral health approach to intervention development begins with professionals recognizing a problem that needs to be addressed and then creating a solution for it. However, the lack of widespread uptake and practical effectiveness and the high disengagement found with many mHealth interventions (23) highlight the limitations of this traditional expert-driven design approach. Design thinking offers an alternative to this approach. Design thinking integrates the scientific method with end-user engagement and experience to provide an evidence-based and humanistic foundation to problem solving. This method is directly relevant to mHealth development and testing by prioritizing patients at the center of the process (20, 22).

We therefore recommend that software developers incorporate a design-thinking approach focused on understanding the needs and experiences of service users and on building and testing solutions (e.g., prototypes) with them as partners throughout the software development and implementation process. Specifically, we recommend that product management teams incorporate patient-centered approaches for obtaining data to inform design decision making. Moreover, we recommend that clinical teams regularly liaise with service users, social workers, and certified peer support specialists to ensure continuous interaction and that clinical teams also utilize accountability measures of service user engagement that promote feedback throughout the process (24). For instance, the Quality of Patient-Centered Outcomes Research Partnerships instrument was coproduced by service users and scientists and designed to improve the quality of community engagement research by providing feedback on the extent to which stakeholders report being involved in research activities (24).

Reduce Bias in Machine-Learning Analytical Techniques

Eliminating or reducing bias in artificial intelligence (AI) is of increasing importance. Despite the seemingly objective nature of AI, bias and subjectivity in this technology can affect findings in many subtle ways that affect equity. Researchers and clinicians must be mindful of the nature and limitations of the data that are being used to train algorithms and from which conclusions about service users’ health and well-being are being made (24). One example is a commercial algorithm used by the U.S. health care system. This algorithm uses health care costs to represent health needs (25). Because less money is spent on Black service users who have the same level of needs as White service users, the algorithm incorrectly determines that Black service users are healthier than White service users when both are equally sick (25). As a result, the needs of Black service users may be underestimated.

Participant recruitment, sampling framework, data collection procedures, and a host of other methodological decisions made by researchers can also have unwanted impacts on the results produced by AI analyses in terms of their ability to produce conclusions that are valid for disadvantaged populations. For example, passive data collection via smartphones has become an increasingly rich resource for researchers (26); however, these devices are not generally developed with disadvantaged populations in mind. As a result, data collection may be biased from the outset by selecting only service users who have the physiological, cognitive, and functional capacity to participate (27). Scrutinizing the sampling frameworks and recruitment strategies may offset some of these biases, in addition to working alongside community partners from underrepresented, disadvantaged, or vulnerable groups to directly address these biases.

To minimize bias in AI, we recommend that scientists be deliberate in data collection, engineers be mindful of the ways in which the data are analyzed and interpreted, and marketing teams be careful about the specific services they are promoting and for what demographic groups. When dealing with “medical interventions” and “medical devices” in which AI underpins the mechanism of action, researchers must be particularly diligent. Perhaps because of AI’s intangible nature, it can be easier for bias to creep into its applications. If a physical device was designed with a clear bias against specific groups, it is unlikely that health care regulators would grant approval of the device. Therefore, the same level of scrutiny must be applied when AI is used in the context of health care. Research protocols for producing new apps should include plans to develop algorithms from more diverse data sets.

Promote Inclusivity via mHealth Design Features

Service users with mental disorders may also have comorbid neurocognitive deficits that may vary by race, ethnicity, or gender. These types of deficits are also present among older adults or individuals with cognitive impairments. Design features have been tested in a series of studies to inform guidelines for developing mHealth tools and resources for people living with mental health conditions or cognitive impairments (Table 1) (2832).

TABLE 1. Summary of mHealth design guidelines

VisionFeatures should include the use of large navigational buttons, a shallow navigational hierarchy, pop-up menus that appear when hovering above with cursor (28), variable font size and type, variable light and contrast settings, and adaptations to the type of interface based on user preferences (29, 30).
CognitiveText content should be written at a reading level and in the language of its intended users and should use explicit or concrete wording for headings (28). Further, text should avoid jargon and diagnostic labels (31), use simple sentence structure and common terms, focus on a single topic at a time, set minimal time frames for in-app tasks, and allow the user to set a self-adopted pace (29, 30).
AuditoryFeatures should include in-app volume control, closed captioning, voice command, and video-to-video for people who may not be able to read or write (32).
MemoryBuild in repetition to facilitate retention; enable a self-adopted pace and the capacity to review; ensure variability in the presentation of information, summaries, prompts, and reminders; and include suggestions to engage in nontechnical activities to encourage multiple forms of learning and retention (3).
MobilityAllow for short interactions with technologies, such as 2-minute time frames, and provide reasonable accommodations when using technologies. For example, do not require a camera to be on during videoconferencing in cases where a service user may be bedbound.

TABLE 1. Summary of mHealth design guidelines

Enlarge table

We recommend that software developers and system engineers incorporate design features that improve app usability across diverse diagnostic groups. Moreover, to facilitate design equity and to make more informed and inclusive recommendations to industry partners, we suggest that social scientists research and consider the effects of culture, language, race-ethnicity, and gender when service users interact with digital technologies.

Additionally, not all users will need the same level of repetition and reminders. Potentially, the integration of precision medicine and multiphase optimization strategy (MOST) study designs can support the level or “dose” of mHealth features needed to optimize outcomes. MOST is a framework for building and optimizing multicomponent behavioral interventions similar to those in many mHealth apps (33). MOST involves establishing a theoretical model, identifying a set of intervention components to be examined, and experimenting to examine the impact of individual intervention components (33).

Facilitate Informed Decision Making in Technology Selection

In a qualitative study with 40 service users with serious mental illness (34), users reported that they did not feel informed regarding which mHealth technologies they could use for treatment at community mental health centers. When service users are not informed, mHealth’s intended benefits may not be achieved.

Decision support in selecting technologies may strengthen informed decision making and emphasize choice, engagement, and decision making by service users, clinicians, and certified peer support specialists (35). Current decision support interventions in mental health focus on treatment and medication choice (36), psychiatric rehabilitation decisions, or care transition determinations. Decision support within mental health has been found to promote engagement in services and treatment adherence (37). To date, few decision support frameworks regarding the choice of mHealth tools exist. For example, the American Psychiatric Association initiated a framework for selecting smartphone apps for use in clinical settings (38) that includes suggestions for informing decision making, such as asking a professional, reviewing research supporting the app, or reaching out to the app developer. Although these guidelines may be feasible for some patients, interpreting them can still require extensive functional or cognitive resources that can make it difficult for individuals with mental health challenges. Nevertheless, efforts to increase accessibility to reviews of apps are already underway. For example, Wykes and Schueller (39) call for app stores to take responsibility for what they call the “transparency for trust principles,” including providing information on privacy and data security, on how the technology was developed, on the feasibility of the tool, and on benefits to individuals. All of these details can be provided in a plain language summary to facilitate understandability.

Another example to facilitate understandability is a framework called “Decision-Support Tool for Peer Support Specialists and Service Users” that was initiated by patients (or peers) who worked together to facilitate shared decision making in selecting technologies to support mental health (40). Patient-identified decision domains include privacy and security, cost, usability, accessibility, inclusion and equity, recovery principles, personalized mHealth for patient needs, and ease of device setup. All questions are in a simple form so that all consumers can understand them. Moreover, online services, such as PsyberGuide (https://onemindpsyberguide.org) in the United States and ORCHA (https://us.orchahealth.com) in the United Kingdom, provide detailed and accessible reviews of online health apps that users can utilize depending on their individual needs. Backed by scientific advisory boards and structured by various parameters, services such as these are personalized tools that can help users navigate the untested space of mHealth with greater confidence and clarity.

We recommend that software developers create features that enable service users to have access to and control of the features they want to use, which can influence satisfaction and willingness to remain engaged; such control could include, for example, giving users a way to opt in or out of creation of a medications list (i.e., that would require a service user to enter a medication regimen). Other examples include service users’ ability to control whether they receive notifications or alarms for treatment (41), as well as other privacy and security features (42, 43). We also advise that more clinicians and social workers collaborate with developers in creating evidence-based decision support strategies within the selection process to better support service users’ mental health. Because service users commonly disengage from therapeutic digital technologies after 2 weeks or before intervention effects take place (44), the incorporation of decision support strategies may counteract premature disengagement and guide individuals in selecting technologies according to available research, their preferences and unique needs, and socioenvironmental characteristics. Moreover, we recommend that app stores integrate the aforementioned reviews and frameworks into their ratings and overviews for mHealth apps. Information such as privacy policies, user experience, and credibility scores can provide accessible guides for potential users.

Finally, we suggest that legal teams for mHealth products provide plain language summaries of privacy statements. Recently, the Cochrane Group has created guidelines to develop “plain language summaries” for all published scientific reviews to make research findings accessible to nonscientists (45). Such plain language summaries can greatly guide the development of a more understandable framework (46).

Embrace Adaptive Technology

Adaptive technologies can be products or modifications to existing services that provide enhancements or different ways to interact with a certain technology. For instance, developing mHealth applications that are accessible on multiple platforms and that can work without the Internet or that use limited data might support engagement among disadvantaged service users regardless of their location. One example is WhatsApp, a mobile instant-messaging system that offers smartphone-based communication for free across and within countries. WhatsApp also adjusts to inconsistent Internet service by sending messages as soon as the signal returns. This approach has application to many mental health apps that require content updating. Moreover, adaptive technologies can help strengthen the engagement of service users with self-management practices and reduce the likelihood of secondary complications. iMHere is an adaptive mHealth system that helps combat the dynamic changes in self-management needs that can arise for individuals with chronic conditions and disabilities (47). Its architecture includes cross-platform client and caregiver apps, a Web-based clinician portal, and a secure two-way communication protocol. The system can suggest personally relevant treatment regimens to individuals tailored to their conditions (with or without support from caregivers) and allows clinicians to flexibly modify these modules in response to their service users’ changing performance or needs over time (47).

We recommend that clinicians, social workers, and certified peer support specialists embrace such adaptive solutions in their work within communities. In the development of mHealth technologies, companies should similarly consider the integration of adaptive features to offset challenges, such as limited Internet connection, and to increase flexibility in interacting with the product to ensure long-term engagement. Nevertheless, we note that WhatsApp’s parent company, Facebook, has faced major backlash for allegedly collecting and using private user data. WhatsApp also could hinder service users’ autonomy, because users may be unable to refuse a service recommended by their health care provider despite a user’s hesitations about the service (48). Thus, although we reference these services and their role in adaptive technologies, it is important to review privacy statements to determine the privacy standards of each technology before using it or before recommending it to service users.

Promote Digital Literacy by Teaching Service Users How to Use the Technology

It is necessary to develop mHealth features that facilitate service users’ learning how to use mHealth platforms. Adults from higher socioeconomic strata are much more likely to have greater familiarity with mHealth than are low-income, Black or Hispanic, disabled, or homebound adults (49). In the promotion of digital literacy, it is necessary to understand how adults learn and especially how adults’ learning needs differ from individuals of other age groups (such as children and adolescents and older adults), including previous experiences and factors associated with mental disorders that can impede learning, including neurocognitive deficits (50).

We recommend that clinicians and social scientists incorporate andragogical learning theory—a theory of how adults learn through the comprehension, organization, and synthesis of knowledge (51)—when teaching service users or other participants how to use new technologies. We also encourage various peer-to-peer networks and groups to facilitate digital literacy training.

Facilitate mHealth Access to Improve Health Outcomes

Consistent with the concept of social determinants of health (52), which suggests that conditions in an individual’s life (e.g., housing, socioeconomic status, and education) affect the person’s overall health, having access to mHealth may also affect the overall health of service users. Thus, mHealth utilization is potentially mediated by social forces, institutions, ideologies, and processes that interact to generate and reinforce inequity among groups (1). Institutional infrastructure and processes espousing mHealth as a public health facilitator or a health care access facilitator may merely produce “digital redlining”—perpetuating unequal access for already marginalized populations. mHealth may also be used to perpetuate a “separate but equal” health care interface, allowing or justifying structural barriers to health care (53).

Nevertheless, throughout the world, governments are increasingly offering income-based access to free smartphones and data plan services, which can help reduce the digital divide. Thus, smartphone ownership among disadvantaged groups is steadily increasing, including among people with serious mental illness (54, 55), people with disabilities (56), and rural residents (17).

Access to mHealth is vital to care, and an equitable offering of services through mHealth is equally important. Specifically, an equitable and inclusive mHealth ecosystem must always include protocols to facilitate timely crisis responses. Although some mHealth technologies may discourage the use of mHealth for crises, it is possible that service users in crisis may still seek out care on publicly available platforms. Therefore, development of crisis response protocols that align with state, county, and legal regulations can support service users in crisis. For example, possible solutions can be an on-call provider to support service users in crisis in real time, integration of natural language processing to predict suicidal ideation through text message interactions, or a feature that allows for immediate connection with a local authority by dialing 988 (in the United States) or connecting service users to a warmline to work through a crisis.

Inclusive and appropriate mHealth utilization requires not merely contemplating individuals’ comfort with and access to mHealth but also requires scrutinizing the systems and processes introducing the mHealth tool. We advise that policy makers work in conjunction with social scientists to study data plan use at the population level and to determine the minimum amount of data required to use mHealth products effectively. Moreover, government programs can be expanded to allow more individuals to qualify for services (i.e., allowing more than one subscriber per household to accommodate individuals in shared housing and group homes). Software developers of mHealth products can use knowledge of government policy during the development process to ensure compatibility with free government programs and also include in-app protocols to facilitate timely crisis responses.

Limitations and Future Scope

We represent diverse stakeholder groups, including social workers, mental health service users, social scientists, certified peer support specialists, clinicians, software developers, industry partners, and systems engineers. However, as a group we are not fully representative of the backgrounds, diagnoses, socioeconomic status, professions, geography, nationality, age, and gender of the various communities of service users. Therefore, these recommendations should be reviewed with caution, because this is an important limitation.

Limitations regarding implementation of these recommendations include poverty and access, which create challenges in developing an equitable mHealth ecosystem. Much of the data on access to smartphones and the Internet come from online surveys. Therefore, it is highly likely that access levels are currently overestimated. Additionally, not all smartphone owners have data plans that give them access to apps, and some may have plans that do not provide sufficient data.

Also, it is not practical for all commercial apps to include all the resources necessary to support people with cognitive, hearing, and vision impairments. Adding all the necessary requirements may affect the cost of app development.

Nevertheless, an analysis of the long-term return on investment may show cost effectiveness. Examining the economic impact of dollars spent on mHealth development and mHealth’s impact on health services outcomes, such as hospitalizations and medication adherence, is important for determining the potential return on investment in incorporating the features delineated above. It is also necessary that future research move beyond studying the feasibility and acceptability of mHealth, because it has become quite apparent that people with mental health challenges can use and are interested in using these services. Instead, we urge for a greater focus on investigations of the clinical effectiveness of these technologies in addressing health concerns through more rigorous randomized controlled trials and meta-analyses (9).

Department of Psychiatry (Fortuna) and Center for Technology and Behavioral Health (Barr), Geisel School of Medicine, Dartmouth College, Hanover, New Hampshire; BRITE Center, University of Washington, Seattle (Kadakia); Gerontology Research Centre, Simon Fraser University, Vancouver, and Oxford Institute of Population Ageing, University of Oxford, Oxford (Cosco); Center for Health Equity Research and Promotion, Mental Illness Research, Education and Clinical Center, Department of Veterans Affairs Pittsburgh Health Care System, and Center for Behavioral Health, Media, and Technology, University of Pittsburgh, Pittsburgh (Rotondi); Heller School for Social Policy and Management, Brandeis University, Waltham, Massachusetts (Nicholson, Myers); School of Social Work, University of Illinois, Urbana (Mois); College of Applied Health Sciences Human Factors and Aging Laboratory, University of Illinois, Champaign (Mois); College of Social Work, University of Kentucky, Lexington (Hamilton); Department of Cardiovascular Medicine, Mayo Clinic College of Medicine, and Center for Health Equity and Community Engagement Research, Mayo Clinic, Rochester, Minnesota (Brewer); Psychology Department, University of Colorado, Colorado Springs (Collins-Pisano); Department of Medicine, University of South Carolina School of Medicine, and Prisma Health, Greenville (Hudson); Centre for Mental Health, University of Rwanda, Kigali (Joseph); Psychiatric Rehabilitation Division, Vinfen, Cambridge, Massachusetts (Mullaly); Clarity Health, Nashua, New Hampshire (Booth); College of Nursing and Health Sciences, University of Vermont, Burlington (Lebby); Office of Recovery and Empowerment, Massachusetts Department of Mental Health, Boston (Walker).
Send correspondence to Dr. Fortuna ().

Dr. Fortuna partners with Emissary Health, Inc. and offers consulting services through Social Wellness. The other authors report no financial relationships with commercial interests.

References

1. Fortuna KL, DiMilia PR, Lohman MC, et al.: Feasibility, acceptability, and preliminary effectiveness of a peer-delivered and technology supported self-management intervention for older adults with serious mental illness. Psychiatr Q 2018; 89:293–305Crossref, MedlineGoogle Scholar

2. Huckvale K, Nicholas J, Torous J, et al.: Smartphone apps for the treatment of mental health conditions: status and considerations. Curr Opin Psychol 2020; 36:65–70Crossref, MedlineGoogle Scholar

3. Anderson K, Burford O, Emmerton L: Mobile health apps to facilitate self-care: a qualitative study of user experiences. PLoS One 2016; 11:e0156164Crossref, MedlineGoogle Scholar

4. Torok M, Han J, Baker S, et al.: Suicide prevention using self-guided digital interventions: a systematic review and meta-analysis of randomised controlled trials. Lancet Digit Health 2020; 2:e25–e36Crossref, MedlineGoogle Scholar

5. Gammon D, Strand M, Eng LS, et al.: Shifting practices toward recovery-oriented care through an E-recovery portal in community mental health care: a mixed-methods exploratory study. J Med Internet Res 2017; 19:e145Crossref, MedlineGoogle Scholar

6. Slade M, Amering M, Farkas M, et al.: Uses and abuses of recovery: implementing recovery-oriented practices in mental health systems. World Psychiatry 2014; 13:12–20Crossref, MedlineGoogle Scholar

7. Druss BG: Addressing the COVID-19 pandemic in populations with serious mental illness. JAMA Psychiatry 2020; 77:891–892Crossref, MedlineGoogle Scholar

8. Livingston NA, Shingleton R, Heilman ME, et al.: Self-help smartphone applications for alcohol use, PTSD, anxiety, and depression: addressing the new research-practice gap. J Technol Behav Sci 2019; 4:139–151 CrossrefGoogle Scholar

9. Leigh S, Flatt S: App-based psychological interventions: friend or foe? Evid Based Ment Health 2015; 18:97–99Crossref, MedlineGoogle Scholar

10. Akbar S, Coiera E, Magrabi F: Safety concerns with consumer-facing mobile health applications and their consequences: a scoping review. J Am Med Inform Assoc 2020; 27:330–340Crossref, MedlineGoogle Scholar

11. Pescosolido BA: The public stigma of mental illness: what do we think; what do we know; what can we prove? J Health Soc Behav 2013; 54:1–21Crossref, MedlineGoogle Scholar

12. Joseph K, Nicole I, Leon M, et al.: Impact of COVID-19 on mental health in Rwanda. Rwanda Public Health Bull 2020; 2:7–12 Google Scholar

13. Morris NP: Digital technologies and coercion in psychiatry. Psychiatr Serv 2021; 72:302–310 LinkGoogle Scholar

14. Terry NP, Gunter TD: Regulating mobile mental health apps. Behav Sci Law 2018; 36:136–144Crossref, MedlineGoogle Scholar

15. Key Substance Use and Mental Health Indicators in the United States: Results From the 2020 National Survey on Drug Use and Health. Rockville, MD, Substance Abuse and Mental Health Services Administration, 2021Google Scholar

16. Atske S, Perrin A: Home Broadband Adoption, Computer Ownership Vary by Race, Ethnicity in the US. Washington, DC, Pew Research Center, 2021. http://pewresearch.org/fact-tank/2021/07/16/home-broadband-adoption-computer-ownership-vary-by-race-ethnicity-in-the-u-s/. Accessed Aug 27, 2021 Google Scholar

17. Vogels EA: Some Digital Divides Persist Between Rural, Urban and Suburban America. Washington, DC, Pew Research Center, 2021Google Scholar

18. Anderson M, Perrin A: Americans With Disabilities Less Likely Than Those Without to Own Some Digital Devices. Washington, DC, Pew Research Center, 2021. https://www.pewresearch.org/fact-tank/2021/09/10/americans-with-disabilities-less-likely-than-those-without-to-own-some-digital-devices. Accessed Sept 27, 2021 Google Scholar

19. Satcher DS: Executive summary: a report of the Surgeon General on mental health. Public Health Rep 2000; 115:89–101Crossref, MedlineGoogle Scholar

20. Adams C, Nash JB: Exploring design thinking practices in evaluation. J Multidiscip Eval 2016; 12:12–17 Google Scholar

21. Dorst K: The core of “design thinking” and its application. Des Stud 2011; 32:521–532 CrossrefGoogle Scholar

22. Tantia P: The new science of designing for humans. Stanford Soc Innov Rev 2017; 15:29–33 Google Scholar

23. Eysenbach G: The law of attrition. J Med Internet Res 2005; 7:e11Crossref, MedlineGoogle Scholar

24. Fortuna KL, Myers A, Brooks J, et al.: Co-production of the Quality of Patient-Centered Outcomes Research Partnerships instrument for people with mental health conditions. Patient Exp J 2021; 8:148–156Crossref, MedlineGoogle Scholar

25. Brewer LC, Fortuna KL, Jones C, et al.: Back to the future: achieving health equity through health informatics and digital health. JMIR mHealth uHealth 2020; 8:e14512Crossref, MedlineGoogle Scholar

26. Obermeyer Z, Powers B, Vogeli C, et al.: Dissecting racial bias in an algorithm used to manage the health of populations. Science 2019; 366:447–453Crossref, MedlineGoogle Scholar

27. Keusch F, Struminskaya B, Antoun C, et al.: Willingness to participate in passive mobile data collection. Public Opin Q 2019; 83(suppl 1):210–235Crossref, MedlineGoogle Scholar

28. Rotondi AJ, Grady J, Hanusa BH, et al.: Key variables for effective eHealth designs for individuals with and without mental health disorders: 2^12-4 fractional factorial experiment. J Med Internet Res 2021; 23:e23137Crossref, MedlineGoogle Scholar

29. Rotondi AJ, Sinkule J, Haas GL, et al.: Designing websites for persons with cognitive deficits: design and usability of a psychoeducational intervention for persons with severe mental illness. Psychol Serv 2007; 4:202–224Crossref, MedlineGoogle Scholar

30. Rotondi AJ, Eack SM, Hanusa BH, et al.: Critical design elements of e-health applications for users with severe mental illness: singular focus, simple architecture, prominent contents, explicit navigation, and inclusive hyperlinks. Schizophr Bull 2015; 41:440–448Crossref, MedlineGoogle Scholar

31. Bakker D, Kazantzis N, Rickwood D, et al.: Mental health smartphone apps: review and evidence-based recommendations for future developments. JMIR Ment Health 2016; 3:e7Crossref, MedlineGoogle Scholar

32. Fortuna KL, Lohman MC, Gill LE, et al.: Adapting a psychosocial intervention for smartphone delivery to middle-aged and older adults with serious mental illness. Am J Geriatr Psychiatry 2017; 25:819–828Crossref, MedlineGoogle Scholar

33. Collins LM, Baker TB, Mermelstein RJ, et al.: The multiphase optimization strategy for engineering effective tobacco use interventions. Ann Behav Med 2011; 41:208–226Crossref, MedlineGoogle Scholar

34. Venegas MD, Brooks JM, Myers AL, et al.: Peer support specialists and service users’ perspectives on privacy, confidentiality, and security of digital mental health. IEEE Pervasive Comput 2022; 25:41–50CrossrefGoogle Scholar

35. Hamann J, Leucht S, Kissling W: Shared decision making in psychiatry. Acta Psychiatr Scand 2003; 107:403–409Crossref, MedlineGoogle Scholar

36. Zisman-Ilani Y, Shern D, Deegan P, et al.: Continue, adjust, or stop antipsychotic medication: developing and user testing an encounter decision aid for people with first-episode and long-term psychosis. BMC Psychiatry 2018; 18:142Crossref, MedlineGoogle Scholar

37. Tlach L, Wüsten C, Daubmann A, et al.: Information and decision-making needs among people with mental disorders: a systematic review of the literature. Health Expect 2015; 18:1856–1872Crossref, MedlineGoogle Scholar

38. Torous JB, Chan SR, Gipson SY-MT, et al.: A hierarchical framework for evaluation and informed decision making regarding smartphone apps for clinical care. Psychiatr Serv 2018; 69:498–500LinkGoogle Scholar

39. Wykes T, Schueller S: Why reviewing apps is not enough: transparency for trust (T4T) principles of responsible health app marketplaces. J Med Internet Res 2019; 21:e12390Crossref, MedlineGoogle Scholar

40. Mbao M, Zisman Ilani Y, Gold A, et al.: Co-production development of a decision support tool for peers and service users to choose technologies to support recovery. Patient Exp J 2021; 8:45–63 CrossrefGoogle Scholar

41. Hilliard ME, Hahn A, Ridge AK, et al.: User preferences and design recommendations for an mHealth app to promote cystic fibrosis self-management. JMIR mhealth uhealth 2014; 2:e44Crossref, MedlineGoogle Scholar

42. Kotz D, Gunter CA, Kumar S, et al.: Privacy and security in mobile health: a research agenda. Computer 2016; 49:22–30Crossref, MedlineGoogle Scholar

43. Atienza AA, Zarcadoolas C, Vaughon W, et al.: Consumer attitudes and perceptions on mHealth privacy and security: findings from a mixed-methods study. J Health Commun 2015; 20:673–679Crossref, MedlineGoogle Scholar

44. Adelufosi AO, Ogunwale A, Adeponle AB, et al.: Pattern of attendance and predictors of default among Nigerian outpatients with schizophrenia. Afr J Psychiatry 2013; 16:283–287Google Scholar

45. Standards for Reporting Plain Language Summaries (PLS) for Cochrane Diagnostic Test Accuracy Reviews. London, Cochrane Collaboration, 2014Google Scholar

46. Wada M, Sixsmith J, Harwood G, et al.: A protocol for co-creating research project lay summaries with stakeholders: guideline development for Canada’s AGE-WELL network. Res Involv Engagem 2020; 6:22Crossref, MedlineGoogle Scholar

47. Setiawan IMA, Zhou L, Alfikri Z, et al.: An adaptive mobile health system to support self-management for persons with chronic conditions and disabilities: usability and feasibility studies. JMIR Form Res 2019; 3:e12982Crossref, MedlineGoogle Scholar

48. Silva DS, Snyder J: The ethics of new technologies to track drug adherence. CMAJ 2018; 190:E1209–E1210Crossref, MedlineGoogle Scholar

49. Choi NG, Dinitto DM: The digital divide among low-income homebound older adults: Internet use patterns, eHealth literacy, and attitudes toward computer/Internet use. J Med Internet Res 2013; 15:e93Crossref, MedlineGoogle Scholar

50. Cosco TD, Firth J, Vahia I, et al.: Mobilizing mHealth data collection in older adults: challenges and opportunities. JMIR Aging 2019; 2:e10019Crossref, MedlineGoogle Scholar

51. Knowles MS: Informal Adult Education, Self-Direction, and Andragogy. New York, Association Press, 1950 Google Scholar

52. Marmot M, Allen J, Bell R, et al.: WHO European review of social determinants of health and the health divide. Lancet 2012; 380:1011–1029Crossref, MedlineGoogle Scholar

53. Piat M, Sabetti J, Bloom D: The transformation of mental health services to a recovery-orientated system of care: Canadian decision maker perspectives. Int J Soc Psychiatry 2010; 56:168–177Crossref, MedlineGoogle Scholar

54. Ben-Zeev D, Davis KE, Kaiser S, et al.: Mobile technologies among people with serious mental illness: opportunities for future services. Adm Policy Ment Health 2013; 40:340–343Crossref, MedlineGoogle Scholar

55. Brunette MF, Achtyes E, Pratt S, et al.: Use of smartphones, computers and social media among people with SMI: opportunity for intervention. Community Ment Health J 2019; 55:973–978Crossref, MedlineGoogle Scholar

56. Morris JT, Sweatman MW, Jones ML: Smartphone use and activities by people with disabilities: 2015–2016 survey. J Technol Pers Disabil 2017; 5:50–66 Google Scholar