Mental Health Apps Operating Outside Ethical Boundaries

Shyanne Dougherty questions the ethics of unregulated mental health apps. __________________________________________

A major part of the 21st century’s technological boom has been the development of smartphones, which provide application (app) platforms for almost anything imaginable. This includes apps aimed at improving physical health, fitness, and mental health. During the COVID-19 pandemic healthcare professionals were encouraged to get creative in delivering care through remote and virtual measures. As of 2023, there are more than 350,000 health apps available globally. The emergence of these apps helps alleviate barriers to accessing healthcare, and closes the knowledge gap between the healthcare field and the general public, thus promoting social interest and the principle of justice.  However, the development of these apps, in particular those aimed at mental health, raises ethical concerns regarding providing competent care, respecting self-determination, and patient confidentiality.

Healthcare is a regulated field, meaning that healthcare practitioners are held to educational requirements, standards for professionalism and scope of practice, ethical guidelines, and continuing education requirements. In Canada, each province or territory regulates the mental health profession. Overall, in North America, to be a mental health practitioner usually involves a graduate level education in the field, licensure with a governing body, demonstration of specific skills, and an apprenticeship under an already qualified mental health professional. None of this is needed to create a mental health app. In fact, anyone with the programming skills can create an app! While not every app submitted will make it onto the app store, the team assessing which apps to accept focus on consumer-related criteria such as design, innovation, and user experience, not the qualifications of the developer or the validity of the health-related data behind or within the app.

Photo Credit: Marco Verch Professional Photographer/flickr. Image Description: Hands holding a smartphone on the street background.

Furthermore, a majority of mental health apps are not even based on empirical research regarding the safety and effectiveness of the interventions and techniques they employ. Essentially, these are apps created by anybody, giving out mental health advice that may not have been researched or proven to be effective. Although the app store does provide information on the app, the information provided consists of an app description, and user reviews but no transparency on the qualifications behind the app creator is given.

Another aspect of providing competent care is providing full transparency, acknowledging the multi-faceted nature of mental health, and having emergency preparedness plans in place. Some mental health apps mislead clients into believing the app alone is effective at curing mental health disorders. As such, they can deter individuals from seeking the proven and effective treatment that they need. Many apps do not have disclaimers with instructions on what to do in a mental health emergency, and the ones that do have disclaimers often present the information in dense, formal language. Given that the purpose of these apps is to explore mental health, some material in the apps may be triggering to users, which could result in a mental health emergency, and the apps themselves provide no guidance on how to handle such emergencies.

The health field also strongly values confidentiality. All jurisdictions have health care confidentiality laws. For example, Newfoundland and Labrador has the Personal Health Information Act. Mental health apps evoke various concerns regarding their ability to guarantee confidentiality. If an app has not encrypted the coding to include proper confidentiality measures, user information could be accessible by outside individuals, other businesses, and third parties looking to sell the information. Even if apps disclose their inability to guarantee confidentiality, the disclosures tend to be frivolous, written in complicated language, and are not user friendly, challenging the ability of users to provide appropriate informed consent.

As healthcare services move to technological platforms, we need to consider how we can move our carefully formed ethical principles to these platforms as well and ensure the proper regulation of services being offered through technology. Relevant principles include professional competency, protection of confidentiality, and respect for patient autonomy. The first step is to consider who should be regulating these apps if they fall under different authorities and if regulation should differ depending on the app type or function. The primary concern of regulation should be the approval of apps for public use. Currently, technological-based mental health apps are largely unregulated, and operating outside the carefully deliberated foundation of ethics that grounds healthcare.

 __________________________________________

Shyanne Dougherty is a Master of Health Ethics student at Memorial University.