Dear Editor,

Multiple dermatologic conditions (e.g., acne, hidradenitis suppurativa) have been linked to higher incidences of mental health disorders, such as depression and body dysmorphic disorder [1]. Thus, it is important to address not only the dermatological components of these diseases but also their psychological impact to better serve patients’ overall well-being. The proliferation of artificial intelligence (AI) programs in conjunction with a shortage of mental health professionals has led to the establishment of AI-generated digital therapy for patients [2]. Although these programs offer the potential to help supplement traditional mental health therapies, they are fraught with ethical implications that should be appropriately considered by providers. In this letter, we attempt to highlight the ethical considerations of AI therapy programs to help dermatologists make informed decisions when approaching multidisciplinary care.

One concern is the need for programmed guardrails in AI therapy to prevent the exacerbation of mental health conditions. For example, AI has the potential to increase self-harm by encouraging users in conversations to harm themselves, with at least one potential suicide linked to AI encouragement [3]. Hence, AI therapy programs require appropriate oversight to ensure detection and de-escalation of self-harming behaviors. Moreover, many AI programs are developed as large language models, based on expansive online datasets designed to generate inferences in response to user queries. This poses a challenge for individualized therapy, given human therapists are trained to handle specific concerns that may be unique to the individual’s history, as opposed to generalized advice from aggregate data.

Privacy concerns are also important to address, given many publicly available AI models are not covered by HIPAA oversight. Privacy policies may allow for data to be stored for indefinite periods and used to train further iterations of models. Additionally, patient information has the potential to be leaked via data breaches. The AI therapy modality may accommodate patients who possess barriers to seeking human therapists, such as perceived social stigma, cost of care, shortages of available professionals, and long wait times. However, these advantages must be weighed against prominent ethical gaps, including harm prevention, individualized guidance, and privacy.

Given the rates of psychiatric comorbidities are as high as 33.4% among patients with skin disorders, providers must remain vigilant in addressing mental health [4]. Doing so amid the rise of telehealth and AI requires attention to upholding patient-care principles. The insights discussed here are important to consider when addressing the psychological care of dermatologic conditions.

The authors have no conflicts of interest to declare.

This article has no funding source.

R.S.: conceptualization and writing and editing. S.S.: writing and editing. K.N.: supervising and editing.

Additional Information

Ryan Scheinkman and Sheila Sharifi contributed equally to this work.

1.
Fried
RG
,
Gupta
MA
,
Gupta
AK
,
Gupta
AK
.
Depression and skin disease
.
Dermatol Clin
.
2005
;
23
(
4
):
657
64
.
2.
Sciencedirect.com
.
Application of artificial intelligence in medicine
. Available from: https://www.sciencedirect.com/science/article/pii/S2214782922000021
3.
Businessinsider.com
.
Widow accuses AI chatbot of encouraging husband’s suicide
. Available from: https://www.businessinsider.com/widow-accuses-ai-chatbot-reason-husband-kill-himself-2023-4
4.
Aktan
S
,
Ozmen
E
,
Sanli
B
.
Psychiatric disorders in patients attending a dermatology outpatient clinic
.
Dermatology
.
1998
;
197
(
3
):
230
4
.