Use of AI Chatbot by Mental Health Company Causes Ethical Concerns

KoKo, a non-profit health service connecting people with mental health needs to volunteer counselors, used artificial intelligence to write messages to users.

Telehealth and virtual counseling are convenient ways to receive help for mental health conditions such as anxiety or depression. Several companies provide such services, some of which use messaging systems to connect patients to providers.

Recently, KoKo, a mental health service that connects volunteer counselors to people requesting mental health therapy, employed an experiment using chatbots to respond to users. First, the user would send a message, which would relay to a volunteer. The volunteer would then provide an answer using OpenAI's GPT3 large language model. This technology can write anything from poems to well-articulated responses.

This differed from the usual messaging protocol used by KoKo. Previously, a person seeking mental health counseling would chat with a bot, the bot would forward their message to a volunteer counselor, then the counselor would respond.

According to KoKo’s co-founder Robert Morris, the experiment allowed KoKo to help around 4,000 people.

But Tweets by Morris led some to believe that patients using the service were not informed of the experiment, which raised ethical and legal concerns.

One of the Tweets in question said, “once people learned the messages were co-created by a machine, it didn’t work. Simulated empathy feels weird, empty.”

Twitter users responded to Morris with criticism about ethics, patient-to-therapist trust, and a glaring lack of informed consent.

In a follow-up Tweet, Morris explained that the initial Tweet wasn’t related to users, only himself and his team. Morris also claimed the experimental feature was opt-in, so users knew about the chatbot messaging.

Informed consent occurs when a medical doctor or facility provides information to the patient about the risks, benefits, and alternatives to treatments or medical procedures. Failing to provide informed consent, if it was necessary, could result in the mental health company being held liable.

Resources:


Leave a comment

Your email address will not be published. Required fields are marked