Health Technologies

Researcher questions overreliance of AI chatbots for child mental health

A US researcher is calling for more to be done to address ethical concerns about overreliance on AI for

Bryanna Moore, PhD, assistant professor of Health Humanities and Bioethics at the University of Rochester Medical Center (URMC), wants to make sure these conversations include ethical considerations.

The researcher said: “No one is talking about what is different about kids—how their minds work, how they’re embedded within their family unit, how their decision making is different.

“Children are particularly vulnerable. Their social, emotional, and cognitive development is just at a different stage than adults.”

In fact, AI mental health chatbots could impair children’s social development.

Research shows that children believe robots have “moral standing and mental life,” which raises concerns that children—especially young ones—could become attached to chatbots at the expense of building healthy relationships with people.

A child’s social context—their relationships with family and peers—is integral to their mental health.

That’s why paediatric therapists don’t treat children in isolation.

They observe a child’s family and social relationships to ensure the child’s safety and to include family members in the therapeutic process.

AI chatbots don’t have access to this important contextual information and can miss opportunities to intervene when a child is in danger.

AI chatbots—and AI systems in general—also tend to worsen existing health inequities.

Commentary coauthor Jonathan Herington, PhD is assistant professor of in the departments of Philosophy and of Health Humanities and Bioethics.

He said: “AI is only as good as the data it’s trained on. To build a system that works for everyone, you need to use data that represents everyone.

“Unfortunately, without really careful efforts to build representative datasets, these AI chatbots won’t be able to serve everyone.”

A child’s gender, race, ethnicity, where they live, and their family’s relative wealth all impact their risk of experiencing adverse childhood events, like abuse, neglect, incarceration of a loved one, or witnessing violence, substance abuse, or mental illness in the home or community.

Children who experience these events are more likely to need intensive mental health care and are less likely to be able to access it.

Herington said: “Children of lesser means may be unable to afford human-to-human therapy and thus come to rely on these AI chatbots in place of human-to-human therapy.

“AI chatbots may become valuable tools but should never replace human therapy.”

Most AI therapy chatbots are not currently regulated. The U.S. Food and Drug Administration has only approved one AI-based mental health app to treat major depression in adults.

Without regulations, there’s no way to safeguard against misuse, lack of reporting, or inequity in training data or user access.

Moore said: “There are so many open questions that haven’t been answered or clearly articulated.

“We’re not advocating for this technology to be nixed. We’re not saying get rid of AI or therapy bots.

“We’re saying we need to be thoughtful in how we use them, particularly when it comes to a population like children and their mental health care.”

Going forward, the team hopes to partner with developers to better understand how they develop AI-based therapy chatbots.

Particularly, they want to know whether and how developers incorporate ethical or safety considerations into the development process and to what extent their AI models are informed by research and engagement with children, adolescents, parents, paediatricians, or therapists.

Avatar

admin

About Author

You may also like

Health Technologies

Accelerating Strategies Around Internet of Medical Things Devices

  • December 22, 2022
IoMT Device Integration with the Electronic Health Record Is Growing By their nature, IoMT devices are integrated into healthcare organizations’
Health Technologies

3 Health Tech Trends to Watch in 2023

Highmark Health also uses network access control technology to ensure computers are registered and allowed to join the network. The