Education

Therapists urged to ask patients how they use AI

The question “How are you using AI?” is being pushed into therapy rooms, not as a technical checklist, but as a way to understand what a client is actually reaching for when they open a chatbot.

Therapy as a conversation, not a tech audit

Wright says asking what a patient is getting out of their conversations with an AI chatbot sets “a foundation for the therapist to better know how they are trying to navigate their emotional wellbeing and their mental illness.” It’s not just curiosity for curiosity’s sake.
Misryoum editorial desk notes that people are using these tools on a regular basis—often for coping with stressful experiences, personal relationship challenges, and, in some cases, advice on symptoms of anxiety and depression.

Saba describes it as a “treasure trove of information.” To the extent that therapists can prompt clients to bring these conversations—sometimes even into the therapy room—in increasing detail, he believes it can reveal what’s happening emotionally underneath the screen.
That could be information about the main causes of stress in someone’s life, or it could be a window into avoidance.
Wright gives a blunt example: if someone has relationship issues with their spouse, they might go to the chatbot “to either fill those needs or to avoid having these difficult conversations.”

Helping therapists see that pattern matters. Wright explains that discussing AI use can help a therapist better support a patient, including understanding how to have a safe conversation with a spouse—and the limitations of AI as a tool for filling those gaps in needs. Psychologist Cami Winkelspecht, who works primarily with children and adolescents, told Misryoum she’s been thinking about adding similar questions to intake forms, especially as more clients and parents ask for help

using AI for brainstorming and other tasks without breaking a school’s honor code. (Actually, she said it’s the kids and parents who notice the technology is already in the process, even when the adults aren’t sure what it’s doing.) In her Wilmington, Del. practice, you can almost hear the shift—paper intake forms rustling, a phone buzzing, and the faint smell of coffee that always seems to hang around—while parents try to figure out what’s allowed.

Risks, privacy, and asking without judgment

But the approach matters.
When it comes to first broaching the subject, Saba suggests doing it without judgment.
“We don’t want to make clients feel like we’re judging them,” he says, warning that clients may not want to work with a therapist if it feels accusatory.
Instead, he recommends genuine curiosity, with suggested language such as: “AI is something that’s rapidly growing, and I’m hearing from a lot of people that they’re using things like ChatGPT for emotional support.
Is that the case for you?
Have you tried that?” He also suggests asking specific questions about what patients found helpful, and whether any chatbot interactions were unhelpful or problematic.

Insel adds that AI can sometimes complement therapy in helpful ways—like helping a patient vet which topics to bring to sessions or vent about day-to-day life.
In a way, therapy and chatbots “could be aligned to work together,” he says.
Still, there are major caveats.
Saba and his co-author, William Weeks, recommend that therapists also offer to share risks of using chatbots for emotional support, including data privacy concerns because many AI companies use conversations—even sensitive ones—to further train their models.

There’s also the risk of treating a chatbot like a therapist.
Insel puts it plainly: talking with a chatbot about mental health is “the opposite of therapy” because chatbots are designed to affirm and flatter, reinforcing users’ thoughts and feelings.
“Therapy is there to help you change and to challenge you,” he says, and to get patients to talk about things that are particularly difficult.
And that line—about what’s hard to say out loud—seems like the point many clinicians are circling back to, even as questions about AI use keep growing in clinics and school discussions, and maybe… only just starting to settle into intake forms.

Digital accessibility deadline arrives, schools lag

Registration Opens for 2026 National Eucharistic Pilgrimage, Schedule Released

Bihar Board 12th result 2026 likely today; check online

0 0 votes
Article Rating
Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments
Back to top button
0
Would love your thoughts, please comment.x
()
x