AI Group Chats Are Now Therapy’s Third Wheel

AI Group Chats Are Now Therapy's Third Wheel - Professional coverage

According to Forbes, the latest advancement in generative AI is the introduction of group chat features in large language models, with OpenAI launching this capability for ChatGPT on November 13, 2025. This allows multiple human participants and the AI to be active in a single dialogue, where the AI can be told to listen quietly or intervene when called upon. One of the most significant applications being explored is in mental health therapy, transforming the classic therapist-client dyad into a therapist-AI-client triad. This is happening as millions already use AI like ChatGPT for mental health guidance, with the platform alone boasting over 800 million weekly active users. The move follows a lawsuit against OpenAI in August of this year over a lack of AI safeguards in providing cognitive advice, highlighting the risks even as the technology races forward.

Special Offer Banner

The AI Third Wheel

So here’s the thing. For years, using an LLM for anything was a solo act. You talked to the bot, it talked to you. The idea of bringing a therapist into that same digital room, with the AI listening in, was clunky. Basically, they’d have to share a login and pretend to be one person. Now, with a dedicated group chat feature, that logistical hurdle is gone. The therapist and client can enter as themselves, and the AI is just… there. A silent note-taker, a clarifier on standby, or even an active co-therapist, depending on the settings. It’s a fundamental shift from a batch process to a live, integrated one.

Convenience Vs. The Creep Factor

And look, the potential benefits are obvious. The AI can instantly define terms like PTSD if a client gets lost. It can provide a perfect, unbiased transcript and summary of the session for later review. It might even gently interject if it detects a misunderstanding. That’s powerful. But doesn’t this also feel a bit eerie? You’re having your most vulnerable conversation with a human professional, and a corporate-owned language model is parsing every word. The Forbes piece rightly flags this “Big Brother” vibe. Who owns that data? What are the privacy implications? The trust between therapist and client is sacred—adding an unfeeling, error-prone algorithm into that mix is a massive gamble. After all, these are the same systems recently sued for potentially fostering delusional thinking.

A Tool, Not A Replacement

The critical perspective here is that this only works if the AI remains a tool under the therapist’s strict control. The Forbes analysis outlines a spectrum, from a totally passive listener to an autonomous participant. I think the passive mode is where the near-term value lies—an advanced recording device. Letting the AI intervene at its own discretion? That seems like a recipe for disaster. Therapy is nuanced, human, and messy. An LLM might correctly define a term but completely miss the emotional subtext of why that term is frightening a client. The therapist’s expertise isn’t just in knowing concepts; it’s in knowing when and how to deliver them. The AI can’t replicate that judgment. Not yet, anyway.

The Inevitable Rush Forward

Now, despite the huge red flags, this is absolutely going to happen. Forbes predicts that group chat will become “table stakes” for all major LLMs. Once one platform has it, they all will. And in a field like mental health, where access is a huge problem, the siren song of an “AI-assisted” session that’s more efficient and maybe cheaper will be irresistible to many. But we’re rushing headlong into this. The safeguards clearly aren’t robust, as the recent lawsuit shows. We’re basically conducting a massive, real-time experiment on vulnerable populations. The convenience is seductive, but the risks—to privacy, to effective care, to the very humanity of therapy—are profound. The tech is advancing faster than our ability to understand its consequences. Again.

Leave a Reply

Your email address will not be published. Required fields are marked *