What's New New Groups are now forming. Signup Now

When AI Isn’t Enough: Why Human Connection Matters

Table of Contents

In late August, The New York Times published a deeply troubling story about a 16-year-old who had been sharing his suicidal thoughts with ChatGPT over the course of several months. While the AI tool responded with supportive messages and resources, the teenager managed to bypass safeguards by claiming the conversations were research for a book. Tragically, he later died by suicide. His parents are now suing, raising important questions about how artificial intelligence fits into the landscape of mental health support.

As a therapy practice specializing in Dialectical Behavior Therapy (DBT) and Cognitive Behavioral Therapy (CBT), we believe this story highlights an essential truth: while AI can provide helpful information, it cannot—and should not—replace the human support, accountability, and connection that is life-saving for people in crisis.

This blog explores what AI can and cannot do, the complexities of suicidal thinking, and why human connection remains central to healing.

Teen in a supportive environment

What the New York Times Story Reveals

The teenager’s conversations with ChatGPT illustrate both the promise and the peril of AI. On one hand, the model did provide him with crisis resources and messages designed to steer him toward safety. On the other, he was able to easily circumvent those guardrails, leaving him effectively alone with his darkest thoughts.

The article noted a rise in “delusional conversations with chatbots.” While rare, the fact that some conversations with AI may precede suicides or violence raises an unsettling question: Are AI safety mechanisms truly adequate in moments of life-or-death struggle?

The short answer is no—because safety cannot be automated.

Suicidal Ambivalence: A Window into Human Experience

Research on suicide attempts consistently shows that many people who survive did not actually want to die. Instead, they wanted their pain to end. This is known as suicidal ambivalence—a state in which a person is pulled between the desire for relief through death and the hope for reasons to live.

One landmark study of suicide attempt survivors revealed that nearly all of them experienced second thoughts, even in the midst of their attempt. Some survivors described profound relief at having survived, realizing in that moment that what they truly wanted was another chance.

This nuance is critical. AI systems are not equipped to recognize ambivalence, pick up on subtle hesitations, or lean into the relational dance of validating pain while nurturing hope. Humans can.

Why AI Falls Short in Crisis Care

Let’s be clear: AI has value. Tools like ChatGPT can:

  • Provide information (e.g., coping strategies, grounding techniques, psychoeducation about mental health).
  • Offer resources (such as phone numbers, text lines, and websites).
  • Reduce barriers to access by being available instantly, at any hour.

But here’s where AI falls short when someone is suicidal:

  1. No true relational connection. AI can simulate empathy through words, but it cannot truly “be with” someone in their suffering the way another person can.
  2. Lack of accountability. A therapist or trusted friend can check in, follow up, and intervene. AI cannot.
  3. Easily manipulated. As the NYT article shows, users can sidestep safeguards by framing conversations in hypothetical terms.
  4. No real-time crisis response. AI cannot call emergency services, reach family members, or create a safety net when immediate action is needed.

The Healing Power of Human Connection

Therapies like DBT and CBT are built on the foundation of human connection. In DBT, we emphasize “radical genuineness”—the idea that therapists meet clients as real people, not just as patients. This connection is not a side benefit; it is core to healing. At the Counseling Center Group, we offer weekly individual therapy, weekly skills training group, phone coaching in between sessions, and weekly participation by therapists in a peer consultation team that helps them hone their skills.

Clients in crisis need:

  • Validation: Someone to say, “Your pain is real, and it makes sense you feel this way.”
  • Skills coaching: Real-time support to regulate emotions, tolerate distress, and problem-solve.
  • Accountability and follow-up: A therapist remembers your story, cares about your progress, and is invested in your survival.
  • Hope: The presence of another human being who believes you can make it through is powerful medicine.

No chatbot, however sophisticated, can replicate these relational elements.

Understanding Why People Turn to AI

If AI cannot replace therapy, why are more people—especially teenagers—turning to it for support?

  1. Accessibility. It’s available 24/7, without waitlists or appointment scheduling.
  2. Anonymity. Users feel less judged when opening up to a machine.
  3. Affordability. Many families struggle with the cost of therapy.
  4. Stigma. A chatbot feels safer than telling a parent, teacher, or peer about suicidal thoughts.

These factors highlight systemic problems: insufficient access to mental health care, cost barriers, and stigma. AI fills a gap, but it’s the wrong tool for a life-or-death situation. The National Institute of Mental Health offers suicide prevention resources online.

What Survivors Teach Us

Listening to suicide attempt survivors offers powerful lessons. Many describe feeling unseen, unheard, and unbearably alone. But when another person reached out—a therapist, family member, or even a stranger—it made the difference between life and death. 

Many survivors describe this ambivalence by saying they didn’t actually want to die, but rather wanted the pain to stop. For many, being listened to without judgment gave them a renewed will to live.

That is what human connection provides: not just resources or coping strategies, but presence, empathy, and hope.

How DBT and CBT Help in Suicidal Crisis

Both DBT and CBT are evidence-based therapies proven to reduce suicidal behaviors.

  • DBT (Dialectical Behavior Therapy):
    DBT was originally developed to treat chronic suicidal thoughts and self-harm. It combines acceptance (validating pain) with change (teaching concrete coping skills). DBT offers skills in mindfulness, distress tolerance, emotional regulation, and interpersonal effectiveness. Importantly, DBT therapists are trained to provide crisis coaching between sessions, so clients are not left alone with overwhelming urges.
  • CBT (Cognitive Behavioral Therapy):
    CBT helps people recognize and reframe distorted thinking patterns, such as “I’m a burden” or “Nothing will ever get better.” Through CBT, clients learn to challenge hopelessness and practice healthier coping strategies.

Both approaches provide structure, support, and skills—things that AI cannot replicate.

The Ethical Responsibility of Tech and Mental Health

The New York Times story raises tough ethical questions: What responsibility do tech companies have when their tools are used in crisis? Are current safeguards sufficient?

While it’s critical that companies like OpenAI continue to strengthen safety systems, there will always be limitations. No algorithm can carry the ethical responsibility that humans do. That’s why society needs robust mental health infrastructure—so people have somewhere real to turn.

A Call to Parents, Educators, and Communities

This tragic story is also a reminder for parents, teachers, and community leaders: teenagers need safe spaces to talk about their pain. They may reach for AI because they feel they have no one else. At The Counseling Center Group, we’re here to help teens (and families) find relief and hope through short-term, evidence-based therapies that are proven to work and delivered with the care, precision, and compassion every young person deserves.

Some steps to take:

  • Check in regularly. Ask how they’re doing emotionally—not just about grades or activities.
  • Normalize therapy. Treat mental health care like medical care: a normal and necessary resource.
  • Know the signs. Withdrawal, hopeless talk, and giving away belongings can be warning signs.
  • Keep resources visible. Post crisis numbers where your child can see them, and model openness in talking about emotions.

Moving Forward: Where AI Fits In

AI can still play a role in mental health care—just not as a substitute for therapy or crisis intervention. Its best uses include:

  • Delivering psychoeducation and evidence-based information.
  • Supporting therapists with tools, worksheets, or reminders.
  • Offering non-crisis self-help for stress management or skill practice.

But for anyone experiencing suicidal thoughts, the message must remain clear: talk to a person, not a program.

Conclusion

The article from The New York Times is telling. It underscores what we know as therapists: people in suicidal crisis need connection, not just information.

AI may offer quick answers, but it cannot sit with someone’s suffering, notice their ambivalence, or hold hope on their behalf. For that, human beings are irreplaceable.

If you or someone you love is struggling with suicidal thoughts, please know that help is available. Reach out to The Counseling Center Group for a free consultation or to learn how we can support you or your loved ones in crisis. You can also call the Suicide & Crisis Lifeline at 988, or lean on a trusted friend or family member. You are not alone, and your life is worth saving.