Health & Well-Being ambassadors offer resources outside the Student Health Center on Nov. 10. Staff encouraged Toros to seek in-person support rather than unspecified answers from artificial intelligence platforms like ChatGPT. Credit: Camila Chavarria, The Bulletin

Staff encourage Toros to seek in-person support in-person, rather than answers from platforms like ChatGPT.

Editor’s note–Content warning: This story contains references to suicide and self–harm.

Finals season at CSUDH brings its usual mix of late-night study sessions, packed schedules, and rising stress levels. With classes, jobs, and personal responsibilities competing for time, many Toros look for quick ways to manage anxiety or get answers fast. 

Increasingly, that search leads them to artificial intelligence tools like ChatGPT, which can feel more accessible than traditional support systems. But as AI use grows on campus, Health & Well-Being staff are raising concerns about how students rely on these platforms for mental health guidance.

The Student Health Center hosted a tabling event on Nov. 10, “Do’s and Don’ts with ChatGPT: Mental Health Edition,” aimed at teaching Toros how to use the chatbot safely when seeking information related to their well-being.

The table included information about on-campus resources such as counseling services, and ambassadors encouraged students to look through the mental health information available. They also spoke with students about when AI technologies can be helpful—and when they can’t.

The group also provided stress-relieving activities, including kinetic sand, a sensory toy shown to help reduce feelings of stress and anxiety. The repetitive motion of molding the sand through one’s fingers can have a calming effect.

Kayla Belloso, a Student Health Ambassador, explained to The Bulletin how students have been using ChatGPT when seeking mental health resources or support.

“Recently, we have been seeing that ChatGPT has been used as a therapist and we want to refrain from that,” Belloso said. “We’d rather use ChatGPT to ask questions such as, ‘What are some self care tips I can use?’—or [to] give you resources.”

Developed by OpenAI and launched in November 2022, ChatGPT has quickly become one of the most widely used AI tools among students. Many people—particularly students—now turn to the platform for quick information, sometimes instead of traditional search engines.

Although many students use ChatGPT, a recent series of lawsuits in California alleges the platform offered “coaching,” providing detailed self-harm instructions that contributed to several deaths.

According to one such case, filed in San Francisco Superior Court, the family of 17-year-old Amaurie Lacey alleges the teen turned to ChatGPT for help, but instead it “counseled him on the most effective way to tie a noose and how long he would be able to ‘live without breathing’.”

The lawsuit described Lacey’s death as “neither an accident nor a coincidence but rather the foreseeable consequence of OpenAI and [founder] Samuel Altman’s intentional decision to curtail safety testing and rush ChatGPT onto the market.”

OpenAI called the case “incredibly heartbreaking” and said it would review the filings.

Some students say they feel ChatGPT is less judgmental than speaking to a doctor or therapist. As one student told The Baltimore Banner, “I know that my doctors are probably not going to judge me, but I know for sure ChatGPT isn’t … It’s like a diary.”

Belloso emphasized the risks of turning to ChatGPT for mental health support, noting that studies show the platform is not confidential in the way a licensed therapist would be. If students do use AI chatbots to address mental health concerns, she added, then that should be limited to general self-care tips or simple breathing exercises.

“When it comes to your therapy sessions, they are confidential but we have to break [confidentiality] when it comes to harming yourself or others,” Belloso said. “ChatGPT is not a person, it’s not human, it won’t understand the extent of what suicide can really lead to, and we want to prevent that. We want you to come to the psychological services and meet someone face to face and get help.”

Andy Dukeshire is a licensed mental health therapist based in Woodland Hills. Dukeshire told The Bulletin that ChatGPT cannot understand a person the way a trained clinician can. Dukeshire said that unlike AI tools, trained therapists take time to understand a client’s full situation.

“Therapists are also trained to pay attention to a client’s affect, body posture, and tone of voice as well as many other areas to help in treatment of their clients,” Dukeshire said. “It does not have all the knowledge that a person with many years of training has nor may pick up on important diagnostic information that has developed through these years training or from being face-to-face with the person.”

The Student Health Center offers support from 11:30 a.m–1:30 p.m. during “Mental Health Mondays,” where ambassadors help students learn how to navigate available services.

Belloso encouraged students who may feel shy or anxious about seeking support should consider coming with a person they trust, because it can make the process easier

“Don’t do this alone,” Belloso said. “Have a support network with you, it can go a long way. Having someone with you can be beneficial.”

If you or someone you know is experiencing a mental health crisis, help is available. The national suicide and crisis lifeline in the U.S. is available by calling or texting 988.

Leave a comment

Your email address will not be published. Required fields are marked *