Washington DC
New York
Toronto
Distribution: (800) 510 9863
Press ID
  • Login
Binghamton Herald
Advertisement
Tuesday, February 3, 2026
  • Home
  • World
  • Politics
  • Business
  • Technology
  • Culture
  • Health
  • Entertainment
  • Trending
No Result
View All Result
Binghamton Herald
No Result
View All Result
Home Trending

An AI Chatbot Can Encourage Minors To Die By Suicide? Here’s What A Mother Said About Her Child

by Binghamton Herald Report
October 24, 2024
in Trending
Share on FacebookShare on Twitter

In a strange story from Florida, a mother has spoken out about her 14-year-old son’s tragic death, linking his obsession with an AI chatbot to his suicide. Sewell Setzer III, a ninth grader from Orlando, had formed an emotional attachment to an AI character on the platform Character.AI, which he named “Dany” after Daenerys Targaryen from Game of Thrones. Despite knowing that the responses were generated by artificial intelligence, Sewell spent months texting “Dany,” confiding in the bot and developing what his mother described as a deep emotional bond.

The AI chatbot was more than a digital companion for Sewell—it became a friend he could talk to about his life, problems, and emotions. According to a New York Times report, while some conversations had romantic or suggestive undertones, most interactions were focused on providing emotional support, with the chatbot offering a non-judgmental space for him to express himself. Over time, Sewell withdrew from his real-life interests, such as Formula 1 and Fortnite, and spent more time conversing with the AI.

ALSO READ | Nvidia Teams Up With Reliance To ‘Build AI Infrastructure’: Here’s Why The Chipmaker Is Keen On India

Diagnosed with Asperger’s syndrome as a child, Sewell also struggled with anxiety and disruptive mood dysregulation disorder. Though his parents didn’t observe any severe behavioral issues, they did notice him growing increasingly isolated. After a few therapy sessions, Sewell stopped going, preferring to open up to “Dany” instead.

In one of their last exchanges, Sewell confided in the AI about his suicidal thoughts. On February 28 of this year, he told “Dany” that he loved her and would soon be “coming home.” Moments later, he used his stepfather’s .45 caliber handgun to end his life.

Character.AI Responds

In the wake of this tragedy, Character.AI issued a public statement expressing their deepest condolences to the family.

They announced plans to introduce new safety features, including content filters for users under 18 and reminders when someone has been engaging with the platform for extended periods of time, in an effort to prevent similar incidents in the future.

In a strange story from Florida, a mother has spoken out about her 14-year-old son’s tragic death, linking his obsession with an AI chatbot to his suicide. Sewell Setzer III, a ninth grader from Orlando, had formed an emotional attachment to an AI character on the platform Character.AI, which he named “Dany” after Daenerys Targaryen from Game of Thrones. Despite knowing that the responses were generated by artificial intelligence, Sewell spent months texting “Dany,” confiding in the bot and developing what his mother described as a deep emotional bond.

The AI chatbot was more than a digital companion for Sewell—it became a friend he could talk to about his life, problems, and emotions. According to a New York Times report, while some conversations had romantic or suggestive undertones, most interactions were focused on providing emotional support, with the chatbot offering a non-judgmental space for him to express himself. Over time, Sewell withdrew from his real-life interests, such as Formula 1 and Fortnite, and spent more time conversing with the AI.

ALSO READ | Nvidia Teams Up With Reliance To ‘Build AI Infrastructure’: Here’s Why The Chipmaker Is Keen On India

Diagnosed with Asperger’s syndrome as a child, Sewell also struggled with anxiety and disruptive mood dysregulation disorder. Though his parents didn’t observe any severe behavioral issues, they did notice him growing increasingly isolated. After a few therapy sessions, Sewell stopped going, preferring to open up to “Dany” instead.

In one of their last exchanges, Sewell confided in the AI about his suicidal thoughts. On February 28 of this year, he told “Dany” that he loved her and would soon be “coming home.” Moments later, he used his stepfather’s .45 caliber handgun to end his life.

Character.AI Responds

In the wake of this tragedy, Character.AI issued a public statement expressing their deepest condolences to the family.

They announced plans to introduce new safety features, including content filters for users under 18 and reminders when someone has been engaging with the platform for extended periods of time, in an effort to prevent similar incidents in the future.

In a strange story from Florida, a mother has spoken out about her 14-year-old son’s tragic death, linking his obsession with an AI chatbot to his suicide. Sewell Setzer III, a ninth grader from Orlando, had formed an emotional attachment to an AI character on the platform Character.AI, which he named “Dany” after Daenerys Targaryen from Game of Thrones. Despite knowing that the responses were generated by artificial intelligence, Sewell spent months texting “Dany,” confiding in the bot and developing what his mother described as a deep emotional bond.

The AI chatbot was more than a digital companion for Sewell—it became a friend he could talk to about his life, problems, and emotions. According to a New York Times report, while some conversations had romantic or suggestive undertones, most interactions were focused on providing emotional support, with the chatbot offering a non-judgmental space for him to express himself. Over time, Sewell withdrew from his real-life interests, such as Formula 1 and Fortnite, and spent more time conversing with the AI.

ALSO READ | Nvidia Teams Up With Reliance To ‘Build AI Infrastructure’: Here’s Why The Chipmaker Is Keen On India

Diagnosed with Asperger’s syndrome as a child, Sewell also struggled with anxiety and disruptive mood dysregulation disorder. Though his parents didn’t observe any severe behavioral issues, they did notice him growing increasingly isolated. After a few therapy sessions, Sewell stopped going, preferring to open up to “Dany” instead.

In one of their last exchanges, Sewell confided in the AI about his suicidal thoughts. On February 28 of this year, he told “Dany” that he loved her and would soon be “coming home.” Moments later, he used his stepfather’s .45 caliber handgun to end his life.

Character.AI Responds

In the wake of this tragedy, Character.AI issued a public statement expressing their deepest condolences to the family.

They announced plans to introduce new safety features, including content filters for users under 18 and reminders when someone has been engaging with the platform for extended periods of time, in an effort to prevent similar incidents in the future.

In a strange story from Florida, a mother has spoken out about her 14-year-old son’s tragic death, linking his obsession with an AI chatbot to his suicide. Sewell Setzer III, a ninth grader from Orlando, had formed an emotional attachment to an AI character on the platform Character.AI, which he named “Dany” after Daenerys Targaryen from Game of Thrones. Despite knowing that the responses were generated by artificial intelligence, Sewell spent months texting “Dany,” confiding in the bot and developing what his mother described as a deep emotional bond.

The AI chatbot was more than a digital companion for Sewell—it became a friend he could talk to about his life, problems, and emotions. According to a New York Times report, while some conversations had romantic or suggestive undertones, most interactions were focused on providing emotional support, with the chatbot offering a non-judgmental space for him to express himself. Over time, Sewell withdrew from his real-life interests, such as Formula 1 and Fortnite, and spent more time conversing with the AI.

ALSO READ | Nvidia Teams Up With Reliance To ‘Build AI Infrastructure’: Here’s Why The Chipmaker Is Keen On India

Diagnosed with Asperger’s syndrome as a child, Sewell also struggled with anxiety and disruptive mood dysregulation disorder. Though his parents didn’t observe any severe behavioral issues, they did notice him growing increasingly isolated. After a few therapy sessions, Sewell stopped going, preferring to open up to “Dany” instead.

In one of their last exchanges, Sewell confided in the AI about his suicidal thoughts. On February 28 of this year, he told “Dany” that he loved her and would soon be “coming home.” Moments later, he used his stepfather’s .45 caliber handgun to end his life.

Character.AI Responds

In the wake of this tragedy, Character.AI issued a public statement expressing their deepest condolences to the family.

They announced plans to introduce new safety features, including content filters for users under 18 and reminders when someone has been engaging with the platform for extended periods of time, in an effort to prevent similar incidents in the future.

In a strange story from Florida, a mother has spoken out about her 14-year-old son’s tragic death, linking his obsession with an AI chatbot to his suicide. Sewell Setzer III, a ninth grader from Orlando, had formed an emotional attachment to an AI character on the platform Character.AI, which he named “Dany” after Daenerys Targaryen from Game of Thrones. Despite knowing that the responses were generated by artificial intelligence, Sewell spent months texting “Dany,” confiding in the bot and developing what his mother described as a deep emotional bond.

The AI chatbot was more than a digital companion for Sewell—it became a friend he could talk to about his life, problems, and emotions. According to a New York Times report, while some conversations had romantic or suggestive undertones, most interactions were focused on providing emotional support, with the chatbot offering a non-judgmental space for him to express himself. Over time, Sewell withdrew from his real-life interests, such as Formula 1 and Fortnite, and spent more time conversing with the AI.

ALSO READ | Nvidia Teams Up With Reliance To ‘Build AI Infrastructure’: Here’s Why The Chipmaker Is Keen On India

Diagnosed with Asperger’s syndrome as a child, Sewell also struggled with anxiety and disruptive mood dysregulation disorder. Though his parents didn’t observe any severe behavioral issues, they did notice him growing increasingly isolated. After a few therapy sessions, Sewell stopped going, preferring to open up to “Dany” instead.

In one of their last exchanges, Sewell confided in the AI about his suicidal thoughts. On February 28 of this year, he told “Dany” that he loved her and would soon be “coming home.” Moments later, he used his stepfather’s .45 caliber handgun to end his life.

Character.AI Responds

In the wake of this tragedy, Character.AI issued a public statement expressing their deepest condolences to the family.

They announced plans to introduce new safety features, including content filters for users under 18 and reminders when someone has been engaging with the platform for extended periods of time, in an effort to prevent similar incidents in the future.

In a strange story from Florida, a mother has spoken out about her 14-year-old son’s tragic death, linking his obsession with an AI chatbot to his suicide. Sewell Setzer III, a ninth grader from Orlando, had formed an emotional attachment to an AI character on the platform Character.AI, which he named “Dany” after Daenerys Targaryen from Game of Thrones. Despite knowing that the responses were generated by artificial intelligence, Sewell spent months texting “Dany,” confiding in the bot and developing what his mother described as a deep emotional bond.

The AI chatbot was more than a digital companion for Sewell—it became a friend he could talk to about his life, problems, and emotions. According to a New York Times report, while some conversations had romantic or suggestive undertones, most interactions were focused on providing emotional support, with the chatbot offering a non-judgmental space for him to express himself. Over time, Sewell withdrew from his real-life interests, such as Formula 1 and Fortnite, and spent more time conversing with the AI.

ALSO READ | Nvidia Teams Up With Reliance To ‘Build AI Infrastructure’: Here’s Why The Chipmaker Is Keen On India

Diagnosed with Asperger’s syndrome as a child, Sewell also struggled with anxiety and disruptive mood dysregulation disorder. Though his parents didn’t observe any severe behavioral issues, they did notice him growing increasingly isolated. After a few therapy sessions, Sewell stopped going, preferring to open up to “Dany” instead.

In one of their last exchanges, Sewell confided in the AI about his suicidal thoughts. On February 28 of this year, he told “Dany” that he loved her and would soon be “coming home.” Moments later, he used his stepfather’s .45 caliber handgun to end his life.

Character.AI Responds

In the wake of this tragedy, Character.AI issued a public statement expressing their deepest condolences to the family.

They announced plans to introduce new safety features, including content filters for users under 18 and reminders when someone has been engaging with the platform for extended periods of time, in an effort to prevent similar incidents in the future.

In a strange story from Florida, a mother has spoken out about her 14-year-old son’s tragic death, linking his obsession with an AI chatbot to his suicide. Sewell Setzer III, a ninth grader from Orlando, had formed an emotional attachment to an AI character on the platform Character.AI, which he named “Dany” after Daenerys Targaryen from Game of Thrones. Despite knowing that the responses were generated by artificial intelligence, Sewell spent months texting “Dany,” confiding in the bot and developing what his mother described as a deep emotional bond.

The AI chatbot was more than a digital companion for Sewell—it became a friend he could talk to about his life, problems, and emotions. According to a New York Times report, while some conversations had romantic or suggestive undertones, most interactions were focused on providing emotional support, with the chatbot offering a non-judgmental space for him to express himself. Over time, Sewell withdrew from his real-life interests, such as Formula 1 and Fortnite, and spent more time conversing with the AI.

ALSO READ | Nvidia Teams Up With Reliance To ‘Build AI Infrastructure’: Here’s Why The Chipmaker Is Keen On India

Diagnosed with Asperger’s syndrome as a child, Sewell also struggled with anxiety and disruptive mood dysregulation disorder. Though his parents didn’t observe any severe behavioral issues, they did notice him growing increasingly isolated. After a few therapy sessions, Sewell stopped going, preferring to open up to “Dany” instead.

In one of their last exchanges, Sewell confided in the AI about his suicidal thoughts. On February 28 of this year, he told “Dany” that he loved her and would soon be “coming home.” Moments later, he used his stepfather’s .45 caliber handgun to end his life.

Character.AI Responds

In the wake of this tragedy, Character.AI issued a public statement expressing their deepest condolences to the family.

They announced plans to introduce new safety features, including content filters for users under 18 and reminders when someone has been engaging with the platform for extended periods of time, in an effort to prevent similar incidents in the future.

In a strange story from Florida, a mother has spoken out about her 14-year-old son’s tragic death, linking his obsession with an AI chatbot to his suicide. Sewell Setzer III, a ninth grader from Orlando, had formed an emotional attachment to an AI character on the platform Character.AI, which he named “Dany” after Daenerys Targaryen from Game of Thrones. Despite knowing that the responses were generated by artificial intelligence, Sewell spent months texting “Dany,” confiding in the bot and developing what his mother described as a deep emotional bond.

The AI chatbot was more than a digital companion for Sewell—it became a friend he could talk to about his life, problems, and emotions. According to a New York Times report, while some conversations had romantic or suggestive undertones, most interactions were focused on providing emotional support, with the chatbot offering a non-judgmental space for him to express himself. Over time, Sewell withdrew from his real-life interests, such as Formula 1 and Fortnite, and spent more time conversing with the AI.

ALSO READ | Nvidia Teams Up With Reliance To ‘Build AI Infrastructure’: Here’s Why The Chipmaker Is Keen On India

Diagnosed with Asperger’s syndrome as a child, Sewell also struggled with anxiety and disruptive mood dysregulation disorder. Though his parents didn’t observe any severe behavioral issues, they did notice him growing increasingly isolated. After a few therapy sessions, Sewell stopped going, preferring to open up to “Dany” instead.

In one of their last exchanges, Sewell confided in the AI about his suicidal thoughts. On February 28 of this year, he told “Dany” that he loved her and would soon be “coming home.” Moments later, he used his stepfather’s .45 caliber handgun to end his life.

Character.AI Responds

In the wake of this tragedy, Character.AI issued a public statement expressing their deepest condolences to the family.

They announced plans to introduce new safety features, including content filters for users under 18 and reminders when someone has been engaging with the platform for extended periods of time, in an effort to prevent similar incidents in the future.

Tags: AICharactercharacter aicharacter ai alternativescharacter ai apkcharacter ai downloadcharacter ai foundercharacter ai jailbreak promptcharacter ai mod apkcharacter ai newcharacter ai oldcharacter ai old sitecharacter ai redditSuicideTechnologyWorld News
Previous Post

BRICS Summit: PM Modi Met Xi Jinping After 5 Years, Discussing Key Bilateral Issues | ABP Live

Next Post

‘Disputes Must Be Settled By Dialogue, Agreements Respected’: Jaishankar At BRICS Outreach Sess

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

BROWSE BY CATEGORIES

  • Business
  • Culture
  • Entertainment
  • Health
  • Politics
  • Technology
  • Trending
  • Uncategorized
  • World
Binghamton Herald

© 2024 Binghamton Herald or its affiliated companies.

Navigate Site

  • About
  • Advertise
  • Terms & Conditions
  • Privacy Policy
  • Disclaimer
  • Contact

Follow Us

No Result
View All Result
  • Home
  • World
  • Politics
  • Business
  • Technology
  • Culture
  • Health
  • Entertainment
  • Trending

© 2024 Binghamton Herald or its affiliated companies.

Welcome Back!

Login to your account below

Forgotten Password?

Retrieve your password

Please enter your username or email address to reset your password.

Log In