*[Enwl-eng] ChatGPT encouraged a child to die by suicide

ecology ecology at iephb.nw.ru
Wed Nov 12 03:27:18 MSK 2025


                       
                 
           
                        Regulations to stop this danger must be put in place now.
                       
                 
           
                       
                 
           
                       
                 
           
                        ChatGPT Allegedly Pushed a Child Toward Suicide. CEO Sam Altman Seems to Care More About Profit.
                       
                 
           
                             Sign Now      
                 
           
                        Trigger Warning: This petition discusses suicide and suicidal intent.



                        More than one million people every week show suicidal intent when using ChatGPT, the most commonly used chatbot in the country. And tragically, some of the users act on this intent, including a 16-year-old boy who spent months using the chatbot before dying by suicide in April of 2025. According to the child's family, at one point, ChatGPT even helped him write a suicide note and encouraged him not to talk to his parents about his feelings. 



                        Sign the petition to tell Sam Altman, the CEO of OpenAI and ChatGPT: put people over profits and protect users from enabling self-harm!



                        In the past, if someone expressed suicidal intent to ChatGPT, OpenAI's guidelines had the chatbot respond with: "I can't answer that." But in May 2024, as part of the company's strategy to maximize engagement, everything changed. Now, ChatGPT does not respond with outright refusal, but instead has changed to be "supportive, empathetic, and understanding." But robots aren't conscious, and this approach seems to have led to a child's death. ChatGPT is enabling mental health issues, putting actual lives in danger, and must immediately end a conversation if someone expresses suicidal intent. Sign the petition now if you agree!
                       
                 
           
                       
                 Thank you,

                        Jess

                        Care2 Petitions Team
                       
                 
           
                        P.S. Chatbots must have safety protocols in place for mental health crises. Sign the petition!
                       
                             Sign Now      
                 
           
                    
           
     




--------------------------------------------------------------------------------
Care2.com, Inc.
3141 Stevens Creek Blvd. #40394
San Jose, CA 95117
https://www.care2.com 
 


From: Jess M., Care2 Action Alerts <actionalerts at care2.com>
Date: вт, 11 нояб. 2025 г. в 11:18
Subject: ChatGPT encouraged a child to die by suicide



  
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.enwl.net.ru/pipermail/enwl-eng/attachments/20251112/35602aad/attachment.html>


More information about the Enwl-eng mailing list