It was Amaurie’s younger sister who discovered the body. She was also the one who was looking through her brother’s smartphone and found his final conversation before he took his own life. It was with ChatGPT, the popular chatbot developed by OpenAI.
“In the messages, he was talking about killing himself—it told him how to tie the noose, how long it would take the air to come out of his body, how to clean his body,” [Cedric] Lacey tells WIRED …. Lacey, who is a single dad, says he thought his son was using the chatbot to get help with schoolwork.
With cases such as Amaurie’s piling up, OpenAI made some changes to ChatGPT in September. The company is rolling out “age prediction” technology, meaning that when a user is identified as being below 18 years of age, “they will automatically be directed to a ChatGPT experience with age-appropriate policies.” The company also recently introduced parental controls, which, among other things, let parents link their child’s account to their own, create blackout hours when they can’t use the app, and send notifications when the child shows signs of distress.





















