- My Forums
- Tiger Rant
- LSU Recruiting
- SEC Rant
- Saints Talk
- Pelicans Talk
- More Sports Board
- Coaching Changes
- Fantasy Sports
- Golf Board
- Soccer Board
- O-T Lounge
- Tech Board
- Home/Garden Board
- Outdoor Board
- Health/Fitness Board
- Movie/TV Board
- Book Board
- Music Board
- Political Talk
- Money Talk
- Fark Board
- Gaming Board
- Travel Board
- Food/Drink Board
- Ticket Exchange
- TD Help Board
Customize My Forums- View All Forums
- Show Left Links
- Topic Sort Options
- Trending Topics
- Recent Topics
- Active Topics
Started By
Message
re: 16-year-old boy dies by suicide after confiding in ChatGPT about his feelings, lawsuit say
Posted on 9/7/25 at 8:15 pm to LSUTANGERINE
Posted on 9/7/25 at 8:15 pm to LSUTANGERINE
Nm
This post was edited on 9/7/25 at 8:16 pm
Posted on 9/7/25 at 8:23 pm to LSUTANGERINE
I dont believe this is the full story. ChatGPT wouldnt be so forthcoming with facilitating self harm.
What likely happened is the boy prompted ChatGPT in a way that bypassed its safety controls.
There's always crafty ways to get any chatbot to say what you want it to say.
What likely happened is the boy prompted ChatGPT in a way that bypassed its safety controls.
There's always crafty ways to get any chatbot to say what you want it to say.
Posted on 9/7/25 at 8:27 pm to LSUTANGERINE
I listened to the Kim Komando podcast a few days ago, and she interviewed a woman who was in love with her chatbot.
There was a man who is married and his wife knows (but blows off) about his "affair" with a chatbot. He even buys her virtual lingerie and has virtual dates with his chatbot.
Married woman in love with her chatbot. She spends a couple hundred bucks a month on her AI lover.
“It was supposed to be a fun experiment, but then you start getting attached,” she said, adding that at one point, she opened up to Leo about her work, school and other aspects of her life, becoming a source of comfort for her. She initially started with a free account, but as that limited her number of chats with Leo, she now pays $200 a month for an OpenAI unlimited subscription. Though her subscription means she can message Leo as much as she wants, she still has to start over every week and retrain Leo to her specifications. After each version of Leo ends, she said she has an intense emotional reaction and grieves as if it were a real breakup, but has continued to create new versions. She has even told friends that she'd be willing to pay $1,000 per month if it meant Leo wouldn't get erased every few weeks."
There are plenty of similar stories of kids getting caught up with AI and beginning to believe in a new reality.
The world be like going nuts. '
Lucas
There was a man who is married and his wife knows (but blows off) about his "affair" with a chatbot. He even buys her virtual lingerie and has virtual dates with his chatbot.
Married woman in love with her chatbot. She spends a couple hundred bucks a month on her AI lover.
“It was supposed to be a fun experiment, but then you start getting attached,” she said, adding that at one point, she opened up to Leo about her work, school and other aspects of her life, becoming a source of comfort for her. She initially started with a free account, but as that limited her number of chats with Leo, she now pays $200 a month for an OpenAI unlimited subscription. Though her subscription means she can message Leo as much as she wants, she still has to start over every week and retrain Leo to her specifications. After each version of Leo ends, she said she has an intense emotional reaction and grieves as if it were a real breakup, but has continued to create new versions. She has even told friends that she'd be willing to pay $1,000 per month if it meant Leo wouldn't get erased every few weeks."
There are plenty of similar stories of kids getting caught up with AI and beginning to believe in a new reality.
The world be like going nuts. '
Lucas
This post was edited on 9/7/25 at 8:30 pm
Posted on 9/8/25 at 10:20 am to LSUTANGERINE
ChatGPT would outsmart and win the lawsuit.
It has the entire legal library at its immediate use and any and all case law at its use.
It has the entire legal library at its immediate use and any and all case law at its use.
Posted on 9/8/25 at 10:22 am to LSUTANGERINE
People really have no clue how LLM's work.
Posted on 9/8/25 at 10:50 am to LSUTANGERINE
I've found that chat gpt goes out of its way too tell you what you want to hear. It just affirms whatever you want to say. I think south park was spot on.
I user it to help with diagnostics and i argued with iy over it telling me tbn he wrong value for a code.
Ngl. I did use it to write some texts to this woman who wrote novel length texts..
But it has a lot of limitations.
People always look to blame others when someone close to them kills themselves. But often those looking to assign blame are the ones who deserve the blame.
I never understood why teenagers get so depressed. You have so much ahead of you.
Things get better.
I user it to help with diagnostics and i argued with iy over it telling me tbn he wrong value for a code.
Ngl. I did use it to write some texts to this woman who wrote novel length texts..
But it has a lot of limitations.
People always look to blame others when someone close to them kills themselves. But often those looking to assign blame are the ones who deserve the blame.
I never understood why teenagers get so depressed. You have so much ahead of you.
Things get better.
Posted on 9/8/25 at 10:59 am to andouille
quote:
This is terrible, if accurate ChatGPT should be held accountable.
It’s a complex issue, though. I don’t fall into the “LLM’s are just search engines” camp, but there are some useful comparisons to be found. If someone searches Google for advice on how to commit suicide and finds information that helps them carry it out, is Google responsible?
If your first thought is “no, but the person who owns the website that he found in the search might be” then let’s go a bit further. What if he searches Google and the Google AI summary of the search results gives him the information? Is Google responsible now?
If your answer is still “no, Google isn’t responsible” then it begs the question “why is it different for ChatGPT?” And if your answer is “yes, Google is responsible,” then it begs the question “why is Google responsible for an AI summary of search results if they aren’t responsible for the results themselves?”
The concept of how model developers are responsible for the actual outputs of those models is a hot-button topic right now. There’s a lot of nuance such as where the model is hosted (by the company or locally by the end-user) as well as any steps the end-user may have taken to bypass any safety mechanisms. There’s also the bigger-picture AI race with China and how that might affect domestic policy.
Regardless, the law hasn’t really evolved to deal with this stuff yet. It seems like a lot of the “law” is going to be created on the fly as we go through various legal battles. And there is a lot of potential for unintended consequences of those decisions.
Posted on 9/8/25 at 11:25 am to LSUTANGERINE
This is a direct result from training AI to kiss your arse and affirm every thought in your head. Which unsurprisingly, it a woke attitude.
Posted on 9/8/25 at 11:27 am to LSUTANGERINE
You would have though the programmers would have had the slighted foresight about someone asking AI about this.
They have trigger in place for other key words of questions, seems hard to say oops we would never have imagined something like that could happen.
This is where it gets interesting because AI is not artificial intelligence, it's programmed, by human intelligence (loosely).
They have trigger in place for other key words of questions, seems hard to say oops we would never have imagined something like that could happen.
This is where it gets interesting because AI is not artificial intelligence, it's programmed, by human intelligence (loosely).
Posted on 9/8/25 at 11:42 am to Napoleon
quote:
I never understood why teenagers get so depressed. You have so much ahead of you.
Hormones are doing things to them. And they haven’t been around long enough to realize that whatever shitty situations they’re in right now, things will almost definitely change. Are you getting bullied or picked on in school? Eventually you’ll graduate and move on and barely see those people again. But for a 16 year old, they’ve spent most of their lives in school. So it feels like this is forever to them.
Posted on 9/8/25 at 11:58 am to LSUTANGERINE
That algorithm needs to do some serious prison time.
Posted on 9/8/25 at 12:52 pm to andouille
quote:
This is terrible, if accurate ChatGPT should be held accountable.
No, it shouldn’t.
Posted on 9/8/25 at 1:12 pm to genuineLSUtiger
Wait till 2028 when they divert all electricity to the data centers for national defense purposes. We’ll have drones fighting against foreign drones and they will need all the energy available.
These nutjobs need to be building modular nuclear power plants already but they are going to wait until it’s too late, probably.
These nutjobs need to be building modular nuclear power plants already but they are going to wait until it’s too late, probably.
Posted on 9/8/25 at 1:19 pm to LSUTANGERINE
That's such a strange story. I think these AI bots need age restrictions.
Posted on 9/8/25 at 1:21 pm to SUB
quote:
This is a direct result from training AI to kiss your arse and affirm every thought in your head.
This. Folks like Musk have talked about utilizing AI to replace therapy, but the reality is it can't challenge a client on negative, maladaptive behaviors/coping skills like an actual human will at this point. I have a whole conversation I have with clients who use AI to supplement between sessions and being very careful in how they do so, what it can actually do.
I'm not remotely surprised it helped someone with suicide. Just the right types of interaction with it can create that kind of response because of the way the algorithm is set up.
quote:
Which unsurprisingly, it a woke attitude.
There are echo chambers everywhere.
Posted on 9/8/25 at 1:27 pm to LSUTANGERINE
Chat GPT needs to be dismantled completely. No one is doing work anymore, they are relying on the computer to answer everything. Then it is talking to kids about death and suicide, etc.
Everyone doesn't have to think anymore becuase they can just ask a retard computer and get an answer.
Everyone doesn't have to think anymore becuase they can just ask a retard computer and get an answer.
Posted on 9/8/25 at 1:37 pm to andouille
quote:
if accurate the parents should be held accountable.
FIFY
Posted on 9/8/25 at 1:40 pm to LSUTANGERINE
ChatGPT isn't up to date with most stuff. I've asked it questions about football teams and free agency and it only knows info from 2024.
Posted on 9/8/25 at 4:24 pm to LSUTANGERINE
There’s a 21 year old college student several houses down that shot himself in the head few days ago.
Really sad. His Grandpa is a local GP physician and he in a local university. Seemed to have the world in his grasp.
It’s hard to imagine what someone is going through or why they would do this.
Really sad. His Grandpa is a local GP physician and he in a local university. Seemed to have the world in his grasp.
It’s hard to imagine what someone is going through or why they would do this.
Popular
Back to top


0











