Started By
Message

re: AI child pron predator caught in Florida

Posted on 12/3/25 at 5:08 pm to
Posted by Jimmyboy
Member since May 2025
1934 posts
Posted on 12/3/25 at 5:08 pm to
Make believe should not be illegal…
You don’t like it? Don’t look/change the channel
Posted by SallysHuman
Lady Palmetto Bug
Member since Jan 2025
13606 posts
Posted on 12/3/25 at 5:08 pm to
quote:

So, that means the AI servers would have to access massive amounts of child porn.


That seems like the way to go at this.

Posted by TexasTiger08
Member since Oct 2006
29030 posts
Posted on 12/3/25 at 5:33 pm to
quote:

So if someone puts an AI picture of your daughter on a naked body and and bunch of dudes railing her, that’s okay…because it’s not real? Then shares it with the whole High School and calls it something like, “Shelley likes to ride” when your daughters name is Kelly….that’s cool with you cause it’s not real??


You taking a real picture and using it as “AI” makes it real…at least IMO.
Posted by HoustonGumbeauxGuy
Member since Jul 2011
32664 posts
Posted on 12/3/25 at 5:35 pm to
quote:

What happens when some 35 year old male gets a detailed sex doll that looks like some 14-year old?


Damn... shite is getting crazy. This is a great point.

Posted by TexasTiger08
Member since Oct 2006
29030 posts
Posted on 12/3/25 at 5:40 pm to
Or better yet…some guy orders various dolls to create his own “underage doll”. That way they could get around any manufacturing restrictions or laws.
Posted by Obtuse1
Westside Bodymore Yo
Member since Sep 2016
30015 posts
Posted on 12/3/25 at 5:54 pm to
quote:

seems like a pretty slippery slope.


There is a (legitimate) knee-jerk jerk that it should be illegal, but there is arguably no victim. It could lead to illegal acts, but so can a lot of legal things; Exhibit #1 is alcohol. This is edging closer to thought crimes.

The emotional part of me doesn't care if people who view AI child pornography end up in jail, but the logical side of me sees the bigger picture and the potential ramifications.
Posted by jcole4lsu
The Kwisatz Haderach
Member since Nov 2007
31785 posts
Posted on 12/3/25 at 6:00 pm to
I think there should be a strong distinction made between "AI" which is completely artificial and "deepfake" which is basically a real person's face on either another person's body, or an AI body, or a real person's body modified via AI.

Deepfake's absolutely have a victim and should be illegal.
Purely AI is a thought crime and I can't support thought crime no matter how gross it is.
Posted by LanierSpots
Sarasota, Florida
Member since Sep 2010
69465 posts
Posted on 12/3/25 at 6:07 pm to
quote:

What if a guy draws his own child porn?



From the way the law is written, it doesnt look like you cant have it. Just that you cant distribute, show or sell it. Unless I read it wrong





quote:

2) Any person who, with knowledge that the material is a deepfake depicting a minor, knowingly advertises, distributes, exhibits, exchanges with, promotes, or sells any sexual material that depicts a minor engaging in sexual conduct shall be punished by imprisonment at hard labor for not less than ten nor more than thirty years, a fine of not more than fifty thousand dollars, or both. At least ten years of the sentence of imprisonment imposed shall be served without benefit of probation, parole, or suspension of sentence.
This post was edited on 12/3/25 at 6:11 pm
Posted by i am dan
NC
Member since Aug 2011
30373 posts
Posted on 12/3/25 at 6:20 pm to
quote:

So, that means the AI servers would have to access massive amounts of child porn.

That sounds highly illegal to me.


Comcast's automated system found out about this guy I think. I don't know what sort of surveillance system they use. But it alerted whoever, and they alerted the police.

But anyway, I'm saying we already have that camera up our arse apparently.
Posted by forkedintheroad
Member since Feb 2025
1471 posts
Posted on 12/3/25 at 6:30 pm to
quote:

I’m pretty sure that law only applies to deepfakes (i.e. fake images of real person who is a minor).


There are billions of unique looking faces in the world.

Its arguable most AI images could be taken as a deep fake of someone real.
Posted by Snipe
Member since Nov 2015
15672 posts
Posted on 12/3/25 at 6:43 pm to
I can’t have a logical conversation with someone who does not believe possessing child pornography should be a punishable offense and tries to spin the discussion into some obscure thought crime that it was never about.

Possessing child pornography no matter how it was created should be punished to the higher degree of the law because people who would possess such material are a threat to society.

This shouldn’t be a problem to anyone who doesn’t not have an ulterior motive to protect and or decriminalize child pornograpy i.e normalize this sick deviant behavior.
Posted by lostinbr
Baton Rouge, LA
Member since Oct 2017
12627 posts
Posted on 12/3/25 at 6:44 pm to
quote:

There are billions of unique looking faces in the world.

It’s arguable most AI images could be taken as a deep fake of someone real.

I mean.. you could argue that but you’d be out on a limb. Regardless, the LA law requires “knowledge that the material is a deepfake depicting a minor.”
Posted by Darth_Vader
A galaxy far, far away
Member since Dec 2011
71982 posts
Posted on 12/3/25 at 6:56 pm to
quote:

I suppose this is better than actual child pRon. Less kids getting sexually abused and taken advantage of which is a net positive to society.


I think this is a dangerous slippery slope. Porn is an addiction to some, and one that progresses from vanilla Playboy (or something similar) to increasingly hardcore and depraved subjects. Adding in easily accessible CP, even if it’s AI generated will invariably lead some to turn to it as they seek their next thing to get their jollies. An if some of those have access to children, well I don’t have to tell you what will follow.
Posted by TexasTiger08
Member since Oct 2006
29030 posts
Posted on 12/3/25 at 6:57 pm to
quote:

I can’t have a logical conversation with someone who does not believe possessing child pornography should be a punishable offense


Naw…you can’t even properly identify what child porn is in your own thread. Can you quote where I said possessing child porn was ok? I’ll wait…

quote:

spin the discussion into some obscure thought crime that it was never about.


You said a guy who draws his own porn should face consequences…

quote:

Possessing child pornography no matter how it was created


No matter how it was created? At this point, you’re ready to hammer a guy for drawing stick figure porn.

quote:

This shouldn’t be a problem to anyone who doesn’t not have an ulterior motive to protect and or decriminalize child pornograpy i.e normalize this sick deviant behavior.


Ad hominem…strawman…gaslighting. You’ve got it going for you. Starting to think you are all about that big brother, 1984-type shite. Maybe you’re projecting your own insecurities. Maybe, just maybe, it’s YOU who has a sexual addiction to underage kids.
Posted by TexasTiger08
Member since Oct 2006
29030 posts
Posted on 12/3/25 at 6:58 pm to
quote:

the LA law requires “knowledge that the material is a deepfake depicting a minor.”


OP doesn’t care. He’s ready to prosecute thought crime.
Posted by LSU Grad Alabama Fan
369 Cardboard Box Lane
Member since Nov 2019
13896 posts
Posted on 12/3/25 at 7:01 pm to
quote:

Anyone who has the propensity to create and or look at AI generated child porn has in the past or will in the future abuse children.



I disagree. I have the urge to look at SSBBW porn, but I don't have the urge to bang an SSBBW in real life.
Posted by boosiebadazz
Member since Feb 2008
84411 posts
Posted on 12/3/25 at 7:06 pm to
quote:

SSBBW porn


Do we want to know what this is?
Posted by TexasTiger08
Member since Oct 2006
29030 posts
Posted on 12/3/25 at 7:12 pm to
I believe it’s Super Sized Big Beautiful Women
Posted by lostinbr
Baton Rouge, LA
Member since Oct 2017
12627 posts
Posted on 12/3/25 at 7:20 pm to
quote:

It should be. AI generates images from a massive number of real images.

quote:

So, that means the AI servers would have to access massive amounts of child porn.

That sounds highly illegal to me.

So this aspect is actually pretty complicated. Long post ahead…

TL;DR:
1. A model capable of producing CSAM doesn’t necessarily need to be trained on it.
2. There are certainly some creeps out there who seem to be going out of their way to intentionally produce models capable of creating CSAM, but it’s impossible to prove without someone getting their training data.
3. The major foundational models produced by the big AI companies likely had an extremely small amount of CSAM in their training data simply because they trained the models on images automatically scraped from the internet.

Long version:

First off, diffusion models are capable of generating images based on concepts outside of their actual training data.. e.g. concepts they weren’t explicitly trained on at all. This is especially true for the most advanced models. So the fact that a diffusion model can generate CSAM does not mean it was explicitly trained on CSAM.

However, there’s been a perverse (if not somewhat predictable) trend among publicly-available models that have been fine-tuned by users. These are models that are designed to run locally on consumer-grade hardware rather than remotely in server racks.

The thing is.. the #1 reason people fine-tune the publicly available models is to make them better at generating.. well, porn. But a model that can generate porn can probably also generate CSAM, because again - diffusion models don’t have to be explicitly trained on every concept. Since there is no way to tell what exact training images were used by analyzing the model itself, it’s impossible to say whether a given model was trained on CSAM or not.

Then there’s a separate issue where many of the big models (both those publicly available for download and those that run remotely) were trained at some point on LAION datasets. LAION (a nonprofit focused on open source AI development) basically scraped the internet for image/caption pairs to train diffusion models. The actual dataset consists of URL links to the images plus the text of the captions found in HTML.

There was a report a couple of years ago that the LAION-5B dataset included about 1,000 images that were confirmed to be CSAM (out of 5 billion images total, so 0.00002% of the dataset). LAION responded by pulling the dataset.

It’s unlikely anybody actually committed a crime here because LAION’s dataset came from links that were automatically scraped from the internet as a whole and LAION didn’t even actually host the images. Additionally, the AI companies would not have known that 0.00002% of the 5 billion images contained CSAM because surely nobody manually vetted each image. However, it does mean that many of the foundational models were likely trained on some amount of CSAM simply because it existed on the internet. It’s questionable (at best) whether this data actually made much difference to model capabilities, but there’s really no unringing the bell at this point regardless.
This post was edited on 12/3/25 at 7:54 pm
Posted by UFFan
Planet earth, Milky Way Galaxy
Member since Aug 2016
2656 posts
Posted on 12/3/25 at 7:29 pm to
How can they even judge whether a fictional AI character is 18 or 17 years and 364 days old?
first pageprev pagePage 3 of 4Next pagelast page

Back to top
logoFollow TigerDroppings for LSU Football News
Follow us on X, Facebook and Instagram to get the latest updates on LSU Football and Recruiting.

FacebookXInstagram