Started By
Message

Apple will scan photos stored on iPhones and iCloud for child abuse imagery

Posted on 8/5/21 at 4:00 pm
Posted by DemonKA3268
Parts Unknown
Member since Oct 2015
19177 posts
Posted on 8/5/21 at 4:00 pm
Nothing to see here

quote:

Apple plans to scan photos stored on iPhones and iCloud for child abuse imagery, according the Financial Times. The new system could help law enforcement in criminal investigations but may open the door to increased legal and government demands for user data.
quote:

The system, called neuralMatch, will “proactively alert a team of human reviewers if it believes illegal imagery is detected, who would then contact law enforcement if the material can be verified,” the Financial Times said. neuralMatch, which was trained using 200,000 images from the National Center for Missing & Exploited Children, will roll out first in the US. Photos will be hashed and compared with a database of known images of child sexual abuse.
quote:

“According to people briefed on the plans, every photo uploaded to iCloud in the US will be given a ‘safety voucher,’ saying whether it is suspect or not,” the Financial Times said. “Once a certain number of photos are marked as suspect, Apple will enable all the suspect photos to be decrypted and, if apparently illegal, passed on to the relevant authorities.”
quote:

John Hopkins University professor and cryptographer Matthew Green raised concerns about the system on Twitter Wednesday night. “This sort of tool can be a boon for finding child pornography in people’s phones,” Green said. “But imagine what it could do in the hands of an authoritarian government?”
LINK
This post was edited on 8/5/21 at 4:06 pm
Posted by CrimsonFever
Gump Hard or Go Home
Member since Jul 2012
17911 posts
Posted on 8/5/21 at 4:01 pm to
Good.
Posted by red sox fan 13
Valley Park
Member since Aug 2018
15325 posts
Posted on 8/5/21 at 4:01 pm to
1984
Posted by MrJimBeam
Member since Apr 2009
12235 posts
Posted on 8/5/21 at 4:02 pm to
I'm all about catching predators, but you know...WCGW
Posted by TigerMan79
Lake Charles
Member since Jul 2014
808 posts
Posted on 8/5/21 at 4:03 pm to
Good? You are one pathetic person. I honestly feel sorry for you
Posted by TomBuchanan
East Egg, Long Island
Member since Jul 2019
6231 posts
Posted on 8/5/21 at 4:03 pm to
(no message)
This post was edited on 11/8/23 at 2:02 am
Posted by SouthernStyled
Member since Apr 2021
1307 posts
Posted on 8/5/21 at 4:04 pm to
Man if only there were other options for smart phones.
Posted by CrimsonFever
Gump Hard or Go Home
Member since Jul 2012
17911 posts
Posted on 8/5/21 at 4:04 pm to
Are you a child abuser? I dont abuse children so I have nothing to worry about if they look for those images on my phone.
Posted by cable
Member since Oct 2018
9614 posts
Posted on 8/5/21 at 4:04 pm to
so some dweeb out in California might be looking at the dirty pics my gf sends me? frick that.
Posted by NYCAuburn
TD Platinum Membership/SECr Sheriff
Member since Feb 2011
57002 posts
Posted on 8/5/21 at 4:05 pm to
quote:

a team of human reviewers


Going to be just like the team of reviewers for google home and Amazon echos. Never any issues happened with them....
Posted by MrJimBeam
Member since Apr 2009
12235 posts
Posted on 8/5/21 at 4:05 pm to
quote:

Are you a child abuser? I dont abuse children so I have nothing to worry about if they look for those images on my phone.



It's sad you don't see this is big government infringing further and further into our lives. Information is power, for good and for bad.

I say big government as a freudian slip, but I'm probably not wrong
This post was edited on 8/5/21 at 4:07 pm
Posted by TigerMan79
Lake Charles
Member since Jul 2014
808 posts
Posted on 8/5/21 at 4:06 pm to
Nothing funny about this
Posted by back9Tiger
Mandeville, LA.
Member since Nov 2005
14123 posts
Posted on 8/5/21 at 4:06 pm to
Does not baffle me one bit that a Bama fan does not understand the entire picture here....not one bit.
Posted by Clockwatcher68
Youngsville
Member since May 2006
6897 posts
Posted on 8/5/21 at 4:06 pm to
quote:

Good.


That’s not as effective as preemptively locking everybody up though. Just to be safe… since they can’t read our minds… yet.
Posted by red sox fan 13
Valley Park
Member since Aug 2018
15325 posts
Posted on 8/5/21 at 4:06 pm to
quote:

so some dweeb out in California might be looking at the dirty pics my gf sends me? frick that.
Posted by WaWaWeeWa
Member since Oct 2015
15714 posts
Posted on 8/5/21 at 4:07 pm to
quote:

Are you a child abuser? I dont abuse children so I have nothing to worry about if they look for those images on my phone.


What if I have a picture of my 3 year old playing in the bath? I’m not worried about me but do you really want some weirdo in California looking at your private pictures and making some type of moral decision?
Posted by CrimsonFever
Gump Hard or Go Home
Member since Jul 2012
17911 posts
Posted on 8/5/21 at 4:07 pm to
They can look all they want I dont have anything to hide from them. I hope they catch the people that do have something to hide if theyre a terrorist, child abuser, sex trafficer, ect.
This post was edited on 8/5/21 at 4:08 pm
Posted by rattlebucket
SELA
Member since Feb 2009
11411 posts
Posted on 8/5/21 at 4:08 pm to
Privacy and Home Ownership gone in the same day
Posted by DemonKA3268
Parts Unknown
Member since Oct 2015
19177 posts
Posted on 8/5/21 at 4:08 pm to
quote:

What if I have a picture of my 3 year old playing in the bath? I’m not worried about me but do you really want some weirdo in California looking at your private pictures and making some type of moral decision?


Bingo
Posted by LSU2001
Cut Off, La.
Member since Nov 2007
2388 posts
Posted on 8/5/21 at 4:08 pm to
Problem is some folks deeply believe that pics like below are child porn or evidence of child abuse.
Page 1 2 3 4 5 6 7 8
Jump to page
first pageprev pagePage 1 of 8Next pagelast page

Back to top
logoFollow TigerDroppings for LSU Football News
Follow us on Twitter, Facebook and Instagram to get the latest updates on LSU Football and Recruiting.

FacebookTwitterInstagram