Apple confirms it will begin scanning iCloud Photos for child abuse images

biometrics

Well-Known Member
Joined
Oct 17, 2019
Messages
20,267
Later this year, Apple will roll out a technology that will allow the company to detect and report known child sexual abuse material to law enforcement in a way it says will preserve user privacy.


Apple intends to install software on American iPhones to scan for child abuse imagery, according to people briefed on its plans, raising alarm among security researchers who warn that it could open the door to surveillance of millions of people’s personal devices.


Many articles on this. They will scan images on iCloud and on your phone. Obviously if you aren't in their target it shouldn't matter. But from my experience of biometrics and AI false positives is a regular occurrence.

This doesn't sit well with me. How do you feel about this?
 

Venomous

Well-Known Member
Joined
Mar 4, 2020
Messages
1,349




Many articles on this. They will scan images on iCloud and on your phone. Obviously if you aren't in their target it shouldn't matter. But from my experience of biometrics and AI false positives is a regular occurrence.

This doesn't sit well with me. How do you feel about this?
While I don't agree with what happens to any data "in the cloud"... People often agree to terms they either turn a blind eye to or simply couldn't be arsed to read.

These terms very often include a variety of things that can be done to your data....
 

biometrics

Well-Known Member
Joined
Oct 17, 2019
Messages
20,267
This is not going down unnoticed. I hope this blows up big time.
 

biometrics

Well-Known Member
Joined
Oct 17, 2019
Messages
20,267
Remember this is the equavalent of your personal hard drive scanning your data, scoring it, then comparing it to to some online database and reporting you if triggered.
 

Venomous

Well-Known Member
Joined
Mar 4, 2020
Messages
1,349
Remember this is the equavalent of your personal hard drive scanning your data, scoring it, then comparing it to to some online database and reporting you if triggered.
Never said I agree with their policies on data handling.

People often believe their "stuff in the cloud are untouchable" even when they agree to data handling
 

mikewazar

Member
Joined
Aug 4, 2021
Messages
49
Location
Gauteng
This opens the door just enough for an absolute sh*t storm since Apple already buckled under the Chinese iCloud fiasco (https://support.apple.com/en-hk/HT208351)

iCloud in China mainland is operated by GCBD (AIPO Cloud (Guizhou) Technology Co. Ltd). This allows us to continue to improve iCloud services in China mainland and comply with Chinese regulations.

iCloud services and all the data you store with iCloud, including photos, videos, documents, and backups, will be subject to the terms and conditions of iCloud operated by GCBD.

 

scudsucker

Well-Known Member
Joined
Jun 16, 2020
Messages
1,517
Slightly off topic: my sister once sent a pic of her son, aged 2 at that point, naked and playing in the garden, to a family whatsapp group.

My uncle, the Catholic priest, could not leave that group fast enough, and was super pissed off with her - imagine if he was caught with that on his phone!

But I have a number of absolutely innocent naked pics of my kids stored in Google Photos and I'd be pretty pissed off if they got deleted or me flagged as a paedophile.
 

mikewazar

Member
Joined
Aug 4, 2021
Messages
49
Location
Gauteng
Slightly off topic: my sister once sent a pic of her son, aged 2 at that point, naked and playing in the garden, to a family whatsapp group.

My uncle, the Catholic priest, could not leave that group fast enough, and was super pissed off with her - imagine if he was caught with that on his phone!

But I have a number of absolutely innocent naked pics of my kids stored in Google Photos and I'd be pretty pissed off if they got deleted or me flagged as a paedophile.

Remember the photo hashes are compared to a known list of CSAM material (provided by whom...?) so unless you're storing literal CP it won't. The issue is that building the infrastructure to do this on-device means the scope of what is considered abuse material can only be broadened and the only thing stopping this is Apple's pinky promise to not scan for things other than CSAM.

In the past Apple has shown to follow money rather than morals/ethics (examples above)
 

Paul Hjul

Well-Known Member
Joined
Apr 9, 2020
Messages
483
I am actually in favour of the approach to the problem which Apple is advancing as it moves the locus of analysis from where it is intrinsically insecure to where it can be secure(r). This is not only applicable in respect of CSAM but is quite generally the case. The privacy and related concerns do not arise from the device performing an on device search but rather it reporting the results of such search. The way Apple is framing this is as a feature rather than as something to comply with a CSAM mandate.

BUT there is an interesting legal fiasco which isn't being given much coverage at all namely the fact that the proposed principal application involves disseminating CSAM

Under many legal systems disseminating child sexual abuse content for any purpose requires very specific authorizations and so on. In SA you FPB framework for example requires immediate reporting and non-dissemination when CSAM is identified. While the data being disseminated by Apple to its devices may not be accessed by users in a manner in which it can be consumed as CSAM the statutory prohibitions on distribution aren't universally (or even generally) to distinguish between CSAM which a human can process.

Beyond CSAM the approach aligns well with Apple's general approach towards presenting itself as affording the user privacy and ownership of their data, while not keeping a target on making it trivial to break Apple encryption. Suppose a person is accused of harbouring trade secret information on their Apple device. At present the course of action by the courts (as a civil matter) or law enforcement is through Anton Pillar orders and having to "break in" to the device exposing all of the contents to whatever eyes is preying - thereby also creating an incentive for efforts to undermine Apple's on device encryption initiatives. On the other hand if Apple has perfected on device scanning for identification of unlawfully possessesed content. The device downloads the relevant hash of what it has to look for, it does the search if it finds something it performs an on device notification. This allows for narrow orders rather than fishing expeditions. The incentives for various industries to build more heavily on Apple are huge and the incentives to undermine Apple encryption is reduced (reducing the amount they spend in a cat and mouse scenario). In theory your MacBook can tell you if it thinks you have copyright infringing material in your music library and give you the option to purchase it via the store, however because the search is on device based on the device pulling from a database there is no third party disclosure baked in. For all of this though Apple gets to market features that protect your on device private data, allows companies to roll out Apple devices with the app installed that allows the company to tell the user when they've got material that is tied to a specific company NDA. The App on the device has the settings about notification. Its pushing everything from the cloud where Alphabet, Microsoft and Amazon can be king to hardware and a software ecosystem which is Apple's strong suit.
 

Johnatan56

Well-Known Member
Joined
Jun 22, 2020
Messages
1,530
Location
Vienna
I am actually in favour of the approach to the problem which Apple is advancing as it moves the locus of analysis from where it is intrinsically insecure to where it can be secure(r). This is not only applicable in respect of CSAM but is quite generally the case. The privacy and related concerns do not arise from the device performing an on device search but rather it reporting the results of such search. The way Apple is framing this is as a feature rather than as something to comply with a CSAM mandate.

BUT there is an interesting legal fiasco which isn't being given much coverage at all namely the fact that the proposed principal application involves disseminating CSAM

Under many legal systems disseminating child sexual abuse content for any purpose requires very specific authorizations and so on. In SA you FPB framework for example requires immediate reporting and non-dissemination when CSAM is identified. While the data being disseminated by Apple to its devices may not be accessed by users in a manner in which it can be consumed as CSAM the statutory prohibitions on distribution aren't universally (or even generally) to distinguish between CSAM which a human can process.

Beyond CSAM the approach aligns well with Apple's general approach towards presenting itself as affording the user privacy and ownership of their data, while not keeping a target on making it trivial to break Apple encryption. Suppose a person is accused of harbouring trade secret information on their Apple device. At present the course of action by the courts (as a civil matter) or law enforcement is through Anton Pillar orders and having to "break in" to the device exposing all of the contents to whatever eyes is preying - thereby also creating an incentive for efforts to undermine Apple's on device encryption initiatives. On the other hand if Apple has perfected on device scanning for identification of unlawfully possessesed content. The device downloads the relevant hash of what it has to look for, it does the search if it finds something it performs an on device notification. This allows for narrow orders rather than fishing expeditions. The incentives for various industries to build more heavily on Apple are huge and the incentives to undermine Apple encryption is reduced (reducing the amount they spend in a cat and mouse scenario). In theory your MacBook can tell you if it thinks you have copyright infringing material in your music library and give you the option to purchase it via the store, however because the search is on device based on the device pulling from a database there is no third party disclosure baked in. For all of this though Apple gets to market features that protect your on device private data, allows companies to roll out Apple devices with the app installed that allows the company to tell the user when they've got material that is tied to a specific company NDA. The App on the device has the settings about notification. Its pushing everything from the cloud where Alphabet, Microsoft and Amazon can be king to hardware and a software ecosystem which is Apple's strong suit.
The issue is that you can compare it to other things, an example that is often brought up is Winnie the Pooh in regards to China's CCP leader. Apple could be ordered to compare against that and it would now show up that someone has that image.

The ramifications are just a bit too huge in regards to this feature. It's great in terms of your image is still private etc., but if it's a public image, it makes it easier to find content other than CSAM. I'm fine with this on a public thing like FB on walls and stuff, that makes sense, but not for private.

I think this is probably the only way that Apple can see making stuff E2E for their servers, since no longer need to scan on the server for content as that's the one where they have a mandate, they have no mandate in regards to on your phone or if not stored on the server.

And you can also do profiling, e.g. certain picture that only people with a certain background would share, now you can track all of those down.
 

Paul Hjul

Well-Known Member
Joined
Apr 9, 2020
Messages
483
the presumption being made is that Apple are dictating or controlling which databases the devices pull data from. If Apple is compelled to add Winnie the Pooh to their image base this doesn't mean the devices all pull from the database. The issue would arise if the device is locked to the particular externally contolled database.

take out the reporting back functionality and most if not all of the concerns evaporate on the core approach.

the point which I am making - and how I see this falling within Apple's approach to its products - is that the Apple wants to move as much control to devices. They want devices to put encrypted data onto their cloud, but they also want devices to tell them if the encrypted data will be a problem. Therefore to use the iCloud you need your device to certify in advance of uploading that the data isn't CSAM.
 

Johnatan56

Well-Known Member
Joined
Jun 22, 2020
Messages
1,530
Location
Vienna
take out the reporting back functionality and most if not all of the concerns evaporate on the core approach.
Which means they don't follow CSAM anymore, and they just proved that they can follow if they want to, so they won't be able to back out anymore.
 

Düber

Well-Known Member
Joined
Jul 20, 2020
Messages
1,533
This is definitely one of those things where the aims are noble but the method has me feeling uncomfortable. People that are in the Apple system have some tough choices to make.

Just a question, would a jailbroken (?) phone be able to bypass this?
 

biometrics

Well-Known Member
Joined
Oct 17, 2019
Messages
20,267
This is definitely one of those things where the aims are noble but the method has me feeling uncomfortable. People that are in the Apple system have some tough choices to make.

Just a question, would a jailbroken (?) phone be able to bypass this?
Don't think so, will be baked into the OS.

I don't have a problem with the noble intention but what next? Governments insisting on their database of hashes need to be scanned and reported.

Fuck this shit.
 

Düber

Well-Known Member
Joined
Jul 20, 2020
Messages
1,533
I don't have a problem with the noble intention but what next? Governments insisting on their database of hashes need to be scanned and reported.

Fuck this shit.
Exactly that, it has a very Orwellian feel about it.
The worst part is that the actual perpetrators will move on to some other means leaving everyone else stuck with it.
 

mikewazar

Member
Joined
Aug 4, 2021
Messages
49
Location
Gauteng
Top