I am actually in favour of the approach to the problem which Apple is advancing as it moves the locus of analysis from where it is intrinsically insecure to where it can be secure(r). This is not only applicable in respect of CSAM but is quite generally the case. The privacy and related concerns do not arise from the device performing an on device search but rather it reporting the results of such search. The way Apple is framing this is as a feature rather than as something to comply with a CSAM mandate.
BUT there is an interesting legal fiasco which isn't being given much coverage at all namely the fact that the proposed principal application involves disseminating CSAM
Under many legal systems disseminating child sexual abuse content for any purpose requires very specific authorizations and so on. In SA you FPB framework for example requires immediate reporting and non-dissemination when CSAM is identified. While the data being disseminated by Apple to its devices may not be accessed by users in a manner in which it can be consumed as CSAM the statutory prohibitions on distribution aren't universally (or even generally) to distinguish between CSAM which a human can process.
Beyond CSAM the approach aligns well with Apple's general approach towards presenting itself as affording the user privacy and ownership of their data, while not keeping a target on making it trivial to break Apple encryption. Suppose a person is accused of harbouring trade secret information on their Apple device. At present the course of action by the courts (as a civil matter) or law enforcement is through Anton Pillar orders and having to "break in" to the device exposing all of the contents to whatever eyes is preying - thereby also creating an incentive for efforts to undermine Apple's on device encryption initiatives. On the other hand if Apple has perfected on device scanning for identification of unlawfully possessesed content. The device downloads the relevant hash of what it has to look for, it does the search if it finds something it performs an on device notification. This allows for narrow orders rather than fishing expeditions. The incentives for various industries to build more heavily on Apple are huge and the incentives to undermine Apple encryption is reduced (reducing the amount they spend in a cat and mouse scenario). In theory your MacBook can tell you if it thinks you have copyright infringing material in your music library and give you the option to purchase it via the store, however because the search is on device based on the device pulling from a database there is no third party disclosure baked in. For all of this though Apple gets to market features that protect your on device private data, allows companies to roll out Apple devices with the app installed that allows the company to tell the user when they've got material that is tied to a specific company NDA. The App on the device has the settings about notification. Its pushing everything from the cloud where Alphabet, Microsoft and Amazon can be king to hardware and a software ecosystem which is Apple's strong suit.
The issue is that you can compare it to other things, an example that is often brought up is Winnie the Pooh in regards to China's CCP leader. Apple could be ordered to compare against that and it would now show up that someone has that image.
The ramifications are just a bit too huge in regards to this feature. It's great in terms of your image is still private etc., but if it's a public image, it makes it easier to find content other than CSAM. I'm fine with this on a public thing like FB on walls and stuff, that makes sense, but not for private.
I think this is probably the only way that Apple can see making stuff E2E for their servers, since no longer need to scan on the server for content as that's the one where they have a mandate, they have no mandate in regards to on your phone or if not stored on the server.
And you can also do profiling, e.g. certain picture that only people with a certain background would share, now you can track all of those down.