Regardless of the larger conversation, why do you think that adding friction wouldn't stop people from engaging in an activity? I mean, generally speaking A/B testing is used to find the path to most engagement and doesn't get this kind of pushback.. but as soon as you want to make something harder you get people saying "but that wont do anything". It demonstrably does! 'trivial to circumvent' in a world where people regularly have the experience of being called out to plug something in, when "have you checked the power button and power cord" has been a common trope for over two decades!
I'm genuinely curious about your thoughts but to be clear my focus is on this very narrow, nigh nitpick tangent.
> Regardless of the larger conversation, why do you think that adding friction wouldn't stop people from engaging in an activity?
See war on drugs or piracy, or alcohol prohibition for instance.
Now there's a million ways to share pictures online in a manner that bypasses the few big platforms' interference, and you really don't need to be a genius to use (say) password-protected archives. That's how a lot of casual online piracy happens.
This thing does very little to prevent spreading CSAM pictures, and it does nothing to prevent actual child abuse.
Well, those things were actually pretty effective at stopping people. People clearly circumvented those restrictions, but from my perspective the number of people that smoke weed probably went up after it became easy to access in my state, and the quality and variety of products increased enormously. They weren't effective at achieving any broader aim, like stopping or slowing drug production but frankly that wasn't even the point. Regardless, I think we can both agree that these broad bans decreased the number of people participating in that behavior. I think the shift in our societies viewpoints on them came about as an awareness that they were ineffective at achieving any moral or justifiable aim. I think the correct criticism of them isn't that they failed to prevent or impede a given behavior, but that they weren't worth the negative externalities (if you'll forgive the understatement).
Which brings me to your example of a password protected archive. I'm ignorant on specifics so I'm just going to sketch a broad argument and trust that you'll correct me if my premise is incorrect or my argument otherwise flawed. Essentially, if something is opt-in instead of opt-out, a non-trivial portion of the population won't do it. Especially if it is even slightly technical, there are just a lot of people who will just stop thinking as soon as they encounter a word they don't already know. So if the preventative measure is something that is not automatic and built in to whatever tool they are using to share images, then that preventative measure will not protect most of them. So to bring it full circle, I think it would do much to prevent the spreading of CSAM because other bans have been effective and I don't think most people have the technical literacy to even be aware of how to protect themselves form surveillance. As you say you don't have to be a genius, but I'd suggest you'd need to be above average, which gives you over half the population.
Also thanks for responding, I hope I'm not coming across as unpleasantly argumentative, I mean all this in the most truth-seeking sense of a discussion (I was going to say 'sporting sense of a debate' when I realized I had never been part of a formal debate team and that might not mean what I thought it meant, heh).
Sorry I haven't had the time to write a thoughtful reply.
Are you talking about a state where cannabis was de-criminalized?
I agree that ease of access does have an effect on people, but the effect of iCloud scanning is so marginal in the grand scheme of things that it's almost like fighting drugs by installing surveillance cameras in malls. They just go trade and smoke elsewhere. The friction is virtually zero, but the privacy concern of scanning on half a billion Apple devices is much worse.
It's worth keeping in mind that CSAM is already highly illegal and banned, whether Apple scans iCloud photos makes no difference on that front. So it's nothing like the difference between weed being de-criminalized or not.
Also, fact is you already have to jump through hoops to obtain CSAM. It's very rare to stumble upon it being casually shared online (I think the last I witnessed it must've been around 15 years ago on 4chan, and somewhere between 5 and 10 years ago in a spam attack on freenode). Trying to search for it on the clearnet is mostly going to yield nothing.
In general, people also tend to know when they're doing something highly illegal. And yet they still do it, just taking the steps to try avoid being caught. No difference with CSAM. They will jump through hoops, and "don't store child porn on iCloud" is the tiniest of hoops to jump for real.
Password protected archives were meant to be just one example of how to bypass scanning on cloud platforms, and one that happens to be widely used among casual pirates. Google drive might be one of the biggest pirate services around these days. The bigger point I was trying to make is just that there are countless ways to share files without exposing their contents to scanning. Nothing for people who are willing to jump through hoops to get CSAM.
Finally, one point I've had to try make over and over again is that detecting the storage or distribution of old (catalogued) CSAM photos is only very tangentially related to actual abuse of childen. Unfortunately that keeps happening even if you destroy the internet and make sure no photo is ever stored in the cloud again.
I've said it before: child abuse and violence existed before cameras and internet, and will continue to exist. Detecting images of abuse or violence is not going to stop abuse or violence.
And if someone makes a system that is efficient at detecting all catalogued (=old) images of CSAM, then that might just create a larger market for "fresh" (uncatalogued) child abuse. Credit to nullc for realizing this.
And on protecting-the-children front, there are much bigger problems than stashes of old CSAM. Like grooming, or chatrooms where child prostitutes are forced to stream for an audience..
I'm genuinely curious about your thoughts but to be clear my focus is on this very narrow, nigh nitpick tangent.