Apple's new user espionage protocol: I can't wait for all the abuses to start

I’m sure many of you are familiar with rule 34 of the Internet. It’s scope creep that can get really sticky. It’s very easy to say “oh this is also objectionable, we should check for this too…” For example I have to wonder if someone draws something sexual featuring a Disney character or something like that (because Disney characters frequently have a young audience) if Disney won’t say “Apple you need to root out these things too.” Then someone makes a sexual parody of some politician, and that could be called abusive, and oh boy, we better root that out too. I am convinced they won’t be able to reject any and all requests for “just one more thing” that needs to be detected and eliminated… because that’s how the thought police work.

Here’s a real example from my youth: In grade 10 I used to hang around with a talented artist. When I first got to know him, he used to draw Garfield strips at least as well as the original author (in my opinion at the time.) One day, we were messing around at a table in the library (when we should have been studying), and he drew something sexual involving Garfield characters. I honestly don’t even remember what it was, I just remember we all had a laugh and then he crumpled it up and threw it away in the garbage. Imagine his surprise when someone hauled that out of the garbage, and reported it to the school management. He got kicked out of school for that (which ended our friendship :frowning: )

From my above story: I feel like there needs to be room for unpopular though in private, on my own device in my own private space. Depending on what it is, it should be blocked from sharing, but it merely existing on my device doesn’t necessarily mean I won’t eventually reject that line of thinking (i.e. grow up/out of it.) It’s probably no different than the right for a child to have privacy in their personal diary… mom and dad don’t need to be the thought police. Catch bad actors, but be vary wary about trying to police bad thought.

1 Like

I thought it was just in iCloud… my mistake

No, the checks are local on the device, but only active if you use iCloud Photo storage.

2 Likes

I think the fact that you authorize iCloud means you are also authorizing this inspection on your phone. I don’t think the phone scanning will be limited to just files that go to the cloud, but will include any file on the phone. Basically, by enabling iCloud, you’re paying Apple to spy on your phone. I’m sure they argue they don’t have access to your files in iCloud, so they have to do the scanning at the source. This seems like a big like though, given that they have given China access to pretty much everything in their cloud. Like you suggest, there is more going on here, and it’s probably their attempt to sew as much confusion as they can, to give them as much cover as they can get.

Okay, here’s an even different explanation. There is so much confusion around this, and I don’t understand why it’s been reported so many different ways.

1 Like

They released an FAQ

This part makes me feel a little better:

“ Could governments force Apple to add non-CSAM images to the hash list?

Apple will refuse any such demands. Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-man- dated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limit- ed to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it. Furthermore, Apple conducts human review before making a report to NCMEC. In a case where the system flags photos that do not match known CSAM images, the account would not be disabled and no report would be filed to NCMEC.”

This addresses the concern that I had when I spoke to @Leo, but I am still reluctant to believe that they would choose to pull out of major markets like to keep their promise.

2 Likes

Also worth noting that “… other child safety groups.” can mean many things. Russia’s anti-gay law that passed in 2013 was done so to “protect children” from homosexual imagery and to to prohibit the distribution of "propaganda of non-traditional sexual relationships.”

2 Likes

That is what I said, as soon as the news hit.

The downside of that is that they couldn’t warn minors, when they are sending/receiving naked images - but that has nothing to do with the CSAM database…

1 Like

Facebook and Google have been using it for several years and there haven’t been any stories about people being arrested on the basis of false positives.

1 Like

That is definitely the problem with this!

2 Likes

If you have iCloud photos turned off, but “my photo stream” on, is or will Apple scan those photos?

1 Like

They will scan them locally to provide recommendations (“memories”) but that data doesn’t leave your phone.

1 Like

Thanks for the info. I guess they’ve had access all along. Don’t know why I should be surprised. They’ve been sorting by faces for a long time now also. Microsoft and Google do it too. Do you think turning off Photo Stream would do any good? I suppose turning off icloud might help. I am just feeling disheartened about the loss of privacy that Apple has promised through the years. It has always creeped me out when I get notified that they have a memory for me to look at in my photo library.

Depends on what you are trying to do. If the CSAM scanning worries you, disabling iCloud photos will do the trick.

Yes, but that just provides tags to the photos on the device itself. It doesn’t send Apple a message, along with the photos, when you’ve received more than 30 images of your granddaughter… (We’ve received/made around 800 photos of the granddaughter so far this year.)

That is the problem bit. Adding names is for your benefit. Reporting you to the police is not for your benefit - and, while CSAM is a great cause, it is the opening of the floodgates that is the problem.

Apple have demonstrated that it can do this, so it will be pressured to do more, especially by more restrictive regimes - like Romania, which implemented anti-LGBT laws earlier this year. Any LGBT photos or chat about LGBT topics might become reportable in that country. “Update your database or your products are banned in our country!”

Romania? Okay, a small country, probably not a major profit centre for Apple, but China? Russia? Europe? Germany has twice tried to get the Staatstrojaner (State trojan) installed on all devices, but it was rejected by the constitutional court, they can now install it on devices, once they get a warrant - which I actually have no issue with, probable cause is fine for installing spyware, with a warrant, but to install it on every device? No way!

This Apple system could provide the access that they need. Again, if it means that they can get access to the device with a warrant, fine and if it doesn’t involve installing malware to do it, even better. But the risks of mass surveillance are too great.

Am I going to throw my new iPad in the bin? No. Not yet, anyway.

And I was looking at switching back to an iPhone this year, but now I’m in a wait-and-see mode.

2 Likes

Unfortunate, I don’t know that the alternative is any better.

To my knowledge, Android does not actively spy on its users for the government despite the presence of privacy issues in Google’s bundled services.

Users who are extremely concerned about privacy can opt for any of several de-Googled Android builds. That comes with extra effort, of course, and may not have all the functionality some people would prefer.

Yeah, that’s the problem. A lot of the functionality and apps worth running require Google Play Services.

It’s on my to-do list to see if I lose anything I can’t live without using MicroG.

1 Like

I think that it depends on what you mean by “Android.” If by Android you mean AOSP, sure.