TWIG 624: The Ladies' Menu

Beep boop - this is a robot. A new show has been posted to TWiT…

What are your thoughts about today’s show? We’d love to hear from you!

@leo I’m afraid you erred in your recap of Apple’s new initiatives. You said the first was iCloud CSAM hash scanning, the second was on-device CSAM hash scanning, and the third was scanning kids’ devices for pornographic images then providing a warning for the kids and then a notification to parents if the kid chooses to view.
I believe the initiatives are actually as follows:

  1. Update to Search and Siri that will provide resources if someone searches for topics related to sexual abuse.
  2. Opt-in parental controls in the Messages app to scan photos for explicit content; blur the image; provide a warning and option to view; if the child is under 12 and views the image then parents get an alert. This feature uses on-device machine learning to determine explicitness. Apple does not get access to the images.
  3. On-device CSAM hash scanning (if a user has enabled iCloud), which will trigger an alert to Apple for human review if a threshold criteria is met, then if manual review confirms it Apple will disable the user’s account and send a report to NCMEC.
1 Like

There are a few issues with the discussion of the 4th Amendment in Apple. Apple is a private actor, they are not the Government. Just like the First Amendment doesn’t require Facebok, Twitter, etc. to host your speech, the 4th Amendment does not protect you against a private search from Apple.

I don’t think there is anything wrong with analogizing between Apple’s actions and what law enforcement can do under the 4th Amendment, but they are very different issues.
As you noted, Facebook already runs these scan in the cloud. As do Dropbbox, Microsoft, Google, etc - at least when it comes to data stored in the cloud. While Apple’s is on device, it is only running IF photos are syncing to the cloud. So, we are talking about scanning data that is going to Apple’s servers.

There are arguments being made that companies doing these scans are Government actors because they are being compelled by the Government to run the scans. However, I’m not yet aware of any case where a Judge has agreed with that reasoning.

Is on-device scanning a slippery slope? Maybe. But, it’s not really that different from all of the other companies doing similar scanning on their services. Plenty of messaging apps are running scans on our message, and file-sharing services run these scans on files we share.

That is correct.

It will also be interesting to see, what Apple does in Germany. The CSAM technology is only for the USA at the moment, but it could be used elsewhere.

But, in Germany, you cannot force changes of terms and conditions onto existing users/customers. So anyone with an iOS 14 device upgrading to iOS 15 cannot have this feature automatically turned on, the user has to explicitly agree to the change. Likewise, if the CSAM scan on upload is turned on at a later date, any iOS 15 devices sold prior to that date are automatically opted out of the check, unless the user explicitly agrees to the change in terms of service. (The same goes for iPadOS, WatchOS and macOS, naturally.)

This is something that caught Facebook/WhatsApp out a few years ago. They wanted to get people to use new conditions, which were detrimental to the users privacy, when using WhatsApp. In most countries, it was just a fait accompli, it just turned on, the user tapped an “OK” button to accept. In Germany, they had to have the ability to dismiss the dialog without accepting and, until the user accepted, they couldn’t implement the change on that account. It got really annoying after a while (every time you went to the home screen of WhatsApp, it displayed the dialog and you had to dismiss it, before you could continue, ISTR), so I expect most people either stopped using WhatsApp or (possibly accidentally) clicked on OK.

Apple got into hot water over OS X about a decade ago, as well. They tried suing a maker of Hackintoshes, because they were installing genuine OS X onto the devices. Apple complained that this was against the terms and conditions in the OS X license agreement - namely that the OS could only be put on Apple branded kit.

The Hackintosh maker pointed out, he was buying OS X retail licenses and nowhere on the outside of the sealed box was any restriction on its use.

The court agreed: Only those terms available at the point of purchase (the checkout in the electronics store) are enforceable. Any additional terms and conditions inside the sealed carton are not enforceable, as the purchaser could not reasonably be aware of them when they finalized the contract with Apple (point of purchase - as soon as money changes hand, the contract is made; not when the user clicks past the legal waffle when installing the OS).

That was a big wake-up moment for the industry. A lot of online services used to get you to sign up and then showed the legal waffle, after you had created an account. Any legal waffle that comes after the account has been created is null and void. Now, most services, at least in Germany, show the legal waffle, before the account is created.

2 Likes

What all of this Apple/CSAM debate is is another reminder that we don’t own what we own. We THINK we do, but neglect at levels of government when electronic hardware/software pivoted from a purchase to a rental model leaves us in a situation where Apple can safely wait for this to blow over, and no one but a few outliers will decide to go to another platform. Apple is globally too big to fail as is Facebook, Google, etc.