TWIT 835: A Straw Man Without Legs

Given the evidence so far, the government needs to be watching private islands owned by billionaires if they are serious about protecting the children.
It seems this image cache is just for known images, not new ones, so seems to be targeting the collectors and not the producers…on their private islands.

2 Likes

Again, crime is crime, as defined by a society, and I am not advocating for ignoring crime. What I am questioning is what efforts are reasonable versus the results they achieve. It’s about effectiveness versus effort. We could probably completely eliminate a lot of crimes by requiring, say, strip and cavity searches, before entering any public space. Would that be a good use of money and effort for the results achieved? Probably not. I am much more concerned about children being abused as child labour than I am that there are rampant cases of children being forced into sexual acts that can be detected or prevented by technology (assuming we’re not willing to be filmed/monitored in every possible way 24/7.)

If the system does find a match then it gets reviewed by a human who makes a final decision so, as far as I’m concerned, that is a non issue.

My problem with arguing that it can be abused is that all technology can be misused. The example I often go to is the hammer. You can use a hammer to bash a nail into a wall or to clock someone over the head but you don’t ban someone from owning a hammer, Likewise, in the UK, as a result of rising knife crime, the sale of knives is restricted so, maybe, regulate the use of this technology

In my mind it is similar to the backdoor debate. Everyone seems to critisize it but I haven’t heard the industry suggest an alternative solution apart from “send the phone to Apple”

1 Like

One option was to not build this capability at all. Fully end-to-end encrypt backups to the point where it is impossible to read the contents of a backup. That comes with its own problem, but it’s an option.

The problem is, there is no solution. Encryption is mathematically “perfect”. It conceals the stored or transmitted text, and without the private key, you can’t view it. Only the recipient / owner of the data has the private key, everybody else has the public key, so they can encrypt stuff and send it to the key owner, but, even with the public key, they can’t see what the data consists of.

The only way to get around that is to use a different encryption method that is “broken”, which allows a second private key. The problem is, that second private key gets shared with authorities and it gets shared and shared and, suddenly, somewhere along the line, the bad guys get the key and the whole thing is worthless. All messages ever sent, all data ever encrypted is no better protected than plain text.

The NSA tried this back in the 90s, by trying to force the Clipper chip on us. That, luckily, didn’t go very far, because it was proven unsafe and the encryption “broken”, before it went into widespread production.

Even if NSA-Backdoor-Encryption is pushed on us plebs, the real criminals already have access to non-backdoorable encryption. This is a mathematical truth and cannot be forgotten. Therefore the average citizen will be using “encryption” that isn’t worth the bytes that are used to represent it, whilst criminals will still be able to communicate in complete secrecy - just like they could before the computer world came along, they used secret codes, safe drops, couriers etc. It wasn’t as fast or efficient, but it has existed for 3,000 - 4,000 years, at least.

The only way around this would be something like the system Apple is proposing for CSAM and the porn protection for minors. You have something on the phone, checking the data directly on the device, before it is encrypted, or after it has arrived and has been decrypted.

And that should only ever be allowed with a valid warrant.

The German government has been trying to implement this through the “Die Staatstrojane” (The state trojan), which would be smuggled onto every device in Germany. Luckily the constitutional court told them where to go on that one. Now they want to do it again, but with a valid warrant.

I am okay with that. If they have found evidence enough to warrant “tapping” the suspect, then that could include planting a “tapping device” on their phones - in this case, a software shim to get encrypted messages off the device; probably a mixture of keyboard capture and screen grabber.

If a company deems it worth the effort to check images being uploaded to their servers against a set of known child pornography images, isn’t that decision about effectiveness vs effort already made?

That’s the ultimate slippery slope argument I’ve ever heard. (Just to clarify, I was referring to cavity searches in all public spaces). :slight_smile:

Jerry

Apart from slippery slope arguments, how do you think this particular implementation of checking images against a hashed database before uploading to iCloud can be misused?

Jerry

That’s sort of my point

Chiming in here, I don’t have anything to hide, but as an adult who has taken photo’s of a naked GF and naked wife, the thought of an AI that scans my device for something that looks like a known image. My concern is that an explicit image I have of someone could mistaking flag and that photo gets uploaded for someone at apple to look at and determine if it’s legal or illegal. Most of the photo’s you can see the face of the adult and it would be clearly legal. But I don’t like the idea of them taking an image off my device to be reviewed because of a false positive. The fact that there is zero wait to opt out of this other than to switch to Android where Google will scan it just screams big brother to me. I’m all for protecting children as I work with a summer camp. But perfect legal photo’s of consenting adults that could be a false positive to the system frightens me. Just my two sense.

I was glad to hear @Leo being harshly critical of Apple’s client-side CSAM scanning plan, but disappointed nobody on the show called it what it is: spyware.

Software built in to the operating system of a computer that actively works against the interests of that computer’s owner is a bright line, and one I’m surprised to see Apple cross in this way. I hope this serves to push users toward options that make stronger privacy guarantees, such as de-Googled Android builds.

Honestly, the more I read about this, the less I care :slightly_smiling_face:

1 Like

the PinePhone and related products seem to be coming along fine :smiley: