TWIT 835: A Straw Man Without Legs

Beep boop - this is a robot. A new show has been posted to TWiT…

What are your thoughts about today’s show? We’d love to hear from you!

2 Likes

@Leo Just in case you didn’t see in Discord, I’ll restate my question here:

If there’s a meetup with Stacey pre-cruise, could people attend the meetup who can’t go on the cruise? I’d so love to go on the cruise, but I’m saving for the down payment on my next home, so the most I could do would be a quick trip to Seattle for the meetup.

Google has been using the same hashes since 2008. They scan your photos and emails. We have not heard any issues with false positives from them.

Great, let’s criticize them as well. False positives is only one of the potential problems. In my opinion, it is the least problematic aspect of this.

1 Like

For the government to try and change the hash database Apple, Google, Microsoft, Adobe, Facebook as well as many other companies would have to agree. They all use the same database.

I hear what you’re saying, but it’s important to note that most people don’t live in the United States. “The government” for many people outlaws anti-government sentiment and homosexuality. They can create their own hashed database and legally demand Apple use it.

3 Likes

They can already do that to Google, Microsoft, etc. So why would this be different?

Like I said, let’s criticize everyone who does this, including all of the companies you mentioned.

2 Likes

The other thing is, if you don’t store photos on those services, they won’t be checked. With Apple, they are moving the check to the local device.

1 Like

I don’t see any reason to criticize them. When you choose to use someone else’s computers to store your stuff you come under that companies TOS and the laws they have to abide by. Apple only does the check if you upload to iCloud.

Oh we absolutely wouldn’t limit it in any way. But it’s going to be up to Stacey whether we do it. I’ll ask again when we get closer.

2 Likes

Why would a government do that, though? To be of value, the hashed database has to be based on a restricted set of image files. This is where the show’s discussion fell down the slippery slope of conflating Apple’s neural hash with image recognition. Using an example from the show, let’s say a government comes up with a database of hashed images of all the homosexuals in that country. To be effective for the government’s nefarious purposes, the database given to Apple would need to either contain hashes of all the pictures ever taken of each person, or there would need to be a niche market for trading images of each person from a small set. Neural hash wouldn’t be able to detect any picture taken of any homosexual person since it’s not image (or facial) recognition. It would only detect if an image on someone’s phone was shared and previously hashed in the database. Neither of which exists for this example.

Sadly, child porn images fall into the latter category because the images are the equivalent of trading cards. And since the act of taking the images is illegal, the same images get shared over and over. The numbers grow over time, but not as exponentially as selfies or other photos.

For an in-depth look at how one social media company has become a haven for pedophiles, listen to Jack Rhysider’s Darknet Diaries episode 93. Kik – Darknet Diaries

Jerry

If the hashes are used by everyone, let’s please stop giving Apple any credit for inventing anything new. If they ARE novel, then Apple needs to explain how their process is different from what the other companies are doing that requires it to be done on the device.

A valid point. Two things come to mind:

  1. Pornographic images that are widely distributed. Not child pornography but pornography nonetheless.
  2. Widely distributed images of protests (Hong Kong, for example).

I think this “think of the children” argument is used by government to justify nearly any privacy invasion they come up with. They want you to believe they’re so common that something like using the word “haven” makes sense. If there is a serious pedophile problem, then there should be arrests on the daily, filling the news with stories. While a certain amount of effort needs to be expended in trying to eliminate them, I don’t really think it’s any more likely to control them more than the efforts to control elicit drugs have succeeded. If there is a market for new content then certainly scanning for old content is not going to catch anyone making new content. I suspect the real culprits creating the new content that becomes available to "low life"are organized crime types, who can spend the money to have cops look the other way.

tl;dr I think we need to be skeptical that these efforts are any more than “security theatre” making people feel good because they believe evil is “contained”, when in all likelyhood the amount of evil is probably greatly exaggerated.

My guess is the novel part is this voucher thing that gets uploaded with the photo

I just wonder how, now that Apple has shown this capability, can they legitimately say they won’t let governments have access to the system, or a system like it. Especially China. They would have no choice because they’re so beholden to China. To be fair, I don’t doubt that a system like this already exists since iCloud data must be stored in China with China having the keys.

#1: In the many countries where distributing pornographic images is outlawed, the government would need to build and maintain their own database of forbidden image files, which might become a Sisyphean task.
#2: For what purpose would the government do this? To identify co-conspirators or pro-democracy citizens who are uploading any of these images to iCloud? (I didn’t see where Apple is applying the image hash scan to iMessage. The Parental Controls feature they also announced is different.) So, yes…this is possible. Improbable given other invasive measures the government can take like tracking the metadata of phone calls and text messages to build a relationship graph.

I suppose the slippery slope argument is this: authoritarian governments around the world will pressure Apple into expanding the image hash check into other features on the phone: selecting images for any app, sending images via iMessage, or storing images. And then Apple will somehow cave into this pressure despite taking such cautious measures with this one use case: uploading images to iCloud servers. If this happens, then I agree that Apple would deserve all of the hate and condemnation being heaped upon it now.

This measure they’re taken is flawed if the true intent was to identify people sharing known images of child pornography:

  • The phone’s owner can turn off “store in icloud”.
  • The “security token” likely isn’t updated after the image is uploaded. So images that haven’t yet been catalogued in the database will be tagged “clean”.

The inflamed rhetoric is overblown imo for the small step Apple is taking.

Jerry

The safety voucher is flawed since the comparison is made with the database of “known” CSAM images. Unknown images will pass this check and be uploaded. But since the risk of being caught is still high, people will learn to turn off “upload to iCloud”.

I wonder if Google Photos on the iPhone does such a check. Google’s page on CSAM only mentions Youtube.

In your opinion, how many children forced into sexual acts on a daily basis are no longer exaggerated? Were the 82,000 reported claims against the Boy Scouts too much exaggeration?

Apple’s small step to check for images being uploaded to iCloud isn’t really any attempt to contain this particular form of evil. There are other things they could have done (eg, scan any image on the phone, any image selected by an app, etc). These would’ve obviously been an invasion of people’s privacy and would be worthy of the outrage and hate being voiced. But, they didn’t take these steps. Will it eliminate the evil of child pornography and pedophilia? No…nor is Apple claiming it will.

Jerry