TWIT 889: The Thin Green Line

Beep boop - this is a robot. A new show has been posted to TWiT…

What are your thoughts about today’s show? We’d love to hear from you!

Regarding the photographing of the toddler, it made me think.

We had the free baptism of our granddaughter a couple of months back, followed by a garden party. I was photographing and filming the event and shared the photos later. One of the nieces was playing in the pool naked the whole afternoon. It never even occurred to me, until this piece, that some of those photos could possibly have gotten me into hot water, if they are so sensitive about natural nudity - there are several shots, where her and her brothers are blowing bubbles and she is facing the camera, out of modesty, I trimmed most of the shots as best I could, before sending them on to the family. Luckily, I took them on my Sony Alpha in RAW and only the cropped images have been uploaded to iCloud in JPEG format, but society is becoming ever more crazy about such things.

Where is the line drawn, between kids playing naked or taking a bath and child pornography even begin? It is quite natural here for toddlers to run around naked on the beach, lakeside, poolside etc. let alone on the FKK beaches, where everybody is naked (nudist beaches, FKK = free body culture), although you aren’t generally allowed to photograph on those beaches.

Do we have to start worrying about losing our accounts, just because a naked toddler walks in front of us on the beach as we are photographing the sunset - by the sounds of the report, the police would find there was no offence committed, but our electronic history and presence could well be gone forever.

1 Like

The last 15 minutes or so of the Club TWIT version seems to be messed up, there is the beginning of an ad, then it skips to something random, then jumps into the house ad for the other shows, not quite sure what happened there. Maybe the show ended late and the editor fell asleep on the console, @Leo are you over working them, again? :wink::joy:


Ugh. Thanks David. I’ll check. Sorry about that.

1 Like

No problem, had me confused for a few seconds.


Indeed. I have no problem with nudity at all, but I really do have a problem with exploitation. I had this thought… what if some agency employed the evolving AI (like DALL-E) to overwhelmingly flood the Internet with “garbage quality” faked images. (Assuming it’s possible, it would need to have a source of images that didn’t involve the original exploitation.) If the only thing that perverts can find is “imaginary” it might significantly hinder the “market” for anything made real, as it would be impossible to stand out in a sea of fakes…?

1 Like

They did screw up. We’re uploading a new edit now. Thanks for the heads up.

1 Like

The problem is, as far as I understand it, these sorts of images are not on the open internet to search for, they are shared privately between “like minded” individuals or on dark net forums etc. so flooding the normal Internet with “garbage quality” fake images wouldn’t help.

The last big group that got sent up was run out of an allotment shed, the son of the owner had a rack server in there with hundreds of thousands of images and videos of abuse - he and friends even allegedly filmed scenes in the allotment shed (more of a small house) and the images were shared over a dark web forum.

I would think, with CSAM filters on big-name cloud services and things like iPhoto on an iPhone, you are only going to get innocent people photographing their own kids at play, or for medical reasons, or the dumbest of dumb abusers, who forget to turn off the synchronisation of their mobile devices and keep their material on said devices, instead of having a separate burner device which isn’t connected to any sort of cloud service…

The main takeaway for me in that CSAM story was Google’s own process failed.

‘When CSAM is identified in a user’s Google Account, we send a CyberTipline report to NCMEC and may disable the account. Users are notified of the account termination and are given the opportunity to appeal.’

Doctors shouldn’t request images like this anyway. The website of our medical centre, where you can send a message to a doctor and attach photos, says not to send any explicit images. I assume this is why.