logoalt Hacker News

superkuhtoday at 1:18 PM6 repliesview on HN

They're both underdefining what "intimate images" means and using the term images instead of photos. So this means they want this to apply to everything that can be represented visually even if it has nothing to do with anything that happened in reality. Which means they don't care about actual harms. The way they're using the word 'harm' seems be to more in line with the word 'offend'. So now in the UK if there is an offensive image (like a painting) posted on a web site (or other internet protocol) they are going to be, " treated with the same severity as child sexual abuse and terrorism content,". That's wild. And dangerous. This policy will do far more damage than any painting or other non-photo images would.


Replies

noobermintoday at 1:59 PM

I had to google a bit, but this Guardian article[1] goes into a lot more detail than the Register piece here. I was of the opinion that this sounds too onerous and ill-defined when I first read the Register piece especially with censorship on the rise in Europe recently, but the Guardian piece made me side more with this particular policy. It doesn't sound as broad as the Register piece puts it, it sounds like it's specifically for revenge porn and generating deep fake porn non-consensually, not any "intimate image" which I agree is far too broad. Albeit, of all governments, I'd suspect especially the current UK government is to be amongst the most likely to say expand these powers to speech they don't like or general pornography one day, etc, it doesn't sound like this specific policy is broad yet according to the Guardian article. The Register piece is using "intimate image" as a euphemism I think whereas the intent of the policy is a bit more defined and specific.

[1] https://www.theguardian.com/society/2026/feb/18/tech-firms-m...

iMarktoday at 1:33 PM

I agree that laws such as this to be defined very carefully, but I think "images" is the appropriate term to use, rather than "photos". LLMs make it near trivially easy to render a photo in countless styles, after all, such as paintings, or sketches.

SamoyedFurFlufftoday at 1:58 PM

I think if I produced inappropriate images that are identifiable as a specific child victim, who obviously cannot consent to have inappropriate images generated of their likeness, I believe images and photos are a distinction without a difference.

femiagbabiakatoday at 2:11 PM

Blame xAI. It has to be worded in this way to capture the behavior they allowed to persist.

show 1 reply
vr46today at 1:38 PM

What bit of "intimate images shared without a victim's consent" is lacking context in the article?

show 1 reply
actionfromafartoday at 1:37 PM

I hate to appear to defend this, but generative AI has sort of collapsed the distinction between a photo and an image. I could generate an image from a photo which told the same story, then delete the photo, and now everything is peachy fine? So that could have been a motivation for "images".

Though I wonder if not existing frameworks around slander and libel could be made to address the brave new world of AI augmented abuse.

show 1 reply