logoalt Hacker News

A simpler way to remove explicit images from Search

36 pointsby gnabgibtoday at 4:15 AM24 commentsview on HN

Comments

dannywtoday at 6:31 AM

Looks like a nice and well designed improvement that will help people.

I can see this is related to the sad and ongoing ‘purification’ of the internet, but still, not going to get upset over better UX for taking down deepfakes or non-consensual explicit images which do hurt people.

xeyownttoday at 6:32 AM

So you can pinpoint to Google what image is of high (damaging) value and Google show you more of these.

What could go wrong?

show 1 reply
etchalontoday at 6:28 AM

"Please don't regulate us" step 6,438.

vascotoday at 6:04 AM

I don't see how those religious groups that forced card payment processors to ban pornhub et al are not going to abuse this by mass reporting any nude picture they find as their own.

show 2 replies
mlindnertoday at 6:05 AM

Google practically never shows explicit images to anyone anymore anyway. Even bing doesn't anymore. I feel like we've returned to a more prude society, at least on the mainstream internet.

show 3 replies
guessmynametoday at 5:34 AM

Why is Google indexing these harmful images in the first place?

Microsoft, Google, Facebook, and other large tech companies have had image recognition models capable of detecting this kind of content at scale for years, long before large language models became popular. There’s really no excuse for hosting or indexing these images as publicly accessible assets when they clearly have the technical ability to identify and exclude explicit content automatically.

Instead of putting the burden on victims to report these images one by one, companies should be proactively preventing this material from appearing in search results at all. If the technology exists, and it clearly does, then the default approach should be prevention, not reactive cleanup.

show 2 replies