logoalt Hacker News

cbeachyesterday at 12:22 PM4 repliesview on HN

> when notified, doing nothing about it

When notified, he immediately:

  * "implemented technological measures to prevent the Grok account from allowing the editing of images of real people in revealing clothing" - https://www.bbc.co.uk/news/articles/ce8gz8g2qnlo 

  * locked image generation down to paid accounts only (i.e. those individuals that can be identified via their payment details).
Have the other AI companies followed suit? They were also allowing users to undress real people, but it seems the media is ignoring that and focussing their ire only on Musk's companies...

Replies

afavouryesterday at 12:29 PM

You and I must have different definitions of the word “immediately”. The article you posted is from January 15th. Here is a story from January 2nd:

https://www.bbc.com/news/articles/c98p1r4e6m8o

> Have the other AI companies followed suit? They were also allowing users to undress real people

No they weren’t? There were numerous examples of people feeding the same prompts to different AIs and having their requests refused. Not to mention, X was also publicly distributing that material, something other AI companies were not doing. Which is an entirely different legal liability.

show 2 replies
freejazzyesterday at 6:55 PM

Kiddie porn but only for the paying accounts!

derridayesterday at 12:29 PM

The other LLMs probably don't have the training data in the first place.

show 1 reply
techblueberryyesterday at 12:29 PM

[flagged]